A college is removing its vending machines after a student discovered they were using facial recognition technology::A photo shared on Reddit showed one of the vending machines with an error code suggesting it used facial recognition tech.

  • ohto@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    119
    ·
    edit-2
    10 months ago

    Based on the quotes from the vending company, at first I thought this was just a dumb way to detect when a human is standing there. But it’s worse than that.

    So first we get this from a company representative:

    The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface

    Ok, fine. Overkill, but fine. But then their company’s FAQ tells us this:

    only the final data, namely presence of a person, estimated age and estimated gender, is collected without any association with an individual.

    So they ARE collecting data, and they are trying to obfuscate that fact by saying they are just “activating the purchasing interface”. This isn’t just turning on a lighted display when a person is standing there. “Activating the purchasing interface” means activating the algorithms to analyze my appearance. They are trying to figure out who is buying their product. That’s different.

    So they are being shady about their true intentions. They aren’t being up front, and they expect us to trust that they aren’t storing or transmitting anything other than estimated age and sex. Hmm, maybe. But their actions don’t build trust.

    Plus, now I have to worry about VENDING MACHINES getting hacked and being used as surveillance devices now too?? Can I just buy a candy bar without being reminded we live in a dystopia?

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      10
      ·
      10 months ago

      To be fair that is perfectly valid and benign data to collect to determine what demographics use your service.

      I’m sure this is going to be controversial on here though but when you build a service or a device it’s usually pretty valuable to know who uses it in order to determine what features to work on next or how to change it.

      Of course the ability for them to abuse it is quite high and it would be difficult to trust them not to transmit more information than they’re supposed to

      • JJROKCZ@lemmy.world
        link
        fedilink
        English
        arrow-up
        32
        ·
        10 months ago

        You don’t need to add features to a vending machine though, it just needs to take currency in exchange for snacks and drinks. Any metrics of what sells better/worse can be done by watching inventory like we’ve done all of commercial history. They’re over complicating this for no valid reason

        • nymwit@lemm.ee
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          10 months ago

          It’s not just what sells, but who buys what. “Demographic X buys this one product more than others so how can we advertise this product to them where they will see it?” Growth is their “valid” reason, you know, like malignant cancer cells.

        • Cringe2793@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          10 months ago

          There is a valid reason. It’s easier. You don’t need to get a person to watch/count/analyse the inventory.

            • Cringe2793@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Yeah I think you’re just paranoid. Maybe you need to understand how this works better before making a snap judgement. No personally identifying pictures are stored. What’s wrong with a bit of market research? Knowing which (estimated) gender buys what is not that big a deal.

              Y’all can downvote me and argue till the cows come home, but y’all are just paranoid. I see nothing wrong with this.

      • NaibofTabr@infosec.pub
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 months ago

        I think a big question here is whether or not this feature of the machine was disclosed to the university when they were installed. It’s one thing for the university to place its own security cameras that it has control over, but if a third party is placing surveillance devices on the property they should be giving very clear written notice.

      • OpenStars@startrek.website
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        10 months ago

        Ofc it could have been benign, but there is no evidence that it was, while conversely everything that we currently know points to a breach of ethics.

        One, they did not fully disclose that a camera was even there (unless I am mixing up this story with another one just like it?). That also makes it impossible to…

        Two, they did not obtain proper (or any) consent. A banking ATM that needs to use your face to verify your identity could be an example of a benign use, and ignoring the enormous potential security implications of that atm, it could do so with a popup on the screen “Do you consent to having your face observed?”, “Do you consent to storage of your facial data in our database?”, “Do you consent to us selling the marketing data we collect from analysis of your facial data?”. They did none of this.

        Three, when asked about it, they lied. Technically they obfuscated the truth, which is just another way of stating that they lied.

        Ofc it COULD have been benign, but so far they are zero out of three already towards that end - and that is even from just what we know so far.

      • LifeInMultipleChoice@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        10 months ago

        Many of these were set up in areas where there are no employees as well. While some may be “vending machines” by definition many colleges and work sites installed areas with food/drink items all set up on shelves where you can go and grab what you want and then go self checkout on your own and walk off. Cameras and recognition of who is taking items without paying has been the regulating power since they were set up. Many do not accept cash, and you use a card, your phone or even set up an account using your fingerprint to grab a banana/cookies/gatoraid/ice coffee/whatever and pay quickly. The idea that they knew who you were was used to balance their costs against the number of lost/stolen items.

        It is foolish to think they weren’t identifing individuals, but it would be wrong to sell the data as well.

      • ohto@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        I agree this is a legitimate goal. I guess I’m just thinking they need to be transparent about it. The representative should be clear what they are doing and not insinuate they are only identifying the presence of a human and that’s it. They probably should even have a sign on the machine to notify people they are being videoed. When I get into my Ford Escape the touch screen tells me I’m supposed to notify my passengers of privacy concerns because I have location services turned on. This sort of privacy notification seems standard these days.

  • Swordgeek@lemmy.ca
    link
    fedilink
    English
    arrow-up
    82
    arrow-down
    1
    ·
    edit-2
    10 months ago

    I’m surprised nobody has discussed the most obvious “marketing” use of this data: Differential pricing.

    Someone walks up to the machine. Based on the image seen by the machine, they determine which product is most likely to sell, and bump that product’s price up by a quarter or 50 cents.

    If they’re not doing it now, they’re preparing to do it in the near future.

    EDIT If you watch Invenda’s marketing videos, they talk about how the ‘optical sensor’ provides a ‘bespoke purchasing experience.’

    Sounds exactly like dynamic pricing is their model.

    • NegativeInf@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      10 months ago

      Alternatively, I have seen the vending machines with giant screens on the front that play ads for different drinks. Perhaps they will use it for advertising decisions as well.

      • jabjoe@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        It’s not either or. It can easily be both and selling any data they can as well.

    • Myrbolg@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      That’s my assumption as well. Man in a suit checks out the machine? Bump it up. Couple checks out? Bump it up.

  • dhork@lemmy.world
    link
    fedilink
    English
    arrow-up
    80
    arrow-down
    2
    ·
    edit-2
    10 months ago

    On the one hand, I can totally understand that there is a difference between recognizing a face and recognizing your face. Algorithms that recognize a face are really easy to implement now.

    On the other hand, though, why should a vending machine need to recognize a face? So it shuts off it’s lighting when no one is looking at it? I’m not sure if there is any practical benefit besides some project manager justifying a new feature with buzzword-compliant tech.

    I believe the company when they say there is nothing problematic here, but they deserve the bad press for thinking it would be a good idea in the first place.

    • OpenStars@startrek.website
      link
      fedilink
      English
      arrow-up
      77
      ·
      10 months ago

      Their corporate website mentions that they use the data for marketing purposes. Whatever type of face they see - e.g. male or female, large or skinny, etc. - gets correlated with what was purchased, and then they sell that data for marketing purposes. Exactly like Google selling your search history, except with likely fewer restrictions in place.

      Their website doesn’t mention how often they get hacked to give away that data for free - to be clear, that data meaning A PICTURE OF YOUR ACTUAL FUCKING FACE. I don’t know what resolution, or even what someone would do with it later, I am focusing here on the fact that the picture taking seems nonconsensual, especially for it to be stored in a database rather than simply used in the moment.

      • dhork@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        4
        ·
        edit-2
        10 months ago

        They claim to be GPDR compliant, and while I am not an EUian I think if that claim is accurate, they can’t be doing any of those things you mention.

        My point is, even if we take them at their word that the facial recognition is benign, it was still a dumb choice.

        • mosiacmango@lemm.ee
          link
          fedilink
          English
          arrow-up
          13
          ·
          10 months ago

          GPDR only applies in the EU, and this happened in Canada. They may actually be GPDR compliant in europe, but have they stated whether they are following those laws where they aren’t legally required to?

          • dhork@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            10 months ago

            Most companies who sell worldwide won’t bother developing one set of firmware which is GPDR compliant for the EU, and another set for the rest of the world, unless there was an explicit business reason to do so. So when they replied about this incident in Canada with their GPDR status, I thought it was implied that they had only one codebase which was GPDR compliant, and they ship it in Canada, not because they have to but because it’s all they have.

            • mosiacmango@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              10 months ago

              The assumption is exactly what they are hoping for and the problem. They say they adhere to the GPDR, but not that they adhere to it everywhere, regardless of legal requirement. If they do adhere to its requirements everywhere, it would be an easy thing to state.

              The article has comments from the manufacturer and the company that stocks the machine and both state that they dont take or store pictures, but are purposely vague about what data they so take and storing. I expect this is due to it still being a creepy level of information about their customer base that is another revenue stream they exploit.

      • Vlyn@lemmy.zip
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 months ago

        A PICTURE OF YOUR ACTUAL FUCKING FACE

        That’s not how this works. The most likely use case is using a picture of your face, letting the algorithm run (which then finds out if you’re male, female, roughly how old) and then they throw the picture away. The actual collected data is anonymous, so if they did that it might even be GDPR compliant in the EU (otherwise they’d break several laws).

        There really is no value in having a picture of your actual face, it’s just a lot of trouble in waiting.

    • ohto@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      18
      ·
      10 months ago

      They need to recognize a face because they explicitly state in their FAQ they are estimating purchasers’ age and sex. This isn’t just adjusting lighting. I would not be so quick to say there is nothing problematic here. I’m highly skeptical.

      • Cringe2793@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        10 months ago

        And yet, you’re quick to jump to the conclusion that there is something problematic? I don’t really see anything wrong with this. It’s not personal information. It’s demographics.

    • bionicjoey@lemmy.ca
      link
      fedilink
      English
      arrow-up
      17
      ·
      10 months ago

      Of note, it’d be pretty easy to push an OTA software update to have it go from recognizing a face to recognizing your face

      • the post of tom joad@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        then of course linking your card/phone to your face. maybe you can get a text message reminding you that you ate one this time last week and “youre not yourself when you’re hungry”

  • pr06lefs@lemmy.ml
    link
    fedilink
    English
    arrow-up
    34
    ·
    10 months ago

    Lesson learned: don’t name your surveillance tool EvilFaceRecognition.exe

    • clif@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      10 months ago

      We got a phishing campaign at work awhile back with an attachment named “OktaAccountStealer.pdf”

      … I was impressed. What I really want to know is how many people opened it anyway.

      • dhork@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 months ago

        They do shit like that on purpose. Someone who is aware enough to read the names of attachments probably won’t fall for the rest of their scam. Its a filter to make sure they don’t waste their effort on anyone other than the most gullible.

    • Gork@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      Motion sensor is all well and good until they start to implement gait recognition and sell that data as well.

      It’ll soon know you by your sick dance moves.

  • nymwit@lemm.ee
    link
    fedilink
    English
    arrow-up
    22
    ·
    10 months ago

    How about some consent and payment for my info? Swingy peephole cover thing over the camera. Offer a discount if the machine can take a picture of you. Oh that’s right, it’s only worth something when you amass a ton of the data. 0.004 cents off isn’t that appealing is it?

  • TherouxSonfeir@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    10 months ago

    As facial recognition becomes easier and cheaper, you are going to find it in all sorts of things. From your refrigerator to your child’s toy. Better get used to it, because that is the future. We should all invest in stickers to place on all the cameras, or guillotines… for the tech ceos

    • demonsword@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      10 months ago

      We should all invest in stickers to place on all the cameras, or guillotines… for the tech ceos

      why not both

  • Rentlar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    10 months ago

    I wasn’t against the idea of using facial recognition (age/gender) to do things like recommend products, where I first saw it in Japan on train station vending machines.

    The difference is that in my example the camera was in a very obvious spot, here it seems it is covertly collecting facial recognition data, and users weren’t aware it was happening until a student noticed this error.

    • OpenStars@startrek.website
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      I wouldn’t mind if like I told my phone to send out a signal “I am a man, but I like cold tea”. Partnering together with the machine to help me buy something I will enjoy is truly helpful.

      Consent makes all the difference in the world.

  • SuperSynthia@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    10 months ago

    It’s funny how much I love cyberpunk fiction but how much I hate cyberpunk reality. Now if the vending machine becomes sentient? Then we are good, until then I guess fuck these guys?

    • dhork@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      I’ll gladly welcome the sentient machines if they can make me a sandwich now and then

  • ProxyZeus@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    10 months ago

    Glad the college I went to is too cheap for these fancy things. Their vending machines are just barely smart enough to use tap pay, and by barely I sometimes it doesn’t even work.

  • Desistance@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    10 months ago

    A solution in search of a problem. They didn’t need that tech for a payment interface.

    • Swordgeek@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 months ago

      No, but they need it to dynamically adjust pricing based on customer demographics.

      • Cringe2793@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        10 months ago

        Do they actually dynamically adjust pricing? Or is thst just fear mongering over technology?

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 months ago

    This is the best summary I could come up with:


    A university in Canada is expected to remove a series of vending machines from campus after a student discovered a sign they used facial recognition technology.

    “The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface — never taking or storing images of customers.”

    “The software conducts local processing of digital image maps derived from the USB optical sensor in real-time, without storing such data on permanent memory mediums or transmitting it over the Internet to the Cloud.”

    Representatives for the University of Waterloo, Invenda Group, Adaria Vending Services, and Mars did not respond to Business Insider’s request for comment sent over the weekend ahead of publication.

    Facial recognition technology on college campuses is an ongoing tension point for students and staff members, with examples popping up globally.

    Tensions heightened in March 2020 when students at dozens of US universities protested facial recognition on college campuses, The Guardian reported.


    The original article contains 610 words, the summary contains 159 words. Saved 74%. I’m a bot and I’m open source!

  • ArbitraryValue@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    11
    ·
    10 months ago

    Honestly, what’s the big deal? Your face is not secret and anyone who feels like it can photograph you while you’re out in public. Vending machines already know who you are if you use a credit card.

    However, this is a good reminder to programmers: customers might sometimes see your error messages even if you didn’t intend them to. Don’t write anything Marketing wouldn’t like.