back to article Silly Google's Photos app labelled black people as gorillas

Google's new Photos software automatically labelled images of black people as "gorillas". The ad giant has since apologised. Mountain View's hugely embarrassing blunder comes just one month after it launched its cloud-hosted photo storage service, and made a big deal out of its machine-learning features. Google also warned …

Page:

  1. Anonymous Coward
    Anonymous Coward

    Maybe

    Google should hire a black employee or two, for a change. This is what you get when there are no black folks working there to test this kind of rubbish app on.

    1. Anonymous Coward
      Anonymous Coward

      Re: Maybe

      Perhaps they did it deliberately in order to get massive free publicity coverage for their product?

    2. Anonymous Coward
      Anonymous Coward

      Re: Maybe

      I can't speak for Google's diversity policy, but I'm fairly sure there's a large enough test sample size online of pictures for almost every ethnicity you care to mention.

    3. Meerkatjie

      Re: Maybe

      Like the time we had to test an off-the-shelf IVR system that failed to recognise female voices. Was a bit of a problem for us since the majority of our callers were female.

    4. Anonymous Coward
      Anonymous Coward

      Re: Maybe

      > This is what you get when there are no black folks working there to test this kind of rubbish app on.

      Awsome ... consulting insult analysts.

      "Hey Pete, c'mere. Are you insulted by this."

      "Nope"

      "And this?"

      "No."

      "How about this?"

      ''Err ... nope."

  2. John Robson Silver badge

    Question?

    "However, the question has to be asked: why did Google release such a half-baked app for showtime in the first place?"

    That's not a question - they needed test subjects...

    1. ratfox

      Answer

      It is not half baked. It probably works as it should in 99.9999% of the time. however, when you have hundreds of millions of users, that still means hundreds of cases that are wrong.

      It's like saying Google Maps is half baked because there is a street that is missing in your town. It's still damn useful.

      1. Jeffrey Nonken

        Re: Answer

        Ratfox, thank you for expressing thoughtful conjecture instead of just bashing Google.

    2. 142

      Re: Question?

      And indeed, it's not even half baked!

      By all accounts, the ratio of correct categorisation to mistakes is extremely good for an image detection system of this sort.

    3. Anonymous Coward
      Anonymous Coward

      Re: Question?

      "That's not a question - they needed test subjects..."

      Yep, we are all Google's Software Quality Assurance and Test (SQuAT) team.....

      Anon - I know of one organisation that really is thick enough to have software test team with that name.

      1. Charles Osborne

        Re: Question?

        Software Quality Uh...ssurance And Test?

    4. Matt Bryant Silver badge
      Facepalm

      Re: John Robson Re: Question?

      Welcome to the World of Agile! Letting users find your embarrassing bugs in the name of "flexibility".

    5. Anonymous Coward
      Anonymous Coward

      Re: Question?

      Test "subjects" or test "monkeys"?

    6. Charles Manning

      Is the software really broken?

      The whole point of machine learning software is that it gets fed input, does a classification and generates output.

      The software is not broken, it just has not been fed with good data. It clearly needs more black people in its learning set so it can tell the difference between a gorilla and a black person. This is no different from the recent NSFW classification by, IIRC, FB that classified pictures of girly bits as butterflies.

      But no, the numpties think there's racist software that goes

      if (image_property.black_face) printf("gorilla.\n");

      These classifications come from what people type in. As black people can call other black people "nigger" without the PC alarms going off, we'll also see these classification engines generate outputs like "nigger", "bro"... and no doubt the technically illiterate will think Google added more code that says

      if (image_property.black_face) printf("nigger");

  3. A Non e-mouse Silver badge
    Holmes

    Testing

    ..the question has to be asked: why did Google release such a half-baked app for showtime in the first place?

    Because in the agile web 2.0 world, you are the alpha & beta testers.

    1. Mark 85

      Re: Testing

      I think it goes back further than the web... many companies figured the risk assessment and decided to let their customers/users do their testing. It's just taken the software companies time to figure out how in-house testing affects the bottom line. The difference is that generally software won't kill or maim people compared to say a car or other piece of equipment.

  4. Irongut

    "However, the question has to be asked: why did Google release such a half-baked app for showtime in the first place?"

    Come on Kelly you've been in IT journalism for long enough to know the answer to that question. Everything Google does is a half baked "beta" that may be cancelled at any time and with minimum notice, even services that no longer have the beta tag like GMail.

  5. Anonymous Coward
    Anonymous Coward

    You're not really going to put much development time into what is not classed as your target audience. Which is wrong on quite a few levels.

  6. Bob Wheeler

    AI is hard

    How do you define an algorithm to describe a chair, something to sit on. Is that algorithm good enough to correctly distinguish a dining table chair, a stool, sofa, a park bench?

    1. Turtle

      @Bob Wheeler Re: AI is hard

      "How do you define an algorithm to describe a chair, something to sit on. Is that algorithm good enough to correctly distinguish a dining table chair, a stool, sofa, a park bench?"

      That's a good question. And what's particularly interesting is that scientists do not even know how the human mind is able to distinguish the incredible variety of things that are subsumed under the heading "chairs".

      Because intelligence of any sort, artificial or otherwise, is hard.

    2. Anonymous Coward
      Anonymous Coward

      Re: AI is hard

      Does that mean that white people are polar bears? What about people after a night on the town, are they pandas? AI is hard. I'd understand if Cruela was identified as a skunk.

      1. JoshOvki

        Re: AI is hard

        I wouldn't be offended if I was tagged as a panda. Probably be fairly accurate actually.

        1. Anonymous Coward
          Anonymous Coward

          Re: AI is hard

          Question : Why is being called a Gorilla an insult ? What's wrong with Gorillas ?

          And the AI appeared to do a reasonable job in identifiying that the iamge was indeed an animal, with a black face, eyes mouth and nose.... The mistake was just the type of animal..

          So everyone decides that this should be treated as racist, or as an insult, cmon folks, the problem lies with the meatbags not with the AI.

          1. Mark 85

            Re: AI is hard

            Well... there's a whole lot of racial slurs involving apes and monkeys that still show up in the US. So yes, I can see where it's an insult. Unintentional from an AI standpoint, but... unintended consequences and all that.

            1. TeeCee Gold badge
              Thumb Up

              Re: AI is hard

              Unintentional from an AI standpoint...

              Yup, the important bit about an insult is the intent. A "perceived insult" isn't actually an insult at all, just a misunderstanding.

              Trouble is that the legal system tends to reward people for acting pigshit-thick and misunderstanding as much as possible.

          2. Graham Marsden
            Boffin

            Re: AI is hard

            > Question : Why is being called a Gorilla an insult ?

            Seriously? Try looking at some history and how people who don't have white skin have been constantly and repeatedly dismissed over the centuries as "less than human" and then you may find the answer to your question.

          3. Anonymous Coward
            Anonymous Coward

            Re: AI is hard

            In the absence of the context of the treatment of those of African origin - you would be quite right to point out elements of faux-outrage, or victim identification.

            But as they were often portrayed as animals/less than human, there is good reason that it may make some bridle at the inadvertent linking.

            1. Anonymous Coward
              Anonymous Coward

              Re: AI is hard

              Bananas are thrown at players on the pitch all over Europe. It's maybe not reported so much, having become de rigeur.

        2. Charlie_Manson
          Joke

          Re: AI is hard

          JoshOvki

          Is that because you eat, shoot and leave?

    3. Anonymous Coward
      Anonymous Coward

      Re: AI is hard

      Yes, AI is hard. Google excels at coming up with fast and useful answers to questions of nearly impossible scale. It's in their designs, their software, and their mentality. As a result, Googlers have trouble comprehending situations where one imperfect answer may have dire consequences.

      1. Anonymous Coward
        Anonymous Coward

        Re: AI is hard

        Many of the above comments are in direct correlation to what I initially wrote.. The problem is with the meatbags not the AI.

        The AI has no concept of "insult", it did not intentionally create this situation. The AI analysed an image and found a match for what is characterises as a Gorilla, nothing racial here..

        The only time that it becomes a racial problem is when the PC start shouting, because up until that point it was , and is, just a computer algorithm trying to determine the real-world equivalant object from a bitmap.

        By reading anything more into it than that should really make you should think about your own mind's processes..

        Unless of course certain amongst the El Reg forum believe in a more biblical approach to evolution.....I can hear Darwin sobbing to himself with his face cupped in his hands...

  7. Crisp

    Stuff kids say

    To be fair to Google that machine can only be a few years old, and what toddler hasn't said something inappropriate?

    1. TRT Silver badge

      Re: Stuff kids say

      My daughter reached the linguistic ability of being able to put adjectives and nouns together quite early on, but her speech was somewhat indistinct. "White car" sounded more like "Whan Car"... which she shouted very loudly whilst pointing at at a passing BMW with windows wound down, lowered suspension, trailing "aromatic" smoke like a Pacific 4-6-2 and at an gap between tracks 19 and 20 of the album "Murder Junkies" that had been pounding out of the ICE as it approached... one can only be grateful that the driver was probably momentarily suffering hearing loss as a result of the incredible volume.

      1. Sir Runcible Spoon
        Coat

        Re: Stuff kids say

        She sounds quite advanced to me, can't find fault with either her timing or what she said.

      2. Anonymous Coward
        Anonymous Coward

        You *sure* she meant "white car"?

        @TRT; Sounds like your daughter hit the nail on the head with "whan car". Perhaps you're just not giving her enough credit? ;-)

        1. TRT Silver badge

          Re: You *sure* she meant "white car"?

          Accuracy and not getting your dad gunned down in a drive-by for teaching his kids to 'dis the local drug dealer are two different skill sets.

      3. Dan 10

        Re: Stuff kids say

        Genius. I once blagged a ticket for a Liverpool-Fulham football match, separated from my friend and in a stand with the opposing Fulham supporters. Liverpool won 2-0, met with much hurling of obscenities by those around me, including the man next to me who was with his son (I know, great example!). At the end, this lad, probably about 7 yrs old, turned to his Dad, and pointing to me, says "he's not made a sound for 90 minutes, do you think he's a Liverpool fan?" His Dad said something like "No, don't be silly, be quiet", and I just thought was that he was one of the most observant people there. I wanted to tell his Dad, but chickened out!

      4. Anonymous Coward
        Anonymous Coward

        Re: Stuff kids say

        Why 2 year old likes watching Play Doh videos on YouTube (don't ask) and has seen me use Google Now. So the other day she was trying to tell my phone 'Okay googoo.... paedo videos'. Also she likes 'surprise egg videos' (again, don't ask), Google Now interprets 'egg videos' as 'xvideos', so thats one feature that won't be returning to my phone any time soon.

        1. Sir Runcible Spoon

          Re: Stuff kids say

          "so thats one feature that won't be returning to my phone any time soon."

          Superb :)

          I think all* products should be tested by toddlers.

          *There will obviously need to be *some* restrictions!

    2. Anonymous Coward
      Anonymous Coward

      Re: Stuff kids say

      Very true....When there were no black people where I lived,......when I did talk to a guy (I must have been 6 or 7 ) I asked why the palms of his hands were white (Mum and Dad gobs wide open!!!).... he said it was due the spray paint job he had from god.....he had to put his hands on the wall.....Fab that must have been great carried on been a 6 year old.......only a polnker would see this as racist.....work in progress....I would love to be compared to such a great animal!!

  8. TRT Silver badge

    Reminds me a bit...

    of that scene in Die Hard With A Vengeance where John McClane has to walk around Harlem with a sandwich board...

    Except Google don't have any sort of noble rationale behind why they are doing something so utterly stupid and offensive.

    1. Turtle

      @ TRT

      "Except Google don't have any sort of noble rationale behind why they are doing something so utterly stupid and offensive."

      Look, no one hates Google as much as I do, but they didn't do this intentionally. And when they say "We’re appalled and genuinely sorry that this happened" I actually believe them. And I don't believe much of what they say, I promise you.

      1. Anonymous Coward
        Anonymous Coward

        @Turtle - Re: @ TRT

        You believe in Google ?!! Good, how about the Tooth Fairy ?

        Sorry, pal, I simply can't stop my guffaws.

  9. Mephistro
    Happy

    On the other hand, if I am included in their images database, ...

    ... I'm probably classified as an albino silverback.

    This is clearly NOT a case of racism. It's plain stupidity without any additives!. ;-)

    1. Brewster's Angle Grinder Silver badge

      Re: On the other hand, if I am included in their images database, ...

      The charge of racism is directed towards the programmers for not having used enough photos of black people. But, obviously, none of us are in a position to judge since we don't know what it was trained on.

      1. Destroy All Monsters Silver badge

        Re: On the other hand, if I am included in their images database, ...

        The charge of racism is directed towards the programmers for not having used enough photos of black people

        So "racism" now extends to using a biased training set? Oh brave new world.

        "Be offended often. It helps in not noticing the real problems."

        1. Anonymous Coward
          Anonymous Coward

          Re: On the other hand, if I am included in their images database, ...

          Implicit racism informed the selection bias so subtly that those developing the tool failed to notice that the samples fed into their tool were not representative of the extent of variation existing outside the confines of the Silicon Valley/Bay Area.

        2. Brewster's Angle Grinder Silver badge

          Re: On the other hand, if I am included in their images database, ...

          >So "racism" now extends to using a biased training set?

          Let's read the first sentence of Wikipedia's current entry on racism: "Racism consists of ideologies and practices that seek to justify, or cause, the unequal distribution of privileges, rights or goods among different racial groups."

          So we have the right---or perhaps privilege---of being recognised as an instance of H. sapiens sapiens apparently being caused to be distributed unequally via the use of a biased training set. QED

      2. roselan

        Re: On the other hand, if I am included in their images database, ...

        Don't dismiss the possibility of racists circlejerks tagging pictures in bulk to "train" the system whilst it is young.

        I doubt these idiots are capable of such insight thou. Google only mistake was probably to only use pictures from Google+ ...

        1. Triggerfish

          Re: On the other hand, if I am included in their images database, ...

          This link shows some of the ways they are teaching the image recognition software, ends up with some surreal photos.when they try and get it to interpret some images, clouds seem to give it some problems it see faces and things in the patterns.

          http://googleresearch.blogspot.co.uk/2015/06/inceptionism-going-deeper-into-neural.html?m=1

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like