Racist computers
Google and Flickr's automatic photo tagging programs took flag for being racist (early to mid 2015). Labeling black people "Gorilla" or "ape" is the worst unPC gaffe imaginable. Other labeling mistakes are simply funny or dumb. These labeling programs are self learning, so no person could be blamed for the mistake. Racists say that such mistakes are normal, due to apparent similarities, that infants make exactly the same embarrassing mistakes.
New Google app blunders, calls black people ‘gorillas’ Times of Israel Company officials apologize for mortifying blunder, explain that technology still needs more development
<bq>SAN FRANCISCO (AP) — Google’s new image-recognition program misfired badly this week by identifying two black people as gorillas, delivering a reminder that even the most intelligent machines still have lot to learn about human sensitivity. </bq>
R: Computers, and babies, need special instructions in PC sensitivities. Other mistakes, by computers or babies, are funny or dumb. But offending PC sensitivities is worse than many violent crimes.
<bq>The blunder surfaced in a smartphone screen shot posted online Sunday by a New York man on his Twitter account, @jackyalcine. The images showed the recently released Google Photos app had sorted a picture of two black people into a category labeled as “gorillas.”</bq>
The account holder used a profanity while expressing his dismay about the app likening his friend to an ape, a comparison widely regarded as a racial slur when applied to a black person.
“We’re appalled and genuinely sorry that this happened,” Google spokeswoman Katie Watson said. “We are taking immediate action to prevent this type of result from appearing.”</bq>
Google is fully aware that blatantly offending a disadvantaged minority with a slur is the worst sin possible. So much Google invited the offended party to an interview.
<bq> A tweet to @jackyalcine requesting an interview hadn’t received a response several hours after it was sent Thursday.
Despite Google’s apology, the gaffe threatens to cast the Internet company in an unflattering light at a time when it and its Silicon Valley peers have already been fending off accusations of discriminatory hiring practices. Those perceptions have been fed by the composition of most technology companies’ workforces, which mostly consist of whites and Asians with a paltry few blacks and Hispanics sprinkled in.</bq>
The mix-up also surfaced amid rising US racial tensions that have been fueled by recent police killings of blacks and last month’s murder of nine black churchgoers in Charleston, South Carolina.
Google’s error underscores the pitfalls of relying on machines to handle tedious tasks that people have typically handled in the past. In this case, the Google Photo app released in late May uses recognition software to analyze images in pictures to sort them into a variety of categories, including places, names, activities and animals.
When the app came out, Google executives warned it probably wouldn’t get everything right — a point that has now been hammered home. Besides mistaking humans for gorillas, the app also has been mocked for labeling some people as seals and some dogs as horses.
“There is still clearly a lot of work to do with automatic image labeling,” Watson conceded.
Some commentators in social media, though, wondered if the flaws in Google’s automatic-recognition software may have stemmed on its reliance on white and Asian engineers who might not be sensitive to labels that would offend black people. About 94 percent of Google’s technology workers are white or Asian and just 1 percent is black, according to the company’s latest diversity disclosures.
Google isn’t the only company still trying to work out the bugs in its image-recognition technology.
Shortly after Yahoo’s Flickr introduced an automated service for tagging photos in May, it fielded complaints about identifying black people as “apes” and “animals.” Flickr also mistakenly identified a Nazi concentration camp as a “jungle gym.”
Google reacted swiftly to the mess created by its machines, long before the media began writing about it.
Less than two hours after @jackyalcine posted his outrage over the gorilla label, one of Google’s top engineers had posted a response seeking access to his account to determine what went wrong. Yonatan Zunger, chief architect of Google’s social products, later tweeted: “Sheesh. High on my list of bugs you never want to see happen. Shudder.”
Copyright 2015 The Associated Press.
Google ‘appalled’ as Photos app labels black people ‘gorillas’ |Irish Times Google removes ‘gorilla’ tag from app, after it misidentified images of black people
Google apologises photo app tagged black couple 'gorillas'
Google apologized and went to work fixing the problem earlier in the week after the offensive blunder was pointed out in a Twitter message from @JackyAlcine.
"Google Photos, y'all (messed) up," Jacky Alcine said in a series of emphatic messages.
"My friend's not a gorilla."
OVERHAULED PHOTO APP
Google released its overhauled photo app for smartphones in May, touting it as a major advancement in sorting, organizing, and handling pictures.
Google engineer Yonatan Zunger put the blame for the labelling on the artificial intelligence software designed to let machines learn how to recognise places, people and objects in pictures.
"Sheesh," Zunger said in the exchange of Twitter messages. "High on my list of bugs you never want to see happen. Shudder."
Zunger told of problems such as not seeing faces in pictures at all, or even identifying some people as dogs.
Picture recognition has proven challenging for computers and numerous companies are working on programs to improve identification.
Google and Facebook are among Silicon Valley technology giants investing heavily in artificial intelligence to get machines to think more like the way people do.
"There is still clearly a lot of work to do with automatic image labelling, and we're looking at how we can prevent these types of mistakes from happening in the future," the Google representative said of the photo gaffe.
"I understand how this happens," Alcine said in the online exchange. "The problem is more so on the why."
Google says sorry for racist auto-tag in photo app
Flickr's new auto-tags are racist and offensive | CNN Money
Some <a href="https://www.flickr.com/photos/30926561@N02/17730858426/in/photolist-t1PhtC-t9UdPS-s6sGU9-smmvJp-sPvwCJ-tgaLPN-teWtUw-sZttTb-sKNEiW-eHmxpn-7Vk6Ej-dt8Tcc-d5A1RW-smkQik-tfXmoY-t2WmHi-s3KiFZ-sZkEgy-t767zr-sKnhzW-gqKgNE-85X9ph-8ekV5V-5RaoV2-bURk9k-aC1Qdt-55ns6z-4bm4LE-4JKVz1-55rDAd-4zsDnW-dBCSe-cJXXbs-55ns5r-a2pVNe-6s2ZM2-6s38R6-6sdaGL-apBu25-ccdzgb-dQmvU5-6g232V-c9pxY3-eTpvDQ-fCgvGy-4zsDHw-2QyZUa-4zsCK9-4JFDHi-6s79UA">concentration camp photos</a> received inappropriate tags, including "<a href="https://www.flickr.com/photos/30926561@N02/17730858426/in/photolist-t1PhtC-t9UdPS-s6sGU9-smmvJp-sPvwCJ-tgaLPN-teWtUw-sZttTb-sKNEiW-eHmxpn-7Vk6Ej-dt8Tcc-d5A1RW-smkQik-tfXmoY-t2WmHi-s3KiFZ-sZkEgy-t767zr-sKnhzW-gqKgNE-85X9ph-8ekV5V-5RaoV2-bURk9k-aC1Qdt-55ns6z-4bm4LE-4JKVz1-55rDAd-4zsDnW-dBCSe-cJXXbs-55ns5r-a2pVNe-6s2ZM2-6s38R6-6sdaGL-apBu25-ccdzgb-dQmvU5-6g232V-c9pxY3-eTpvDQ-fCgvGy-4zsDHw-2QyZUa-4zsCK9-4JFDHi-6s79UA">sport</a>" and "<a href="https://www.flickr.com/photos/laughingsquid/8201658514/in/photolist-duKDkq-9PKSj-cXnmV3-fTKqzx-a7YLG-bkwWS9-W7SqH-4H8i7U-maaRAe-4H847H-8mzCSf-dBzph-2ZdhDw-4he7Rw-kgxaY7-5hALSZ-5yyGsr-b4HESt-4S1hBe-nypSAa-6iVXQd-54wFUE-9YqupJ-5xS4xE-wqoBQ-8CMD9P-7erYPR-b9xF1-9HXGYE-JzjuB-bUHrXa-92saU-75dzRW-e5NfKv-dodUBM-c8aVUW-4VAMGs-bs1fes-cSWgM-6p1LkQ-6T9fQ3-ahjXEK-5MUzNh-9NERVS-gF9q4s-9iPPE2-c3HUcG-4S17ci-6riRgJ-cRxAKE">jungle gym</a>."
Flickr had also been tagging some images of people as "ape" and "animal," including a <a href="https://www.flickr.com/photos/thirteenthfloormedia/14570569401">photo of a black man</a> named William taken by photographer Corey Deshon, according to <a href="http://www.theguardian.com/technology/2015/may/20/flickr-complaints-offensive-auto-tagging-photos">the Guardian</a>.
The photo service had also labeled a <a href="https://www.flickr.com/photos/132452869@N04/17801937502/">white woman wearing face paint </a>as "ape" and "animal," so Flickr's algorithm does not appear to be taking a person's skin color into consideration when auto-tagging them.
Flickr has since corrected both of those mistakes, but the concentration camp errors remain.
"We are aware of issues with inaccurate auto-tags on Flickr and are working on a fix," a spokesman for Flickr said in a statement. "While we are very proud of this advanced image-recognition technology, we're the first to admit there will be mistakes and we are constantly working to improve the experience."
Flickr noted that deleting incorrect tags teaches the new algorithm to learn from its mistake and improve its results in the future. The company also noted that Flickr staff does not personally tag photos -- it's all automated.