Changes

Jump to navigation Jump to search
1,734 bytes added ,  12:19, 24 October 2018
added a recent article by steve Sailor
Google and Flickr's automatic photo tagging programs took flag for being racist (early to mid 2015). Labeling black people "Gorilla" or "ape" is the worst unPC gaffe imaginable. Other labeling mistakes are simply funny or dumb. These labeling programs are self learning, so no person could be blamed for the mistake. [[Racists say]] that such mistakes are normal, due to apparent similarities, that babies make exactly the same embarrassing mistakes. [[Racist babies]] [[Racist baby]] raised in a non-diverse white family sometimes blurt out loudly "monkey, monkey" when they see a black person.
 
[[File:Racist-computer-calls-black-gorilla.jpg|frameless|left|Racist google algorithm calls Black "gorilla"]]
<bqblockquote>SAN FRANCISCO (AP) — Google’s new image-recognition program misfired badly this week by identifying two black people as gorillas, delivering a reminder that even the most intelligent machines still have lot to learn about human sensitivity. </bqblockquote>
[[racists say|R:]] Computers, and babies, need special instructions in [[Politically correct |PC]] [[sensitivities]]. Other mistakes, by computers or babies, are funny or dumb. But offending PC sensitivities is worse than many violent crimes.
<bqblockquote>The blunder surfaced in a smartphone screen shot posted online Sunday by a New York man on his Twitter account, @jackyalcine. The images showed the recently released Google Photos app had sorted a picture of two black people into a category labeled as “gorillas.”</bqblockquote>
The account holder used a profanity while expressing his dismay about the app likening his friend to an ape, a comparison widely regarded as a racial slur when applied to a black person.
“We’re appalled and genuinely sorry that this happened,” Google spokeswoman Katie Watson said. “We are taking immediate action to prevent this type of result from appearing.”</bqblockquote>
Google is fully aware that blatantly offending a [[disadvantaged minority]] with a slur is the worst sin possible. So much Google invited the offended party to an interview.
<bqblockquote>
A tweet to @jackyalcine requesting an interview hadn’t received a response several hours after it was sent Thursday.
Despite Google’s apology, the gaffe threatens to cast the Internet company in an unflattering light at a time when it and its Silicon Valley peers have already been fending off accusations of discriminatory hiring practices. Those perceptions have been fed by the composition of most technology companies’ workforces, which mostly consist of whites and Asians with a paltry few blacks and Hispanics sprinkled in.</bqblockquote>
The mix-up also surfaced amid rising US racial tensions that have been fueled by recent police killings of blacks and last month’s murder of nine black churchgoers in Charleston, South Carolina.
The incident points to the problem tech companies face as computers get smarter and are expected to take on more more tasks a human normally would do. Those areas of computer science -- such as artificial intelligence or machine learning -- are some of the biggest engineering focuses in Silicon Valley. But ''with that focus comes another task that computers have not traditionally tackled: grappling with the challenge of sensitivity''
</blockquote>
=== [http://internet.gawker.com/flickrs-auto-tagging-feature-accidentally-labeled-a-bla-1706045425 Flickr's Auto-Tagging Feature Accidentally Labeled a Black Man "Ape"]
<blockquote>
<p>Search for “ape” on Flickr and you’ll witness an endlessly scrolling cavalcade of primate photography, from monkeys glimpsed on safari to those held in captivity at the zoo. Until recently, you’d also see <a href="https://www.flickr.com/photos/thirteenthfloormedia/14570569401">a portrait of a middle-aged black man</a> named William. Flickr thought William was an ape, too.</p>
<p>All of the offending examples listed here have since been corrected, though the two portraits are still labeled with “animal,” which is I suppose technically accurate. And users can manually remove bad auto-tags from their pictures. As the <em>Guardian </em>notes, Flickr appears to have wisely removed “ape” entirely from its auto-tagger’s list of choices. Maybe leave this stuff to humans with eyes next time. </p>
</blockquote>* [http://www.theguardian.com/technology/2015/jul/01/google-sorry-racist-auto-tag-photo-app Google says sorry for racist auto-tag in photo app]  
=== [http://money.cnn.com/2015/05/21/technology/flickr-racist-tags/ Flickr's new auto-tags are racist and offensive | CNN Money]===
* [http://www.cnet.com/news/flickr-misfires-with-automated-photo-tags/ Flickr misfires with automated photo tags]
 
---------------------------
[https://vdare.com/posts/artificial-intelligence-the-robot-becky-menace / Artificial Intelligence--The Robot Becky Menace]
 
Not surprisingly, the conventional wisdom increasingly believes Artificial Intelligence needs a dose of Artificial Stupidity to keep it from being as racist and sexist as Natural Intelligence. Otherwise, the Robot Permit Patties will run amok, says Nature:
 
[https://www.nature.com/articles/d41586-018-05707-8?utm_source=briefing-wk&utm_medium=email&utm_campaign=briefing&utm_content=20180720 / AI can be sexist and racist — it’s time to make it fair]
 
<blockquote>
… As much as possible, data curators should provide the precise definition of descriptors tied to the data. For instance, in the case of criminal-justice data, appreciating the type of ‘crime’ that a model has been trained on will clarify how that model should be applied and interpreted. …
 
Lastly, computer scientists should strive to develop algorithms that are more robust to human biases in the data.
 
Various approaches are being pursued. One involves incorporating constraints and essentially nudging the machine-learning model to ensure that it achieves equitable performance across different subpopulations and between similar individuals.
 
A related approach involves changing the learning algorithm to reduce its dependence on sensitive attributes, such as ethnicity, gender, income — and any information that is correlated with those characteristics.
</blockquote>
 
[[Category:Under construction]]
autoreview, Bureaucrats, editor, reviewer, Administrators
295

edits

Navigation menu