Racist machines

Thread Starter

nsaspook

Joined Aug 27, 2009
13,265
https://www.theguardian.com/technol...gence-beauty-contest-doesnt-like-black-people
The first international beauty contest judged by “machines” was supposed to use objective factors such as facial symmetry and wrinkles to identify the most attractive contestants. After Beauty.AI launched this year, roughly 6,000 people from more than 100 countries submitted photos in the hopes that artificial intelligence, supported by complex algorithms, would determine that their faces most closely resembled “human beauty”.

But when the results came in, the creators were dismayed to see that there was a glaring factor linking the winners: the robots did not like people with dark skin.
 

wayneh

Joined Sep 9, 2010
17,498
Is that possibly just the result of statistics? I mean, if dark faces are a minority and the software seeks to find the faces with the widest appeal, this result is predictable. In China, you'd expect a different outcome than in Sweden, than in Kenya.

It may be obvious that I didn't read the article.
 

Thread Starter

nsaspook

Joined Aug 27, 2009
13,265
Is that possibly just the result of statistics? I mean, if dark faces are a minority and the software seeks to find the faces with the widest appeal, this result is predictable. In China, you'd expect a different outcome than in Sweden, than in Kenya.

It may be obvious that I didn't read the article.
That's pretty much the drift of the article. There were many inputs of dark faces submitted (11 percent of participants were black) but the machine had 'learned' those sort of faces lacked the “human beauty” quality goal of the existing programming. These so called AI 'deep learning' machines still are just mirrors of the data biases feed them.
 

MaxHeadRoom

Joined Jul 18, 2013
28,683
It seems pretty obvious reading the article the data fed in was subject to the biases of those supplying the data.
IOW, pretty much proves the old adage, Beauty is in the eye of the beholder.!
Max.
 

Sinus23

Joined Sep 7, 2013
248
These so called AI 'deep learning' machines still are just mirrors of the data biases feed them.
Yeah I've yet to see an AI that learns at the pace of the slow kid that got picked on... The problem with AI is that they don't think like any of us not even the programmers that set the wheels in motion. Because we draw unforeseen/outlandish conclusions and make wild assumptions that a strictly logical machine can't compete with. We see solutions and problems that are of no value to the AI but is of an great value to us. Which is why we build the machines that build other machine but not them us...Yet that is;)

However yes those who programmed that AI did not foresee that the sample pool would/could only contain small X% of certain ethnics groups Vs another one bigger .

Or all the programmers are simply racists with the exception of those two guys which threw the curve a bit...;)
 

dannyf

Joined Sep 13, 2015
2,197
It is all driven by the data set that the computer is trained on. The data likely come from a population that's majority white, which everything else being equal tends to like attributes that white people have.

Calling the algorithm a racist reflect ones own stupidity.
 

Sinus23

Joined Sep 7, 2013
248
It is all driven by the data set that the computer is trained on. The data likely come from a population that's majority white, which everything else being equal tends to like attributes that white people have.

Calling the algorithm a racist reflect ones own stupidity.
IMHO, If the programmers didn't take that in account beforehand(that is that the data could become one-sided) they aren't then the brightest light bulbs in the house... Far from being stupid, just more programmers than sociologists/psychologists/statisticians.. which if at least one of them would have vaguely thought of it that way. Might have predicted that problem.

But hey hindsight, meet 20/20:cool:. Now it's just a matter of correcting that error.;)
 

atferrari

Joined Jan 6, 2004
4,768
And what is the ultimate goal of that machine once it becomes non-racist while perfect to determine who is the most beautiful?

To send it to a museum on behalf of someone who could stay at home drinking a beer?
 

Sinus23

Joined Sep 7, 2013
248
And what is the ultimate goal of that machine once it becomes non-racist while perfect to determine who is the most beautiful?

To send it to a museum on behalf of someone who could stay at home drinking a beer?
Yeah I hadn't even posted my thoughts of the obscure reason behind that program. All I can say is that no machine can tell me which women I find beautiful or attractive. Not today, not tomorrow...I think that applies to most people, men or women. That gave me a though. Run that shit with only men as subjects:cool:

That could be a bit interesting.;)
 

wayneh

Joined Sep 9, 2010
17,498
IMHO, If the programmers didn't take that in account beforehand(that is that the data could become one-sided) they aren't then the brightest light bulbs in the house...
Meh. I wouldn't expect even a brilliant programmer to understand much about statistics, sampling biases, doing science and such. And who knows what their goal was. It's likely it performed exactly as intended, just with unintended side effects.
 

Sinus23

Joined Sep 7, 2013
248
Meh. I wouldn't expect even a brilliant programmer to understand much about statistics, sampling biases, doing science and such. And who knows what their goal was. It's likely it performed exactly as intended, just with unintended side effects.
Oh I would think that they knew in one way or another(programmers there is). But they weren't exactly the project managers or the handlers of the budget were they:p So the program became what it is.
 

GopherT

Joined Nov 23, 2012
8,009
Maybe the Computer had a great previous experience with a Scandinavian woman. It's difficult to become unbiased after something like that.
 
Top