Racist machines

Discussion in 'Off-Topic' started by nsaspook, Sep 9, 2016.

  1. nsaspook

    Thread Starter AAC Fanatic!

    Aug 27, 2009
    2,908
    2,169
    https://www.theguardian.com/technol...gence-beauty-contest-doesnt-like-black-people
     
  2. wayneh

    Expert

    Sep 9, 2010
    12,123
    3,048
    Is that possibly just the result of statistics? I mean, if dark faces are a minority and the software seeks to find the faces with the widest appeal, this result is predictable. In China, you'd expect a different outcome than in Sweden, than in Kenya.

    It may be obvious that I didn't read the article.
     
  3. SLK001

    Well-Known Member

    Nov 29, 2011
    818
    228
    Maybe not racist machines, but racist programmers. Now racist programmers, I can believe.
     
  4. nsaspook

    Thread Starter AAC Fanatic!

    Aug 27, 2009
    2,908
    2,169
    That's pretty much the drift of the article. There were many inputs of dark faces submitted (11 percent of participants were black) but the machine had 'learned' those sort of faces lacked the “human beauty” quality goal of the existing programming. These so called AI 'deep learning' machines still are just mirrors of the data biases feed them.
     
    Sinus23 likes this.
  5. MaxHeadRoom

    Expert

    Jul 18, 2013
    10,547
    2,373
    It seems pretty obvious reading the article the data fed in was subject to the biases of those supplying the data.
    IOW, pretty much proves the old adage, Beauty is in the eye of the beholder.!
    Max.
     
  6. SLK001

    Well-Known Member

    Nov 29, 2011
    818
    228
    I thought that adage was, "Beauty is in the eye of the beer holder".
     
  7. MaxHeadRoom

    Expert

    Jul 18, 2013
    10,547
    2,373
    That's at closing time!:p
    Max.
     
    cmartinez, Sinus23 and nsaspook like this.
  8. nsaspook

    Thread Starter AAC Fanatic!

    Aug 27, 2009
    2,908
    2,169


    Mathematical Beauty is a ratio?
     
  9. joeyd999

    AAC Fanatic!

    Jun 6, 2011
    2,683
    2,744
    I looked at their leaderboard. Each finalists' eyes are spread too wide on their faces. Very unattractive. The algorithm requires tweaking.
     
  10. Sinus23

    Member

    Sep 7, 2013
    161
    409
    Yeah I've yet to see an AI that learns at the pace of the slow kid that got picked on... The problem with AI is that they don't think like any of us not even the programmers that set the wheels in motion. Because we draw unforeseen/outlandish conclusions and make wild assumptions that a strictly logical machine can't compete with. We see solutions and problems that are of no value to the AI but is of an great value to us. Which is why we build the machines that build other machine but not them us...Yet that is;)

    However yes those who programmed that AI did not foresee that the sample pool would/could only contain small X% of certain ethnics groups Vs another one bigger .

    Or all the programmers are simply racists with the exception of those two guys which threw the curve a bit...;)
     
  11. dannyf

    Well-Known Member

    Sep 13, 2015
    1,811
    362
    It is all driven by the data set that the computer is trained on. The data likely come from a population that's majority white, which everything else being equal tends to like attributes that white people have.

    Calling the algorithm a racist reflect ones own stupidity.
     
  12. Sinus23

    Member

    Sep 7, 2013
    161
    409
    IMHO, If the programmers didn't take that in account beforehand(that is that the data could become one-sided) they aren't then the brightest light bulbs in the house... Far from being stupid, just more programmers than sociologists/psychologists/statisticians.. which if at least one of them would have vaguely thought of it that way. Might have predicted that problem.

    But hey hindsight, meet 20/20:cool:. Now it's just a matter of correcting that error.;)
     
  13. atferrari

    AAC Fanatic!

    Jan 6, 2004
    2,648
    764
    And what is the ultimate goal of that machine once it becomes non-racist while perfect to determine who is the most beautiful?

    To send it to a museum on behalf of someone who could stay at home drinking a beer?
     
    Sinus23 likes this.
  14. Sinus23

    Member

    Sep 7, 2013
    161
    409
    Yeah I hadn't even posted my thoughts of the obscure reason behind that program. All I can say is that no machine can tell me which women I find beautiful or attractive. Not today, not tomorrow...I think that applies to most people, men or women. That gave me a though. Run that shit with only men as subjects:cool:

    That could be a bit interesting.;)
     
  15. nsaspook

    Thread Starter AAC Fanatic!

    Aug 27, 2009
    2,908
    2,169
    These sort of programs exist when a lot of money has been invested in something with little to show that will make money.
     
    Sinus23 likes this.
  16. wayneh

    Expert

    Sep 9, 2010
    12,123
    3,048
    Meh. I wouldn't expect even a brilliant programmer to understand much about statistics, sampling biases, doing science and such. And who knows what their goal was. It's likely it performed exactly as intended, just with unintended side effects.
     
  17. Sinus23

    Member

    Sep 7, 2013
    161
    409
    Oh I would think that they knew in one way or another(programmers there is). But they weren't exactly the project managers or the handlers of the budget were they:p So the program became what it is.
     
  18. GopherT

    AAC Fanatic!

    Nov 23, 2012
    6,052
    3,817
    Maybe the Computer had a great previous experience with a Scandinavian woman. It's difficult to become unbiased after something like that.
     
  19. nsaspook

    Thread Starter AAC Fanatic!

    Aug 27, 2009
    2,908
    2,169
  20. nsaspook

    Thread Starter AAC Fanatic!

    Aug 27, 2009
    2,908
    2,169
Loading...