Breaking News

OMG it's totally unsurprising that Amazon's recruitment AI was biased against women


Amazon CEO Jeff Bezos. 

  • Amazon abandoned a project to build an AI recruitment tool, which engineers found was discriminating against female candidates.
  • Dr Sandra Wachter, an AI researcher at Oxford University, told Business Insider that the gender bias was hardly surprising.
  • You feed an AI with garbage and it will spit garbage out, she said. In Amazon's case, the machine may have reflected the fact that the historical data it was being fed was predominantly male résumés.
  • Nonetheless, Wachter believes algorithms could become better decision-making tools than humans.
Amazon admitted this week that it experimented with using machine learning to build a recruitment tool. The trouble is, it didn't exactly produce fantastic results and it was later abandoned.
According to Reuters, Amazon engineers found that besides churning out totally unsuitable candidates, the so-called AI project showed a bias against women.
To Oxford University researcher Dr Sandra Wachter, the news that an artificially intelligent system had taught itself to discriminate against women was nothing new.
"From a technical perspective it's not very surprising, it's what we call 'garbage in and garbage out,'" she told Business Insider.

Garbage in, garbage out

The problem boils down to the data Amazon fed its algorithm, Wachter speculated.

What you would do is you go back and look at historical data from the past and look at successful candidates and feed the algorithm with that data and try to find patterns or similarities," said Wachter.
"You ask the question who has been the most successful candidates in the past [...] and the common trait will be somebody that is more likely to be a man and white."
Reuters reported that the engineers building the program used résumés from a 10 year period, which were predominantly male. Amazon did not provide Business Insider with the gender split in its engineering department but sent us a link to its diversity pages. Its global gender balance is 60% men, with 74% of managerial roles being held by men.
"So if then somebody applies who doesn't fit that profile, it's likely that that person gets filtered out just because the algorithm learned from historical data," said Wachter. "That happens in recruitment, and that happens in basically everywhere where we use historical data and this data is biased."
Garbage in, garbage out (sometimes abbreviated to "GIGO") just means that bad input will result in bad output, and it's the same with bias. The problem is that it's incredibly difficult to filter out algorithmic bias, because the algorithms we build pick up on human prejudices.
What is the algorithm supposed to do? It can only learn from our semantics and our data and how we interact with humans, and the moment there is no gender parity yet, unfortunately," said Wachter.

No comments