Addressed this before, so summary version: the AI was fed info on past military personnel in order to make its selections, which had, by and large, consisted of young, white, males. It was discriminating against black men, too, iirc.
It was basing it off the data it was fed. The data it was fed was unintentionally skewed, which caused the AI to develop biases.
https://www.google.com/amp/s/amp.businessinsider.com/amazon-built-ai-to-hire-people-discriminated-against-women-2018-10
'
"The company created 500 computer models to trawl through past candidates' résumés and pick up on about 50,000 key terms. The system would crawl the web to recommend candidates.
'
"They literally wanted it to be an engine where I'm going to give you 100 résumés, it will spit out the top five, and we'll hire those," one source told Reuters.
"A year later, however, the engineers reportedly noticed something troubling about their engine - it didn't like women. This was apparently because the AI combed through predominantly male résumés submitted to Amazon over a 10-year period to accrue data about whom to hire.
'
Consequently, the AI concluded that men were preferable. It reportedly downgraded résumés containing the words "women's" and filtered out candidates who had attended two women-only colleges.
'
Amazon's engineers apparently tweaked the system to remedy these particular forms of bias but couldn't be sure the AI wouldn't find new ways to unfairly discriminate against candidates.
'
Gender bias was not the only problem, Reuters' sources said. The computer programs also spat out candidates who were unqualified for the position."
https://www.google.com/amp/s/amp.businessinsider.com/amazon-built-ai-to-hire-people-discriminated-against-women-2018-10
'
"The company created 500 computer models to trawl through past candidates' résumés and pick up on about 50,000 key terms. The system would crawl the web to recommend candidates.
'
"They literally wanted it to be an engine where I'm going to give you 100 résumés, it will spit out the top five, and we'll hire those," one source told Reuters.
'
Consequently, the AI concluded that men were preferable. It reportedly downgraded résumés containing the words "women's" and filtered out candidates who had attended two women-only colleges.
'
Amazon's engineers apparently tweaked the system to remedy these particular forms of bias but couldn't be sure the AI wouldn't find new ways to unfairly discriminate against candidates.
'
Gender bias was not the only problem, Reuters' sources said. The computer programs also spat out candidates who were unqualified for the position."