KCBS News Radio San Francisco Stan Bunger and Susan Leigh Taylor sat down to chat with Paul Teich, DoubleHorn’s Principal Analyst, to discuss the news of Amazon shutting down their Artificial intelligence tool being used for hiring after they discovered it had a bias towards women. Paul has previously written about how artificial intelligence can reinforce bias similar to Amazon’s AI tool.
Reuters says Amazon’s Machine-learning tool used previous applications, which were overwhelmingly male, and “concluded” that men should be preferred for new hires.
Listen to the full Podcast below or visit the KCBS website.
About the Shut-down
Amazon shut down their experimental (AI) artificial intelligence recruitment engine after it started preferring male candidates and was being biased towards women because of bad data.
Amazon reported the AI tool Tuesday, to be penalizing female resumes which included certain words like “women’s” or referenced female universities. The AI’s was also favoring applications featuring words most often used by male candidates such as “executed”.
The tool was designed to identify patterns from top applicants by giving the AI tool data from the past 10 years of Amazon applications as a sample.
Amazon reports that because the majority of those applicants were male, the AI developed a bias against women, therefore, making it easier for them to find top applicants.
Though Amazon disbanded the team in 2017, the AI started showing a bias against female candidates and had to be shut down in 2015.