Hr Library
Trending

Algorithms Learn Our Workplace Biases. Can They Help Us Unlearn Them?

With the help of intelligent machines, humans can be nudged to make choices that make workplaces fairer for everyone

Source | www.nytimes.com | Corinne Purtill

— Iris Bohnet, a behavioral economist and professor at the Harvard Kennedy School


[In Her Words is available as a newsletter. Sign up here to get it delivered to your inbox.]

In 2014, engineers at Amazon began work on an artificially intelligent hiring tool they hoped would change hiring for good — and for the better. The tool would bypass the messy biases and errors of human hiring managers by reviewing résumé data, ranking applicants and identifying top talent.

Instead, the machine simply learned to make the kind of mistakes its creators wanted to avoid.

The tool’s algorithm was trained on data from Amazon’s hires over the prior decade — and since most of the hires had been men, the machine learned that men were preferable. It prioritized aggressive language like “execute,” which men use in their CVs more often than women, and downgraded the names of all-women’s colleges. (The specific schools have never been made public.) It didn’t choose better candidates; it just detected and absorbed human biases in hiring decisions with alarming speed. Amazon quietly scrapped the project.

Amazon’s hiring tool is a good example of how artificial intelligence — in the workplace or anywhere else — is only as smart as the input it gets. If sexism or other biases are present in the data, machines will learn and replicate them on a faster, bigger scale than humans could do alone.

Click here to read the full article

Source
www.nytimes.com
Show More

Related Articles

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button