New NYC law restricts hiring based on artificial intelligence
New NYC law restricts hiring based on artificial intelligence
When a new law in New York City takes effect at the start of 2023, employers won’t be allowed to use artificial intelligence to screen job candidates unless the tech has gone through an audit to check for bias.
The potential for algorithmic discrimination in hiring has been the target of state laws in Illinois and Maryland. The federal Equal Employment Opportunity Commission also recently formed a working group to study the issue.
The internet has made applying for jobs easier than ever, but it’s also made the process less human, said Joseph Fuller at Harvard Business School.
“When you open the faucet, all of a sudden a lot of applications started coming in, and no one’s gonna hit print 250 times,” he said.
So most big companies use some sort of automated recruiting system, which narrows down the candidate pool using algorithmic filters. “If you don’t have this, you’re out. If you don’t have that, you’re out,” Fuller said.
It can be anything, from years of experience to your choice of words. Companies are also increasingly using automated video interviews, per Lindsey Cameron at the Wharton School.
“And it’s sort of monitoring your tone and your facial expressions and, you know, the depth and quality of your responses as best as they can,” she said.
Which, though maybe a bit creepy, isn’t necessarily bad, she said. Automated systems have the potential to bypass some human biases, but too often bias is just built into the tech, said Nicol Turner Lee, director of the Center for Technology Innovation at the Brookings Institution.
“Computers are programmed by humans, that they come with the same values, norms and assumptions that humans hold,” she said.
Amazon reportedly scrapped the AI recruiting system it was using a few years ago because of concerns about gender bias. Turner Lee said the algorithm was trained on historical data about successful candidates.
“Because the data was trained on men, it kicked out any resume that suggested a woman’s name, a woman’s college or woman’s extracurricular activity, like the women’s lacrosse team,” she said.
Likewise, facial recognition software can disadvantage people with darker skin when algorithms are trained on white faces. There needs to be greater oversight, Turner Lee said, to make sure AI complies with civil rights law.
There’s a lot happening in the world. Through it all, Marketplace is here for you.
You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible.
Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.