Credit scores and the bias behind them
While your credit score is generated from data about you, the way the algorithms behind the score are designed is often based on broader financial trends.
“Many times credit scores are built on history of all kinds of other aggregate data, so people who look like you,” said Safiya Noble, a professor of gender studies and African American studies at the University of California, Los Angeles.
Noble, who wrote the book “Algorithms of Oppression,” researches how algorithms can perpetuate racism and gender bias.
“And this is where we start to get in trouble,” Noble said. “If you are part of a group that has traditionally been denied credit or been offered predatory products, then your profile may, in fact, look more like those people and you will be dinged.”
As a result, consumers might not have access to a loan, a mortgage or better rates on insurance.
David Silberman, a senior fellow at the Center for Responsible Lending, said this is part of a bigger problem.
“Credit scores very much are reflecting of the history of discrimination in the country,” he said.
Silberman, who spent a decade at the Consumer Financial Protection Bureau and years in the financial services industry, has thought about how algorithms can reflect privilege, or the lack thereof.
“If one starts out without any wealth, with limited income prospects, the kinds of credit you can get is going to be affected,” he said.
For instance, payday lenders concentrate in African American and Latino neighborhoods and tend to offer loans with less favorable terms, so borrowers who use those lenders could be more likely to default.
“Your ability to repay that credit is going to be affected, and that’s then going to itself find its way into credit scores,” Silberman said.
According to payments processor Shift, white Americans have an average FICO score of 734 — a relatively good score for most financial products. But for Black Americans, it’s 677. A lower score can equal higher interest rates or lead to being denied a loan.
Since accurate historical data can still create biased algorithms, lots of researchers and businesses are looking for new options to determine creditworthiness, but that can be risky too.
Nicholas Schmidt, CEO of SolasAI, vets algorithms for disparate impact. He said bias can “creep in” anywhere.
“Most people talk about bias in the data. And that’s somewhat of an obvious thing,” he said.
One example he shared was a lender algorithm to assess credit risk associated with people who didn’t pay credit card debt. He said the best predictor was how often consumers shopped at convenience stores.
At gas stations or strip malls, even freestanding stores like Patron Convenience Store in southeast D.C., it can be busy on a Wednesday morning, with people buying lottery tickets and snacks.
“And I thought about it. What do you get at a convenience store — cheap beer, cigarettes, bad candy and lottery tickets?” Schmidt said. “Those are all probably pretty well correlated with risky behavior, which is probably well correlated with bad credit card outcomes.”
But then Schmidt and his team thought about it some more and realized there was a gaping hole in that analysis: food deserts. These are areas where residents are low-income and lack easy access to supermarkets or large grocery stores, according to the U.S. Department of Agriculture.
In 2021, about 13.5 million people lived in American food deserts — and many of them shopped at convenience stores.
Ekram Aman is a cashier at the Penn Way Market, a strip mall convenience store in a food desert in southeast Washington, D.C.
She said most of her patrons use electronic benefit transfer, a tool to access government food aid programs, to buy groceries.
“They say because it’s convenient for them. And for people especially who don’t drive, it’s very convenient,” Aman said.
Most customers come from the neighborhood and walk to Penn Way, she said. Sometimes they send their kids to pick up food for dinner or some of the household goods packed into the shelves of the narrow store.
Schmidt of SolasAI said the use of data generated this way is a form of discrimination that could creep through when an algorithm lumps all these people together.
“What you’re going to do is capture the risky behavior of whites in the suburbs, who are going to convenience stores and buying lottery tickets and bad candy and bad beer,” he said.
But, Schmidt said, you’re also going to capture creditworthy people in cities, low-income people and people of color, but also wealthier people in dense cities who shop at bodegas.
Schmidt doesn’t know whether that particular variable ended up in a lender’s final model because financial services firms often adjust their models to account for built-in biases.
But, said David Silberman at the Center for Responsible Lending, there’s only so much these algorithms can do.
“There may be tweaks that will, on the margins, bring more people into the system or give a fuller picture of their creditworthiness by looking at a richer set of data,” he said. “But I think that’s marginal. It’s not going to deal with the fundamental problems of inequality that we have to deal with.”
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.