Collateral Damage: Landing Credit

Emilio Miles
2 min readApr 17, 2021

Posted on April 16th, 2021 by Emilio Miles

In chapter 8 of Weapons of Math Destruction, author Cathy O’Neil talks about how WMDs affect people’s ability to get credit, loans, and jobs. She discusses the impact that big data has on the decisions of credit companies in loaning out credit. She compares FICO scores and e-scores and goes in depth about how e-scores affect more than just your ability to get credit.

O’Neil says FICO scores are pretty transparent and easy to tweak to be more accurate. They are calculated by looking only at a borrower’s finances — mostly one’s debt load and bill paying record. The score was color blind in that it didn’t use race or ethnicity to make predictions. Because of this, it predicted risk far more accurately than a human could. However, since companies are legally prohibited from using credit scores for marketing purposes, e-scores were created. They were a mishmash of data that included things such as a person’s zip code, Internet surfing patterns, or even recent purchasing history.

E-scores are arbitrary, unaccountable, unregulated, and often unfair. For example, a company called Neustar used e-scores to create a hierarchy and quickly funnel customers to a human operator if the data showed that they were more profitable or wealthy. Those at the bottom of the hierarchy had longer wait times or were dispatched to an outsourced overflow center. Capital One used a similar model in which they used the data they had on customers for their credit card offers. For example, people that were looking for new cars were examined and categorized depending on the type of car they were looking at. Someone who looked at a new Jaguar was more likely to be wealthy than someone who looked for a used car. They did the same with real estate data. Someone seen surfing the web from a place like Beverly Hills seemed like a better prospect than someone from Compton, for example.

E-scores, thus, create a negative feedback loop. Using the real estate data example, an e-scoring system is likely to give a borrower from the rough section of Detroit a low score. A lot of people likely default there, so the credit card offers popping up will be targeted towards a riskier demographic. This means less available credit and higher interest rates which makes things difficult for someone who is already struggling. By having a human at the finishing end of these algorithms, these types of problems can be avoided.

The biggest takeaway from this chapter is that WMDs can be managed with the help of human error checking. Left to run by themselves, these algorithms are capable of causing destructive harm. When checked however, they can prove to be useful solutions to tough problems.

--

--

Emilio Miles

Computer Science student at the University of Kansas