Ineligible to Serve

Emilio Miles
2 min readMar 26, 2021

Posted on March 25th, 2021 by Emilio Miles

In chapter 6 of Weapons of Math Destruction, Cathy O’Neil talks about the ways that companies use AI to screen out applicants unfairly. She also mentions how St. George’s Hospital Medical School blackballed foreign students using such software. These programs, like all the WMDs discussed in this book, were meant to aid humans in doing their jobs more efficiently. Like many other Big Data programs, the companies settle for proxies to accomplish this task. However, as we’ve seen, proxies are often inexact and unfair.

O’Neil talks about a student named Kyle Behm who has a perfect SAT score and previously attended Vanderbilt University, yet has a tough time finding a job. She mentions that he suffers from bipolar disorder and needed to take some time off school to get treatment. When he tries applying for jobs, he is given a personality test (made by Kronos) that he must take. This Kronos personality test is used to filter out applicants and due to his mental health problems, he is filtered out of almost all the jobs for which he applies. This is an example of a feedback loop created by a WMD. Red-lighting people with certain mental health issues prevents them from getting a job and leading a normal life, thereby isolating them even further.

She also talks about how St. George’s Hospital Medical School used a program to help screen applicants applying to their program. They had to filter out enough applicants so that humans could take over the interviewing process. The program would automatically cull down thousands of applications down to 500, for example, to make it more manageable. However, even though one of its goals was fairness, the AI learned how to discriminate based on the records of screenings from previous years. It took applicants from regions where they would be less likely to speak English and rejected them instead of considering the possibility that they could learn it.

What can be done about this? The book provides a solution that is very simple, yet would likely be the best course of action. Instead of using intelligence to reject and punish society, we should strive to reach out to them with the resources they need. This way, people with great potential aren’t overlooked and are given the chance to grow. The biggest takeaway from the chapter: Artificial intelligence should be used as a way to help others rather than to punish them.

--

--

Emilio Miles

Computer Science student at the University of Kansas