This system failure does not stop at housing.
Similar software tools can embed hidden discrimination in decisions that affect your life at every level: a university screening you, a company not hiring you, a bank denying you credit or a loan, or even a hospital. who refuses painkillers.
This problem – humans outsource important decisions about people’s lives to automated calculation formulas based on faulty or biased data – is known as “algorithmic discrimination”. And we in DC have a chance to pass a bill that bans it.
The Stop Discrimination by Algorithms Act (SDAA) is a 21st century civil rights law. The bill – which is now in committee at the DC Council and had a hearing last month – prohibits algorithmic discrimination based on protected characteristics in education, employment, housing, health care, credit and insurance. It also requires companies to be transparent to the public whenever they use the types of software described above to inform their decisions.
Does every new technology require new laws? Not always, but this one yes. Here’s why.
First, back to that computer-generated score, the one that kicks you out of a job, apartment, medical treatment, or school. This score is the result of statistical models drawn from massive amounts of personal data collected from all directions, such as social media, shopping behavior, employment history, police records and a myriad of other databases. public and commercial. If this data is wrong, so is the resulting digital sketch. Yet this distorted sketch is what can have a huge impact on your future.
For example, the NarxCare addiction risk algorithm might disproportionately erroneously recommend against giving painkillers to women because the algorithm gives weight to gender-specific factors such as history of trauma.
This is also why a job interview platform may mistakenly reject a qualified candidate. This could happen if the candidate does not display specific behavioral cues that the software algorithm has been trained to look for as signs of competence for the role. If the developers haven’t considered neurodivergence or cultural diversity, for example, then that person’s algorithmic data sketch looks unqualified, even though the real person is qualified.
The law already protects people against discrimination in the face of others. It must also protect people from being discriminated against by a machine.
Second, algorithmic decision-making hides discrimination behind a veneer of so-called scientific “objectivity.” This is called “mathwashing”.
For example, a black person may have always lived in a particular zip code due to historical redlining (the discriminatory institutional practice of denying financial services and other important resources to people living in predominantly black neighborhoods). If that person were to seek housing elsewhere, landlords or banks could not categorically refuse to approve a mortgage or rent for that person because of their race. However, they could use a tenant scoring tool or a mortgage “risk assessment” tool.
The risk-scoring algorithm takes into account the person’s previous place of residence, but due to the legacy of redlining and other forms of historical oppression, the black applicant could receive a “scientifically calculated” as a potential tenant or owner. Lifting the mathematical veil of the tool would reveal a centuries-old core of systemic racism.
This is why the SDAA would require companies and other organizations using algorithmic decision-making tools to proactively audit their systems for discriminatory impacts – to prevent willful ignorance.
Third, these algorithmic decision tools are widespread, influence many essential spheres of life, and require and generate massive amounts of data. This data then feeds into deeper and more intimate profiling at all levels. The risk of inaccurate data, the demonstrated likelihood of bias, and the inescapability of these tools have compounding ramifications on your life as a student, worker, patient, tenant, and borrower. These forks in the road throughout each of our lives, if chosen for us by an algorithmic decision-making tool at every turn, would ultimately create a discriminatory society worse than the sum of its technological and rights-violating parts. of man. .
If civil rights protections are to keep pace with this kind of technological threat to equality, they require an updated legal framework. We urge the DC Council to pass the Stop Discrimination by Algorithms Act and hope that other state and federal lawmakers will soon follow.