When Mary Louis, a Black woman, received an email informing her that a third-party provider had rejected her lease, her excitement about moving into an apartment in Massachusetts in the spring of 2021 was replaced with disappointment.
An algorithm used by that third-party firm to score rental applicants was the focus of a class action lawsuit led by Louis, who claimed that the program discriminated against applicants based on their ethnicity and income.
The algorithm’s developer agreed to pay more than $2.2 million and remove certain of its screening products that the lawsuit claimed were discriminatory as part of a settlement granted by a federal judge on Wednesday. This is one of the first settlements of its sort.
The settlement excludes any admissions of fault by SafeRent Solutions, which stated in a statement that litigation is costly and time-consuming, even if it still believes the SRS Scores adhere to all applicable regulations.
The use of algorithms or artificial intelligence programs to screen or score Americans isn’t new, but these lawsuits may be. AI has been covertly assisting Americans in making important decisions for years.
There’s a potential that an AI system or algorithm is rating or evaluating someone, just like it did Louis, when they apply for a job, a home loan, or even specific medical care. However, despite the fact that some of those AI systems have been shown to discriminate, they are mainly unregulated.
According to Todd Kaplan, one of Louis’ lawyers, management firms and landlords must be aware that they are now under notice and that the methods they take for granted will be contested.
According to the lawsuit, SafeRent’s algorithm discriminated against low-income applicants who were eligible for the assistance because it failed to consider the advantages of housing vouchers, which they claimed were a crucial factor in determining a renter’s capacity to pay the monthly rent.
Additionally, the lawsuit claimed that SafeRent’s algorithm over-relied on credit data. According to their argument, it does not provide a whole picture of an applicant’s capacity to make rent payments on time, and it unfairly disadvantages Black and Hispanic applicants for housing vouchers, in part because of their lower median credit scores, which can be linked to historical injustices.
One of the plaintiff’s lawyers, Christine Webber, stated that even while an algorithm or artificial intelligence (AI) is not designed to discriminate, the data it utilizes or weights may have the same impact as if you had instructed it to do so.
Louis attempted to appeal her application’s denial, providing references from two landlords attesting to her 16 years of on-time or early rent payments, despite her lack of a solid credit history.
Louis, who had a housing voucher, was in a hurry because she had already informed her former landlord that she was leaving and was responsible for looking after her grandchild.
“We do not accept appeals and cannot override the outcome of the Tenant Screening,” was the statement from the management business that employed SafeRent’s screening service.
She said that the algorithm didn’t know her, which demoralized Louis.
Numbers are the foundation of everything. Louis added, “You don’t get the individual empathy from them.” The system cannot be beaten. We will always lose to the system.
Although aggressive controls for these kinds of AI systems have been proposed by state lawmakers, the measures have not received enough support.This indicates that legal actions such as Louis are beginning to establish the foundation for AI accountability.
In a request to dismiss, SafeRent’s defense lawyers contended that since SafeRent was not making the final decision over whether to accept or reject a renter, the corporation shouldn’t be held accountable for discrimination. Landlords or management firms would decide whether to accept or reject a renter after the service screened, scored, and submitted a report.
The U.S. Department of Justice, which filed a statement of interest in the lawsuit, and Louis’ lawyers contended that SafeRent’s algorithm might be held liable since it continues to influence housing access. SafeRent’s move to have those counts dismissed was refused by the judge.
According to the settlement, SafeRent is prohibited from using its score component on its tenant screening reports in a number of situations, such as when the applicant is using a housing voucher. Additionally, if SafeRent creates a new screening score, it must be approved by the plaintiffs and verified by a third party.
Although it was $200 more expensive and in a less desired neighborhood, Louis has now moved into an inexpensive apartment that her son located for her on Facebook Marketplace.
Louis stated, “I’m not hopeful that I’ll get a break, but I have to keep on keeping.” I depend on far too many people.
— The Associated Press/Report for America’s Jesse Bedayn
Jesse Bedayn is a member of the Statehouse News Initiative’s Associated Press/Report for America corps.A nonprofit national service initiative called Report for America places reporters in local newsrooms to explore topics that aren’t often covered.
Note: Every piece of content is rigorously reviewed by our team of experienced writers and editors to ensure its accuracy. Our writers use credible sources and adhere to strict fact-checking protocols to verify all claims and data before publication. If an error is identified, we promptly correct it and strive for transparency in all updates, feel free to reach out to us via email. We appreciate your trust and support!