Black woman wins landmark $2.2M against discriminatory housing algorithm

SafeRent is accused of using an AI algorithm that disproportionately penalized Black and Hispanic applicants
algorithm
Photo credit: Shutterstock.com / Rawpixel.com

On Nov. 20, 2024, Mary Louis, a Black woman, achieved a significant legal victory when she was awarded a $2.2 million settlement in her lawsuit against SafeRent Solutions. This third-party service specializes in providing resident screening and applicant risk scores to landlords and property managers. The lawsuit was initiated after Louis faced rejection for an apartment in Massachusetts in 2021, despite her eagerness to move into a new home.

According to reports from the Associated Press, Louis was devastated when she received an email from SafeRent informing her that her application had been denied. The core issue of her lawsuit revolved around the algorithm used by SafeRent to evaluate rental applicants. Louis claimed that this algorithm discriminated against her based on her race and income, leading to an unjust denial of housing.


Discriminatory algorithms and housing access

The lawsuit highlighted critical flaws in SafeRent’s algorithm, particularly its failure to consider the benefits of housing vouchers, which are essential for many low-income applicants. Louis argued that this oversight unfairly disadvantaged those who rely on such vouchers to secure housing. The suit further contended that the algorithm placed excessive emphasis on credit information, which provided an incomplete picture of an applicant’s ability to pay rent on time.

Moreover, the lawsuit pointed out that the algorithm disproportionately penalized Black and Hispanic applicants, who often have lower median credit scores due to systemic inequities. This bias made it increasingly difficult for these individuals to obtain housing, perpetuating a cycle of discrimination and exclusion.


SafeRent’s response and future changes

In response to the lawsuit, SafeRent has agreed to revise certain provisions within its AI algorithm and screening tools that were identified as discriminatory. However, it is important to note that the settlement does not include any admission of wrongdoing by the company. SafeRent stated that while it remains confident in the compliance of its scoring systems with relevant laws, it acknowledged the challenges and costs associated with litigation.

The broader impact of AI algorithms on Black communities

The implications of this case extend beyond Mary Louis. AI algorithms have long been scrutinized for their disproportionate impact on Black communities. A 2021 study from the University of California, Berkeley, revealed that AI-based mortgage systems consistently charged Black and Latinx borrowers higher interest rates than their white counterparts for identical loans. Additionally, in 2023, the Los Angeles Homeless Services Authority faced accusations of using an algorithmic scoring system that resulted in Black and Latinx individuals experiencing homelessness receiving lower priority scores for assistance.

Todd Kaplan, one of Louis’ attorneys, expressed hope that this landmark victory would send a strong message to mortgage companies and landlords. He emphasized the need for changes to their algorithms. This sentiment underscores the growing awareness and demand for accountability in the use of technology within housing and finance.

Louis’ case is a pivotal moment in the fight against discriminatory practices in housing. It not only highlights the challenges faced by Black individuals in securing housing but also serves as a call to action for companies to reevaluate their algorithms and practices. As the conversation around racial equity and technology continues to evolve, it is crucial for stakeholders to prioritize fairness and inclusivity in their systems.

Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Join our Newsletter

Sign up for Rolling Out news straight to your inbox.

Read more about:
Also read