Your Tenant Screening Software is a Lawsuit Waiting to Happen

When the Algorithm Denies Housing, Who Takes the Blame?

Here’s a scenario that should keep every property manager up at night: A qualified tenant with 16 years of on-time rent payments applies for an apartment. Your AI screening tool spits out a score of 324. She needed 443. Application denied. No explanation. No appeal.

That’s exactly what happened to Mary Louis in Massachusetts. The catch? She’s now the lead plaintiff in a class-action lawsuit that could reshape how every property manager in America screens tenants.

Are you ready for that phone call from your attorney?

The Speed-Accuracy Paradox: What Your Screening Tool Isn’t Telling You

AI tenant screening promises the dream: faster decisions, consistent criteria, reduced vacancies. And let’s be honest—it delivers on speed. What used to take days now takes minutes. One survey found that two-thirds of landlords now receive AI-generated scores or recommendations in their tenant screening reports.

But here’s the uncomfortable question nobody wants to ask: How many of those fast decisions are wrong?

A December 2024 Government Accountability Office report raised serious red flags about AI in rental housing. The GAO found that these technologies could potentially violate fair lending, fair housing, and consumer protection laws. Even more concerning? The algorithms that determine whether someone gets housing are often complete mysteries—even to the people using them.

The Electronic Privacy Information Center puts it bluntly: tenant screening companies are generating reports that contain serious errors and biases, and they’re neither vetting the third-party information they use nor monitoring their services for mistakes. When the NACA v. RentGrow lawsuit survived a motion to dismiss in November 2024, a DC Superior Court effectively said: we need to take a closer look at how these systems actually work.

If your current screening process can’t explain why it rejected an applicant, you’re not just risking a bad tenant decision. You’re building a legal exposure file one denial at a time.

The Black Box Problem: You Can’t Defend What You Can’t Explain

Think about the last applicant your system rejected. Can you explain exactly why? Not just “the score was too low”—but what specific data points drove that score? What weight did each factor carry? Was there any data that should have been excluded because it was more than seven years old?

Most property managers can’t answer these questions. And that’s precisely the problem.

HUD’s May 2024 guidance made this crystal clear: housing providers are responsible for ensuring that the AI tools they use comply with the Fair Housing Act. Using a third-party screening company doesn’t get you off the hook. If that company’s algorithm discriminates—even unintentionally—you’re potentially liable.

The guidance specifically calls out the dangers of automated decision-making systems that perpetuate racial biases. Criminal records, eviction filings, credit histories, even address histories—these data sources can reflect historical discrimination from practices like redlining. When an AI learns from this biased data, it encodes inequality into its recommendations.

One lawsuit against SafeRent alleges the algorithm assigns disproportionately lower scores to Black and Hispanic rental applicants compared to white applicants. Another lawsuit in Jacksonville claims the screening tool flagged tenants with dropped or cleared eviction filings—penalizing them for cases that never resulted in actual evictions.

Meanwhile, TechEquity’s research found that 37% of landlords rely solely on the algorithm’s recommendation to make rental decisions. No human review. No consideration of context. Just whatever the black box spits out. Is that really the process you want to defend in court?

The Hidden Costs of “Efficiency”

Let’s do the math that screening tool vendors don’t want you to do.

When an automated screening wrongly rejects a qualified applicant, you don’t just lose a good tenant. You pay for extended vacancy, additional marketing, more staff time processing applications. And if that wrongly rejected applicant files a complaint? Now you’re paying for legal review, response documentation, and potentially settlement costs.

In 2023, the CFPB fined TransUnion $23 million for inaccurate data in its tenant screening programs. That same year, the FTC settled with AppFolio over faulty tenant screening algorithms that resulted in qualified tenants being turned away from homes. Commissioner Rohit Chopra didn’t mince words, stating that sloppy, inaccurate credit reporting practices can reinforce discrimination and foreclose opportunities for individuals seeking a better home.

The D.C. Housing Authority has used RentGrow’s screening services since 2018 for its Housing Choice Voucher Program—meaning the most vulnerable residents, those who depend on housing assistance, have been subject to automated decisions that allegedly contain serious errors. That’s not just a business problem. That’s a human problem.

The question isn’t whether AI screening saves time. The question is whether the time you save is worth the trust you destroy.

Wondering how to implement AI screening that actually protects you? Property managers who want to leverage AI without the legal exposure are finding that the right implementation strategy makes all the difference. Reach out to learn how we help real estate professionals deploy AI-powered marketing and operations workflows that are both effective and defensible.

What the Regulators Are Actually Saying

Let’s cut through the noise and look at what federal agencies are telling property managers right now.

HUD’s April 2024 guidance documents lay out clear expectations. First, AI tools must not discriminate, even unintentionally. Second, landlords are responsible for ensuring the tools they use are fair and compliant. Third, transparency and accountability are essential—if someone is denied housing, they should understand why.

The guidance specifically recommends that property managers screen applicants only for information relevant to the likelihood that they will comply with tenancy obligations. It emphasizes ensuring records are accurate, since certain types of inaccuracies are more likely to occur for members of some demographic groups. And it calls for transparency with applicants by making screening policies written, public, and readily available.

The GAO’s December 2024 report goes further. Federal agencies have pursued legal action against companies for misleading and discriminatory advertising on rental platforms. They’ve taken enforcement actions against companies that screen out tenants based on inaccurate or outdated data. And they’re beginning to change how they oversee these technologies for compliance with fair housing laws.

Even in California, which conducted fair housing testing across Los Angeles and Ventura Counties in 2024, the results are sobering: 54% of tested properties demonstrated discrimination based on source of income, 26% showed discriminatory practices against families with children, and 22% demonstrated discrimination based on race.

These aren’t abstract regulatory concerns. They’re active enforcement priorities.

The Predictive Analytics Problem: Guilty Before Moving In?

Here’s something that should genuinely concern every property manager using AI screening: almost one in five landlords surveyed by TechEquity reported receiving “predictive” information from screening companies. These aren’t scores based on what an applicant has done. They’re predictions about what the applicant might do—whether they’ll break their lease, fail to pay rent, or cause property damage.

Think about that for a moment. You’re denying someone housing based on something they haven’t done yet.

This practice is banned in the European Union precisely because of its suspect nature. Yet one quarter of North Carolina landlords, 16% of California landlords, and 15% of Georgia landlords are receiving these predictions and using them to make housing decisions.

And here’s the transparency nightmare: only 3% of tenants surveyed could name the screening company that assessed them. The rest didn’t know who was evaluating them or how. If renters can’t identify who screened them, how can they enforce their rights under the Fair Credit Reporting Act? How can they dispute inaccurate information? How can they prove discrimination?

More importantly for property managers: if your tenants can’t explain the process that denied them, neither can you.

A Smarter Path Forward

None of this means you should abandon technology in tenant screening. That would be impractical and probably uncompetitive. But it does mean you need to be strategic about how you implement and oversee these tools.

Demand explainability. If your screening provider can’t tell you exactly how their algorithm works, that’s a red flag. You need to be able to articulate—and defend—every screening decision.

Maintain human oversight. Don’t rely solely on algorithmic recommendations. Build in human review for edge cases, appeals, and any situation where context matters—which is most situations involving housing.

Audit for bias regularly. If you’re not testing your screening outcomes across demographic groups, you’re flying blind. The patterns that indicate disparate impact may not be visible until you look for them.

Document everything. Your screening policies should be in writing, publicly available, and consistently applied. When a denial is challenged, you need a paper trail that shows fair, lawful process.

Consider the source data. Criminal records, eviction filings, and credit histories all carry potential for bias. Understand what data your screening tool uses, where it comes from, and how often it’s updated.

Create appeals processes. Give applicants a meaningful way to provide additional context or dispute inaccurate information. This isn’t just good practice—it’s increasingly becoming a regulatory expectation.

The Bottom Line: AI Is a Tool, Not a Shield

The lawsuit against RentGrow that survived dismissal in November 2024 sends a clear message: consumer protection laws can hold tenant screening companies liable for inaccurate and biased report data. The courts are beginning to take these claims seriously.

But here’s what too many property managers miss: these lawsuits don’t just target screening companies. They can target the landlords and property managers who use these tools without adequate oversight.

AI tenant screening isn’t going away. It’s too efficient, too scalable, and too embedded in modern property management operations. But the era of “set it and forget it” AI is ending. The property managers who thrive will be those who implement these tools thoughtfully, maintain meaningful human oversight, and can actually explain their decisions when challenged.

The technology exists to screen tenants quickly, fairly, and legally. The question is whether you’re willing to implement it that way—or whether you’ll keep trusting a black box that might be building your lawsuit file one rejection at a time.

Ready to Implement AI the Right Way?

The property managers who are winning with AI aren’t just adopting technology—they’re implementing it with proper oversight, documentation, and compliance frameworks. That’s exactly what we help real estate professionals do.

Our AI marketing implementation services help property managers, brokers, and real estate professionals leverage automation and artificial intelligence in ways that are transparent, defensible, and genuinely effective. From lead generation to tenant communications to marketing workflows, we help you capture the efficiency benefits of AI while maintaining the human judgment that keeps you protected.

Contact us today to learn how we can help you implement AI agents and workflows that save time, reduce risk, and improve the real estate transaction experience for everyone involved.

Sources

• EPIC: NACA v. RentGrow Case – https://epic.org/documents/naca-v-rentgrow/

• GAO WatchBlog: AI Is Changing Home Buying and Renting – https://www.gao.gov/blog/ai-changing-home-buying-and-renting-not-always-better

• HUD Fair Housing Act Guidance on AI – https://archives.hud.gov/news/2024/pr24-098.cfm

• TechEquity: Screened Out of Housing Research – https://techequity.us/2024/07/24/screened-out-of-housing-research-paper/

• American Bar Association: Ghosts in the Machine – https://www.americanbar.org/groups/crsj/resources/human-rights/2024-june/how-past-present-biases-haunt-algorithmic-tenant-screening-systems/

• CFPB: Tenant Screening Report Errors – https://www.consumerfinance.gov/about-us/blog/errors-in-your-tenant-screening-report-shouldnt-keep-you-from-finding-a-place-to-call-home/

AI Disclosure: This article was generated with AI assistance and may contain inaccuracies or factual errors. Readers are encouraged to verify information independently and consult with legal or industry professionals for specific guidance. The information provided is for educational purposes and does not constitute legal advice.