To paraphrase Mark Twain, history may not repeat itself, but it does rhyme. The U.S. mortgage industry recently had to remind itself of this fact when a CNN analysis found that the nation’s largest credit union, Navy Federal, has the widest disparity in mortgage approval rates between white and black borrowers of any major lender. The report, released back in December, noted that Navy Federal Credit Union approved more than 75% of white borrowers who applied for a new conventional home purchase mortgage in 2022 vs. less than 50% of black borrowers.
As ever, redlining — the intentional, systematic effort by American banks and government to refuse mortgages to African Americans and segregate U.S. cities — looms in the background. The practice wasn’t outlawed until 1968, and ongoing research at the University of California, Berkeley, School of Public Health finds that redlining affects non-white communities to this day in the form of air pollution, reproductive health disorders, and fewer urban amenities.
Is the mortgage industry offering fair access to loans?
The report suggests that the mortgage industry may be neglecting its duty to offer all applicants fair access to loans. With civil rights bills like the Fair Housing Act (FHAct) and Equal Credit Opportunity Act (ECOA) now the law of the land, most lenders will note that they rely on supposedly objective borrower screening algorithms to make lending decisions. But the results tell a different story.
The Federal Reserve’s Consumer Compliance Handbook observes that “evidence of discriminatory intent is not necessary to establish that a lender’s adoption or implementation of a policy or practice is in violation of the FHAct or ECOA.” If a lender has a supposedly neutral policy that results in them denying loans to people of a protected class (i.e., racial, religious, or gender minority) at a greater rate, that policy may constitute lending discrimination. To prove the policy is not discriminatory, the lender must show that the policy is justified by “business necessity.”
Meanwhile, repeated, heavily publicized evidence has dispelled the myth that algorithms are inherently neutral actors. Those in the data analytics field have long been aware that algorithms can not only encode but amplify bias. For example, Amazon had to stop testing an automated hiring algorithm in 2015 when it became obvious that the algorithm was systematically discriminating against women for technical jobs. Trained on the resumes of existing employees, most of whom were male, the algorithm reproduced that same bias when evaluating candidates.
Algorithmic bias in mortgage lending is just another example of the phenomenon that data analysts call “garbage in, garbage out.” Institutions that feed their lending algorithm data that encodes structural racism should not be surprised when the ending algorithm results in a disparate impact.
One of the biggest offenders in algorithmic bias is the almighty credit score.
Lenders rapidly expanded their use of computerized credit scores in the 1970s and 80s, ironically, to protect themselves against discrimination lawsuits. Experts point out that the supposedly objective credit scoring system still bakes in intentional discrimination from decades ago. Leaning too heavily on credit scores actually biases lenders against a large swath of potential homebuyers, including foreign buyers, younger buyers, and buyers from families with low financial literacy, who often lack robust U.S. credit histories.
To avoid discriminating against these groups, lenders need to scrap the credit score and start looking at cashflow. Cashflow underwriting is a transparent, data-driven approach that looks at an individual’s core financial behavior metrics. The main factors that cashflow underwriting looks at are the applicant’s balances, cashflow trends, and their ratio of discretionary-to-core spending. Cashflow underwriting puts income verification where it belongs: at the front of the process. By looking at an applicant’s behavioral metrics based on real-time financials from their bank data, cashflow underwriting is blind to racial and age discrimination. People qualify based on their ability to pay, not their placement in some opaque scoring system.
Cashflow underwriting also addresses another problem with traditional screening approaches, which is the overreliance on paystubs. According to the Bureau of Labor Statistics, over 10% of Americans are self-employed. With the growth of the gig and sharing economies, as well as the rise of social media influencers, an increasing number of Americans are getting their income from nontraditional sources. A cashflow-first approach acknowledges applicant income from all sources, based on their bank deposit history, rather than just payroll alone, making it easier for applicants to demonstrate their ability to pay.
With today’s artificial intelligence (AI) and machine learning technologies, lenders can make the switch to cashflow underwriting a lot more easily than they think. Innovative, AI-powered verification algorithms on the market right now can evaluate an applicant’s income, assets, employment history, and cashflow without invading anyone’s privacy. The applicant’s race, creed, sexual or gender orientation, current neighborhood, or place of origin never enters the picture.
Navy Federal may have received the bulk of the bad press this time around, but these problems — overreliance on credit scores, outdated expectations about employment — are industry wide. To avoid becoming the next Navy Federal, lenders must evaluate the whole picture. Following a cashflow underwriting approach and backed by AI, lenders can make safe bets, free from prejudice or the appearance of bias, to help people attain home ownership.
Tim Ray is co-founder and CEO of VeriFast, an identity and financial verification platform that reduces underwriting and costs while eliminating fraud. A serial entrepreneur and angel investor, Tim is an influential voice in the real estate and property management sectors.
This column does not necessarily reflect the opinion of HousingWire’s editorial department and its owners. To contact the authors of this story:
Tim Ray at [email protected]
To contact the editor responsible for this story:
Tracey Velt at [email protected]