The alchemy of equality: Combining intention and technology for underwriting for Racial Justice
- Historical practices, as well as modern "black box" credit scoring models, effectively bar communities of color from equal access to credit.
- To make credit underwriting more fair and representative, both technology and intent need to be combined and this is what a new partnership aims to do.

Recent economic pressures hit particularly hard on those that have less access to credit or are credit invisible. Historically, Black consumers make up a significant portion of those most at financial risk.
For example, in 2019, the average household wealth for a Black family was an eighth of that of a white household. Black mortgage applicants are denied at a rate 84% higher than their white counterparts. Overall, 6% of Black applicants are denied due to their credit history.
This is because credit scoring mechanisms have assumptions and processes in place that negatively impact Black consumers. For example, the ability to secure a mortgage and pay it down in time can positively impact your credit score. But mortgages are historically difficult for Black consumers to obtain. The practice of redlining which designated Black neighborhoods as too risky is illegal today, but academic research has found that redlined districts of the past still experience particularly high interest rates. Moreover, institutional practices like the FHA mortgage risk maps of Chicago deny mortgages to large sections of the city.
In the absence of mortgages, Black consumers often rent properties, but some credit scoring models don't take these types of payments into account. Being denied mortgages due to lower credit scores and not improving your credit scores due to not being able to get a mortgage is a self-perpetuating cycle that leaves many Black consumers vulnerable to financial instability and riskier lending practices like payday loans.
Left unchecked, these practices can result in mounting denials to communities that need access to credit. Earlier this year, the Justice Department accused the Los-Angeles based City National Bank of discrimination, which it said was refusing underwriting for mortgages in areas with large Black and Latino populations. The bank paid $31 million in one of the largest redlining settlements in history.
While in City National Bank’s case there seems to be an intentional policy to avoid providing mortgages to non-white borrowers, the problem of inequitable access to credit has two heads: intention and technology. Technologies that underlie credit underwriting models often depend upon black box Machine Learning algorithms. These algorithms rely heavily on historical data- data which can be skewed against non-white communities.
Technology: The problem with credit underwriting models
Historical data: AI models are very data dependent and are often considered only as good as the data given to them. And historical data is not neutral. It contains within it imprints of the disparity between Black and white Americans. “Historical data is almost by definition biased. And without the right ability for overrides and intervention would just further propagate or perpetuate that bias going forward,” said Laura Kornhauser, co-founder and CEO of Stratyfy, a company that provides underwriting technology for lenders.
Credit scores: On the other hand, credit scores, which are considered one of the most common ways of assessing the risk associated with a borrower, have problems of their own. Some lenders have sharp “cut-off” points below which they do not consider lending, according to Kornhauser. This can adversely impact non-white communities which typically have lower credit scores.
Transparency: Most machine learning algorithms are black boxes, the machinations of which are impervious to scrutiny. This is a problem from a regulatory and accuracy standpoint. When a model takes in a barrage of data and spits out a borrower’s risk categorization, there is little to no understanding of how it came to that conclusion. The opacity of ML models makes it hard to ascertain which parameters impacted the end result most heavily and how different parameters interacted with each other to produce a result. If lenders can’t tell what parameters may be introducing bias into the system, they cannot correct for it.
To counteract this lack of clarity in traditional black box models, some firms use approaches like explainable and interpretable AI, which provide insight into how a model works. An explainable AI runs on top of existing black box models to render them comprehensible, but an interpretable AI model is developed to avoid the black box state to begin with. For example, Stratyfy’s model is an interpretable one: “It does not require the post hoc or after the fact explainers to be layered on top of it,” said Kornhauser.
To make its scoring interpretable, Stratyfy exposes its internal parameters to the lender and allows the lender to adjust them if they want to. Since each decision on a loan explanation can be scrutinized to understand what factors may have impacted it, areas where bias might have seeped in can be identified and then adjusted.
However, as was apparent in the City National Bank case, the issue of unfair credit underwriting practices is not one of technology alone. To enable equitable access to credit, both intention and technology have to come together. The Beneficial State Foundation, a nonprofit that focuses on financial justice, has partnered with Stratyfy, for the Underwriting for Racial Justice program. Under this program the partnership will run a 2 year pilot program for 20 lenders that will leverage Stratyfy’s technology to make more nuanced underwriting decisions and revise their loan policies. “Justice is well overdue and if we can use technology to help more people get access to capital suited to their needs and financial situations, we should certainly try,” said a spokesperson from Beneficial State Foundation.
Combining intent and technology: Underwriting for Racial Justice
Lenders like Berkshire Bank, Texas National Bank, and Eastern Bank, which are part of the URJ program, have been selected because they have already demonstrated an intention to improve access to credit within their communities. With URJ, these lenders will be able to learn from Stratyfy’s technology and convene to discuss how they are overcoming their individual challenges.
“Through the insights they gain in working with Stratyfy staff and using their tool, we expect the lenders will have a much better sense of what specific changes they can make to their underwriting to realize this goal. We expect this to mean more capital to borrowers of color more quickly, which, in turn, means more business growth, homeownership, potential for wealth building, and less financial stress and trauma for families and individuals in various cities, rural communities and Tribal communities across the country,” a spokesperson from Beneficial State Foundation said.
Each lender's commitment comes with a stated impact goal that Stratyfy will help track. Kornhauser hopes that once the pilot program ends, lenders will see the benefits of the change and double down on the practice.
The technology, which allows lenders to see if data parameters correlate with a protected class and the interactions between different parameters, will also cut down on the issues with traditional black box models. Although Stratyfy has the ability to take into account alternative data, Kornhauser added that the lenders they are currently working with are not ready to leverage other data sources.
The big picture
Perhaps one of the biggest upsides of this program will be the generation of two years of data by 20 different financial institutions. This data will provide an excellent basis of analysis for how FIs fare when they course correct for financial justice. How do 20 different lenders of different types and sizes solve for equitable access to credit within their localities and their infrastructure? What solutions work and where does correction start eating into risk tolerance? Given the current economic uncertainty, smaller time frames for the pilot program could have skewed results, but 2 years may be the right window into how lenders actually progress.
Beyond the numbers, there is also the idea of allowing institutions to find a forum where they can troubleshoot their respective problems. Retooling for financial justice requires a shift from traditional technologies and practices, which can quickly become risky and daunting for organizations that have limited scale but strict regulatory requirements. Troubleshooting together in a pilot program can offer a sustainable way forging a path toward financial justice.
Lastly, for those who are denied access to credit and mortgages, 20 communities across America will experience how things may evolve if FIs retool, casting a web of change that will be bigger than any one organization taking this on its own.