Where Credit’s Due Ep. 8: How AI is changing lending, with Zest AI CEO Mike de Vere and Informed.IQ CEO Justin Wickett
- When we talk about lending these days, it's no longer just about big banks with large underwriting teams. Advancements in technology are completely changing the lending and loan management practices.
- We're exploring this topic of AI in lending today with my guests Zest AI CEO Mike de Vere and Informed.IQ CEO Justin Wickett
Join us at Tearsheet’s Power of Payments Conference, on September 15th at Current, Chelsea Piers, NYC for a day full of critical insights and discussions, as well as in-person networking opportunities. Apply here: https://bit.ly/3R6oENt
When we talk about lending nowadays, it’s no longer just about big banks with large underwriting teams. Advancements in technology are completely changing the lending and loan management practices.
Take a look at FICO, which has been one of the most reliable credit ranking systems out there. It was developed with some of the most sophisticated algorithms that were available at the time and is still widely trusted to this day. And yet, now it feels outdated and software engineers are out to design a better model.
It’s all about AI these days – a cost-effective way to reduce that long time it usually takes to close a loan, boosting competitiveness and profitability. Automation allows lenders and banks to focus more on the customer experience and less on looking at data from all kinds of standardized forms. Digital lenders have shown that their bet is on technology, rather than balance sheets.
Plus, through the use of alternative data, lenders can access a wider customer base, as folks who maybe didn’t qualify for a loan under FICO can now be found eligible.
The following excerpts have been edited for clarity.
What’s your definition of AI and how it relates to the financial services industry?
Mike de Vere: AI has been around since the 1950s. When you say AI, maybe most people think a bit about the Terminator, and artificial intelligence coming alive. But really, you’re teaching a computer to do something that a human might have done. Today, AI gives us great promise to fix a trillion dollar issue in the financial system, which is that the current credit system is failing America, it’s wildly inaccurate. And by applying AI, which is just better math, and consuming more data, you can fix a lot of issues that are in the economy itself.
Justin Wickett: There’s so many different applications for AI. This is technology that’s been around for quite some time. But with the advent of cloud computing, with more and more CPU and GPU resources available in the cloud, financial institutions are able to tap into machine learning and AI to automate processes that historically have required a lot of human input or just haven’t been able to be solved by people. Because folks’ memory is just limited, whereas a computer can do so much processing that before just hasn’t been within reach.
FICO was initially developed to be the best credit ranking systems out there, but now a lot of the focus is on its limitations. How could AI change the credit system?
Mike de Vere: If we look at the current scores that are out there, industry scores, most of them are leveraging logistic regression, which limits the variables or signal that you could have on a bar to 20 to 30 variables. With AI or machine learning, you’re able to consume thousands of points of data. And I’m not talking about creepy data like social media or anything like that. I’m talking about traditional raw credit data. But you’re able to consume that much more information. And frankly, it’s like a television – would you rather have 20 to 30 pixels or 1000 pixels? 1000 pixels would be a far more accurate prediction or picture of a borrower.
I think the time for change is long overdue to transition away from this old math that served us well in the 1950s when they started, but there is a new way. And that new way is to leverage machine learning. But the secondary aspect when we talk about the failure of the current credit system is not only the accuracy and the ability to predict if one should provide a loan or not, but it’s also the bias. I say it’s actually a bit racist, there’s gender disparity and this should be concerning. All of America should be screaming about this. I know for some, the topic of AI can be scary, but it has such great promise to solve serious issues that are plaguing our society.
Justin Wickett: Lenders are looking to go much further than FICO’s inherent limitations. You talked about immigrants coming to the United States within files, or maybe no credit, maybe there’s no FICO score on them. And very often, financial institutions will underwrite and put a lot of emphasis on the individual’s income, maybe immigrants coming to the United States are technology workers, aspiring entrepreneurs, and they have the ability to generate income, but they don’t have any credit history established. What Informed.IQ has been doing is automating the verification of income. We see lenders increasingly looking to remove bias, not just from the FICO process, but also from the verification process, especially when it comes to income, given the importance of income, as it relates to generating an interest rate, credit decision, payment to income ratios, debt to income ratios are very important. And AI is able to more accurately validate and verify an applicant’s income than the historical manual process. So I’m excited to share more on that.
When it comes to what should be qualified as data to be used in a financial process or data that generates an outcome with financial or social implications, do you think about alternative data with this kind of ethical lens?
Justin Wickett: Informed.IQ is all about enabling consumer information that Americans are putting forth to really be taken into consideration holistically. We’re not an alternative data company, we are about enabling applicant income to be really considered. We hear from financial institutions how they give the same loan to 10 of their different loan funders, and each of those 10 loan funders happen to calculate applicant income in a different manner. Some folks working at the bank take into account overtime pay, others disregard tips, some staff take into account ATM deposits, counter deposits, others don’t. So what we’ve continually heard from lenders is not so much that they’re begging for alternative sources of data, they just want a better and more consistent manner to process the information that Americans are providing to them when they’re applying for credit.
I think that this is a trend that’s really getting noticed. For example, the Consumer Financial Protection Bureau, is making some amends to 1033, which is in relation to the Dodd Frank Act. And what it’s all about is how a financial institution, governed by the CFPB, can provide greater transparency to consumers so that as an American, when I’m applying for credit, I know what information is actually being taken into consideration, not just my FICO score.
Mike de Vere: I think you just have to be mindful about the information that you’re using. At times, you can have an unintended outcome. Another FinTech in recent years got in trouble, because they were including the university that you graduated from? Well, if you say that I graduated from Harvard, and someone else graduated from Harvard, that can end up being a proxy for race. I think being mindful about what your intentions are, with the decisions you’re trying to make with the model that you’re building, it’s mindful AI, a concept popularized by a professor up near you, Justin, in Berkeley. And so being very purpose driven about how you’re building the model on what data you include, I think is super important.
How do each of you think about regulation in this space?
Mike de Vere: From my perspective, it’s a balance between providing greater guidance, and at the same time allowing for space for innovation, because if the pendulum swings one way versus the other, we’ll have no innovation on one side of the spectrum and the credit system will not be fixed, on the other side of the spectrum. It can’t be the wild wild west where we’re all scraping social media data, as is acceptable in other countries. When you’re scraping social data to actually make decisions around credit that’s wildly dangerous, and not acceptable here in the US, and so I think it’s really about having a balanced approach.
Justin Wickett: I completely echo that part of informed submission is to lower the cost of credit using AI for real time, transparency, wider access to capital and improve compliance. And along those lines, I think the advanced notice of proposed rulemaking, the NPR coming out of the CFPB, mandating additional transparency is very much along those lines informed our policy is such that we don’t go out and scrape third party data, we do not do that at all.
Today, what we do is we help lenders, financial institutions, process data that applicants are providing to them already that applicants are consenting to provide to them. We’ve been blown away by how manual that process has been to date, how much bias and error prone It is currently. And by taking advantage of AI and machine learning automation, we’ve been able to free up staff, bring more consistency and transparency and auditability to that existing manual verification process that in turn allows for hundreds of thousands of Americans to qualify for better interest rates.