The Consumer Financial Protection Bureau (CFPB) issued new guidance surrounding lenders' legal obligations when using artificial intelligence (AI) and other intricate models. The guidance emphasizes the importance of providing clear and precise reasons for any adverse actions taken against consumers.
With the proliferation of AI technologies, there has been a marked increase in the types of data used for lending decisions. "Technology marketed as artificial intelligence is expanding the data used for lending decisions and also growing the list of potential reasons for why credit is denied,” said CFPB Director Rohit Chopra. “Creditors must be able to specifically explain their reasons for denial. There is no special exemption for artificial intelligence.”
The issue is the growing reliance of creditors on complex algorithms fed by vast datasets, sometimes sourced from consumer surveillance. Such algorithms may decline credit based on factors not immediately relevant to an individual's financial history. While some creditors have been using CFPB's sample checklists to provide reasons for adverse actions, this approach has been flagged as potentially insufficient. The Equal Credit Opportunity Act prohibits a simplistic check-the-box exercise when notifying consumers about adverse actions, especially if it doesn't accurately convey the reasons for the decision.
Reiterating a point from a circular last year, the CFPB emphasized creditors' need to explain specific reasons for adverse actions. This requirement stands firm even when black-box credit models, which can be opaque in their decision-making, are employed.
The latest guidance elaborates that specific and accurate reasons must be given even for decisions rooted in complex algorithms. A generic reason, such as "purchasing history," would not suffice. If a credit line was lowered due to behavioral spending data, creditors need to detail the particular negative behaviors that led to the decision.
Merely selecting approximate factors from sample reason checklists will only satisfy legal requirements if those reasons precisely depict the action's actual rationale. The CFPB stresses that consumers deserve to understand the genuine reasons, even if it might upset or surprise them.
This guidance builds upon the CFPB's ongoing efforts to ensure fairness at the intersection of lending and technology. In the context of an increasing demand for digital algorithmic scoring by corporate landlords, the CFPB underscored that prospective tenants deserve notices for adverse actions, such as housing denials. Collaborating with other federal entities, the CFPB has proposed rules on automated valuation models and is dedicated to ensuring that opaque models don't result in digital discrimination in mortgage markets.