BID® Daily Newsletter
Jan 8, 2021

BID® Daily Newsletter

Jan 8, 2021

Staying Compliant With Artificial Intelligence

Summary: Are you wondering what regulators are saying about artificial intelligence and machine learning? We give you some of the regulatory recommendations.

Have you noticed that many of the AI-driven assistants (Alexa, Siri, Erica, etc.) are female? Apparently, that is not accidental. Research has shown that people prefer hearing a female voice over a male one. There seems to be more to learn about artificial intelligence (AI) these days than ever before.
As the banking industry delves into all the ways that AI and machine learning (ML) can be used (e.g., detecting fraud, creating efficiencies, tailoring products), bankers are also trying to understand the regulation and oversight of these tools.
First of all, according to the Financial Stability Board, AI is defined as “the application of computational tools to address tasks traditionally requiring human sophistication.” ML, which is considered a subset of AI, is “a method of designing a sequence of actions to solve a problem, known as algorithms, which optimize automatically through experience and with limited or no human intervention.
Knowing that financial institutions are looking for guidance, staff members from the Federal Deposit Insurance Corporation (FDIC), the  Federal Reserve, the Office of the Comptroller of the Currency and the Consumer Financial Protection Bureau (CFPB) recently held a virtual panel discussion, “Ask the Regulators: Banks’ Use of Artificial Intelligence, including Machine Learning.” The regulatory agencies touched on some of the things that financial institutions should take into consideration before utilizing AI/ ML. We provide a few of their recommendations as a starting point.
Examine challenges and mitigate them
Before using AI/ML within your institution, the board and senior managers should look closely at any potential risks that could result and determine how to tailor risk management to each specific area of risk identified. Investigate specifically any biases or issues that could result from utilizing AI/ML. To mitigate these challenges, be sure to use high-quality data, ensure adequate transparency on how the information you collect is used by the algorithms, and implement adequate security measures that protect your customers’ personal information. Be sure your AI/ML approaches are transparent with a clear understanding of how the data goes from input to output.
Match risk management to complexity
As your AI/ML efforts become more complex, your risk measures will need to increase. In order to avoid inadvertently violating regulations such as fair lending requirements or consumer protection laws, regulators suggest an in-depth, independent review of the various data elements by an objective third party. Also helpful in keeping your institution in compliance: routine monitoring, using benchmarking, and conducting regular performance reviews and analysis. Regulators will be keeping a particularly close eye on the internal standards that organizations create regarding transparency and explainability of the data and processes. The greater your reliance on data is, the greater your data controls should be. Make sure your AI/ML initiatives can be modeled, tracked, and replicated, or else red flags will be raised among regulators.
Ensure staff expertise and be inclusive
It is critical that your organization has employees with the necessary expertise to both develop and manage such initiatives. When you are developing your AI/ML approach, include bank staff responsible for consumer protection and fair lending compliance. They can keep your approaches in tight alignment with laws and regulations to avoid any pitfalls down the road during implementation. Also, diverse staff may be able to see some of these pitfalls more easily.
Subscribe to the BID Daily Newsletter to have it delivered by email daily.

Related Articles:

Predictive Analytics Help Personalize Customer Relationships
According to a recent report from McKinsey & Co., 71% of customers expect companies to personalize their interactions. Community Financial Institutions (CFIs) have traditionally delivered personalized experiences through their customer interactions and building strong relationships. But now, it’s vital to aggregate, integrate, and analyze customer data in ways that let CFIs anticipate and predict future needs.
Serious Hack-Attack-You Have 36 Hours to Report It
CFIs and other banks now have 36 hours to report serious hacks, including those that may disrupt operations, cause material losses or even threaten the stability of the entire financial system. Is 36 hours enough time for CFIs?