Recent CFPB Releases Continue Focus on Bank Fees and Identify CFPB Concerns with Use of AI in Customer Service
Recent CFPB Releases Continue Focus on Bank Fees and Identify CFPB Concerns with Use of AI in Customer Service

The Consumer Financial Protection Bureau (CFPB) issued releases in May and June that reflected their continued focus on consumer protection issues associated with both bank fees and the use of artificial intelligence (AI) by financial institutions. On May 10, 2023, the CFPB issued Circular 2023-02 (the Circular) advising that a financial institution’s unilateral reopening of a deposit account to process a debit or deposit received after account closure can constitute an unfair act or practice under the Consumer Financial Protection Act (CFPA). On June 6, 2023, the CFPB published the issue spotlight “Chatbots in Consumer Finance” (Issue Spotlight) identifying potential risks in using chatbots for customer service functions based on recent customer complaints and relevant laws.

The Circular confirms the position taken by CFPB in prior enforcement actions, but the Issue Spotlight differs from other recent statements by the CFPB on AI, which have focused on its use in credit decisions, marketing, and automated valuation models. The Issue Spotlight specifically addresses risks the CFPB has identified with virtual assistants and other AI-driven customer service, which, as the CFPB notes, are among the most prevalent uses of AI by financial institutions.

Circular on Unilateral Reopening of Deposit Accounts

CFPB circulars are intended to promote consistency among agencies with authority to enforce consumer protection laws. To that end, the Circular addresses the practice of reopening a previously closed deposit account and concludes it meets each element of an unfair act or practice under the CFPA. Under the CFPA, an unfair act or practice is one that (i) causes, or is likely to cause, consumers substantial injury that (ii) is not reasonably avoidable by consumers, and (iii) is not outweighed by countervailing benefits to consumers or competition.

  • Substantial Injury: The CFPB states that substantial injury can include monetary harm, such as fees resulting from the unfair practice, and does not require actual injury, just a “significant risk of concrete harm.” Monetary harm in this context comes from non-sufficient fund (“NSF”) and overdraft fees that may result from processing a debit from an account that had to be taken to a zero balance as part of the account closure process, as well as account maintenance fees. Other potential harm includes the risk of unauthorized persons accessing funds that are credited to a reopened account and potential furnishing of negative information to consumer reporting companies if the account is overdrawn.
  • Not Reasonably Avoidable by Consumers: The CFPB states that the consumer cannot control one or more of the following:
    • A third-party’s attempt to debit or deposit money, which may be inadvertent or incorrect, but nonetheless result in the financial institution reopening the account;
    • The process and timing of account closure, which may involve multiple steps and waiting periods, making it difficult for the consumer to predict when exactly debits from, and credits to, the account need to stop; and
    • The terms of the deposit account agreement, which may not even reference a financial institution’s ability to reopen the account and is presented on a take-it-or-leave-it basis, leaving consumers with no practical ability to negotiate its terms.
  • Likely Not Outweighed by Countervailing Benefits: The CFPB states that financial institutions have alternatives to reopening an account to minimize any costs to the institution, including declining any transactions after the account is closed. The CFPB notes that declining transactions has the added benefit of reducing fraud risk for the institution and, for the consumer, making the sender of the deposit or debit aware of the need to contact the consumer for updated information. The CFPB acknowledges there may be some benefit to consumers if deposits into the reopened account become available to them. However, the CFPB reasons any such benefits are outweighed by the potential injury, including from third parties depleting those funds before the consumer can access them. 

The Circular also notes that, depending on the circumstances, reopening a closed deposit account may fall under the CFPA’s prohibition on deceptive or abusive acts or practices as well. The CFPB does not define what constitutes unilateral reopening of a deposit account, but it references prior enforcement action in this space, where the CFPB found customers had not provided authorization, and did not receive notice of, the reopening.

Issue Spotlight on Use of Chatbots in Consumer Finance

The CFPB defines chatbots as “computer programs that mimic elements of human conversation” by “ingest[ing] a user’s input and us[ing] programming to produce an output.”  Chatbots range in sophistication from rules-based chatbots, which rely on use of keywords to trigger one of a set menu of options, to AI-driven chatbots that simulate natural dialogue or use large language models (LLM) to analyze patterns and predict which words should come next in response to a question.

The CFPB notes that use of chatbots is widespread across the financial industry, with the largest 10 commercial banks all using chatbots for customer service and 37% of the U.S. population having engaged with a bank’s chatbot in 2022 alone. Financial institutions have increased their reliance on chatbots and other technologies on their websites, mobile applications, and social media platforms over traditional customer interfaces such as contact and call centers. This is, in part, a cost saving mechanism and also due to chatbots’ ability to respond to customers immediately and at any time of day. Some financial institutions have created their own proprietary chatbot programs, while others have partnered with technology companies to use existing technology including Amazon Web Service, Microsoft, and Alphabet’s Google Cloud.

The CFPB examined consumer complaints regarding their interactions with chatbots and identified several common themes:

  • Limited ability to solve complex problems: Chatbots are limited in their capacity to understand a wide array of human communication and provide output outside of predetermined programming. This may present issues when a consumer makes a dispute, triggering legal obligations for the financial institution to investigate and provide a response within a certain time, without using specific buzzwords. Additionally, chatbots may provide incorrect responses because they are unable to distinguish between accurate and inaccurate information in the datasets on which they are trained. Finally, even when the information is accurate, it may be unhelpful since chatbots are often limited to preset scripts that may not address the consumer’s specific question.
  • Hindering access to timely human intervention: As the CFPB notes, consumers often contact customer support when they are under financial pressure and require timely resolution of their questions. Limitations of chatbots may interfere by increasing consumer frustration and making it more difficult for consumers to reach human representatives with the ability to resolve their issues. Linking back to its recent focus on fees, the CFPB provides the specific example from a consumer complaint of credit card late fees being imposed due to consumer’s inability to reach a human capable of accepting the payment on its due date.
  • Technical limitations and associated security risks: Like other forms of technology, chatbots are susceptible to system crashes and security concerns. The CFPB reiterates that financial institutions’ obligation to safeguard personally identifiable information extends to personal details input into chat platforms. Beyond this, financial institutions need to consider how to protect against phishing or scamming schemes that may target chatbots, the risks of disclosure of personal or confidential information through chatbots’ training sets, and auditing of third-party service providers.

The report urges financial institutions to be thoughtful in their use of AI and related technology for customer support based on the risks of (i) noncompliance with federal consumer financial laws, (ii) eroding consumer confidence and trust due to consumers’ inability to obtain responses to their questions or access to a human representative, and (iii) harm to consumers who receive inaccurate information or are assessed inappropriate fees. The CFPB does not suggest that banks should avoid using chatbots for customer service but cautions against severely reducing or replacing human support altogether. Additionally, the CFPB signals that it will continue to be active in this space by monitoring consumer complaints and, more broadly, the implications of moving to AI-based customer service. 

The Issue Spotlight marks the first time the CFPB has commented on use of AI in customer service, although the CFPB has been active in addressing AI in other contexts. The CFPB previewed in April that it intended to publish a white paper on AI limitations and use by financial institutions, without noting it would focus on customer service specifically. In a blog post the day after releasing the Issue Spotlight, the CFPB suggested it may pursue financial institutions for UDAAP and other violations for providing inaccurate information through chatbots, failing to resolve problems related to consumers accessing money and making payments, and data security breaches or deficiencies.

About MVA White Collar Defense, Investigations, and Regulatory Advice Blog

As government authorities around the world conduct overlapping investigations and bring parallel proceedings in evolving regulatory environments, companies face challenging regulatory and criminal enforcement dynamics. We help keep our clients up to date in these fast-moving areas and to serve as a thought leader.

Stay Informed

* indicates required
Jump to Page

Subscribe To Our Newsletter

Stay Informed

* indicates required

By using this site, you agree to our updated Privacy Policy and our Terms of Use.