Add a bookmark to get started

30 de abril de 20247 minute read

FCA and PRA AI Update: Senior Managers’ accountability for the use of AI

Following the publication of the Government’s pro-innovation strategy earlier this year, the Financial Conduct Authority’s (FCA) AI Update last week sets out how its existing regulatory framework maps to the Government’s five principles on Artificial Intelligence (AI) regulation (AI Principles). The Bank of England and the Prudential Regulation Authority (PRA) have also written to the Government last week setting out the work they are doing on delivering safe and responsible AI and Machine Learning within their regulatory remit.

The FCA confirms it is monitoring the situation and will consider if any future regulatory adaptations are needed over the next 12 months. The PRA is adopting a similar approach. Yet, the clear message from these latest updates is the current regulatory framework, including the Senior Managers & Certification Regime (SMCR), already applies to the use of AI so financial services firms need to ensure they have the necessary systems in place to demonstrate compliance now.

Their updates address each of the following AI Principles:

  1. Safety, security and robustness;
  2. Appropriate transparency and explainability;
  3. Fairness;
  4. Accountability and governance; and
  5. Contestability and redress.

It is in respect of the accountability and governance principle that the FCA and the PRA see the existing SMCR framework playing a key role.

 

Governance and Accountability and the SMCR

The SMCR is an individual accountability regime that aims to promote the safety and soundness of regulated financial firms. Under this regime, regulated firms are required to ensure that one or more of their Senior Managers (ie, a key decisionmaker within the firm) holds a senior manager function and is allocated prescribed responsibilities. This applies to all firms subject to the SMCR but where a firm is dual regulated by the FCA and the PRA or where they are an “Enhanced Firms” under the SMCR, they are also subject to additional responsibilities and requirements.

The AI Principle requires there to be governance measures in place to ensure effective oversight over the supply and use of AI systems with clear lines of accountability established across the AI life cycle. The FCA and the PRA consider the SMCR already applies to ensure such accountability and governance of the safe and responsible use of AI. The FCA considers it applies in the following ways:

  • For firms who are dual regulated by the FCA and PRA or who are “Enhanced Firms” under the SMCR:
    • Where larger financial institutions are required to appoint Senior Managers who are personally accountable for certain functions, activities or business areas, the FCA makes clear this means the individual is responsible for the use of AI in such area/function.
    • Technology systems, including AI, would fall within the responsibility of the Chief Operations function and the Risk function, particularly as the latter has responsibility for the overall management of the firm’s risk controls including setting and managing risk exposure.

  • The obligation on all Senior Managers to take reasonable steps to ensure the business for which they are responsible is effectively controlled would cover the safe and responsible use of AI. This derives from the requirement for all Senior Managers to have a statement of responsibilities and to be subject to the Senior Manager Conduct Rules. Importantly, the FCA expressly states that they consider this aspect even applies to those smaller FCA regulated firms who are ‘Core Firms’ or ‘Limited Scope Firms’ under the SMCR.

In their October 2022 discussion paper on AI and machine learning, the FCA and PRA asked for feedback on whether to add a specific Prescribed Responsibility for oversight of AI development, testing, deployment, monitoring, and control which would be allocated to a Senior Manager. The general feedback received was against such a proposal. As such, the PRA’s letter to the Government indicates they are not contemplating adding a specific responsibility for AI. It is not clear if a decision has been made by the FCA on this but until any changes are made, existing Senior Managers will still be responsible for AI as set out above.

 

Practical Takeaways

In the context of the FCA and PRA’s latest position, the key practical takeaways for firms who are subject to the SMCR are as follows:

  • Such Senior Managers will need to be aware of the use of AI in their function, activity or business area and given the exponential increase in the use of AI, some organisations may not have that visibility in place.
  • Having clear governance procedures and protocols for approving the use and implementation of AI will be critical. This is particularly so as multiple Senior Managers could have responsibility for the same AI if it is used across several business areas or functions so there needs to be consistent standards.
  • Senior Managers need to be able to show they took reasonable steps to prevent the use of AI breaching any regulatory requirements otherwise they can be held personally accountable. The above actions would be relevant. Unhelpfully, the FCA has not provided any specific guidance on what this involves in the AI context yet so there is a level of ambiguity but they have indicated further guidance may be published.
  • Given all of the above, AI literacy will be key. Firms will need to ensure Senior Managers are equipped and capable of being accountable for the use of AI in the areas for which they are responsible.

 

What can we expect next?

The FCA’s AI Update sets out what they plan to do in the next 12 months, both in terms of the ways the FCA plans to use AI internally as part of its regulatory functions and how it will regulate the use of it externally. Consistent with the message from the current Government, the FCA will continue to build an in-depth understanding of how AI is deployed in the UK Financial Markets. As part of this monitoring, they may consider future regulatory adaptations if needed. The PRA explains it too will continue to monitor and analyse the position and as part of this, it intends to run the third instalment of the ‘Machine Learning in UK financial services’ survey in conjunction with the FCA.

The FCA has said it plans to publish a consultation paper on the SMCR in June 2024 in response to its March 2023 discussion paper on ways to improve the SMCR. The discussion questions did not refer to the use of AI. Yet, the reference to the upcoming consultation paper in this AI Update indicates it is a likely topic so it may well provide further insight on whether the SMCR will be adapted to expressly address AI or if the position will remain as above.

If you have any questions, please contact the author of this post.

Print