Organisationally, we have seen banks appear to be adopting a range of models with the aim of ensuring ensure that different AI risk management stakeholders are brought together. Some have given lead responsibility to one control function, usually model risk management. Others have followed a more collegiate 鈥榝orum鈥� approach. Both options bring advantages and disadvantages: centralisation can support faster decision-making, while broader committees may better ensure that all voices are heard. The right structure for each bank will depend on its overall business model and the range of use-cases where AI is deployed.
So far supervisors have not prescribed specific models or structures for managing AI risk. (Nor does the AI Act specify precisely how firms should comply with its governance requirements.) Indeed, the European Central Bank (ECB) has not yet issued any specific guidance on banks鈥� use of AI. (The ECB鈥檚 should include some expectations on the use of machine learning: this is expected to be published in the summer.)
Ensuring effective governance is, however, a key priority for the ECB. Supervisors will likely see inadequate AI governance as symptomatic of poor governance more broadly. This is a further reason why banks should put a robust AI governance framework in place. At a minimum this should include these five key elements:
- AI principles: Banks should adopt a clear set of principles and commitments to responsible and trustworthy use of AI;
- AI risk appetite: Banks should include AI risk in their overall Risk Appetite Framework, to ensure proper attention to AI risk management;
- AI catalogue: Banks should compile and maintain a thorough and up to date inventory of all their AI systems and the different risks associated with each;
- AI risk management: Banks should adopt a comprehensive framework of policies and procedures for managing and mitigating AI risk, in accordance with their risk appetite framework, including roles for each of the three lines of defence;
- AI oversight: Banks should establish a clear structure for oversight of AI applications to assess and ensure compliance with internal policies as well as legal obligations.
Compliance with the AI Act is the foundation of AI governance. But good governance is not only a matter of compliance: it is also key to winning trust and acceptance of a banks鈥� AI deployment in the eyes of customers, staff and the wider public. This should allow banks to fully capture the benefits of a revolutionary new technology.