Roles and responsibilities in governing AI
Using Saidot, you will be able to
Invite colleagues from product, business, process, legal, and compliance teams to collaborate on governing your AI system.
Assign specific system-level responsibilities to team members to ensure well-informed governance decisions.
Sign up for role-based notifications, tasks, and access to effectively contribute to the AI system.
Regularly report the status of your AI system and risk portfolio to your organisation's governing body.
AI governance is a responsibility that touches an organisation's vast group of stakeholders. It is a collaboration between AI product teams, legal and compliance departments, and business and product owners. The executive level and board hold responsibility for unlocking AI opportunities responsibly by prioritising the alignment with corporate values and ensuring quality and compliance.
Accountability for AI governance should be allocated within organisations' three lines of defence. We build on the Three Lines of Defence Model (The Institute of Internal Auditors (IIA), 2024; Schuett, 2023) to define and assign roles and responsibilities for the entire AI governance process to leverage existing governance structures in organisations and increase the effectiveness of AI accountability. This model helps organisations identify structures and processes that best assist in achieving corporate objectives while ensuring strong AI governance and risk management.
The First Line
The First Line is responsible for the governance of the AI system. The members of this group develop and deploy AI systems and maintain appropriate structures and processes for the management of risks. This group is the closest to the context of use of AI systems; therefore, they also are the closest ones to AI risks, which allows them to play an essential role in identifying, evaluating, and mitigating them. Well-defined roles in the First Line are key elements of effective AI accountability. Examples include:
Roles | Responsibilities |
---|---|
Managers (e.g. Product Owners, Business Owners). | Create or procure AI products and ensure AI systems' governance. They are also responsible for managing AI systems' risks and compliance. Lead the actions and resource allocation to accomplish AI governance objectives. |
Technical Specialists (e.g. Data Scientists, ML Engineers). | |
Procurement Specialists |
The Second Line
The Second Line of Defence establishes AI governance and compliance practices, offers complementary expertise to the First Line, and monitors and challenges the results of AI governance. AI governance and risk management require a high level of multidisciplinary knowledge, such as policy, legal, ethical, and risk-specific, which might be needed to complement the expertise of the First Line. This group operates by developing AI governance policies, frameworks, processes, and tasks, in addition to designing and delivering training and assisting with context-based expert advice. The members of this group might vary according to each organisational context, but examples are described in the table below.
Roles | Responsibilities |
---|---|
AI Governance Managers | Establish practices and provide expertise and support in AI governance, risk management, and compliance. |
AI Validation Specialists | |
Risk Managers | |
Cybersecurity Managers | |
Policy and Legal Specialists |
The Third Line
The Third Line of Defence is responsible for independently evaluating AI governance effectiveness. The members of this group assess the work of the first two lines and provide the results to the Governing Body. The Governing Body relies on the Third Line to deliver independent, objective assurance of AI governance and to offer advice on addressing potential shortcomings. For instance, the Third Line can be formed by members of the Internal Audit team. Examples of responsibilities to be assigned to the Third Line can be defined as follows:
Roles | Responsibilities |
---|---|
Internal Auditors | Independently assure and evaluate the effectiveness of AI governance. |
The Governing Body
A governing body is a general denomination that represents a decision-making forum within the organisation (e.g., a Body, Board, Committee, or equivalent). Each organisation should tailor this concept to its existing accountability structure and entities in order to leverage its current corporate governance model.
The members of a Governing Body are ultimately accountable for the decisions made throughout the organisation, including those made through the use of AI, and for the adequacy of governance and compliance where AI is being developed or deployed. Thus, they are accountable for the use of AI considered acceptable by the organisation and, consequently, for potential AI-related harms.
Clarity, certainty, and ownership are necessary to implement accountability in organisations effectively. Therefore, it is crucial to assign roles and define responsibilities clearly within the Governing Body.
Examples of responsibilities to be assigned to the Governing Body can be defined as follows:
Roles | Responsibilities |
---|---|
Governing Body | Provide direction by setting targets for responsible AI use, evaluating the AI portfolio, and approving intended use cases, considering the organisation's risk appetite. Own AI policy and company-level procedures and processes for responsible development and use of AI. Responsible for oversight and outcomes of AI as well as adequacy of governance. Sign off high-risk use cases for potential regulative processes. |
Apart from the roles within organisations' three lines of defence, our methodology also acknowledges External Assurance Providers, who can provide additional support to safeguard stakeholder interests and complement internal assurance.
Sources:
Adapted based on ISO/IEC 38507:2022, Information technology — Governance of IT — Governance implications of the use of artificial intelligence by organizations.
Schuett, J. (2023). Three lines of defense against risks from AI. AI & SOCIETY. Advance online publication. https://doi.org/10.1007/s00146-023-01811-0
The Institute of Internal Auditors (IIA). (2013) IIA position paper: The Three Lines of Defense in effective risk management and control. 121691 PROF-Position Paper 3 Lines of Defense_Digital Version_CX.indd
The Institute of Internal Auditors (IIA). (2024) The IIA's Three Lines Model: An Update of the Three Lines of Defense. Position Paper. Three Lines Position Paper - IIA Sept. 2024 Update.