Skip to main content
Skip table of contents

AI Policies

Using Saidot, you will be able to

  • Build your organisation’s AI policy tailored to your business needs, anchored on your culture and values, and aligned with other relevant internal policies (E.g. Responsible Business Conduct, Corporate Sustainability).

  • Identify applicable AI policies according to your objectives of procuring, developing, using, and/or selling AI systems.

  • Directly apply ready-to-use AI policy templates to seamlessly address and monitor all the relevant obligations, requirements, and controls for legal compliance and responsible governance of your AI system.

What we mean by AI policy

AI Policy at Saidot is defined as an umbrella term for different types of normative documents that shape responsible AI governance. Examples of documents falling into these criteria are: AI-related public policies, AI national strategies, technical standards, hard law instruments (at national, regional and international level), and soft law instruments, such as principles, guidelines, recommendations, declarations, best practices, among others.

These instruments are introduced, established or enacted by a wide range of actors and stakeholders in the AI ecosystem, such as governments, international organizations, intergovernmental organizations, industry representatives, standardization organizations, Non-profits, and research and civil society organisations.

Policy analysis

Irrespective of the type of AI policy, it is important to be aware and understand which AI policies are important for the given AI system and what requirements those policies prescribe to the AI system. The process of identifying and recognising the relevant AI policies and applying them in the context of an AI system is called policy analysis.

Identifying AI Policies

Policy analysis starts by identifying AI policies, which helps you understand the relevant policies applicable to your AI system. The first step in the process of identifying AI policies is scoping, which helps with narrowing down the policies relevant to your AI system. It is helpful to start scoping by considering where the AI system is planned to be taken into use or put into service. This allows for narrowing down relevant regions and jurisdictions that need to be considered in the policy analysis.

Once the relevant jurisdictions have been clarified, it is helpful to consider the wider context of the AI system, its intended purpose, and the context in which it functions. A good starting point is to understand whether the jurisdictions relevant to your AI system have in place any AI-specific laws that could apply to your system and its context of use. Secondly, considering the intended purpose of your AI system and the context in which it functions is helpful also to understand whether any industry-specific AI laws may apply to your system. You can also consider the use of personal data to understand whether laws on the use and processing of personal data apply to your system. It is also important to consider any internal policies, frameworks, or codes of conduct within the organisation and identify the relevant ones for your system. These are only examples of different types of policies you can consider when identifying relevant AI policies for your system. The identification of relevant AI policies is best conducted by legal experts.

Applying AI policies through Policy Templates

Once relevant AI policies for your system have been identified, it is important to understand the obligations stemming from these policies and start working on ensuring compliance with them. Saidot enables this through the application of policy templates. Template is a structured framework derived from the policy and includes the requirements outlined in a given policy. Templates provide a standardised format for individuals to document and manage their compliance with the relevant policy. Templates are specific for the roles and categories of AI systems or types of actions that are regulated under a policy.

Selecting templates

When attempting to understand the obligations applicable to an AI system under a given policy, a good starting point is to understand the scope of the policy and how the organisation developing or taking into use the AI system, as well as the AI system itself, fall under the policy’s scope. Both legal experts as well as technical experts responsible for the technical implementation of the AI system should come together to understand how the AI policy applies to the system.

It is helpful to consider, for example, what is your organisation's role relative to the AI system? Do you perhaps develop the AI system as a provider or merely take it into use as a deployer?

It is also helpful to understand the type of AI system your organisation is either developing or taking into use, and in what context the AI system will function. The obligations under the AI policy can vary depending on the type of AI system.

Once you have an understanding of the role of your organisation relevant to the AI system and the AI policy, as well as whether and how the AI system will fall under any potential categories of AI systems or types of actions regulated under the AI policy (also known as ‘Policy coverage’), you can move on selecting the relevant template(s) in the AI policy for your system. It is good to note that this process should be repeated separately for each identified AI Policy.

Ensuring compliance through templates

Once relevant templates have been selected, the AI system’s compliance with relevant requirements can be documented. The template contains the requirements relevant to the role and coverage category displayed in the template. This information guides the structure of the template, as each template is tailored specifically for the given role and policy coverage category. It is important to note that in some cases, the AI system may have multiple applicable templates from one policy. Compliance with the different templates should be separately documented.

The templates consist of requirement sections, requirements, and controls that help structure the relevant obligations under the template. Organisations document and evidence their compliance with the requirements and related controls displayed in the template.

Concept

Description

Template

A template is a structured framework derived from a policy. It includes the requirements outlined in the policy and provides a standardised format for individuals to document their compliance with the given policy. Templates allow for an easy and efficient way to document and manage the AI system’s compliance with a given policy. One policy can have one or more templates. Templates are specific for the roles and categories of AI systems or types of actions that are regulated under a policy, and some policies may have multiple regulated roles and policy coverage categories.

Templates consist of requirement sections, requirements, contextual requirements, and controls.

Requirement section

Requirement sections structure a group of related requirements found in a policy and a template. They help structure the policy by grouping together requirements sharing a common topic, e.g. “Risk management” or “Data governance and management”.

Requirement

A requirement is a specific condition, rule, or standard that must be met or adhered to. Policies include requirements that outline the necessary actions, behaviour, or criteria that individuals and/or organisations must follow to comply with the policy. Requirements are displayed in requirement sections.

Contextual requirement

A contextual requirement is a type of requirement that is only applicable in a certain context. Contextual requirements can either apply to the AI system or be non-applicable depending on the intended purpose and use context of the AI system.

Control

Control is an organisation’s action to comply with the respective requirements. Controls are derived directly from requirements.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.