Ensure accountability, governance and "AI compliance by design"

Generative AI

Companies should utilise the knowledge and infrastructure obtained from carrying out GDPR compliance programs to help them with discharging their accountability, governance and "AI compliance by design" obligations which underpin the new pending AI regulations.

All relevant stakeholders should be involved in assessing the risks as it applies to your business against the specific purpose you intend to use the generative AI solution for.

There are many ways generative AI solutions can be used within a business and each purpose will generate a different risk matrix. For example, utilising ChatGPT to assist with the first draft of an internal comms message which contains no personal data or confidential information is very different from incorporating ChatGPT as an interface to a core product which you subsequently license to your customers.

You should clearly define the purpose/business case for the use of a generative AI solution and assess whether it is necessary and proportionate to use such a solution or whether an alternative approach could be taken with the same results. A new purpose/use should trigger a new risk assessment and risks and mitigation steps should be well documented (to meet the AI impact assessment concept under the pending EU AI Act).

Implementing good accountability and governance processes and training personnel on the same will be particularly important for larger businesses to ensure guardrails are in place to prevent unknown adoption and misuse.