Skip to main content

Let’s give a warm welcome to the AI Act!

Once again, the European Union asserts its global dominance when it comes to modernising and shaping the future of data use, and this time by rolling out the AI Act. The AI Act is the first ever overarching legal framework on Artificial Intelligence, which aims to address the risks arising out of the rampant growth of AI solutions.

Together with the ‘AI Innovation Package’ and the ‘Coordinated Plan on AI’, the European Union developed a three pronged approach that aims to guarantee the safety and protection offundamental rights of people and businesses in the European Union.

As an industry leader, you want your company to have a competitive advantage, and are therefore probably looking to hop on the AI hype train, but at the same time, you’re not quite sure of the risks and how this so-called ‘AI Act’ will have an impact on AI tools now or in the future.

We’re here to offer a helping hand in this upcoming series of blogs! First and foremost, who are you in the eyes of the AI Act, and how to prepare accordingly? To tackle an already complicated issue, let’s identify you as either a Provider or a Deployer

Provider
 

If you develop an AI tool and place it on the market in the EU, the AI Act will see you as a Provider. As the party involved in the development of the AI tool from scratch all the way to commercialisation, a lot of the obligations will fall on you. Depending on the classification of your tool, you might have to register with the centralised EU database, implement human supervision in the training process, meticulously clean and label your dataset, draft policies to transparently inform users, apply for certification, etc.

Implementing these obligations after commercialisation will simply be impossible or cost you a fortune, and as such, it is advised to start preparing sooner rather than later. How, you ask? Just follow this simple four step action plan:

1.       Classify your solution in accordance with the AI Act

2.       Implement the necessary technical and organisational measures

3.       Impress your prospect-deployers in procurement phase

4.       Profit!

Deployers

Just because you did not develop the AI tool you are implementing yourself, does not mean you are free from responsibility. As a user, the AI Act qualifies your company as a Deployer. Deployers need to have a detailed overview of active AI tools within the company, their classification and the compliance level of the providers.

In about a year, all providers will be bound by the AI Act. And believe us, nothing worse as a deployer than being forced to replace some (by then deeply imbedded) AI tool, just because your provider clearly forgot to read our blog and therefor did not work on its compliance level. So how do you protect yourself as a deployer?

  1. Start today by making an inventory of all active AI tools people rely on within the company (yes, even ChatGPT)
  2. Reach-out to the existing providers to discuss their classification and measures taken
  3.  Adopt AI compliance checks in your provider assessment or procurement process
  4. Safely enjoy all AI has to offer!


And if I don't?
 

Unfortunately, failing to comply, either as a provider or a deployer, has more than just the negative operational consequences mentioned above. The Act includes fines of up to 7% of your annual turnover or €35 million, whichever is higher. Risky business indeed...

Through this series of blogs, we will provide insights into the core obligations of the Act. If you need practical assistance with drafting and implementing your action plan, or if you are unsure about how to classify AI solutions, feel free to send us a quick (AI-generated) message. 😉

Aanbevolen voor jou