AI Act | Fieldfisher
Skip to main content

What is “AI literacy”?

According to Article 4 of the AI Act, managing directors of affected companies are obliged to take measures to ensure AI literacy. This is intended to ensure that both staff and other persons involved in the use of AI systems on behalf of the company have a sufficient level of AI competence. The EU legislator is particularly focused on the technical knowledge, experience, education and training of employees as well as the context in which the AI systems are to be used (Art. 4 AI Act).

But what exactly does the legislator mean by “AI literacy”? Article 3 (56) of the AI Act defines AI competence as the skills, knowledge and understanding necessary to use AI systems in an informed manner, to gain awareness about recognize the potential risks and opportunities associated with them and as well as to gain awareness about the opportunities and risks of AI and possible harm it can cause.

According to Recital 20 of the AI Act, AI literacy should help all actors along the AI value chain to make informed decisions and ensure compliance with the AI Act. This includes both technical knowledge and the ability to understand the potential impact of the use of AI.


What companies need to be aware of?

  • Take Action on AI Literacy: Train employees to ensure they use AI tools with technical expertise and legal compliance. This includes raising awareness of AI's opportunities, risks, and potential harms it may cause.
  • Conduct Risk Analyses for AI SystemsPerform context-specific risk assessments for AI tools.
  • Implement Transparency, Labeling, and Documentation Requirements: Depending on the type of AI system, user information, reports, documentation or registration with authorities may be required.
  • Ensure Legally Compliant Contracts: When using AI systems, particular attention must be paid to aspects of copyright, privacy and data protection law.

Who needs an AI officer?

In comparison to data protection law, which requires the appointment of a data protection officer when processing personal data (Art. 37 GDPR/ § 38 BDSG), the AI Act does not contain an explicit obligation to appoint a designated AI officer. Nevertheless, the more actively AI systems are used in the company, the more it is recommended to nominate a designated AI officer. However, it is not enough to simply appoint a colleague with a particular affinity for technology as the AI officer. Rather, it is necessary to train the entire team and ensure interdisciplinary AI expertise in all areas of the company.

The challenge is that not every company has the necessary resources or know-how to ensure the legally required AI expertise internally. For small and medium-sized companies in particular, the most pragmatic solution may therefore be to appoint an external AI officer and use external AI training services.


Fieldfisher's AI Officer Package

Needs and GAP Analysis: Identify the legal obligations for your AI tools and pinpoint areas requiring action.

Internal AI Governance Guidelines: Establish clear responsibilities and processes for AI use within your company.

Employee Training: Equip your team to understand and implement new compliance requirements effectively.

Registrations and Communication with Authorities: Let us handle the paperwork and talk with regulators so you can focus on your core business.

Contract Review and Drafting: We create or review contracts to minimize your risks and protect your rights.