Are you ready? The first provisions of the AI Act come into force in less than a month
The first provisions of the AI Act come into force from 2 February 2025. If you haven’t already, it is time to start thinking about what you need to do to ensure you are compliant with the first wave of provisions – AI literacy requirements and prohibited AI practices.
Article 4 - AI Literacy
From 2 February, providers and deployers of AI systems will have to take measures to ensure that there is a sufficient level of AI literacy amongst their staff and other persons dealing with the operation and use of AI systems on their behalf. These measures will have to take into account the technical knowledge, experience, education and training of the personnel or other persons, as well as the context in which the AI system will be used.
AI literacy is defined widely under the AI Act. Staff or relevant persons should have the requisite skills, knowledge and understanding that allow providers, deployers and affected persons, taking into account their respective rights and obligations, to make an informed deployment of AI systems, as well as to gain awareness about the opportunities and risks of AI and the possible harm it can cause.
Providers and deployers will have to assess the current level of knowledge, skill and expertise of their staff and identify and fill any gaps in this regard. This may involve:
- implementing a training programme;
- considering whether any further provisions are needed in contracts for procuring AI systems or with third party service providers deploying AI systems; and/ or
- documenting their organisation’s training practices in an AI Policy.
Article 5 - Prohibited AI practices
From 2 February, the following AI practices, subject to some limited exceptions, will be entirely prohibited under the AI Act:
- use of subliminal, or purposefully manipulative or deceptive, techniques to materially distort the behaviour of a person or group of persons by impairing their ability to make an informed decision;
- exploiting any vulnerabilities of a person or a specific group of persons (e.g. personality traits, age, or physical or mental ability) with the objective or the effect of distorting their behaviour;
- use of ‘social scoring’ evaluations or classifications of persons or groups of persons based on their social behaviour or personality characteristics, leading to detrimental or unfavourable treatment: (i) in social contexts unrelated to the contexts in which the data was originally generated or collected; and/or (ii) that is unjustified or disproportionate to their social behaviour or its gravity;
- predicting or assessing the likelihood of individuals committing criminal offences;
- creating or expanding facial recognition databases through the untargeted scraping of facial images from the internet or CCTV footage;
- inferring the emotions of a person in the workplace or in education institutions (unless such use is intended for medical or safety reasons);
- employing biometric categorisation that categorises individuals based on biometric data to deduce or infer, for example, race, political opinions or religious or philosophical beliefs; or
- using ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purposes of law enforcement unless used for certain specific objectives, such as searching for missing persons or prevention of terrorist attacks.
Failure to comply with the prohibition under Article 5 could expose organisations to significant fines. Non-compliance may be subject to administrative fines of up to €35 million or 7% of total worldwide annual turnover (if the offender is an undertaking), whichever is higher.
Next Steps
If your organisation needs advice on compliance with the AI literacy requirement or the prohibition on prohibited AI practices ahead of 2 February, please reach out to our Technology & Innovation Group or to your usual contact in McCann FitzGerald.
This document has been prepared by McCann FitzGerald LLP for general guidance only and should not be regarded as a substitute for professional advice. Such advice should always be taken before acting on any of the matters discussed.
Select how you would like to share using the options below