Photo by julien Tromeur on Unsplash |
The EDPS takes a tough stance as regards some of the solutions envisaged in the proposal. For instance, the authority once again emphasized that classifying several uses of AI as "high risk" is not enough in cases where such uses pose unacceptable risks to fundamental rights (see para. 7 of the opinion). This includes a.o.:
- any use of AI to carry out any type of "social scoring";
- any use of AI for automated recognition of human features in publicly accessible spaces, such as of faces, gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioural signals;
- the use of AI to infer emotions of a natural person except for certain well-specified use-cases, namely for health or research purposes;
- any use of AI systems categorising individuals from biometrics into clusters according to ethnicity, gender, political or sexual orientation, or other grounds for discrimination prohibited under Article 21 of the EU Charter of Fundamental Rights.
According to the EDPS, such uses should be prohibited as they are intrusive and affect human dignity.
The EDPS also notes that the AI Act proposal exempts operators of high-risk AI systems already on the market or in use before the AI Act's applicability, except in cases when these systems are subject to significant changes in their design or purpose or in case of "substantial modifications" (para. 12 of the opinion, see also Article 83(2) of the AI Act proposal). However, the EDPS finds this solution unclear, leading to legal uncertainty and some high-risk AI systems never falling within the scope of the AI Act. The EDPS recommends removing this exemption and applying the AI Act to existing high-risk AI systems on the date of its applicability.
What is more, the EDPS suggests that the notion of AI "providers" should be further clarified, and probably (explicitly?) include AI operators who retrain pre-trained AI systems. Although training is a fundamental part of AI development, the current proposal does not clearly state whether activities such as retraining or continuous training should be considered as part of AI system 'development'. As a result, it is uncertain whether operators taking part in such activities could be assigned the status of "providers" of AI systems (para. 15-19 of the opinion).
Finally, the authority shared specific recommendations on how to clarify the proposal's provisions on EDPS roles and tasks as a notified body, market surveillance authority and competent authority for the supervision of the development, provision or use of AI systems by EU institutions, bodies, offices and agencies (para. 29 et seq.).
* Updated information on the legislative process you can find here.