EDPB Adopts Guidelines on Facial Recognition in the Area of Law Enforcement
Time 2 Minute Read

On May 17, 2023, the European Data Protection Board (EDPB) adopted the final version of its Guidelines on facial recognition technologies in the area of law enforcement (the “Guidelines”). The Guidelines address lawmakers at the EU and EU Member State level, and law enforcement authorities and their officers implementing and using facial recognition technology. 

The Guidelines consist of the main body of guidance, along with three annexes which include: (1) a template for assessing the severity of the interference with fundamental rights caused by facial recognition technology; (2) practical guidance for authorities wishing to procure and run facial recognition technology; and (3) a set of hypothetical scenarios and relevant considerations based on certain uses of facial recognition technology.   

The EDPB considers the applicable legal framework, focusing on the EU Charter of Fundamental Rights (the “Charter”) and the Law Enforcement Directive 2016/680 (the “Directive”). With respect to the Charter, for example, the EDPB explores the interference with the Charter caused by processing biometric data and how the interference can be justified in accordance with Article 52 of the Charter. With respect to the Directive, the EDPB examines several areas including the lawfulness of processing special categories of data for law enforcement purposes and the interplay between facial recognition technology and the rules regarding automated decision making.

The EDPB notes that it “understands the need for law enforcement authorities to benefit from the best possible tools to quickly identify the perpetrators of terrorist acts and other serious crimes” but that “such tools should be used in strict compliance with the applicable legal framework and only in cases when they satisfy the requirements of necessity and proportionality.” It also identifies several potential uses of facial recognition technology which it considers to pose a high risk to individuals and their private lives, or to be “highly undesirable,” such as remote biometric identification of individuals in publicly accessible spaces and the use of facial recognition to infer the emotions of an individual.

Search

Subscribe Arrow

Recent Posts

Categories

Tags

Archives

Jump to Page