This thesis explores the regulation of Facial Recognition Technology (FRT) as a focal point for analysing the profound challenges posed by Artificial Intelligence (AI) to constitutional principles and fundamental rights in the digital age. By positioning FRT at the intersection of technological innovation and legal frameworks, the study examines how the adoption of this technology, particularly in law enforcement and public surveillance, amplifies risks to privacy, data protection, freedom of expression and assembly, and, especially, access to effective remedies. The research introduces the Fundamental Rights Impact Assessment (FRIA), a methodological tool designed to anticipate and mitigate the risks posed by high-risk AI systems. The FRIA framework represents a critical innovation, extending constitutional safeguards to digital technologies by operationalizing rights protection in a structured, preemptive manner. This model is applied to the specific context of biometric systems, demonstrating its capacity to address systemic risks, including algorithmic bias, surveillance creep, and the chilling effect on democratic freedoms. Through an in-depth review of the European Union’s regulatory landscape, the thesis evaluates the interaction between the AI Act, the General Data Protection Regulation (GDPR), and the Law Enforcement Directive (LED). It underscores the fragmented and reactive nature of current frameworks, particularly in their ability to address the opacity, accountability, and societal impact of FRT. Case studies, in particular based on the case law of the Court of Justice of the European Union and of the European Court of Human Rights, and legal analysis reveal critical gaps in procedural safeguards, judicial oversight, and the provision of effective remedies for individuals affected by the deployment of biometric technologies. By situating FRT within the broader paradigm of fundamental rights, the thesis interrogates the tension between the harmonisation objectives of the European Union and the procedural autonomy of Member States. This duality is particularly pronounced in areas where jurisdictional boundaries blur and regulatory fragmentation persists. The analysis further highlights the lex specialis nature of the AI Act in regulating FRT and its limited capacity to address the structural challenges posed by these technologies. Finally, the study argues for a recalibrated regulatory approach that moves beyond compliance-based models to prioritise robust procedural safeguards, effective remedies, and greater transparency. By framing FRT as a litmus test for AI regulation, the thesis underscores its potential to propose a model for FRIA that eventually aims at balancing innovation and fundamental rights in the digital era. The findings contribute to the growing discourse on AI regulation by providing an in-depth examination of FRT as a unique lens for understanding the constitutional, regulatory, and societal dimensions of emerging technologies.
Constitutional Safeguards in the Age of AI. A Study on the Fundamental Rights Impact Assessment of Facial Recognition Technology
PAOLUCCI, FEDERICA
2025
Abstract
This thesis explores the regulation of Facial Recognition Technology (FRT) as a focal point for analysing the profound challenges posed by Artificial Intelligence (AI) to constitutional principles and fundamental rights in the digital age. By positioning FRT at the intersection of technological innovation and legal frameworks, the study examines how the adoption of this technology, particularly in law enforcement and public surveillance, amplifies risks to privacy, data protection, freedom of expression and assembly, and, especially, access to effective remedies. The research introduces the Fundamental Rights Impact Assessment (FRIA), a methodological tool designed to anticipate and mitigate the risks posed by high-risk AI systems. The FRIA framework represents a critical innovation, extending constitutional safeguards to digital technologies by operationalizing rights protection in a structured, preemptive manner. This model is applied to the specific context of biometric systems, demonstrating its capacity to address systemic risks, including algorithmic bias, surveillance creep, and the chilling effect on democratic freedoms. Through an in-depth review of the European Union’s regulatory landscape, the thesis evaluates the interaction between the AI Act, the General Data Protection Regulation (GDPR), and the Law Enforcement Directive (LED). It underscores the fragmented and reactive nature of current frameworks, particularly in their ability to address the opacity, accountability, and societal impact of FRT. Case studies, in particular based on the case law of the Court of Justice of the European Union and of the European Court of Human Rights, and legal analysis reveal critical gaps in procedural safeguards, judicial oversight, and the provision of effective remedies for individuals affected by the deployment of biometric technologies. By situating FRT within the broader paradigm of fundamental rights, the thesis interrogates the tension between the harmonisation objectives of the European Union and the procedural autonomy of Member States. This duality is particularly pronounced in areas where jurisdictional boundaries blur and regulatory fragmentation persists. The analysis further highlights the lex specialis nature of the AI Act in regulating FRT and its limited capacity to address the structural challenges posed by these technologies. Finally, the study argues for a recalibrated regulatory approach that moves beyond compliance-based models to prioritise robust procedural safeguards, effective remedies, and greater transparency. By framing FRT as a litmus test for AI regulation, the thesis underscores its potential to propose a model for FRIA that eventually aims at balancing innovation and fundamental rights in the digital era. The findings contribute to the growing discourse on AI regulation by providing an in-depth examination of FRT as a unique lens for understanding the constitutional, regulatory, and societal dimensions of emerging technologies.File | Dimensione | Formato | |
---|---|---|---|
PAOLUCCI_Revised Thesis_vf.pdf
accesso aperto
Descrizione: PAOLUCCI_Revised Thesis
Tipologia:
Tesi di dottorato
Dimensione
3.01 MB
Formato
Adobe PDF
|
3.01 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.