Balancing Security Requirements and Fundamental Rights Protection
By Rebekah, Miruna, Mihai and Vesa
The EU Commission’s 2023 proposal to increase Europol’s authority, especially concerning the systematic processing of biometric data, aims to improve security regarding serious crimes. However, it also raises serious legal concerns for individuals. This blog post critically explores the tension that biometric data poses between increasing surveillance power and fundamental rights under the Charter, which questions: Can this expansion of enforcement powers be justified under the principles of necessity and proportionality, or does it risk going too far?

The Commission proposed a complementary Regulation for Europol regarding migrant smuggling and trafficking in human beings. The proposal aims to improve the coordination between Europol and the Member States regarding sharing information. This entails Member States providing Europol with citizens’ data to effectively address crimes.
But what kind of data does an agency such as Europol need to process? Europol processes biometric data. The EU has defined biometric data as personal information that can be attributed to unique human physical characteristics, such as facial features and fingerprints. Biometric data has been used by law enforcement authorities in the EU through technological advancements to surveil citizens in public spaces. Citizens have raised concerns that the EU provides law enforcement authorities with the right to interfere with citizens’ fundamental rights and freedoms.

How does Europol’s processing of biometric data place it at the center of fundamental rights concerns?
When Europol becomes involved with biometric data, it is concerned about being thrown into the deep end of some of the EU’s most sensitive fundamental rights. Article 7 of the Charter protects our private and family life, while Article 8 gives us a fundamental right to personal data protection. These two fundamental rights are shaped by how the EU needs to handle individuals’ privacy and data protection in practice. Europol is not subject to different rules and must also respect them.
Biometric data became significant with the establishment of the GDPR and the Law Enforcement Directive. This is because biometric data falls under a “special category” of data due to its sensitivity, which means it cannot be handled lightly. Europol can only process this type of data when it is necessary for law enforcement, like preventing or solving serious crimes, and even then, only with solid legal safeguards as outlined by the respective regulations.

Over the years, the Court has made it clear that interfering with fundamental rights is only allowed if it is in accordance with the principles of necessity and proportionality. That means that Europol needs to provide justification for why biometric data is truly essential for carrying out their work and ensure they are not over-collecting or casting too wide a net.
Because without tight rules and accountability, data processing can start to look a lot like surveillance. And that is especially concerning when the individuals being monitored are not even suspects, just non-suspect individuals who might get caught in the digital sweep.
Enhancing security but challenging privacy?
Proponents argue that allowing Europol to process biometric data is crucial in modernising law enforcement and bolstering our security. They claim that by tapping into advanced technologies, such as AI-powered facial recognition systems and machine learning algorithms, Europol can quickly identify and track criminal networks involved in migrant smuggling and human trafficking. This, they argue, helps prevent crimes before they escalate. For everyday citizens, this might mean faster responses during emergencies and more efficient coordination between national police forces across the EU; at the same time, it also raises legitimate concerns regarding individual privacy and data protection.

This approach is similar to the rationale behind the landmark ruling, where the Court underscored the need for a careful balance between state security and individual rights. While that case focused on mass data retention, it highlights the broader principle that privacy interference must be necessary and proportionate.
However, without mandatory measures like independent oversight by the European Data Protection Supervisor (EDPS), robust data retention rules, and enforceable accountability mechanisms, the 2023 proposal risks creating a surveillance apparatus that goes far beyond its intended scope.
How can Europol balance security requirements with fundamental rights, then?
In contrast to the previous framework, the proposal mandates that Member States consistently provide Europol with biometric data, with no clear limitations on volume, purpose or retention.
The EDPS has raised concerns, stressing that mass collecting of biometric data such as fingerprints or facial scans without proper safeguards and guidelines could interfere with fundamental rights under the Charter. Such concerns have also been raised by citizens as, according to a survey conducted by the EU Agency for Fundamental Rights, only 17% of Europeans are willing to provide their facial photographs to public authorities for identification purposes. The findings also reveal significant differences among the Member States in countries such as Germany and Austria, who show greater resistance to the sharing and processing their biometric data, while others, such as Portugal and Spain, show a more open approach. Additionally, the Biometrics Institute’s 2023 Industry Survey found that 54% of participants consider privacy and data protection significant challenges when developing biometric technologies.
As Europol’s powers expand, how do we protect our fundamental rights?

A path forward should come with precise safeguards, not shortcuts. In balancing Europol’s powers on processing large sensitive data such as biometric data with fundamental rights, it is necessary to amend the Europol Regulation by adding more explicit criteria on how biometric data is collected, stored and used. Furthermore, provisions guaranteeing more transparency to avoid misuse of such data or profiling individuals with no criminal links. Lastly, carrying out an independent fundamental rights impact assessment before adopting new powers defines when biometric data can be used and how long it should be strictly limited to migrant smuggling and human trafficking.
Hence, with clearly defined safeguards in place, the EU and law enforcement agencies, such as Europol, could strike a balance between technological developments and the protection of fundamental rights.
- [DRAFT] The European Defence Agency: A EUR48 Million “Discussion Form”? - April 10, 2025
- [DRAFT] Following in AMLA’s footsteps: is direct enforcement the way to go for wandering ENISA? - April 10, 2025
- [DRAFT] A Chemical (Im)Balance: Judicial Review and Technical Expertise of the ECHA BOA - April 10, 2025