Updated on 16 June 2021. Estimated reading time: over 12 minutes
Despite the very particular character of such information, virtually no legal provisions worldwide are specific to biometric data protection.
Legal texts instead rely on provisions relating to personal data protection and privacy broadly. But such legislation sometimes proves to be poorly adapted to biometric data.
Assuming – that is – there is any such legislation at all.
Biometric data and privacy: what the law says
However, the General Data Protection Regulation (GDPR) for European Member States does address biometric data.
It represents a significant step forward for data protection and privacy with a real international impact.
The result?
Twenty-eight countries, including the U.K., now have a new regulation in place.
In the United States, no single, comprehensive federal law regulates the collection and use of biometric data.
However, following Illinois and Texas, Washington passed a biometric privacy law in 2017. California enhanced its privacy protection regulation at the end of 2018. The law (CCPA and now its new layer named CPRA) is frequently presented as a potential model for a U.S. data privacy law.
New York State and Virginia now stand beside California.
U.S. regulators are also increasingly focusing on the use of biometric data.
In August 2017, India's supreme court ruled privacy a "fundamental right" in a landmark case, illustrating that biometric data protection is now on top of the regulators' agenda in the largest democracy in the world.
Let's dig in.
In this web dossier, we will focus on six topics:
- Biometric data within the GDPR (the E.U. privacy law)
- Main objectives and provisions of the GDPR (including a video)
- The GDPR and the U.K.
- GDPR after 2,5 years: The Mariott and British Airways data breaches and the Schrems II ruling (video)
- U.S. legal landscape for biometric data protection in 2021(including CCPA, CPRA, and another video)
- India and China, and the emerging consensus on biometric data protection.
Biometric data and GDPR
The EU GDPR establishes a harmonized framework within the European Union, the right to be forgotten, unambiguous and affirmative consent, and, amongst other things, severe penalties for failure to comply with these rules.
- The Regulation 2012/0011 was adopted officially on 27 April 2016.
- The provisions of the Regulation are applying as of 25 May 2018.
National governments do not have to pass any enabling legislation. The new legislation replaces the existing national laws.
So, yes, you read that right.
The law is now the same for 500 million people.
What is biometric data?
The E.U. data privacy law defines biometric data as "special categories of personal data" and prohibits its "processing."
Biographical data or personal history data like date of birth, marital status, gender, name, or address are also protected by the GDPR.
The Regulation protects E.U. citizens and long-term residents from having their information shared with third parties without their consent.
Their processing for "uniquely identifying a natural person" is prohibited.
However, it does contain some exceptions:
- If consent has been given explicitly
- If biometric information is necessary for carrying out the obligations of the controller or the data subject in the field of employment, social security, and social protection law
- If it's essential to protect the individual's vital interests and he/she is incapable of giving consent.
- If it's critical for any legal claims
- If it's necessary for reasons of public interest in the area of public health.
Moreover, the Regulation permits the Member States to introduce other limitations regarding biometric information processing.
How does GDPR protect privacy?
The text's main objective is to give back to European citizens control over their data while simplifying companies' regulatory framework.
More precisely, as we said earlier, as of 25 May 2018, only one set of rules directly applies to all the European Member States regarding protecting personal data.
But wait, there's more.
E.U. residents are gaining more control over their personal and biometric data.
The right to be forgotten
The Regulation states that the consent must be explicit before data collection.
It also explains that "the data subject shall have the right to withdraw his or her consent at any time," also known as "the right to be forgotten."
Data breach must be notified within 72 hours.
It establishes a clear set of consumer rights, and GDPR also includes measures to boost enterprise security. For example, if a company discovers a data breach, then processors must inform the authorities within 72 hours of discovery.
GDPR's penalties
Companies managing biometric information could be hit with massive penalties if they do not secure that data. These could reach 20 million euros or 4% of annual worldwide turnover.
A global law
And here is why this law has a truly international impact.
Non-EU-established organizations will be subject to the GDPR if they process personal data about E.U. data subjects. This makes the GDPR a global law.
The extra-territorial scope of the GDPR is illustrated below (see GDPR and the U.K.) with the Marriott data breach case.
Privacy by design and by default
Data usage should be limited to what is necessary. The Regulation states that personal data shall be collected for "specified, explicit and legitimate purposes."
It shall not be further processed "in a manner incompatible with those purposes."
Data protection must be designed to develop business processes for products and services to avoid the well-known "function creep" effect.
With a clear focus on biometric data privacy
For biometric security to work well, citizens' rights must be protected appropriately, and the data collected by private and public organizations must be managed carefully and sensibly.
The new GDPR focuses primarily on biometrics, recognizing the technology's immense potential.
What does the GDPR mean for businesses?
One of the goals of the GDPR is to simplify the requirements for companies working in several European Member States.
The GDPR establishes a "one-stop-shop" for active companies in several European countries. They will only have to deal with the country's Supervisory Authority where their "main establishment" is located (e.g., where the main processing activities occur).
This Supervisory Authority will then play the "lead authority" role and supervise all the company's processing activities in the European Union.
Moreover, one of the most critical new obligations is Data Protection Officers' appointment (D.P.O.s) in some specific companies (over 250 employees). The D.P.O.'s role will only be to verify the compliance of the company's activities with the GDPR.
More details regarding the D.P.O.s were adopted by Article 29 Data Protection Working Party (WP29) on 13 December 2016 in its guidelines on the subject.
Let's see how the U.K. has been preparing for the GDPR.
The U.K. Data Protection Bill and biometric data
In June 2017, the British government presented its legislative program for the next two years, bringing GDPR into U.K. law and the country into line with the E.U.
The U.K.'s decision to leave the E.U. is not affecting the implementation of the GDPR.
Of course, some post-Brexit amendments are necessary for the role of the U.K. supervisory authority and its relationship with the E.U. authorities, for example.
The notes to the Queen's speech (page 46) underlined the importance of maintaining data flow from the E.U. after Brexit to "cement the U.K.'s position at the forefront of technical innovation, international data sharing and protection of personal data."
The September 2017 Data Protection Bill
The Data Protection Bill also concerns topics other than the provisions of the GDPR. The Information Commissioner's Office (I.C.O.), U.K.'s Data Protection Authority, explained that reading the GDPR and the Data Protection Bill is essential.
Just let us show you what this means with two examples.
The Marriott data breach
In July 2019, the I.C.O. imposed on Marriott a fine of £99m — more than €109m or $128m - after the personal data of 339m guests were stolen in a hack, starting back as far as 2014.
The information stolen included personal data such as passport numbers, log-in, payment card, and travel booking details.
Why did the Maryland-based company get fined?
It makes sense when you think about it.
About 30m of the hacked guest records were related to residents of 31 countries in the European Economic Community. The lack of protection is an infringement of the GDPR.
Needless to say, the case serves as a barometer to see how the GDPR is enforced against U.S.-based businesses, as stated by law.com in an article dated 27 July 2020.
The British Airways data breach
To top it off, the I.C.O. revealed its intention to fine British Airways (B.A.) £183m the same week. The breach occurred in 2018 and affected personal and credit card data.
In October 2020, I.C.O. reduced the fine to £20m ($26m), according to the B.B.C. The I.C.O. said that "the economic impact of Covid-19 has been taken into account".
The data breach affected the personal data of over 400,000 clients stolen from the B.A. mobile app and website.
Does the GDPR apply in the U.K. after Brexit?
In 2021, the new data privacy ecosystem, also known as 'the UK GDPR,' will be implemented. Find more information here.
GDPR experience so far (May 2018- May 2020)
GDPR assessment after one year
As reported by the E.U. Commission in May 2019:
- Three countries are still in the process of adopting the GDPR (Greece, Slovenia, and Portugal)
- 144,376 queries and complaints to data protection authorities have been filed
- 89,271 data breaches have been reported
- Five fines have been issued (for a total of €52m)
What can we see here?
- There is no avalanche of multi-million fines as predicted by scaremongers.
- GDPR did not change nor block everything, as feared by many. It's an evolutionary process. Over 1000 US sites blocked E.U. citizens in 2018. This situation is no longer the case.
- The GDPR has an influence worldwide and, more specifically, in the United States. The debate is heating up in the country for two reasons: the introduction of the GDPR and California's CCPA.
- The GDPR is not only about consent. That's probably one aspect that has been misunderstood.
- However, the U.K. and France have seen a flood of businesses reporting themselves for violations.
- It's interesting to note that Japan has set rules (adequacy decision) to bridge the differences between its data protection system and the GDPR in January 2019.
- Download our infographic (Happy Birthday GDPR)
GDPR assessment after two years
According to the June 2020 E.U. Commission's report, it is an "overall success" but remains a work in progress.
In particular, the regulation appears to be a burden for small and medium-sized enterprises, according to the E.U. report.
Slovenia is the only country still in the process of adopting the law.
But don't get left behind.
The Schrems II ruling (July 2020)
What is Schrems II?
First of all, who is Schrems?
Maximilian Schrems is an Austrian jurist and privacy activist born in 1987. He founded a non-profit organization named noyb (none of your business).
In July 2020, he obtained the invalidation of the US-EU Privacy Shield agreement from the Court of Justice of the European Union (CJEU). The ruling is known as Schrems II.
This former framework organized the transfer of personal data from the European Economic Area (EEA) to the United States to support transatlantic commerce.
It provided a mechanism to comply with data protection laws on both ocean sides.
In other words, the Privacy shield is dead as of 16 July 2020.
What does the Schrems II ruling mean?
It clearly says that the U.S. privacy shield does not provide adequate privacy protection for E.U. data to be exported to the U.S. under the GDPR.
US-based companies such as hosting providers must adapt to the Schrems II ruling. They either follow the GDPR or take additional protection measures without the GDPR equivalent.
The protection afforded to E.U. citizens' data cannot be weakened by a data transfer to a "third country."
Similarly, the Schrems I ruling (Maximillian Schrems v Data Protection Commissioner) invalidated the Safe Harbor data privacy exchange framework (U.S. Department of Commerce 21 July 2000, the European Commission's Decision 2000/520/E.C. of 26 July 2000) in October 2015.
Schrems’ latest win: the Grindr case
On 24 January 2021, the Norwegian Data Protection Authority (D.P.A.) imposed a fine of €9.6m ($11.7m or £8.6m) on Grindr, the Californian dating app, over a privacy breach.
The Norwegian D.P.A. clarified it has jurisdiction and issued the fine.
The company shared personal data with third-party advertisers, principally tagging users (including European users) as LGBTQ without valid and explicit consent. Grindr users were not informed about the data sharing and had no opportunity to opt out from sharing data with third parties.
This is a violation of European privacy rights.
This leads us to…
Biometric data protection in the United States
In the United States, no single, comprehensive federal law regulates the collection and use of personal data in general or biometric data in particular.
Instead, the country has a patchwork system of federal and state laws and regulations that sometimes overlap or contradict one another.
But that's not all.
Government agencies and industry groups have developed self-regulatory guidelines drawn from best practices and are now considered by regulators.
Apple, Facebook, Google, and Microsoft have been self-regulating for some time, even though they have invested heavily in creating powerful facial recognition technologies.
Facebook, for example, has an agreement with the Federal Trade Commission. Under this, the company must obtain "affirmative, express consent" before going beyond a user's specified privacy settings.
There's more.
In July 2018, Microsoft President Brad Smith called for federal regulation for facial recognition software use and urged Congress to oversee its implementation.
This unusual blog post illustrates how powerful artificial intelligence technologies — such as facial recognition — have set off a controversial battle among tech executives (and employees.)
Identification without consent in 45 states
It is legal in 45 states for software to identify an individual using images taken without consent while they are in public.
New York, California, Washington, Illinois, and Texas don't allow it for commercial use.
A New York State law called the Stop Hacks and Improved Electronic Data Security (SHIELD) became effective on 21 March 2020.
This evolution of New York State's existing data security law defines private information as personal data, including:
- Social security number
- Driver's license or state I.D. number
- Financial account information
- Biometric data
- Username or email address in combination with a password or security question.
The law (New York State bill S55575B aka SHIELD Act) requires implementing a cybersecurity program and protective measures for N.Y. State residents.
The act applies to businesses that collect the personal information of N.Y. residents.
With the act, New York now stands beside California.
California was the fourth state to pass a biometric privacy law in 2020. It covers any business entity that collects biometric identifiers for commercial purposes.
So what's the situation in most states?
Facial recognition, for example, can be performed inconspicuously from a distance without the individual actively providing any information.
There's already facial recognition software that shops can use to signal pre-identified shoplifters or identify customers who often return goods.
And it doesn't take much to imagine that - thanks to Facebook - these shops could quickly get immediate information on their customers when they enter the store: who they are, where they live, income, or credit score.
These practices conflict with critical principles such as anonymity, consent, and purpose from a privacy perspective.
Let's dig a little deeper.
Many parties are addressing the issue.
The question of consent and how to manage biometric data is sensitive, and it seems as if virtually every agency in Washington is addressing at least part of the issue:
- The National Institute of Standards and Technology for the evaluation of biometric technologies.
- The Federal Trade Commission for data security with the F.T.C. Act (15 U.S.C. §§41-58). This consumer protection law prohibits unfair or deceptive practices. It's been applied to offline and online privacy and data security policies.
- The Food and Drug Administration for the security of implants.
- The Department of Health and Human Services with the Health Insurance Portability and Accountability Act (42 U.S.C. §1301 et seq.) for medical information. The HIPAA Privacy Rule of 2003 regulates the use and disclosure of Protected Health Information (PHI) held by "covered entities."
Five states have enacted a protection law for biometric identifiers, and several others are debating one.
So, U.S. regulators have to focus increasingly on using biometric data.
Four significant steps in 2019-2020
Things have been moving fast in the last months in the U.S.
At least four significant privacy legislation fronts are worth mentioning:
- The California Consumer Privacy Act, the California Privacy Rights Act, and N.Y.'s SHIELD
- The 2008 Illinois Biometric Protection Act (BIPA) and the 25 January 2019 ruling in the Rosenbach v. Six Flags Entertainment Corporation case.
- Federal legislative hearings
- The anti-surveillance ordinance was signed on 6 May 2019 by San Francisco's Board of Supervisors.
#1 California's new privacy laws
CCPA
The California Consumer Privacy Act (CCPA) is a bill passed in June 2018. It enhances privacy rights and consumer protection for residents of California. The CCPA becomes effective on 1 January 2020.
California is the fifth-largest economy in the world and home to many tech giants.
It is also traditionally a trend-setting state for data protection and privacy in the U.S.
The result?
Effective 1 January 2020, the law is frequently presented as a potential U.S. data privacy law model.
In that sense, the CCPA can potentially become as consequential as the GDPR.
The regulation inspires many national laws outside the E.U., including Chile, Japan, Brazil, South Korea, Argentina, and Kenya.
CCPA's definition of biometric data is a bit broader than that of GDPR: "an individual's physiological, biological or behavioral characteristics, including an individual's D.N.A., that can be used, singly or in combination with each other or with other identifying data, to establish individual identity."
The rights provided to California consumers to protect their personal information and biometric data include:
- Accessing the data (right of disclosure or access),
- Deleting them (right to be forgotten),
- Taking them (data portability – the data must be received in a commonly used and readable format),
- Requesting businesses not to sell their personal information,
- Opting out (Opt-in is the primary consent standard mandated by European GDPR),
- Right of action (penalties).
CPRA
At the end of 2020, Californian voters made another step forward on the data privacy road.
The California Privacy Rights Act (CPRA) passed into law on 3 November 2020 will take effect on 1 January 2023 with a lookback period from 1 January 2022.
It's a supplement to CCPA.
It creates a series of new amendments to the existing text.
In particular, it creates new rights and expands existing ones for California residents.
CRPA creates a new category of personal information named sensitive personal information.
Biometric data, along with race, ethnicity, sexual orientation, religious beliefs, and geolocation or social security number, to name a few, are included in this new group.
We suggest this excellent document for a more detailed comparison between CCPA and GDPR.
Virginia's CDPA
Virginia passed the Consumer Data Protection Act (CDPA) (SB 1392) in March 2021.
It was signed on 2 March 2021 and will become effective on 1 January 2023, synchronized with the CPRA.
The law specifies genetic and biometric data as sensitive data to be protected.
#2 Illinois's BIPA and the Rosenbach v. Six Flags case
Illinois' BIPA is the most robust biometric privacy law in the United States.
The case was significant because the Illinois Supreme Court ruled that a plaintiff didn't need to show additional harm to impose penalties on a BIPA violator. A loss of statutory biometric privacy rights is enough.
The Electronic Frontier Foundation praised the ruling, calling it a key privacy victory.
#3 Federal hearings and activity
It seems California has strongly motivated members of Congress.
Federal legislative hearings and activities aim to combat the challenge created by a "patchwork" of different, individual state privacy laws.
The House Oversight and Reform Committee held its third hearing on 15 January 2020 about facial recognition.
But could California's privacy law be a model for the U.S., as Government Technology put it?
According to Fortune Magazine (May 2020 issue), at this point, we're looking at something for 2021.
More on representative Suzan DelBene's (D-WA) proposal here: Information Transparency and Personal Data Control Act
#4 San Francisco's ban on facial recognition
The anti-surveillance ordinance signed on 6 May 2019 by San Francisco's Board of Supervisors is the first ban by a major city on the use of face recognition technology.
It prohibits its government from using facial recognition technology. This includes SFPD.
Since the ordinance's passage, the debate has been hot in many cities and states.
Should other localities follow this example? Is this a step backward for public safety? Is the ban just a "pause button" to better analyze the risks of such technology?
Somerville (Massachusetts) in June and Oakland (California) in July 2019 made the same decision. San Diego went the same way in December 2019 with a three-year moratorium.
Boston banned using facial surveillance technology by police on 24 June 2020.
Lather, rinse, repeat.
Portland, Oregon, made the same decision in September 2020. The ban will come into force in January 2021.
So, stay tuned for the outcome of all these discussions, and, in the meantime.. let's move to India.
India and the emerging global consensus on biometric data protection
On 24 August 2017, India made it very clear as the Supreme Court ruled privacy a 'fundamental right' in a landmark case.
A September 2018 supreme court judge eventually ruled that it is unconstitutional for private companies to use Aadhaar data, impacting the massive country's biometric identification program.
Just think about the size of this project.
Aadhaar was first unveiled back in 2009.
In June 2021, some 1.29 billion people had an Aadhaar number, accounting for more than 99% of India's adult population.
The principle is simple.
Biographic and biometric data are captured from all Indian residents aged over 18. This means name, date of birth, gender, address, a photograph, ten fingerprint, and two iris scans.
Each resident is then issued with their own unique 12-digit Aadhaar number. It's a residential, not a citizenship card, and not compulsory.
It's a single, universal digital identity number that any registered entity can use to "authenticate" an Indian resident.
But the I.D. is not the card; it's the number, and it's purely digital and hence verifiable online.
So should this project be limited to a national I.D. scheme? It seems not.
On 28 February 2019, India's Modi government approved the law governing the country's biometric I.D. program. In particular, the changes allow Aadhaar to be used by private entities— after a September 2018 supreme court judge ruled it unconstitutional.
New Aadhaar amendments were passed in July 2019.
They allow for the usage of the Aadhaar number for verification (such as the electronic know your customer process or e-KYC) by private parties, thereby preventing them from collecting the Aadhaar details of an individual.
According to the German DW website, Personal Data Protection (PDP) legislation will be passed by 2021, similar to the E.U.'s GDPR and Californian CCPA (November 2020).
However, the Houses of Parliament granted another delay to the working committee on the Personal Data Protection Bill in March 2021.
There's no validated legal text as far as now (06 June 2021.)
Data Protection Laws in China (2018-2021)
In China, biometric data privacy is protected by several laws.
The two most important pieces include:
- The Cybersecurity Law (C.S.L.) came into force on 1 June 2017. It acts as the baseline for present guidelines.
- The PIPL ( personal information protection law) is effective as of 1 November 2021.
China at first followed a path similar to the U.S. approach.
It's now converging with the more rigorous E.U. rules on many legal aspects, primarily through the C.S.L. and the Personal Information Security Specification, according to legal expert E. Pernot-Leplay in its comparative law study of May 2020.
The C.S.L. includes biometric data in its definition of personal information:
"personal data are all types of information recorded electronically or through other means, that taken alone or with other information, is sufficient to identify a natural person's identity, including, but not limited to, full names, birth dates, identification numbers, personal biometric information, addresses, telephone numbers, and so forth."
The Chinese paradox ( as far as data privacy is concerned)
China's approach is unique.
The country increases consumer privacy (data stored by companies and third parties) AND state surveillance (data collected by authorities).
This "data privacy with Chinese characteristics" has a third dimension: the cyber sovereignty principle (data localization, in particular, i.e., data have to be stored in China).
China's first facial-biometrics-related litigation
In April 2021, the Hangzhou Fuyang People's Court ordered a local safari animal park to delete facial information collected without the owner's consent, Mr. Guo Bing, the plaintiff.
The Park must also pay 1,038 yuan ($158) to compensate for economic losses.
According to the Chinese Global Times website, both parties have decided to appeal.
There's more.
A new and awaited personal information protection law was drafted in 2020 and was open to public comment from February to May 2021.
It was eventually passed in June 2021.
The (Chinese) personal information protection law (PIPL) took effect on 1 November 2021.
Alongside China's PIPL and the Cybersecurity Law of 2017, the new Data Security Law (DSL) was issued on 10 June 2021.
The DSL will become effective on 1 September 2021.
Its goal is to push governments, institutions, and companies to increase investment in data security as it clarifies the responsibility of all stakeholders.
A global consensus on privacy?
Thales and digital security
An expert in a strong identification with more than 200 civil I.D., population registration, and law enforcement projects incorporating biometrics, Thales can act as an independent authority in proposing and recommending the most suitable solution for each application.
Thales attaches great importance to assessing risks and private operators' capacity to manage such risks. Similarly, legal and social implications are also significant.
Although Thales keeps an open mind concerning biometric techniques, it remains no less convinced that this technology offers significant benefits for guaranteeing identity, whatever the choice of biometric.
More resources on privacy laws
- We recommend Global Tables of Data Privacy Laws and Bills (2019 edition) by Graham Greenleaf, University of New South Wales, Faculty of Law, for a broader view of privacy laws.
- January 2020 E.U. whitepaper draft on artificial intelligence and facial recognition
- February 2020: Massachusetts halted its consumer privacy bill until a future legislative session
- March 2020: Data Privacy Law in China: Comparison with the E.U. and U.S. Approaches
- April 2020: N.Y. State's Stop Hacks and Improve Electronic Data Security (SHIELD)
- Brazil has a new privacy law (effective August 2020)
- Brazil's LGPD (September 2020)
- September 2020: Portland becomes the first to ban companies from using face recognition in public
- September 2020: South Africa's Protection of Personal Information Act (POPIA)
- September 2020: Japan and the use of facial recognition to fight crime
- October 2020: Data privacy predictions for 2021
- December 2020: CPRA explained
- Our web privacy policy (2014 revision) and other terms, for apps