Face/Off: The future of biometric data vs. consumer privacy
Posted: June 21, 2024
There has always been an invention to satisfy a necessity.
Traditional access to the digital world through a pair of usernames and passwords to verify one’s identity has always remained questionable and prone to theft.
Biometrics has created a paradigm shift in identity verification and authentication. This revolutionary technology harnesses an individual’s unique physical and behavioral characteristics, like fingerprints, iris scans, voiceprints, and facial features, to verify their identity.
Transcending access to a website or application, biometrics have found a presence in digital devices, smarthomes, and automotives. The accuracy with which it verifies identity is unparalleled. For its enhanced security measures, it can be applicable across diverse industries and applications, including healthcare, travel, government, and finance.
Its rapid proliferation, though, raises a new breed of data privacy and security risks that need to be addressed. Their inability to be revoked or reset, unlike compromised passwords, makes biometrics a much higher-stakes form of identification.
With more and more companies collecting biometrics, there is a risk of security breaches, identity theft, impersonation, and fraud.
How biometric data weighs in against consumer privacy remains a puzzle, which we look to solve below…
The importance of obtaining explicit consent in the digital age
With your biometrics now at stake, in a world where data fuels the economy of data-driven organizations, consent is one of the most important mechanisms for balancing data utility and data privacy.
While businesses want to make the most of user data to personalize experiences and generate revenue, users expect businesses to respect their privacy. To bridge the gap between the two, data privacy regulations come into play.
Data protection laws require businesses to obtain informed, specific, and freely given consent from users before collecting and processing their data. Explicit consent sits at the intersection of all these principles, requiring a clear opt-in mechanism for users to agree to websites’ terms of processing their data.
The explicit consent requirement for a clear and affirmative action puts the power to control data sharing in the hands of users. Users can make informed decisions about their data, thereby being active participants rather than passive subjects in the data ecosystem. Additionally, businesses achieve compliance with data protection regulations like GDPR and CCPA, which emphasize clear and unambiguous consent mechanisms in place for respecting user privacy and autonomy.
Challenges in the face of biometric consent
There has been a growing public resistance against the use of biometrics. The public’s perception of biometrics as a security risk is genuine. Their reluctance gets in the way of consent for the following reasons:
Covert collection
Biometrics can be collected without a person’s knowledge, participation, or consent. Covert collection violates the principle of informed consent. Users cannot consent to what they’re not aware of.
Impressions of fingerprints on hard surfaces can be lifted to collect biometric information about individuals without their knowledge. Footage from public or private surveillance cameras can be harnessed for facial recognition technology. This constant monitoring without awareness leaves people with little control over their privacy and autonomy.
Moreover, facial biometric information can be captured from photographs of individuals available in the public domain. For example, Clearview AI amassed massive databases of facial images by scraping social media platforms without users’ consent, raising significant privacy concerns.
Undisclosed purposes
A business collecting biometric data might change its business model or be acquired by another company. The original consent may have been given to the first company for a specified purpose. The acquiring company may have distinct business priorities and, hence, may use the biometric data for purposes other than those originally obtained.
Scenarios in which purposes for data processing change with a change in business model or acquisition give rise to functional creep. Companies may use biometric data for purposes other than what was consented for or use it in combination with other personal information to make a complete profile of users.
Even with explicit consent obtained, the scope of emerging technologies makes it impossible to disclose every potential use case. Besides, companies’ use of vague terms like ‘to improve our services’ or ‘for security purposes’ might veil the specificity of consent. Users might not be able to interpret the true extent to which the company intends to use the data.
Some jurisdictions have tighter data privacy regulations (e.g., GDPR in the European Union) and extensively control the change in purposes for data processing. However, in nations with weaker regulations, companies can abuse biometrics in unforeseen ways, affecting people’s informed decision-making or even biasing algorithms due to technical limitations.
Balancing convenience and privacy in biometric authentication
Finding a happy balance between ease of use and protecting biometrics is a constant tussle in today’s digital world. Biometrics makes everything lightning fast. Be it unlocking a phone with a fingerprint or face recognition, completing transactions with a few clicks, or accessing information instantly, everything elevates the user experience.
However, the other side of the equation, privacy, often comes with trade-offs that need careful handling. Convenience comes at the cost of users’ surrendering some level of personal information. Companies collecting individuals’ physical characteristics may use it for targeted advertising, third-party reselling, or ways beyond anybody’s anticipation.
In view of the evolving societal expectations for data privacy and to maintain compliance with data protection regulations, it is high time to balance convenience and privacy. A balance can be established with the following measures:
- User control and choice: There should be alternative authentication in place to provide users with options who do not prefer to use biometrics. It allows users to choose an authentication method that meets their preferences and concerns. Clear information about the collection, storage, and usage of biometrics enables users to make informed decisions. Mechanisms for explicit consent, like opt-in or opt-out of biometric authentication, let users share data as per their convenience, limiting data collection practices by organizations.
- Robust security measures: Biometrics under GDPR are categorized as sensitive information. Strong security measures like encryption protocols, secure storage practices, and multi-factor authentication help keep data confidential and inaccessible to unauthorized parties. Furthermore, storing only a template of biometrics instead of the full data itself could mitigate potential misuse and data breaches. Whenever possible, biometrics should be processed on the user device itself. It minimizes the risks of transmitting biometrics to external servers.
- Ongoing evaluation and improvement: Technological advancements never sit idle. New sensors, algorithms, and data analysis techniques emerging each day bring unique challenges with them. Manufacturers of biometrics and organizations should ensure they regularly review and update their systems. Their proactive risk management approach can identify areas for improvement that require attention pertinent to accuracy or user experience. Regular evaluations keep biometric systems up-to-date and able to withstand potential security vulnerabilities as they arise.
Final thoughts…
Biometrics are indeed the future of identity verification, but they don’t come without privacy risks. Businesses should consider biometrics as a valuable asset for users and conduct risk assessments to comprehend the implications of breaches or unauthorized access for users.
Balancing convenience with privacy requires providing users with control over their privacy settings, implementing robust security measures, and reviewing systems regularly. Covert collection and undisclosed purposes require stringent regulations to protect user privacy. As the issue of vagueness with explicit consent lies, businesses should clearly state each specific case for which biometrics can be used and limit their use to only the provided purposes. They should collect consent using a robust and scaleable technology solution that can meet not only the requirements of today but also tomorrow.
Read our latest guide: A privacy professional’s AI checklist
Though AI technology and legislation are rapidly evolving, there is enough of a trending pattern for savvy businesses to get ahead of the AI train. To help an organization make privacy-sensitive and future-proofed AI decisions use our AI top 10 checklist to support:
- Identifying data goals, strategy, and tactics
- Determine legal basis
- Solve transborder data flow concerns
- Consider data sets.