Respecting privacy when implementing new technologies
Every day, organisations are finding new ways to apply technology to help them grow, solve problems, and become more efficient. But nearly every new business idea involves using personal data.
Fields like AI, connected devices, and VR are advancing quickly. But before adopting such technologies, take a step back and consider the potential impacts on people’s privacy and data protection rights.
This article will consider some key privacy considerations when implementing new technologies. We’ll also explore some analysis from the UK’s data protection regulator about privacy challenges in emerging tech.
Putting privacy first
Embracing privacy can help businesses succeed when taking risks and exploring new initiatives. Privacy and data protection law exist to support the safe use of personal data—not to block business growth.
For example, take a look at Recital 6 of the EU and UK General Data Protection Regulation (GDPR):
“Technology has transformed both the economy and social life, and should further facilitate the free flow of personal data… while ensuring a high level of the protection of personal data.”
In other words: we shouldn’t resist technological progress or the use of tech to process people’s data—as long as there are safeguards.
If you’re using tech in new and innovative ways, integrating privacy should start from day one.
The UK’s data protection regulator, the Information Commissioner’s Office (ICO), talks a lot about empowering organisations to innovate and use technology in safe and responsible ways.
The ICO published its Tech Horizons Report in January 2023. The report looks at four types of tech that should have a major social impact in the coming years:
- Consumer HealthTech
- Next-generation Internet of Things (IoT)
- Immersive technology
- Decentralised finance
Privacy risks manifest differently across these four areas. But there are some common challenges.
Many new technologies are driven by vast amounts of personal data.
Organisations using or developing these technologies might not be reasonably able to inform people that their personal data is being used for these purposes. Some companies might even collect data indiscriminately.
This means some people may not know what is happening with their personal data.
Facilitating people’s rights
People have rights over their personal data, such as the right to access or delete it in some circumstances.
But big data ecosystems can get messy. Data can be combined and stored in unstructured form.
Collecting and storing vast amounts of unstructured data can make it difficult to help people exercise their rights.
Under most data protection laws, organisations must not collect personal data unless they need it for a specific purpose. This is called the principle of “data minimisation”.
The data-hungry nature of some new technologies means that organisations might cast their data-gathering net too wide, scooping up more data than they need or can realistically use. This would violate the data minimisation principle.
Respecting sensitive data
Most data protection and privacy laws recognise that some types of personal data require a higher level of protection than others.
Under the GDPR, such “special category data” includes information about religion, health and other highly personal characteristics.
Organisations need a good reason to collect sensitive data and must treat it with particular care. When using new technologies, some organisations might be collecting sensitive data carelessly and failing to protect it properly.
Steps to protect privacy when using new technology
The Tech Horizon report considers the steps organisations should take when implementing these technologies.
Again, these measures vary between each type of tech the ICO examines. But some of the regulator’s advice applies to using any type of new technology.
- Use your privacy notice to provide clear, intelligible explanations about your practices.
- Inform people about how you draw inferences about them using AI.
- Always consider conducting a data protection impact assessment (DPIA).
- Ensure AI is accurate, fair, and checked for bias. Consult the ICO’s AI guidance.
- Consider how to make your tech more secure, such as by consulting the European Telecommunications Standards Institute’s IoT security standard.
- Implement the principles of “privacy by design and by default”.
Putting privacy first can be an advantage as you develop or adopt new technologies. Transparency, fairness, and security are crucial elements of any new initiative.
By consistently assessing and mitigating data protection risks, you can protect people’s privacy and help create long-term, sustainable growth.