Response to Privacy Commission’s 2023 Biometrics Discussion Paper

Editor’s note: We wrote this response to a discussion document in 2023, but failed to publish it until we started our response for the next consultation.


Dear Office of the Privacy Commissioner,

We appreciate your efforts in leading this work looking at the privacy implications of the use of biometrics and the best way to regulate it. We broadly support the position taken in the discussion paper and believe that a privacy code of practice for biometric data would be of benefit to the freedom and privacy of New Zealanders.

This submission explains our thinking around this topic as well as answering some of the specific questions in the discussion document.

Our approach

We share the concerns of many others about the use of biometric technology. We have no issue with some common uses like using a fingerprint to unlock a phone, and also accept the carefully regulated use of biometric information in passports and the associated e-gates at the border.

Rather we are worried about the widespread use of biometrics for surveillance, tracking, and invading people’s privacy. The technology is cheap enough that there is potential for anyone to be able to set up systems that can watch individuals, track their movements from place to place, and categorise them by emotions, health, race, and other characteristics.

We are also concerned about the accuracy of biometrics, racial bias built into the system both from recognition shortcomings and when systems are built on existing data, and general issues of managing and securing biometric data.

A key point that differentiates this technology from others, and justifies a different approach to regulation, is that many biometric scans such as facial recognition can be done without the consent or even knowledge of the scanned. There is also often no escape – you can leave your mobile phone at home but you can’t leave your face.

Biometric technology exponentially increases the already present risks associated with pervasive surveillance by turning it from watching anonymous people to watching identifiable individuals with measurable characteristics.

We believe that New Zealand is falling behind other comparable countries with our lack of regulation. We further believe that the unconstrained use of biometric technology is a threat to our freedoms and civil liberties. We support appropriate regulation so that we can take advantage of this technology while avoiding the associated risks.

Code of practice vs other regulation

We have always believed that biometric information is personal information that therefore comes under the Privacy Act. We have been surprised at the lack of action taken by the Privacy Commissioner over some uses of facial recognition, such as in supermarkets, which we believe are contrary to the act. The lack of suitable regulation is a real problem as people rush to install it for their own ends and with little consideration as to the wider ramifications.

Thus we are pleased at the prospect of a biometric code of practice being issued under the Privacy Act. This is a chance to limit some of the more egregious uses of biometric technology and protect the privacy and freedom of New Zealanders.

However, we also recognise that regulating the use of biometric technology is a significant step. Ideally such a significant act would be done as legislation and passed by a democratically elected Parliament.

The use of a privacy code of practice is also somewhat limiting. Not all issues with biometric technology are strictly privacy related, and the Privacy Act imposes limitations on both thescope of the regulation and the measures that can be taken. This tension can be seen in the somewhat novel approaches in the discussion document such as the use of a requirement for explicit consent to collect biometric data. The use of legislation instead would enable a wider scope and more flexibility in approach than the use of a code of practice.

That said, on balance we believe that a code of practice is a reasonable and worthwhile approach. As stated earlier, biometric data is absolutely personally identifiable data and therefore subject to the Privacy Act. It’s also important to note that a code of practice would not limit the ability of a future government to legislate as required.

We also acknowledge that the OPC is consulting widely and that any code of practice would have to be available for public consultation before it could be adopted, thus providing at least some level of democratic input.

Tempting databases and watchlists

There is no longer such a thing as an isolated technological system or information database. If someone collects data there is inevitable pressure to share it with others. A particular concern of ours is the sharing of data with the state in the form of law enforcement and spies, relying on “maintenance of law” exceptions in the Privacy Act, or provisions in the Search & Surveillance and Intelligence and Security Acts. The data also appeals to those who can’t get it legally and will just attempt to steal it.

When we consider the use of biometric technology we must be careful to avoid the creation of what we could call “tempting databases” of biometric data. The only sure way to protect sensitive personal information is to avoid collecting it in the first place. While this is the reiteration of a well understood privacy principle, the connectedness of everything makes it ever more important.

The example of ANPR number plate recognition systems in Auckland is a good example of how a database established for one purpose has been turned into a source of data for law enforcement that they couldn’t get otherwise. We have been told that the number plate recognition data is now being retained longer than was originally needed in case law enforcement wishes to access it in the future, a case of the surveillance tail wagging the traffic-flow dog.

The discussion paper suggests that one allowable exception for unconsented mass facial recognition systems might be to scan patrons against a list of banned people so that problem gamblers or trespassed shoplifters can be excluded. Ignoring the many other privacy and civil liberties problems with this approach for now, this alone would be used to justify a significant and widespread deployment of facial recognition systems at retail stores across the country.

The very nature of such a system means that all people will have to be scanned and analysed so that their faces can be compared against the database of banned people. At that point they’re in the system and we’ve created a “tempting database”, resulting in pressure to start keeping the data for longer and making it available to law enforcement under the
“maintenance of law” exception.

The general point we want to make is that the code of practice must take into account the inevitability that any significant collection of biometric surveillance data will be able to be accessed by the state. We must avoid the creation of tempting databases wherever possible.

The specific point is that if there are to be exceptions to allow watchlist matching, a position that we are not yet persuaded by, any captured biometric information that fails to match must be discarded immediately.

Consent

We are pleased to see the requirement for specific and explicit consent for the capture of biometric information. Anything less than this, such as consent by walking past a small sign, will surely lead to the wide-spread and indiscriminate deployment of facial recognition systems.

We acknowledge that this focus on consent is a change in emphasis for our Privacy Act. We have spoken before on the issues with consent when it comes to privacy – there can be no meaningful consent when an individual has to agree to the use of their data in order to use public transport, access banking services, or otherwise participate in modern society.

We cannot rely on it, but it seems to us that consent can be useful as an additional layer built on top of mandatory strong protections. Furthermore the way that biometric information can be captured and used without people’s knowledge is justification for the need for explicit consent as a way to stop over-surveillance and abuse.

Categorising people

One of the concerning uses of modern biometric surveillance systems is that they can be used to not just identify people but also to classify them. This includes
● Attributes such as race, age, gender
● States such as mood or emotions
● Conditions such as health status

We are concerned automatically detecting attributes will be used to unlawfully discriminate against people. It’s worth noting that many of these are protected grounds in the Human Rights Act. With security being one of the driving needs behind increased surveillance, it’s also very easy to see how a biometric-equipped surveillance system combined with an AI trained on historical data will end up enabling prejudiced actions against minority communities.

We believe it is a gross invasion of privacy to use technology to detect the internal state of people’s emotions. Surely we should be able to feel things without having that information analysed and recorded. This technology is an affront to human dignity.

The same applies to people’s health status. Health information is generally regarded to be highly sensitive and worth strong privacy protection. Is all this to be undone because technological systems are getting to the point that they can detect and diagnose certain medical conditions? Capturing people’s health information without their knowledge or consent is a gross invasion of privacy.

In general we support the prohibition of people using biometric systems to categorise people in this way. However, we note that there may be positive uses of the ability to categorise people. For example, it is possible to imagine a system that detects that I am blind and automatically changes its mode of communication to suit. This topic may be worth exploring to better define whether there are circumstances or purposes for which this information may be captured.

Tracking

We recommend adding “location tracking” to the list of forbidden purposes.

While any form of biometric capture will typically involve being able to identify that a person is at a particular place at a particular time, it gets more concerning as additional data capture locations are added, thus giving the owner the ability to track people’s movements over time.

We believe that people should be entitled to the privacy of location and that location tracking should be a forbidden purpose.

Technical security

The specific technical recommendations for securing biometric data are out of place in a code of practice. We accept that it may be worth defining biometric data as being sensitive personal data and therefore worth a higher level of protection, but particular methods and technologies should not be included in a code of practice as they will quickly date.

Rather it seems that it would be useful for the OPC to provide best-practice guidance for the storing of all private data, not just biometric data. This could then be updated as required. There are also issues with how biometric data is processed. It seems likely that while some users will process their own, others will use national or international Facial Recognition as a Service providers. There is already a requirement for people storing the data in other countries to only do it in countries with equivalent laws, this should also apply when the data is processed in other countries.