Facing up to reality – New Zealand is failing on controlling facial recognition technology
The drive of the surveillance age is towards more. More cameras. More tracking. More detail. More data. More aggregation. More sharing. More analysis. Advances in technology drive the increase in capability, and there are people willing to turn these new capabilities into products, and people willing to buy and implement them.
The only thing stopping or slowing the surveillance machine are the rules we choose to impose. We can allow people to encrypt their data. We can decide that the privacy of medical records outweighs the need to feed the machine. We can stop companies from collecting more data than they need and from sharing it with others. Our rules, our laws, are the only things stopping us from living in a panopticon where governments, corporations, and other snoops monitor everything we do.
Which is why the draft Biometrics Code from the Office of the Privacy Commissioner is such a disappointment.
Remote Biometric Identification
Remote biometric identification (RBI) is where our attributes – our face, the way we walk, our height, our emotions – are captured by cameras and then processed in a computer in order to identify us. It turns camera-based surveillance of the behaviour of relatively anonymous people into a system that identifies individuals and stores information about them.
The most obvious example is facial recognition technology (FRT). Supermarkets are using it to recognise people on their watchlists, marketers use it to identify which ads to serve to people, police forces are using it to look for criminals.
When RBI identifies you it captures more than just your identity. It means that the people who own the system know where you were at that time, what you did while you were there, maybe even judging you by what you’re wearing, your facial expressions, or how you move. It knows who you were with, who you talked to, and for how long.
RBI is worrying enough in corporate and retail environments but gets even more worrying when used in areas where you might want more privacy. It can be used to identify protesters at a political demonstration, patients entering a health clinic, or people attending a social function.
Facial recognition systems take away our:
- Privacy of location – the ability to be somewhere without others knowing where we are. It allows anyone with a camera and access to the FRT to determine where we were at a particular time.
- Privacy of identity – the ability to be anonymous in public places.
Importantly, RBI is something that is done to you with no opportunity to consent or refuse. The technology is cheap and cameras keep getting better. You can’t exactly change the way you walk or leave your face at home. You can be captured by an RBI system just by existing.
Data captured using RBI isn’t the same as most other forms of private data we talk about. We know that government agencies and private companies capture data about us when we deal with them. We give them the data and generally have a reasonable understanding of what use it’s being put to.
There’s no choosing when it comes to RBI, rather you’ve just walked past a camera that you might not have even noticed. You’re identified and recorded in a database without your consent or knowledge.
We believe that RBI is a fundamentally different class of threat to our privacy and calls for a different kind of response.
Enter the Office of the Privacy Commissioner
In 2022 we were pleased to see that the Office of the Privacy Commissioner was taking a proactive approach to addressing the issue of remote biometric facial recognition under existing Privacy Law. The technology was already starting to show up in New Zealand and it was clear that existing law and practice was insufficient. Some other jurisdictions were already talking about controls or even bans on the use of facial recognition technology and other forms of remote biometric identification.
In 2023 the OPC published a Biometrics Discussion Paper (PDF) which proposed establishing a Biometrics Code of Practice (a legally recognised specialised subset of the Privacy Act) which, recognising that biometric information was particularly sensitive, would be used to set standards for its capture and use.
A key point in this discussion paper was the requirement for specific and explicit consent for the capture of biometric information. This was more than just “walking past a small sign of consent” but real consent that would require people to actually agree that their likeness could be captured and analysed. As such, it would have eliminated the ability to use FRT for surveillance of those walking by while still allowing it in some circumstances for identification (e.g. unlocking your phone).
We felt heartened to see that the OPC seemed to be taking the threat seriously and our response supported this important requirement. You can read our full response to the OPC’s 2023 discussion paper, which covers a wider range of topics than this piece, on our website.
Draft Biometric Code
But the draft Biometric Code (PDF) released in April 2024 had an unpleasant surprise. The requirement for consent was gone, replaced with a scheme where people could implement surveillance systems using facial recognition technology after they completed their own privacy assessment.
Some might compare this to the current scheme for other forms of personal data, where companies are obliged to assess their own actions in light of the requirements of the Privacy Act.
But as we’ve already discussed, and as the OPC admitted in their first discussion paper, the acquisition and use of biometric data is different.We shouldn’t treat it like any other data collected in the normal course of business such as your delivery address and payment details. Rather it’s information that is taken from people passing within range of a camera no matter what their relationship might be with the entity who controls the camera.
Self-assessment is woefully insufficient for implementing a surveillance system. How many companies wanting to use FRT are going to do the assessment and then decide that they can’t justify its use? We suspect very few.
The draft code did propose some interesting things which go beyond the standard requirements of the Privacy Act. There are a long list of considerations that had to be taken into account when self-assessing the desire to use RBI, and prevented people from using RBI for:
- Detecting someone’s emotions or health
- Classifying people into categories that relate to prohibited grounds of discrimination in the Human Rights Act (ethnicity, disability, sex/gender, age)
The draft code also stated that any specific issues that Māori might have will be too difficult to deal with, because these systems collect data indiscriminately and can’t differentiate at point of capture.
Our submission
We met with people from the OPC and other civil society groups to make sure we understood this new draft code before writing our own submission.
We pointed out that remote biometric identification is not the same as normal transaction data or even existing video surveillance. We expressed the concern that captured data can rapidly be aggregated both by surveillance tech companies offering their services to companies and individuals, and by government agencies such as the police and intelligence agencies.
We quoted European Digital Rights in saying that these systems “destroy the possibility of anonymity in public, and undermine the essence of our rights to privacy and data protection, the right to freedom of expression, rights to free assembly and association (leading to the criminalisation of protest and causing a chilling effect), and rights to equality and non-discrimination”
We noted that identification can discourage people from attending political protests, thus infringing on our right to freedom of expression and association. This damages our democracy.
We noted that RBI systems can be used to track people’s location and build up a database of their movements over time, and recommended that this be added to the list of banned uses.
We strongly criticised the self-assessment requirement, noting that companies wishing to use RBI were naturally going to be biased in their assessment. Even worse, there is no requirement to share this self-assessment until such time that someone starts the complaints process, giving companies ample opportunity for post-facto justifications.
Finally and most importantly, we said that the use of remote biometric identification such as facial recognition is a threat to our liberty and our democracy, and said that rather than being weakly constrained we should be banning it.
The OPC response
The Office of the Privacy Commissioner has released their summary of the submissions they received.
The analysis of submissions, after acknowledging that “members of the public are concerned about the use of biometrics, particularly around use for surveillance”, concentrated on the technical flaws in the draft code.
Unfortunately it seems that they are so committed to the route they’re going down that they failed to engage with or even mention some of the substantive criticisms we made.
We’re particularly disappointed that they again refused to address the use of biometric data for location tracking. We fail to see why this is not being given serious consideration as one of the forbidden uses of RBI. If it’s not forbidden it seems highly likely that someone or some group will try to use it.
What we’re going to end up with
While this was only a summary of the submissions we can’t help feeling that the OPC has already decided that they can’t or won’t stop the widespread use of facial recognition whether we want it or not, and that the OPC is merely going to put some rather weak rules around the use of it.
It looks like we’re going to end up with a permissive biometric code that will enable companies to use facial recognition and other forms of RBI as much as they feel comfortable with. The requirement to self-certify might stop some of the worst abuses, but there is still a huge area for companies, private individuals and government agencies to play in. There are going to be a lot of these systems out there.
Limitations in the Privacy Act?
But is this the fault of the Office of the Privacy Commissioner? They only have the powers and abilities that Parliament granted to them through the Privacy Act.
For example, the earlier discussion document in which consent was required before scanning people’s biometrics was seen to be a bit of a stretch of the current law. If so, then the current Privacy Act isn’t fit for managing these new forms of privacy invading behaviour and needs to be reformed.
Another, separate, problem is that there are very large exceptions in the Privacy Act for actions to support the “maintenance of law”. This means that many of the requirements are eased or removed when private data is being handed to the police, intelligence and other agencies. If we want to reduce the threat of remote biometric identification we need to make sure that these rules apply to police and other agencies as well.
What is required
We believe that remote biometric identification in general, and facial recognition technology in particular, are too big a threat. They enable new forms of surveillance, remove our ability to be anonymous in public, and make it easier for snoops to track our location and movements. They threaten our privacy and thus our freedom.
We don’t need them and we don’t have to put up with them.
We believe that we should ban the use of these technologies. If the Privacy Act as it stands won’t allow this, we need to either amend it or pass a new law that does ban the technology.
Our future freedom depends on it.