Privacy Commission Biometrics Consultation
About the New Zealand Council for Civil Liberties
- The New Zealand Council for Civil Liberties (‘the Council’) is a voluntary, not-for-profit organisation which advocates to promote human rights and maintain civil liberties.
Threat to Liberty
- The Council believes that biometric processing is a significant threat to our liberties and our democracy. The Council re-iterates our call for a ban on Remote Biometric Identification (RBI). We define RBI as the use of biometric information by a third party at a distance in a public space without a person’s explicit knowledge and consent.1 The Council considers all uses of RBI to be mass surveillance.
- Privacy may not be listed in the New Zealand Bill of Rights Act, but it is a fundamental human right, at the core of human dignity.2 By allowing everyone to be identified everywhere they go, RBI removes our right to be anonymous in public.
- If we allow private organisations to implement RBI it is seemingly inevitable that government organisations such as the police and spy agencies will seek to access and aggregate this data. Such aggregated systems will provide the users data about the movements of everyone in New Zealand, thus placing us all under unnecessary and unwarranted surveillance.
- This is a violation of our right not to be subject to unreasonable search and seizure under section 21 of the New Zealand Bill of Rights Act 1990 (BORA).
- In an open letter co-signed by 52 civil society organisations, European Digital Rights said:
The remote use of [biometric] systems destroys the possibility of anonymity in public, and undermines the essence of our rights to privacy and data protection, the right to freedom of expression, rights to free assembly and association (leading to the criminalisation of protest and causing a chilling effect), and rights to equality and non-discrimination.
Without an outright ban on the remote use of these technologies in publicly accessible spaces, all the places where we exercise our rights and come together as communities will be turned into sites of mass surveillance where we are all treated as suspects.
- It is well documented that our BORA s14 right to freedom of expression is limited when we are identified., People naturally and rationally speak more freely when they do not fear undue consequences from their speech.
- RBI is also widely used overseas to deter public protests by identifying those who attend them. This is one of the ways in which RBI interferes with our BORA s17 right to freedom of association.
- The risks from RBI are not just hyperbole. The Council notes with concern China’s use of RBI to oppress minorities in their country. The Council further notes that the ongoing genocide in Palestine has been assisted by the use of RBI in that conflict. We’d be hypocritical to allow our police to act in ways we condemn overseas.
Failings of biometric technologies
- The Council wishes to be clear that even if RBI worked perfectly we would still staunchly oppose it for the reasons above. However, as the EFF says, “Errors, both human and technical, abound in every step in the process.” Therefore we also oppose RBI as it does not work well enough.
- RBI creates information security risks. These massive databases are vulnerable to leaks and hacks. Further, they are frequently used by the people who have legitimate access for illegal purposes, primarily stalking women.
- RBI is not reliable enough for the uses typically proposed. Last month’s incident in Rotorua where a woman was treated badly after being misidentified will not be an isolated event. RBI is not accurate enough for activities which have legal consequences.
- Moreover, RBI errors are far more pronounced against already marginalised groups, whose protection the OPC ought to be prioritising. Finally, RBI’s use overseas and its initial uses in Aotearoa are within systems in which racism is already deeply embedded. Automating decisions based on that data codifies and perpetuates that racism.
- The vendors of RBI acknowledge that the technology is not ready. Just this week Microsoft announced it would no longer supply facial recognition technology to police.
Feasibility of a code under the Privacy Act
- The Council previously believed that a code under the Privacy Act 2020 would provide protection from RBI. Having seen the way the code has been weakened since the 2023 draft it is clear that this is no longer the case. As such it seems that while the Biometric Code may have some utility, it will fail to prevent the most concerning risks from the use of biometrics in our society.
- Earlier consultation discussed the pros and cons of a code versus legislative change. The use of a privacy code is limiting. As we have identified above, not all issues with biometric technology are privacy related. The Privacy Act imposes limitations on both the scope of the regulation and the measures that can be taken. The constraints placed on the OPC by the Privacy Act prevent the OPC from creating the outcomes our communities need.
- Legislative change by a democratically elected Parliament is also an appropriate method to make significant changes such as banning RBI.
- We are concerned that adopting a biometrics code would send the false signal that this problem has been dealt with, when it now seems clear that the fundamental issues can not be addressed by a code under the Privacy Act. Therefore the Council recommends that a biometrics code under the Privacy Act should not be adopted.
Consent
- The 2023 draft required specific and explicit consent for the capture of biometric information. The Council strongly backed explicit consent in our submission, saying “Anything less than this, such as consent by walking past a small sign, will surely lead to the wide-spread and indiscriminate deployment of facial recognition systems.”
- Consent has been removed from the exposure draft. The consultation document says that consent “wasn’t practical” and “wasn’t the best tool for the job.” The consultation document continues “The Privacy Act doesn’t always require consent” which makes it difficult for a code under the act to require consent.
- The Council believes that the majority of acceptable uses of biometrics are those in which explicit, informed, and written consent is easily obtained. The Venn diagram of applications where obtaining informed consent is challenging and applications which should be opposed as attacks on our liberties is not quite a circle, but lack of consent should be a flag for considerable extra scrutiny.
Problems with the Exposure Draft
Non-Privacy Harms
- As the exposure draft is a code under the Privacy Act, it understandably focuses on Privacy. However, as we addressed in the first section of this submission, there are many other harms including harms protected by BORA.
- The code acknowledges the existence of non-privacy harms in its definition of adverse action. However, our interpretation of the thoroughly convoluted exposure draft, is that Rule(1)(2)(b) and (d) either exclude or unreasonably downplay non-privacy harms. Both of those sections should be strengthened with the explicit inclusion of non-privacy harms.
Algorithm Governance
- Rule 3 of the exposure draft provides for some information to be made available to the people whose biometrics are being collected, such as the purpose of the collection and their right to complain to the Privacy Commissioner.
- The information listed in Rule 3 falls far short of the information stipulated in the transparency section of Aotearoa’s Algorithm Charter, which we believe should be taken as a model. However even this is insufficient and we believe that greater transparency is required concerning the design, testing, and implementation of such systems.
- The code should include a requirement for public agencies which use biometric systems to promise not to withhold information about those systems under Section 9(2)(b) of the Official Information Act 1982, which is the commercial secrets exclusion. Given the dire consequences inherent in this technology, the use of this widely used loophole should be denied.
- Almost the entirety of the exposure draft overlaps with algorithm governance. Despite this overlap, it appears that the drafters may not be familiar with the work of the Government Chief Data Steward (GCDS) in the Ministry of Statistics, who is the systems lead for this subject under the Public Service Act 2020. The Council urges the OPC to seek detailed guidance from the GCDS.
Self-assessment
- Rule 1 of the code requires an assessment of alternatives, benefits, impacts, and risks of a biometric system before the system is used. However, that is a self-assessment performed by people who have a clear conflict of interest, and who often will already have decided to implement the system before they perform the assessment. Self assessment is inappropriate given the stakes. This is self-policing with no downside from being too lenient on yourself.
- No one gets to see the self-assessment unless there is a complaint. Practically, there’s nothing to check that the self-assessment actually occurred before the first complaint, or that it isn’t altered to defend against the first complaint between the first complaint being filed and the self-assessment being sent to the OPC.
- Furthermore the exposure draft is quite technical and the meanings are subtle. People writing their own assessments are going to make mistakes, and there needs to be some form of quality assurance even for those who are doing their best to comply.
- The Council thinks that a better process is required. One option, recognising the uniquely invasive nature of this technology, would be to have anyone wishing to deploy a system under the code required to have an external, accredited auditor review their system and proportionality assessment prior to that system being allowed to be used.
Suppliers of Biometric services
- Interpretation (4) says that the “benefit of an agency achieving its lawful purpose outweighs the privacy risk of biometric processing” where “(c) the private benefit to the agency outweighs the privacy risk to a substantial degree”. For a supplier of biometric services whose very existence depends on being able to process biometric data, won’t this always be true? The code needs to account for this by centering the interests of the people who are being targeted by the system. The organisation which controls the site being surveilled needs to be the one whose trade offs are being assessed.
- Despite stretching to 327 words, or perhaps because of it, rule 12 fails to sufficiently regulate cross-border uses of biometric information. Rule 12 should be rewritten to make clear that service providers in Aotearoa are bound by the code regardless of where the people whose biometrics are being processed are located. Further every use of the biometric information of people in Aotearoa should be bound by the code, regardless of the location of the sensors or processors.3
Location tracking
- The council believes the biometric processing code should include an explicit ban on location tracking. While the immediate use of a biometric system inherently establishes a location, that location should not be allowed to be retained, and the only immediate use of that information which should be allowed is to direct the system’s on-site staff to the individual identified.
Privacy Harms do not require identification
- The exposure draft uses the phrase “is to be used in a form in which the individual concerned is not identified“ in Rules 10(3)(b)(i) and 11(1)(h)(i). The Council agrees with Privacy Foundation NZ’s assessment that identification is not necessary for privacy harm to occur.
- At its simplest, someone persecuting a minority doesn’t need an identification, particularly if they are comfortable with collateral damage. The Council suggests that “is to be used in a form from which no privacy harm could come to the individuals” is a better phrasing.
Agencies above the law
- The Council objects to the explicit exclusion of the intelligence and security agencies from the code, and the tacit exclusion of the Police by the repetition of a “maintenance of law” exception. None of these agencies should be placed above the law, nor should they be allowed to trample on our rights.
- Indeed, if it was not for the existing uses of biometrics by Police and our spy agencies the Council would likely not have taken part in this consultation. Aotearoa needs governance of biometrics which shuts down unjust uses of biometrics, not a code that says that trusted agents are free to be unjust.
- Rather, the Council asks for the code to be amended such that biometric information may not be collected by or provided to Police or the Intelligence Services unless there is a warrant which specifically mentions that biometric information.
- The Council thanks the OPC for its work on this important topic, and for running this consultation.
- The Council is not opposed to the use of biometrics by a person on themselves, for example unlocking a device. ↩︎
- Privacy Concepts and Issues, Law Commission, January 2008. NZLC SP19, chapter 4. https://www.lawcom.govt.nz/our-work/review-of-the-law-of-privacy/tab/study-paper ↩︎
- this aligns to p4 of https://edri.org/wp-content/uploads/2024/04/EUs-AI-Act-fails-to-set-gold-standard-for-human-rights.pdf ↩︎