Submission: Digital Identity Services Trust Framework Bill
About the New Zealand Council for Civil Liberties
- The New Zealand Council for Civil Liberties is a voluntary, not-for-profit organization which advocates to promote human rights and maintain civil liberties.
- We wish to make an oral submission before the Committee.
Summary of Recommendations
- The Council recommends that clause 82 be amended to include provisions similar to those outlined in Part 5 of the Privacy Act (2020) Complaints, investigations, and proceedings.
- The Council recommends that clause 103 be removed from the Bill.
- We recommend that clause 19 be amended to require Trust Framework (TF) rules to include minimum standards for accessibility and inclusion standards, and that services be co-designed and pre-tested with input from suitable advisors in inclusion of disabled people.
- We are concerned that clause 20 does not require public consultation on rules for a system that will affect the public.
- We also recommend that either clause 46 be amended so that the TF Board include expertise in accessibility for disabled people or that clause 20 be amended to include such experts in the list who must be consulted over rules.
- We recommend that the Committee investigate options to appoint an independent oversight agency, as we do not think the structures proposed in clauses 43 and 58 will adequately deal with the conflicts of interest created.
Introduction
- The Council recognises the importance of trust in achieving an open and just democratic society but believes the trust of people should be gained through being trustworthy.
- Trustworthiness is shown through transparency, openness, and justice.
- This Bill makes some positive attempts at establishing trustworthiness of digital identity services, but also makes some missteps which erode trustworthiness.
Concerns
Remedies
- Clause 82 outlines remedies for faults in practice by a service provider but does not include reparative action or damages for any harms caused or financial losses suffered. The Council recommends that provisions similar to those outlined in Part 5 of the Privacy Act (2020) Complaints, investigations, and proceedings, be included.
Immunity from civil liability
- The Council sees no justification for exempting providers for accountability or liability for the consequences of their operations. The Council opposes clause 103 and recommends that it be deleted. The clause significantly reduces people’s access to justice – and by doing so diminishes the strength of our rights under the Privacy Act 2020, Human Rights Act 1993, and other laws.
- We should be very wary of any service provider accredited under this framework as it effectively reduces our rights should we suffer harm. This immunity reduces the trustworthiness of accredited service providers in our eyes and therefore runs counter to the goals of the Bill.
- We note a Regulatory Impact Statement attempts to justify this immunity in two main ways. First, to incentivise participation in the framework. This goal cannot possibly justify limiting the rights and access to justice of users who suffer harm. The second is due to service providers wanting assurance against “risks that do not exist in the unregulated market.” These risks are primarily the liability offences established in the Bill should they opt-in. However, none of these offences relate to the rights (such as privacy) diminished by the immunity. Immunity for providers is not a fair trade for the risks to providers, as the cost of that immunity is borne by the users.
- Presumably, there is an assumption that the users’ loss of access to civil action is countered by their access instead to dispute procedures. However, these procedures are to enforce rules which are to be established by Order. It is impossible to determine whether this is an adequate alternative when the rules are unknown and subject to change. It is also not at all clear that the complaints procedure set out in the Bill, or any unknown future Alternative Dispute Resolution option, is an adequate replacement for access to civil actions in the courts.
Accessibility and Inclusion
- Research by the Department of Internal Affairs indicates that digital technologies such as digital identity services increase barriers to accessing services for some people. These barriers are often referred to as the “digital divide” or “digital exclusion”. This exclusion is particularly difficult for service users when offline options for accessing services are diminished or removed. It is further exacerbated by a lack of a co-design process to ensure services are digitally accessible from the start, and that alternative support is provided for those who cannot access digital services. The Government has a duty to include people by ensuring their needs will be considered when approving pathways for digital identity services.
- Barriers to accessing digital services include issues like poverty or low literacy that are outside of the scope of this Bill to resolve. It is nevertheless important that Parliament consider how people will be served by a digital identity service. Citizens’ Advice Bureau (CAB) reports that approximately 30% of client visits now require assistance with digital services which they are otherwise unable to access. This is placing a strain on CAB resources and particularly on volunteers providing support. Blind or low vision people are particularly poorly served by bad design and use of tools like Captcha.
- We recommend that clause 19 be amended to require TF rules to include minimum standards for accessibility and inclusion standards, and that services be co-designed and pre-tested with input from suitable advisors in inclusion of disabled people. While this Bill may precede the Accessibility for New Zealanders Bill announced by Minister for Disability Issues Carmel Sepuloni, it would be unwise to create accessibility barriers to future service users that will need to be removed subsequently. We also recommend that either clause 46 be amended so that the TF Board include expertise in accessibility for disabled people or that clause 20 be amended to include such experts in the list who must be consulted over rules. We are concerned that clause 20 does not require public consultation on rules for a system that will affect the public. Not least because the limited list provided for relies upon officials identifying people to invite to participate, when in reality there are likely to be sources of expertise and experience amongst the public that officials do not know about.
Insufficient independence of the Board and Authority
- Clauses 43 and 58 require the Prime Minister to nominate a public service department responsible for the Board and Authority. Only the Authority is required to act independently, and only in respect of its enforcement functions.
- Our view is that the public service departments, as likely relying parties for digital identity services, are not sufficiently independent to carry out either of these roles as they are significantly interested parties. The Department of Internal Affairs in particular has a major conflict of interest as the provider of the government’s most significant existing digital identity service, RealMe. Despite the stated requirement to act independently, we question the degree to which the Authority can be both independent and seen to be independent under such an arrangement.
We recommend the committee consider instead placing both the Board and Authority within an Independent Crown Entity.