STAY CONNECTED

Privacy in one’s image and biometrics

03 November 2022 03:10


Adelaide Hayes, Tom Jury and Kate Hearnden from Cooper Grace Ward


Newsletter subscribers can read the full article here.

Introduction

New technologies are making it easier to collect and use the images and biometrics of individuals. This article examines the types of biometric technologies being used, their legal ramifications and how Australia’s privacy law framework is responding.

Background

Biometrics are the physical or behavioural characteristics that can be used to identify an individual, including their fingerprints, facial patterns, and voice. New technologies have made it easier to identify a person using their biometrics.

Images and biometrics are considered to fall within the definition of “personal information” under the Privacy Act. For businesses subject to the Privacy Act, this means that the collection, use and disclosure of images and biometrics must be governed by the Australian Privacy Principles (APPs).

Commercial Use

While biometric technology has an established role in law enforcement, the use of the technology in a commercial context is becoming increasingly prevalent. Employers are using facial recognition services to identify individuals easily and more accurately, leading to better employee access and check-in systems. Similarly, organisations are using the information gathered by these services to create customer profiles and even profile customer demographics.

As the adoption of these technologies grows, organisations must be aware of privacy concerns. The Australian Law Reform Commission raised concerns about the use of these technologies as early as 2008, stating that the technology:

  • is capable of enabling extensive monitoring activities.
  • can be used to identify individuals without their knowledge or consent.
  • could be used to reveal sensitive information about individuals, including their health and religious beliefs

Clearview AI

The Office of the Australian Privacy Commissioner’s (OAIC) investigation into Clearview provides an example of how the use of biometrics in technology is creating strong privacy concerns among stakeholders. Clearview developed a facial recognition app in 2017 that allowed users to upload an image of an individual that was then converted into a biometric template and matched against images in the company’s database.

The app provided links to where those images appear on the internet, enabling users to identify the individual.

One of the main concerns with this practice is that Clearview’s database of images was developed by scraping over 3 billion images from various websites, including social media pages, without the knowledge or consent of the individuals in the images.

Australian Investigation

In July 2020, the OAIC announced that it was commencing a joint investigation with United Kingdom’s Information Commissioner’s Office (ICO) into Clearview and its information handling practices.

The Privacy Commissioner found that Clearview:

  • failed to take reasonable steps to implement practices, procedures and systems to ensure compliance with the APPs, in contravention of APP 1.2
  • collected sensitive information of individuals without their consent, in contravention of APP 3.3
  • failed to collect personal information only by lawful and fair means, in contravention of APP 3.5
  • failed to take reasonable steps to notify individuals of the collection of personal information, in contravention of APP 5 and failed to take reasonable steps to ensure the personal information it used or disclosed was accurate, up-to-date, complete and relevant, in contravention of APP 10.2

What does the decision show?

This decision demonstrates that Australia’s privacy regime is capable of responding to the misuse of images and biometrics depicting Australian individuals. The Privacy Commissioner stated that this case demonstrates the need to strengthen protections for individuals as part of the current review of the Privacy Act, including prohibiting practices such as the data scraping of personal information from online platforms.

The Commissioner’s scathing review of Clearview’s data handling practices sends a strong regulatory message to online platforms, both in Australia and overseas, looking to collect biometric information, especially where the collection is covert, widespread and unjustified.

Identity-Matching Service Bill

Despite the backlash against Clearview, Australia’s public sector is continuing to use the biometrics of individuals for identity-matching services. In 2017, the Council of Australian Governments (COAG) agreed to facilitate a hub for the exchange of government images between the states, territories and the Commonwealth.

The purpose of this agreement was to promote the “secure, automated and accountable” exchange of identity information, in part to fulfil the objectives of law enforcement and community safety.

Lee v Superior Wood

Another example of the existing commercial use of biometric technology is illustrated in the case series of Lee v Superior Wood Pty Ltd. These cases concerned Mr Lee, a general factory hand who was employed by Superior Wood at one of its sawmill sites.

In 2017, Superior Wood introduced a policy requiring all workers to use the company’s new biometric fingerprint scanners to sign in and out of the site each day.

When directed to provide his fingerprint data, Mr Lee refused, expressing his concerns that the company could not guarantee that his biometric data would be safe from third parties. Superior Wood eventually terminated Mr Lee’s employment for refusing to follow its directions.

Mr Lee applied to Fair Work Commission on the basis of unfair dismissal, initially being unsuccessful. However, the decision was overturned upon appeal determining that Superior Wood’s direction was in breach of the Privacy Act.

Implications

This decision demonstrates that Australia’s privacy regime seems to provide protection to individuals from the unlawful collection of their biometric data in certain circumstances.

The decision also serves as a reminder to businesses that they should implement appropriate safeguards (such as easily accessible and transparent privacy policies and collection notices) to ensure that any collection of biometric information complies with the Privacy Act.

Some consider that Mr Lee’s dispute could have been avoided if Superior Wood had included a clause in Mr Lee’s initial employment contract requiring compliance with all future workplace policies. While this was not explored in the decision, it indicates that an individual’s right to privacy may be weaker than expected.

Conclusion

The expansion of biometric technology and systems has the potential to erode the privacy of one’s own image and biometrics.

However, the OAIC’s investigation into Clearview, the initial criticisms of the Identity-matching Services Bill and the Lee v Superior Wood Pty Ltd cases indicate that Australia’s privacy regime has the means to offer some protection in the face of advancing technologies.

It will be interesting to see whether and how Australia’s privacy regime adapts to keep pace with these evolving technologies and practices across the private and public sectors.

In the meantime, private businesses should remain vigilant in complying with the Privacy Act and accounting to individuals transparently as to the collection and use of biometric information from individuals.

Related Articles

Practical Guidance


Your one-stop solution for accurate legal answers from Australian legal experts. Tools, practically focused guidance notes, checklists, precedents, and training materials support and streamline your legal workflow.

LEARN MORE

Subscribe to our Newsletter


RELX Trading Australia Pty Limited and our affiliates may further contact you in your professional capacity about related products, services and events. You will be able to opt-out at any time via the unsubscribe link provided within our communications. For more information, see our Privacy Policy.