Founded by an Australian tech entrepreneur, Clearview AI owns and operates a facial recognition app. A user uploads an image of a person and receives results in the form of photos of that person (including photos where they are in the background of another person's photo) and links to where they were posted.

Clearview generates the search results from its database of over 3 billion images that it created by supposedly scraping social media platforms including Facebook, Instagram, Twitter, and YouTube. Most social media platforms expressly prohibit scraping without consent; Twitter's developer policy even expressly prohibits using Twitter data for facial recognition activities.

Following an investigation by BuzzFeed News the Australian Federal Police initially denied using the app, but recently admitted to trialing it in 2019. This is despite the fact that use of facial recognition by law enforcement and Government agencies is unsettled in Australia. A 2019 bill, seeking to establish a framework for a national facial recognition database, was rejected by the parliament's intelligence and security committee on the basis that it needed more robust protections of privacy, parliamentary oversight and reporting requirements.

The privacy implications of such technology are profound and if apps like Clearview enter the market, privacy as we know it will fundamentally shift.

Under the Privacy Act 1988 (Cth), biometric information (which includes any features of your face) is categorised as sensitive information. Sensitive information carries a higher level of protection than other personal information, including a requirement of express consent (consent given 'openly and obviously'), with some exceptions. Exceptions include collection of sensitive information without an individuals' consent by enforcement bodies that reasonably believe the collection is reasonably necessary for, or directly related to, one or more of its functions or activities.

Regarding images of individuals (for example in photos or videos), images are only personal information where the individual's identity is clear or can reasonably be determined from the image. Without apps like Clearview in the picture (pun intended), this means that a photo of someone in a crowd won't be considered personal information if that person cannot be identified by taking 'reasonable' steps. However, fast forward to a time when the app is installed on all our phones and you will always be able to reasonably work out a person's identity in an image using the app.

In January, the OAIC commenced an investigation into whether Clearview is being used in Australia and whether personal information of Australians is being used in the app's database. While we await their investigation, what we do know is that biometric matching technology should be seriously vetted and regulated when it comes to (among other things) privacy, security, accountability and accuracy.

We should have listened to Winston. #DOWNWITHBIGBROTHER

We do not disclaim anything about this article. We're quite proud of it really.