Your facial data can be legally collected and sold by anyone without needing your permission
The day began with a friend informing me that they had used my photos to train their local version of Midjourney. They then sent me generated images of myself in a playful steampunk outfit.
Actually, I thought. I felt violated. Is not that right? I bet Taylor Swift did when her deep fakes hit the internet. But is the legal status of my face different from that of my celebrity?
Your face data is unique personal sensitive information. It can identify you. Intensive profiling and government mass surveillance receive a lot of attention. But companies and individuals are also using tools that collect, store and edit facial data, and we’re facing an unexpected wave of photos and videos created with AI tools.
The development of legal regulation regarding these uses has been delayed. At what level and in what ways should our face data be protected?
Is implied consent enough?
Australian data protection law considers biometric data (which includes your face) to be part of our sensitive personal data. However, the law does not define biometric data.
Despite its drawbacks, the Act is currently Australia’s most important piece of legislation aimed at protecting facial data. According to it, biometric data cannot be collected without the person’s consent.
However, the law does not specify whether it should be express or implied consent. Express consent is given explicitly, either verbally or in writing. Implied consent means that consent can reasonably be inferred from a person’s actions in a given context. For example, if you enter a store with a sign saying “facial recognition camera on premises”, your consent is implied.
But using implied consent opens up our facial data to potential abuse. Bunnings, Kmart and Woolworths have all used easily visible signs in their stores to indicate that facial recognition technology is being used.
Valuable and unprotected
Our facial data has become so valuable that data companies like Clearview AI and PimEye search the Internet mercilessly for it without our permission.
These companies compiled databases for sale that are used not only by the police in various countries, including Australia, but also by private companies.
Even if you remove all your facial information from the Internet, you can easily remain in the public eye and show up in some database anyway. Being in someone’s TikTok video without your consent is a good example – in Australia this is legal.
In addition, we now also struggle with generative AI programs such as Midjourney, DALL-E 3, Stable Diffusion and others. In addition to collecting our face data, the change can also be easily done by anyone.
Our faces are unique to us, they are part of what we experience as ourselves. But they have no special legal status or special legal protection.
The only action you can take to protect your facial data from aggressive collection by a business or private entity is to file a complaint with the Australian Information Commissioner’s Office, which may or may not lead to an investigation.
The same applies to deep forgeries. The Australian Competition and Consumer Commission will only take into account activity that relates to trade, for example if deep counterfeiting is used for false advertising.
And the data protection law does not protect us from the actions of other people. I did not agree to someone training an artificial intelligence with my facial data and producing fake photos. But the use of generative AI tools is also not controlled.
There are currently no laws preventing other people from collecting or editing your facial information.
Embracing the law
We need a set of regulations on the collection and processing of facial data. We also need a stricter position on facial data itself. Fortunately, some developments in this area look promising.
Experts from the University of Technology Sydney have proposed a comprehensive legislative framework to regulate the use of facial recognition technology under Australian law.
It includes proposals to regulate the first step of non-consensual activity, the collection of personal data. It can help in the development of new laws.
As for photo editing with AI, we’ll have to wait for announcements from the government’s newly formed AI expert group, which is working to develop “safe and responsible AI practices.”
In general, there are no special discussions about the higher protection level of our face data. However, the government’s recent response to the Attorney General’s review of the Data Protection Act contains some promising provisions.
The government has agreed that increasing risk assessment requirements for facial recognition technology and other ways of using biometric data should be considered. This work should be coordinated with the government’s ongoing digital identity work and the national identity resilience strategy.
Regarding consent, the government has agreed in principle that the definition of consent required for the collection of biometric data should be changed so that it should be voluntary, informed, up-to-date, precise and unambiguous.
With the increasing use of facial data, we’re all waiting to see if these discussions become law – hopefully sooner rather than later. (Talk) NSA NSA
Also read these top stories today:
Headed by Carl Pei, Nothing is launching a mid-range smartphone, the Nothing Phone 2a in India on March 5th! This article has some interesting details. Check it out here. If you enjoyed reading this article, please share it with your friends and family.
Teasing its design and AI features, Moto says the Motorola X50 Ultra will launch soon. It is touted to compete with the Samsung Galaxy S24. This article has some interesting details. Check it out here. If you enjoyed reading this article, please share it with your friends and family.
USA vs China! US reevaluates privacy policies over Chinese tech, focuses on AI risks. President Biden’s recent actions are aimed at limiting the flow of sensitive information abroad to prevent espionage and extortion. Read all about it here. Was it interesting? Go ahead and share it with everyone you know.