Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Face-off against surveillance

Opinion: We are constantly being watched and recorded. Thanks to the proliferation of AI facial recognition technologies we are being watched even more: on the street, in the bank, in shopping malls, on public transport, at the supermarket check-out, even our phones are likely watching us. Are AI facial recognition technologies keeping us safer and making life easier, as many would say they are, or should we be wary of it, even if we think we’ve got nothing to hide?
The Privacy Commissioner evidently thinks the increasing recording of our faces is enough of a concern that it is consulting on a Biometrics Code, to see if greater regulation of the collection and use of ‘sensitive’ biometric information is needed. Biometric information includes your face and other information, such as the way you walk (gait), your voice and so on, but the focus of the article is about the face, and how we could and should protect it.
It’s true that facial recognition can make life easier. Many use AI facial recognition technology to unlock their smartphones, laptop and other devices, even their front door – no key necessary, no password to remember. If you want to open a bank account at some banks, you can send through a scan of your driver’s license or passport, let your face be scanned and the AI does its magic.
If you have travelled overseas recently, if you used the smart gates at the airport, you would have used facial recognition technology. Your face was scanned and checked against the database of New Zealand passport photos.  
But as with most things in life, every pro comes with a con. Scammers and fraudsters can get around AI facial recognition technology through spoofing, using a photo or series of photos or even a mask to fool the system. Or worse, by using AI deepfakes.
AI systems in turn are being developed to prevent spoofing and are getting more sophisticated, but they’re not 100 percent foolproof. The American technological research and consulting firm Gartner has predicted that by 2026, 30 percent of enterprises will no longer consider biometric tools reliable on their own.
Scammers are cunning, however. In Thailand scammers used iOS malware Gold Pickaxe (the name says it all) who contacted people pretending to be from Thai government agencies, who were asked to download a “digital pension” application so they could receive their pensions online. To set up the digital pension the victims recorded a video for facial recognition purposes, which the scammers used to get into their bank accounts.
The more we depend on facial recognition, the greater the risk of it being used against us. If someone steals your credit card, driver’s licence, passport details, you can eventually get them replaced. What happens if someone hacks into a system and steals your face – that is, the data that allows AI to know that you’re you? You can’t exactly go and get a new face.
Even if you are unusually alert to scams or wary enough of facial recognition technology and refuse to provide scans of your face/photographs etc, your face can be recorded without your consent or knowledge, when you’re out in public or online or anywhere where your face is showing.
Apart from moving under a rock, what can we do to protect ourselves?
If you use AI facial recognition to, for example, access your bank accounts you should set up another verification method, such as two-factor authentication – so that the bank, for instance, will need to send a message to your phone to verify that you were the one trying to log in using AI facial recognition.
Or you could refuse to use facial recognition to set up or use your bank accounts or any other service. And stand in line at the airport to be seen by a custom’s officer rather than going through a smart gate.
However, not using AI facial recognition technology is likely to be become increasingly difficult, as we’re nudged or even obliged to use facial recognition technology.
It is quite common for governments to roll out new technology on people who have no choice about whether they want to use it. In New Zealand the Identity Check system is becoming the primary way New Zealanders can verify who they are online, for access to thousands of public and private services, from benefits to banks, and which includes facial recognition.
In New Zealand we should not be forced to provide biometric information, including photographs and facial scans. People should be provided with another way to prove identity. For example, with the Identity Check system, people can verify their identity in person at a service centre. And organisations should not be permitted to make it prohibitively difficult to prove your identity without using facial recognition. Unfortunately, the proposed Biometric code does not require that an alternative to AI facial recognition be provided.
Because of the problems surrounding the use of AI facial recognition technology it makes sense to avoid it wherever possible. If a business or government agency provides an alternative to AI facial recognition technology, use the alternative, even if it is less convenient. And, if a business does not provide an alternative, take your custom to another business that does.   

en_USEnglish