General
Who’s watching you? With Choice magazine's Kate Bower
Transcript
Your face is a bit different than other types of personal information. You can change your email address if it gets involved in a hack. You can change your phone number, but you can't change your face. So we do need to be extra careful about how we treat that information.
[Val] Hello and welcome to the Be Connected Podcast. I'm Val Quinn, and I'm a technology commentator, broadcaster, publisher, and your host of the Be Connected Podcast. So facial recognition has become one of the most widely used technologies today. It's pretty much everywhere, and you can use it to unlock your phone, you can open your front door or even organise your photos. So in places like airports, or stadiums, and banks, it's being used to add security and extra features and convenience. But the big question is, who exactly is watching you? Who owns your face print, whatever that is. What are they using it for, and how did they even get your consent in the first place? Well, thankfully, here to help us understand the benefits and concerns of facial recognition technology is consumer data advocate at CHOICE, Kate Bower. So Kate investigates data misuse to help ensure fairness and safety for all Australians, regardless of their technical knowledge. She's uncovered how some of our most trusted retailers have been using facial recognition on their customers, and shone a light on why laws around this emerging technology really do need to change. So Kate, welcome to the podcast.
[Kate] Thank you for having me, I'm glad to be here.
[Val] Well, okay, help us explain, I guess, or help me understand what is facial recognition technology and how does it actually work?
[Kate] So I think the simplest way to think about facial recognition technology is to think of it as a kind of an add-on feature to cameras. So we're all kind of familiar with cameras, and CCTV cameras, for example. But facial recognition is basically a software, an AI software that sits on top of those cameras. And what it does is it uses the images that it captures, uses some measurements, some kind of mathematical calculations of people's faces to be able to identify those faces. So what it does is produce something called a face print. So the best way to think of a face print, it's kind of like your fingerprint. So your face print is unique to you as an individual. It's like your fingerprint, It's another way of identifying you. And it's basically created from the image that the capture uses. So there's a couple of different ways that it can be used. So there's what you would call a one-to-one match. So for example, that might be you have a database of faces existing, so images of people's faces. So for example, if you think about the smart gate at the airport, for example, all of people's passport photos are already on file in that database. And then you come up to the gate, and it looks at the image of your face, and then it matches you directly to the face in the database. The other way that it might be used is by detecting people within a crowd, we call that "one to many". So for example, they might have a small database of, say just 10 people that they're looking for. So it might be 10 terrorism suspects, for example. And they wanna see if those people are entering a public space, or it could be an airport, or it could be a stadium or somewhere like that. In which case, the facial recognition software scans the whole crowd, and then tries to identify if any of those people match the 10 people in the database. And then there's a third way that it can be used, which is what they call "facial analysis". This is probably one of the more controversial uses, in that it's assessing people's faces, not to identify who they are, but to identify what they might be thinking and feeling. So this might be saying, "This person's feeling sad, or this person's aggressive, maybe you need to watch out for them."
[Val] Wow, so you mean that we can now actually tell how someone might be feeling just by using facial recognition, like if I'm happy or sad? That's incredible.
[Kate] Yeah, I think we're certainly seeing very big advances in computer vision.
[Val] Another thing that I actually had my own experience, is I was flying back from Los Angeles to Sydney. And normally when you get to the gate, you have to show your boarding pass and your passport before you get on the plane. But I was amazed that this time, actually a couple of times now, I actually just walked up to get on the gate. I stood in front of a camera, my face appeared on a screen, it scanned it, and then I was free to actually walk onto the plane. So I didn't even show my passport or my boarding pass. So it really is amazing to see where this technology is popping up, and how trusted it actually is.
[Kate] Yeah, that's right. In that kind of one-to-one, what we call that one-to-one matching, when they already have your image on file, 'cause they'll have a copy of your passport, or they'll have that image available to them, and then it matches that directly with your face. It is a very secure and a very accurate way. The accuracy of that type of facial recognition is around 99%. So it is much more secure than many other types of ways of identifying people. Certainly than the old-fashioned way of when you greet a customs officer at the airport and they'd have a look just with their eyes, they'd glance down at your passport for two seconds, and then they'd look up at you. This kind of technology's much more accurate than, say, a human in that exact same scenario. It's when we get to the spotting a face in the crowd that you can see that accuracy go way down. But certainly, that one-to-one matching is very, very accurate and very secure.
[Val] I see. Well, I do miss getting the passport stamp, but I guess the accuracy is better and it doesn't involve having a person have to check, so.
[Kate] So yeah, so with the amount of data points that you can now collect using facial recognition. So, I think when we think even about a fingerprint, it might be thousands of lines. What we're talking about with a face print is millions and millions of tiny, tiny dots almost on your face, if you think about that. So if someone just drew millions of dots on your face, so they can detect even the smallest movements. So that means that they may be able to detect even things like a slightly raised eyebrow, that might detect that you're lying if your eyes dart away instead of looking directly at the subject. So signs of deception, but certainly things like smiles and frowns. The other things that they claim to be able to do is identify people's gender and their race from those sorts of indicators as well. But as I said, it is a slightly controversial use. Probably the less controversial use of it is the direct one-to-one matching when it's matching one photo to another.
[Val] Right, okay. But what other ways do we have facial recognition technology sort of used in our everyday lives? I think my phone uses something like that to unlock. Is that true too?
[Kate] That's right. So probably the one that people most commonly use is your facial information to unlock your phone. So that's actually quite a useful beneficial security feature. No one's going to steal your face, whereas they might overlook your shoulder and see your passcode when you're entering it. So it's certainly a more secure way to unlock your phone. Other things that you might see if you are using an phone is your photo collection will often say, you might be looking through your photo collection and it'll say, "Oh, is this so-and-so" that you've already tagged as a person as your mum, or your dad, or your grandchild. That's using facial recognition to identify those people in your photo collections. And it's also pretty commonly used on things like filters in social media to manipulate people's faces. So again, that's taking an image of your face, working out all of the different points, and then uses a filter over the top. So those are some pretty kind of benign uses of facial recognition in our everyday lives.
[Val] What happens if I grow a beard, or I'm wearing glasses, or change my glasses? I mean, how does facial recognition figure out that it's still me? Is it still accurate that way?
[Kate] Yeah, so this is one of the kind of, I guess, amazing things about where the technology has gotten to now, is that, yes, it can still identify you with a beard, if you're wearing a mask, if you put on sunglasses and a hat, and try and disguise yourself. It is still very good at recognising faces, and that's just because of the large number of data points that it collects. So as I said, it's millions and millions of data points across your face. So it doesn't in fact need your whole face to be able to determine who you are, because it's got so many of those data points.
[Val] So Kate, even banks use facial recognition technology, so I assume that it's quite a robust and secure technology. How do they use it?
[Kate] So yeah, we've seen some banks, particularly overseas, use it for authentication. So in the same way that you would use it to unlock your phone, it's a secure way to make sure that you are who you say you are. So instead of using something like a password that can be stolen, you can use your face ID or your face print to authenticate a transaction, whether that's a deposit, or a withdrawal, or just moving some money around. So that's another way that it's being used. We're likely to see that extend here. There's certainly a lot of interest at the moment in Australia, what they call a digital ID, which is using your facial information, so as a form of identification. So that you might be able to use it not just at the bank, but you might be able to use it in place of your licence or your passport when you need to use it to identify yourself as in a verified way. -
[Val] Let's try playing out some scenarios. So how would facial recognition be used for policing, for example?
[Kate] So yeah, so one of the ways that police use it most commonly is to identify people who might be on a watch list. So for example, a terrorism watch list. They might have a group of people, and particularly if they know that there's a crowded event coming up. So maybe it's a sporting event, or like the Commonwealth Games, or maybe it's a political event, they can use facial recognition in real time to identify people who are on the watch list as they come into the space. So obviously, that has some good benefits in terms of security and trying to prevent those sorts of incidents before it happens.
[Val] And what about clubs? I understand that they're an advocate of using, obviously CCTV and facial recognition technology as well.
[Kate] Yeah, so we've seen clubs start to roll this out. So one of the reasons why clubs are using it, and for example, if we look in South Australia, it's actually a requirement for any gaming venue with more than 30 poker machines to have facial recognition as mandatory. And that's for the purposes of gambling harm prevention. So for example, the way that that works is that someone who is on the self-excluded list, gives their photo over to be included in the database. And then that way, if they then go into a gaming venue, it's able to identify them quickly and remove them from the venue. But some other potential uses, and ones that I think potentially we might be a little bit more suspicious of, is that it can also be used to identify high rollers, for example, or patrons that they think might be more likely to spend more money. And so that they can build a profile of where that person might move throughout the club, what they buy, and then potentially suggest future things for them to purchase, or to offer them certain incentives to stay in the club and to spend more money.
[Val] I see, so this is like taking the loyalty card idea to the next level, whereas now they just have to see you, and they can pretty much connect all the dots together and have it an even more full picture about how you behave in there.
[Kate] That's right. And I think the main difference between using your loyalty card and using facial recognition is you choose when to use your loyalty card, but you don't always choose when they're using facial recognition. And it can be something that's used on you without your knowledge. And again, that's I think another case for why we need some pretty good strong legal guardrails around what's an appropriate use, and what isn't, and what you should seek consent for.
[Val] Well, why don't we talk now about how your image or your face print can be collected even without your knowledge, and then you can be scanned or searched for without your permission. I mean, this is possible, right?
[Kate] Yeah, that's right. So a lot of those examples that we've just talked about, for example, the one where you open your phone, you've consented to that, you know about it because what you were asked to set it up when you set up your phone. It said, "Would you like to use your face?" and you can say yes or no at that point, and it'll ask to do a little bit of a scan two or three times of your face to make sure it gets the right image. But there are more and more uses of facial recognition popping up where you don't know about it. And the most common use of this is when it's an addition to CCTV monitoring. So CCTV is pretty widely used in many, many public spaces, and also in, I guess what you'd call commercial spaces. So for example, retail settings, or stadiums, or other kinds of public, open-access spaces. So we're pretty used to having CCTV. What we're not used to is the type of technology like facial recognition, and the accuracy, and what that enables to be in public spaces. And that's why there's been quite a lot of debate, and concern, and good discussion around what are the risks and potential things that we need to be aware of when we use it in these public spaces.
[Val] So as you were mentioning before, in terms of organisations needing to let us know that they're using facial recognition technology or surveillance, I mean, what are their current obligations right now?
[Kate] So we do have some laws around facial recognition to the extent that it's considered sensitive information under our privacy laws. So what that means is that they are supposed to not just inform people, not just notify people that the system's in use, but seek their consent, and seek their consent in an informed and in a direct way. And this is where it can be quite tricky with facial recognition, particularly in those crowd systems, those crowd-based systems. So for example, if you think about the context of a stadium, you might have up to 10,000, 20,000 people in that space. It's actually very, very hard to ensure that all 20,000 people are aware of the use of facial recognition in that space, and fully understand what it means, and then consent to that situation. Particularly when you think that also children and potentially people with intellectual disabilities are also coming into the space. People with different levels of cognitive ability as well are coming into that space, and perhaps can't always consent or fully understand what the technology that's in use is. And that's why we need some stronger laws, really, to protect people in those circumstances.
[Val] I guess, I mean, do we know who is doing this? I mean, is there a way that we can tell it's happening? I mean, what's currently out there today?
[Kate] So I mean if you wanna find out if someone is using it, the things that you can do is review the privacy policy online before you go. Obviously that's not something that we're all in the habit of doing, is looking up the privacy policy before we go down to the local supermarket. But it should be in the privacy policy, but they should also have some notifications up at the place where it's being used. And there might be some other kind of conditions of entry. But certainly, this is one of the things that CHOICE is most concerned about, is keeping consumers really informed about where this use is. And certainly, I think there's a lot of work that we can do to get better consent and to inform people. So last year, following our CHOICE investigation, we actually made a complaint to the office of the Australian Information Commissioner, who's also the privacy commissioner, alleging that Kmart and Bunnings were breaching the Privacy Act by not getting adequate consent. And what they were doing was they just had a very small notice at the front of their stores as a way to inform their customers. We surveyed over a thousand Australians at the time about their knowledge of retailers using facial recognition, and basically no one knew about it. So we think that they demonstrated that they didn't do enough to get people's consent to collect this kind of information. And on the basis of that complaint, the office of the Australian Information Commissioner has opened an investigation, and we're expecting to see some findings from the investigation before the end of the year. A lot of people were quite angry, I think, with Kmart and Bunnings. These are pretty trusted brands. I think there wouldn't be a person in Australia who hasn't shopped at at least one of those stores. And they said they were using it to identify people engaging in antisocial behaviour, and also for loss prevention, so to identify thieves, which I think we can understand, seem like legitimate reasons. But the question is, who's overseeing these databases? How do you end up on one? What if they get the wrong person? What kind of remedies would you have if they do misidentify you or if you've been falsely placed on that database? Are they referring this information to the police, for example? What kind of real world consequences might there be for people? And then again, how securely are they storing this information on these databases? I think one of the biggest problems at the moment is that we're just trusting businesses to do the right thing. And that's why we need stronger regulations specifically around facial recognition, because it's your face, it's your face print, it's a sensitive part of you. Your face is a bit different than other types of personal information. You can change your email address if it gets involved in a hack, you can change your phone number, but you can't change your face. So we do need to be extra careful about how we treat that information.
[Val] Absolutely, I think that if we think about our fingerprints, we'd certainly want some control over who's taking it, where it's going, how it's being used, and our faces are, obviously exactly the same. Yeah, plus it'd be really frustrating if your only option, say if you were going to Bunnings to do some shopping and they say, "Well, we're recording you and we'll be using facial recognition to identify you", your only option is to not go to Bunnings. And it's like, well, I need to go there to get some tools, so what are my options here? Do I just not go?
[Kate] Yeah, when we've spoken to consumers, when we've spoken to the everyday people shopping in the Bunnings and Kmart, and certainly other retail spaces or spaces where facial recognition is being used, they would like the option to opt out. And in that use in those crowd settings, it's very, very difficult to opt out. Unless, as you say, you just don't go to Bunnings, or you just don't go and see the football game, or you just don't shop in that shopping mall. So it is really, really hard to avoid. We don't have that option at the moment to opt out.
[Val] Absolutely, and I think that's why it's great to hear that the Privacy Act is actually under review right now. Because our privacy needs to be protected, but at the same time, technology just moves at such a fast pace and it can really change the way we do things. So it's great to see that there's some opportunities there to protect us when it comes to facial recognition.
[Kate] That's right. I think this Privacy Act review is a really good opportunity to kind of reset the balance in favour of consumers. It's been a really long time since the Privacy Acts been around. It was created in the '80s, and I think we can all see how rapidly technology has changed, and the kind of information that's available to be found out now, to be collected, is radically different than when the Act was first designed. So I think the timing is really right for us to get the settings right, and to introduce some kind of clear guard rails about how this technology can benefit everybody without harming people.
[Val] Hmm, because I think that the more information that is collected about us, I guess the more our privacy is at risk. And what I mean by at risk is, of for example, that information being stolen. And then a hacker or a thief could then steal your identity, so there's always a greater risk of identity thefts there. And with the recent data breaches that we've had, some of Australia's biggest companies have lost our private information to hackers and thieves. So it really does make you think about what happens if a facial recognition database gets hacked. So what would happen there, what does that look like?
[Kate] Yeah, so as we kind of mentioned earlier, you can change your username or your password to your email pretty easily. You can even change your address. You probably don't want to, but maybe you could use a PO Box or something to try and get around the fact that your address may have been leaked in a data hack. But you can't change your facial features, you can't change your digital face. So for that reason, it is kind of critically important to keep it secure. So if we were to see a hack of a facial recognition database, that could potentially be quite damaging. Because what it means is that people may be able to spoof, so that's make a false version of your face. Which when we were started talking previously about all the different ways that you use your face to authenticate your identity. So for example, getting into your phone, to authenticate your banking information, to use in place of your passport, that could potentially be quite dangerous and quite problematic. So it's really critically important that we have very strict rules around facial recognition databases, and that we have really strong oversight by regulators about how these databases are used and how secure they are. And important to have the kinds of, I guess the threat of a big penalty if businesses don't do the right thing and don't keep your information secure.
[Val] Yeah, that's right. Just like your credit card details or your postal address. I mean, companies that have collected this information from you have a duty of care to protect it. And I think we need to be mindful about who you're giving your face print away to. An example of this was something that happened a while back where there's an app, I think it's called Face App. It can make you look like you're 50 years older or how you might have looked as a child. So it is just this fun toy that you can play with, but what you're doing is, is you're supplying your face print to the app developer, and it goes into a database somewhere. And suddenly this was such a popular app that it had millions and millions and millions of faces in there, but you have no idea what they're gonna do with it, whether they're keeping it safe, or anything like that. So I guess we need to be mindful of who we give our face prints to.
[Kate] Yeah, that's another really important consideration. There's certainly been some concern around the ownership of some of the apps and the social media apps. One of the largest facial recognition companies, and one that was in fact successfully prosecuted by the office of the Australian Information Commissioner that people might have heard of is Clearview AI, which is actually an Australian based facial recognition company who's got what they claim to be the largest database of faces in the world. And that was, actually...
[Val] Yes, and where did they get that database from?
[Kate] Exactly.
[Val] Where did they get all of these face prints?
[Kate] So they scraped social media. So that's in fact what the Information Commissioner found against them, and they agreed to delete all Australian users from their database. But essentially what they did is that they just looked across people's social media. So potentially their Facebook page, so their profile images, but also people's images just elsewhere on the internet. So if they'd been, say, in a news article, or they'd put had been maybe their social club, or their cricket club, bowling club put up some images of their, say, their social event, their Christmas party on their websites. It would scrape all those sorts of images off the web without people's knowledge or without their consent, and then use that to create the database of images, and then use that to create a product that they then sold - and became quite profitable - to governments and to police forces around the world.
[Val] I remember hearing about that. It was even used by the Australian police organisations to help find criminals. So even though it was, I guess used, the intent was for good, I guess how they collected that database to begin with was the real issue there with Clearview.
[Kate] Yeah, that's right. And that's why the Information Commissioner ultimately found against them in that finding, and requested that they delete that information, because that was an inappropriate and unethical way of collecting that kind of information. I mean, and those are the kind of rules that we need, is that people need to know what they're giving their face print away for. They need to know how the databases are made, who's got oversight of them, how secure that information is. And then when is that technology then gonna be used to identify me, or in what way is it gonna be used? And can I opt out and have that knowledge of who and what is accessing it?
[Val] Right, well, it does seem that even some of the big tech companies acknowledge that the regulation's currently lagging behind the technology, and are even pulling back from some facial recognition products like Microsoft, or Amazon, or Facebook. Yeah, so I think the fact that we've seen some of those really large tech companies, as you mentioned, pull back from the technology kind of indicates how potentially risky it is. And certainly, we've seen a large public outcry in response to the Bunnings and Kmart investigation. So I do think that time is right to pause the use. Certainly, the Human Rights Commission has called for what they're calling a moratorium on all use of facial recognition until we can have appropriate laws in place. So yeah, I do think we're likely to see an increasing level of support for good regulation in this space.
[Val] Well, it's great to hear that there really is some attention and changes to the regulation coming to help put some safety in place for us, and a good framework for companies to operate under, and to give us more access and control over what might be happening with facial recognition. But I guess, I mean, as an expert, Kate, I mean, you've studied this for a long time, you've got great insight into how it works and where it's heading. I mean, what's your opinion of facial recognition? Is it something that we should be happy about? Is it something that we should be fearful about? I mean, how do you view it in the long term?
[Kate] Yeah, I mean I think it's a great question. I think it's pretty much like any other type of technology, and you get the good and the bad together. There's nothing inherently good or evil about facial recognition. It's all about the purpose and about the use. So here at CHOICE, we're always, our purpose is to fight for fair, safe and just markets. So what we wanna see is a fair, safe, and just use of facial recognition technology. And that's certainly something that I think is possible if we get the right laws in place, and we get the right guardrails and regulation in place. But I definitely think some of the ways that we've seen it be used recently, I think those examples of the retail space, and even some of these concerning uses, say to identify high rollers in gaming spaces, those are certainly on the more problematic end, and certainly what we think is a bit more unethical. But the technology itself, I think like most technologies, is pretty benign. It's all about how we put it to good use or put it to bad use.
[Val] Well, thanks Kate. Look, I think this has been a really interesting chat, and I think it really helps bring people up to speed about where facial recognition is now, how it can be used, and where the legislation is going. Also, really excellent to hear that CHOICE did identify how it was being used with those retailers, and sort of protecting our privacies that way. So yeah, again, thanks for joining us, Kate. I really appreciate having you on the show.
[Kate] Thank you so much for having me. It was great to talk about it.
[Val] Well, thanks for joining me for this super interesting episode of the Be Connected Podcast. And Kate, once again, really appreciate you sharing your insights and knowledge. And if you like what you've heard, please subscribe to receive all of the latest episodes, and even leave us a review to help others find us if you're listening on a podcast platform. And remember to visit the show notes for more information on anything we've covered here today, including links and other useful material. And for more about today's subject, and to discover other great topics too, go to www.beconnected.esafety.gov.au. That's www.beconnected.esafety.gov.au. I'm Val Quinn and I look forward to your company next time. [Voiceover] Be Connected is an Australian government initiative developed by the Department of Social Services, the eSafety Commissioner, and Good Things Foundation Australia. Be Connected builds the digital skills, confidence, and online safety of all Australians with engaging online learning resources and a network of over 3,500 community organisations to support them to thrive in a digital world.
[Val] Any views expressed in this episode are strictly personal views only, and do not in any way reflect the opinion of the Australian government, the eSafety Commissioner, or the Be Connected program.
You can also listen on your favourite podcast platform:
Show notes
Guest: Kate Bower
Facial recognition. You can use it to unlock your phone and organise your photos, and in places like airports, stadiums, and banks it’s being used to add security features and convenience. But who exactly is watching you, and why? And do they have your permission? In this episode, Consumer Data Advocate at CHOICE, Kate Bower, highlights some of the concerns and benefits of the technology.
Disclaimer: Any views expressed in this episode are strictly personal views only and do not in any way reflect the opinion of the Australian government, the eSafety Commissioner or the Be Connected program.
Discover more:
- Sign-up and build your digital skills for free at Be Connected: https://beconnected.esafety.gov.au
- Watch a video about using FaceID on your iPhone
- Listen to our podcast episode on Artifical Intelligence (AI)
- Learn how facial recognition is used to secure your banks accounts
- Read the CHOICE magazine article about retailer use of facial recognition
- Learn about Kate Bower
- Kate Bower on Twitter
- Valens Quinn on Instagram
- Valens Quinn on LinkedIn
- Listen to more podcast episodes
- Join us on Facebook
Be Connected is an Australian Government Initiative developed by the Department of Social Services, the eSafety Commissioner and Good Things Foundation Australia. Be Connected builds the digital skills, confidence, and online safety of all Australians with engaging online learning resources, and a network of over 3,500 community organisations to support them to thrive in a digital world.
Be Connected acknowledges the Traditional Owners of the land on which we live and work, and pays respect to their Elders, past, present and emerging.