I mean, it's dizzying. There is Teams and there is BlueJeans and there's Skype and there's Zoom and they all work differently. And then we started seeing more seniors taking on, How do I do online banking? How do I do online shopping? So we saw a huge take-up of our services and modules.
[Val] Hello, and welcome to the "Be Connected" podcast. I'm Val Quinn, and I'm a technology commentator, broadcaster, publisher, and the host of the "Be Connected" podcast. Australia led the world by establishing the first specialist agency dedicated to online safety. Ever since, the independent eSafety Commissioner has been educating and delivering services with the purpose of helping to safeguard all Australians from online harms. It's an initiative that is inspiring similar models in other countries, and which also fosters the Be Connected program and podcast you're listening to. Our guest today is Australia's inaugural eSafety Commissioner, Julie Inman Grant. She took up the role following a career that included working with the US Congress and senior positions at Microsoft, Adobe, and Twitter. In this episode, Julie talks about the mission to ensure positive online experiences and create healthy internet habits for everyone. Welcome, Julie.
[Julie] Excellent. Thank you so much for hosting the podcast and for having me as a guest.
[Val] Well, Julie, maybe we'll just jump in a little bit and just talk about the eSafety Commissioner and really why it was set up and what does it do. So can you help take us through that?
[Julie] Yeah, sure. Well, the listeners might remember that there was a reality show called "Australia's Next Top Model", and there was a beautiful compere named Charlotte Dawson, who was very active on social media, and she was also very open about her struggles with mental health. And she was terribly trolled on Twitter. Had a nervous breakdown that was quite public, got some help, came back on to Twitter. I remember that because I was interviewing for a role with Twitter at the time. They were just opening an Australian office and they were looking for a head of public policy, philanthropy, and trust and safety, and I remember seeing some of the commentary. Just terrible, terrible invective. And tragically, she ended up taking her own life and it was referred to as the Twitter suicide. That started a petition that went to government. And at the time, Malcolm Turnbull was the ICT minister, the parliamentary secretary was Paul Fletcher, and Tony Abbott was PM. But the Australian public said, "Enough is enough. We need to draw a line in the sand here and government needs to intervene because this online abuse is, you know, we have no recourse and the platforms aren't doing enough." And so what Malcolm and Paul, as the primary architects of the eSafety Office, is they started with the Children's eSafety Commissioner. You know, we have a long history of, you know, real pushback on anything that resembles censorship. And so they thought, well, we've got a longstanding scheme around illegal online content, particularly around child sexual abuse material and pro-terrorist content. Let's set up the first youth-based cyberbullying scheme so when companies fail to act when young people report to them, this eSafety Commissioner can serve as a safety net and compel that takedown. We'll start with children because nobody can argue that they're not more vulnerable. Alastair MacGibbon was the first eSafety Commissioner. He was only in that role for about nine months and I started the role in January 2017 and by June, I went from the Children's eSafety Commissioner to the eSafety Commissioner. And one of the first programs that we had covering all adults was Be Connected for older adults because research had told us that the least represented population online was not children, was not Indigenous communities, it was Australians over the age of 65. And we also knew that they were a much more trusting generation and that scams and all forms of social engineering were starting to take off and target older Australians and we were seeing terrible situations. So that's where we actually started engaging with adults on that upper end of the spectrum. And then we started taking on programs like eSafetyWomen, technology-facilitated abuse against women and children who are disproportionately targeted with online harassment.
[Val] Okay, so Julie, how can older Australians help adopt these healthy habits for all generations and their families? So not just for themselves, but also maybe for their grandkids. You know, what's happening there and how can they learn about what to do?
[Julie] Well, at eSafety, we talk a lot about "sharenting". I suppose we should be talking about grand-sharenting a bit. And, you know, the whole idea is that, you know, your digital footprint or your digital reputation is created by all the things you do and say online as well as what others share about you. And, you know, this actually starts before a child is born. I mean, how many sort of sonograms have you seen on Facebook? And then there's that first obligatory baby shot and the first day of school and all of this type of stuff, which is great. I remember being mocked by the media when we came out four years ago with a early childhood learning guide that basically said, "Hey, you should model good behaviour for your children. When you're taking a picture of them, you know, on their first day of kindergarten, you might want to ask them for their consent. Ask them if it's okay if they post it to Facebook so that their grandma oversees can see it." And people said, "Oh, you know, you have to ask your three-year-old to take a picture?" And that wasn't really the point. It was really about modelling responsibility, the idea of consent. I mean, you can't tell your kid to get off Fortnite when you're scrolling through your Twitter feed at the dinner table. But the other thing that, you know, I was joking about grand-sharenting, but there's a very, very high proportion of grandparents in Australia that take on caring responsibilities for their grandchildren because childcare is so expensive. I think it's something like 25%. And so I was saying one day to the team, I'm like, "Gee, I loved, you know, going to my grandma's house 'cause she'd give me candy and all this food that I couldn't eat at home and I got to watch TV that I couldn't watch at home. I bet kids are getting away with a lot when they're using technology." So we came up with this whole idea of a grandparents' guide and we had two very famous grandparents help us launch it. One was Jimmy Barnes and the other was Malcolm Turnbull. So that was pretty exciting.
[Val] Absolutely. With the grandparents' guide, it's helping, you know, educate and inform grandparents so when they are with their kids, oh, grandkids I should say, you know, they're across these things that you mentioned and they can model the right behaviour because children are online now before we know it and, you know, it's just as important for grandparents to model this, you know, safe behaviour. And without that awareness, you know, that's very difficult to do. So there's certainly great stuff in the guide. You know, anything from managing online time to, you know, protecting that digital footprint, like you said, sorting fact from fiction, especially with generative AI. That's becoming a really, really big, big topic. Cyberbullying, all of that stuff. So this is a great guide written, you know, in an easy-to-follow language. I think it's such a good idea. So, you know, a really, really good one there, I think, Julie, to help educate older Australians.
[Julie] Thank you. I mean, I think it's just also important to note that one of the key challenges for us, when I came into this role, only 50% of children would confide in a trusted adult when something went wrong online. And so a huge focus of ours has been in, you know, reporting to the website, reporting to eSafety, talking to a trusted adult. And often children will prefer to talk to their grandparent than their parent because the parent takes on that disciplinary role. So providing those grandparents with the information they need about what is cyberbullying? What is image-based abuse or the non-consensual sharing of intimate images and videos? And if they are confided in, what kind of guidance can they give their grandchildren? What can they do to help them work through it, including reporting to eSafety if something has gone wrong? And I have to note here that I often talk to my own kids about the grandma test. I've got two 11-year-olds, twins that don't have phones yet, but, you know, they're occasionally using mine, so I can mostly see what they're doing. But just because something can be said or can go online doesn't mean it should. And so I often ask them, "Well, what would Grandma Jean or Granny Glenda think?" Use that as a barometer about what you want to say about yourself or what you would share online.
[Val] Yeah, that's a really good way of putting it in perspective because, you know, yeah, I think we are living in a world of oversharing potentially and online is a, it's a big, big audience.
[Julie] And once it's there, it could be potentially there forever. It's kind of out of your control.
[Val] Well, that's right. And I think that's a really good point. And that kind of segues into, you know, what the commission actually can help do, and I know have done, to help remove that kind of information online and how eSafety can interface and set, I guess, the boundaries for what big technology companies are able to do. Can you talk a little bit about that and sort of how eSafety can actually inform big tech and keep an eye on them?
[Julie] Yep, that's right. Well, I talked about that paradigm of prevention, protection, and proactive change. Through the Online Safety Act, we have a number of what we call individual complaint schemes where we can help individual Australians who are experiencing different forms of personal harm, get that content taken down. But we've also got powers to compel removal and fine both the platforms and the perpetrators if they fail to respond to a formal regulatory notice. And so these schemes are in the form of a youth-based cyberbullying scheme. We've just used some powers around what we call end-user notices, which is sort of a deterrent, but it's also in a pretty, it's basically sending a notice to a young person with the support of their parents and their schools to say, "Cease and desist. Apologise to the person." And this was around a case of a 14-year-old girl. There were six young boys and she said no to a date to one of their friends and they started making dating and rape threats and piled on her. And we thought it was important enough, we were working with the school, to actually send a message that you can't do this with total impunity and there are, there can be penalties and we're watching, but do it in a way that's supportive. We also have image-based abuse powers, so when intimate images are shared without consent. We've had seniors report to us. We have a 90% success rate in terms of getting that content taken down from sites all over the world. And a lot of it is done with collaboration with the industry because often it is violating their terms of service and it's violating their law. One of the newer schemes, which is a little bit trickier, is the serious Adult Cyber Abuse Scheme. It's tipped very high because it doesn't cover defamation or harm to reputation. But when it's reported to a platform and doesn't come down and reported to us and we can prove serious intent to harm an Australian individual and that is menacing, harassing, and offensive in all cases to an ordinary, reasonable person, we can compel the takedown of that. And we've used those powers a number of times. We also have what we call systemic powers. So basic online safety expectations, which are the rules of the road, we expect the companies to have in the way of safety. And I've been able to issue transparency notices to everyone from Microsoft, Meta, who owns Facebook and Instagram, Apple, Twitter, TikTok, Google, and others, to say, "What are you actually doing to detect child sexual abuse material, prevent sexual extortion, and are your algorithms being used to serve up harmful content?" So really potent powers that don't exist anywhere else in the world.
[Val] And you can issue legal notices to big tech companies to get them to answer tough questions about how they're tackling these types of things as well. Is that something that is happening now?
[Julie] Yes. Yes, and we just tested those powers for the first time last year. We put out a pretty damning report in December of 2022 about the extent to which Microsoft, Apple, and Meta and a few other companies were not using technologies to identify and remove child sexual abuse material. They were basically letting crime scenes happen on the platforms. And that also applied to, you know, Skype and FaceTime and these videoconferencing services, which have long been known to host livestream child sexual abuse, particularly to countries like Indonesia and the Philippines. So that's resulted, actually, I mean, one, it lifted the lid on what is and isn't happening, that crime scenes are literally being allowed to happen on these platforms. But we'll be able to measure over time whether or not they're making improvements. But it's also led to really constructive conversations because we've been asking these questions for years and we can fine them up to $700,000 a day if they don't answer these questions truthfully. So they are potent powers. You know, we have to put a lot of research, we have to do a lot of statements of reasons, use a huge evidence base. It's not something that you just wield willy-nilly. But it's important to get that transparency. They say that transparency or sunlight is the best disinfectant, and we've certainly found that. And companies do respond to risk to their reputation because as you've seen with Twitter, when you rely on advertising, if your brand is bad and advertisers aren't paying, there goes your business model.
[Val] Hmm, and just for the average person who runs into some online abuse or something shared that they don't want to, I mean, what would you say is their best port of call? Where should they go? What should they do? Should they come to the eSafety Commissioner or should they look to, say, the social networking platform first? Where would you advise-
[Julie] Well, I'd say the most expedient way to get any form of abusive or objectional content taken care of is to report in the first instance to the platforms that, you know, whether it's a social media platform, a gaming platform. We've even seen bullying on Spotify, you know, songs and song lists created to target specific children and to humiliate them. So it can happen anywhere. That's the most expedient way to do it. It's also their responsibility. Really, we're really set up as a safety net. We're not moderating the entire internet. We're not proactively monitoring. We're there to support Australians when they, in most cases, when they report and it's not taken care of. And that's what these powers also do. They put the, if companies aren't consistently and effectively enforcing their own policies, we'll call them up on it. If they're making it too hard for people to report different kinds of abuse, that's part of what safety by design is. There are three principles there. Service provider responsibility and being very clear about what that is. Providing user empowerment and autonomy. So making sure that there are parental control tools that are easy to use, that there are easy and intuitive and discoverable reporting pathways that are clear and that actually get responses. And then transparency and accountability. You can't have accountability if we're not transparent about what's happening under the hood. So all of these things are important. You know, we've just set up a global online safety regulators network. We now have an online safety commissioner that just started in Ireland and in Fiji and the UK's got an online safety bill. The Canadians are considering legislation. Europe has the DSA. So soon we'll have a network of online safety regulators that we can work with. We'll all have different powers, we'll all have different vantage points, but, I mean, we've got to work together to really counteract the stealth, the wealth, and the power of a hugely previously unregulated technology sector to do this effectively. I mean, the internet's global. We're a small agency. You know, we don't have thousands of investigators trawling the internet. So, you know, we do need to encourage people to report to the platform first and then come to us if it's not taken care of.
[Val] Okay. That's right. So they should look to the platform that they're on. There's usually a method of reporting abusive content or actually getting in touch with the platform that should be obvious and clear. There should be clear community guidelines about what's okay and what isn't. And then from there, you know, the eSafety Commissioner makes sure that, you know, there's a framework there that operates and that these companies are, you know, making it possible for you to get what needs to be done done, I guess.
[Julie] Yep, esafety.gov.au's all you need to remember. I think right on the front page, it'll say Report Abuse. It explains our different regulatory schemes. The other thing that's worth noting, if you ever feel like you're being physically threatened or there's criminality, of course you go to the police, but you can also come to us and we have MOUs with every state, territory, and the federal police and can assess and then refer on. But we also refer on to mental health services or to legal aid and try and provide what I call compassionate citizen service and wraparound care. Sometimes people just need to talk to someone and we'll provide strategies to people who come to us about how to better protect themselves given whatever their situation is.
[Val] So Julie, I've also heard about the Young Mentors program. Can you explain how that works and what the benefits are?
[Julie] You know, believe it or not, gosh, it was almost 25 years ago when I was at Microsoft. I was involved in starting a seniors online program. I remember being in high school and one of the volunteer activities we used to do that I loved would be going to the local aged care home for the dances and just to talk and engage. As we were talking to older Australians, we kept hearing, "Most of what I learn is from my children or my grandchildren and, you know, sometimes I feel sheepish about asking or talking to them." But when you talk to young people, they want to help. They want to be able to impart their knowledge. They love being able to have this intergenerational connection. I know I did, and I do. And that's precisely what we found. So we wanted to come up with a framework and a program where we could facilitate that in a more formal way. And we were about to launch the Young Mentors programme, again, right before COVID hit. So very bad timing when we couldn't actually physically be together. But now, you know, we've reinstated that program and we have got lots of different schools and aged care facilities engaged and would love to get the word out about that. People wanting to learn and not be left behind and be able to communicate with their loved ones and be productive online. And, you know, I think that we've got lots of great games, memory games, and things to stimulate thinking on the Be Connected site. I think there's really something for everyone.
[Val] Absolutely. I mean, what would you say is your favourite topic? Is it banking or, I mean, just how to connect your devices? Do you have one in particular?
[Julie] I've actually used it with my own parents. Like I said, there's something for everyone from the very, very basics, but even really important things about passwords and password security and how to spot scams. Really important because, unfortunately, scammers do play on our best instincts, but also our worst fears. And we just need to make sure that we're staying a step ahead and engaging in protective behaviours.
[Val] Absolutely. I mean, as a tech expert, there's plenty of holes in my knowledge and it's important just, you know, to take the time to fill in those holes. And that's what I love about Be Connected as well is that you can do it at your own pace at any level. It's free, it's easy to access, and yeah, there's lots to learn, and there will continue to be lots of things going on in technology that we need to keep up to speed with. So yeah, it's a great place to be.
[Julie] A hundred percent.
[Val] Well, thanks for joining me for this episode of the "Be Connected" podcast and thank you, Julie, for your company today. It's been really, really great having you with us and to learn about all the things you've been doing with the Be Connected programme as well as eSafety. If you've liked what you've heard, please subscribe to receive all of the latest episodes and even leave a review to help others find us. And please remember also that we have show notes for information on anything we've covered here, including a link to the grandparents' guide and some information about how you can be involved with the Young Mentors program. So check out the show notes for these links and other useful material. There have been some discussions of suicide and self-harm in this chat. We care about your wellbeing and want to ensure that everyone in our community feels safe and supported. So if you or someone you know is struggling, please know that help is available. In Australia, you can contact Lifeline, Beyond Blue, or the Suicide Call Back Service for support. You can find links to these resources in the show notes. For more about today's subject and to discover other great topics, go to www.beconnected.esafety.gov.au. That's www.beconnected.esafety.gov.au. I'm Val Quinn and I look forward to your company next time.
[Announcer] Be Connected is an Australian government initiative developed by the Department of Social Services, the eSafety Commissioner, and Good Things Foundation Australia. Be Connected builds the digital skills, confidence, and online safety of all Australians with engaging online learning resources and a network of over 3,500 community organisations to support them to thrive in a digital world.