In this episode of the Technology & Security podcast, Dr. Miah Hammond-Errey speaks with privacy expert Anna Johnston about the evolving landscape of data privacy, AI governance, and security. With AI dominating the global conversation, this episode explores why strong data governance is the essential—yet often overlooked—foundation for innovation. Anna warns that companies rushing into AI without mature privacy practices risk legal challenges and regulatory crackdowns, including the risk of ‘algorithmic disgorgement’.
With new privacy reforms on the horizon, 2025 is set to be a defining year for digital rights and compliance. The discussion explores major legal shifts in 2025, including Australia’s new statutory privacy tort and a landmark Victorian court decision recognising a common law right to privacy. The episode explores the links between privacy, national security, and algorithmic polarisation as well as mis and disinformation and tech facilitated gender violence—and why privacy law reform is essential to safeguarding democracy. Johnston shares insights on how privacy reform could fix some of these intractable problems. Whether you're a tech leader, policymaker, or privacy-conscious citizen, this episode unpacks critical issues shaping our digital future.
Transcript, check against delivery:
Dr Miah Hammond-Errey: My guest today is Anna Johnston. Anna Johnston is one of Australia's most respected and well recognised privacy law experts. Her blogs and newsletters regularly introduce new ideas, influence policy and legislative reform. she is a partner at Helios Salinger, where she leads a team helping organisations comply with privacy law. She has previously worked as an in-house privacy officer and a privacy regulator. She founded and ran her own company for 20 years, specialising in privacy, compliance and data ethics consulting, training and resources until its recent merger with Helios. She's won numerous prestigious awards and fellowships. I can't think of someone better to join me and explore privacy with. Thanks so much for coming on, Anna.
Anna Johnston: Well thanks Miah. I'm excited to be here.
Dr Miah Hammond-Errey: We're coming to you today from the lands of the Gadigal people. We pay our respects to elders past, present and emerging and acknowledge their continuing connection to land, sea, and community.
Dr Miah Hammond-Errey: So Anna, research shows that Australians do continue to care deeply about privacy and expect more business and government action to protect it. I want to start by asking you the same question I asked Australia's Privacy Commissioner, Carly Kind. Can you imagine an ideal privacy landscape in Australia in 2050?
Anna Johnston: 2050 feels like a really long way away at this point. I can probably only think about five years into the future, but, I would really love to see an environment in Australia and globally where we have really clear and consistent and robust regulation around the use of data about humans. Um, and thinking really big picture. I think that means not only in terms of respecting privacy, it's about respecting copyright and human intellectual endeavour. It's about, respecting our environment and the kind of carbon footprint of the use of data. I would love to see a world, as I said, not just Australia, where we really we value and respect both humans and the environment and our laws and the enforcement of those laws and business practices all reflect that. unfortunately, at the moment, we're a long way off that point.
Dr Miah Hammond-Errey: Yeah. what do we need to change now to narrow that gap and get us a little bit closer to that vision?
Anna Johnston: Yeah. So we we need, I guess, three big things. The first is for legislation to kind of catch up. We need laws that are fit for the digital age. They're not fit at the moment. We need effective and robust enforcement of those laws. That means, regulators with decent budgets and, the powers and the tools and the willpower to enforce those laws. And then we need, um, both public and private sector organizations to respect and comply with with those laws. Um, one of the reasons that we say that our laws are not currently fit for the digital age is, um, whether we're talking privacy, copyright, whatever it is, they are primarily drafted in an era before the internet or very early in the internet, very much before, really, the powers of, um, big data and complex data linkage and the, um, kind of surveillance capitalist economy was really only in its infancy. Um, for the most part, if we're talking particularly about let's focus on Australia, let's focus on privacy laws in particular. For the most part, our privacy laws have done a reasonably good job of keeping up. They were drafted originally in the late 1980s, deliberately to be technology neutral. if you if you just focused on the technology that was relevant in the 80s, we'd have laws that still talked about, VHS and beta recordings.
Anna Johnston: So they were deliberately drafted to be technology neutral. They have flexible principles that should be capable of being applied across a range of technologies, a range of use cases, a range of user entities, if you like. but there are some critical ways in which our laws have failed to keep up. And one of those is what's effectively a threshold definitional issue which impacts the scope of the laws and how far they apply. Basically, our laws only regulate what fits within the definition of personal information. And that definition is, constantly being challenged. Um, businesses are kind of pushing the envelope about what is or is not inside fits within that definition. I think it's absolutely critical that we have a, an updated definition so that it is clear that, uh, or kind of all of the activity of, you know, businesses, governments, um, operating online where they're collecting data about humans should be in scope for privacy regulation.
Dr Miah Hammond-Errey: Excellent. So let's get let's get to privacy reform. it's obviously been a long time coming in Australia. And amendments to the Privacy Act were passed in November last year. they've been touted as the first tranche of reform, but it's fair to say there's also been a lot of criticism that the provisions don't go far enough in protecting individual privacy. before we get to some of the definitional stuff, what are your low points and high points?
Anna Johnston: So my high points for tranche one were, giving the regulator more effective civil penalty provisions. I guess a low point of tranche one really is just that everything else was left to tranche two. the really meaningful stuff. So, um, tranche one useful because it gives the current regulator better power to enforce the current law, but it doesn't really shift anything very significant in terms of what the actual legal obligations are on regulated entities. It doesn't broaden the scope of who is regulated. It doesn't broaden the scope of what kind of data is regulated. And it doesn't, lift small businesses. No, no, no. All of that is still up for debate in tranche two.
Dr Miah Hammond-Errey: When Australian's talk about privacy. We really tend to talk more about the concept of privacy rather than the legislation. and I want to get into the detail with you in some of the provisions. what is the definition of personal information in the Privacy Act and why does it matter?
Anna Johnston: Yeah. So it privacy is such a funny concept, right? Like there's no single neat definition. Um, some jurisdictions, they what we call privacy law, they call data protection law as a way of distinguishing it from the broader concept of privacy as a human right. so those of us who work in the area tend to think about privacy as, um, having a few different elements. So sometimes people think of privacy as the right to be left alone, or it's like something to do with secrecy, or it's something to do with solitude, or it's something to do with anonymity, or it's something to do with controlling how our information is used. Um, and it's, it's all of those things, but there is no need definition. We've got a piece of legislation called the Privacy Act. It doesn't define what the word privacy means. but I tend to break it down into like or separate out concepts like the privacy of your body. So your physical, personal bodily integrity, um, versus the privacy of communications versus the privacy of your behavior, which is impacted by surveillance versus the privacy of your personal information. And when we talk about the Privacy Act, it's really only talking about the management of your personal information. Um, I really wish that definition just said all data about humans. It would be much easier. Um, but the actual definition talks about, um, information or an opinion about an individual who is identified or reasonably identifiable. So, um, and this is.
Dr Miah Hammond-Errey: Where it gets tricky.
Anna Johnston: This is where it gets tricky. Exactly. So first of all, what what information is about a person? Is it about a person or is it about something else? And we've had some legal challenges to just the word about. So we we had a case which found, for example, that, um, mobile phone metadata was data about a phone and the network. So the device and the telephone network, not about the human being who used that phone. Um, so that's that's a contested area. That's one of the proposals, is to change the word about to relating to, to make it clear that something can be about a phone and about a human and tell you something about the human.
Dr Miah Hammond-Errey: Yeah. does that threshold come first before the reasonably identify reasonable identification because. Right. Okay. So can you explain how that works?
Anna Johnston: There are three elements. It needs to be information or an opinion. So that does mean it can be um kind of if you like objective or subjective and it can be inaccurate. So it doesn't have to be true information or opinion. It can be inaccurate information or opinion about an individual. So we are talking about human beings. We're not talking about, companies for example, companies don't have a right to privacy. They might have secrecy. They might have confidentiality, but different to privacy. So it's about an individual. And as I said, that word about is itself kind of contested. is this information about. is mobile phone metadata about just a phone and the company that owns the poles and wires it connects to, or is it about the person using that phone and what it can reveal about that person's movement and, and patterns of behaviour? Um, and then the individual must be either identified or reasonably identifiable. So you've got to kind of meet all of those different sort of legal tests.
Anna Johnston: So the two big areas of contention about, the proposal is to change the word about to relating to. And the second area of contention is, well, what do we mean by reasonably identifiable? Um, and there's been plenty of, businesses that benefit from big data from kind of, the corporate online surveillance economy. Who will argue? Well, we don't know the person's real name because they're on the internet. To us, they're just a collection of attributes. We know that this person likes to buy yoga mats and that they live in this general area, and they catch a bus to work, but we don't know their name. There's plenty business practices that rely on saying, well, we don't know the person's name, therefore we don't know their identity. Therefore they're not reasonably identifiable. Therefore, this definition does not apply to us, and therefore we don't have to comply with any of the rules actually set out in the privacy principles, which are, if you like, the rules for how you manage that type of data.
Dr Miah Hammond-Errey: It's so fascinating because being identifiable, it it's not static. I mean, it's a transient Situation and you can be. As soon as more data is aggregated, you can easily be identified. It's an interesting threshold to have as the last threshold, given its significance, I guess.
Anna Johnston: Yeah. So this is I guess this has been a real area of passion of mine to have this part of the definition reformed.
Dr Miah Hammond-Errey: Before we move on, though, I do want to ask you a little bit about the definition of consent in the Privacy Act. Um, why does it matter? And could you take us through the recent cases, like the use of facial recognition in Bunnings and Kmart as examples?
Anna Johnston: So it's important to realise that consent is, sometimes it's described as as the gold standard, that you should get someone's consent before you collect, use or disclose their personal information. In reality, um, if you treat consent as something Serious and valuable. Which which I believe it should be. Um, then you realize that actually the vast majority of things that we do, both offline and online, are not really based on consent at all. I mean, we don't have a choice about whether or not we interact with the tax office. We don't have a choice about whether or not we, um, you know, give our details over to a bank. Other than people who live completely off grid and then, you know, completely as hermits, we have we don't really have a choice about handing over our data. Um, in order to live in a, you know, a normal, healthy life. So, um, most of the time, kind of routine government or business activities are conducted not on the basis of consent in the sense of someone having, you know, absolute freedom to say no and still get the Get the benefit of the good or service that they're after. Most of the time our the Privacy Act actually says, well, you don't need consent as long as what you're doing is for the primary purpose for which you collected it, or it's a directly related secondary purpose that the person would expect, or it's authorized by another law.
Anna Johnston: So there's a whole bunch of, pathways for organizations to collect, use and disclose personal information or consent or with the consent of the individual is only one of those lawful pathways. So consent tends to be reserved for the unusual scenarios. So not the routine activity, but it's where an organisation wants to do something that is unrelated to their normal business activity or not necessary in order to deliver that good or service, and that's when you should stop and ask for the person's permission, and ensure that that their choice is genuinely free. I want to transact with an organisation i want to buy a book online, for example. I'm going to have to hand over some details for payment. I'm going to have to hand over some details to have the book shipped to me. Do I now want to go on the mailing list for the future? That is really that's effectively a consent based or should be a consent based question. I should be free to say yes or no to go on the mailing list, but still get the book that I'm ordering online. so when I said, you need to kind of value and respect consent to me what consent. Means in law, this is what the regulator says it means in law, but it's not currently spelled out in the letter of the legislation itself. So this is one of the proposals for reform is, is to make sure it's actually spelled out in the letter of the law itself. So consent should be voluntary, free to say yes or no and it not have some negative impact on you if you say no. proactive meaning the default is no unless the person gives some kind of positive affirmation. it needs to be current. It needs to needs to be informed and it needs to be specific.
Dr Miah Hammond-Errey: However, it needs to happen prior to the collection also?
Anna Johnston: Well, it depends whether you're seeking consent to the collection or consent to a later use or disclosure. So you might have um, quite often organisations will have collected someone's information. for a certain purpose. And then sometime later they want to use it for a secondary purpose. Maybe they want to do data analytics. They want to do research. They want to now start training some machine learning. The question is, do you have a lawful authority to do that new thing with the data? So can you rely on any of the other lawful pathways set out in the Privacy Act? And usually consent is sort of your last resort. So it's very difficult to if you're treating consent seriously. It's difficult to get. It's easy to lose. It's fragile. You need every single person to agree. One of the elements of it being voluntary is not only that someone is free to say no, it means they're free to say yes and then change their mind later and withdraw their consent or revoke it later. So consent it's it's extremely difficult. It's it's fragile. It doesn't scale. because to get someone's genuine consent requires the question to be specific. It requires them to be informed. They need to understand all the risks involved. It's extremely difficult, especially in the context of, um, let's say reusing data to train AI for machine learning purposes, where it's, it's, you know, people didn't necessarily anticipate that at the time the data was first collected. It's really hard to even explain what are the possible risks. We don't even know. It's all in the future. It's all unknown. often for our clients, we do a lot of work trying to ensure that they have legal authority to do what they want to do that doesn't rely on consent.
Anna Johnston: Because, as I said, consent is so fragile. It's it's hard to get it's easy to lose. the problem we have now is a whole bunch of business practices are built around pretty poor practices, that claim to rely on consent but don't really meet or have all those different elements that I've spelt out in terms of being voluntary and form specific and given by a person with capacity, mental capacity to do so. So, you know, we see a lot of practices where it's opt out rather than opt in or it's something's buried in mandatory terms and conditions, or you have to accept the privacy policy, or you have to accept these terms and conditions in order to do the thing that you're trying to do. Um, so those sorts of practices, to the extent that organizations say it's okay, we can now go and monetize this data because we got our customers consent. Those sorts of practices are increasingly under challenged by the regulator and will be under even more kind of scrutiny and challenge if this proposed reform passes. Now, that's not to say that those business practices will all suddenly become unlawful, but organisations will need to go back and say, well, what's our lawful authority to do this? If we've been relying on consent, do we need to turn around and actually look at other options? And if we have no other options and we do need to rely on consent, then do we need to change the way that we seek and manage those consents?
Dr Miah Hammond-Errey: So I want to move to talk about data privacy and surveillance. Recent reports have shown the use of real time bidding data in relation to defence, political decision makers and intelligence officers and their families. What privacy reforms would be needed to minimize these capabilities.
Anna Johnston: So we need several things. So we need that definition of personal information to be modernized so that the Privacy Act can actually do the work that it's supposed to do. That's the first thing. The second is that we actually need the the rules in the privacy principles themselves. So the principles are the kind of the backbone that explain what you can or can't as a regulated entity, what you can and can't do in terms of collecting, using and disclosing personal information. So we need those privacy principles also to be updated. Now, what's in what's been proposed as part of the reforms, as part of the review of the Australian Privacy Act. It's not a lot of tinkering with the principles themselves. So when I talk about the kind of lawful pathways to collect, use and disclose, not a lot of lot of changes there. But the proposal is to put what's called the fair and reasonable test, lay it kind of over the top. So it's almost like a screening device that no matter what you're doing in terms of collecting, using, disclosing, no matter which of those lawful pathways you're relying on to do your activity, it still has to pass the fair and reasonable test as well. the nickname is the privacy pub test. It is what would be considered fair and reasonable to kind of the average reasonable person.
Anna Johnston: So it's not what's considered fair and reasonable to the entity themselves or a particular, customer who's made a complaint. It's what would sort of overall across the community, be considered fair and reasonable. Um, and this proposed reform. Um, I think has the benefit of being extremely flexible. Um, that obviously makes it more difficult for organisations to comply with that rather than, say, really prescriptive law. But privacy law has never been prescriptive. It has to be able to kind of flex to the circumstances. If you think about all the types of organisations, all the types of data, all the types of technology, all the types of use cases, applying that data to that technology by that organisation. Um, you obviously need to work at a fairly general principle based level. So where the fair and reasonable test, I think, could do the kind of the heavy lifting of, bringing our privacy rules into a more modern sort of digital age is that, it would enable us to reflect emerging technologies, but also community sentiment towards those technologies. And also reflect different use cases. So if we want to look at an example like say, the Bunnings use of facial recognition technology. So this is so we've had a determination on on this from the commissioner.
Anna Johnston: Um and Bunnings the retailer has said it will appeal. So we'll see. You know where that appeal lands. Um, and this determination is not based on the fair and reasonable test because we don't have that test yet. It's based on the law as it exists now. But, um, what you can start to see in the determination of the Commissioner Carly Kind is she's, starting to get kind of people to think more deeply about, the application of certain technologies to certain use cases because you can't just say all facial recognition technology is bad, or all facial recognition technology is amazing and should be allowed. You actually need to look at the use case and think about, well, what is reasonable in this use case. Does the what is what is the problem you're trying to solve? Does the technology actually work to solve that problem? Um, you've got to balance that against the privacy harms that might be done in using that technology for that use case. And also think about what other are there less invasive alternatives available. So you need to kind of throw all of those things into the mix. To kind of come up with an answer about, is the application of this technology in this way, in this use case, in these circumstances, does that pass the privacy pub test or not?
Anna Johnston: It just doesn't pass that privacy pub test. know that people, the vast majority of people do not read privacy policies. I don't blame them. They're boring. and I say that as someone whose job is to often write privacy policies for our clients. They're lengthy. They're boring. Um, my point is, consumers shouldn't have to read them. we don't, um, read the airline safety manual before we step on a plane. We trust that the plane has been built and maintained by people with expertise, and will be flown by people with expertise. We don't ask consumers to kind of sign away their rights to safety when they get on a plane or get on a bus or, buy medicine.
Anna Johnston: Why are we asking consumers to sign or effectively sign away their right to privacy?
Dr Miah Hammond-Errey: I'm going to move to a segment. Um, what are some of the interdependencies and vulnerabilities of privacy and technology that you wish were better understood?
Anna Johnston: That's a really good question. guess I'd call out, how vulnerable organisations can be if they don't have mature privacy practices. Um, I don't think organisations can any longer. And this is whether your small business, large business, nonprofit, government, you can no longer really afford to treat privacy as a nice to have. Um, for organizations that don't have really mature privacy practices, meaning, um, you know, they, um, kind of don't really know what data they collect or they don't have good rules around what data they collect or how long they keep it for or what they're using it for or ways to test. Is it appropriate for us to use it in this way? So organizations that don't have mature privacy practices, I think they're really vulnerable to cyber security risk. That's the first point. Um, you know, sloppy data practices like over collecting data, retaining it for too long really increases the your exposure, you sort of surface area to attack. Um, I think that the big Optus and Medibank data breaches in Australia a few years ago now were a pretty clear illustration of that. We we certainly have seen a real shift in business thinking since those two big data breaches. People realizing, oh, privacy or privacy equals bad cybersecurity risk.
Dr Miah Hammond-Errey: what are some of the new technologies that you see impacting privacy and security?
Anna Johnston: It's hard to have any conversation at the moment without talking about AI, I mean, it's the new gold rush. plenty of organizations are sort of rushing to capitalize and say, how can we use AI, how to improve our efficiency or build new products or earn more money or whatever it is. And coming back to that question about privacy maturity, though, or vulnerabilities, I think organizations are, making rookie errors and wasting time. If they're trying to do things with AI, whether they're developing or deploying. Um, if you haven't first tackled your privacy practices. So, you need a workforce with really high data literacy. You need, um, your organization to have really robust data governance in order to actually grasp those opportunities to capitalize on those technology advances. so that's.
Dr Miah Hammond-Errey: Unfortunately, the boring part. You know, for most people, like they want to talk AI, but not data governance. Yeah. It's essential. Right?
Anna Johnston: It's I mean, it's just the building blocks, I think, you know, having good data governance, good data literacy, good mature privacy practices, all of that is the kind of foundation stones. You need to make sure that you're not going to be building something, that you have to re-engineer some way down the track. When you realize that the data that you use to train your model was illegally collected, or that maybe the collection was fine, but it's use secondary use to train the model was a breach of the law, and you've got a regulator coming in saying, well, you now need to throw out that algorithm and all the data and start again. And we've seen regulators do that, particularly in the US. Um, the Federal Trade Commission has been using a tool they call algorithmic disgorgement, if they find that the basis on which you built your algorithm, was data that you didn't have permission to collect or use in the first place, you've got to throw to throw everything out.
Dr Miah Hammond-Errey: going to go to a new segment for 2025. I've got a new question which looks at the contest spectrum. So what's a new cooperation, competition or conflict you see coming in 2025?
Anna Johnston: I think we've seen the last couple of years have been marked by a lot of cooperation, between kind of consumer anti-competitive. Both regulators and advocates and and privacy that's been really marked. Both the advocates and the regulatory community are sort of aligning, um, their interests and their areas of activity. I would expect that to continue. I think what will be particularly new in 2025 is, um, more on the conflict space particularly. We're expecting to see, conflicts or contested ideas coming up argued in the courts. Um, so there's a few reasons for that. one is we've sort of alluded to, for example, the the Australian privacy regulator, the OAIC determination on Bunnings and the use of facial recognition. We know Bunnings has said they're going to appeal. So you know that's going to be fascinating to see that play out.
Anna Johnston: there's another specific reason why I think we're going to start seeing more conflict in terms of, litigation coming in 2025. And that's because of two developments very late last year in 2024. one of the tranche one reforms was to introduce what's called a statutory tort of privacy that will come into effect in June 2025. so a statutory tort, meaning effectively, any person, instead of having to complain to the oaic and the complaint at the moment has to be framed in terms of it involves personal information. So we've got that definitional kind of threshold issue there. And it must be about a regulated entity. So small businesses are exempt, for example. And it must be alleging that there has been a breach of one of the privacy principles. That's our current regime. A statutory tort says any person can sue anybody for a breach of privacy. So not limited to personal information, not limited to certain regulated entities, not limited to complying with the privacy principles. It has to be serious.
Dr Miah Hammond-Errey: Not limited to digital...
Anna Johnston: No, it could be physical privacy. It's only for serious invasions of privacy. there are some big carve outs for media organisations and journalism in particular, and national security agencies, as you would expect. It's going to be really interesting to see how that plays out. as a tool not only for individuals to seek redress for harms done to themselves, but, um, kind of strategic litigation by advocates to try and force more systemic or structural change. Um, so I did mention there were two developments late last year. So the first was the introduction of a statutory tort. But, only a few weeks prior to that, there was also recognition of a common law tort of privacy or common law right of privacy, in the Victorian county court. And because that's common law, there's no exemption for media organisations under a under a common law tort. So, um, statutory tort had been proposed for decades now. And it's it finally got passed, but with this big carve out. And meanwhile, a Victorian court has, you know, beat them to the, um, to the punch by a couple of weeks. Um, and said, well, actually, the common law already recognises a right to privacy, and therefore people can see other people or other entities. Um, so that's going to become now it's like it's at I said, it's like buses. You wait years for a bus to come along and then two come along at once. It's going to be really interesting to see how people use the tort, to address privacy harms, but how these two torts, actually compare and conflict with each other.
Dr Miah Hammond-Errey: I want to go to a segment now. What do you see as the biggest shifts for leadership from the From the introduction of technology.
Anna Johnston: Where the leaders are in government or in in business, they really need to understand, the foundations of technology, you know, how it works, where the data comes from, how it's used. I think for a very long time, it was easy for corporate and business leaders to think that technology was the job of the guy. It's usually a guy, whether it's, trying to grasp those opportunities offered by technologies like AI, senior executives, leaders, board members can no longer afford to think that they don't need to understand Stand technology, that it's someone else's job to understand technology and advise them. Think anything to do with data. Anything to do with technology is a potential privacy risk, a potential cyber security risk, and that makes it a business risk.
Dr Miah Hammond-Errey: One of the key segments we have is a segment on alliances. You know, we talked a little bit earlier about about the the global role of, privacy and particularly in relation to things like climate change. Do you see any role for alliances in the privacy space, and what could we be focusing on to try and bring people together to solve this problem?
Anna Johnston: We're certainly seeing, I've seen a sort of shift over the last couple of decades. It's been a long time now since I was in a privacy regulator role. But, there's been a really noticeable shift in terms of cooperation between regulators. And by that I mean globally, cooperation between privacy regulators and locally cooperation between privacy regulators and other regulators. I mean, privacy regulators, like many regulators, are often, resources are stretched very thin, especially if you're going up against big, big tech companies, that just have billions and billions and billions and billions of dollars. Um, regulators need to always do more with less. So the more they can share knowledge, share resources, conduct joint investigations, the better. the challenge always, though, is then that they have to apply that law in their own jurisdiction, and then you've got these sort of subtleties of difference in the law between different jurisdictions.
Dr Miah Hammond-Errey: I want to just touch on security and privacy. Do you see any major national security implications for how we regulate privacy?
Anna Johnston: Absolutely. I started in privacy in the late 90s, and, you know, the kind of issues that we were concerned about then are just so vastly different to now. so I came in from very much from a my background is in law, very much caring about human rights. everything shifted for us on September 11th, 2001, and for a very long time after that, it was very difficult to get, anyone to move away from the idea that there was this absolute dichotomy between national security and privacy, that these interests were absolutely in opposition. I think it's there's a real growing recognition that the privacy harms that can be done you know, the integrity of our free elections, the sorts of harms done at an individual level through micro-targeting that is driving extremism and polarization and radicalization. All of that is posing a threat to national security. the current director general of ASIO, Mike Burgess, has warned repeatedly of this kind of rising threat not just from foreign states, but individuals intent on politically motivated violence.
Anna Johnston: So vulnerable young men in particular, becoming polarised and radicalized, at speed and at scale. there's a correlation there between the sort of e-safety risks posed women in particular. a lot of, gender based violence. All of these threats, absolutely pose national security risks to us as countries, they pose safety risks to us as individuals, they pose threats to the stability and integrity of, of our societies. and these problems come from data scraping, profiling and microtargeting. And they are privacy concerns. So you have to tackle the kind of the root cause that, the business models that reward conflict with clicks and likes which the algorithmic engines are kind of spreading hate and misinformation. we need to tackle those business models, and we need to tackle those through privacy law reform
Dr Miah Hammond-Errey: the extractive data economy requires privacy reform because it, it it fuels the algorithmic curation of, basically every major platform. that's what's driving, you know, whether it's information warfare, you know, extremism and polarization. and and it's also driving threats to our, democratic and electoral processes. So it's a foundational problem.
Dr Miah Hammond-Errey: coming up is a segment called Eyes and Ears. What have you been reading, listening to, or watching lately that might be of interest to the audience?
Anna Johnston: I've just come off the back of my summer holiday where I deliberately read and listened to anything other than work related material. having said that, late last year, the Iaap International Association of Privacy Professionals at the National Australia New Zealand Summit, author Anna Funder was a keynote speaker. if your listeners don't already know, Anna Funder wrote a fantastic book called Stasiland about, the East German former East German regime. So she gave this keynote speech where she really spoke about the link between the, um, you know, the power of the Stasi was all about surveillance. It was all built on invasions of privacy. So she was talking about the link between more modern days, this sort of mass invasion of privacy from surveillance capitalism, um, and the link between that and totalitarianism. Um, so and a lot of her books have been about, um, you know, uh, sort of totalitarianism surveillance, um, and the but the individuals who carve out their own private spaces to protect themselves or to resist. I've just started Wifedom I'm looking forward to to getting more into that.
Dr Miah Hammond-Errey: I'm going to go to a section called disconnect. How do you wind down and unplug?
Anna Johnston: I like to get as far away from all screens as possible. So literally I stick my head underwater. I love swimming, whether it's the the ocean, the harbour or a swimming pool. It helps me clear my head. I can't see or hear anything else. It's perfect.
Dr Miah Hammond-Errey: The final segment is Need to Know. Is there anything I didn't ask that would have been great to cover?
Anna Johnston: I think I will mention is that a development that, um, hasn't been yet kind of well noticed? Is that, for all the years and years of we've been talking about the review and reform of the Privacy Act, the actual official review kicked off in 2019. Um, and we're still waiting for the tranche two, which will be the meaningful reforms. But late last year Western Australian government, uh, kind of leapfrogged our national government and introduced a new Western Australian privacy law, which grabbed a lot of the proposals out of the federal Privacy Act review and has already implemented them. So Western Australia now has, um, the nation's most modern definition of personal information. Um, and they introduced a fair and reasonable test. we now have a state law, which clearly relates to information that relates to individuals, not just about individuals. It also explicitly relates to or incorporates online identifiers, pseudonyms, location data, inferred information, generated information. All of that has been explicitly included in the definition of personal information in Western Australia. So having had one state be the kind of the first mover, I would expect that to generate further pressure on Canberra to get on with passing the final tranche to reforms.
Dr Miah Hammond-Errey: Yeah, absolutely. Anna Johnston, thank you so much for joining me today.
Anna Johnston: Oh, you're very welcome. It's been a pleasure. Thank you.