Signed in as:
filler@godaddy.com
Signed in as:
filler@godaddy.com
Dr Miah Hammond-Errey is joined by Professor Johanna Weaver, founding Director of the Tech Policy Design Centre, to discuss the recent 2023–2030 Australian Cyber Security Strategy, including its funding, implementation and challenges, the relationship between hacktivism and international humanitarian law and the different ways of thinking about AI risk and harms. They also discuss their highlights for 2023, Australia’s important position in driving change in multilateral forums, myGov and digital government services and their hopes for the tech policy conversation in 2024.
Professor Johanna Weaver is the founding Director of the Tech Policy Design Centre at the Australian National University (ANU). Before joining ANU, she was Australia’s independent expert and lead negotiator on cyber issues at the United Nations. Johanna also led the Cyber Affairs branch at the Department of Foreign Affairs and Trade, is on the global advisory board on digital threats during conflict at the International Committee of the Red Cross and a former commercial litigator. Johanna also hosts the Tech Mirror podcast, which hosts discussions reflecting on technology and society.
Resources mentioned in the recording:
Making great content requires fabulous teams. Thanks to the great talents of the following:
This podcast was recorded on the lands of the Ngunnawal people, and we pay our respects to their Elders past, present and emerging — here and wherever you’re listening. We acknowledge their continuing connection to land, sea and community, and extend that respect to all Aboriginal and Torres Strait Islander people.
Please check against delivery
Dr Miah Hammond-Errey: [00:00:02] Welcome to Technology and Security. TS is a podcast exploring the intersections of emerging technologies and national security. I'm your host, Dr. Miah Hammond-Errey. I'm the inaugural director of the Emerging Technology Program at the United States Studies Centre, and we're based in the University of Sydney. My guest today is Johanna Weaver. Thanks for joining me.
Professor Johanna Weaver: [00:00:22] Thanks for having me.
Dr Miah Hammond-Errey: [00:00:24] Johanna is the founding Director of the Tech Policy Design Centre and professor of practice at the Australian National University. Before joining ANU, she was Australia's independent expert and lead negotiator on cyber issues at the United Nations. Johanna also led the Cyber Affairs Branch at the Department of Foreign Affairs and Trade, is on the Global Advisory Board on digital threats during conflict at the International Committee of the Red Cross, and a former commercial litigator. She also hosts the Tech Mirror podcast. We're thrilled to have you join the podcast, Johanna, and bring one of our conversations into the world.
Dr Miah Hammond-Errey: [00:00:54] We're coming to you today from the lands of the Ngunnawal people. We pay our respects to the elders past, present and emerging, both here and wherever you're listening. We acknowledge their continuing connection to land, sea and community and extend that respect to all Aboriginal and Torres Strait Islander people.
Dr Miah Hammond-Errey: [00:01:09] So 2023 has been a huge year for emerging technologies and technology policy. We'll dive into the detail in a bit, but can you in a nutshell give us your highlights?
Professor Johanna Weaver: [00:01:21] Look, I think for me there's three real trends that come out of 2023. One was the start of the year, really the end of 2022, when there was increasing geopolitical tensions. So we saw the CHIPS act, the export controls, a real increase in the way that the US was responding to China's rise in this space and increasing decoupling at the high end of the of the technology space.
Professor Johanna Weaver: [00:01:47] The second trend that really stands out to me is one that I call the changing zeitgeist, just because I love the word zeitgeist. But really, this is you see this across all different types of tech policy. So whether you're looking at cyber security or privacy or increasingly now, the conversation around AI regulation, it's this shift to say it's actually not up to the individual to look after their own privacy or to manage cyber security. We need to place more of a burden on the companies. And we've really seen that the US led that in the cyber security space earlier this year. We've seen that in our own cyber security strategy that just came out. And then if you look at the Bletchley Park Declaration, that's all about, okay, you know, we have to stop letting the tech companies mark their own homework, which is sort of the phrasing of it.
Professor Johanna Weaver: [00:02:34] And then, of course, the third biggest trend is around the emergence of artificial intelligence and the fact that this is a conversation that's happening on the front page of the newspapers is very exciting to me, and one that, you know, is pretty obvious in terms of the key highlights out of 2023.
Dr Miah Hammond-Errey: [00:02:50] I think you'll be shocked to know that we are among a small few that are excited by this being on the front page of the newspaper.
Professor Johanna Weaver: [00:02:55] Well, I think there are lots of people that are terrified by it, and a small amount of people that are really excited by it. And what I'd like to see is a little bit more of a nuanced conversation in the newspaper, so we get more people excited about the opportunity of artificial intelligence because it is enormous opportunity, the productivity benefits that can come from this, from your personal satisfaction, from, you know, personalised education, these types of things. There's amazing opportunity. And so hopefully we'll get some more balance to the conversation in 2024, whilst also being really realistic about the risks as well.
Dr Miah Hammond-Errey: [00:03:28] You have a passion for imagining how governance can reshape technological futures. And I hear that coming out a little there. Talk me through how you got to this point and why it's so important for you.
Professor Johanna Weaver: [00:03:38] So, you know, I describe myself as a reformed commercial litigator. I bought back my soul by joining the public service and becoming a diplomat, and then actually left the diplomatic service and retrained in what I then called a specialisation in strategic cyber policy and then worked as, as you mentioned, back in foreign affairs, I got offered a job that I couldn't say no to, establishing Australia's cyber diplomacy practice. And it really emphasised to me how much every single country, every single company and every single individual is genuinely struggling with how to respond and to address this vast step change in the technology that we're using.
Professor Johanna Weaver: [00:04:25] And it is a design challenge. It involves a lot of creativity in being able to do it at speed and scale, and that's why my centre is called the Tech Policy Design Centre, because we really wanted to emphasise that we're not here to admire the problem, we're here to co-design solutions and also that element of creativity. We can't keep applying the same 20th century solutions to 21st century problems.
Dr Miah Hammond-Errey: [00:04:50] So it's really about imagining incredible technology futures, but also agitating for the change to make them happen.
Professor Johanna Weaver: [00:04:57] Exactly. And recognising that, at least for the moment. And we'll see what happens with generative AI. But humans create technology and we can make technology differently. We can make technology that builds in privacy and security, that respects our human rights, but we have to actually demand that, and we have to change the incentive structures for the way that technology is produced, because at the moment, technology rewards the first to market. This is not about stopping innovation. It's actually about enabling good innovation. And, you know, I do get very passionate about that because I do think technology will be core to solving the solutions of the biggest challenges of our time. But it has to be good technology, not just technology.
Dr Miah Hammond-Errey: [00:05:46] Absolutely. Let's talk about cyber security. The Minister for Home Affairs and Cyber Security, Clare O'Neil, released the 2023–2030 Australian Cyber Security Strategy in November. What are your thoughts?
Professor Johanna Weaver: [00:05:58] Look, overall, I was very impressed with the strategy. There's a lot of concepts and ideas that those of us who've been working in the field have been talking about and calling for, for many years, but a government hasn't been brave enough or resourced enough, or had the demand from the voters to take action, to actually put it in paper. So, you know, I think the strategy is exceptionally ambitious and I very much welcome that. What I particularly like about it is its breadth. So it is providing solutions for everyone from personal cyber security, there's commitments for small businesses, there's commitments for critical infrastructure, and there's commitments for large businesses. And also, importantly, that government needs to get its own house in order. Normally, I would say the devil is in the detail. In this case, I would say the devil really is going to be in implementation because there's a lot there's a lot to crack on with.
Dr Miah Hammond-Errey: [00:06:50] And on that note, how do you see the balance of responsibility in the strategy across government?
Professor Johanna Weaver: [00:06:54] Look, I looked at that very closely as a former public servant who's been involved in sort of the down and dirty of who leads on what issue. And to be honest, there are times when an issue like cyber security is hot and sexy and everybody wants to own it, and then something bad will happen and nobody wants to own it, and it's everybody else's fault. So I think what is particularly interesting about the strategy is the very clear allocation of leads. I think if anyone's really nerdy and wants to get into it, I highly recommend you have a look at the action plan, in particular the areas in the action plan where there's two lead agencies. That clearly indicates that there is some bureaucratic jostling still going on.
Professor Johanna Weaver: [00:07:36] So clear delineation of roles is important, but you need to have clear coordination, clear accountability and reporting and tracking of all of these initiatives. And it's actually not just relevant to the initiatives announced in the action plan under the strategy, but also across other areas. For example, the cyber security strategy commits to reviewing data retention requirements. That's also a commitment that the government has agreed to under the Privacy [Act] Review. So we need to make sure that we've got the coordination across portfolios, across initiatives as well.
Dr Miah Hammond-Errey: [00:08:09] In a piece my colleague Tom Barrett and I published, we highlighted the key role of industry in this strategy, which you've mentioned. They do some heavy lifting in several areas. How do you think our coordination between government and industry is placed at the moment to achieve this?
Professor Johanna Weaver: [00:08:23] So we've come a long way in a short period of time. I think that's in no small part due to the standing up of the Critical Infrastructure Centre [now the Cyber and Infrastructure Security Centre] within the Department of Home Affairs. Minister O'Neill has conducted a lot of engagement as part of the drafting of the strategy. The question, though, is can you harness that for impact and harness that to actually deliver a lot of these initiatives that are in the strategy? So there's quite a lot that is foreshadowed that we will do this in consultation with industry, things that are going to probably be quite controversial. So things like the no fault, no liability ransomware reporting requirement that's being introduced. So you know, there it will be interesting to see when the rubber really hits the road on implementation, how those how well those relationships are able to be maintained.
Dr Miah Hammond-Errey: [00:09:20] Yeah, you're absolutely right. And we've seen the significant public-private partnership announced with Microsoft–ASD's Cyber Shield to help improve threat detection and protection. How important do you think these private-public partnerships will be in improving cyber resilience, and are there any risks we should be attuned to?
Professor Johanna Weaver: [00:09:38] What was really interesting to me was the $600 million that was announced with the strategy. Put that in the context of the REDSPICE package, which is like $9.9 billion or something over a period of ten years, and I actually think a lot of the REDSPICE money is going to be used to develop the strategy, and that's pretty common in government. Right? So, I think they are very much linked in terms of the partnerships.
Professor Johanna Weaver: [00:10:04] I actually think the biggest risk is capture. So, the biggest risk is that you have these large companies who are able to secure these contracts, whether it is to secure critical infrastructure or to provide support to government, and that they become the only voices in the room that are listened to. That's the biggest risk.
Dr Miah Hammond-Errey: [00:10:22] Yeah, absolutely. I mean, as you kind of pointed out at the beginning, and I talk about in my forthcoming book, the geopolitical context of technology is really picking up pace in almost every sector. And the relative power of private enterprise to governments is is really shifting. The Tech Policy Design Centre published a policy paper earlier this year on ransomware, given its headline assessment was to strongly discourage businesses and individuals from paying ransoms, how do you feel about the way the matter was addressed in the cyber security strategy?
Professor Johanna Weaver: [00:10:52] Yeah, look, I think the cyber security strategy has taken a really good approach on this. We do need to be discouraging people from paying those ransoms and providing them with support. So I think the no fault, no liability disclosure scheme, that's in line with one of the recommendations in our report and I would like to see that expanded over other jurisdictions as well. If you can have that expanded, for example, in the quad countries, then you're actually going to start to get some really interesting data and information about the way that the ransomware business model is operating.
Professor Johanna Weaver: [00:11:22] The other thing I really like in the cyber security strategy, which is linked to some of the findings in that ransomware report, is the establishment of of support for small and medium sized enterprises. The average cost for a small business recovering from a ransomware attack is about $49,000. Most small businesses don't recover from that, right? So the strategy has put in place support for before, during and after the cyber security incidents for small and medium sized businesses. Now, my biggest question around that is, how are we going to get the people in place to be able to provide that support and guidance? And I know this is something that you've written on and are very passionate about as well Miah. It's all well and good that we make these announcements. Do we actually have the bums on seats to be able to staff them?
Dr Miah Hammond-Errey: [00:12:10] You really raise something there that I think is incredibly admirable in the strategy. And that is, as you said, the breadth. So it looks at small to medium enterprise, which is the largest group of enterprise in Australia, as well as, you know, the big end of town, who actually are often the providers of cyber security services, but also the individuals and I think that's a really key point that we haven't seen everywhere, but does align really well with the US strategy of placing the onus of protection on government and the largest businesses that actually provide services. So I also agree, I think that's incredibly valuable.
Dr Miah Hammond-Errey: [00:12:43] You recently wrote that 'bombs or bytes, missiles or malware, international humanitarian law applies'. Can you take us through what has happened at the International Criminal Court that led you to make this statement?
Professor Johanna Weaver: [00:12:53] This is sort of the culmination of three things happening in relatively quick succession building off the work that I did when I was Australia's chief negotiator at the United Nations. And so those negotiations in 2021 culminated with recognition that international humanitarian law applies to state conduct in cyberspace. And this was something that we had agreed back in 2013, that international law applied, but this was the first time that we'd specifically called out international humanitarian law as applying. Less than a year later, Russia invades Ukraine, and we see cyber operations being used in the context of armed conflict. And, you know, a lot of the negotiations that we did was very closely in partnership with, with the Russians and the Chinese. And so when we were in those negotiations, never did I think that we would see cyber operations being used in armed conflict.
Professor Johanna Weaver: [00:13:45] So international humanitarian law, as I'm sure all your listeners know, only applies during armed conflict. And the prosecutor of the International Criminal Court has come out and said that they consider the cyber operations to be within their jurisdiction. That basically is foreshadowing that the next round of prosecutions will include prosecutions for cyber operations conducted in conjunction with kinetic operations, which constitute war crimes.
Professor Johanna Weaver: [00:14:14] At the same time, the International Committee of the Red Cross, the board that I was on, had been meeting for two years, and we've just released a report and this is essentially, again, something that was initiated pre the Russian invasion of Ukraine, includes a Russian expert, 20 experts from all around the world, and includes a number of recommendations addressing, you know, really topical issues. The increasing way that we see civilians participating in armed conflict from other jurisdictions. So, you know, potentially Australian hackers participating in hostilities in Ukraine, but from Australia. You know, what are the implications of that from a legal perspective? We look at the role of industry. You know, we saw industry on the ground in Ukraine having such a significant role.
Professor Johanna Weaver: [00:14:59] And then finally the third thing that happened, which prompted that article was a blog that was published by Tilman Rodenhäuser, who's a legal adviser at the ICRC, talking about eight rules for hacktivists or for hackers during an armed conflict. And that came out of one of the recommendations in the ICRC report. And what was extraordinary about that is the response from those hacktivist groups on both sides of the Ukrainian conflict, coming out and saying, we will abide by these rules, which is quite extraordinary. Anonymous came out and said, we will abide by these rules. And it may be that the reality out there in some instances still has the appearance of the Wild West. But it's not because of a lack of rules, it's because of a lack of enforcement of those rules.
Dr Miah Hammond-Errey: [00:15:46] Thank you for such a comprehensive answer. And these are really significant crimes, so it's important to see them investigated.
Professor Johanna Weaver: [00:15:52] I think if I can just add two little bits on that, one is I know people often say, what's the point of these rules if they're broken? But the point of the rules is you've agreed the rules. Russia and China and every country in the world has agreed these rules apply. It means we can now call out when they break the rules. And if you haven't agreed the rules, you can't call it out.
Professor Johanna Weaver: [00:16:09] And the other thing I'd say is the conversation has actually expanded. So it used to be that there was lots of different siloed conversations that happened. So it was the use of ICTs in the context of international security. Then you had a cyber operations conversation. Then you had a laws conversation. Then you had lethal autonomous weapons. And what we're really seeing coming together is looking at these things much more holistically as well, which is really another trend, I would say from 2023.
Dr Miah Hammond-Errey: [00:16:37] Yeah, I'd actually agree with you. I think many players understanding the length and breadth of the technology ecosystem and how it affects everyone. It's been a real joy to watch as someone who's advocated that for a long time.
Dr Miah Hammond-Errey: [00:16:51] Let's move on to AI. 2023 really has been the year of AI. In my 'Tech Wrap' of 2023, I set out some of the incredible tech developments as well as the many approaches to regulatory reform. We saw consumer technologies like ChatGPT diffuse through society at an incredible rate. We've seen discussion about what AI is and isn't, and witnessed immense hyperbole about what it will mean for society, including the future of humanity. What about AI stood out for you this year?
Professor Johanna Weaver: [00:17:19] Well, look, I think all technology is about the future of humanity. And if we're not, if we're not putting humans at the centre of technology, then we're getting it wrong. I mean, the focus largely, though, was on the existential risks, which, you know, is a is one part of a conversation, which is an important part of the conversation around artificial intelligence. But I would like to see there being more of a conversation about the way, not necessarily that the frontier artificial intelligence models and the existential risks that they present, but some of the more mainstream artificial intelligence that's already out there in our economies and being used, how that is shaping humanity, not just at the at the far end of the existential risks.
Professor Johanna Weaver: [00:18:06] And I think we actually we need to be addressing both the current risks in terms of discrimination, bias, etc., whilst also those long-term risks. And I know a lot of people who work in the field of artificial intelligence don't like this phrasing of short term and long term, because, you know, a year ago ChatGPT was a long-term thing and then all of a sudden it's here, so, you know, even those timeframes are relatively artificial. But by focusing on the harms that are happening now and putting in place the right and well-designed regulatory frameworks to address those, we're also largely giving ourselves the toolkit to be able to respond to those longer-term risks as well, or to those existential risks, if you like, if we want to avoid that phraseology.
Dr Miah Hammond-Errey: [00:18:55] We've seen various efforts to regulate AI globally, from the Biden administration's Executive Order on Safe, Secure and Trustworthy Development and Use of AI to the EU AI Act, and multilateral agreements such as the UK-hosted global AI Safety Summit, the one you referred to earlier, Bletchley Park, and the G7 Leaders Statement on the Hiroshima AI Process. Where do you think Australia sits and how well calibrated do you think our efforts are?
Professor Johanna Weaver: [00:19:21] Look, I think making an assessment of where Australia sits is actually quite difficult at the moment because we've had the consultation that came out in the middle of the year asking for submissions on Australia's approach to the regulation of artificial intelligence. There were thousands of responses to that. But we haven't yet seen a lot of feedback. We don't yet have a clear idea of the direction that the government is going to take. After Bletchley Park, Minister Husic put down a number of markers, saying that he thought the Australian legal environment and existing laws were largely sufficient, that we did need to be having some additional measures that needed to be put in place, but we're yet to have any real indication of what that is going to be. So I think it's still a bit of a waiting game for us. And, you know, there are pros and cons of each of the different models that are out there.
Dr Miah Hammond-Errey: [00:20:15] Something you mentioned there, the thousands of submissions highlights a conversation I had on the podcast with Hamish Hansford from Department of Home Affairs about the role of Australians and companies, but they were finding in the security of critical infrastructure consultations a huge growth in the number of public submissions, and it's being replicated across many industries and many reviews now where we're seeing thousands of submissions for things. And on the one hand, I think it's wonderful, I love that people are more engaged in democratic deliberations. But on the other hand, it's actually posing some really challenging. You know, how do government agencies actually respond to and analyse in an effective timeframe? It's a really interesting topic and maybe one technology can help with.
Professor Johanna Weaver: [00:21:00] I just think I think this is a perfect example of applying 20th century solutions to 21st century problems, right? Well, first of all, I guess I have a question about how many of them are actually even read properly. And so a lot of work and effort goes into producing those submissions. How can we design a system that allows those inputs to be received more effectively and more efficiently? And there's lots of interesting models, either in Ireland or in Taiwan, of participatory democracy that could be slightly adopted to allow for participatory consultation. So this is not without precedent, and I hope that we start to see an evolution in that model as well. It's something we've proposed to the centre in a report called 'Cultivating Coordination'.
Dr Miah Hammond-Errey: [00:21:48] Do you see any tensions between democracies and regulation of fast-moving and impactful technologies, or do you see that we can continue to regulate the way that we have been, but applying these first principles?
Professor Johanna Weaver: [00:22:01] I think if we apply well-designed regulations, there isn't and shouldn't be a tension.Although, what we have seen in a number of emerging technology fields is that regulation is pushed through, that doesn't necessarily have standard democratic protections built in. So this is your ability to appeal a decision, your ability to seek review of a decision, somebody to be watching the person that, you know in the national security field, we call it watching the watcher. And so we as democracies need to make sure that we're building in those democratic protections that are at the core of our societies. We can't take it for granted.
Dr Miah Hammond-Errey: [00:22:50] We're going to go to a segment now about alliances. What is the role of alliance building in technology policy?
Professor Johanna Weaver: [00:22:57] Look, I think alliances have a really important role in terms of Australia being a middle power, relatively small economy and predominantly a tech adopter. So the building of alliances is important both from a research and design perspective, but also from the perspective of influencing governance and change. Right. Australia as an economy is not going to be able to take really dramatic action in terms of regulating artificial intelligence, because if we do, all of the companies operating in Australia will go offshore and go to two different jurisdictions, right? So to do that effectively and to ensure that we get the productivity benefits of artificial intelligence, we have to do it in partnership. Alliances are really important in that respect. Obviously, with the US being one of the primary countries that we are engaging with in that respect, but also a number of like-minded partners.
Professor Johanna Weaver: [00:23:51] But I would also posit and challenge your listeners to think about non-traditional partners as well. I have found the ability to make real, meaningful progress at the international level on these issues comes from having partnerships that sometimes are a little uncomfortable, right. So this is partnerships with countries that maybe don't have a hundred per cent alignment on our democratic values and our human rights principles, but will come in behind and support the general concepts and principles.
Professor Johanna Weaver: [00:24:20] Not everybody has the same viewpoint. And so we have to accept the world as it is. And I think, and this is what I mean by Australia being able to punch more above our weight is that we, unlike the US, can come from our very immediate region, engage with our close partners, whether that's India, Indonesia, Singapore. And if Australia, India, Indonesia and Singapore put forward a proposal in a multilateral forum, whether that's in the context of, for example, the ASEAN plus India, or if it's in the UN context, it will get a much better reception than if something is put forward by the US or by China.
Professor Johanna Weaver: [00:24:58] I've lived and breathed it both at the ASEAN level and at the UN level. And until you actually see the power of what it is to be Australia and to be Australia separate from the US, not because we're disowning the US, but recognising that we actually, it is of benefit to the Australia–US Alliance, for Australia to be an independent voice in our region and harnessing that is very powerful.
Dr Miah Hammond-Errey: [00:25:24] More than fifty per cent of the world is holding an election in 2024. Mis and disinformation will obviously continue to be a huge challenge.
Professor Johanna Weaver: [00:25:31] Gosh, that's a terrifying statistic.
Dr Miah Hammond-Errey: [00:25:33] How can we collaboratively combat AI-enabled mis and disinformation in the coming year?
Professor Johanna Weaver: [00:25:38] I would just love for someone to invent a watermarking technology, but I'm told reliably that that's a little bit of the way off. In Australia it's by reforming truth in advertising legislation. In other jurisdictions, it is ensuring that there is a requirement to disclose when artificial intelligence is used, and you can place that obligation upon those who are subject to the elections relatively quickly. And it's education of the population. All of these responses are deeply unsatisfactory to me. And that's why I think in this instance, we do need to particularly also be focusing on a technology solution to a technology problem, whilst ensuring that that is put in the context of this is about democratic elections, this is the thing that is at the core of our society and we must protect.
Dr Miah Hammond-Errey: [00:26:26] You'll be pleased to know then, that we are continuing the early work we started on AI in the information environment. You wrote on the export controls on semiconductors in October 2022, just after they were announced and flagged that they were a historical turning point or tipping point for US–China decoupling. From your perspective, where is the debate at now a year on with another round of export controls?
Professor Johanna Weaver: [00:26:48] Look, at the time I wrote that article, my editor came back and said to me, do you really want to be that definitive? To say that people will look back on the controls that were announced in October 2022 as the point at which US-China tech decoupled? And I said, yes, I'm comfortable with that, and I remain comfortable. I think it absolutely is. And the trend continues.
Dr Miah Hammond-Errey: [00:27:09] How does Australia fit into this if it impacts other areas like clean technologies?
Professor Johanna Weaver: [00:27:14] I don't think decoupling means you can't continue to cooperate and that we won't continue to cooperate in areas like clean technology with countries like China. I just think it means that we won't be necessarily sharing the technology until it's well developed and commercialised.
Dr Miah Hammond-Errey: [00:27:32] You recently joined the myGov Advisory [Group]. Congratulations. It's led by former New South Wales minister Victor Dominello, who is also been one of our podcast guests this year, and its role is to advise the Minister for Government Services, Bill Shorten, on improving the myGov service. What do you hope to achieve?
Professor Johanna Weaver: [00:27:50] I hope to avoid Robodebt 2.0.
Dr Miah Hammond-Errey: [00:27:53] How can we see technology enable improved engagement between government services and individuals?
Professor Johanna Weaver: [00:27:59] Look, I think both Victor Dominello and Bill Shorten speak about this very passionately. Can you imagine an environment where you could log into myGov and, you know, all of the relevant government services knew that you'd change your address, you didn't need to lodge it 50 different times and 50 different places? And there are many countries around the world who have really effective digitised government services. And so this has been done before. And so, you know, I think this is about, again, about reimagining a possible future and then taking incremental steps to see it realised.
Dr Miah Hammond-Errey: [00:28:34] We've seen a recent focus on national digital identification. Can countries manage without a digital ID system in a digital era?
Professor Johanna Weaver: [00:28:43] I mean, I don't think so. I think, you know, many of the challenges that we saw out of the Medibank, the Latitude Financial or the Optus hacks would have been avoided in terms of the impact on the on Australian population, if we had digital identity and you weren't required to hand over your passport, your credit card and your driver's license every time you wanted to do something. But the reality is we do have all of the elements of digital identity that exist already.
Professor Johanna Weaver: [00:29:09] So digital identity for us is perhaps a cultural issue we need to to get over. We need to make sure that we get clear public messaging about the use and access of the information that is collected, and make sure we've got really strong and clear protections in place about, you know, law enforcement access and these types of things. And we need to demonstrate that we are implementing those protections and respecting those protections.
Dr Miah Hammond-Errey: [00:29:34] Absolutely. One of the questions we ask regularly of our guests is, what do you do in your downtime to keep sane? And do you have any specific technologies which bring you the most joy?
Professor Johanna Weaver: [00:29:45] The way that I turn off is by turning off my devices. I love technology, I love the connectivity that it brings, but I think it's also important that we detox and that we have periods of time where we're away from technology. So, most weekends I will put my phones and my devices in the drawer, and I will have at least a 24-hour period where I'm not on the phone. One thing that I really recommend, and maybe your listeners can try, is having a box on your kitchen table that everyone puts their devices in. And I mean mum, dad and the kids. It's not selective. And you sit down and have, you know, dinner times that are digitally free. So I think that's really important.
Professor Johanna Weaver: [00:30:21] In terms of the technology that brings me the most joy, I'm teaching myself to play the guitar and I have this amazing app that I am using, so I get a lot of joy out of that. And I'm also chronically dyslexic, so I use a lot of technology to assist me. Things like Grammarly and technology like that, which doesn't necessarily bring me joy but makes my life a lot easier.
Dr Miah Hammond-Errey: [00:30:43] Coming up is 'Eyes and Ears'. What have you been reading, listening to, or watching lately that might be of interest to our audience?
Professor Johanna Weaver: [00:30:50] Well, given we're going into Christmas, I thought what I might do is just flag some of my favourite novels in this space. So Neal Stephenson's Cryptonomicon absolute chunker of a book, but definitely one of the best books.
Dr Miah Hammond-Errey: [00:31:04] You're taking advantage of being the final episode for the year, where you assume people have all this time to read.
Professor Johanna Weaver: [00:31:08] Exactly, but it's an awesome, awesome book, and it has, you know, some quite complex cryptography in it, which I am told is actually like factually correct as well, which, you know, I just skipped over those pages.
Professor Johanna Weaver: [00:31:19] But I would also recommend to listeners the Illuminae Files. This is a three-part series, and it's a graphic fiction and really awesome, set way into the future on a spaceship, really interesting in terms of the way that humans interact with technology. So highly recommend that.
Professor Johanna Weaver: [00:31:38] I would also recommend The Codebreakers by Alli Sinclair, which tells the story of the Garage Girls who were codebreakers during the Second World War in Brisbane. And one that I'm just finishing reading that I highly recommend is called Queen of Codes, which is The Secret Life of Emily Anderson, Britain's greatest female codebreaker. And she was really pivotal within British intelligence through the First, the Second World War, the Middle East, and someone who should be a household name, but isn't, so highly recommend that book as well.
Dr Miah Hammond-Errey: [00:32:09] Excellent recommendations. I want to chat to you, Johanna, about technology contest. And you brought it up as one of your three themes of 2023. We're seeing the concentration of data and computational capacity as a new form of power, geopolitically, as well as within industry. What are you most concerned about?
Professor Johanna Weaver: [00:32:28] I think I'm most concerned about the concentration of power in the hands of people and organisations that are not democratically accountable. Now, that might be within the context of Australia and our alliance partners, our close allies within industry. Or it might be the concentration of this type of power within other governments like China, or, you know, Saudi Arabia has just released an AI model as well. So I think for me, it's the concentration of power.
Professor Johanna Weaver: [00:33:06] The amount of compute power that is required, both in terms of the chips that underpin that, but also the power that is required in order to do that, these are things we can control. So, and here I mean in a in a traditional arms control context, right, most, most technology is actually really difficult to control across borders because it is dual use and multi use, these super, super, super powerful frontier artificial intelligence models, you can identify them because of their power pattern. And we can also track where these chips are being taken and used.
Dr Miah Hammond-Errey: [00:33:43] In January, OpenAI's ChatGPT reached 100 million monthly active users, just two months after launch. It briefly held the record of the fastest-growing consumer application in history, beating TikTok's nine months. In November, it was announced that ChatGPT is also used by two million developers, including over ninety per cent of Fortune 500 companies. What followed, though, was a chaotic four-day stoush between OpenAI CEO Sam Altman and the board. Johanna, what can we learn from the Sam Altman saga?
Professor Johanna Weaver: [00:34:14] My key lesson out of this comes back to what? What it always does for me. It's about governance, right? It's about accountability. And it's about saying if we're developing these types of tools and we're placing this amount of power ultimately in the boards of private companies, is that how we want the technology that will shape our future to continue to be governed? And certainly for me, the answer is no. We need to have much more oversight of the way that these organisations that will be shaping our future are being run. And it's part of the trend that we've really seen over 2023. That is, we need to hold these companies to account much more than we have previously.
Dr Miah Hammond-Errey: [00:34:59] We're going to go to a segment, 'Emerging Tech for Emerging Leaders'. Can you share some new emerging technologies you think-up-and-coming leaders should know about?
Professor Johanna Weaver: [00:35:06] Actually, I think, for emerging leaders, the most important thing is to recognise that the technology is going to change so fast. It's not about choosing a technology that you're going to specialise in, because that technology will be out of date before you transition from an emerging leader to a leader. But it's about providing yourself with a strategic framework to be able to think about these technologies. And that strategic framework doesn't necessarily have to change depending on what those emerging technologies are. So it's building that ability to critique, to ask questions, to second guess, to listen to your gut. As an emerging leader in this field, make sure when you're engaging in you're applying that strategic framework which you're building, that you actually have an understanding of the impact. And that usually means getting out and using the tech and understanding what the tech actually does. And not enough people do that.
Dr Miah Hammond-Errey: [00:35:55] This is the final episode for 2023, and with all that's happened in tech this year, including what we've just touched on, there's plenty to consider for the coming year. What are you expecting or looking forward to in 2024?
Professor Johanna Weaver: [00:36:06] I am really looking forward to, in 2024, the continued dominance of the conversation around technology in the general public debate. This is a real shift in 2023, the fact that, you know, I'm going on mainstream radio and we're having these conversations in podcasts like your own, but also that it's really permeating the public conversation. And that, to me is really exciting. So how do we move towards implementation of some of these really significant changes that we're seeing both domestically and globally?
Dr Miah Hammond-Errey: [00:36:39] Yeah, I often start my presentations with a recount of how recent many of our consumer technology applications actually are. You know, they're basically teenagers. And for anyone that's got a teenager, you know, thinking about their level of maturity and sophistication, we start to understand why perhaps they need more guidance in board governance.
Dr Miah Hammond-Errey: [00:36:58] In our final segment called 'Need to Know', is there anything I didn't ask you that I should have or would have been great to cover?
Professor Johanna Weaver: [00:37:04] Oh no. You've been very comprehensive. What a great set of questions, thank you. Look, I think the final thing I would I would leave listeners with is one of the things that we have touched on and tha