Signed in as:
filler@godaddy.com
Signed in as:
filler@godaddy.com
In this episode of the Technology & Security podcast, host Dr. Miah Hammond-Errey is joined by Australian Privacy Commissioner Carly Kind. In this episode, we start by imagining a future privacy landscape where individual privacy is protected, and users have real agency and choice and look at the steps we need to take to get there. We explore the significant impact of advertising and extractive data economies on our daily technological interactions and emphasize the critical interdependency of AI on vast amounts of personal data. We discuss the challenges posed by large tech platforms developing AI models that will shape future technology products.
The episode also looks into the misconception around cookie tracking consent notifications under GDPR and looks at legislative reform around privacy globally. We discuss the growing need for robust data breach deterrence as the OAIC aims to penalise entities for systemic failures to secure personal information. Lastly, we consider the immense infrastructural power of technology and its role in shaping society, highlighting how big tech companies are not just intermediaries but are actively influencing the world we live in.
Carly Kind became Australia’s Privacy Commissioner in February 2024. Prior to this, she was the inaugural director of the Ada Lovelace Institute. She is a lawyer and leading authority on the intersection of technology, policy and human rights. She has advised industry, government and for purpose organizations and has worked with the European Commission, the Council of Europe, UN bodies and a range of civil society organisations.
Resources mentioned in the recording:
· Hard Fork https://www.nytimes.com/column/hard-fork
· Ezra Klein podcast https://www.nytimes.com/column/ezra-klein-podcast
· Exponential View from Azeem Azhar https://www.exponentialview.co
· Miah Hammond-Errey (2024) Big Data, Emerging Technologies and Intelligence: National Security Disrupted, Routledge (30% off code: ADC24)
This podcast was recorded on the lands of the Gadigal people, and we pay our respects to their Elders past, present and emerging. We acknowledge their continuing connection to land, sea and community, and extend that respect to all Aboriginal and Torres Strait Islander people.
Thanks to the talents of those involved. Music by Dr Paul Mac and production by Elliott Brennan.
Dr Miah Hammond-Errey: [00:00:00] My guest today is Carly Kind. Carly kind is the Australian privacy Commissioner. Previously, she was the inaugural director of the Ada Lovelace Institute. She is a lawyer and leading authority on the intersection of technology, policy and human rights. She has advised industry, government and for purpose organizations and has worked with the European Commission, the Council of Europe, UN bodies and a range of civil society organizations. Thanks so much for joining me, Carly.
Carly Kind : [00:00:25] Thanks for having me Miah.
Dr Miah Hammond-Errey: [00:00:26] We're coming to you today from the lands of the Gadigal people. We pay our respects to elders past, present and emerging and acknowledge their continuing connection to land, sea and community.
Dr Miah Hammond-Errey: [00:00:36] So, Carly, research shows that Australians do care deeply about data privacy and expect business and government action. So I want to start by asking you to imagine an ideal privacy landscape, kind of far in the future in 2050.
Carly Kind : [00:00:54] Good question. For me, it is always about agency and control and empowerment and the idea that privacy is a form of power that individuals can exercise against, um, businesses, governments and big entities around them. So an ideal privacy landscape is one where people feel empowered around their information and their personal data. Um, it's not necessarily one that looks any different in terms of the technologies we use or the ways in which people share their information. I think there's a lot of kind of cultural value to the, um, way in which our kind of online communications are evolving, including through the use of social media, etc.. But I would hope that the underpinning infrastructures for those kinds of technologies have changed dramatically, whereby, again, users are in control, their personal data is their own and is not kind of exploited and monetized in the pursuit of kind of pushing them towards particular, consumerist behaviors. So I think the, um, the ideal state is a is a looks a lot like where we are today. But with all of the underpinning infrastructure being much more protecting of individuals privacy, and where individuals feel like they have actual agency and choice when it comes to the products and technologies they use, and that they don't have to take a cut on the types of innovation they have access to just to protect their personal data.
Dr Miah Hammond-Errey: [00:02:40] Do you think that you have a different view for where we might be in five years,
Carly Kind : [00:02:49] What I described to you is a slightly utopian vision. I don't think we can get there in five years, but that's not the same thing as not being able to get there. There is a lot of really entrenched, features of the personal information ecosystem that will require a lot of coordinated efforts to peel back, dismantle and overcome short of some kind of global, hugely progressive treaty. we have to look at this as a cross-jurisdictional issue that can only be tackled by national jurisdictions, and that means it's a patchwork. It requires collaboration and cooperation between policymakers, regulators, civil society, academics and consumers. Um, there's going to have to be a lot of small steps that eventually amount to big changes. I think a lot. About how change happens. And it's not a linear process. So crises pay play a big role in, um, leapfrogging some hurdles towards change. So if we just look in the privacy and data world, things like Cambridge Analytica were a moment in which we kind of leapfrogged forward in terms of our understanding of regulation of the data economy. So those are unknowns, and we may have crises that eventuate that will again propel us forward. All of which is to say, I think we can start to make progress within five years. But I don't think I'm naive enough to think that we can make that wholesale change in that time.
Dr Miah Hammond-Errey: [00:04:18] What do you think the key priorities are for change to narrow that gap?
Carly Kind : [00:04:21] So I think in Australia there is high public awareness and high public concern. And that's actually a really foundational element of real change. People are motivated. They want to see more regulation. They want to see kind of tougher controls. So tick on that box, which you don't have in every jurisdiction by far. I think the next step is, kind of political and policy environment that also agrees for the need for change and is informed, well informed by a range of different stakeholder interests, including civil society, on the one hand, but also the realities of the technology, the interests of business and corporate Australia. And I think we also have that we have quite a high level of sophistication in the policy environment. We also have the possibility that technology and innovation itself may answer some of those questions and build new privacy preserving products that can compete in the market, and I'm kind of maybe less convinced that that's going to be an effective route in the meantime. We have a regulator, the office of the Australian Information Commissioner, that is highly motivated to advance the application of privacy law in a way that takes forward protections for individual Australians. And we also have a government who seems highly motivated to bring legislative change about. we certainly have some of the components that are heading us in the right direction.
Dr Miah Hammond-Errey: [00:05:38] Most of my listeners will have heard the recent episode with Byron Tau. I was hoping that you could describe how the digital landscape is so deeply connected to surveillance and how it works.
Carly Kind : [00:05:55] Yeah, I suppose for your listeners and this came through in Byron's interview as well, that it's important to not underestimate how much advertising is a primary driver of much of the technological ecosystem that we interface with every day, because it is the business model behind so many of the free products that we all enjoy and use. And advertising is itself a form of surveillance in many respects. It's a corporate form of surveillance. It benefits, or at least the way in which advertising is currently done through the use of real time bidding. It relies on a high degree of granular information about individuals in order to inform choices about what advertising to target them with on the premise that targeted advertising delivers more kind of click through follow through in terms of purchasing, which I think is like a questionable premise. I think there was a famous quote from one of the kind of Mad men in the 50s and 60s that, um, 50% of the money we spend on advertising is worth it, but we just don't know which 50%. And I think that that generally is true also of the current advertising ecosystem, that, of course, the way in which that advertising based model incentivizes the collection and analysis of personal information is its form, as I said, a form of surveillance. And then it can, as Byron helpfully laid out, can then be exploited by other actors also seeking to, benefit from surveillance, including states.
Dr Miah Hammond-Errey: [00:07:16] The data economy we have now. creates immense personal risk and social harms, as well as new national security threats, ranging from online bullying to tech facilitated abuse all the way through to large scale surveillance, tracking and oppression. Do you see any successful protections in other countries that have prevented this kind of surveillance?
Carly Kind : [00:07:35] I don't think any jurisdiction has been successful in in really tackling the heart of those kind of, business model issues. I think that, you know, if any has attempted it's been the European Union via the use of via various kind of data protection regulators. And I think we have seen some interesting, um, you know, both national regulator decisions and then European Court of Justice decisions Around, um, for example, um, political micro-targeting. Um, but, you know, I don't think we've seen enough yet to really fundamentally upend the business model, which is probably what it's going to take. the application of law is not outside the realm of of politics and lobbying and the influence that companies apply as well. So, yeah, I don't think we've yet seen those real, real impact in that space.
Dr Miah Hammond-Errey: [00:08:29] you've worked heavily in Europe and the EU. for many people, GDPR is associated with a constant cookie tracking requests that we get when we're browsing. But can you take listeners through how GDPR and other provisions in Europe have impacted privacy, and is it impacting the extractive, data collection business model.
Carly Kind : [00:08:50] Big question Miah. I would just say on cookies, I think that's a really good example of how, uh, the unintended effects of well-intended law can be played out through kind of disingenuous application by corporate actors. cookie notices are by no means required by European law. Actually, what's required by European law is a presumption that you need consent to do cookie tracking. But the way in which entities have, ostensibly complied with that obligation is actually a huge burden to individuals that makes them almost resent the legal protection in the first place. we have to be so careful. I think with the development of new legislation and guidance that accompanies how it should be complied with, that we avoid those kinds of unintended and unintended consequences. Um, I think that for the most part, the GDPR is it's risk based law and it's, it's relatively high level. And the actors who have been best able to comply with it are those that are big, have big legal teams and a high level of understanding of the GDPR and who are able to comply, move just up to the barrier of compliance and not go any further, but to kind of, you know, push as far as they can. And those have been, you know, perhaps perversely, the same actors that that the European Union may have wanted to fundamentally change their practices. So in many respects, the GDPR is has been highly embraced by big tech companies because it provides a framework that they can work within largely. And now that's not true across the board. And we've seen, recent instances, for example, the development of the meta AI tool in which they sought to roll it out in Europe under a legitimate interest legal basis, and the regulators there said that's not sufficient.
Carly Kind : [00:10:35] They've had to stop the rollout of meta II. for the most part, it may have effectively entrenched some of those larger actors who are able to better feel comfortable about complying with risk based legislation, whereas smaller entities tend to over comply because they don't they can't kind of operate with that same level of confidence. So that's kind of an infrastructure level. But at a kind of day to day level, I would say having lived in the European Union for 15 years and just moved back to Australia, I notice daily the way in which my data is extracted here in a way that it wasn't in when I was living in the UK. So really small things like, you know, if you want to join a public Wi-Fi network, often you're asked for your date of birth or your full name. That simply would not be sustainable in the EU. Um, when I buy a product online, I'm often subscribed to the newsletter and receive direct marketing for many months afterwards. I notice more kind of scam emails. I notice more, unsolicited text messages and that kind of thing. So at a kind of basic like consumer level, I definitely notice the data exploitation here. And I think that is as a result of the regime in Europe is, is has this kind of legal basis approach to, um, collection and use of personal information that's heavily grounded in consent. That is not the case in under Australian law.
Dr Miah Hammond-Errey: [00:11:58] you were appointed to the role of privacy commissioner in February this year, This saw the role reinstated to a standalone position. what do you think this says about the ambition of this government? And how can that be reconciled with the long overdue reform and delay of the new legislation?
Carly Kind : [00:12:14] I think the ambition of the government is clear, and the attorney general has been really committed to this issue for a really long time. I think he, in particular has a very personal commitment to privacy as an issue. the reinstatement of the Privacy Commissioner as a standalone position is one element that evidences that, as is the recent developments around the Privacy Act review. The question of how then moves into legislation is, of course, again one of government priorities. It's outside of my remit to speak to that. And the answer is I don't have visibility into how it fits along. Other side government priorities. I would note that it's not an insubstantial legislative change. It has effects for potentially for businesses across the economy. And I can imagine that that is part of the, you know, weighing calculation that's being factored into. Yeah.
Dr Miah Hammond-Errey: [00:13:05] obviously privacy reform is long overdue. Do you think it's feasible to expect that this new legislation will move the needle in the Australian context?
Carly Kind : [00:13:08] I think it will depend on what is contained in the legislation. I think if the entire ambit of the Privacy Act review were to be taken forward, it would be a really big step change for the Australian economy for a few reasons. One is currently there's around three. 3 million Australian businesses not covered by the Privacy Act.
Dr Miah Hammond-Errey: [00:13:29] And that's because they're small to medium.
Carly Kind : [00:13:31] Correct. So bringing those within the remit of the act. I mean, most of those organizations are probably, I would hope, embracing good data governance practices simply from a risk mitigation perspective. undeniably, that would instill a culture of privacy across the Australian economy that doesn't currently necessarily exist. Secondly, I think there's some some key elements that were raised in the Privacy Act review, which may be taken forward in legislation or may not be. One is the fair and reasonable test, which I think is a really novel approach to, um, kind of legitimacy around data handling and collection. It goes it goes in a different direction to the GDPR. So it doesn't say consent is the main legal basis. Um, but in a way, it puts the onus back on entities themselves to, you know, really ensure that what they're doing with personal information passes the smell test, as it were. I think that that could be a really that could be a game changer, actually, in terms of instilling good privacy practices. And then there's a range of other things in the proposals. One is around the Children's Online Privacy Code, which would for the first time enshrine specific privacy protections for children under Australian law, which don't currently exist. And I think that could also be a really significant change.
Dr Miah Hammond-Errey: [00:14:43] We've seen global efforts to regulate data privacy, but none really appear to have got the settings quite right. What do you think are the most important here in the Australian context?
Carly Kind: [00:14:54] I think you'll be unsurprised to hear, and it's a kind of feature of everything I have to say. I think there has to be a kind of ecosystem. There has to be a patchwork of different, provisions or standards that come together. So one of the one of them is around powers for the regulator. Of course, that's front of mind for me at all times. Um, you know, as a privacy regulator, we have some quite novel powers, including our determination, power, our ability to seek fines in the court or seek penalties in the court is relatively restricted. We have quite a high bar that we have to meet, which is serious and repeated. So the Privacy Act review proposes a range of tiers of penalties that we could see that would enable the office of the Australian Information Commissioner to bring more enforcement actions that have a like a wider spread of potential impacts and severity, which we would hope then have flow on effects in terms of deterrence and education for the regulated community. So that's one aspect. I think the fair and reasonable test is a novel introduction. It's not something that other jurisdictions have tried, and that means it really poses potentially quite an interesting opportunity to say, is this a better route in to developing better privacy practices?
Dr Miah Hammond-Errey: [00:16:09] I've heard you say that we can expect a new statutory tort of privacy that would include things like rights to erasure and rights to de-indexing on search engines. wanted to just understand a little bit more about the likelihood that we would see something like that.
Carly Kind: [00:16:25] So the statutory tort is a direct kind of claim that individuals can make to the court about their privacy being infringed. Um, and it will actually relate to things outside of personal data. Will be privacy also more broadly. for example, if your neighbours are spying on you, for example, that would be a route that you could potentially seek remedies through a statutory tort. But equally, you could bring a claim against meta for misuse of your personal information via the same legal avenue that tort is was originally designed by the Australian Law Reform Commission in 2014, when the Attorney General, Mark Dreyfus, was the Attorney General at that time as well, and it was part of his initiative that led to that. that is an idea that has hence been around for more than ten years. And the Privacy Act review agreed that it should go forward. we don't know if it will look like the ALRC version or not, but that's that. The right to de-indexing and the right to erasure are would be changes to the Privacy Act that would, um, potentially come through Privacy Act reform. the way in which it was scoped in the Privacy Act review is that those would be specific provisions modelled on.
Dr Miah Hammond-Errey: [00:17:33] Thank you. Good clarity. Privacy does get down in the weeds a little bit complicated at times, You've previously called for new privacy laws, obviously, and at Senate hearings highlighted the extractive data collection and online tracking system really requires a systemic response to fix something we're hearing across the board in many tech related policy issues. Given the ubiquity of data and pervasiveness of these business models that rely on it. do you actually think it would be possible to achieve meaningful privacy protection without business model change?
Carly Kind : [00:18:05] I think to big tech, it's hard to imagine, meaningful changes to individuals privacy within the confines of the current business model. it depends on the product, of course. So Apple is, because they have a hardware business that is an alternative to a, um, advertising driven business model. They're able to roll out more privacy preserving features, So I think we are you know, we're seeing quite good steps in some respects from them in terms of like the Do Not Track feature on apps. I do struggle to imagine what a, advertising driven data ecosystem that is also privacy preserving looks like.
Dr Miah Hammond-Errey: [00:18:45] You've signalled a stronger enforcement role in privacy protections. And in June, the information commissioner commenced civil action against Medibank under the Privacy Act for failing to protect the personal information of nearly 10 million Australians. why is this action so important and what can we expect going forward?
Carly Kind : [00:19:00] So this is the third civil penalties proceedings commenced by the Oaic in its history. Um, it is pertains to a data breach that affected 9 million Australians and had really serious effects for quite a lot of those people, um, not least because the information that Medibank had was included, you know, really sensitive in health information. so this is an important action because we are really seeking to establish that, um, although there are cybersecurity risks inherent in our kind of digital economy and we cannot, um, mitigate all of those risks. And cyber criminals are increasingly more sophisticated, and we cannot prevent all actions by those actors. There is an obligation on entities to take reasonable steps to secure personal information, and that is a legal obligation, not a nice to have. And we believe that entities need to both invest in reasonable steps and to be able to establish that those steps are reasonable in all the circumstances, including the type of information they have, the size of the entity, their revenue. so we really want to set a standard that, and send a very clear message that this is not an issue that can be deprioritized it's not an issue that can be ignored at the board table, but it is an area in which um, companies have to invest. And we are not seeking to penalize entities who've been subject to data breaches, but certainly we are seeking to penalize entities who do not take reasonable steps to secure personal information.
Dr Miah Hammond-Errey: [00:20:38] On the 21st of August this year. The PAIC dropped action against Clearview. In a world where digital privacy intrusion is rife. What factors do you prioritise when taking regulatory action?
Carly Kind : [00:20:51] Yeah, so there are a range of factors and we have to weigh them in every single case. we have a statement of regulatory approach which sets out some of those factors. But for example, the types of matters that we will, um, prioritise include those that include persistent, systemic or egregious violations of privacy, those where there are serious harm to vulnerable people. Those were our action is going to result in changes, including to market practices. That's a really key element. By taking this action, are we actually going to get an effective change for the Australian community? And will the investment of resources by our office be commensurate to the change that we're able to exact through? Through this process? So those are the those are some of the factors we take into account in the Clearview matter. Just to clarify, we we undertook an investigation against Clearview. We made a determination with a range of findings that they were unlawfully contravening the Privacy Act and that they needed to take certain steps. That determination still stands. They challenged that determination in the A8 and pulled out before the proceedings finalized. The determination exists and they need to comply with it. The balance we're making is what lengths do we go to to ensure that a company that doesn't have a physical presence in Australia, that is the subject of many regulatory interventions around the world, is complying with our determination, and that's the balance we're having to strike. We have thousands of privacy complaints afoot to what extent do we pursue that enforcement angle?
Dr Miah Hammond-Errey: [00:22:24] Privacy and security are often presented as polar concepts, creating regulatory and social tensions. My experience and research suggests that isn't actually the case, but in fact, they are interrelated and essential in a democracy. How do you frame the issues?
Carly Kind : [00:22:40] I completely agree that they are really essentially, kind of interrelated. I think there's actually a really interesting kind of common goal of many in the privacy community and the security community, who certainly kind of agree that, vulnerabilities in technical systems and excessive data collection all create risks when it comes to cyber security and physical security as well. And even though privacy and security are often framed as kind of at loggerheads, particularly in the encryption debate. Actually, I think most people in the national security realm would accept that encryption is an essential component of national security elements. And just as many in the privacy sphere think encryption is an essential privacy protection. There is a small area in which there's friction there, which is around access to information for criminal investigations. But, you know, for the most part, I think the two are reinforcing.
Dr Miah Hammond-Errey: [00:23:42] in a digital era, the tools of surveillance, tracking, tracking, targeting and oppression are actually available much more broadly, and without the oversight that you'd see in the national security agencies in Australia and other democracies, democracies. No one is saying the oversight is perfect, but it certainly acts as a protective mechanism for Australians. Are you concerned about the security risks associated with privacy intrusion by non-state and other foreign actors.
Carly Kind : [00:24:10] I think that is a huge problem. And I would connect it to the kind of trade in surveillance technologies worldwide where you see, off the shelf, tools from malware, right through to, apps that can hijack an entire device. And, um, being very freely available on the open market, both to corporate actors, to other governments, those without the same protections in place, Um, I think that, again, for me, that reinforces the crossover between the privacy and security community objectives. all of this kind of leaky bad technology that's enabling data to go all sorts of places where individuals don't want it to go. All of that creates privacy risk. All of that creates security vulnerabilities. Those vulnerabilities can be exploited directly by governments. They can be exploited by a third parties that governments are using, etc.. So anything Again, that kind of contributes to this economy where our data is everywhere, I think is both a privacy and a security risk.
Dr Miah Hammond-Errey: [00:25:00] if my LinkedIn feed is anything to go by, regulators across information, privacy, online safety and competition are working more closely together than ever before. Is this an anomaly or are we seeing increased collaboration?
Carly Kind : [00:25:21] I think we're definitely seeing increased collaboration. Um, we're not the only jurisdiction doing this. We you know, we're certainly inspired by what's happened in the UK around the Digital Regulators Cooperation Forum there. they've gone one step further and they've established essentially a kind of joint staff across the four. Uh, I think they're four regulators there as well. in Australia, it's still at the level of, um, uh, kind of meeting of minds, of leadership. We also have working groups that our staff, um, jointly run. Um, and there's there's clearly a lot of crossover, particularly when we think about the big digital platforms and how to ensure that they are properly regulated. It's really hard to do regulatory cooperation, I think, and in particular, it's hard to take, you know, the higher level ambitions and bring them right down to a lower level of enforcement actions, for example. For me, that's the next ambition. How do we do a kind of joint enforcement action in this space? And, you know, I think my colleagues at the Tech safety and ACMA share that ambition.
Dr Miah Hammond-Errey: [00:26:25] The podcast has a segment in 2024 called Interdependencies and Vulnerabilities What are some of the interdependencies and vulnerabilities of privacy and technology you wish were better understood?
Carly Kind : [00:26:35] I think that for me, the biggest kind of interdependency issue is the one of kind of vertical integration and the way in which, the acquisition of personal information through one product owned by a company enables them to entrench their position of power and dominance in that market. To channel you through to other products and to consolidate you within their platform or platform ecosystem. my concern is I see that emerging in the AI space where you have these major tech platforms by virtue of the fact that they have access to so much personal information, being those who are able to develop the large foundation models, which then going forward will underpin many of the technology products we all start to use. I think that that is a big interdependency issue that is really of concern. we were fortunate that OAS to have Doctor Michael Veale from the UCL Give this a lecture this week. And he spoke to us about AI supply chains and the way in which kind of responsibility and accountability is diffused amongst so many different actors. And kind of unpicking those supply chains is really, really difficult, both from certainly from the consumer perspective, but also from a regulatory perspective.
Dr Miah Hammond-Errey: [00:27:53] on to the segment about alliances. You mentioned before that, regulatory collaboration is very difficult. what alliances do you think will be most important or collaborations in the privacy and tech space in the next few years?
Carly Kind : [00:28:07] The big alliance that I'm focused on is that between regulators, academia and civil society, I think that, um, the nature of privacy harms and privacy issues going forward is going to be so complicated, so obfuscated, um, and so technical that regulators like the Oaic with fewer than 200 staff across both freedom of information and privacy jurisdictions are simply not going to be able to build the internal capabilities to really understand and target the most problematic privacy practices, and therefore, we have to lean on and leverage the work done by others who do have those skills, including technical skills in the privacy in the academia and civil society communities. And I think there's lots we can take from other jurisdictions. I think this has been much more common, for example, in the competition jurisdictions, and we need to really learn how to do that kind of collaboration across sectors.
Dr Miah Hammond-Errey: [00:29:09] Onto a segment about emerging technology for emerging leaders. What do you see as the biggest challenges for leaders in the current technology and privacy environment?
Carly Kind : [00:29:19] One of the biggest challenges, I think, is the cross jurisdictional issues that come through trying to legislate and regulate practices undertaken by companies that aren't based within Australia, and the kind of confines of our legislation, um, and the confines of others legislation. I think we saw this play out really interestingly recently with the Esafety Commissioner's action against X around the their issuing of takedown notices, um, that weren't complied with by X, and Esafety sought to have those notices enforced by means of injunction, and the Federal Court essentially refused to do so. Um, because of the challenges of enforcing injunctions across borders and the risk that doing so might impact on what's called the comity of nations. And I think actually, that's a really big bug in the machine when it comes to trying to use what are, in Australia, often really novel or ambitious enforcement powers against companies that aren't domiciled in Australia or aren't solely domiciled in Australia, and whose operations cross borders. even technical issues such as the use of website caching to ensure that data is, accessible from the most, proximate geographical point creates actual, actually really big barriers to enforcing certain regulatory enforcement orders.
Dr Miah Hammond-Errey: [00:30:48] so going coming up is a segment called Eyes and Ears. What have you been reading, listening to or watching lately that might be of interest?
Carly Kind : [00:30:54] I am a big podcast person, and I am also a mum of three kids, which means I have no time to do anything other than work and look after my kids. So I, um, get all my information from podcasts while Commuting and cleaning the house, cooking dinner, etc. I am also a big politics nerd, so I love all politics podcasts, but in the tech space I like very much like Hard Fork from the New York Times to keep up with developments in technology. I also really like the Ezra Klein podcast. I also do religiously read the newsletter Exponential View from Azeem Azhar, who is a former economist, journalist and now investor. And he is the person that kind of keeps me ahead of the curve on technology developments.
Dr Miah Hammond-Errey: [00:31:41] You've alluded to it, but we'll go to the segment called disconnect. How do you wind down and disconnect with three kids?
Carly Kind : [00:31:47] I don't think it's possible. If it is, please let me know. Um, I'm in the trenches of parenting at the moment, so the only time in my day I'm able to call back to try to disconnect is going for a walk at about 5:00 in the morning before the kids wake up, and I'm lucky to live near the beach, so I've been trying to do that. I'm also trying to make myself listen to music and not podcast all day long.
Dr Miah Hammond-Errey: [00:32:10] what are some of the new technologies you see impacting privacy and security, or are there any things you're concerned about?
Carly Kind : [00:32:17] I'm very preoccupied by generative AI like everybody else is at the moment and trying to think what are the not only what are the privacy implications, but what does achieving privacy in a world in which gen AI is prevalent look like, um, you know, at a quite a high level. I've been trying to think about that. I'm pretty obsessed with, um, this notion that we're running out of data to train foundation models and kind of what that means for how, you know, the very kind of fundamental issues we've been discussing here today that most of that data only exists because of these problematic business models. What might the drive to further AI incentivize in terms of data creation collection going forward? Um, I certainly have my eye on Neurotechnologies as an emerging issue that will create real privacy issues and concerns. I don't feel on very firm ground yet in kind of exploring what that looks like, particularly from the perspective of a regulator. But I have a sense that it's going to become a bigger and bigger issue.
Dr Miah Hammond-Errey: [00:33:20 It's a huge question, but what are some of the implications for global tech companies in the use of their technologies and their and personal data in war?
Carly Kind : [00:33:27] I don't know. I find this issue really, really tricky. The infrastructural power of technology is huge and it is not captured adequately by things like privacy law. Actually, you know, privacy and particularly data collection is one, uh, input into kind of computational infrastructure and computational power that's exercised by big tech companies. Um, but actually the this is in the realm of kind of politics and power. Um, and they are clearly shaping society. They are not, um, intermediaries. I think that's a really important thing. I struggle to, um, think about kind of technology as a lever to pull, to undermine or prevent, like really problematic exercises of state force. I'm not sure that it feels like, you know, going after Al Capone for his taxes when we should be looking at like the big, the big issues of, um, you know, contravention of international law and, and, and the use of state power, etc..
Dr Miah Hammond-Errey: [00:34:35] final segment is need to know. Was there anything I didn't ask you that would have been great to cover.
Carly Kind : [00:34:40] Me I think you've covered everything we could possibly talk
Dr Miah Hammond-Errey: [00:34:45] Thank you so much for joining me. It's a real pleasure to have you here.
Carly Kind Thanks for having me.
Dr Miah Hammond-Errey: Thanks for listening to Technology and Security. I've been your host, doctor Miah Hammond-Errey. If there was a moment you enjoyed today or a question you have about the show, feel free to tweet me at Miah_HE or send an email to the address in the show notes [techandsecurity@stratfutures.com]. You can find out more about the work we do on our website, also linked in the show notes. If you liked this episode, please rate, review and share it with your friends.