Miah Hammond-Errey

Miah Hammond-ErreyMiah Hammond-ErreyMiah Hammond-Errey
  • Home
  • About
  • TS POD
    • About TS
    • Season 3 Episode 6
    • Season 3 Episode 5
    • Season 3 Episode 4
    • Season 3 Episode 3
    • Season 3 Episode 2
    • Season 3 Episode 1
    • BONUS Ethics Tech Intel
    • Episode 24
    • BONUS OSINT
    • Episode 23
    • Episode 22
    • Episode 21
    • Episode 20
    • Episode 19
    • Episode 18
    • Episode 17
    • Episode 16
    • Episode 15
    • Episode 14
    • Episode 13
    • Episode 12
    • Episode 11
    • Episode 10
    • Episode 9
    • Episode 8
    • Episode 7
    • Episode 6
    • Episode 5
    • Episode 4
    • Episode 3
    • Episode 2
    • Episode 1
  • Publications
  • BOOKS
  • Contact
  • I3
  • Events
  • Good Reads
  • MEDIA
  • PhD
  • Byte-sized diplomacy
  • Downloads
  • More
    • Home
    • About
    • TS POD
      • About TS
      • Season 3 Episode 6
      • Season 3 Episode 5
      • Season 3 Episode 4
      • Season 3 Episode 3
      • Season 3 Episode 2
      • Season 3 Episode 1
      • BONUS Ethics Tech Intel
      • Episode 24
      • BONUS OSINT
      • Episode 23
      • Episode 22
      • Episode 21
      • Episode 20
      • Episode 19
      • Episode 18
      • Episode 17
      • Episode 16
      • Episode 15
      • Episode 14
      • Episode 13
      • Episode 12
      • Episode 11
      • Episode 10
      • Episode 9
      • Episode 8
      • Episode 7
      • Episode 6
      • Episode 5
      • Episode 4
      • Episode 3
      • Episode 2
      • Episode 1
    • Publications
    • BOOKS
    • Contact
    • I3
    • Events
    • Good Reads
    • MEDIA
    • PhD
    • Byte-sized diplomacy
    • Downloads
  • Sign In
  • Create Account

  • My Account
  • Signed in as:

  • filler@godaddy.com


  • My Account
  • Sign out

Miah Hammond-Errey

Miah Hammond-ErreyMiah Hammond-ErreyMiah Hammond-Errey

Signed in as:

filler@godaddy.com

  • Home
  • About
  • TS POD
    • About TS
    • Season 3 Episode 6
    • Season 3 Episode 5
    • Season 3 Episode 4
    • Season 3 Episode 3
    • Season 3 Episode 2
    • Season 3 Episode 1
    • BONUS Ethics Tech Intel
    • Episode 24
    • BONUS OSINT
    • Episode 23
    • Episode 22
    • Episode 21
    • Episode 20
    • Episode 19
    • Episode 18
    • Episode 17
    • Episode 16
    • Episode 15
    • Episode 14
    • Episode 13
    • Episode 12
    • Episode 11
    • Episode 10
    • Episode 9
    • Episode 8
    • Episode 7
    • Episode 6
    • Episode 5
    • Episode 4
    • Episode 3
    • Episode 2
    • Episode 1
  • Publications
  • BOOKS
  • Contact
  • I3
  • Events
  • Good Reads
  • MEDIA
  • PhD
  • Byte-sized diplomacy
  • Downloads

Account


  • My Account
  • Sign out


  • Sign In
  • My Account

full text of opinion & Commentary

the data economy is an inherent security flaw

Published in Canberra Times, 5 July 2025

ORIGINAL: Australian political defence and government leaders are vulnerable to attack 

Foreign states, including China, as well as terrorists and criminals obtain extensive data on Australian defence, security and political leaders with ease and increasingly, in real time. This poses an unacceptable security risk which will increase as geopolitical tensions rise. 


The recent shootings of Minnesota representative Melissa Hortman and her husband are a chilling reminder that the tools of surveillance and targeting are not just available to foreign adversaries but to anyone with malicious intent. Notebook excerpts from the alleged shooter, Vance Boelter, and presented in a court filing show he used data brokers and aggregators to find the home addresses of his intended targets. Politico reports that police found the names of 11 registered data brokers in Boelter’s abandoned car after the shootings. 


On an average day Australians are tracked nearly 500 times a day, as just one form of advertising technology–AdTech–broadcasts what a person in Australia is reading or watching, and where they are. This advertising model enables precise tracking and targeting of individuals and sells that information in an open market. The risks of data usage in conflict has been acknowledged for years, but remains unaddressed. 


AdTech is active on almost all websites and apps. One type, known as real time bidding (RTB), is an online auction. It is an invisible, instantaneous and automated process, happening in the microseconds that a webpage or app loads. It broadcasts personal and sensitive data–including geolocation data–about Australians, without security measures to protect the data to facilitate the bidding for advertising.


RTB happens billions of times a day. The many players in the auction get the data (even if they don’t purchase the advertising). Australians are exposed in this way 3.7 trillion times a year and that might be the half of it–figures from Amazon and Meta are not available. It is a profitable business model that siphons and resells the data. The way the data economy functions is an inherent security flaw and a national security problem that simply cannot be overstated.  


Possible data harvesting from TikTok, DeepSeek and Chinese EVs has sparked significant concern globally. While it makes surveillance easier, foreign powers and malicious actors don’t need custom-built apps to surveil our political and defence leaders, or commit the rising threat or espionage and foreign interference ASIO’s Director General warned of this year. They just need widely available advertising data. We now have evidence of how it will be weaponised as geopolitical tensions and global uncertainty increase.


A recent research paper shows how data about Australian defence personnel and political leaders flows to foreign states. I’ve been warning about this for a long time, detailing the risks in my book, Big Data, Emerging Technologies and Intelligence, National Security Disrupted, published in January last year. 


Defence and intelligence personnel–and even politicians–are people too. Just like the rest of us, they binge watch Netflix, workout and shop online, leaving a digital trail that creates a dossier of information. 


Foreign states and non-state actors can use this data to spy on individuals’ movements, financial problems, mental health and intimate details, such as if they have experienced sexual assault. Even if individuals use secure devices, data about them will still flow from personal devices, their friends, family and contacts. This exposes individuals and organisations to severe security risks. There is evidence partners and children are already targeted. 


We now know that Australian defence personnel and national leaders are being actively categorised and targeted using RTB data. As the Minnesota shootings show this represents a a genuine threat to legislators. In conflict, this will be more acute. 


As global tensions escalate, advertising data will be used to target defence, intelligence and government leaders and institutions for espionage, foreign interference, or to dox individuals and pressure political change. Companies, leaders and supply chains will be vulnerable too. As will their spouses and children. 


Data protection is not just a matter of personal privacy but of national security and resilience. Disrupting the pervasive data economy would reduce serious security vulnerabilities and uplift privacy protections for all of us. It would also reduce the growing threat of algorithmic influence and foreign interference capacity inherent in our current technology ecosystem.


National Cyber Security Coordinator, Michelle McGuinness told me; it is not possible to make cyberspace secure without meaningful privacy protections. Australia’s Privacy Commissioner, Carly Kind, concurred; “vulnerabilities in technical systems and excessive data collection create risks to cyber security and physical security as well.”


No single group should have unlimited or unregulated access to our data and how we feel and react, as I have written previously. Currently, data is easily available to aggregate, analyse and exploit to anybody willing to pay for it, including foreign states, for any purpose.


We need urgent action on tranche 2 of privacy reform to address the issue. There’s a myth that carveouts are technically possible, or practically feasible. In reality, we protect all of us, or none of us. Yet, it is possible to create different kinds of digital infrastructure to rewire technologies to protect privacy and enable digital advertisingthat doesn’t “throw us to the wolves’. Indeed, it is necessary to do so.


Without strong privacy protection and rewiring technologies to reflect democratic values, our political, defence and government leaders are vulnerable to attack now. This will only worsen as geopolitical instability, and the likelihood of conflict increase.

The only privacy we have left is what’s in our heads...

Published in The Age, 1 February 2024

Your  brain might be your greatest asset – and your most private. But soon,  your brain’s data could be at the centre of a new wave of intellectual  interference and disinformation – one that could exceed the fears around  generative AI.


Though neurotechnology, which can read and even influence the brain’s patterns and electrical activity, has existed for some time  in the medical context of monitoring neurological conditions, consumer  devices will soon hit the mainstream. Apple’s Vision Pro, which offers  rudimentary neurotechnology through iris-tracking cameras will be  released on February 2, while Meta’s neural smart watch is expected in early 2025.

Recent  advances in neuroscience and AI are paving the way for improved  consumer applications. This includes complex and invasive BCIs –  brain-computer interfaces – like Elon Musk’s Neuralink, which has just  announced a successful brain implant, which could, for example, let a paralysed person type, or play video games.

This new era of consumer neurotechnology will herald unprecedented disinformation, influence and interference.

As  the last piece in the jigsaw puzzle of data collection, neurotechnology  and the brain data it collects offers those who own and access this  information insight into an individual’s thoughts and emotions. It goes  beyond the likes, swipes and clicks to capture unconscious reactions,  individual bias, feelings, preferences and emotional states.

Disinformation will be supercharged by the introduction of neurotechnology. Generative AI – algorithms such as ChatGPT that draw on vast training data to  generate streams of “new” content using prompts – can rapidly create and  distribute disinformation tailored to the individual. However,  neurotechnology and AI will build on the vast data already in existence to customise and adjust content to cater to each individual’s specific  feelings and current emotional state – offering the ability to create a  feedback loop fuelled by brain data. A simple example would be where,  while scrolling through your news feed, you have a positive emotional  response to reading a statement of hate speech. A social media platform  could then set its algorithm to feed your brain more of the same content  to elicit the positive response again.

As  Duke University law professor Nita Farahany explains, the  commercialisation of this technology poses serious risks. “You have the  same tech company that has access to ... brain data that’s [also]  driving the environment that the person is operating in, like an  immersive or spatial computing environment, where it almost becomes like  people are in a matrix that is operated by a foreign adversary.”

Until now, much of the policy conversations around addressing these dangers have focused on areas such as content moderation, and dealing directly with the platforms that enable the dissemination of generative AI-produced disinformation.  However, with the introduction of neurotechnology, it is the cognitive  liberty and resilience of individuals that can be manipulated through  changes in algorithms that respond to inputs from brain data.

“I  fear that’s an insidious way of really getting at social control in  ways that would be very difficult to counteract and to even detect,”  says Farahany, a leading scholar on the ethical, legal, and social  implications of emerging technologies. “To me, cognitive liberty is a  national security requirement for countries to start to embed into  products and to say brain data is uniquely sensitive and uniquely  valuable … It needs to live on device. It needs to live in the hands of  individuals.”

With  2024 set to be the biggest election year in history – more than 50 per  cent of the world’s population will be casting their votes –  disinformation will be front-of-mind for the protection of democracy.  Yet, global efforts to date have struggled to tackle this issue. Without  urgent action, the risk of supercharged disinformation and interference  – powered by brain data and generative AI – will present a grave  challenge to the digital landscape and democracy worldwide.

Preparing  for the widespread adoption of consumer neurotechnology is critical,  now. As it becomes commercially available, there must be the appropriate  protections in place. We must recognise the significance of brain data –  arguably the most intimate form of data available – with a particular  focus on its collection, storage and use. This includes who controls that data and how they employ it.

No  group should have unlimited or unregulated access to our data and how  we feel and react. The level of influence that this affords – be it  large technology companies that collect much of our data and operate our  social media feeds, government who hold much of our other important  credentials, or any other group – is too large.

We  could ensure, for example, that brain data remains, and is processed,  on-device and that only the data needed to operate interactions – such  as whether to swipe right or left, or click – is accessed and the rest  stays with the user. Updating international human rights to consider cognitive liberty will also be an important step. Irrespective of how it is achieved, we  must ensure rigorous regulation of our brain data and its use – both now  and into the future. Not doing so risks significant breaches of  privacy, discriminatory conduct, waves of disinformation and the loss of  the final frontier; our cognitive liberty.

You’re being watched: How Big Data is changing our lives

Published in Sydney Morning Herald February 2, 2022 — 5.30am

Your phone lights up with a notification. “COVID-19 case alert. You checked in to X location on Y date around the same time as a COVID-19 case. Unless otherwise advised, you must monitor for symptoms. If you are unwell, get a COVID-19 test.”


You pick up your phone and wonder whether to panic.


Global studies showed that even in 2016, users were interacting with or ‘touching’ their phones thousands of times per day. COVID-19 has exacerbated that trend, for everything from check-ins and vaccination records to banking, ordering food, working and socialising. If you’re anything like the average Australian, you’re spending five and a half hours a day on your phone.


We often think about online data from a personal privacy perspective, which is of course profoundly important. However, it is also important to understand how our collective reliance on data infrastructure and our participation in the digital economy are forming the backbone of new economic, political, and social power – what I call in a paper published on Wednesday by the Lowy Institute “the big data landscape”.

The big data landscape has embedded new structures for unprecedented collection, aggregation and analysis of data about almost every aspect of our lives, as well as new ways to keep us continuously connected. All this data tells a rich story about who we are and what we do. It can provide real-time updates about every aspect of our lives – our physical movements, finances, sexual preferences, mental wellbeing, friends and desires.


Big data democratises capabilities for targeting and surveillance – functions that were previously performed by nation states. The big data digital footprint and the infrastructure used to analyse it is predominantly owned by commercial entities, meaning that the data – and the ability to derive insight from it – largely resides in the private sector.


Much of this collection occurs within companies that monetise their user data, and much of it is available for purchase. The sheer amount of data – often accessible in real time – creates uncertainty over when, where, and by whom aggregation of that data, as well as targeting and surveillance, can occur.

The enormous national security implications of near-complete data coverage of human lives are underappreciated in policy and public commentary. For countries and their governments, it means that data can be used to target and surveil citizens, politicians, journalists, national security workers and military. This ‘democratisation’ of targeting and surveillance means that these capabilities could be available to anyone that can collect or buy data, such as private companies, malicious actors, other nation states or a combination.


Around the world and possibly exacerbated by COVID-19, employer surveillance is on the rise, including more wide-ranging – and oppressive – monitoring. In some places, this takes the form of comprehensive surveillance, including worker location, physiological signs, time on tasks and performance management – as well as algorithmic payment and firing. High-end spyware Pegasus has been used to spy on everyone from journalists, to human rights activists, government ministers, diplomats, and businesspeople in both democracies and autocratic regimes.


In some places, there is less distinction between state and non-state actors. Take the extensive surveillance and detainment of Uighurs in Xinjiang, enabled by a combination of big data collection and data fusion, among other forms of surveillance. We also see China harvesting masses of data on Western targets.


The big tech companies that have successfully capitalised on data abundance, digital connectivity, and ubiquitous technology are the new oligarchies and are increasingly controlling the capabilities essential for a functioning society. Big data has concentrated information flows, critical data sets and the technical capabilities essential for functioning democracies in the hands of a small number of commercial entities. Since virtually all information is now controlled or goes through these large companies, at some level, it fundamentally changes the security landscape for government and has created new power dynamics between governments, citizens, companies and nation states.


Phones, computers and internet-connected devices (such as Fitbits and ‘Alexa’) are so deeply embedded in our lives and have so much power in society that it is easy to forget many are barely older than teenagers. Google started in 1998. Facebook is 17, YouTube is 16 and the iPhone is merely 15 years old. These powerful entities have created a new information and infrastructure landscape with minimal oversight from national governments.


Technical developments occur much faster than regulation can keep up with. And very little regulation occurs before real social harms manifest. The challenges presented by big data are whole-of-society ones that now require new ways of thinking and responding.


Many of Australia’s regulatory approaches necessarily look at single aspects of the big data landscape, such as the ACCC’s inquiry into digital ad services. But big data has created a new landscape comprising data abundance, digital connectivity and ubiquitous technology.


There is no doubt big data holds immense promise and possibility for positive change. However, it is also changing society and national security in ways that need consideration and response. It’s time to turn our focus to using innovative and exciting technologies in ways that ensure Australia is secure and we create opportunities for all Australians.

Copyright © 2025 Miah Hammond-Errey - All Rights Reserved.


Powered by