Zac Rogers is an academic from Adelaide, South Australia. His research combines a traditional grounding in national security, intelligence, and defence with emerging fields of social cybersecurity, digital anthropology, and democratic resilience, working closely with industry and government partners across multiple projects. Parasitoid is his first book.
Every technology comes to be used to meet the needs of its time. Those needs interact with the often hidden-from-view affordances that reside in the tech to radically skew the intentions with which it might have been conceived and developed. When those affordances are geared to exploit network effects, and lock-in is pursued by monopolists at scale, no one can predict what happens next. Need trumps everything. A 2003 independent report commissioned by the Pentagon1 was tasked with considering the geopolitical and national security implications of a worst-case climate change scenario. Widely ignored and roundly dismissed at the time as alarmist (it was statedly alarmist by design), the report’s premises, nearly two decades on, now fall easily within the scope of plausible near-term scenarios.
Historical evidence suggests that long periods of slow warming are consistently followed by dramatic falls in global temperature. Scientists believe these sharp falls are caused by the thermohaline conveyor – the current that mixes and moves heat around the world’s oceans – shutting down. Heat that is trapped means a hotter, wetter equator, more polar ice, while the intermediary zones – the site of all of the world’s grain production – get windier and drier.
The impact on food production and transit is catastrophic. Before rapid and severe climate change means anything at all, it means a precipitous decline in the earth’s carrying capacity. In other words, mass starvation. Mass starvation means the mass uncontrolled movement of people. Conflict and danger that spreads like a Californian wildfire was the report’s main prediction.
The report was released when the United States government was readying itself to invade the sovereign state of Iraq. The administration saw two main opportunities. First, control of Iraq’s large oil reserves would be increasingly important for US security-of-supply in the coming years. Second, at the Pentagon under Secretary Rumsfeld, a new way of war was being conceived. Large bases and heavy boot prints were to be replaced with a type of bit-torrent war, enabled and driven by the new era of networked digital telecommunications. Smaller and more agile forces, able to move, assemble, and disassemble rapidly anywhere on the globe, fed situational awareness by a global information grid, formed Rumsfeld’s vision.
9/11 created new needs and accelerated trends already underway. The information age created novel challenges for national security, not the least of which was what to do with all this information. Would it actually be more useful? Or more of a hindrance? Sometime in the first decade of the twenty-first century, human civilisation swept past an inflection point with regard to information and knowledge. The vast tail of history before the Internet harboured an information scarcity problem. In a vertigo-inducing heartbeat, that became an overload problem.
A solution had to be sought that would prevent the generational wealth invested in digital technologies by the United States from becoming a wasting asset. Artificial intelligence, or more accurately, statistical inference software capable of inferring patterns within large digital data sets, became the Emperor’s New Clothes. Iraq and Afghanistan would be its military testing grounds. Blurring with the civilian domain, feedback loops that serve and return information based on past activity are the Internet’s basic sorting mechanisms. Enabling the statistical inference of these loops has been the killer app of the AI era.
Sorting information in such a manner comes at a cost. As Nicholas Carr wrote in 2011,2 using the world-spanning, globe-connecting Internet has actually made everybody’s world a little smaller. That the Internet was causing changes in the brain was no accident.3 The human cognitive system, and all of its bugs and vulnerabilities, were central to the largest growth industries of the early twenty-first century.4 The commercial domain was its centre of innovation and growth, with government seed-funding and often in tow as a follow-on customer for its products and services. Unsurprisingly, science mixed with pseudoscience and commercial incentives in irreversible ways,5 producing endless tropes taken as gospel by the sector’s often breathless acolytes, particularly in bureaucracy and finance.
While commercial Big Tech and the national security state typically take the brunt of a growing backlash, less public attention has been directed at their primary sources of intellectual legitimacy. The streams of scientism that came to feed much economic and behavioural theorising, of the type awarded the Bank of Sweden Prize and lauded by no less influential institutions than the World Bank, have their origins in the world’s most prestigious universities. Harvard’s ‘Nudgers’, Stanford’s ‘Captologists’, and MIT’s ‘Social Physics’ cohorts appeared to take the digital information age as a sign that historically devastating critiques of behaviourism and positivism no longer applied or could now be more readily ignored. One reason might simply be that the demand for a datafied episteme was generated as a consequence of the oversupply of data; Goodhart’s Law be damned.6 Another reason may be that when they surveyed their surrounds for informed political opposition to these historical tropes, they encountered a neoliberal wasteland propelling them forward.
For all the influence of academic theorising, commercial and financial imperatives ruled. A common trope about the virtues of widespread automation is the promise that it will ‘free up’ human beings to pursue more creative and productive ventures. On the contrary, automation-for-the-sake-of-it frees up human attention so that it may consume more distraction product. Why bother with rote tasks and tactile experience when all that surplus attention can be burned in capitalism’s new furnaces? The commercial titans of the digital age peddle distraction, not development. As John Gray wrote in 1998,7 the chief engine of capitalism in the wake of modernity was the rising demand for divergence.
Bait-and-switch and a fat tail
Iran is now the controlling power in Iraq. The withdrawal of US and allied forces from Afghanistan heralds the end of an experimental period in American strategic culture for which mounting strategic costs are the main outcome. The rapid and severe climate change flagged in 2003 is playing out amidst an attempt by global corporate, financial, monetary, and bureaucratic authorities to shift away from the institutional mediation of resource allocation of the post-war era to an infrastructure of algorithmic mediation. The consequences of the experiment are unstated, yet clear. As climate crises and the impacts on human security play out, the command and control of human behaviour, most importantly human movement, will be pivotal in determining stakeholder advantage.
Every technology is used to meet the needs of its time. The intentions and dreams of its progenitors are irrelevant. And so it appears to go for the rise, fall, and return of behaviourism. The needs of the coming era for mass population control, amidst the declining conditions in human security, will become the defining character of a regime of digital technologies sold by corporate entities to the public-at-large as novel and convenient. Mass surveillance, behavioural prediction and modification, and various forms of cognitive simulation are already the chief manifestations of the digital economy.
Developed and scaled as advertising disruption, and as an expansion of the profiling and scoring industry boosted by the electronic telecommunication boom of the 1970s, digital ICTs curated by AI are uniquely applicable to automated command and control activities.8 They harbour this affordance. In a stunning episode of bait-and-switch, scholars are now exploring how their incursion on every facet of human behaviour and cognition under the guise of market productivity and consumer want has achieved scale. They need look no further than the early 2000s literature on how network effects achieve lock-in,9 which much of the industry read and appropriated.
We grab anything when we fall
These were business strategies for getting rich in the digital age before they were the tools and methods of C2. Which only reinforces the point about technologies meeting the needs of their time, regardless of intent. One of the oldest and most widespread myths about technology is the gnostic belief that it harbours a type of mystery, which the adept society alone can tap and bend to their want. In fact, no such thing exists. The gnostic awe for technology is a dangerous spillage of a latent monotheism; it delivers to its believers exactly the same service: a view of history as having a hidden meaning.
Technology is more like detritus than magic dust. Its effects linger and distort, long after the devout are gone. As George Dyson notes, constant mediation by statistical inference machines is already distorting social, political, and economic relations in both open and closed societies.10 For all the dystopian and futuristic themes popularly associated with high technology, the chief danger may be simply of needless regression. As Jane Jacobs foresaw,11 when the tireless work of sustaining hard-won civilisational gains is subordinated to infantile fantasies of the future, backsliding will be fast and easy.
A darker reality, however, stalks modernity’s wake. The technologies of control which have been scaled and locked in under the yoke of techno-fetish are already the object of a quickening geopolitical contest. Apparently unable to conceive of an alternative, corporate and financial elites, and their often witless shills in government, have ridden late modernity like an express train heading for the edge of a cliff. Able but unwilling to change course or even slow down, they are now fully invested in preparing for themselves a survivable landing when the time comes to disembark.
There is not a single government of any consequence on the face of the earth that is ‘denying’ climate change. Those with any capacity to do so are preparing for the types of scenarios outlined in the Pentagon report. Powerful states such as Germany, Japan, China and the US will strategise to quarantine themselves from the growing disorder, ensure access to supply chains, resources and transit zones, and prepare to defend these advantages by force. Weak states with strong leadership, such as Russia, strategise to profit from a dangerous yet lucrative spoiler role under cover of nuclear arms. States of little or no capacity are left to twist in the wind.
The Emperor has no clothes
Statistical inference software, crawling over huge streams of data and predicting inscrutable futures, is an arresting vision of technological prowess. But it is a vision of absurdity. Data is a recorded digital abstraction of a state of the world past. As much information is missing as is present, perhaps a great deal more. AI, marketed as ready to solve climate change, conflict and scarcity on the same day, is not going to do these things. What it will do is what it is already doing, which is to distort and disable the capacity for any collective political response to the wants of capital.
The techno-fetish is a cul-de-sac both erected and defended by capital, which enables it to feed off its own waste. East and West. The wake of modernity accommodates a techno-political struggle that is little more than a harlequinade of stagnant and duelling monisms.
At least since the birth of monotheism, humankind has been intoxicated by an idea of itself unfolding in history. Such a vision provides unique salve to an intolerable condition to which every human is vulnerable: the possibility of meaningless suffering. The decline of formal religion only drove the intoxicated to the next well. Science and technology are now secular religions that supply the devout with stories of their place in history, and with reasons to regard said history as having order and meaning. Such an order forms an arc of history for the world-fixing, life-improving cohort, the wrong side of which can only be occupied in error. The arc includes and excludes ‘behaviour’ formulated with all the cultural force of pop science.
It is an unmistakably human folly, and for that, it must be forgiven. Cognitive dissonance is the condition of being human, not a behavioural bug that can be vaccinated against. But such folly comes at a grave cost. As John Gray has written, the need to posit meaning in suffering incurs a price in delusion.12 As gnostic visions of technological oracles, supplying humans with supernatural power, become common tropes, the price steadily rises. Subordinating their cognitive capacities to a machine episteme has only prepared humans to be utterly unprepared.
Nature never locked in
Peter Schwartz and Doug Randall, ‘An Abrupt Climate Change Scenario and Its Implications for United States National Security’, October 2003, https://eesc.columbia.edu/courses/v1003/readings/Pentagon.pdf.
Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains, Updated Edition (W. W. Norton & Company, 2020).
Gary W. Small et al., ‘Brain Health Consequences of Digital Technology Use’, Dialogues in Clinical Neuroscience 22, no. 2 (June 2020): 179–87, https://doi.org/10.31887/DCNS.2020.22.2/gsmall.
Howard E. Gardner, The Mind’s New Science: A History of the Cognitive Revolution (Hachette UK, 2008).
Philip Mirowski, Science-Mart (Harvard University Press, 2011).
Zac Rogers, ‘Goodhart’s Law: Why the Future of Conflict Will Not Be Data-Driven’, Grounded Curiosity (blog), February 13, 2021, https://groundedcuriosity.com/goodharts-law-why-the-future-of-conflict-will-not-be-data-driven/.
John Gray, False Dawn: The Delusions of Global Capitalism (Granta Books, 2015).
Jeremy Packer and Joshua Reeves, Killer Apps: War, Media, Machine (Duke University Press, 2020).
Albert-László Barabási, Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life (Plume, 2003).
George Dyson, ‘Childhood’s End’, Edge (blog), January 1, 2019, https://www.edge.org/conversation/george_dyson-childhoods-end.
Jane Jacobs, Dark Age Ahead (Knopf Doubleday Publishing Group, 2007).
John Gray, Feline Philosophy: Cats and the Meaning of Life (New York: Farrar, Straus and Giroux, 2020).
Wallace Stevens, ‘Disillusionment of Ten O’Clock’, The Palm at the End of the Mind: Selected Poems and a Play (Knopf Doubleday Publishing Group, 2011), 11.
Every technology comes to be used to meet the needs of its time. Those needs interact with the often hidden-from-view affordances that reside in the tech to radically skew the intentions with which it might have been conceived and develo...