9 min read

Influence Wars

This article is based on a presentation I delivered as part of the Australian Strategic Policy Institute’s (ASPI) War in 2025 Conference, held in Canberra from 12 and 14 June 2019 – where I was part of a panel discussing Information Age Warfare.

Over the course of Day 1 of #WarIn2015 much was discussed about technology – and where that path is going to lead us as we consider the challenges the wars of the future will present us. In presenting on information warfare and information operations in the morning of Day 2, I challenged the audience to consider the human factors that enable information to be such a potent weapon of modern warfare.

While the picture I paint may seem bleak, from the discussions had at the APSI conference it is clear many organisations, academics and thought leaders are already contemplating the challenges ahead in the context of human factors and cultural change necessary to improve the information environment.

Conventional warfare is something we’ve become very proficient in. But when it comes to our adversaries and enemies that inhabit areas of operation we’re less comfortable in – such the information terrain and social media – all traditionally viewed as civilian spaces – we are in the fight of our generation to defend the democracy, multiculturalism and freedoms our ANZACs, allies and today’s warfighters have fought so hard to protect.

While we’re technologically progressing our capabilities across cyber, artificial intelligence, machine learning, sonic weaponry, genome based biological targeting … just a few of the topics of discussion here at ASPI yesterday – as several speakers in the afternoon panel sessions pointed out, there is a large culture shift that needs to occur to meet challenges of the battlefield of the future.

Part of that culture shift is without doubt technologically led – but more importantly it must be human factor centric.

I think we can all agree that the information domain has irrevocably changed over the past decade. Today’s wars fight to control hearts and minds – driving deep wedges in the existing divisions in our society — at times with horrifically violent outcomes.

By manipulating social cohesion and destabilising our perceptions of safety and security;  information has become a powerful commodity – a commodity that is controlled extensively by the private sector and dominated militarily by non-State actors and insurgents.

It’s these actors that are creating new information ecosystems that defy conventional rules of engagement on an exponentially growing number of battlefields that permeate every facet of our existence.

Our area of operation isn’t confined to military theatres of war or even Multi Domain operations – we are literally carrying the current and future area of operations around in our pockets on our smartphones. We are in the area of operation 24hrs a day / 7 days a week 365 days a year.

Given our reliance on technology, it’s hard to mute the war being waged via our newsfeeds and on our social streams. In fact, our reliance on these technologies has cognitively wired[1] us into systems of repeated exposure to the echo chambers[2] that have been engineered around our likes, reactions and shares.

These used to be traditionally civilian spaces.

Al Shabaab and then ISIS changed that – and we can’t put that genie back in its bottle.

What the weaponization of the online and social environments have shown us – is that we are inherently uncomfortable in these spaces. We are unsure how to navigate the digital ethical minefields that bots, trolls and deepfakes present us with.

We muse over what the cost of fighting and winning is – in contrast to fighting and failing.

The cost of fighting in this terrain is fiscally low – but the moral damage and the ethical costs can be extreme and generational in their long term effects[3].

We can’t legislate our way out of the issues technology present us with.

Punishing technology giants for the way people choose to use their products is like blaming guns, video games and social media for manner of societal ills. We’ve been down this path before – banning things doesn’t really work.   

While efforts to prevent the rapid dissemination of abhorrent and objectional material should always be made – blaming ‘technology’ for the actions of people doesn’t address the reality of the underlying situation.

We need to meaningfully grapple with the ‘why’ of how agents of influence achieve success in our region – whether that be via military, political, economic or diplomatic means –  and look with introspection at the core of why these campaigns are effective.

Without the fractures that exist within our society[4], agents of influence would not have such a fertile proving ground for their campaigns.

We can unpack social and cognitive biases – the agenda setting and framing in play –  and the behavioural economics[5] of life inside the monetized echo chamber that is engineered to serve you more of what you like, love, share and save.

Our brains aren’t cognitively wired to critically process the sheer volume of information available – nor are we wired to always think logically and strategically in situations where the information we have is limited or ambiguous.

We jump to conclusions that fit our own worldview, a neurological shortcut our brains take along the path of least cognitive resistance to avoid any mental discomfort – and actively disengage from views or ideas that make us question our worldview.

This isn’t new, this is human physiology.

What is new is how these behavioural economic biases have been designed into information campaigns that incentivise one outcome over another.

We are all being nudged back into the comfort of our echo chambers, away from culture change and in fact putting up any meaningful resistance to agents of influence at all.   

  • Degenerate values are amplified to provoke outrage.

As is the case with those espousing degenerate values, that influence isn’t always earned – it’s manufactured – often monetized – and framed as views that are representative of mainstream society.

  • News is sold – not told. Content is weaponised.

The decline of journalism and the rise of highly sophisticated click bait style reporting with allied advertising is a reflection of a news media that by and large doesn’t produce stories that even fit the definition of news.

When actual ‘news’ is reported, the way it is quickly embedded into newsfeeds to achieve SEO wins-  delivers the content in exactly the way the agent of influence set out to achieve- something we see so often though the lens of terrorism where terrorist organisations achieve significant propaganda of the deed success by producing their own content.

  • Truth is out of vogue. Trust is in decline.

The inconvenient truth of our time is that fiction reigns supreme. The more outlandish the view, the more influential – the more viral- its potential.

People want to believe- and will use technology to imagine new outcomes – creating more information ecosystems and potentially more information battlefields.

If it can be built –  someone will figure out how to weaponise it.

  • Social media is the fuel to the fire. Echo chambers amplify discord.

Social media is not the spark that starts the fire – but it does add fuel to the fire because it feeds the inherent echo chambers within each platform algorithmically continues to serve content you like and engage with time and time again.

How we think about our technology and its bearing on our future is intertwined with the bias bubbles we live within every time we pick up our smartphones.

We’ve learned a lot about the information and influence environment this decade – a decade which will draw to a close in a little over six months time.

We have seen dictators and insurgencies rise and fall on the back of digital uprisings and cyber caliphates.

We’ve borne witness to a live stream of humanity at its very best – and horrifying worst.

The information domain has held up a mirror to our society, our country, our planet – and collectively when we’ve gazed inward, no one really likes what they see.  

We have the knowledge and the power to make the next decade different.

But do we have the will to shape a better, fairer, more transparent information domain for our future?

Do we have the courage to change our culture?

Change – meaningful change – can only eventuate if we address the fundamental challenge of our time: technology may be the conduit but is not the culprit.  

We have a people problem.

And by and large those people have a problem with the reality they perceive.

How do we then change our culture to meet both the technological need for advancement to obtain or retain multi domain operations dominance – AND  the ethical and moral challenges we will face?

  • Societal issues must be dealt with proactively.

To combat fake news and misinformation we must be proactive in dealing with the societal divisions that confront us.

We need to collectively establish what and when influence is acceptable in our society – and when it is not.

We need to define our left and right of arc in the information environment so that people understand how and when they are being manipulated and to what end.  

  • Leaders must demonstrate values led decision making.

It is not enough for our leaders and those with prominent voices – including the media – to simply legislate away responsibility or shirk accountability for the societal issues their words and actions cause in the information terrain.

We cannot let indifference and apathy toward our fellow humans become the status quo.

  • Voices on the far-fringes need to be engaged.

We can’t press mute on the extremists amongst us – and we have extremists vying for airtime from all sides  – but we can prevent their narratives from being artificially amplified to manufacture outrage, intolerance and incite violence.

An egghead on Twitter with 9 followers and an account that’s 6 months old shouldn’t be held up as having a mainstream view anymore than someone with a million followers – the false economy of followership online needs to be reassessed.

And we can do better at opening and holding constructive dialogues.

Engagement does not mean acceptance or tolerance – it means disagreeing with respect and compassion, rather than hostility and denigration.  

Engagement gives peaceful outcomes a chance to change views or share narratives based on fact.

  • Fact can’t fight fiction and win. Biases can be systematic and programmatic.

100 years ago we were barely six months out of the first World War; and only 20 years away from the start of the second.

The disregard for facts and the manipulation of information- particularly the dehumanisation of specific groups of people, during and leading up to the second world war to justify hatred, intolerance and racism must never be allowed to manifest or gain a foothold in our global society again.

Government and policy makers have the opportunity to lead us toward a better future – but we must acknowledge that not all leaders or policy makers have our best intentions at heart – and in fact, there are those who among that cohort who will continue to gain significant advantage from the current state of play.


We will face internal resistance to cultural change. We must be prepare for that skirmish and push past the obstacles that will be placed in our way.  

We must learn to adapt and be dynamic in preparing to fight unconventional information wars where ever we find them.

We are at a cross roads.

Information is our greatest strength and biggest weakness.

Technology is – or soon will – surpass our ability to moderate or control it. 

If we haven’t got a handle on the societal issues we face today by then, how much longe can we realistically tread water in this ocean of discontent?

We must be ready – we must have the capability the training and the expertise we need to fight on this battlefield – and we must have the sincere will to fight and win on our own terms.

They often say that the victorious write history.

The sobering thought I want to leave you with today is this:

If we don’t succeed, our future will be written by one or more of our adversaries –  who with their effective troll farm has produced and amplified the deepfakes produced with our face, in our name telling our history and legacy. 


[1] Horvath, A. (2015) How does technology affect our brains? ‘Voice’ Volume 11, Number 6 (University of Melbourne).  

[2] Garimella, K., Gionis, A., Gianmarco De Francisci, M., & Mathioudakis, M. (2018) Political Discourse on Social Media: Echo Chambers, Gatekeepers, and the Price of Bipartisanship.

[3] Karim, K.H. (2010) Cyber-Utopia and the Myth of Paradise: Using Jacques Ellul’s Work on Propaganda to analyse information society rhetoric. Journal of Information, Communication and Society, Volume 4, Issue 1.

[4] Biluc, A.M., Jakubowicz, A., & Dunn, K. (2019) Racism in a networked world: how groups and individuals spread racist hate online. The Conversation.

[5] Samson, A., & Sunstein, C. (2017) The Behavioural Economics Guide 2017.