Keynote remarks: Tech 4 Democracy Summit, Madrid

To Provost Muniz, to the Organisers at the Instituto de Empresa  buenas tardes and as we would say in New Zealand, kia ora kotou katoa. 

To colleagues from the State Department, from Academia, and Civil Society Groups, to all our distinguished guests - kia ora tatou katoa. It’s a pleasure to be with you today at the Tech 4 Democracy conference

This summit is squarely focused on the challenges of our modern world. And so it would be remiss of me not to acknowledge one that is front of mind for this region right now.

I should start by acknowledging the context.  We are here on the eve of the NATO Leaders’ Summit, looking across at a major land war in Europe - an unjustified, illegal attempt to occupy another nation and, alongside that, to upend the global order by force.  Civilian populations are being subjected to terrible cruelty.  Millions have been displaced.  I want to take a moment at the outset to recognise the Ukrainian people in the face of this aggression, but also not to lose sight of the appalling suffering inflicted upon them. 

The ramifications of this war are wide in breadth and depth. It represents a challenge to our rules-based order. A challenge to the notion of territorial sovereignty. A challenge, I would argue, to basic humanity. It’s also illustrative that there is nothing “traditional” about warfare in these times.  

A central issue here is a sovereign nation’s democratic right to exist and to determine its own future.  This is about democratic will. 

The war in Ukraine is being prosecuted not only by artillery, tanks, and missiles.  It is also a war of information.  And, in that, it carries important lessons for all of us in how we think about the use and abuse of technology.  The challenges, but also opportunities.

The proliferation of publicly available information channels has changed the nature of conflict, just as it has changed the nature of media, business, and democracy.  Our daily lives have a new town square, and also a new battlespace.   

It has been well understood for centuries that carefully crafted manipulation campaigns could be used to upend institutions and civil order.  Europe knows this better than most. But technology has now allowed that to reach a global scale, where asymmetric information campaigns can be carried out across borders and geographical distances at very low cost, with limited traceability. 

Beyond the reach of missiles and artillery shells, it is apparent a well-coordinated effort is underway to erode and, ultimately, to undermine the democracies that enable our citizens to express their collective will.   The will to confront the presenting challenges of the age - from coronavirus to climate change, to collective security.   

The vulnerabilities in our social fabric have become the attack surfaces for this effort, from economically and socially disempowered groups to those poorly served by education, health, and social services.  Disinformation campaigns are often carefully designed to engender polarisation and division, and I suspect, if we’re looking for some clear or rational reason why these efforts to disrupt the fabric of our societies is being waged, we’re unlikely to find it. Recent research, for instance, by Microsoft found a sudden and pronounced spike in the consumption of Russian disinformation by New Zealanders, which increased by 30 percent relative to our neighbours in Australia, or the United States, in the period after December 2021. I cannot yet tell you why this is happening.  But I can tell you that it matters that it is.

As leaders it is probably natural that instinctively we seek to disrupt activity designed to change peoples’ agency. We welcome debate and dialogue but when that debate is being skewed not by fact but fiction, we feel the need to react.  But in doing so we risk being framed as paternalistic or, worse still, fulfilling the very worst conspiracy narratives. It’s a vicious cycle.

We are being subjected to a form of reflexive control – whereby governments are invited either to respond in coercive or paternalistic ways or, alternatively, to appear impotent in the face of asymmetric threats and thereby furnish the case for authoritarianism. 

That is part of the reason we are here.  There is no ready-made toolbox to address corrosive disinformation networks.  Traditional doctrines of warfare are probably not the logical starting point in this context.  This isn’t about winning one decisive victory on the information battlefield, nor is it about “full spectrum dominance” of the information space. 

Our real duty is to nurture a thriving and effective democratic system that protects human rights and provides for people’s long-term wellbeing.  It is about realising our shared values, through political engagement and through the positive and constructive use of the technologies that can serve us for good or for ill.  And it’s also about our collective resilience to a problem that is not going away.

There are many – not just elected officials, but people working at all levels and across all sectors of society - building inclusive institutions and economies and contributing to open and transparent government.  Social entrepreneurs and businesspeople, teachers, health workers, volunteers.  Strong democracies and open societies mobilise everyone in the defence of their values. 

Technology has a role to play, not just as a threat vector for information warfare.  It is a vital enabler of our long-term wellbeing, a source of hope to future generations aspiring to a more sustainable future, and an integral part of our social fabric.  We can’t just sit back and hope for the right outcomes to emerge.  Nor can we simply regulate these issues away.  We need values-based multistakeholder responses that draw fully on the knowledge and capabilities of affected communities, of civil society, of the technology sector. 

All these communities have a massive stake in the defence of liberal democracy, open societies, and respect for human rights.  And all these communities have a similarly key role to play in building resilience to targeted efforts to undermine these through disinformation campaigns and abuses of the online environment.

These are all things we have learned in our efforts to prevent terrorism and violent extremism online. 

When technology enables violent extremists to bridge the divide between believing harmful false narratives and committing atrocities the effects can be hugely harmful.  We saw this play out tragically in Christchurch.  More recently, we saw it in Germany and in the US.    And in those cases, the reverberations continue well beyond the event itself.

The combination of extreme violence and hateful propaganda invites us to find enemies within, to abandon trust in our collective humanity. 

The Christchurch Call was born out of a will to prevent this happening.  It is based upon promoting a free, open, secure internet as a tool for good, a place where human rights are protected, and economic and social opportunities are realised, as we work with civil society, industry, and democratic governments to prevent the weaponization of social media by terrorists and violent extremists.  The Call is New Zealand’s conviction in the importance of values-based multistakeholder coalitions, put into practice. 

Multi-stakeholderism is a balancing act.  A lot of effort goes into the process: to gather disparate perspectives, agree on a shared problem set, and mobilise the different capabilities, insights, and resources of each stakeholder group.  In the end these efforts are judged by their results – and that means a relentless focus and effort is needed. 

In the Christchurch Call we have made real progress on measures to limit the impact of terrorist and violent extremist abuses of the internet.  We are now turning our minds to some of the thorniest questions.  What can we do together to better understand the contribution of machine learning-based systems to polarisation, and radicalisation?  How can we develop algorithmic systems in a more socially and morally responsible way?  Put another way, how do we best prevent and intervene before a young person finds themselves armed, online, and about to broadcast an attack on their fellow human beings?

As part of that, we are working hard – including with European partners - to unlock better access for independent researchers, and to develop risk-based approaches that could lead to dynamic improvements in those areas.

As with terrorist and violent extremist content, so too I consider many of the tools, solutions, and multistakeholder models we develop will prove indispensable in addressing other online challenges, and particularly those around disinformation.  Shared work on transparency, surfacing information about state-sponsored online information campaigns, and an understanding of the processes that drive these; all of these are indispensable to navigating the challenges we face.  I’m encouraged by the increasing recognition of the importance of inclusive, multistakeholder work, and by the shared efforts of partners in industry, civil society, and governments.

So I’m pleased to be here today at the Tech for Democracy conference, supporting a positive, forward-looking effort to see our values embodied in the way we develop and deploy technology.  It is encouraging and inspiring to see the work you are doing on disinformation, on responsible AI, election integrity, and on building a multistakeholder community to sustain this work.

I am especially pleased to see this forum engaging in the discussion on disinformation as a global challenge.  For those of us who remain optimistic about a pluralistic and open internet, you can count on New Zealand’s support in mobilising a global community for action and helping to do what democracies do best – to give a platform to the brightest ideas, to adapt and flex to the challenges we face, and to work together to overcome them.  

Thanks once again to the organisers.  Best of luck to all the participants.  I look forward to seeing your ideas take shape. No reira, tena kotou katoa.