One Question Quiz
Businessman working on tablet in the dark, close-up

MediaMarch 21, 2019

Taking white nationalists off the internet won’t solve right-wing terrorism

Businessman working on tablet in the dark, close-up

The removal of extremist content alone isn’t going to solve the problem of right-wing terrorism. Instead, we need to harness new technology to find such individuals early and intervene.

Last week, 50 lives in Christchurch were lost in another act of terrorism by a white nationalist. This follows 11 dead in Pittsburgh. Nine dead in Quebec City. Nine dead in Charleston. The list goes on.

In the days that have followed the Christchurch attack, some political leaders have downplayed the scale of the problem. Those that follow this logic are missing or unwilling to admit the years of systematic violence that communities have faced at the hands of white nationalists over the past decade. Right-wing extremism has been consistently overlooked by politicians and police globally over that time. Eight years after a right-wing extremist killed 77 people in Norway, we’re still failing to see these attacks for what they are: part of a global movement that threatens the security of democracies in North America, Europe, and Australasia, and is not confined by the boundaries of any one country.

In every attack, what the perpetrators have in common is greater than what sets them apart. They leave the same trails of clues behind them: manifestos littered with the same international reference points, the same cultural icons, worshipping the same “heroes”. This is also not the first time a white nationalist has migrated across borders to carry out an attack. In 2013, Ukrainian-born terrorist Pavlo Lapshyn arrived in the UK and within five days had murdered an 82-year old Muslim grandfather. He went on to plan three attempted bombings against mosques in the weeks that followed before his arrest. Similarly, the Christchurch attacker arrived from Australia to New Zealand for a temporary stay, and within months carried out his attack on two Christchurch mosques. Reminding us of this symmetry, the attacker wrote Pavlo Lapshyn’s name in Ukrainian on his weapon.

To call this a domestic terror threat is a gross underestimation. The particular brand of white nationalism that echoes across the manifestos and testimonies of such killers in New Zealand, Norway and the United States has been exported globally over the past 50 years, blossoming in a new digital era. A powerful global movement has formed online, constituting a threat with global ramifications.

The Christchurch terrorist was deeply entrenched within this global far-right online community – a community that flourishes on platforms like 8chan and where like-minded individuals were watching the live broadcast of his attack, praising his actions. Platforms like 8chan, 4chan, some corners of Reddit and others are not geographically bound. Instead, they offer borderless communities of like-minded individuals who have developed their own codes of conduct and norms.

It is tempting to turn our blame to social media companies – after all, their platforms facilitated the horror of the live streaming of the Christchurch attack. And indeed, over the past five days, political leaders have called on Facebook and other social media companies to take swift action to remove extremist content faster and more efficiently. Similarly, in 2017, in the days immediately following the “Unite the Right” rally in Charlottesville where a woman was killed and many injured by a white nationalist, there was an unprecedented crackdown on violent far-right content. Social media pages were taken down, websites disappeared and forums were banned. But despite these takedowns, we saw no decrease in appetite for this content. On the contrary, it spiked.

A make-shift memorial to those injured and to Heather Heyer who was tragically killed on August 12, 2017 during the Charlottesville rally (Photo: BRENDAN SMIALOWSKI/AFP/Getty Images)

In the days that followed the Charlottesville rally, Moonshot CVE recorded a major spike in Google searches for content related to right-wing extremism across the United States. These searches were not merely looking for information: on the contrary, these were individuals searching for content related to killing ethnic minorities or indicating a desire to join or donate to the Ku Klux Klan. Within just a week, we recorded a 400% increase in Google searches indicating a desire to get involved with violent far-right groups, compared to averages recorded in previous weeks. Similarly, following the attack on the Tree of Life Synagogue by a white supremacist in Pittsburgh, we recorded a 182% increase in Google searches indicating interest in killing ethnic minorities, and a 92% overall increase in search traffic indicating support for right-wing extremism.

It is too soon to say what impact the Christchurch attack has had on the global far-right community online. However, removal of content alone after this attack is not going to solve the problem of right-wing terrorism. Behind each piece of content uploaded to the internet is a human being, who won’t disappear when their comment, meme, or website does.

Every corner of the internet where the Christchurch terrorist’s vitriol has been posted and shared is littered with comments by internet users across the globe who support his actions. These users have left us a trail of clues, a digital footprint informing us of the path they are taking. We need to harness new technology to find such individuals early and intervene, by offering alternatives, challenging them, and engaging them in social work interventions to try and change their paths before they resort to violence.

Change is possible. But in order to facilitate this change, we need to recognize right-wing terrorism as a global threat and stop seeing the internet as a barrier. Instead, we need to recast it as an opportunity. We will need support from tech companies to do this work. They will need to start creating opportunities for us to actually use their platforms to reach communities engaging in hate online. The global programmes companies like Google have set up in response to ISIS, such as the Redirect Method, need to be matched by investment in such methodologies to redirect violent white supremacists online. When governments and tech companies have placed the global threat of right-wing extremism on the same footing as other forms of extremism, they will open up opportunities for us to deliver the proactive work that is so badly needed to engage these audiences online.

This problem will not be solved by removing extremist right-wing content from the internet piece by piece. To stand a chance of solving it, we need to recognise that this threat is global, take responsibility, and start actively investing in long-term online solutions.

Ludovica Di Giorgi and Vidhya Ramalingam work at Moonshot CVE, a company using technology to disrupt and counter violent extremism globally.

Keep going!