One Question Quiz
ActionStation’s Laura O’Connell Rapira and Amnesty International’s Meg de Ronde.
ActionStation’s Laura O’Connell Rapira and Amnesty International’s Meg de Ronde.

ĀteaJune 13, 2019

The online exodus of women and minorities

ActionStation’s Laura O’Connell Rapira and Amnesty International’s Meg de Ronde.
ActionStation’s Laura O’Connell Rapira and Amnesty International’s Meg de Ronde.

Two major studies show that women and minorities in New Zealand are being harassed to the point that they’re leaving online spaces in droves. Leonie Hayden reports on the growing tension between the right to free speech and the right to live without fear.

You don’t know unless you know. This is the only way to describe what it’s like to be a woman or minority receiving death and rape threats online. As Green MP Golriz Ghahraman put it in a recent interview about fellow MP David Seymour calling her “a menace to freedom” for wanting stronger regulation of hate speech: “For some of us this is real life and for everybody else it’s rhetoric and votes.”

This is what’s at the centre of the freedom of speech versus hate speech debate. Should your freedom to say what you want encroach on someone else’s freedom to live without fear?

Unfortunately for the people that think they are simply participating in healthy public debate when they use racial slurs or threaten to kill people online, we can see through the justifications thanks to data. When you track the patterns of people being threatened, harassed and abused online the research does not show ‘diverse selection of citizens disagreeing with one another’. It is overwhelmingly cis white men making threats against women, migrants, LGBTQI+ and people of colour’. Whether you accept the terms or not, it shows the drivers are not civic participation but racism, misogyny, homophobia and xenophobia.

“This isn’t people disagreeing with your opinions, and then them being like, ‘Oh, that’s not very nice!’” says Meg de Ronde, campaigns manager for Amnesty International New Zealand. “This is women that actually got threats of abuse, attacks, and death threats.” Most alarmingly, she says the findings from their 2017 survey revealed a third of New Zealand women had experienced online abuse and harassment.

“Some of the impacts that those women then reported were 75% of them struggled to sleep, 62% of them, that had this experience, had either panic attacks, anxiety, or stress, as a result.”

Lovely Meg de Ronde, campaigns manager for Amnesty International, sat down with me to discuss what turds some people are online. Image: On the Rag

As activist Anjum Rahman noted for The Spinoff earlier this week, the research (in the US at least) also shows that free speech arguments are most often used to defend racist speech. In response to the abuse and harassment, women, people of colour and the LGBTQI+ community are simply leaving online spaces.

“So, when you’re looking at a tool that’s meant to provide freedom of expression to people, and it’s meant to level the playing field as far as peoples’ ability to access information and have their say, we found around 49% of women who had experienced this said they stopped using social media, or they used it differently as a result,” says de Ronde.

The free speech debate is, ironically, silencing huge numbers of people.

In a more recent study by ActionStation, The People’s Report on Online Hate, Harassment and Abuse, the not-for-profit challenged the government to have more say in regulating platforms and content providers, the same as they would our shared public spaces. Economist Shamubeel Eaqub writes in the foreword: “Being online is a misnomer. It’s like walking on footpaths and driving on roads – part of everyday life. Yet we seem to treat online as a separate space rather than an extension of everyday life.”

The reports shows one in three Māori (32%), and one in five Asian (22%) and Pacific (21%) people experienced racial abuse and harassment online in 2018.

Currently the only thing in place offering any protection is the Harmful Digital Communications Act, which ActionStation director Laura O’Connell Rapira (a regular contributor to The Spinoff) tells me was actually drafted predominantly with teenagers in mind.

“As far as I understand, the HDCA was designed to counter cyber-bullying and it got pushed through by the parliamentarians at the time to address that. It hasn’t achieved that goal at all, in my opinion.”

She says an unexpected outcome however is that the Act has had some success in intimate partner abuse cases.

“Domestic abuse, between intimate partners, when that abuse carries on into the digital world, emails or Facebook messages that are causing psychological harm, then the HDCA can be used.”

She says that adding to its limitations, police are under educated about the Act, like in the case of Sāmoan author Lani Wendt Young.

Wendt-Young wrote on Twitter in August last year that she had taken a folder of more than 800 screenshots of online abuse she had received including rape and death threats to the police. She wrote that the officer asked her “what do you expect us to do about it?”

O’Connell Rapira gives an example from her own life:

“We coordinated the petitions around banning semi-automatic weapons and got about 70,000 signatures. We sent out an email to our mailing list, 280,000 people, saying ‘look at what people power did, this is so amazing!’. I got a response from a man, who sent it from his work email, that said ‘I’m the 1% you’re trying to take away from, you should go die in a hole.’ Now, normally I would ignore that but I looked at his work email and he was a firearms importer. I called Netsafe and reported it, but because it wasn’t specific, it didn’t name a time or a place he was going to kill me and he didn’t name a weapon it wasn’t considered over the threshold for harm.”

She emphasises that Netsafe do a great job with what power they have, but with its focus on individuals, the Act can’t for instance, protect a refugee woman of colour from the psychological harm of seeing comments from white supremacists that people like her should be lynched.

De Ronde also endorses the work Netsafe do. “Netsafe is the place to go to learn more about this, to start with. They’re tasked by the government to educate people, and they’ve got a lot of amazing resources and some really great people. But, we also think that there needs to be more funding put into that, generally, and for education for the public and for police.”

One of the recommendations in the People’s Report is that the responsibility be put on platforms when it comes to removing specific harmful content. “Germany passed laws that put the onus on Twitter and Facebook to remove content quickly that denied the Holocaust and violated their hate speech laws. Because of that Twitter created a bunch of jobs in Germany, because they didn’t have enough German speakers, for moderation,” says O’Connell Rapira.

“These platforms largely employ in the US or outsource moderation to places like the Philippines. One of the senior researchers at Facebook that I met at an online safety conference last year told me about how in India a lot of women were getting threats to ‘get on the bus’. To you or I that sounds harmless, but it was code for gang rape. If you’re outside that cultural context, you wouldn’t know to remove it.”

Legislation is slow moving business and solutions aren’t coming thick and fast, so ActionStation have started work on a programme that can at least help vulnerable people in the short term when they’re caught in a scary exchange, or need to find information that is potentially stressful.

“The idea is we train groups of people, usually Pākehā and other white tauiwi who aren’t directly affected by online racist hate, with messaging and listening techniques so they can intervene when they see online hate and threats. We do a ten-week training programme, 20 to 30 volunteers come to a hui to build relationships and to interrogate their own journey of race and privilege and then throughout the 10 weeks they meet every week in online webinars.”

The group have been tasked in the past with intervening on news articles about Golriz Ghahraman and were responsible after the March 15 Christchurch massacre for finding a dossier of scary comments on her behalf that might be needed as evidence if the unimaginable happened.

A charming selection.

O’Connell Rapira encourages people to get in touch with her via the ActionStation page if they’re interested in their services, or in being a volunteer. Meg de Ronde offers some more suggestions for how vulnerable women and minorities can protect themselves without being scared away from online spaces.

“Use the mute, the block, and the report buttons, but also encouraging other people to support you is quite okay. Post screenshots and say, ‘Hey, this is happening to me in a comment thread, could you come and support me?’ I think stepping away from the platform for a bit is okay as well. It’s not a longterm solution, but sometimes the toll on your mental health means that it’s healthier to step back for a little bit, if need be. And if all else fails, find out who the person’s mother is on Facebook or Twitter, report their behaviour to them and let natural justice take its course.”

Keep going!