BERLIN – Every time Luisa Neubauer appears on German TV, her notifications start filling up with insults and threats. The spokesperson for the German branch of the Fridays for Future movement is a favourite target of the country’s far-right, who have made a hobby out of spreading lies about her, or sending death and rape threats.
“The disgusting thing about online hate is that every time you have your phone in your hand, you feel like you’re holding the hate there too,” says Neubauer, who also needs security to accompany her at protests. “It makes me feel…” she pauses, searching for the right words, “speechless and like I want to scream at the same time.”
Harassment of activists, as well as politicians, journalists and other prominent figures, is not a phenomenon exclusive to Germany. However, the situation is considered such a threat to democracy that specialised services are now available to those who find themselves the target of such attacks. If victims’ mental health is being affected, they can contact a dedicated counselling centre, HateAid. And if they want to pursue lawsuits against their harassers, the organisation will help fund them.
It's one of a raft of civil society and political steps Germany is taking to combat online harassment, and could hold lessons for the rest of the world on whether it’s possible to regulate social media – and what the drawbacks could be.
In Neubauer’s case, HateAid monitors all abuse she receives, and takes action if it thinks comments or threats cross a line into illegality. Recently it won an injunction against a far-right author who wrote sexualised comments about her, meaning he cannot post about her again and must pay her €6,000 compensation (about £5,000). So far it has counselled around 1,700 people, supported 31 cases in court and filed 600 criminal complaints.
“I wouldn’t say I feel enthusiastic that this is happening,” says Neubauer. “I would rather not have to be suing people who I’ve never even met. But I’m glad someone is doing something practical.”
HateAid was founded by Anna-Lena von Hodenberg, a former journalist who wanted to combat the harassment she saw colleagues receiving. Her initial aim was to help fund legal action against the most prominent online abusers – although Germany already had strict laws around what is and isn’t hate speech, it falls under civil rather than criminal law, so complainants must pay for private lawsuits. “But what we soon found people really needed was emotional support,” she says. “This has grown into one of the main services we offer.”
Sebastian Jungblut works as the case manager at HateAid and is the first point of contact for those who call its hotline seeking support. “Some people come to us in the moment it’s happening, when they’re having lots of intense emotions,” says Sebastian, who uses they/ them pronouns. Others might just be looking for practical safety or legal advice, “but sometimes they’ll come back weeks or even months later and be like, ‘Actually, that really did something to me.’”
They say online harassment – or “digital violence” as the centre prefers to call it – has, in their opinion, both similarities and differences to physical violence in its psychological impact. “One key difference I see is that digital violence can’t be quantified – it’ll keep popping up,” they explain. “For example, if there are deep fake nudes of you online, you can request to get them removed, but then they pop up elsewhere.” The anonymity, and a sense of loss of control, contribute to feelings of isolation and an “attack on your sense of belonging.”
Another compounding factor is the victim blaming that often accompanies digital harassment – Jungblut says both society and law enforcement don’t always take it seriously, or victims blame themselves for attracting an attack. “One of the most important things [we do] is acknowledge the person, tell them they’ve done the right thing, that what happened to them is real, and that they deserve to be taken seriously,” they explain. The organisation has a team of consultants who offer emotional support; one licensed therapist, and can help people access longer-term care if they are experiencing serious anxiety, depression or PTSD.
Coordinated online attacks are nothing new – before the advent of social media they were organised on forums such as 4Chan. But von Hodenberg says attention was dawn to them in Germany in 2019, when a far-right extremist murdered politician Walter Lübcke, a member of Angela Merkel’s Christian Democrats (CDU), who had been openly supportive of refugees. “There had been lots of calls for his death online,” she remembers, leading politicians to look more closely at the links between online and offline violence. von Hodenberg adds that she is upset that it took “an older, white man from the centre-right party, rather than a woman or an immigrant” being killed for the issue to be taken seriously. “Even though we’d had the [extreme right terror cell] NSU murders, the far-right was still being underestimated [until then].”
Politicians in Germany have also brought in new laws to tackle online hate speech. One of the best-known is the network enforcement act (NetzDG), which requires platforms to delete hate speech 24 hours after it is reported, or incur large fines. Rulings in HateAid’s own lawsuits are also shaping laws. Recently, a court declared Facebook must divulge the personal data of users who spread disinformation about Green Party politician Renate Künast, stating the platform had failed to strike a balance between free speech and human rights. This could have big implications for future cases.
Although HateAid supports victims of all types of online abuse – for example, it regularly helps women who have had nudes leaked online – von Hodenberg says there is a distinct far-right flavour to most harassment it deals with. “[The platforms] are open to everyone – the algorithm doesn’t discriminate by ideology,” she says. “But from what we see, it is the far-right who have really mastered this. Other groups may attack people, but we don’t see the level of professionalism that we do with the far-right.”
She says she has noticed tactics evolving, particularly as people spent more time online during the pandemic. “Many [prominent far-right figures] are now deplatformed, so they’re on Telegram instead,” she says. “Before, they used to say, ‘go attack this person,’ and they’re not even doing that anymore. Now they just post a name, or an article, and their followers know exactly what to do.”
Telegram is not covered by NetzDG as it is classed as a messaging app, and politicians have claimed that the company has been unhelpful when asked to remove illegal content, failing to provide a channel or contact person for raising complaints in Germany. It’s led some to take the drastic step of threatening to ban the app in Germany. The adoption of other, less regulated platforms demonstrates how hard it is for governments to legislate hate speech away. Telegram did not respond to a request for comment by VICE World News.
Von Hodenberg says she has also witnessed an increase in abuse moving offline, with people “digging, digging, digging” to find whatever personal information they can. “We also see the harassment of women becoming even more brutal and sexualised, with deepfake porn being made,” she says. Generally, women receive higher levels of abuse – 62 percent of those who seek HateAid’s services are women, and 78 percent of the content found to be illegal is directed at them.
Jungblut adds that they saw a wider range of figures – such as doctors and virologists – targeted during the coronavirus pandemic, and they also noticed it has taken a heavier psychological toll on victims. “Our lives suddenly moved online, a lot of us were just staring at a screen,” they say. “And when there’s a lot of people on the other end saying violent things, whether they’re directed at us or someone else… We see increased anxiety, fear, and feelings of helplessness and isolation.”
Other countries are weighing up how to regulate online content, particularly after the revelations of Facebook whistleblower Frances Haugen, and Germany has created a blueprint for others to follow. But the legislation has also attracted plenty of criticism.
Jordan Wildon, a digital investigator at LogicallyAI, who tracks extremist groups online, says NetzDG can be weaponised by bad actors. “On Twitter, it’s really easy for a bunch of people in Germany to report something as hate speech, so it has to be taken down, and it could just be that it’s an entirely innocuous piece of content,” he says. Wildon adds that, in his opinion, some regulation of social media is needed, but he is concerned by certain governments’ approaches, such as the UK’s proposal to end online anonymity – which could have grave consequences for activists and journalists.
Tom Jennissen from German digital civil liberties organisation Digitale Gesellschaft believes tasking tech platforms with deciding what is and isn’t hate speech is problematic. “We have very complex laws on what constitutes hate speech and what is freedom of expression, and they are usually decided by a court,” he explains. “[With NetzDG] platforms have to make that decision in just 24 hours.” This concern is echoed in a UN Human Rights Committee report. One study also suggested that authoritarian governments around the world have introduced NetzDG-inspired laws to crack down on dissent.
In addition, Jennissen believes social media regulation may just be papering over the cracks when right-wing extremism is a deeply rooted societal problem. “Before the internet, [neo-nazis] recruited people in bars,” he says. “Politicians can make a show about how tough they are being online, without looking at the root causes.”
But those advocating for tighter controls believe harassment itself is a threat to freedom of speech. “The effect is that people stop speaking up for things, because they don’t want to be targeted again,” says Neubauer. “And that’s the beginning of the end. It’s not how democracy, in its widest sense, should work.”