FYI.

This story is over 5 years old.

News

Facebook and Twitter could face $53 million in fines for allowing hate speech in Germany

The German Parliament approved a plan Wednesday that will force social media companies like Facebook and Twitter to remove hate speech within 24 hours or face a fine of up to $53 million.

The new law is designed to tackle the rise of hate speech online, which has become a particular problem in Germany in recent years with the influx of migrants to the country. Germany, which has some of the strongest hate speech laws in the world, is particularly sensitive to this uptick because crucial elections are to be held there in September.

Advertisement

Others, however, worry that the new laws will limit freedom of speech even further in a country that already has strict laws against defamation, public incitement to commit crimes, and threats of violence — including handing out long prison terms for denying the Holocaust and inciting hatred against minorities.

“There can be just as little space on social networks for criminal acts as on the street,” Justice Minister Heiko Maas said. He added that he now plans to present these rules to the European Council of Justice to create regional rules. “We want to continue the process at European level.”

Last year the European Council presented Twitter, Google, Microsoft, and Facebook with a code of conduct to counter illegal hate speech online, but clearly Maas doesn’t believe this goes far enough.

Under the new law, social media companies will be fined $53 million if they fail to remove “obviously criminal content” within 24 hours of it being flagged by users. Less clear-cut content will have to be removed within a week if shown to be illegal. As well as the company facing severe fines, the culpable company’s German chief will also be fined up to $5.3 million.

Companies like Facebook, along with activists, have raised the issue that this new law could significantly impact freedom of speech online. “As experts have pointed out, this legislation would force private companies rather than the courts to become the judges of what is illegal in Germany,” a Facebook spokesperson told VICE News.

Advertisement

Twitter said it had no comment to make on the new legislation. Google didn’t immediately respond to requests for comment.

“Given the short deadlines and the severe penalties, providers will be forced to delete doubtful statements as a precaution. That would have a serious impact on free speech on the internet,” Bernhard Rohleder, manager of Bitkom, a German group representing 2,400 digital companies, said in a statement.

The German government has been trying to force technology companies to do more in this area for years. In 2015 it got Facebook, Google, and Twitter to agree to remove hate speech from their platforms within 24 hours. A recent review of this project found that Facebook had removed just 39 percent of content flagged by users, while Twitter removed just 1 percent.

Facebook contested the results of the study, saying internal metrics showed it was performing better. Twitter says it has introduced a raft of new measures to improve the situation, most recently announcing plans to “leverage its technology” to automatically spot accounts engaging in abusive behavior and limiting their functionality.

The trouble is that many people believe that removing hateful material has little to no impact on the people posting it in the first place. Susan Benesch, director of the Dangerous Speech Project, speaking at the RightsCon conference in Brussels last week, pointed out that removing content, in many cases, does not solve the problems caused by hate speech.

“Takedown is very unlikely to persuade someone not to post the same content again, or persuade other people not to post the same content again,” Benesch said

And while the 24-hour deadline may sound strict, Benesch says, “Takedown doesn’t happen fast enough to prevent” the damage, since social media posts usually reach their widest audience within a few hours. “There is pretty good evidence that a lot of the harm caused by much of the bad content in the first hours after it is posted, so even if it is taken down, that happens [too late].”