According to the EU Commission, the fight against hate and agitation on the Internet is making progress: online platforms such as Facebook and Twitter are now checking significantly more hate comments that have been reported than four years ago. This emerges from a report published by the EU Commission on Monday.
The companies Facebook, Twitter, YouTube, Instagram and Jeuxvideo.com checked the report “Countering illegal hate speech online – 5th evaluation of the Code of Conduct” according to the last 90 percent of allegedly illegal hate comments within 24 hours – in 2016 it was only 40 percent. An average of 71 percent were deleted. That is roughly on the level of the previous year. Posts that included threats of violence or death have been removed. However, comments that were reported because of their defamatory language were more tolerated by companies.
Same standards for everyone
In 2016, the EU Commission agreed a code of conduct with Facebook, Microsoft, Twitter and YouTube, which has since been joined by other companies. In doing so, they are committing themselves to act more effectively against hate crime on the Internet.
In a company comparison, Jeuxvideo.com came first in 2020. According to the report, the online magazine for video games deleted all reported comments. Facebook came in behind with almost 88 percent deleted messages. Only Twitter removed around 36 percent fewer comments this year than last year. The report gives no reason for this.
“The code of conduct is (…) a success story,” said responsible European Commissioner Vera Jourova. However, according to the report, only Facebook has so far systematically informed those users who have created or marked relevant comments. All other companies would have to catch up here. It was time that all companies followed the same standards and liabilities, Jourova said.