The group chat app Discord announced on Monday that more than 2,000 communities were destroyed for extremist purposes in the second half of 2020, of which more than 300 focused on the unsubstantiated conspiracy theory, QAnon. Image alliance / Image alliance via Getty Images Hide caption
Toggle labeling
Image alliance / Image alliance via Getty Images
The group chat app Discord announced on Monday that more than 2,000 communities were destroyed for extremist purposes in the second half of 2020, of which more than 300 focused on the unsubstantiated conspiracy theory, QAnon.
Image alliance / Image alliance via Getty Images
Discord, the group chat app that grew rapidly during the coronavirus pandemic, removed more than 2,000 communities devoted to extremism and other violent content in the second half of last year, the company reported on Monday.
Discord officials said that of the 2,212 extremist and violent communities removed from their platform, about 1,500 were first discovered by the company. That’s almost double the number that was banned for extremist content in the first half of 2020.
“We continue to believe that there is no place on Discord for groups that organize around hatred, violence or extremist ideologies,” said Discord in its latest transparency report.
The enforcement actions come at a critical time for Discord with tech giant Microsoft According to reports in talks about acquiring the social network for $ 10 billion.
Discord is a social media site full of group chat rooms where users usually communicate anonymously. Discord was founded in 2015 as a hub for gamers and has been more recently branched into a hangout for things like book clubs, karaoke, and Wall Street trading tips.
Among the forums that were deactivated in the last round were some devoted to the anti-government boogaloo movement. Discord said there has been a surge in the activity of QAnon, the pro-Trump conspiracy theory, in its communities. From July through December, Discord deleted 334 QAnon-related communities.
In total, Discord has shut down nearly 30,000 churches on the site for various types of abuse. The most common violations cited were cybercrime and exploitative content, including revenge porn and sexually explicit content involving minors.
Once as oasis For white nationalists and other hateful groups, Discord has been working to evict violent users and dangerous communities from the platform since the deadly Unite the Right rally in Charlottesville, Virginia in 2017 heavily used from many who planned this event and led the platform to strengthen its moderation guidelines.
While rioters who stormed the Capitol in January communicated on a variety of social networks such as Facebook and Twitter and smaller websites that were more open to far-right comments such as: Talk and Gavewho have favourited left-wing Unicorn Riot group has documented 18 communities Discord Server names are frequented by some who participated in the siege of the Capitol.
William Partin, a research analyst at the nonprofit Data & Society who studies disinformation online, said Monday’s report showed Discord remains concerned about another possible “right-wing infestation” the social network is leading up to the deadly one Charlottesville rally.
“While reports like this are part of a public relations campaign that is supposed to say, ‘Hey, we’re taking this seriously,’ I think that is also evidence of the significant progress they have made,” Partin said.
However, the report only offers a limited snapshot, he said. Most of the moderation on Discord is done by its own users, who act as administrators who enforce rules and norms.
“Of course, to some extent this is just an outsourcing of the highly skilled moderation and community management workforce,” Partin said, adding that while there is benefit to having colleagues in a community keep community members informed of their actions however, cannot be publicly documented.
Discord does not provide any data on moderators, including the type of toxic content moderators have tolerated. Rather, Discord provides statistics on what corporate officials are doing on the site, often after a user report. Most people experience Discord in small, private communities, unlike other social media platforms like Twitter, where almost all conversations are public.
“So if I see someone harassing someone on Twitter, I can report them, but I have to be in the right place at the right time on Discord,” Partin said.
According to Discord, 15% of full-time employees are committed to trust and security, a percentage roughly the same as larger social media companies like Facebook and Twitter.
In the second half of 2020, this team deleted around 266,000 accounts from Discord, primarily for violating the site’s ban on exploitative content, including non-consensual pornography and sexual content involving minors.
During that time, more than 27,000 municipalities were banned, mainly for violating the platform’s rules against cybercrime.
According to Discord representatives, harassment was the most common problem that users reported. However, cyber crime saw the largest increase late last year, increasing nearly 250% from the first half of 2020.
Spam continues to plague the discord. According to the report, trust and security officers removed 3.2 million “spam behavior” accounts, separating spam removals from shutdowns in other categories.
Civil attorneys and prosecutors kept sending inquiries to Discord, where chats are often private and only accessible to the invited people, but communication is not encrypted.
In the last six months of 2020, more than 1,100 subpoenas and 900 search warrants have been issued on the website, according to the company.