“Is this 8chan discord?” asked an anonymous member of the gaming chat application Discord. The member was searching for the new home of the 8chan online community known for its extremist content, which included alleged postings from the two men who carried out mass shootings in El Paso, Texas and New Zealand.
“No, it’s a plush Discord for people who use 4chan and 8chan,” replied a regular Discord user.
“Damn, I am fruitlessly searching for a place where 8chan refugees have flocked to but there isn’t much. Even the .onion thing is down,” chimed in a third user, referring to the “dark web” URL where 8chan retreated after internet providers blocked it from easy access.
Several members of the Discord chat group currently called “4/8Chan Plush Lewd” joined the discussion to explain that, due to high levels of scrutiny, new members had to be vetted first. Some Discord groups frequently change their names, making the discussions difficult for non-members to monitor.
The comments touched off a winding discussion about what constitutes an “authentic” replacement for 8chan, a site where users encouraged others to “embrace infamy” and which has been linked to a number of mass shootings. 8chan owner Jim Watkins has stated he will keep the site offline until he can talk to Congress about the site’s future. The House Committee on Homeland Security has requested he appear to address the proliferation of extremist content on 8chan.
Discord, a social media service that allows users to chat in private groups, launched in 2012 and was developed specifically for video game players. As of May 2019, Discord had over 250 million users, including children and teens, and it is valued at nearly $2 billion.
Unlike apps like Instagram, Discord lacks a central news feed. Instead, users are clustered into private or semi-private groups that are typically accessible by invitation.
Discord shares many of the same features that attracted users to 8chan, including anonymity and the ability to tuck toxic conversations into private channels. Much of 8chan’s appeal was that the website was easy to access and hosted dozens of unmoderated subgroups. Content often blurred the lines between innocuous and extreme: Users could easily tab between conversations about toys, video games, threats of violence and anti-Semitism.
Discord has similar features that allow users to quickly toggle between discussion groups. Many of these groups, referred to on Discord as “servers,” focus on benign topics like tips for playing Fortnite and Minecraft.
But more disturbing discussions can be found there, too. Once approved by a group moderator, often a simple process requiring only a few clicks and minimal vetting, private rooms can open to reveal extensive conversations about white nationalism, anti-Semitism, misogyny and violent pornography, animal abuse, and sympathy for members of the Nazi party. Discussions can occur in groups adjacent to, and often linked from, chatrooms for children about popular video games.
CBS News found dozens of examples. One group, called Kool Kids Klub, was established by members of Stormfront, according to some participants. Stormfront is an openly neo-Nazi and white supremacist hate group whose members also allegedly posted to 8chan frequently.
In one Discord chatroom dedicated to fans of Gab, a social media website described by the Southern Poverty Law Center as “white nationalist-friendly,” a user expressed support for “a few militia cells” ready to “pull off a coup” in the United States. “I have predicted this happening before Trump was even elected so don’t even try to to [sic]say that i’m just anti-Trump or i will literally kill you myself,” wrote the Discord user.
In a similar group, members were encouraged to share the results of a personality test that compared individuals to Nazi party leaders. “I must say that I am proud to be similar to this beast,” said one member after sharing a photo of Nazi architect Albert Speer, who was convicted at the Nuremberg trials. Other users in that group shared their affinity for Nazi minister of propaganda Joseph Goebbels.
After a spate of negative news stories in 2017 and 2018, including research conducted by ProPublica and the Southern Poverty Law Center that linked hate groups to Discord, the company strengthened its policies regarding extremism, illegal activity, harassment, doxxing, and violence on the platform. Discord also banned a number of servers affiliated with white nationalists.
Discord says it investigates and takes action when users report violations and that it also uses tech tools such as artificial intelligence to identify them.
But Discord has had a tough time keeping up. This month, it apparently shut down one group, whose members say it is associated with the Kool Kids Klub and who regularly shared Nazi images. The group quickly changed its name to a string of nonsense characters, ‘asdgrfhtyjujtrgefgrefgrtyuafw,’ to mask its origins. And it used two bots available to all Discord users to archive its old content. The group reemerged mere hours later, and was still active at the time this article was published.
In a statement provided to CBS News on August 19th, a spokesperson for Discord said:
Discord was built to bring people together around a passion for gaming, which is what the overwhelming majority of Discord’s millions of users are here for. Discord has a Terms of Service and Community Guidelines that all users are required to adhere to. These specifically prohibit harassment, threatening messages, calls to violence or any illegal activity. Discord’s guidelines cover more expansive activities than other platforms’ rules and include activities such as doxxing and sharing private information. We investigate and take immediate action against any reported violation by a server or user, which can include shutting down offending servers or banning users. The number of these violations make up a tiny percentage of usage on Discord, and the team is committed to improving our policies and process to make it even smaller. Discord’s Trust and Safety team exists to proactively protect the safety of our users – on and off platform – and we have a variety of security methods that help users avoid unwanted or unknown contact. As all conversations are opt-in, we urge users to only chat with or accept invitations from individuals they already know. We respect the privacy of our users and we don’t read each of the billions of messages sent on Discord one-by-one. Instead, our Trust and Safety methodology – computer intelligence, human intelligence, and community intelligence – surfaces violations of Discord’s Terms of Service (ToS) and Community Guidelines so that our team can effectively work to investigate each one properly. We will continue to be aggressive to ensure that Discord exists for the community we set out to support – gamers.
To learn more about technology and internet extremism follow CNET’s iHate series.