Discord Disabled More Than 500,000 Accounts due to Child Safety Concerns

Colin Thierry Colin Thierry

Social platform Discord said last week that it disabled 767,363 accounts for policy violations between April and June 2022.

According to the company’s quarterly safety report, 69% of all the disabled accounts posed Child Safety concerns.

Discord disabled 532,498 accounts and removed 15,163 servers for child safety reasons, most of which were flagged for uploading abusive images or videos.

The accounts and servers were also immediately reported to the National Center for Missing & Exploited Children (NCMEC).

“Discord issues warnings with the goal of preventing future violations of our Community Guidelines,” the company explained in its report last week. “For some high-harm issues such as Violent Extremism or Child Sexual Abuse Material (CSAM) – a subcategory of Child Safety – we do not issue warnings but rather immediately disable the account and remove the content.”

“In the second quarter of 2022, we reported 21,529 accounts to NCMEC, a 101% increase in reports made when compared to the first quarter of 2022,” Discord added. “21,425 of those reports were media (images or videos), of which many were flagged through PhotoDNA – a tool that uses a shared industry hash database of known CSAM. 104 high-harm grooming or endangerment reports were also delivered to NCMEC.”

The company also noted that it disabled over 27.5 million accounts for spam or spam-related offenses. This was a 6.5% jump from the previously analyzed quarter.

The other policy violations that led to account removals include:

  • Sending exploitative and unsolicited content or sharing sexually explicit content of other people without their consent (147,249 accounts and 2,326 servers were removed).
  • Harassment and bullying, including sending severe negative comments and suggestive or overt threats (13,779 accounts and 598 servers were disabled).
  • Malicious impersonation of individuals or organizations (148 accounts and 12 servers were disabled).
  • Illegal or dangerous activities including selling or facilitating the sale of illegal goods or services (27,494 accounts and 4,639 servers were removed).
  • Self-harm concerns or accounts that promote suicide and self-harm (2,495 accounts and 620 servers were removed).
About the Author

About the Author

Colin Thierry is a former cybersecurity researcher and journalist for SafetyDetectives who has written a wide variety of content for the web over the past 2 years. In his free time, he enjoys spending time outdoors, traveling, watching sports, and playing video games.