Discord Implements Stricter Safety Policies to Protect Minor

Discord, a popular platform among gamers, is taking significant measures to improve child safety. Following allegations of grooming, extortion, and child exploitation, the company has banned teen dating servers and AI-generated child sex abuse material. Read on to explore the details of Discord’s enhanced safety policies and its focus on parental supervision.

discord
  • Discord’s Response to Controversy: Discord faced intense scrutiny after a recent NBC investigation revealed rampant instances of child exploitation on the platform. In response, Discord has taken decisive actions to address the issue.
  • Ban on AI-Generated Child Sexual Content: To combat the sexualization of children, Discord has expanded its Child Sexual Abuse Material (CSAM) policy. The platform now prohibits any text or media content, including drawn, photorealistic, and AI-generated depictions, that sexualizes children.
  • Zero-Tolerance Policy on Predatory Behavior: Discord maintains a zero-tolerance policy towards predatory behavior, which encompasses online enticement and sextortion. The platform is committed to safeguarding its users, particularly minors.
  • Prohibition of Teen Dating Servers: Discord has explicitly banned servers dedicated to teen dating, emphasizing the potential risks and self-endangerment associated with online dating. The company aims to create a safer environment for its young users.
  • Restrictions on Sexually Explicit Material: Users under the age of 18 are now restricted from sending or accessing sexually explicit content on Discord. This measure further reinforces the platform’s commitment to protecting minors.
  • Inappropriate Sexual Conduct and Grooming Policy: Discord’s Inappropriate Sexual Conduct with Children and Grooming Policy addresses the issue of older teens grooming younger teens. Violators engaging in such behavior will be reviewed and actioned accordingly.
  • Discord’s User Base and Safety Efforts: With 150 million monthly active users and 19 million active servers per week, Discord recognizes the importance of prioritizing safety. Approximately 15% of the company’s employees are dedicated to improving safety measures and enhancing user experiences.
  • Introducing the Family Center: To provide parents with increased supervision tools, Discord has introduced the Family Center. This feature allows parents or guardians, upon opt-in by their teen, to view their child’s activity, including interactions, joined servers, and basic information of people they communicate with.

Conclusion: Discord’s recent actions demonstrate its commitment to ensuring child safety on its platform. By implementing stricter policies, banning teen dating servers, and enhancing parental supervision features, Discord aims to create a secure and positive online environment for its users.

Leave a Comment

Your email address will not be published. Required fields are marked *