Discord responds after New Jersey sues them for child endangerment & “misleading” users

The state of New Jersey has announced that they’re suing the communication app Discord, claiming that it lacks safety measures for children and “obscured the risks” of using the app. Discord has since spoken with Dexerto and given their response, revealing they plan to fight New Jersey in court.
New Jersey Attorney General Matthew J. Platkin has raised the suit alongside several other state officials, alleging that Discord misled parents about how safe their children would be while using the app and failed to create strong enough parental controls to keep children safe.
“The multiyear investigation by the New Jersey Office of the Attorney General and Division of Consumer Affairs revealed Discord’s conduct violated New Jersey’s consumer protection laws and exposed New Jersey children to sexual and violent content, leaving them vulnerable to online predators lurking on the Discord app,” an official statement from the state says.
They’re seeking immediate action from Discord to not only buff up security measures against underage users signing up for the service, but also the seizure of any funds generated by Discord via users under the age of 13.
New Jersey hits Discord with lawsuit
The suit asserts that the state of New Jersey has conducted a multi-year investigation into Discord’s business practices and security, which drove them to ultimately file a complaint against the company.
They claim that Discord has failed to enforce policies barring children under the age of 13 from accessing the app, citing what they believe to be laid back security measures as a huge concern.
“At all relevant times, Discord’s Terms of Service have stated that users must be ‘at least 13 years old and meet the minimum age required by the laws in [the users’] country.’ To this day, however, Discord only requires individuals to enter their date of birth to establish their age when creating an account—nothing more,” reads the document.
Related
“Discord does not require users to verify their age or identity in any other way. Simple verification measures could have prevented predators from creating false accounts and kept children under 13 off the app more effectively.”
It also claims that children can easily join a server with over a million users on it, allowed for any user on that server to send them a direct message with no restrictions since they’re a part of a common server.

They also call out Discord users being able to make alternate accounts with little difficulty if they get banned, alleged that their “Safe Directing Message” feature didn’t do enough to protect children, and other offenses New Jersey views as a violation of their Consumer Fraud Act.
“Discord actively chose not to bolster its age verification process for years and has allowed children under the age of 13 to operate freely on the app, despite their vulnerability to sexual predators,” the statement says.
“Simply put, Discord has promised parents safety while simultaneously making deliberate choices about its app’s design and default settings, including Safe Direct Messaging and age verification systems, that broke those promises. As a result of Discord’s decisions, thousands of users were misled into signing up, believing they or their children would be safe, when they were really anything but.”
Discord responds to the lawsuit
Dexerto reached out to Discord to get a response on the matter, and they had this to say in an official statement:
“Discord is proud of our continuous efforts and investments in features and tools that help make Discord safer. Given our engagement with the Attorney General’s office, we are surprised by the announcement that New Jersey has filed an action against Discord today. We dispute the claims in the lawsuit and look forward to defending the action in court.”
Additionally, they claim that the safety measures New Jersey is targeting in this lawsuit have been significantly upgraded in recent years, and that the safety features that were implemented in 2017 aren’t accurate to what the app is today.
The current version of Discord allows users to block messages from any source, even people on your friends list. There’s an option to completely block DMs if you so choose to access Discord in that way – or if your guardian chooses that for you.
“The lawsuit’s focus is on safety initiatives we undertook years ago as a small startup to offer our users safety controls. It is important to note that we have continued to invest in and improve safety practices, products and teams since then,” Discord told Dexerto.
“Today’s litigation does not question the value of these tools for users, but instead whether we could have been clearer with our description of them. We have worked hard to provide our users with a clear understanding of the tools and controls we offer.”
In other words, Discord believes that some of the problems that have arisen on their app are a result of security measures not being used the way they were intended to be, pointing to resources they have that explain tools that are within the Family Center to allow for parental controls.

They were also adamant about Discord not being traditional social media, and that users decide what they do and don’t engage with on the platform.
“Unlike social media platforms, Discord users are in control of their experience—they decide with whom they interact and what communities they join. There is no endless scrolling, no counting of likes, and no ‘going viral.’ Discord is not a platform designed to maximize engagement by an algorithm picking and choosing the content users see, and our business model is not based on the paid amplification of content,” Discord explained.
All of these factors combined are why Discord is fighting this lawsuit and going to court in order to defend themselves rather than choosing to settle.
This isn’t the first lawsuit that’s been raised against Discord in recent times, with a class action lawsuit being raised against them late in 2024 claiming it’s too difficult to unsubscribe from their Nitro service.