On February 11 2026, Discord, one of the largest online communication platforms in the world, announced that it would begin implementing age-verification requirements starting at the end of February. The service, which was originally built for gamers as a more modern replacement for services such as TeamSpeak 3 and Mumble, has over 200 million active monthly users and over 600 million total registered users. The service was considered low-bloat and easy to use- so why are users so concerned about age verification that they have begun looking for alternatives?
“The Incident”
In October of 2025, Discord confirmed that its age verification data was stolen by cyber actors. While Discord claims only 70,000 users had their photo IDs exposed, the hackers claim that they exfiltrated over 2.1 million. Few details are known about how exactly the breach occurred, but we do know that:
- Discord’s 3rd party support vendor 5CA was the source of the breach
- The incident started on September 20, 2025, and attackers reportedly gained access for 58 hours
- Scattered Lapsus$ Hunters have claimed responsibility for the attack
From this information, we can make a number of assumptions on the attack path. Casmer Labs believes that one or multiple of the contracted support agents’ credentials were compromised. From there, the government ID images, which were stored, unencrypted and in tickets, could be downloaded. This generally aligns with the amount of time attackers had to execute the breach.
But wait, what? Why were government IDs stored unencrypted? And why was that sensitive information being retained even after those tickets were closed?
The Consequences and What We Can Learn
It’s clear that Discord has lost the trust of its users- storing sensitive information in a ticketing system not designed for PII is a blunder in itself. At the time, age verification was not required, and much of the leaked information was the result of user appeals. Now, since age verification is now being required to access the full suite of Discord functionality, there will undoubtedly be a much higher volume of sensitive information being stored by Discord and/or their contractors.
As more industries begin to implement and mandate age verification practices, the amount of sensitive data being held by organizations and their contractors will continue to increase. Especially for organizations that have little experience in handling sensitive information, Casmer Labs anticipates that we will continue to see an increase in similar breaches over the coming months and years.
We know, we know. This is far less technical than our usual articles. But below, we’re going to drop some general, high-level recommendations for organizations handling PII, especially in the cloud:
- Automatically discover all sensitive information as soon as it is ingested. A robust data classification tool is extremely useful for this.
- Whenever possible, label data based on sensitivity (Public, Internal, Confidential, Restricted, etc.)
- Ensure that data retention policies are set up in a well-thought and robust manner.
- Collect only the data that you need; there’s no “just in case” when a breach could cost you millions.
- Ensure that data is never stored for longer than it is needed. In AWS, S3 lifecycle policies could automatically delete or archive PII after a set period.
- Encrypt all data at rest and in transit.
- All databases, block, file, and object stores should have encryption enabled.
- Enforce TLS/SSL whenever possible for data moving between your users and the cloud.
- Enforce strict IAM policies, starting with the principle of least privilege.
Leave a comment