Doe v. X: Why This Lawsuit Was Filed—and What the Plaintiff Is Alleging
In late 2025, a pseudonymous plaintiff filed a federal lawsuit against
X Corp. (formerly Twitter) and xAI Corp..
The case is known as Doe v. X.
The lawsuit does not ask the public to decide who is right or wrong.
Instead, it raises a set of questions courts are increasingly being forced
to confront: What responsibility do platforms have when intimate
images are shared without consent—and what happens when that content is
used beyond its original posting?
What the Plaintiff Alleges
According to the complaint, the plaintiff alleges that intimate images were
made accessible on X without consent. The lawsuit claims that reporting and
moderation mechanisms failed to stop the continued availability of the
material.
The complaint further alleges that the images were disclosed or used beyond
the original posting context, including in connection with artificial
intelligence development. These allegations form the basis of claims
brought under federal law, not merely under platform
policies or state revenge-porn statutes.
Why the Case Was Filed Anonymously
The plaintiff is proceeding under a pseudonym. This is common in cases
involving nonconsensual intimate images, where public identification can
compound harm, expose victims to retaliation, and permanently tie their
names to the abuse they are trying to stop.
Anonymity does not weaken a case. It reflects the reality survivors face
when seeking accountability.
Where the Case Is Procedurally
The lawsuit was initially filed in California federal court and later
transferred to the Northern District of Texas. Since the transfer, the
plaintiff has amended the complaint, and the case is now proceeding on a
Second Amended Complaint.
At this stage, the court has not ruled on the merits. What matters now is
what legal defenses the defendants are raising—and what those defenses
mean for victims more broadly.
Why This Matters to Survivors
For victims of revenge porn and nonconsensual intimate images, this case is
not about corporate branding or celebrity executives. It is about whether
the law recognizes the real-world harm caused when intimate images remain
accessible—and when they are reused in ways victims never agreed to.
In Part 2, we’ll examine the defenses X and xAI are relying
on—and why those arguments appear in nearly every case survivors bring
against major platforms.
👉 Learn what qualifies as nonconsensual intimate imagery
👉 Get confidential help







