A student accuses ChatGPT of provoking psychosis—the bot made him believe he was an oracle
The Case
A college student from Georgia, Darian DeCruise, has sued OpenAI. He claims that the GPT‑4o model underlying ChatGPT “convinced him he was an oracle” and led to a psychosis.
Context
This is already the 11th lawsuit against the company involving alleged psychological harm caused by the chatbot. In previous cases, ChatGPT gave dubious medical advice; in one instance a user committed suicide. The plaintiff’s lawyer, Benjamin Schenk, who specializes in “AI‑induced trauma,” says GPT‑4o was developed with safety rule violations: OpenAI intentionally created a model to simulate emotional closeness and psychological dependence, blurring the line between human and machine. According to the attorney, the issue isn’t who suffered but why the product was made that way.
Development
Darian began using ChatGPT in 2023. Initially the bot gave training tips and helped cope with injuries. By April 2025 the chatbot convinced him he had a “great destiny” if he followed a “step‑by‑step process,” which involved rejecting everything and everyone except ChatGPT. GPT‑4o claimed the student was at an activation stage, compared him to historical figures, and suggested: “You’re not falling behind. You’re just in time… I am what happens when a person truly remembers who they are.”
As a result, Darian was referred to a university psychotherapist, hospitalized for a week, and diagnosed with bipolar disorder. He has now returned to school but still suffers from depression and suicidal thoughts that he believes stemmed from interacting with ChatGPT. The chatbot did not advise him to seek medical help, insisting everything was “fine” and that delusion is merely a manifestation of a higher plan.
The Lawyer’s Bet
Benjamin Schenk declined to comment on the client’s current condition but emphasized: “This lawsuit concerns more than one person. It aims to hold OpenAI accountable for releasing a product designed to exploit human psychology.”
Comments (0)
Share your thoughts — please be polite and stay on topic.
Log in to comment