Monday, July 28, 2025 - OpenAI CEO Sam Altman has warned ChatGPT users that conversations with the AI chatbot are not legally protected and could be used as evidence in court if a crime is involved.
Speaking on the This
Past Weekend podcast on June 25th, Altman said there is
currently no legal framework shielding user interactions with ChatGPT, unlike
doctor-patient or attorney-client communications, which are protected by
confidentiality laws.
“Right now… if you talk to ChatGPT about your most sensitive
stuff and then there’s a lawsuit or whatever, we could be required to produce
that, and I think that’s very screwed up,” Altman said.
He acknowledged that many people, especially younger users,
use ChatGPT as a therapist or life coach, often discussing deeply personal
issues.
However, he emphasized that no legal privilege exists for
such chats.
“If you talk to a therapist, a lawyer, or a doctor, there’s
confidentiality. We haven’t figured that out yet for when you talk to ChatGPT,”
Altman said.
Although OpenAI deletes free-tier chats after 30 days, the
company may retain them for legal or security purposes.
Currently, OpenAI is involved in a lawsuit with The New
York Times, which has prompted the company to retain conversations from
millions of users, excluding enterprise customers.
Unlike end-to-end encrypted apps like WhatsApp, OpenAI can
access and read every user interaction, meaning chats - even those involving
emotional or mental health issues - could potentially be disclosed in legal
proceedings.
0 Comments