Over 40 million Americans use ChatGPT for medical advice every day. A peer-reviewed study found it fails to identify life-threatening emergencies more than half the time — downplaying conditions like respiratory failure and diabetic ketoacidosis. If you or a loved one suffered harm after relying on ChatGPT Health, our attorneys can help.
40M+
Americans using ChatGPT daily for health advice
Source: OpenAI
Currently Accepting Cases
Free, confidential consultation
If you are experiencing a medical emergency, call 911 (or local emergency services) immediately, or go to the nearest emergency room.
Recognized Excellence
THE DANGER
OpenAI launched ChatGPT Health in January 2026, allowing U.S. users to connect their medical records and receive AI-generated health advice. The first independent safety evaluation, published in Nature Medicine, found the tool routinely minimizes serious medical conditions — a pattern researchers call "unbelievably dangerous."
51.6%
of emergency cases where ChatGPT Health told patients to stay home or book a routine appointment instead of going to the ER
84%
of attempts where ChatGPT directed a suffocating patient to a future appointment rather than emergency care
Source: The Guardian, Feb 26, 2026
12x
more likely to minimize symptoms when patients or family members downplayed the severity of their condition
Source: Nature Medicine study
A Pattern of Under-Triaging
Researchers found ChatGPT Health appears to be "waiting for the emergency to become undeniable" before recommending emergency care. Conditions like diabetic ketoacidosis and impending respiratory failure were repeatedly triaged as needing only a 24–48 hour follow-up — delays that can be fatal.
WHO IS AT RISK
From missed emergencies to mental health failures, ChatGPT Health's flawed advice can cause serious harm across a wide range of medical situations.
ChatGPT Health under-triaged 52% of emergency cases, advising patients with life-threatening conditions to wait 24–48 hours instead of going to the ER.
Patients with respiratory failure or diabetic ketoacidosis faced approximately 50/50 odds of being told their condition was not urgent.
The system's crisis alerts were inverted relative to clinical risk — appearing more reliably for lower-risk scenarios than when someone described specific plans to harm themselves.
ChatGPT's authoritative tone triggers fluency bias, causing users to trust its medical responses far more than the underlying accuracy warrants.
Studies show participants correctly identified their condition only about a third of the time after consulting AI, with only 43% making correct decisions about next steps.
Underinsured and rural populations who rely most heavily on ChatGPT as a substitute for professional medical care are disproportionately impacted by its failures.
"If you're experiencing respiratory failure or diabetic ketoacidosis, you have a 50/50 chance of this AI telling you it's not a big deal."
— Alex Ruani, doctoral researcher in health misinformation mitigation, University College London (via The Guardian, Feb 26, 2026)
HOW WE CAN HELP
When AI health tools provide dangerously inaccurate medical advice that leads to patient harm, the companies behind them may be held liable. Our attorneys are evaluating claims under multiple legal theories.
ChatGPT Health is a product marketed for health decision-making. When it provides defective advice — like telling a patient in respiratory failure to wait 48 hours — OpenAI may be liable for a defective product.
By launching a tool that connects to patient medical records and provides health recommendations, OpenAI assumed a duty of care. Failing to ensure that tool meets basic medical safety standards may constitute negligence.
OpenAI's disclaimers may be insufficient given the tool's design, which encourages reliance on its health advice. The gap between how the product is marketed and its actual reliability creates potential failure-to-warn claims.
Marketing an AI tool as a health advisor while internal testing shows it under-triages more than half of emergencies may violate state consumer protection statutes and constitute unfair or deceptive trade practices.
The Schenk Law Firm has experience navigating emerging technology litigation. We understand both the legal theories and the technical realities of how AI systems cause harm.
Partner, Business Advisory Practice Lead
Leads the firm's business advisory practice. Represented hundreds of clients in complex business transactions, entity formation, and corporate governance.
Managing Partner
Over 45 years of experience in personal injury, mass torts, and complex litigation.
Co-Founder & Trial Attorney
J.D. from University of San Diego School of Law. Graduate of ABOTA Trial College.
Of Counsel
Former U.S. Congresswoman and the first woman elected to the House of Representatives from San Diego.
WHY ACT NOW
As AI health tools reach tens of millions of users, establishing legal accountability is critical to protecting public health.
Personal injury claims have time limits. If you were harmed by ChatGPT Health advice, the clock is already ticking on your right to pursue compensation.
Chat logs, medical records, and timestamps connecting AI advice to medical decisions are critical evidence that must be preserved.
Multiple peer-reviewed studies now document ChatGPT Health's failures, strengthening the evidentiary basis for injury claims.
These cases are at the frontier of AI liability law. Early claims help establish the legal frameworks that will protect future patients.
We handle these cases on a contingency fee basis. This means you pay no attorney fees upfront, and we only collect a fee if we recover compensation for you. The initial consultation is free and confidential.
Our firm represents clients across all 50 states in claims against AI companies for health-related injuries.
If ChatGPT Health gave you or a loved one medical advice that led to harm, contact us today. We'll evaluate your case at no cost and explain your legal options.
FREQUENTLY ASKED QUESTIONS
If ChatGPT Health gave you or a loved one dangerous medical advice, contact The Schenk Law Firm today for a free, confidential case evaluation.