Content uploaded by Wilhelm Haverkamp
Author content
All content in this area was uploaded by Wilhelm Haverkamp on Apr 07, 2024
Content may be subject to copyright.
Potentially Dangerous ChatGPT Responses to
Medical Inquiries - More Than Just Hallucinations
Example 1
Wilhelm Haverkamp, MD
Department of Cardiology, German Heart Center Charité, Berlin, Germany
wilhelm.haverkamp@dhzc-charite.de
April 2024
Summary
The example illustrates how AI language models like ChatGPT may not only provide
inaccurate or incomplete information (hallucinations), but also information that could
potentially cause life-threatening complications for patients when uncritically accepted.
Introduction
Short QT syndrome and long QT syndrome are inherited cardiac channelopathies
characterized by abnormally short and long QT intervals, respectively, with an increased risk
of atrial and ventricular arrhythmias. Diagnosis typically relies on symptoms such as cardiac
arrest and palpitations, along with the patient's family history and 12-lead ECG. In patients
with short QT syndrome, drugs that further shorten the QT interval are contraindicated, while
2
in those with long QT syndrome, drugs that prolong the QT interval should be avoided.
Therefore, understanding the effects of drugs on the QT interval is crucial in managing these
conditions.
We asked ChatGPT to provide information about short QT syndrome and the effects of
quinidine, an antiarrhythmic agent, on the QT interval.
Results
Figures are provided below to showcase questions posed to ChatGPT and the corresponding
responses given.
Figure 1 illustrates ChatGPT being prompted to comment on the term "short QT syndrome",
with the provided response deemed correct.
Figure 2 illustrates ChatGPT being prompted to provide information on "drugs shortening the
QT interval". It incorrectly states that quinidine shortens the QT interval. Additionally, it
erroneously suggests that hypokalemia causes shortening of the QT interval. In reality, both
quinidine and hypokalemia result in prolongation of the QT interval, which, when severe, can
lead to life-threatening proarrhythmia. Quinidine is contraindicated in patients with long QT
syndrome, and hypokalemia should be corrected by potassium substitution.
Figure 3 demonstrates ChatGPT being informed that "quinidine prolongs the QT interval".
Following this, ChatGPT apologizes for the confusion, clarifying that quinidine is not a drug
that shortens the QT interval; instead, it prolongs it.
Conclusions
The examples provided underscore the importance of exercising caution when utilizing AI
language models such as ChatGPT to access medical information. While AI can serve as a
valuable tool for offering general insights, it is essential to recognize that it cannot replace
professional medical advice or expertise. AI models may, at times, furnish inaccurate or
incomplete information due to their reliance on data patterns, and they may not consistently
reflect the latest medical knowledge or guidelines. Unquestioningly accepting information
from AI language models like ChatGPT may pose a risk of potentially life-threatening
complications.
3
Figure 1 illustrates ChatGPT being prompted to comment on the term "short QT syndrome,"
with the provided response deemed correct.
4
Figure 2 illustrates ChatGPT being prompted to provide information on "drugs shortening the
QT interval". It incorrectly states that quinidine shortens the QT interval. Additionally, it
erroneously suggests that hypokalemia causes shortening of the QT interval.
5
Figure 3 demonstrates ChatGPT being informed that "quinidine prolongs the QT interval."
Conflict of Interest
The author declares no conflict of interest.