The internet has revolutionized access to information, and healthcare is no exception. Now, a growing number of individuals are turning to artificial intelligence (AI) tools like OpenAI’s ChatGPT, Google Bard, and Microsoft Bing to navigate the complexities of medical information. ChatGPT, boasting over 100 million users and readily available as a mobile app, offers instant, easy-to-understand answers to a wide array of questions. The basic functions are free, making it seem like a convenient and affordable alternative to traditional healthcare resources.
However, while the allure of quick and accessible medical information is strong, experts caution against relying on AI chatbots for medical advice. Compared to consulting a healthcare provider, using ChatGPT might appear simpler, but it’s crucial to understand its limitations and potential risks, especially when it comes to your health.
“No such unregulated device should be used for medical advice, given the potential high stakes of people misunderstanding or applying such information to their health,” warns Dr. Jonathan Chen, MD, PhD, assistant professor of medicine and physician-scientist at Stanford University School of Medicine. He emphasizes the need for both patients and clinicians to be aware of what these tools truly offer and their inherent limitations.
This article will explore when it’s safe to use AI like ChatGPT for health-related queries and, more importantly, when you should absolutely not, highlighting the potential harms to be aware of.
Understanding ChatGPT for Health: Informational Tool, Not a Diagnostic One
While AI services like ChatGPT are not substitutes for medical professionals, they can be valuable resources for obtaining general health information. Dr. Rigved Tadwalkar, MD, a board-certified cardiologist at Providence Saint John’s Health Center, explains that these tools excel at providing information on health conditions, medications, diseases, and various medical topics.
For instance, if you’re seeking to understand more about the common cold, ChatGPT can generate responses covering symptoms, causes, risk factors, and treatments. Similarly, if you need information about a specific medication – its purpose, potential side effects – AI can be helpful.
Rigved Tadwalkar, MD
[ChatGPT] is more of an informational tool as opposed to being diagnostic and giving definitive advice the way that most people would want it to.
Dr. Tadwalkar highlights the tool’s strength in providing readily accessible information, particularly for straightforward inquiries. He notes, “That’s really where something like AI can shine, and where we see that there is some degree of reliability in the responses.”
Dr. Chen adds that ChatGPT can be useful for simplifying complex medical information from reputable sources like medical websites and scientific studies, making it easier to understand for the general public. Furthermore, these tools can assist in preparing for doctor appointments. For example, you can ask ChatGPT what information to bring to your appointment or what questions to ask your healthcare provider.
Preliminary research suggests ChatGPT performs better than older online symptom checkers in suggesting potential diagnoses and appropriate triage. However, Dr. Chen points out that even ChatGPT is incorrect more than 10% of the time.
“It’s more of an informational tool as opposed to being diagnostic and giving definitive advice the way that most people would want it to,” reiterates Dr. Tadwalkar. “This is a good informational resource on a lot of occasions, but it’s not the end-all because it is just not quite at that level.”
The Dangers of Using ChatGPT for Self-Diagnosis
Despite its benefits, it’s crucial to understand that ChatGPT should not be used for self-diagnosis or as a source of definitive medical advice. Dr. Tadwalkar emphasizes that relying on AI for diagnosis can be risky because the information provided may be inaccurate.
“Many either may not be aware of this fact or may be taking it lightly, but ChatGPT can flat out lie sometimes,” he cautions. “This is where it becomes dangerous.”
Dr. Chen further explains that relying on ChatGPT for symptom interpretation can lead to unnecessary anxiety or, conversely, dangerous complacency.
“There are harms from both over- and under-diagnosis,” states Dr. Chen. “Obviously, if a patient is falsely reassured by a chatbot and declines to seek real medical attention, delays or misses in critical diagnoses (e.g., heart attack or cancer) would be devastating.”
Another significant limitation of ChatGPT is its potential to provide outdated information. Medical research is constantly evolving, and AI models may not always be updated with the latest findings. This means that the responses from a chatbot could be factually incorrect or no longer reflect current medical best practices.
Dr. Tadwalkar also points out that ChatGPT lacks the ability to consider individual patient details unless explicitly provided. Factors like family history, medical history, medications, lifestyle, and demographics are crucial for accurate medical assessments. Without this personalized context, the advice generated by ChatGPT may not be appropriate or safe for a specific individual.
“Users are not often inputting that degree of information. Even if they did, I’m not sure if the AI is in a place where it can recognize all of that,” he explains.
Best Practices: How to Use ChatGPT Responsibly for Health Information
If you choose to use ChatGPT for health-related questions, experts recommend several best practices to minimize risks and maximize its utility as an informational tool.
Be Specific and Provide Details
When asking ChatGPT about health concerns, provide detailed information. For example, instead of asking “What could my cough be?”, specify the duration of the cough and any accompanying symptoms like fever or chills.
Including relevant medical history, such as pre-existing conditions like asthma or COPD, can also help the AI generate more contextually relevant information.
Be Careful About Sharing Personal Info
While providing some general health details is helpful, avoid sharing sensitive personal information directly with ChatGPT. Dr. Chen warns that any data entered into these systems, including personal medical details, is collected by the companies operating the AI.
“When you enter (copy-and-paste) any personal medical information into these systems, you are uploading private information to a big tech company for them to do whatever they wish to do with it,” he states.
To mitigate privacy risks, consider paraphrasing your questions using general scenarios. For example, instead of “I am a 65-year-old with a persistent cough…”, you could ask, “What are possible diagnoses and tests for a person over 60 with a chronic dry cough?”
Do Your Research and Check Sources
Always verify any medical information obtained from ChatGPT with reliable sources. Dr. Chen advises cross-referencing AI-generated advice with credible sources like government health websites (.gov) or educational institutions (.edu). Prioritize sources that are up-to-date and reputable to ensure accuracy.
Always Consult a Healthcare Professional
Even when using ChatGPT as an informational resource, both Dr. Tadwalkar and Dr. Chen stress the importance of following up with a qualified healthcare provider. Whether in-person or via telehealth, consulting a human doctor is essential for accurate diagnosis, personalized treatment, and comprehensive care.
“Patients should still see a physician,” emphasizes Dr. Tadwalkar. “I look at these AI chatbots like ChatGPT as being just complimentary. It’s just another tool in the toolbox. Just like how when you would Google something, you would still go see your physician to see if there’s some accuracy. I would do the same with these AI chatbots.”
Conclusion
AI tools like ChatGPT offer a convenient way to access general health information. They can be helpful for learning about conditions, medications, and preparing for doctor visits. However, it’s critical to remember that ChatGPT is not designed for medical diagnosis and should not replace professional medical advice.
Using ChatGPT responsibly for health information means understanding its limitations, being cautious about personal data, verifying information with trusted sources, and, most importantly, always consulting with a healthcare professional for any health concerns. Think of ChatGPT as an initial informational starting point, not the final word on your health. Your well-being is best served by the expertise and personalized care of human medical professionals.