Using AI as Your Doctor: What Patients and Doctors Really Think
Written by Karen Selby | Edited by Walter Pacheco
Artificial intelligence is quickly becoming a go-to source for people worried about their health. From checking symptoms to researching treatment options, AI tools like ChatGPT are now playing a bigger role in how Americans approach cancer care and more.
To better understand this shift, we surveyed 750 U.S. adults and 256 health care professionals about their experiences with AI in medical settings. The results reveal the potential benefits and pitfalls of using AI in early detection, diagnosis and decision-making.
Key Takeaways
- When seeking health information, Americans have most often used AI tools to learn more about symptoms (72%), treatment plans (47%) and next steps (37%).
- More than half of Americans (52%) have used ChatGPT to check their symptoms, with Gen Z leading the way at 66%.
- Nearly 1 in 3 Americans (32%) said they’d delay or skip seeing a doctor if an AI tool indicated their symptoms were of low risk.
- 58% of health care professionals said AI is making it harder to treat patients.
- 29% of health care professionals are very concerned about AI spreading misinformation about cancer symptoms.
- 24% have seen patients delay or avoid treatment because of the influence of AI, and 13% believe AI is negatively impacting early cancer detection.
How Americans Use AI Tools for Health Advice
AI tools are changing the way people respond to early warning signs and health concerns but not always in ways that help. While many Americans are using platforms like ChatGPT to research symptoms or understand treatment options, the growing reliance on AI may lead some to skip important steps in their care.
More than half of Americans (52%) have turned to ChatGPT when experiencing concerning medical symptoms. This number climbs even higher among Gen Z (born 1997–2007), with 66% saying they used the tool for health advice, followed by 54% of millennials. Among those who used ChatGPT to check their symptoms, half reported that it led to a diagnosis.
However, placing too much trust in AI can have serious consequences. Nearly 1 in 3 Americans (32%) said they’d skip or delay seeing a doctor if an AI tool told them their symptoms were low risk. This could be especially risky for people experiencing mental health issues or symptoms that AI may misinterpret. Gen Z was the most likely to forgo care based on AI reassurance, with 39% reporting they would delay or skip the doctor entirely.
AI tools are also being used beyond symptom checks. Nearly half of respondents (47%) used them to learn about treatment plans, 37% to get advice for next steps and 17% to understand medical costs. More than 1 in 10 Americans (12%) reported using AI tools or online symptom checkers specifically for cancer-related symptoms.
Trust, Anxiety and Role of AI in Health Decisions
Even as AI tools become more popular for health research, most people still put their trust in human doctors and for good reason. In complex or life-altering situations, personalized medical care can make all the difference.
Four in five Americans (80%) said they trust a doctor the most for medical guidance. But anxiety and convenience often lead people elsewhere. Over one-third (37%) said they would Google a possible cancer symptom before picking up the phone, while just 35% would call a doctor first.
For rare or aggressive conditions, this delay could be critical. Some early cancer symptoms mimic more common illnesses, making it easy to misdiagnose without a specialist’s input. In cases like mesothelioma (a fast-growing cancer), waiting too long to get expert care can severely limit treatment options.
Kevin Hession, a pleural mesothelioma survivor, recalled the fear and misinformation he faced after turning to the internet:
“The worst day in my life was the day that I went to ‘Doctor Google’ to find out all about mesothelioma. I’ve since then learned that that’s not what’s recommended, but what Doctor Google told me was that I had 90 days of life following the diagnosis of mesothelioma. I’m now here after 4 years, and I’m stronger than ever in the post-mesothelioma time period that I’m in right now. So I’ve well exceeded the 90 days that Doctor Google said.”
Every cancer diagnosis is unique, and no online tool can fully understand a person’s individual health history, risk factors or emotional needs. While AI and search engines can offer general information, they lack the nuance to interpret subtle symptoms or tailor advice to each patient’s situation. That’s why it’s so important to seek guidance from a qualified medical professional, especially when facing something as complex and life-altering as cancer.
Looking ahead, many Americans expect AI to play a larger role in detecting serious conditions. While 16% think AI will surpass doctors in identifying early-stage cancer within the next 5 years, 28% believe it will happen within 10 to 20 years.
As these tools become more advanced, people also want stronger safeguards. A total of 62% agreed that AI health tools should be regulated like medical devices, and 46% said they would feel more confident in an AI diagnosis if it were FDA-cleared or doctor-approved.
Health Care Professionals’ Perspectives
Doctors are already seeing the impact of AI tools in their exam rooms, and many are raising red flags. While AI has the potential to support care, it’s also making it harder to guide patients who arrive with misinformation or misplaced confidence in a tool that lacks medical training.
More than half of health care professionals (58%) said AI is making it harder to treat patients. And while 27% of health care professionals said AI is helping with early cancer detection, 13% believed it’s doing more harm than good. Nearly one-third (29%) were very concerned about AI spreading misinformation about cancer detection and symptoms.
Some providers are seeing this play out firsthand. One-third said they’ve had patients arrive at appointments convinced they had a serious medical condition based solely on what an AI tool told them. Another 24% had seen patients delay or avoid treatment because of something they read through AI or a chatbot. These delays can be dangerous, especially when AI doesn’t account for the full complexity of a patient’s condition.
“AI is a powerful tool and a quick resource,” said Dr. Daniel Landau, an oncologist and hematologist who collaborates with The Mesothelioma Center at Asbestos.com. “In the right hands, AI has the power to change the world! And nowhere is this more true than in health care. AI has been shown to be able to read X-rays, interpret pathology specimens and tie together complicated cases to solve complex diagnostic situations. AI platforms like ChatGPT have been shown to be able to pass board exams that doctors have to take to become certified.”
Although AI use has shown some promising results, Dr. Landau also warns of important limitations:
“AI models typically lag behind by several years. Much of the data is not up-to-date and can lead to historically accurate answers rather than basing answers on the most recent standards. Additionally, many AI models favor speed of response over accuracy, and this can lead to mistakes. So, while I am a big fan of AI, its use does have to be taken with a grain of salt and should not replace traditional medical opinions.”
Most health care providers (81%) agreed that AI tools should be regulated like medical devices to ensure safety and accuracy, a sentiment shared by most consumers in this study.
How to Use AI Tools Safely for Health Questions
AI can be a helpful starting point when you’re trying to make sense of a new symptom or treatment option, but it’s not a substitute for personalized care. Without the full picture of your health history or risk factors, AI tools can miss critical details or even steer you in the wrong direction. Using them wisely can help you stay informed without compromising your health. If you turn to AI for health advice:
- Use it for general education purposes, not for diagnosis. AI can explain medical terms or point you to possible conditions, but only a medical professional can confirm what’s really going on.
- Check the source. Look for information backed by credible organizations, such as the American Cancer Society, the Centers for Disease Control and Prevention or the Food and Drug Administration.
- Never delay care based on AI reassurance. Even if symptoms seem minor, don’t ignore your instincts. Delaying care could mean missing early signs of something serious. See a doctor if you feel something isn’t right.
- Bring questions to your provider. If AI raises new concerns or unfamiliar terms, consult your doctor. It can lead to more informed and better conversations during your appointment.
- Protect your privacy. Avoid entering personal health information into unverified tools or chatbots, especially those without clear data protections.
Used responsibly, AI can support your health journey, but it works best when paired with real medical expertise, rather than replacing it.
Finding the Balance Between Innovation and Care
AI is transforming the way people engage with their health, not just by providing quick answers but also by shaping their decisions about care. It’s already having a real impact on when patients seek help and who they turn to first. As these tools become more common, the real challenge will be finding a healthy balance. Technology can support better care, but it shouldn’t replace the personal connection and professional insight that only a trusted provider can offer.
Methodology
We surveyed 750 U.S. adults and 256 health care professionals to explore how AI tools are influencing health decisions, particularly in the context of early cancer detection. The generations represented among the U.S. adult respondents included Gen Z (17%), millennials (55%), Gen X (21%), and baby boomers (6%).
About Asbestos.com
The Mesothelioma Center at Asbestos.com is the nation’s most trusted mesothelioma resource. We connect patients to top cancer specialists, provide support in navigating treatment and financial aid and educate families about asbestos exposure risks. Our Doctor Match and legal support programs help ensure you get the care and compensation you deserve.
Fair Use Statement
This content may be shared for noncommercial purposes only. When referencing or republishing the findings, please include a link back to Asbestos.com with proper credit.