- Home
- Medical news & Guidelines
- Anesthesiology
- Cardiology and CTVS
- Critical Care
- Dentistry
- Dermatology
- Diabetes and Endocrinology
- ENT
- Gastroenterology
- Medicine
- Nephrology
- Neurology
- Obstretics-Gynaecology
- Oncology
- Ophthalmology
- Orthopaedics
- Pediatrics-Neonatology
- Psychiatry
- Pulmonology
- Radiology
- Surgery
- Urology
- Laboratory Medicine
- Diet
- Nursing
- Paramedical
- Physiotherapy
- Health news
- Fact Check
- Bone Health Fact Check
- Brain Health Fact Check
- Cancer Related Fact Check
- Child Care Fact Check
- Dental and oral health fact check
- Diabetes and metabolic health fact check
- Diet and Nutrition Fact Check
- Eye and ENT Care Fact Check
- Fitness fact check
- Gut health fact check
- Heart health fact check
- Kidney health fact check
- Medical education fact check
- Men's health fact check
- Respiratory fact check
- Skin and hair care fact check
- Vaccine and Immunization fact check
- Women's health fact check
- AYUSH
- State News
- Andaman and Nicobar Islands
- Andhra Pradesh
- Arunachal Pradesh
- Assam
- Bihar
- Chandigarh
- Chattisgarh
- Dadra and Nagar Haveli
- Daman and Diu
- Delhi
- Goa
- Gujarat
- Haryana
- Himachal Pradesh
- Jammu & Kashmir
- Jharkhand
- Karnataka
- Kerala
- Ladakh
- Lakshadweep
- Madhya Pradesh
- Maharashtra
- Manipur
- Meghalaya
- Mizoram
- Nagaland
- Odisha
- Puducherry
- Punjab
- Rajasthan
- Sikkim
- Tamil Nadu
- Telangana
- Tripura
- Uttar Pradesh
- Uttrakhand
- West Bengal
- Medical Education
- Industry
Hyderabad Psychiatrist flags risks of AI-generated health reports

Patient Turns to ChatGPT for Diagnosis, Hyderabad Doctor Raises Alarm
Hyderabad: Amid a growing trend of patients turning to artificial intelligence for medical guidance, a Hyderabad-based government psychiatrist has warned about the risks of relying on AI-generated reports instead of medical consultation.
Dr Raghuveer Raju Boosa, assistant professor of psychiatry at the Institute of Mental Health, said a 30-year-old IT employee arrived for his first consultation carrying a detailed, 10-page report generated using ChatGPT. The patient had entered his symptoms into the AI tool and answered a series of questions, attempting to establish his medical condition based on the generated output.
Also Read:AI in Healthcare: From Power to Responsibility
Speaking to Deccan Chronicle, Dr Boosa explained that the tool “over validates and invalidates a lot of symptoms. For example, if you say you have a headache, it may suggest a brain bleed as one possibility, though not with this severity.” The patient was eventually diagnosed with Obsessive Compulsive Disorder (OCD) with health-related obsessions, accompanied by illness anxiety features. “They constantly worry—‘I believe I have an illness, I have a headache, I have a body ache,’” he said.
In contrast, the AI-generated report had suggested possibilities such as electrolyte imbalance, endocrine disorders, and nutritional deficiencies. The patient has since been placed under appropriate treatment and is reportedly responding well.
Dr Boosa cautioned that this is not an isolated incident. According to Deccan Chronicle, “We often observe patients dropping my prescription into ChatGPT. This is personal information, and we never know what the platform is doing with those details,” psychiatrists warned.
Medical Dialogues had previously reported that in a shocking incident that has angered local residents, a 38-year-old woman died after two men pretending to be doctors allegedly performed a so-called stone removal surgery on her abdomen by following YouTube tutorial videos.
With a keen interest in storytelling and a dedication to uncovering facts, Rumela De Sarkar joined Medical Dialogues as a Correspondent in 2024. She holds a Bachelor’s degree in English Literature from the University of North Bengal. Rumela covers a wide range of healthcare topics, including medical news, policy updates, and developments related to doctors, hospitals, and medical education

