• 07 Mar, 2026

OpenAI has launched ChatGPT Health in the US, allowing users to share medical records and fitness app data for personalised health answers. Experts warn about privacy, data security, and the risks of AI generated medical advice.

OpenAI Launches ChatGPT Health Feature in the US

OpenAI has introduced a new feature called ChatGPT Health in the United States, designed to analyse users’ medical records and personal health data to provide more tailored responses to health related questions. The company says the feature aims to help users better understand their health and wellbeing through personalised insights.

To enable this personalised experience, OpenAI is encouraging users to share medical records along with health data from popular apps such as MyFitnessPal, Apple Health and Peloton. This combined data is analysed by the chatbot to offer more relevant answers to individual health queries.

Separate Storage and No AI Training Assurance

OpenAI has clarified that conversations held within ChatGPT Health will be stored separately from other chats. The company also stated that this health related data will not be used to train its artificial intelligence models. Additionally, OpenAI emphasised that the feature is not intended for medical diagnosis or treatment.

Privacy Safeguards Remain a Key Concern

Privacy advocates have raised concerns about the handling of sensitive health information. Andrew Crawford from the US non profit organisation Center for Democracy and Technology said it is crucial to maintain airtight safeguards around users’ medical data.

Availability Outside the US Remains Unclear

At present, it is uncertain if or when ChatGPT Health will be introduced in the UK. Crawford highlighted that while AI health tools have the potential to empower patients and improve health outcomes, health data remains among the most sensitive types of personal information and requires strong protection.

Personalisation and Advertising Concerns

Crawford also noted that AI companies are increasingly focusing on personalisation to enhance the value of their services. He warned that as OpenAI explores advertising as a possible business model, strict separation between health data and other ChatGPT memory features is essential.

Massive Volume of Health Queries on ChatGPT

According to OpenAI, more than 230 million people worldwide ask ChatGPT questions related to health and wellbeing every week. In a blog post, the company said ChatGPT Health includes enhanced privacy features specifically designed to protect sensitive data.

Supporting Medical Care, Not Replacing It

OpenAI reiterated that its health feature is designed to support medical care rather than replace healthcare professionals. Users can upload health app data and medical records to receive more context aware responses to their health related concerns.

Risks of Misinformation in AI Generated Health Advice

Experts have warned that generative AI tools can sometimes produce false or misleading information while presenting it in a confident and convincing manner. This raises concerns about users treating chatbot responses as authoritative medical advice.

A Watershed Moment for AI in Healthcare

Max Sinclair, chief executive of AI marketing platform Azoma, described the launch of ChatGPT Health as a watershed moment. He said OpenAI appears to be positioning ChatGPT as a trusted medical adviser, with the potential to reshape patient care and influence consumer health related purchasing decisions.

Competitive Advantage Amid Rising AI Competition

Sinclair added that the health feature could prove to be a game changer for OpenAI as competition intensifies from rival AI platforms, particularly Google’s Gemini.

Limited Rollout and Regional Restrictions

OpenAI has stated that ChatGPT Health will initially be available only to a small group of early users, with a waitlist opened for others. The feature has not been launched in the UK, Switzerland, or the European Economic Area, where strict data protection regulations apply.

US Privacy Gaps Raise Further Alarm

In the US, Crawford warned that companies not bound by strong privacy laws may collect, share and use people’s health data with limited oversight. He cautioned that inadequate data protection policies could put sensitive health information at serious risk.

Source Credit: BBC

 
 

Dr. Dheeraj Maheshwari

MBBS, PGDCMF (MNLU), MD (Forensic Medicine)