The weight bias that seeps out of generative AI is sometimes explicit and other times subtle. AI-generated images of obesity often portray sad, sloppy, sedentary people surrounded by food and wearing ill-fitting clothing, said Stuart W. Flint, PhD, associate professor of psychology at the University of Leeds, during a 2025 ObesityWeek presentation.
Health explanations generated by AI may overemphasize the role of energy imbalance in obesity and underemphasize the impact of genetic, environmental, and socio-economic factors, he said. Advice may deepen stigma and enforce stereotypes while failing to factor in a user’s age, diagnoses, or overall health. Low-carb, low-calorie diets are commonly recommended and seldom helpful.
“I could be a vulnerable person; I could be a teenage girl,” Flint said during the session titled “AI, Advocacy, and Action.” “Chat GPT doesn’t know who I am, and there is concern about what advice it is offering.”
The stigmatizing and derogatory nature of AI content about obesity may improve with training, however Joe Nadglowski, president and CEO of the Obesity Action Coalition, said his trained AI model occasionally reverts to condition-first language despite clear instructions to use “person with obesity.”
To measure the size and impact of weight bias in AI, Flint and colleagues conducted a phase 2 systematic methodology study that analyzed obesity content derived from six conversational and five image-based generative AI platforms. According to their 8,415 total prompts and results, here are some common problems with using AI for weight content and advice, and healthier ways to think about health.
1 of 5
AI can oversimplify obesity’s causes.
When researchers on Flint's team asked six AI platforms about obesity, they noticed the answers overemphasized overeating and physical inactivity — and underplayed the complex genetic, environmental, and socioeconomic factors that influence weight.
This incorrectly frames obesity as a personal failing, reinforcing weight stigma, he said. That’s not helpful; in fact, it’s hurtful. A 2023 study of almost 4,500 women found that when people internalize weight stigma — agreeing with statements like “My weight is a major way I judge my value” and “I don’t feel I deserve to have a fulfilling social life as long as I’m overweight” — their physical and mental health suffers.
We don’t know who needs to hear this, but obesity is not your fault — it’s a disease likely caused by a combination of complex factors. And it’s no reflection on you as a person. If you struggle with weight stigma, here are some research-backed strategies that may help:
- Taking classes in mindful self-compassion
- Writing about what you can do in a larger body
- Watching a docuseries about weight stigma (study participants watched the weight stigma segment in episode four of HBO’s series titled “The Weight of the Nation”).
- Viewing images of higher-weight women exercising
- Thinking about how anti-fat attitudes contradict your values
- Visualizing interacting with a confident person with obesity
- Imagining what you would tell a teenager who asked you for advice about body image based on your experience
2 of 5
AI might assume you want to lose weight if you ask about it.
When researchers asked AI about weight, it often posed unsolicited follow-ups — such as offering to recommend low-calorie diets, meal plans, or other nutrition advice.
The problem: AI’s tips might not suit you. Individual nutrition needs vary based on body weight, body fat percentage, age, and medical history.
Before you try a new eating plan, consult a health care professional — ideally a registered dietitian. A review of 26 studies found that working with a dietitian helps people eat healthier, lose weight, reduce waist circumference, and lower blood sugar. To find one in your area, visit the Academy of Nutrition and Dietetics "Find a Nutrition Expert" directory.
3 of 5
AI presents stigmatizing images.
What does obesity look like? When researchers asked five image-based AI platforms to depict obesity, the AI produced pictures of people eating junk food, struggling to exercise, wearing too-small clothing, and generally looking sad. For inspiring, positive portrayals of people with obesity, visit the Obesity Action Coalition’s image gallery of non-stigmatizing visuals.
4 of 5
AI excludes people with obesity when prompted with positive words.
If you ask AI to show you someone “wealthy” or “godly,” you probably won’t receive images of larger bodies, suggests a study presented at the 2025 Annual Conference at the Nations of the Americas Chapter of the Association for Computational Linguistics. Researchers gave these kinds of prompts to DALL-E, an image-based AI platform, and got results that reflected common cultural stereotypes linking thinness to success or moral goodness.
5 of 5
AI perpetuates cultural stereotypes about obesity.
Generative AI scrapes source material made by people, which means that human biases (including anti-fat bias) are often baked in.
Consider the content AI draws on when responding to your prompts. A 2024 study of images published in four top U.S. and U.K. news sites found that 70% of higher-weight people were depicted in a stigmatizing way. Almost half were shown with their head partially or entirely out of view, compared with just a quarter of lower-weight individuals who appeared that way. Weight stigma even exists in peer-reviewed medical literature and medical school curricula used to teach future doctors.
To help shift the narrative, check out organizations committed to breaking down weight stigma, such as the Obesity Action Coalition and the National Association to Advance Fat Acceptance.






