Can AI Replace Your Nutritionist? What the Rise of Health AI Means for Women in Perimenopause

Something significant happened in the world of digital health earlier this month. OpenAI officially launched ChatGPT Health in the United States, a dedicated version of its chatbot designed to connect with your personal medical records, wearable data and health apps, allowing it to give you more personalised responses than a standard Google search ever could. According to OpenAI, health is already one of the most common reasons people use ChatGPT, with around one in four regular users submitting a health-related prompt every week. Without much fanfare, artificial intelligence has become the first port of call for millions of people trying to make sense of how they feel.

I want to talk about what this means, because I think the conversation happening around it tends to go one of two ways. Either AI is going to transform healthcare and make professional support redundant, or it is dangerous and shouldn’t be trusted with anything important. The reality, as is usually the case, sits somewhere more nuanced than either of those positions.

As a Registered Nutritional Therapist specialising in perimenopause, gut health and energy, I work with women who are already using AI to try and understand their symptoms. They come to me having Googled, having asked ChatGPT, having read articles and followed accounts and downloaded guides. They’re not short of information; they’re short of clarity about what any of it means for them, specifically, and what to actually do about it.

That distinction matters enormously, and it is what I want to explore here.

What AI genuinely does well

It would be dishonest to dismiss the genuine usefulness of tools like ChatGPT Health. For many people, AI has made health information more accessible than it has ever been. If you’ve ever sat in a GP appointment and nodded along to something you didn’t fully understand, then gone home and tried to piece it together from a confusing NHS webpage, you will know how much it can help to have something that translates medical terminology into plain English, explains the connection between different test results, or helps you formulate better questions before your next appointment.

AI is also impressive at identifying patterns across large amounts of data. It can synthesise information from thousands of studies, pull together what is known about a particular symptom cluster and present it clearly. For someone trying to understand why oestrogen fluctuations affect sleep, or why progesterone decline is connected to anxiety, a well-prompted AI tool can give a coherent, reasonably accurate overview in seconds.

For women in perimenopause who have spent years being dismissed by healthcare professionals, told their blood tests are normal, or informed that they are too young to be experiencing what they are clearly experiencing, that kind of accessible information can feel like a revelation. Knowledge is empowering, and anything that democratises access to it has genuine value.

Where it gets more complicated

The problem is not that AI gives wrong answers, although it sometimes does, and that is worth taking seriously. Independent research has repeatedly found that generative AI tools can produce inaccurate or unsafe health advice, occasionally with a confidence that makes it difficult to spot. The bigger issue is something more subtle than accuracy, and it is the thing I see most clearly in my work with clients.

Health information and health support are not the same thing.

Understanding that you need 25 to 30 grams of protein at breakfast to support blood sugar stability and hormone function in perimenopause is useful knowledge. But knowing that doesn’t tell you whether you can stomach much food first thing in the morning, whether your schedule means you’re out of the door by 6.30 am, whether you have a history with food that makes tracking macros feel triggering, whether your digestion is compromised in a way that means certain protein sources will not suit you, or whether your energy levels are currently so low that adding anything new to your morning feels completely unrealistic. Those details change everything about what support actually looks like in practice, and no AI tool, however sophisticated, can know them without being told, and even then, it cannot weigh them the way a person who has been listening to you for an hour can.

ChatGPT Health allows users to connect medical records, wearable data and apps to give more personalised responses. That is genuinely more sophisticated than a general search. But medical records are often incomplete. Wearable data captures certain things and misses others entirely. And the picture of a person's health is always far more than the sum of their data points. Context, history, lived experience, body language, the things someone mentions almost in passing and the things they do not mention at all because they have normalised them, these are the things that shape personalised support, and they emerge through relationship and conversation rather than data upload.

 

The perimenopause problem specifically

Perimenopause is a particularly good example of where the limits of AI support become significant, because it is one of the most individualised health experiences a woman can go through. The symptom picture varies enormously from person to person. The timing varies. The speed of progression varies. The way hormonal changes interact with existing gut health, thyroid function, stress load, sleep history and nutritional status varies. Two women of the same age with what looks like a similar symptom picture might need quite different approaches, and getting that wrong doesn’t just mean slow progress. It can mean making things worse.

I also think there is something important about the emotional dimension of perimenopause that AI simply cannot address. Many of the women I work with arrive feeling dismissed, confused and frankly exhausted by the effort of trying to figure out what is happening to their own bodies. Part of what they need isn’t just a plan but someone who takes their experience seriously, who can say with confidence that what they are describing makes complete physiological sense, who understands why the standard advice has not worked and can explain what to try instead. That quality of being genuinely heard and individually understood is not something you can replicate with a chatbot, however well it is designed.

There is also the question of how recommendations fit into real life. Telling someone to eat more vegetables, reduce alcohol, manage stress and do strength training three times a week is not wrong, but it is not support. Support is working out which of those things matters most for this particular person right now, what the realistic first step looks like given their schedule and their energy levels, what to do when life gets in the way, and how to build something sustainable rather than another regime that lasts three weeks before collapsing under the weight of everything else going on. That requires knowing someone, not knowing about them in general.

A word about data and privacy

It is also worth noting that health data is among the most sensitive information you hold, and uploading it to any platform deserves careful thought rather than a casual click through the permissions screen. ChatGPT Health states that conversations within its Health feature are encrypted, stored separately from other data and not used to train its underlying models, and that users can disconnect linked apps at any time. Privacy advocates cited by the BBC have noted that safeguards must remain watertight, particularly as AI companies continue to develop commercial models. The feature is also currently only available in the United States, and has not launched in the UK or the European Economic Area, where data protection regulations are considerably stricter.

None of that is a reason for alarm, but it is a reason for awareness. Understanding what you are sharing, with whom and under what terms is a reasonable thing to know before you start.

So where does that leave us?

I’m not suggesting that AI has no role in health support. Used thoughtfully, it can be really useful: for preparing questions before a clinical appointment, for understanding terminology in a letter or test result, for getting a broad overview of a topic before you dig deeper, or for generating ideas that you then sense-check with a professional. There is real value in that, and I have no interest in dismissing it.

What I do think is worth being clear about is the difference between information and support, and the difference between knowing what the research says and knowing what to do about it in the context of your specific body, life, history and circumstances. That gap is exactly where personalised nutritional therapy sits, and it is not a gap that AI is going to close anytime soon, not because the technology is not clever enough, but because genuine support is fundamentally relational. It requires curiosity about a particular person, not just competence about a subject.

When I work with a client, I’m drawing on my clinical training and the evidence base, but I’m also listening carefully to the way she describes her symptoms, noticing what she prioritises and what she glosses over, asking about the things that might not seem obviously connected but often are, and building a picture that is specific to her rather than applicable to women in perimenopause in general. I’m also thinking about what she can realistically sustain given her life, her preferences and her current capacity, because a perfect plan that nobody can follow is worth considerably less than a good plan that someone can actually implement and build on over time.

The rise of health AI is not something to fear, and it is not something to uncritically embrace either. It is a tool, and like all tools, its value depends entirely on how it is used and what you expect it to do. For understanding information, it can be genuinely helpful. For the kind of joined-up, personalised, relationship-based support that actually changes how someone feels over time, there is still no substitute for working with a real person who brings both knowledge and genuine attention to your particular situation.

  

If you are navigating perimenopause and feeling overwhelmed by conflicting information, I would love to help you make sense of it. You can find out more at here, or just reply to this post if you have questions.

 

 

Catherine Scott is a Registered Nutritional Therapist (mBANT, rCNHC) specialising in perimenopause, gut health and energy. She works with women online and is based in the UK.

Previous
Previous

Why Perimenopause Isn't Just About Low Oestrogen (And What My Own Hormone Test Revealed)

Next
Next

Flaxseeds: The Tiny Seeds I Add to Everything (And Why You Should Too)