A few years ago, I thought I was going to die. And even though (spoiler alert) I didn’t, my severe health anxiety and ability to always assume the worst persisted. But the growing proliferation of smart health-tracking devices and the new ways AI is trying to make sense of data from our bodies led me to make a decision. For my peace of mind, AI should stay away from my personal health. And after just looking Samsung’s Unpacked eventI am more convinced of it than ever. I will explain to you.
Around 2016, I had severe migraines that persisted for a few weeks. My anxiety rose sharply during this time due to the staff’s concern, and when I finally called the UK NHS helpline and explained my various symptoms, they told me I needed to go to the nearest hospital and be seen within 2 hours. “Go with someone,” I distinctly remember them saying, “It’ll be quicker than bringing you an ambulance.”
This call confirmed my worst fears: that death was imminent.
It turned out that my fears of an untimely demise were unfounded. The cause was actually severe muscle strain from hanging several heavy cameras around my neck for an entire day while photographing a wedding. But the helpline agent was simply working on the limited data I had provided and, as a result, had – probably rightly – taken a “better safe than sorry” approach and urged me to consult a doctor immediately.
Samsung’s health tracker provides a lot of data, which may or may not be useful to you.
I’ve spent most of my adult life struggling with health anxiety, and episodes like this have taught me a lot about my ability to jump to the worst conclusions, even if it doesn’t. There is no real evidence to support them. A ringing in my ears? It must be a brain tumor. A pang in the stomach? Well, I better get my affairs in order.
I’ve learned to live with it over the years, and although I still have my ups and downs, I have a better understanding of what triggers things. On the one hand, I learned Never Google my symptoms. Because whatever my symptom, cancer was always one of the possibilities that research could open up. Medical sites – including the NHS website – offered no comfort and usually only caused distressing panic attacks.
Unfortunately, I’ve seen a similar response with many health trackers. At first I liked my Apple Watch and its ability to read my heart rate during workouts proved useful. Then I found that I was checking it more and more often throughout the day. Then the doubt set in: “Why is my heart rate high when I’m just sitting? Is this normal? I’ll try again in 5 minutes.” When, inevitably, it was no different (or it was worse), panic naturally ensued.
I’ve used Apple Watches a few times, but I find heart rate tracking more stressful than useful.
Whether tracking heart rate, blood oxygen levels, or even sleep scores, I was obsessed with what a “normal” range should be and every time my data fell outside from this beach, I immediately assumed that this meant I was about to collapse. right there and then. The more data these devices provided, the more I felt I had to worry. I learned to keep my worries at bay and continued to use smartwatches, without them really being a problem for my mental health (I should not actively use heart-related functions like ECGs), but AI-based health tools. scare me.
During its Unpacked keynote, Samsung explained how its new Galaxy AI tools – and Google’s Gemini AI – would supposedly help us in our daily lives. Samsung Health’s algorithms will track your heart rate as it fluctuates throughout the day, notifying you of changes. It will offer personalized information about your diet and exercise to contribute to cardiovascular health and you can even ask the AI agent questions related to your health.
To many this may seem like a holistic view of your health, but not to me. To me, this looks like more data being collected and waved in front of me, forcing me to acknowledge it and creating a never-ending feedback loop of obsession, worry, and, inevitably, panic. But it’s the AI issues that are the biggest red flag for me. AI tools, by their nature, must formulate “optimal” responses, typically based on information publicly available online. Asking the AI a question is really just a quick way to perform a Google search, and as I discovered, searching for health queries on Google doesn’t end well for me.
Samsung demonstrated different ways of using AI in its health app during the Unpacked keynote.
Just like the NHS phone operator who inadvertently made me panic about dying, an AI healthcare assistant will be able to provide answers based only on the limited information it has about me . Asking a question about my heart health can bring up various information, just like checking a health website to find out why I have a headache. But just like a headache can technically this is a symptom of cancer, but it’s also much more likely to be a tight muscle. Or maybe I didn’t drink enough water. Or I have to look away from my screen a little. Or I shouldn’t have stayed up until 2 a.m. playing Yakuza: Infinite Wealth. Or a hundred other reasons, all of which are far more likely than the one I’ve already decided is definitely the culprit.
But will an AI give me the context I need to not worry and obsess? Or will it just provide me all potentials as a way of trying to give a complete understanding, but rather to fuel this worry of “what if”? And, like how Google’s AI previews told people to eat glue on pizza, would an AI health tool simply scour the internet and provide me with a choppy response, complete with inaccurate inferences that could tip my anxiety into full-blown panic attack territory?
Or maybe, just like the kind doctor at the hospital that day who smiled kindly at the sobbing man sitting across from him and who had already written a farewell note to his family on his phone in the room. Wait, maybe an AI tool could see it. data and just say, “You’re okay, Andy, stop worrying and go to sleep.”
Maybe one day it will. Perhaps health tracking tools and AI insights can offer me a much-needed dose of logic and reassurance to counter my anxiety, rather than being the cause of it. But until then, it’s not a risk I’m willing to take.