Health Charities: Don't Trust AI for Medical Advice
Health Charities: Don't Trust AI for Medical Advice
Google's AI Overviews are reportedly generating dangerously misleading health advice, causing alarm among health organizations. These AI-powered summaries of expert advice on websites have provided incorrect and sometimes harmful information for queries about critical health topics.
Health charities in the UK have raised significant concerns about the potential for AI-generated search summaries to spread medical misinformation. They worry that this technology could undermine their carefully curated and expert-approved health guidance, which they have spent years developing and making accessible online.
Charities See Traffic Decline
The introduction of AI-powered search results is leading to a "great decoupling" between search engines and content creators, according to one report. Charities are experiencing a drop in website traffic, which impacts their ability to provide services and gather donations. (Source: org.uk)
Organizations that rely on search engine visibility to reach vulnerable people are now facing new challenges. The AI summaries often present information without clear and prominent links back to the original, trusted sources, effectively creating a barrier between users and the charities' detailed guidance.
Inaccurate Medical Guidance
The investigation uncovered several instances of incorrect guidance that could have severe consequences for patients. One of the most alarming examples involved advice for pancreatic cancer patients, where the AI summary incorrectly suggested avoiding high-fat foods. Experts from Pancreatic Cancer UK clarified that this is the opposite of what is typically recommended and could leave patients too unwell to receive potentially life-saving medical procedures.
Further examples of misinformation included incorrect diagnostic information for vaginal cancer and misleading data regarding liver function tests. In one case, a search for vaginal cancer tests wrongly listed a pap test as a diagnostic tool, which could lead individuals to dismiss genuine symptoms after a clear cervical screening. Similarly, AI-generated "normal" ranges for liver blood tests failed to consider crucial factors like age or sex, a mistake that could give people with serious liver disease a false sense of security. (Source: theguardian.com)
What's Your Opinion?
Do you trust AI-generated summaries for health-related questions? Should search engines be held responsible for the accuracy of the medical information their AI provides? How might the decline in website traffic affect the ability of charities to offer essential public services?

My name is Dennis Faas and I am a senior systems administrator and IT technical analyst specializing in technical support and cyber crimes with over 30 years experience; I also run this website! If you need technical assistance , I can help. Click here to email me now; optionally, you can review my resume here. You can also read how I can fix your computer over the Internet (also includes user reviews).
We are BBB Accredited
We are BBB accredited (A+ rating), celebrating 21 years of excellence! Click to view our rating on the BBB.
Comments
Websites are dying
I hate to say it, but I've seen a major impact on search results to this website in the last two years and I would have to attribute it mostly to AI usage. I am currently using Google about 1/10th as much as I used to, and that habit started last year. Now I use ChatGPT for most queries and also tech-related work when I need a quick summary or corrected syntax.
Using AI is so much better than using a Search Engine that it doesn't even compare anymore: no ads, no pages to wade through trying to find what I'm looking for, and results are almost instant. This makes it way better than a Search Engine, especially if I need to program a script - which is what I'm mostly using it for.
Of course, AI is sometimes wrong and you need to be careful with what it's telling you because it can make things up and you wouldn't even know it unless you challenge the ideas, use a search engine to verify, or explicitly ask it to provide evidence of its findings.
Always changing - except FUD
Ten years ago, the charities didn't use (or need) the websites to accomplish their goals. As the technology developed, so did the uses. Technology continues to evolve, and organizations will (...need to...) continue to adapt. Don't make it sound like the world is ending. That's just sowing fear, uncertainty, and doubt.
Search-generated AI probably should cite the sources. It may be that some obscure source is actually providing information that a mainstream source doesn't want revealed. (It's been known to happen!)