Date Published
April 28, 2025
Author
Steven Littlehale
Headless body found in topless bar: Can AI compete with human ingenuity?
While AI tools are becoming increasingly common in healthcare settings, one long-term care expert found that when it comes to regulatory compliance and quality measure logic, AI consistently got important details wrong in ways that could lead facilities down the wrong path without the right human expertise to catch the errors.

Not long ago, I asked a well-regarded journalist if she ever uses artificial intelligence (AI) in her work. Her answer was firm: Though her paper’s editorial rules technically allow for AI use, she wouldn’t dream of it.

“Could AI come up with a headline like ‘Headless Body Found in Topless Bar?’” she posed. Fair point.

That infamous New York Post headline — equal parts shocking and clever — could come from only a human brain tuned to both the absurd and the attention-grabbing. And yet, AI is everywhere now. It’s becoming part of our daily lives, quietly shaping the way we access information, make decisions, and even manage our health. In many ways, it feels inevitable.

From printouts to chatbots

People used to surf the web before medical appointments — often showing up with a handful of printed pages, to the horror of their physician.  

These days, it’s not unusual for consumers to do something a little different: run lab results and symptoms through an AI tool before they see their doctor. And surprisingly, the advice often lines up with what the clinician says during the visit. But AI can also be convincing when it’s wrong, which is dangerous.

AI can be a time-saver. It can simplify complex concepts. But there’s a downside too — it can miss nuance, flatten context, and offer up answers with much more confidence than they deserve.

Trying AI for myself

Wanting to stay open-minded, I decided to try AI for a few work-related tasks. I figured if it could help with speed and clarity, maybe it could free me up for deeper thinking or lunch. Here’s what I tried:

  • Finding regulatory references: I asked AI for the definition of the abuse icon on CMS Care Compare. What I got was incorrect, vague and unhelpful. It referred to the proper definition but misinterpreted the information.
  • Clarifying staffing classifications: I asked AI to confirm whether medication aides/technicians count as CNAs in the CMS Five-Star staffing domain. It said no. I knew better and had the documentation to prove it.

These results, especially the latter one, made me pause. What if I didn’t already know the right answer and just went with what AI told me?

The dealbreaker: Quality Connect

I am working on a new service called Quality Connect — a platform that aims to use analytics to flag and prevent negative quality measures before a nursing home can trigger them. It’s a complex blend of clinical logic, data from multiple datasets, and regulations.

I tried to use AI here, too. I asked it to help sketch out variable relationships and logic flows. I even fed it the official CMS technical manuals that defined the quality measures. Still, AI kept getting things wrong. Important things. Things that could lead a user down the wrong path if they didn’t know better.

That’s when I realized: For certain work — especially in healthcare and compliance — I trust my team at Zimmet Healthcare Services Group and Z-Connect far more than I trust a chatbot.

Final thoughts

Call me old-fashioned, but I believe parts of our work demand human intelligence, which folds in experience and empathy. AI will keep getting better, and there’s definitely a place for it. But when it comes to resident care, regulatory compliance, and quality improvement, the risks are too high. Mistakes can be too costly.

There’s no replacement for real-world experience, critical thinking, and collaborative problem-solving.

Not yet, anyway.

Steven Littlehale is a gerontological clinical nurse specialist and chief innovation officer at Zimmet Healthcare Services Group.

The opinions expressed in McKnight’s Long-Term Care News guest submissions are the author’s and are not necessarily those of McKnight’s Long-Term Care News or its editors.

More insights

Discover the latest trends, best practices, and expert opinions that can reshape your perspective