AI scaling and sustainability – tips for success from providers, payers and vendors

Photo: HIMSS Media

SAN DIEGO – A common theme at the HIMSS AI in Healthcare Forum this past week was that artificial intelligence represents a new paradigm shift for how care is delivered (and paid for) and that with such a fast-emerging technology, we’re all figuring it out together.

As the two-day event drew to a close on Friday, three leaders from three different facets of the healthcare industry each offered their own perspectives, and experience thus far, on how the promise of AI can be harnessed – and scaled – efficiently and efficaciously.

In a discussion moderated by HIMSS Chief Research Officer Anne Snowdon, IT leaders from Providence, VMware and Humana offered some real-world lessons for implementation and expansion of AI models.

Tariq Dastagir, AVP of medical informatics and clinical trends at Humana, set the stage.

“The margins are thin and the pressure is intense every year for us to ease the cost of care,” he said. “And obviously that has to be done with better outcomes. It has to be done with efficiency. The hope is that we can leverage a lot of these new technologies to do that.”

A central question, he added, is “what is the right use case and business case to deploy those, and where can you get by with simpler solutions? Not everything needs a genAI model or a protective model. Sometimes it could be just your simple risk score you can use to predict something.”

As more and more healthcare organizations buy into the promise of AI, they have to be willing to grapple with and think critically about those questions.

“Shop for dissent and not for agreement, because that’s where you really learn,” Dastagir advised. “We get excited about a lot of stuff. But the thing is, what really is going to make the real difference and how everyone else thinks about it and are they on board, do they feel the same thing? And if not, how do you bring them around? Or how do you learn how to evolve your use cases to the point where it actually starts making sense for everyone?”

Corey Lyons, senior staff solution engineer at VMware agreed – and noted that fitting the next-gen capabilities of AI into existing technology processes and workflows is easier to discuss than to deploy.

“We talk about infrastructure, we talk about technical debt, we talk about the intersections of the technologies that will work alongside analytics, these other well-proven business processes,” he said.

“We’re looking to help our customers understand: ‘Look, you’ve run things in this very traditional, well-understood way,” said Lyons. “We’re in this transition to a more nimble way, where the applications are going to iterate more frequently.

“What’s exciting and challenging for us today is if you look at large language models, all these other processes that require this massive amount of horsepower to generate … when you try to back that up into, OK, well, how do we do business today and how can our teams be successful? There’s a big departure from: “We can really do this repeatedly, successfully, safely, with the [necessary] degree of automation security.”

As VMware looks towards the future, he said, “we’re trying to help organizations say no matter if it’s a private cloud you’re hosting, you’re working with a hyperscaler or you’re deploying these solutions to the edge – where I think honestly, a few years from now, that’s the greatest impact – collectively we’re all going to be able to hopefully deploy these things out,” said Lyons. “We’re one of the few organizations that can help everybody do any step in that journey with an eye toward: ‘Here’s how the older applications, older processes meet up with the newer techniques and capabilities.'”

At Seattle-based Providence, a longtime leader in IT innovation, AI-based tools are already deployed across several clinical and operational use cases.

“We have Nuance’s product DAX, with more 1500 providers using that,” said Eve Cunningham

chief of virtual Care and digital health at Providence. “We also have a digital assistant and clinical content management product called MedPearl that we developed and incubated at Providence that scales and has over 7,000 users. We’re also using generative AI to help with inbox management.”

There have been lessons learned about all three of those applications, she said

The first thing I would say, I know you hit on it too, is that we need clinical sponsorship and executive sponsorship. It’s absolutely critical, that alignment there.

You need to make sure that you understand the problem that you’re trying to solve. You can define it and articulate it. If you can speak the love language of the CFO, measuring ROI and KPIs and staying really steadfast with how you’re going to measure that as you start to scale things out.

I know some people talk about how they don’t believe in pilots, they just want to go straight to scale. You can’t always do that. You want to do pilots – what you don’t want is to do perpetual pilots. So you have to be able to say, ‘Hey, we’re going to fail.’ And we’ve done that before. We’ve said, ‘Hey, this isn’t working. We’re going to stop using this application, or working with a vendor because things are failing.’ You have to be able to do that.

You also have to make sure that what you’re trying to solve aligns with key strategic priorities for the health system, or for the organization you’re working with.

So, for example, at Providence, in my division, our top three priorities are:

Workforce shortage and burnout

Hospital throughput and capacity, because we don’t have any hospital beds and so we have to figure out how to digitally enable, virtually enable, put a hospital capacity or treat patients in rural hospitals that don’t need to go to big hospitals, by virtually enabling rural hospitals in specialty care

And care fragmentation

So when we’re looking at different solutions that we’re evaluating, we’re thinking in the context of those three big pain points, which I know are not unique to us.

Then the other thing, once you’ve kind of figured out, okay, this does seem like this is a problem we’re solving, this is going to hit a huge pain point, it’s aligned strategically.

Okay, what is the viability of the solutions that are out there? What is the maturity of the solutions? Is this something that we have to build ourselves? Can we buy it? Can we partner with somebody to co-develop?

And then on top of that, how does it fit into the workflow? And is it possible to integrate it and stitch it together into this workflow? Because it could be the greatest idea in the world, solving the biggest problem in the world.

But if my doctors have to make 17 clicks in order to be able to use it, it’s not going to get adopted. And then what is the demand from the end users? Are you hearing from the people who you’re actually going to try to push this solution into? What is the demand? How does it fit into the workflow? And what is the lift for them to train on it, to adopt it? What is the change management aspect of having to do that?

And then you obviously go into the risk and the bias and the safety. Do you have the right infrastructure?

I’ll give you an example. We have huge demand from radiology for us to bring some of these algorithms into our radiology department to be able to leverage the FDA approved algorithms that use AI to help speed up and optimize the clinician’s workflow and being able to read images, but also improve the quality.

But we have 27 different PACS servers. We don’t have an infrastructure middleware that can connect the network from Nuance to our PACS servers. So the technical debt and the infrastructure build is going to be required for us to be able to actually connect the AI to the workflow and to the system is a pretty significant lift.

It doesn’t mean that we’re not going to do it, and it doesn’t mean that there isn’t a desire. It’s like, how do we connect the dots?

So those are all kinds of the things that we think through and we actually set up a governance structure recently. I co-chair a clinical AI work group. Describe sort of our guardrails and evaluation process that we’re kind of working through and establishing and creating the muscle

There’s no lack of great ideas, but it’s how do you filtrate and prioritize? And what are the things that are common sense that you think that you can make happen.

Source: Read Full Article