At HIMSS22, which was held last week at the Orange County Convention Center in Orlando, and sponsored by the Chicago-based Healthcare Information * Management Systems Society (HIMSS), one of several Special Symposia held on Monday March 14 was the Machine Learning & AI for Healthcare Forum, held at the Rosen Conference Center across the street from the convention center. Several panels of experts took place during that forum, including “The Name of the Game Is Implementation.”
That panel was moderated by Chad Konchak, assistant vice president of clinical analytics at NorthShore University HealthSystem, based in the Chicago suburb of Evanston. Konchak was joined by Stephen Beck, M.D., vice president, medical informatics, at the Cincinnati-based Bon Secours Mercy Health; Brian W. Patterson, M.D., a practicing emergency physician and the physician informatics director for predictive analytics in the BerbeeWalsh Emergency Department at UW Health, in Madison, Wis.; and Ryan Vega, chief officer, healthcare innovation and learning, at the U.S. Department of Veteran Affairs.
Konchak opened the discussion with the following statement and question: “We’re learning that 60-90 percent of all predictive models never make it to production. Why is that?”
“I think it depends on how you define it,” Said Dr. Beck. “In the industry, there are a lot of things that don’t make it to market. If you have a strong team working to implement something—I think the implementation rate is much higher at an organizational level, once it’s been agreed to be adopted.”
“There are so many steps; it’s a weed-out process at each step,” Dr. Patterson said. “But just having an operant model is important. Should a model be implemented? Does it provide an actionable insight? And then, can we get it into the workflow at the right time? Will physicians and nurses accept it? And will it make a difference?”
“The idea that there’s this transformation of data into some kind of actionable intelligence, has really escaped us,” Vega said. “We saw some of that during COVID. And we’ve lost some of the meaningfulness on the provider side. And the question becomes, what am I doing with this? And does it provider another box to check.”
“So how do we do this?” Konchak asked. “Is this just vendors selling you something, or data scientists pushing it? How do we get you all at the front?”
“Being a Lean/Six Sigma black belt, I say, let’s identify what our core issues are, and see what the solutions might be to solve those,” Beck said. “So much time has been spent on the financial side. Of course, that’s important. But on the flip side, there are so many aspects of HC that we haven’t hit yet with regard to process improvement. And I like to say that the best kind of process improvement is the kind that the doctor never sees. So leaders need to identify the gaps, and if there is an idea that could work, match it up.”
“One of the interesting things is that people are trying to take on the hard problems in medicine, and that makes sense,” Patterson said. “As an emergency physician—if you’re going to give me a model that predicts sepsis, it had better be good. One area where we’ve made some progress is in things like adding preventive care services, things I won’t be distrustful of.”
“There’s no way to ruin a project than to lose trust across your clinical providers; and sepsis is really, really hard,” Vega noted. “What providers want is, this patient probably has sepsis—help me figure out the best anti-microbial therapy—but we’re not there yet, we’re still trying to figure out if the patient is at risk. So let’s focus on things that are ancillary, and automate those, and give us more time to be back at the bedside, because you then build trust in the system. It’s going to take time to build trust in AI; we still don’t have genomic data built into the EHR [electronic health record] yet.”
“My wife’s a physician, and she comes home and charts all night; it’s awful, right?” Konchak said. “So—help me by making my job easier. So how do we use predictive models and machine learning to take things off the plates of physicians and nurses?”
“If you look at any given workflow, you can identify what those extra steps are that perhaps you don’t need, and as in any type of care, if we can eliminate extra steps of care, that will help a lot,” Beck emphasized. “I think the key is making sure we’re identifying the right steps, and not getting too focused on one step, or focusing on something as big as sepsis—that’s been a key challenge for so many organizations.”
“The sell has to be, it’s going to make your life easier and help you do better by your patients; it’s going to have to be both,” Patterson emphasized.
Process issues around algorithm adoption
What are some of the process issues that patient care organization leaders have been running into? “There was a journal article on an algorithm predicting pneumonia through the radiological process, and the researchers found that the validity of the algorithm plummeted in real life, because of differing processes. So the testing and replicating in the real world are so very important,” Vega noted.
“You’re talking about process, right? The true end goal is improving outcomes,” Konchak replied. “I think the traditional way people go about it is that the data science team goes off and builds a process, and it’s tested. But I’m hearing from you that that’s not the best way. So what’s worked in your organizations, and what advice do you have for people? “
“Well, it really does require a multifunctional team from day one,” Beck emphasized. “We’ve got to make sure we’ve got physicians there, and also, of course, data scientists. With any clinician-facing program, though, you’ve got to get the clinicians in from day one, and you need trust individuals to help determine what’s working and not working. We’re now trying to take a black box and pry the lid off it. But at the end of the day, if you can develop that trusted team, you can work your way through it, and can promote acceptance. If you don’t start with that multi-functional team, you’ll fail; you will. Because you’ve got all those silos. So, pulling all those folks together and aligning them, and focusing, are all critical.”
“So often, there’s this notion of building a solution and then pulling physician champions in; but so often, that fails,” Patterson noted. “You really need the physicians in from the beginning. The other thing is a middle piece of design, between the clinicians and data scientists; having people who might come from a hospital improvement background, or an engineering or human-centered design background, will be important, to help figure out how to get a solution into the workflow.”
What areas are most fertile to explore right now? The panelists looked at approaches and clinical areas. “One aspect of this involves questions around human-centered design: knowledge of a problem is not the same thing as understanding a problem,” Vega said. “That middle ground in design is crucial: it’s why Uber works and why Amazon works. We intuitively use those apps. But if you’re having to train someone on something as simple as scheduling, you’ve already lost.”
“And readmissions is a classic one,” Konchak said. “We think of innovation as some sort of digital tool, but really, it’s just doing something in a different way that drives value. So should we just be talking about process improvement? You see the value in AI, but how much skepticism is in your peers? Are we focusing on the wrong problem to solve?” he asked.
“We start with the fact that we have tools that we didn’t even have five years ago,” Beck noted. “And the computing power we have on our phones is as much as what was on supercomputers in the 80s. So we can leverage that technology, but I think we can marry that with process improvement. And in the process, we’ve got to use all the tools available and not just focus on the black box itself. There’s always opportunity to improve processes and eliminate workflow steps.”
“It’s not a crystal ball, it’s a co-pilot,” Patterson said. “If Google Maps tells you to drive off a cliff, you’re not going to do it. And the same is true in clinical care. If people feel it’s not threatening to them—you want to focus on the fact that this is assistive technology that will help you.”
“In other words, like a well-informed backseat driver?” Konchak said. “Exactly,” Patterson responded.
What about governance?
Another important area involves governance of development processes. “There’s this idea that we should use AI to get at areas where it can obviously show value,” Konchak said. In that context, he asked, “What kind of governance model do you need?”
“You need to have a governance structure that straddles the clinical and IT spaces,” Beck said. “You need that for a team with innovation and with a leadership structure that can control the process; and process improvement is key as well.”
In fact, Patterson said, “I think that governance is key to this, not just for its own sake. It helps to create an institutional memory and the ability to build trust. We have a health system-wide group that looks at every model and where the team doing the implantation reports to the group. And we use that governance for other CDS tools and other purposes. That breaks down a barrier. Providers look at AI as this super-exotic, weird thing being bolted onto care. To bring a certain understanding into the governance structure is important.”
“You don’t want to over-bureaucratize the process,” Vega added. “You have to give the field the ability to innovate. You want to create communities of practice, communities that are sharing and learning form one another. You need a layer of governance, and it needs to be transparent.”
“What about competencies?” Konchak asked. “How do we include clinicians in the development of algorithms and models? And how do we get competency without contributing to physician burnout, or hiring expensive data scientists?”
“t’s a balance,” Beck responded. “Obviously, the more you do, the better you do. You’ve got to balance the fact that you’ve got folks within your organization who are very enthusiastic about taking on new technology, with levels of experience. A team like this also needs to understand that not every project’s going to succeed. You have to sometimes let things go. That’s hard, oftentimes, for a team to do. You’ve got to allow something to fail from time to time, because maybe it wasn’t a good idea, or the technology just wasn’t there.”
And, Patterson advised, “Letting things fail actually gains a ton of trust throughout the system, and shows we’re not just slavishly obsessed with this technology and are going to push things out. And when you have a small-ish team, we focus on building iteratively and building out over time. Maybe, starting with something created in the industry, and then next time, home-building something. Picking the next project not just to do something new; but taking that early win and building on that success.”
“I think competencies will come,” Vega said. “I think the most important thing is building the capacity in the hospital to build these things. That strategy will move the needle.”
And, said Konchak, “I think teams need to learn some form of process improvement methodology. That’s very important.”