Over the last decade, many health systems have found that augmented intelligence (AI) technologies have overpromised and underdelivered. The promises of AI in clinical care were grand—to ease physicians’ burdens and deliver the most relevant information at the point of decision making. However, more technology has increased the demand on providers along with clinicians’ doubt of AI’s capabilities.
Organizations can still deliver valuable AI-derived patient insight to providers at the front lines of care by taking a collaborative approach to AI that enlists clinicians in three key areas:
1. Development.
2. Implementation.
3. Results.
Care teams are aware of the promised benefits augmented intelligence (AI) can bring to a clinical setting—including rapidly analyzing large data sets, alerting clinicians about out-of-range activity, and surfacing the most relevant information at the point of decision making. Yet, many clinicians still resist AI in healthcare or disregard it due to a lack of trust and concern around unintended patient harm.
Over the last decade, many healthcare-specific AI and other technologies have overpromised and underdelivered. The digitization of medicine largely vowed to ease physician burden while improving patient care and experience. Instead, increased technological adoption in healthcare has coincided with prevalent physician burnout. However, leveraging AI technology in patient care can unearth valuable insights that healthcare users might have overlooked amid the large volume of available data and limited manual resources to review this data.
Using AI technology ensures clinicians are making healthcare decisions based on holistic, complete patient data, rather than only using partial, fragmented data sets. For example, while providers tend to patients, AI can analyze the individual and collective data sets in the healthcare system to uncover valuable insights and present them in the context of patient care. Considering these insights in real time, providers can use their expertise and critical thinking skills to determine the best course of action for the patient.
To develop clinicians’ trust in AI and help them take advantage of what AI can offer at the point of clinical decision making, organizations should invest in transparent AI technology and focus on three collaborative areas between data scientists and care teams.
Data science and care teams can collaborate in following three areas to help clinicians overcome doubts about AI’s value and increase adoption at the front lines of care:
When a health system decides to adopt AI, the data science teams should involve key stakeholders (e.g., quality improvement team members and clinicians) from the start. Clinicians often don’t join the AI process until after the development stage. As a result, the data scientists don’t glean clinicians’ unique perspective about what AI models and information would be valuable and practical in a busy clinical setting.
Data scientists and clinicians should work together in the development stage to define their goals for using AI in healthcare and understand how the data can help them achieve that goal. Realizing what data is available, limitations of current data, and potential biases in the data help the AI-adoption team identify opportunities to improve model results through data quality improvement.
After the data science teams develop the model, it’s time for implementation. Involving clinicians is important in every step of the AI process, but especially in the implementation step because data scientists don’t always understand clinical workflows and culture. Without firsthand clinical understanding, developers might implement a model in a less effective way that leads to suboptimal results.
For example, when one health system implemented a machine learning model to predict 90-day admission, it didn’t include clinical input in the model development process. As a result, the clinical teams didn’t trust the model’s outputs because they didn’t understand how the model derived insight. To make the model useful for clinicians, the organization re-developed the predictive model with clinical input, empowering the clinical team to trust and use the tool.
Although the machine learning model has proven successful, the health system has had to double its efforts and create two different models. Had the organization included clinician input from the beginning, it could have saved the time and resources of developing multiple models.
In addition to leveraging clinical input from the ground up, to increase the likelihood of a model’s success in a clinical setting, AI-adoption teams should identify early clinical adopters who understand how to leverage AI in patient care. Having a clinician champion helps other clinicians see how AI in healthcare can benefit them in their day-to-day work. This trusted relationship between clinicians also provides an open line of communication in which colleagues will share honest feedback and ideas for improvement.
Lastly, data scientists should collaborate with clinicians to decipher the AI model’s results. Rather than data scientists telling clinicians to trust the model outputs, clinicians and data scientists can team up to dissect and understand the AI insights. A teamwork approach also improves transparency and reduces black-box algorithms by requiring the experts to explain complex algorithms in simple terms for non-data scientist users.
Clinicians can also offer valuable insight about the best way to deliver AI insights to the front lines of care. Using terminology that clinicians understand, providers can ensure AI has the best chance of success by showing clinicians clear, understandable, and verifiable model results.
In addition to a collaborative approach, health systems should invest in infrastructure that supports data and analytics. A sophisticated data platform (e.g., the Health Catalyst Data Operating System (DOS™)) has the data aggregation capabilities to set the foundation for AI in healthcare.
Once the data platform is in place, with access to all the organization’s data sources, health systems are ready to invest in AI tools. However, to overcome providers’ the lack of trust in behind-the-scenes algorithms, organizations must prioritize AI tools that are open and transparent.
For example, the new Health Catalyst Healthcare.AI ™ offering allows clinicians to easily understand and manipulate the criteria for alerts (e.g., why a model flags a patient as high risk for Type 2 diabetes). This increased visibility into predictive models helps providers trust the algorithm, its insights, and AI overall.
With this supporting infrastructure (e.g., data platform and AI tools) in place, organizations are ready to work together to develop predictive models. With a collaborative approach, clinicians can give insight about what information would be most valuable in a busy hospital setting, allowing data scientists to build a model that delivers insight providers will trust and use in decision making.
Co-development between clinicians and data scientists also decreases the likelihood that the data science team will create a model that delivers impractical insight and must start over—versus building the most effective model the first time. Health systems can involve clinicians earlier in the AI process by educating clinicians about machine learning and starting with simple projects to help get to know AI.
As AI grows more common in healthcare, health systems that purchase and build AI technology should strongly consider clinician perspectives. Including clinician insight into relevant data at the point of decision making and practical clinical workflow helps clinicians trust AI and maximize predictive model insights.
Health systems can help clinicians reap the benefits of AI in healthcare by promoting collaboration in the development, implementation, and results stages of the predictive model process. This team mindset combined with robust data infrastructure allows health systems to build AI models that will optimize provider decision making and lead to better outcomes.
Would you like to learn more about this topic? Here are some articles we suggest:
Would you like to use or share these concepts? Download this presentation highlighting the key main points.