Most healthcare systems have been building, improving, and maintaining proprietary healthcare analytics platforms since the early 2000s and have invested heavily in the people and resources required to do so. As the demands of today’s healthcare environment continue to increase, it’s becoming more difficult for analytic teams to keep up.
This article deals with the six biggest problems to maintaining a homegrown healthcare analytic platform today:
1. Inability to keep pace with analytic demands.
2. Difficult to support and scale for the future.
3. Difficulty finding and keeping talent.
4. Use of point solutions to fill gaps.
5. Analytic teams must also support third-party vendors and affiliated groups.
6. Difficulty keeping abreast of rapidly changing regulatory requirements.
As the need for data-driven improvement becomes more urgent, many health systems are finding their data management and analytics solutions are falling behind. Most traditional data warehouses are overwhelmed by analytic requests and lack the ability to support today’s increased and rapidly changing demands.
Some large health systems, such as Northwestern Medicine and Intermountain Healthcare, got an early start on data analytics and business intelligence in the early 2000s. At the time, there were no comprehensive or best-of-breed analytic solutions in the healthcare marketplace. This forced these health systems into creating their own homegrown analytic capabilities, data models, and reports. Some opted to purchase the limited general-purpose tools available at the time and build their own custom analytic solutions on top of those.
Several of these larger organizations have been creating their own solutions for so long that they’ve become highly invested in those solutions. They’ve invested in the technology, the thousands of reports, dashboards, and scorecards they’ve created over the years, as well as the personnel resources they’ve hired and trained to work on their proprietary healthcare analytics platforms. Because of the financial and time investments poured into creating, expanding, and maintaining these proprietary systems, it can be a difficult decision to pivot to a new solution.
Although it can be difficult to pivot, there are a number of problems that come with a homegrown analytics solution that causes organizations to look for more effective solutions. Below are the top six challenges with perpetuating homegrown analytic solutions.
The inability to keep pace with the analytic demands of the organization is the biggest problem with homegrown solutions, and it’s a multi-pronged, compounding issue. Below are some of the factors that contribute to this being the leading problem.
By the time the analytic team turns around new requests, oftentimes the department has already moved on and the business need has evolved or changed. The analytic team is constantly playing a catch-up game where they’re never quite meeting the needs of their users because the business need for healthcare analytics is evolving at a faster pace than the ability for the team to turn around a new solution.
Homegrown EDW infrastructures and tools typically result in analysts spending 80 percent of their time prepping, cleansing, and consolidating data instead of performing actual analysis. This translates to needing as many as four analysts to accomplish what one analyst could do when working with a modern enterprise analytics platform with fully integrated tools. A modern platform and data architecture would flip the ratio to analysts spending 80 percent of their time performing analysis vs. getting data prepared for analytic use.
Another factor in analytic operations failing to keep pace with the needs of the organization is that many health systems are increasingly involved in merger and acquisition activities. Whether the health system is buying a community hospital or acquiring a new physician practice, the analytics team must rapidly integrate the acquired data sources in order to bring the new business into a consolidated analytics realm. Homegrown infrastructures present a challenge for this type of rapid integration because they are not equipped to quickly take on new data sources. Analysts must create new, complicated processes in short periods of time. What often happens is data architects and analysts end up doing just enough detail work to bring those source systems of data into their data warehouse rather than doing the integration work at an analytic level inside the data warehouse.
And, that 80 percent number mentioned above that analysts spend doing non-analyst work is worsened because they now have multiple new data sources to deal with. Every time they want to do an analysis or create a report, they have incremental data sources to prep, scrub, and consolidate.
The last factor in keeping pace with analytic demand is that each report, dashboard, scorecard, drillable analytics solutions, or analytical application needs to be custom developed by the team. As a result, reuse is a challenge. This is another factor in why it takes an average of four to six months to satisfy an analytic request.
A homegrown analytic solution typically consists of custom Extract, Transform, and Load (ETL) processes, proprietary data models, limited data management tools, custom developed reports and dashboards, and non-integrated infrastructure for managing and analyzing big data sets. This makes homegrown solutions very difficult to support, maintain and enhance, particularly as major new analytic requirements need to be addressed.
Something that illustrates this well is the Winchester Mystery House phenomenon. Sara Winchester, the widow of the Winchester guns fortune, believed (due her to superstitions about the ghosts of those shot by Winchester weapons coming to haunt her) that she had to devote her life and wealth to building and adding onto her mansion. This resulted in mazes of hallways and rooms, doors that opened to walls or sheer drops, and staircases that led to nowhere. This chaos is the product of what happens when there is no initial master plan or blueprint to address future requirements.
In many homegrown analytic solutions, the original vision fits only the existing needs of the business. Maybe the organization starts with needing to analyze financial data, but as they grow, they want to look at clinical outcomes improvement work or consumer trends. Then, they want to start value-based care arrangements and need to slice data differently. In these cases, the data model grows like the Winchester mansion, expanding with each new requirement added with little thought for the future, and creating headaches for data architects and tool developers.
As mentioned above, mergers and acquisitions further compound this issue when analysts need to rapidly integrate new data sources into this existing data model. When data architects experience a net new data source (or multiple new data sources simultaneously as can be the case in an acquisition), developing custom ETL processes are laborious and time consuming. As is often the case, these new data sources are typically “landed” in source data marts and are left to the analysts to integrate them into meaningful subject area marts. This perpetuates the amount of time that analysts must spend cleansing and prepping data before any actual analysis can be performed.
Part of getting this balance right between a fully-defined data model versus a late-binding or just-in-time approach is trying to strike the right balance between the two approaches. A hybrid modeling approach offers full modeling of the most used data, while utilizing a late binding approach for lesser used data, thus not requiring the overhead expense incurred by a fully defined data model. This Pareto Principle approach (e.g. model the 20 percent of the data that is used in 80 percent of analytic use cases) affords the ability for configuration and customization depending on an organization’s specific needs. Getting that balance right is extremely challenging in the homegrown arena.
Adding more complexity and ways an organization needs to slice and dice their data compounds these problems of scalability. In addition, the problem worsens as data is required from multiple, heterogeneous EMRs and claims sources from in-network affiliated providers and payers.
The bottom line around the problem of scalability is that supporting and enhancing the foundations of a homegrown solution ends up taking much more time as a percentage than the time spent actually analyzing data and adding value to the business.
The level and number of analytic and technical resources required to support a homegrown analytic solution is difficult to find and maintain. Healthcare organizations hire and train great analysts that then decide to leave for greener pastures because they don’t want to work in a proprietary system that’s frustrating and complicated to learn. A standards-based, modern enterprise analytic platform will help streamline the workflow and cut down on the high-end resources required to maintain a complex homegrown system.
Because of the problems outlined above, the analytics and IT teams are often forced to invest in multiple 3rd party point solutions to help fill gaps in their homegrown system. The team often struggles to keep up with demands and then the business needs to have specific analytic solutions that the enterprise or homegrown solution can’t fulfill in a timely manner. This results in the worst of both worlds – they’ve built an expensive infrastructure using expensive human resources and then invest in expensive point solutions to fill the gaps that their homegrown solution is unable to address.
This only further complicates the issue of resources because now the team must do system and data integration work to fulfill the data ingestion demands of third-party vendors. The result is a hybrid solution where they are both buying and building an expensive solution rather than having one comprehensive solution that meets most analytic and technical needs.
It’s difficult enough for analytic teams to support the needs of their own organization, but in today’s environment, many healthcare organizations are also working as a third-party analytics supplier, whether it’s part of an Accountable Care Organization (ACO), Clinically Integrated Network (CIN), or in a fee-for-value arrangement with a large employer group.
The healthcare system offers products and care options and contracts with other providers and hospitals in order provide the necessary network coverage. As the curator of the ACO or CIN, the health system is tasked with compiling the data from these heterogeneous systems such as EMRs used by affiliated physicians, third-party claims payers, and so on. Analytic teams are then tasked with providing services for these third-parties on top of the demands of their own organization they are already struggling to keep pace with.
Internal analytic teams often struggle to keep abreast of the rapidly changing regulatory requirements like Clinical Quality Measures and HCAHP scores. On top of these measures that are always changing, organizations have their own internal custom measures that are also being updated. This is another time-consuming task for a team of overworked analysts.
The Health Catalyst® Data Operating System (DOS™) is a comprehensive solution for healthcare organizations struggling to meet today’s increasing demands with cobbled together proprietary systems and overworked analytic teams. Here’s why:
Health Catalyst started out like many healthcare organizations–without a set data model. Developers created data marts based on the analytic solution they needed for each customer. Then, analysts–like analysts working in a homegrown solution–had to combine, cleanse, and prepare all their own data from the customer’s data sources in the data warehouse. This was termed a late-binding data architecture where data from an EMR or claims system could be dropped into the data warehouse and it would mostly stay in its source format. Then, data analysts would combine the data any time they needed to create a report or fulfill an analytic request. In some ways, this perpetuated the same problem that analysts in a homegrown environment struggle with.
With Health Catalyst’s release of DOS, data is kept in its original source format for things like big data discovery and research where it’s important to preserve the sanctity of the source data. DOS also curates a layer of shared data called DOS Marts. Health Catalyst discovered that roughly 20 percent of all the data stored on behalf of customers represents roughly 80 percent of the use cases for common reports and clinical quality outcomes improvement. DOS takes 20 percent of the available data and persists it into a curated data model supporting standardized, self-service analytics and massive re-use of the data by multiple analytic applications and tools. The remaining 80 percent of the data can be utilized by analysts, as needed, in Subject Area Marts, providing the best of both worlds for analysts.
For example, if a healthcare organization has five different EMR systems, instead of storing it in five different proprietary formats, Health Catalyst takes items like allergies, medication lists, or labs from all five EMR systems and puts them into a single, shared data structure. Then, the analyst can run a simple query that says “Show me all the medications for this patient,” and rather than having to write SQL statements to bring in medications from five disparate sources within the EDW. The data is already highly normalized.
Health Catalyst recognizes that it doesn’t make sense to do this level of curated data modeling for 100 percent of the ingested data elements. Instead, Health Catalyst leaves the lion’s share of the data in a late binding data architecture. The data that is ingested and homogenized in a DOS Mart is the data that provides the basis for most reports. This strategy provides a balance that allows analysts to be highly-productive but doesn’t stifle their ability to do unique things with the data. This curated data model offers immediate benefit to clients.
The second item that brings immediate benefit to clients is that Health Catalyst knows how to integrate data into the data warehouse. With DOS, there are more than 200 different data sources already integrated, allowing Health Catalyst to run circles around any homegrown solution in terms of productivity. With regards to mergers and acquisitions discussed earlier, if a healthcare system needs to integrate those new data sources into a homegrown solution, they have to write custom connectors to do so, a time-consuming process. The chances are very high that Health Catalyst has already dealt with this data source and can quickly ingest the data into its source data marts and then into the curated data model, allowing analysts to take immediate advantage of the data without having to know any of the intricacies of those data systems.
Another huge value-add is that DOS has analytics accelerators and applications on top of the platform that are immediately available to analysts. Between these 70 applications and accelerators, Health Catalyst covers 80 plus percent of what any given department or CIO of a healthcare system needs. Then, analysts configure or customize that last remaining 20 percent for their unique needs. That’s a big head start over proprietary homegrown solutions. Instead of analytic requests taking four to six months, analyst teams are much more capable of keeping pace with the analytical needs of the organization, in addition to providing true self-service analytic tools to their users. Analysts can provide more value and have a bigger impact on the organization.
Health Catalyst has teams of people that focus on keeping up with all the latest regulatory measure changes. Those are translated into modifications and turned around to clients. A healthcare organization’s own analysts no longer need to spend time and resources keeping track of so many different regulatory changes and then customizing and editing all the measure changes that happen. They can simply validate the output.
Every three to five years, the healthcare organization’s IT team has to reinvest in its infrastructure. This usually involves going to bat with finance and operations leaders to support increases in the budgets allocated for analytics infrastructure and tools. Whether this is upgrading the storage area network or purchasing new servers, these requests don’t add any incremental value to the organization’s customers. Instead, they require downtime or outages of the infrastructure and consume large amounts of capital that might be better spent expanding operations and supporting growth opportunities.
With Health Catalyst, clients get technology access as part of an annual subscription that includes all the technology, software, the DOS platform, and the hosting environment. This tech access model offers a predictable spend rate for the organization’s entire analytics program. And, unlike a homegrown solution, Health Catalyst can actually help clients achieve at least a two to one return on investment. An internal homegrown solution will be hard pressed to offer that level of return. In addition, DOS functions as an enterprise analytics platform with the ability to displace many point solutions the client has separately purchased or developed in the past.
Even though many healthcare systems have invested heavily in their homegrown proprietary systems, the demands of today’s healthcare environment are making it increasingly difficult for analytic teams to keep up. The inability to keep pace with the demands of the organization, scale for the future, and find and keep high-end resources, make it unmanageable to maintain homegrown analytic solutions indefinitely.
Health Catalyst’s DOS solution provides a comprehensive platform for analytic teams that are stretched too thin to meet the demands of the organization. DOS provides access to complete, actionable healthcare data to quickly get analysts caught up and doing complex analyses that will ultimately help the organization save money, optimize operations, and improve care.
Would you like to learn more about this topic? Here are some articles we suggest: