top of page

Healthcare Research: When To Shout The Emperor Has No Clothes

Summary

To 'Make It Better' is the universal claim of all healthcare quality and process improvement (QPI) research. Because this research is clothed in a common set of methods (Plan-Do-Study-Act [PDSA]), it is critical to understand when to shout "The Emperor Has No Clothes." This article exposes the risks associated with centering healthcare research around a 'naked database' that has no ability to interact with the clinical setting. The main risk is the failure to see and incorporate opportunities for workflow control and patient engagement. At best the 'results' are of little or no value, and at worst misguide practice and misallocate resources. An example case and tips for avoiding this trap are presented.

 

The Emperor's Clothes

The continuous pursuit of QPI is the emperor within the healthcare industry. There are alternative models (see Lean Healthcare model), but the emperor's clothes typically come from the Model for Improvement, the core of which is small, incremental changes using iterative Plan-Do-Study-Act (PDSA) cycles.

In short, each cycle involves planning what is to be achieved through some change, testing the change, studying the results, and acting on what was learned. The core components of the methods are as follows:

  • Plan

    • Determine what to change or the quality improvement focus

    • Operationalize how the results will be measured

      • Process variables, health outcomes, variables assessing unintended consequences

  • Do

    • Implement the change (if applicable) and capture the measurements

  • Study

    • Enter the data into a database

    • Analyze the data

  • Act

    • Disseminate results to providers / patients

    • Translate positive findings into actions

Worn Out Clothes

There is a long history of success using PDSA cycles, so a fully clothed emperor has paraded around many times the victories of QPI initiatives. It's tempting to lean on inductive reasoning and assume all is well, but clothes wear out. Indeed, advances in information technology have made a key component of the PDSA cycle obsolete and continued use of this obsolete component risks the utility of the entire QPI design. More specifically, database technology in healthcare settings has progressed well beyond the traditional database role in PDSA cycles.


Historically, a database was a central component of all PDSA cycles. That is, all data moved into the database for storage and out for analysis. This database-centered approach has three major limitations:

  1. Reporting results involves multiple steps, thus potential delay

  2. A database “Hammer” makes everything look like a variable

  3. A database isn't a solution, just data about a QPI issue

First, the traditional methods of moving data in and out of a database meant there were multiple steps involved in reporting results (see image below): data collection and entry, data query and extraction, analysis, interpretation of results, generating a report, and delivering findings to providers (and patients, if applicable). Because of the multiple steps involved, any disruption along the way results in additional delay.

Second, a startling and counterintuitive finding in psychology is that your aims drive perception (i.e., you see what you aim at). This is a massive source of conflict and misunderstanding in culture, but for this discussion it’s the investigators “database aim” that can distort the design of a QPI initiative. The risk is that if an investigator is ‘looking’ for a database, there is an increased tendency to only ‘see’ variability in the clinical setting that can be measured (i.e., variables), as opposed to, for instance, characteristics that could be controlled. In other words, there is a mindset that develops whereby the database becomes a hammer that turns everything into being seen as a variable (i.e., the nail).

Third, there are many database applications available, all of which do the same thing, store data. Historically, this is the one and only function of a database in a PDSA cycle. The database is NOT a “Product” in the sense of being a quality improvement solution because it has no mechanism to directly affect workflow, decisions, patient engagement, etc. The database is completely disconnected from the clinical setting. As a result, the database CAN’T solve any of the QPI issues in a PDSA cycle, only store data about the issues.

Dangerous Combination

In the worst case scenario, these three limitations can combine to completely undermine the aims of a QPI initiative. The perceptual distortion that looks for clinical variability extends out such that the QPI aims transform into a 'need' for a database. But a database is NOT a solution, so it can only contains data values about QPI issues.

Because the database has no mechanism to solve any of the issues, the only thing that can be done is to focus on improving the variable values themselves. When this is done, the variables are considered 'healthcare metrics' and the proposed 'solution' is merely an endless loop of capturing and reporting metric performance to providers in hopes the information will spur ‘improvement’ (see image below). In such cases, resources are misallocated into supporting reporting loop infrastructure that is disconnected from the clinical setting (e.g., database administrators, server time, etc.), rather than solving issues directly within the clinic. This scenario is the perfect setup for parading around an emperor with no clothes, an example will demonstrate the point.

Disappearing Clothes

This case example substitutes specific for general terms to facilitate translation to other QPI initiatives. An expert committee derived 14 healthcare metrics related to the management of a chronic disease state. Central to the plan was the creation of a database to capture and store variable values used to generate the metrics. Multiple sites contributed data. A monthly reporting loop was established whereby the data was queried, exported to a statistical package for analysis, performance reports were generated and reviewed by providers. The metrics were aggregated in several ways (e.g., site, disease type, physician provider, etc.) to facilitate comparison and encourage improvement.


Consistent with a database mindset, the data definition of all but one metric involved ‘looking’ for variability within the clinical setting for measurement, as opposed to seeing opportunities for appropriate QPI-related control. More specifically, the metrics assessed variability in patient and/or staff behavior such as:

  • Did the patient complete a health-related questionnaire?

  • Did staff assess and document key symptom targets?

Only one metric did NOT follow this pattern and, instead assessed a treatment outcome goal (i.e., did the target symptom occur in the past X months?).


All the metrics disappeared when appropriate workflow controls with software were put in place, the only exception being the treatment related health outcome metric. In other words, the metrics had little or no value other than pointing out the need for workflow control. Once the emperor is exposed as naked, QPI initiatives can be redirected to start solving QPI issues.

This example brings together all the potential PDSA shortcomings:

  • Delayed Reporting

    • None of the data were used for real-time clinical decision making.

    • Generating reports required multiple steps.

  • A Database Mindset

    • Behavioral variability was ‘seen’ in the clinical setting, rather than a need for control.

  • No Solutions, Just A Reporting Loop

    • Nearly all metrics became irrelevant after appropriate workflow controls were in place.

  • Misallocated Resources

    • Funds are allocated to supporting a reporting loop rather than solving issues.

A final note about this case, there is a key distinction between health-process vs. outcome variables. Health outcome reporting is more likely to be inline with healthcare goals due to the direct relationship with treatment targets. In contrast, process variables are much less likely to be fixed to healthcare goals (i.e., always looking for process improvement) and much more likely to disappear with the introduction of workflow controls. Unfortunately, once a process variable becomes a 'metric', there is a massive disincentive to change or even examine it (i.e., due to reporting continuity [e.g., comparing historical vs. current results]). As a result, Reporting Loops can go on for years without budging, even when containing readily solvable issues.

The Magician Behind the Disappearing Act

A near endless number of techniques can cause QPI-related issues to ‘disappear’ such as devices, workflow changes, imaging, analytical procedures, software tools, etc. In the case example above, nearly all the metrics disappeared by substituting a static database with a dynamic, interactive software tool called a Learning Healthcare System (LHS). In short, the disappearing act is the result of a system that uses data to intervene and inform on the fly, rather than wait to report on what wasn’t done. Indeed, LHS development, in part, was a direct response to the limitations of traditional databases.


Descriptions of the various LHS features are spread across other blogs, only the two most relevant to the above case example are mentioned here:

  1. Real-time Data Utilization

    1. Eliminating the report delay opens numerous opportunities for leveraging data to improve clinical decisions (e.g., clinical reports), and to engage and inform patients with personalized information about their care (e.g., medication adherence).

  2. Workflow Control

    1. Based on planned workflow controls, the LHS can be configured to ensure staff and patient interactions are directed toward completing intended actions. Depending on the situation, this is done using various techniques such as notifications, user-interface layout, continuous monitoring and feedback, data visualization and reporting, analytical algorithms, system triggers and related steps, feature design, etc.

With just these two features alone, nearly all the original Reporting-Loop metrics became meaningless, exposing the fatal risk of poorly designed QPI initiatives. Moreover, a LHS is a proactive, continually advancing product solution to healthcare delivery and can serve as a foundation that integrates other solutions. These results suggest needed changes to QPI design.

The Future of QPI Design

Certainly, the PDSA methods have been a tremendous net benefit to healthcare industry and will remain a core part of well-designed QPI initiatives. Moreover, there are times where the limitations discussed here may not be relevant. However, information technologies have advanced well beyond the point of being critical to consider. A couple of recommendations seem well advised.

Flow Against The Mindset

A flowchart is a common method to diagram a sequence of actions, typically used during the planning stage. To ensure QPI initiatives are not detrimentally hindered by the limitations of a database-centered approach, the traditional flowchart symbols should be redefined to ensure workflow control opportunities are spotted and potentially incorporated into the design. The changes in definition shift the focus from seeing “What Is Done” to “How It Is Done.” The former emphasizes the end-result, leaving little more to do than measuring it with a variable. In contrast, the later focuses on mechanisms that lead to a result, exposing ways to facilitate and/or control the process.

Letting Go

Defining a QPI initiative around a database is fraught with risks and no matter how it is dressed up with justifications, the risks are still there. Multiple LHS solutions are now available, each with an extensive range of features. Thus, it is well past time to LET GO of QPI initiatives based strictly in database terms. Indeed, it should be contingent upon QPI planners to provide a clear rationale if LHS technologies are not included.

Note there are many ways to disguise a database-centered QPI design with all the risks. One of the ways deserving of additional explanation is the claim that EMR data aggregation across clinics is somehow ‘different.’ In fact, typically the opposite is true. That is, an EMR aggregation initiative remains framed around a database, and adds a new limitation due to the data coming from a system with often dramatically different aims (i.e., billing, administration [e.g., scheduling], disease/procedure/treatment documentation). This new limitation introduces a whole host of other potential problems, most notably specificity errors (i.e., not having the variables needed for a complete and accurate analysis). EMR data aggregation can be a fantastically powerful tool for a narrow set of research questions, and is great for clarifying cost distributions, administrative loads, etc.


However, EMR data aggregation as a QPI tool can only at best finger point toward potential problem areas, and looking at the finger (i.e., Reporting Loop) shouldn't be mistaken for solving issues. A sure sign of a Reporting Loop is when site participation in a "QPI" initiative is merely contributing EXISTING data and the only mechanism for change is more reporting. Moreover, the reports typically do NOT contain information about what causes values to vary since there is no mechanism for change and doing so would expose the naked emperor. Lastly, the dynamics of a reporting loop incentivizes those involved to justify the effort and fiscal support, so the importance of the loop is raised up to the level of emperor. The problems with this scenario is part of the reason why Learning Health Systems emerged.


Instead, QPI initiatives and PDSA cycles should be framed and implemented within a LHS. This approach directly addresses the limitations of a database-centered approach through the use of an advancing healthcare solution product that has mechanisms for workflow control, provides clinical staff with on-the-fly utility, pulls data from the EMR as applicable, engages patients with personally relevant disease information and incentives, and provides a platform for integration with other potential solutions (e.g., devices). Other blog articles are coming soon covering examples.






Comentários


bottom of page