Improving the value of health care requires identifying and measuring outcomes that matter to patients. To accomplish these steps, providers must overcome operational hurdles, such as the cost of implementing measurement systems and the time required to utilize them. Here we describe our experience with the use of a simple and affordable homegrown tool for pursuing patient-centered outcomes in real time.
Improving the value of health care requires identifying and measuring outcomes that matter to patients. Ideally, such information should be acquired efficiently and in real time in order to enable immediate improvement within clinical microsystems.
We developed an Assessment of Care tool that is simple and easy to use at the point of care in order to facilitate value-creation in real time. The tool’s design was inspired by the IOM’s six dimensions of perfect care: safety, timeliness, effectiveness, efficiency, equity, and patient-centeredness.
The power of the Assessment of Care tool originates in trust. Care teams must commit to responding swiftly and in good faith to the patient feedback or else risk the enterprise falling apart.
Adoption of the Assessment of Care tool by patients may come naturally. For provider adoption to be successful, the tool must be incorporated into daily clinical workflows.
One approach to optimizing value for patients is to practice interpersonal medicine, which has been defined as “a disciplined approach to delivering care that responds to patients’ circumstances, capabilities, and preferences.” Such an approach requires the measurement of patient experience if providers are to learn and improve.
Tools for assessing patient satisfaction such as the Press Ganey Survey and the Hospital Consumer Assessment of Healthcare Providers and Systems Survey (HCAHPS) offer “valid and reliable measures of hospital quality” by identifying common themes in customer feedback. However, these surveys do not deliver such feedback from a specific patient, nor do they do so in real time. Intelligently designed tools for collecting patient-reported outcomes that are unrelated to the experience of care, such as the PROMIS CAT, also have limitations in that the implementation of such systems can be expensive and technologically complex.
Our goal was to drive value creation at the point of care within a clinical microsystem, but we could not find a simple and effective tool to use to do so. We found guidance in the Institute of Medicine’s (IOM’s) Crossing the Quality Chasm report and ultimately developed an Assessment of Care (A0C) tool inspired by the IOM’s six dimensions of perfect care. The tool consists of a series of six visual analog scales (one for each dimension): safety, effectiveness, patient-centeredness, timeliness, efficiency, and equity. Each scale is a horizontal line, 100 mm long, with anchors at the leftmost end for dissatisfaction and the rightmost end for perfect satisfaction.
At the beginning of or prior to an encounter, the patient or significant other places a mark on each line to indicate the level of satisfaction for each dimension of the care experience. The measured distance of the mark from the left origin of the line quantifies the patient’s assessment. We arbitrarily define any score of <90 mm on any dimension as “imperfect” and thus an opportunity for learning and improving. We capitalize on these opportunities by immediately “stopping the assembly line” — literally halting our workflow to talk with the patient (and family, if appropriate) about the care experience and how to improve it immediately. Then — and this is the hard part — we commit to addressing the concern before resuming our clinical workflow. In our experience, these conversations and activities are typically brief. Trust is key: If the care team does not respond in good faith to the patient’s feedback, the enterprise falls apart.
The clinical microsystem team included the service chief, staff providers, the nurse leader, staff nurses, our patients, and their loved ones or significant others.
We first introduced the Assessment of Care tool in a specialty neuropsychiatry clinic and have since utilized it during thousands of encounters at multiple institutions.
One early experience was particularly humbling and illustrates the power of the Assessment of Care tool. A long-time patient (one of those beloved patients who brought our whole team holiday gifts each year) came in for a routine visit for depression. When we invited her to use the tool, her symptoms were in full remission. She had been receiving electroconvulsive therapy, a treatment that she proclaimed had saved her life, and had experienced only minimal side effects. That day, however, she scored her experience of the safety of our care at only 50%. We were shocked — how could that be? At first, she placated us and dismissed our questions, but then she went on to describe how she never felt completely safe being put to sleep while wearing a flimsy hospital gown in a room full of (mostly) men.
This interaction poignantly demonstrated a clear dissociation between her symptom severity and her sense of safety. It also helped us to identify an opportunity for improvement that had not occurred to us but was obvious to her — specifically, to invite her husband to observe and participate in the procedure. Doing so empowered him to observe the care team in action and then ask simple yet radical questions such as “Why not? Why can’t care be delivered that way? Why can’t the health care system do more to make my wife feel safer?” This approach yielded powerful results as he took the lead role in designing the care for his wife. We have since offered all of our patients the opportunity to have loved ones join them in the procedure room and recovery area, an approach that is being spread internationally.
A second lesson that we have learned is that the effectiveness of care can be quite simple to measure from the patient’s perspective. In our experience, patient scores on the single visual analog scale for effectiveness (the second line on the Assessment of Care) have reflected the same degree of clinical improvement as indicated by more detailed patient-reported or clinician-administered outcome measures. Moreover, the single effectiveness score communicates this information more concisely and focuses on what is most important to the patient (e.g., symptom resolution vs. improved functioning), ultimately resulting in more efficient care delivery. For example, the single effectiveness measure of the Assessment of Care tool has replaced six standardized rating scales that are used routinely for outcomes measurement. This outcome saves valuable time, improving the overall experience of care for patients and making the business case more compelling to those who are concerned that adopting the tool will result in extended visit durations and lost revenue time.
A third lesson, also related to patient-centeredness, is that patients may differ in their definitions of “perfect care” and that such definitions may change over time. For example, although encouraging patients to invite a loved one to participate in their electroconvulsive therapy is beneficial in many cases, not every patient prefers this option. Similarly, some patients prefer more autonomy over the physical environment of care (e.g., “Doctor, will you please silence the beeping alarm and let me hold the oxygen mask?”), whereas others prefer a more paternalistic interaction (e.g., “Doctor, you do what you think is best for me.”).
The Assessment of Care tool easily and rapidly captures these individual differences between patients, especially if it is utilized during each and every patient encounter. This approach creates a cadence of accountability — for both the providers and the patient — as well as ongoing opportunities to refine the shared definition of “perfect care.” For example, even minor decreases in scores that reflect a generally positive assessment (e.g., a change from 100 to 90) can signal an opportunity to tweak, in real time, a process that matters to a particular patient. As a result, the quality bar is raised, rather than the effort regressing to “nibbling around the edges.”
Finally, our patients have made it clear how pleased they are to be invited to use the Assessment of Care tool. Adoption has been easy. Many, although not all, of our patients and their loved ones actively use the tool as a means of codesigning their care. The Assessment of Care tool keeps our focus on the individual patient and the outcomes that matter to them, while simultaneously cultivating our agility to respond promptly to individual circumstances and preferences as they change over time. In the end, the process of using the tool becomes as meaningful as the data that it produces.
An early and important hurdle that we overcame was the tendency to treat our use of the Assessment of Care tool as additional work separate from our clinical workflow; doing so made our efforts to achieve real-time improvements in value feel like adds-on to or delays in our “real work.” We were able to achieve a level of efficiency necessary to hardwire the use of the tool only after we began reconfiguring our processes to incorporate the data gathered from it.
This work was iterative and continual, demonstrating to our team “the relentless hard work of operational redesign.” One important such configuration involved replacing time-consuming parts of the visit (e.g., detailed clinician-administered symptom-severity scales) with the Assessment of Care. An additional configuration involved moving the tool from paper to a digital format that patients could access remotely before each visit and that allowed providers to trend data visually over time. Given the simplicity of the Assessment of Care tool, internal IT personnel were able to complete the technical build of the tool with minimal up-front cost.
Where to Start
The Assessment of Care is easy enough for providers to start using immediately. Nevertheless, it is important to begin the process by engaging the care team in a discussion of patient-centered care in order to develop a shared vision for care quality that focuses on patients and the outcomes that matter most to them.