With only a month left of 2017, practices should be wrapping up their Improvement Activities. MIPS requires at least 90 consecutive days of participation in order for a group or clinician to attest that an Improvement Activity is complete—meaning that the last day to start was October 2. The Improvement Activity portion of MIPS is the only component that is not a direct descendant of a previous program, increasing the challenge of implementation.
Recently, we attended a national conference for those in healthcare practice and administration; one of our goals was to learn more about how practices were adapting to this new requirement. We had the right audience at the right time, shortly after the October 2 start deadline. Everybody should have something to share, we thought. As the convention hall cleared, however, we knew we’d learned something valuable, but not what we had expected:
Improvement Activities are still not understood and, therefore, neither is MIPS—a stunning finding that underscores the need for more and better education and strategic planning.
In advance of the convention, we created a short (10 question) survey for attendees to complete at our exhibit booth. We offered a nominal gift card as an enticement and were able to persuade about 30 attendees to complete the survey. Loyal readers will recognize the value of small group samples. While not enough for a scientific survey, a sample can help spot potential trends and inform strategic planning in advance of a full roll-out.
The shift to performance improvement is not easy, but it’s an essential step for all providers to take in order to prosper under Value-Based Health Care. MIPS Improvement Activities provide a means to begin that transition—with a significant scoring incentive. Our survey uncovered some common misunderstandings and gaps in knowledge; those themes inform these five lessons for implementing successful Improvement Activities now and in the future:
Lesson 1: Understand the Relationship between Improvement Activities and MIPS
Upon compiling responses to the first survey question, we were startled to learn that fewer than a third of respondents recognized the term “Improvement Activity” as it relates to MIPS, even with verbal prompting. When asked to summarize the MIPS Improvement Activity in one sentence, the majority offered a surprising number of professional titles, corporate missions, “working on MIPS projects” and one very disheartening “revenue enhancement.”
Improvement Activities are one of four components of MIPS, of which only three are scored. Those who neglect Improvement Activities are ceding at least 15 percent of their MIPS composite score. Why “at least?” Because when used strategically, Improvement Activities can pay dividends in other components of MIPS, improving outcomes (and therefore, responses) in quality measures, and decreasing costs. Even more importantly, Improvement Activities can lead to better care for your patients.
Lesson 2: Establish a Goal, and Work Toward It
We were also surprised that only half of those who could summarize their Improvement Activity could also describe their goal and the actions taken to achieve it. Fewer, still, could identify how the goal would be measured. In other words, they knew what they wanted to address, but had not established a plan for getting there or how to prove it. This problem was perfectly characterized in one response, where the method for achieving the goal was to “look at quality improvement.” Without defining a goal, establishing a plan for reaching that goal, and measuring whether that plan led to the desired results, improvement is unlikely to occur and impossible to demonstrate.
It may seem daunting to set a goal before knowing whether it’s feasible, but this is where a comprehensive (but flexible) Qualified Clinical Data Registry (QCDR) can help. By testing initiatives with small patient samples, you can preview results and adjust processes as issues surface—and do so before you roll out the program to everyone.
Lesson 3: Ensure Everyone Understands Their Role—and the Goal
Some of our respondents came from the same practices. In theory, as long as they were writing about the same activity, each should provide similar answers, right? Surprisingly, no. Not only was the concept of Improvement Activities misunderstood, but also the responses highlighted a lack of internal communication among players. One individual indicated that their group was not reporting data to any outside entity, but her colleague stated that the group was reporting to two. In another instance, two individuals rated their group’s success differently, even though they both indicated that their practice was focusing on surgical procedures.
If different people within the same practice can neither describe the goal nor their responsibility in achieving that goal, there’s no momentum, and the program will grind to a halt. Certain programs, including the Patient-Centered Medical Home model, require that the whole practice understands the goal and their roles. This is not an off-hand comparison—those in accredited PCMHs earn full points in the Improvement Activity category. Clear communication about goals, roles and strategies is essential to achieving real improvement..
Lesson 4: Measure Success Using Quantified Results
On a scale of 1 to 4, with “1” being “Not Successful” and “4” being “Very Successful,” nearly all respondents picked “2” when asked to rate their Improvement Activity efforts. Regardless of whether the respondent could describe an Improvement Activity or responded “in general” to their practice’s ability to improve, the vast majority believed their efforts to be on the wrong side of the curve. No one answered with a “1” and only one respondent selected “4.”
Why such overrepresentation of the minor negative response or the nearly complete absence of an extreme positive or negative? Look at the previous lessons, and it becomes clear: without a defined goal, an understanding of how to reach it, and a clear sense of responsibility for doing so, no one is certain of the answer. Think back to school—if you didn’t want to be called on, how would you react to your teacher’s question? Would you make eye contact with the teacher, or stare intently at your desk?
To answer with confidence—meaning that you have a defined program in place, with a specific a goal, and you can confirm whether it’s been met—you need concrete results. A QCDR designed to track outcomes (clinical, costs, satisfaction and more) over time gives you this ability. Whether the results are positive, negative or neutral, being able to contextualize those results will illuminate the next step on the path to reaching your goal.
Lesson 5: Let Your Interests (and Needs) Guide Your Selections
The most encouraging responses to the survey came at the end, where respondents were asked to select from a set of broad topics of future interest for Improvement Activities (e.g. improving clinical outcomes, retaining patients, etc.). With a few exceptions (one person—yes, the same whose goal was “resource enhancement”—wrote in “none of the above”), respondents indicated interest in at least one of these categories:
- Improving clinical outcomes (selected by two-thirds of the sample);
- Reducing costs for certain episodes of care;
- Identifying high risk patients;
- Retaining patients;
- Improving communication between patients and providers.
There is an inherent desire to improve. I daresay that one of the reasons most of us entered this field, either as clinicians or to support high quality care in other ways, is to ensure that patients receive the best care. Inevitably, some of us will become patients, as will our families and friends; improved outcomes and increased efficiency will benefit each of us, too.
Identifying opportunities for improvement is the first step. Using tools that CMS and other health plans have provided, including Quality and Resource Use Reports (QRURs) and Episodic Cost Field Test Reports, can help you isolate those areas. In the same way that a test question that everyone answers correctly is not useful, neither is an Improvement Activity that doesn’t target a meaningful area.
Yes, we are passed the October 2 deadline to begin, but in this transition year of MIPS, starting now can still make a difference for next year—and also for your patients.
Founded as ICLOPS in 2002, Roji Health Intelligence guides health care systems, providers and patients on the path to better health through Solutions that help providers improve their value and succeed in Risk. Roji Health Intelligence is a CMS Qualified Clinical Data Registry.
Image Credit: Ryan McGuire