There are 198 measures available for PQRS Registry Reporting in 2016—seems like there should be enough options to select the most advantageous measures for providers. Unfortunately, however, that’s not the case. Even though there are a whopping 21 measures that could be skipped for each 1 reported, and twice as many National Quality Strategy (NQS) Domains as needed, not everyone will be able to report on 9 measures across 3 Domains (including a cross-cutting measure!).

Why is there difficulty in meeting basic reporting requirements? Because measures do not apply to specialties in equal amounts, nor are they evenly distributed within NQS Domains, which must also be satisfied for reporting. These dual reporting requirements leave certain providers with fewer options than others. For some, fewer available measures puts them between the proverbial rock and hard place—report on a measure where your performance is poor and risk penalties and dubious recognition on Physician Compare, or fail to report and incur penalties. For others, it’s simply not possible to achieve the full requirement, even if they report comprehensively on each available measure.

How the MAV Process Is Intended as the Equalizer—But Falls Short

To avoid penalizing providers who reported on everything possible, CMS has designed a process: Measure Applicability Validation (MAV). In this process, CMS examines data for individuals and Group Practices who reported less than a full submission (9 measures, 3 Domains, 1 cross-cutting measure), and then analyzes what was submitted against its own claims data to determine whether there were other measures that could have been reported. If there were, the group will fail PQRS, and potentially incur Value-Based Payment Modifier (VBPM or VM) penalties, as well. If CMS concludes that the provider or group reported on all feasible measures, then the provider or group is held harmless (neutral payment adjustment for PQRS).

To determine if there were other measures that could have been reported, CMS performs what is called a Clinical Relation/Domain Test, rather than using the individual measure specifications. If a provider reported on a measure focused on a particular topic (e.g. diabetes), it is assumed that the provider could have also reported on other measures related to diabetes. These measures are referred to as “clusters.” This makes logical sense, but in practice it is flawed. Why? Because measures may be in the same cluster but have different patient eligibility defined in measure specifications. The result is that providers may be “on the hook” for a measure, even though there’s no way to report it.

MAV Measure Clusters: Technical Flaws Can Create Bad Surprise for Providers

There are several ways that inconsistent clusters may lead to trouble. Some examples:

The cluster for Falls Risk and cluster for Urinary Incontinence each contain two measures: a screening measure and a treatment measure. But the treatment measure is only applicable for patients who screened positive in the first measure; if screening is negative, the treatment aspect becomes moot. So, a provider may screen patients for urinary incontinence or for future falls risk and find no instances of either. That means zero patients are eligible for the treatment measures. Nevertheless, because of the way the measures are clustered, that provider is at risk for failing MAV.

In other instances, the measures sound linked, but the denominators are significantly different. There are two measures in the Stroke Care cluster. However, one of them includes emergency department care in the denominator, while the other one does not. In this case, an emergency department provider could trigger one of the measures on a daily basis but never trigger the other. Yes, the measures each say “Stroke and Stroke Rehabilitation” in the title, but having denominator-eligible patients in one measure does not mean that a provider will have patients eligible for the other.

The Immunization Care cluster is perhaps the most incongruous of them all, even though it is only comprised of two measures: an influenza immunization measure and a pneumococcal vaccination measure. The first problem is the age criteria. The influenza immunization measure is for patients aged 6 months and older. The pneumococcal vaccination measure is only for patients aged 65 years and over. So, anyone between 6 months and 64 years could never qualify for the pneumococcal vaccination measure. The second problem is the timing. Patients are only eligible for the influenza immunization measure during the flu season. So, a patient may be 65 years old, but if the patient is seen in June, they wouldn’t trigger the influenza immunization measure, but would still be eligible for the pneumococcal vaccination measure.

Remember, these will only occur in instances when providers have not met the full reporting requirements, or in cases where another cluster may not also be in play. In other words, a provider who reports on tobacco use and urinary incontinence only will fail MAV, but not necessarily because of the Urinary Incontinence cluster—if the provider could report on the tobacco cessation measure, plenty of others would have also been available.

Your Recourse if MAV Mistreats You

First, if you have been penalized based on a situation like one of those described here, you should file for an Informal Review. You have 60 days from the release of your prior year performance data and Feedback Reports. This year, the deadline is November 30, 2016. Start here.

If you’re concerned that this may apply to you in the future, you should raise this issue with your specialty society and with CMS. CMS takes stakeholder feedback seriously, as evidenced by the differences between what was proposed and what was finalized for MACRA and the Quality Payment Program in 2017. Even though the Rule has been published, we are still within the 60 day period in which the Department of Health and Human Services (HHS) will accept comments (the rule was published on October 14, 2016). Comments may be submitted electronically here. For a more hands-on experience, CMS has developed the Measures Management System (MMS), which allows users to suggest measures, and to provide technical input and offer comments.

Second, you need to be able to predict the possibility of your success under MAV and ensure that you are reporting to your maximum capability. The best way to do this is by Registry Reporting, so that you can use the results. A Registry with consultation capabilities may also be able to help. ICLOPS provides services to its clients through the Registry to evaluate all measures on an ongoing basis to help guide your measure choices.

What Shouldn’t You Do?

Don’t try to dodge reporting requirements through clever use of the Clinical Relevance/Domain Test and clusters. With its claims as a backdrop, CMS has no trouble determining when people are trying to “game the system,” and this also casts your partner (Registry, QCDR, EHR) in an unfavorable light with CMS, which has repeatedly stated that the purpose of this program is to improve patient care through quality measurement and evaluation. When reporting results don’t help to fulfill that goal, no one wins.

Founded in 2002, ICLOPS has pioneered data registry solutions for performance improvement in health care. Our industry experts provide comprehensive Solutions that help you both report and improve your performance. ICLOPS is a CMS Qualified Clinical Data Registry.

Image Credit: Markus Spiske