Program Integrity

Meeting #2

Date: October 14, 2016 1:00 PM

Location: School of Public Health, 1 University Place, Rensselaer NY 12144

Attendees
Overview

This was the second meeting of the Value Based Payment (VBP) Program Integrity (PI) workgroup. The purpose of the meeting was to review the discussion on data quality held during meeting #1, to confirm draft recommendations on this topic as well as to dive deeper into policy design issues related to VBP, including creating draft recommendations on this issue.

The Agenda for this meeting included:

  1. Welcome and Introductions
  2. Data Quality Refresher
  3. Data Quality Draft Recommendations
  4. Policy Design Deep Dive
  5. Policy Design Options for Consideration

Key Discussion Points (reference the slide deck "PI Workgroup – Policy Design")

1. Welcome and Introductions

The PI workgroup co–chairs and DOH Sponsor opened the meeting by explaining the steps leading up to this meeting, including the creation of draft recommendations that took into consideration comments received from workgroup members during the prior meeting.

Co–chair Jeff Gold emphasized the need to develop recommendations to the State to fulfill the workgroup´s charge of providing policy recommendations to the State, coming as close to consensus as possible. The consensus of the workgroup will mean that the workgroup´s voice is appropriately considered before NYS makes final decisions in this area.

The co–chairs also stressed that the workgroup is receptive to any written feedback from constituent organizations following any meeting.

2. Data Quality Refresher

The group received a refresher of data quality issues, including VBP–specific data issues as well as encounter data error rates. The discussion quickly turned to review of the draft recommendations on this topic.

3. Data Quality Draft Recommendations

The workgroup raised the question of whether the draft recommendation related to data completeness was limited just to Fraud, Waste & Abuse measurement or whether it included a program oversight component. The group decided that the final recommendation could be broadened to include both.

Ultimately the data underlying both areas feeds into multiple agencies and the recommendation should apply to oversight, enforcement, and policy components.

The workgroup was in consensus that the recommendations should reflect strengthening the validation process of encounter data. Similarly, members of the group stated that they would like to see a recommendation that includes 100% completeness of encounter data and that the State should reject encounters when data is missing. The question was raised whether all plans are using the same template to communicate claims requirements to providers. In the event that there is significant variation in the claims data submitted by different providers to plans, the workgroup noted that a standardization of these requirements could be included as a recommendation.

Policy Question A: Can the existing reporting and enforcement process be leveraged more effectively in support of VBP?

The workgroup considered recommendation A–I:

"NYS and Health Plans should formalize protocols for Health Plan Special Investigative Units´ (SIU) review of provider–submitted claims specifically for VBP contractors. In support of this effort, certain State oversight authorities should be delegated to the Plans"

The workgroup noted that "delegation" should be strengthened as plans are in fact required to do these tasks and there is a need to maintain the State´s role in the process.

Also in reference to A–I, the workgroup noted that the recommendation should be expanded to include the plans´ compliance programs and their obligations; the compliance program requirements allows for communication between the SIUs and other entities. It was noted that adding this element would allow flexibility for the internal oversight process as VBP is implemented.

Recommendation A–1a was considered by the workgroup:

"SIUs should focus their investigative efforts more intensely on VBP contractors due to the possibility of greater challenges associated with the transition to VBP"

The workgroup noted that there needs to be a consensus on the "attitude" that is taken in this process. Fundamental principles need to be clarified and established as parameters. Historically there has been an aggressive approach involving punitive actions and in the case of VBP, it is recommended that the intention be clarified. It was noted that NYS should handle issues differently for providers who have good intentions, but may lack the sophistication to comply with all VBP–related data quality policies. In the absence of robust and reliable data reporting, it was noted that the State should consider a softened approach (i.e. corrective action as opposed to sanctions). This approach could be described as having a mentality of "right to cure," a period to correct error, or amnesty for well–intentioned providers. This type of approach would address provider concern regarding the overall shift to VBP and associated program integrity initiatives. Another reaction to recommendation A–1a was that VBP does not necessarily require more intensive review; instead it requires "different" review.

The workgroup considered recommendation A–1b:

"protocols should seek to ensure accuracy and completeness of claims and other data associated with both retrospective and prospective VBP"

The workgroup´s discussion concerned the formalization of protocols for the submission of encounter data. The workgroup noted that reporting should be consistent and there needs to be objective communication to the plans regarding their reporting quality and general expectations.

The workgroup considered recommendation A–IIa:

"Perform an evaluation of the current Encounter Intake System, with a focus on supporting VBP program integrity. Special consideration should be given to data elements and measures that are integral to VBP by adding new edits or adjusting the encounter intake process"

The workgroup determined that "if necessary and based on the situation" should be added as a caveat. The group was in consensus that medical necessity reviews probably don´t need to be added into the recommendation since it is not specifically related to VBP and since it probably falls into the responsibility of the Plans rather than an EIS IT program.

Under the recommendations for part A–II "current state assessment and future state design of an encounter intake system", the workgroup noted that a sub–bullet d should be added provided that the entity performing the assessment will "announce findings of assessment for stakeholder input and intention of findings". In such case, the workgroup suggested the State perform the assessment and report out findings and potential changes or edits that could be added or modified within the encounter intake process. This feedback process would allow for input from plans and providers before modifications were made.

Policy Question B: Aside from the encounter data, are there other sources of data, or potential enhancements to data sources, that could potentially serve to ensure that NYS is able to collect high quality submissions?

The workgroup considered recommendation B–I,

"The State´s data protocol should compare encounter data against non–encounter data for VBP quality and efficiency–related fields"

The workgroup noted that the phrase "non–encounter data" should be changed to "other relevant data" since there is not a clear consensus on what non–encounter data constitutes.

There was discussion around potential inconsistencies between encounter and other relevant data– it was noted and acknowledged that it cannot always be assumed that either data set is the source of truth but rather outliers should be identified and reviewed after comparison to multiple data sets. Similarly, there was hesitation in the room regarding the use of other clinical data (specifically RHIO data) to assess for divergent behavior because this other data is not necessarily more reliable. If RHIO data was used for punitive action, providers would perhaps be discouraged from voluntarily submitting data to the RHIOs. Instead, it was noted that OMIG should establish consistent standards by which it carries out enforcement actions through VBP. OMH suggested using cross–data comparisons to look at outliers, to direct in–depth audits, but not necessarily to enact punitive measures. Ultimately, the workgroup agreed that initially the data be used only for monitoring purposes, to identify outliers and guide audits.

The proposed language that outlines the use of secondary data is the "development of the metrics, following up with providers based on outliers identified, and only enact punitive measures when all parties feel comfortable with data."

4. Policy Design Deep Dive

As the group began the discussion of policy design, the co–chairs emphasized the policy design intent and need to focus on "cure, correct, adjust" as opposed to "game and enforce". There was a brief discussion during the policy design outline that low cost could not be considered in a vacuum; a better description would be "efficient use of resources". There was also a high–level discussion of value, and the nexus between cost and quality. Some workgroup members noted that "quality" should include patient satisfaction. It was acknowledged that under VBP, reimbursement is concerned less with the units of service but rather the outcome of attributed patients. The monitoring of outliers or clusters of providers on the cost/quality matrix helps identify outliers, develop best and worst practices, and allows the State to drive parties towards the delivery of high value care.

5. Policy Design Options for Consideration

Policy Question A: What framework should be put in place to ensure that the transition to VBP does not create incentives contrary to the spirit of the program?

The discussion began by identifying the potential risk that under VBP, providers may have a perverse incentive to withhold services that would be beneficial to patients. In monitoring efforts going forward, it was noted that NYS should look at whether there is a sudden drop in service delivery to consumers, essentially tracking whether people are getting greater or lesser services. Similarly, members of the group advocated that consumer experience should be added to the recommendations because patient experience helps in predicting and dictating outcomes.

The group also discussed the issue of providers ´cherry–picking´ populations. To the extent that there are hard to service and/or non–compliant populations, the workgroup noted that enforcement agencies should be mindful of instances where providers have an incentive to pick "easier" populations in order to achieve greater shared savings.

The workgroup noted that the attribution process as well as quality measures should, to some extent, guard against cherry picking, but that there may need to be further investigation into the ability to appropriately and accurately assess and quantify true patient outcomes. The DOH and OMH complaint units were raised as a potential units whom could catch "cherry pickers".

Materials distributed during the meeting:
Document Description
VBP PI Issue Brief This document details the high level policy questions related to program integrity as well as an overview of the current state of data quality, program design, and risk management efforts.
PI Workgroup – Policy Design A presentation deck of policy questions and options for data quality as it relates to VBP program integrity. Includes a background on VBP concepts and PI–specific VBP considerations.
Supplemental Information – SIUs This document provides additional background on Special Investigative Units (SIUs), including model contract requirements and benchmarking information gathered from other states.
PI Workgroup #1 – Draft Recommendations Draft recommendations related to data quality gathered from workgroup discussions during the 1st workgroup session on September 6, 2016.
Key Decisions

The WG made decisions on the following key points during meeting #2:

  • ✓ KPMG will send recommendations to the group by 11/7/16, with written feedback from workgroup participants by 11/15/16. A list of final recommendations will be send to participants 2 days before the 3rd workgroup meeting on November 16.
Action Items:
  • Re–write draft language for recommendations around data quality and circulate to workgroup participants. Workgroup participants are to respond with written comments prior to the next meeting.
  • Send attribution guidance documents to group in advance of next meeting.
  • Investigate how to retrospectively recoup losses when putting someone at a loss.
Conclusion

The next workgroup meeting will be held in Albany on November 16, 2016 and will include:

  1. A discussion of the revised Data Quality Recommendations
  2. A discussion of the Policy Design Recommendations, including modifications resulting from stakeholder feedback received prior to the meeting WG members´ comments on Data Quality
  3. An introduction to the new WG topics including:
    1. Risk Management
    2. Fraud, Waste & Abuse