Azara Healthcare and Baltimore Medical System will present the educational panel: "Show Me The Money!: Secrets for Performance Improvement with Data-Driven FQHC Practice Transformation" on Aug. 26 at the National Association for Health Centers Community Health Institute & Expo (CHI) in San Diego. Portions of the presentation are highlighted in the following case study.
The credibility of data from a reporting tool, when presented to healthcare providers, is the foundation for successful quality improvement in any organization. Leadership at a practice should expect to be challenged on it, and be able to defend it. Providers are apt to blame a tool before taking the time to examine the various factors that lead to performance scores that are lower than expected. Many quality improvement programs fall apart at this early stage because management is not prepared to back up those scores with well-understood data and a firm, but supportive process to address providers’ concerns. Only those organizations willing to explore and understand the reasons behind performance (good or bad) - and support staff process change - can succeed and sustain improved scores.
Here’s one example
A diligent, thorough nurse practitioner at Baltimore Medical System (BMS) insisted that the report on her performance managing diabetic patients must be wrong. She prided herself on providing excellent care to her patients, and yet the report revealed only 39 percent of her diabetic patients were considered “controlled,” meaning their A1c score was less than 9.0. In the measure definition, patients who have not received the test within the required time period are also automatically considered to be above 9.0.
Jessica Boyd, MD, chief medical officer at the federally qualified health center (FQHC), agreed the nurse provided excellent care, but remained firm that the number, produced by the Azara DRVS data analytics and reporting tool, was accurate. She had run the report herself and checked to be certain the patient data reflected in the EHR from the nurse practitioner’s panel matched the report before she showed it to her colleague.
Determined to understand the reason for low score, the nurse practitioner went into the DRVS patient registry, pulled up all her diabetic patients and went through each one, line by line. Simple reasons explained her score: some patients had not gotten lab tests, while other results had been recorded on paper but not entered into the EHR. Based on the data in the system, these patients were being grouped in with patients considered untested, and therefore “uncontrolled” because their results were not recorded in a structured field within the EHR.
By improving the simple things that caused her low score – particularly ensuring that all A1c results got recorded in structured field - the nurse managed to boost her score from 39 percent to over 80 percent in a relatively short period. Using this significant increase as a source of inspiration for her peers, she implemented a similar program to boost scores throughout the center.
She became a DRVS convert, and embarked on a mission to teach the rest of the organization
Dr. Boyd uses this anecdote to demonstrate the kind of improvements BMS has achieved since implementing the DRVS platform in early 2013. The reporting and analytics solution allows the center to make its data “actionable” and provides an opportunity for the hard-working staff members to enjoy a sense of accomplishment by tracking their performance on numerous patient care measures.
“I’ll just say from the medical perspective, we were trying to create a quality program, and we felt as if we didn’t reliably know where we were, then we couldn’t figure out where we needed to be,” said Boyd. “We really found that we wanted to have reliable data to give to our providers so we could move our program forward and make progress; DRVS gave us that.”
Picking up where the EHR leaves off
Founded in 1984, BMS is Maryland’s largest FQHC, serving roughly 45,000 patients across six sites. Although it employs a data programmer, the number of reports the center must produce can be overwhelming, as can the challenge of keeping up with continuously evolving measure standards. And, though the center uses a strong EHR system, Dr. Boyd discovered the reporting provided with the EHR wasn’t robust or granular enough to meet her goals.
Michele Lagana, chief financial and chief information officer at BMS, said the center knew the focus on quality would increase, and she wanted a reporting tool that could help them move the meter quickly.
“We were very taken when we learned how DRVS pre-packaged reports (about 65 at the time), would allow BMS to conduct reporting at the center, site, and patient level,” she said. “The mapping process from our EHR system to the DRVS system allowed us to capture data from multiple input locations throughout the record, giving every chance to include the data on the array of reports.”
BMS gained success by training small groups and individuals across the organization to use DRVS. “Super-users” could go out and train new clinicians. DRVS has become so integral to the center’s operation that it’s part the orientation process for new hires.
“Clinical staff can run reports themselves, and the system is easy enough to use, so anyone can do it. It has really helped us along with our quality initiatives,” said Lagana.
Prior to the DRVS installation, the EHR-provided reports were only collecting data from certain fields or templates within the EHR. However, Azara drew from its knowledge working with other health centers that use the same EHR and knew how to extract valuable BMS data that might be “hiding.”
“That ‘s a big deal because it allows a provider to document data by using his or her preferred workflow, said Boyd. “Doctors want the information to be right there when they want it, where they want it. We were really pleased to be able to adapt the DRVS system to accommodate our workflow.”
Azara conducted the upfront work to map and collect the necessary data, and then worked with BMS until they were satisfied it had been captured accurately and completely. This meant BMS conducted an exhaustive validation process to begin to build the credibility foundation. Physicians and health IT staff examined all the reports and validated each one before making it available to the medical team.
BMS also uses DRVS to present providers with information about the quality of care they provide; these scorecards are released on a quarterly basis.
Boyd said the center’s quality measurement graphs had plateaued prior to the DRVS implementation, but are accelerating so quickly that targets for the current year have already been surpassed.
Boosting performance, coordination and the “bottom line”
DRVS has also become an integral piece of BMS’ day-to-day operations. The patient registries and visit planning report tool fit in well with the patient centered medical home (PCMH) model. The medical assistants (MAs) use DRVS to pull reports daily or weekly for providers, and they can view all the patients scheduled for visits. The report performs a virtual chart review, saving the MA significant time by identifying any preventative care needs that should be addressed for each patient.
“When we huddle each morning, my MA says, ‘don’t forget this patient needs their immunizations’ or whatever else is due. Our medical assistants help us stay on top of our quality, and we really share our results. When my results on something like weight assessment went from 10-20 percent up to 50-60 percent, my MA was proud,” said Boyd, because the MA knew her efforts contributed to that result.
The DRVS registry reports are also key to efficient population management. Nurses use them to identify patients whose chronic illnesses are not well managed, make outreach phone calls, schedule follow-up visits, and know which services are needed or overdue.
The DRVS platform also includes a meaningful use dashboard – a crisp visual interface that informs the center’s leadership and providers on how well they are performing on meeting specific metrics within the federal incentive program. Providers can “drill down” from the aggregate score into the patient detail level to determine why they are lagging on a particular measure. Lagana says BMS uses DRVS to run meaningful use reports and see how each clinician is performing on the program’s wide range of clinical indicators. The quality indicators reported on our Center dashboard, presented monthly to our Board, comes from the DRVS reports.
“Our providers understand meaningful use at the big picture level, but they don’t understand the details very well, and DRVS helps clarify them,” said Boyd. “When you open DRVS, the information that you need is right there, so it’s helping us to educate our providers about what we need them to do for us to get our (payment).”
This understanding is critical because it means providers and staff finally make the connection between the data they enter in the EHR and the data that appears on performance reports. From there, they can determine how that reporting data contributes to organizational performance and incentive payments in a variety of programs in which the health center participates. BMS makes this especially relevant for providers by offering monetary incentives for achieving certain quality goals. This practice helps ensure alignment across the organization toward achieving the strategic goals for the year.