Azara Healthcare and Equality Health Partner to Provide Unified Solution for Management of Value-Based Care Programs at Community Health Centers > Learn More

Skip to content

Automated measure testing? Well worth the effort

Over our nearly five-year history, Azara has deployed dozens of database releases that contain hundreds of new or modified clinical quality measures (CQMs). Most people look forward to each new release as an opportunity to add features and fix bugs – neither of which seem to get out the door fast enough! However, to the paranoid database engineer (ahem), each new release harbors the risk of breaking something that already works well. That's called a regression – and it's an unfortunate byproduct of software development.

The only way to avoid regressions during an upgrade is to check that everything still works, but that's a daunting task, especially because Azara pumps out new releases every 6-8 weeks.

The answer? Automated tests. If there's one thing I've learned as a software developer, it's automated testing is always worth the investment.

Automated testing is vital to software development, and there's several ways to tackle it. The balance of speed and quality is key. Similar to other fledgling software companies, Azara didn't start out with automated tests: We sweated to release our initial product versions. The products only made it out the door once we completed a battery of manual tests.

However, after we grew beyond our initial few clients and inadvertently broke some stuff that already worked, we knew we had to solve a problem. This was especially frustrating with measure logic because the numbers could change without explanation. We'd find ourselves scrambling to fix a new problem after each release. So in early 2013 we decided to take the plunge into automated testing.

We created some fake patient data, or TEST Data, loaded it into our system, ran measure calculations, and checked the measure logic output and results. For example, we fashioned a diabetic patient with an A1c of 7.9 percent and verified that their measure result for the A1c < 8 percent measure was 1/1, or 100 percent. This may seem like a silly test, but it protects us from all sorts of issues. If the test passes, we know we can load patient data, diagnosis data, and lab data. We also know we're parsing lab results correctly, etc. Much has to work correctly for this simple test to pass; its success means many of the system’s parts still function properly.

We currently have hundreds of test scripts that correspond to well over 1,000 actual test conditions. Our test data includes more than 1,500 test patients that were each lovingly handcrafted to test a specific piece of functionality. My co-worker likes to name her test patients after fictional characters, such as Bugs Bunny; I give mine mundane, but descriptive names, such as "patient not in A1c numerator.”

We try to have a little fun with the exercise; it certainly makes for interesting conversations. (Let’s just say we’ve manufactured a population of strange names and a laundry list of medical ailments.) I don’t know the exact extent of our automated test coverage, but we test all core data loading procedures – every measure, alert, registry, etc.

You may wonder how our automation works. Without getting too in the weeds, we use a great tool called Jenkins, which we combine with batch scripts, a build tool called Nant, a SQL test framework called tSQLUnit, and a handful of other helpful tools. Each time a developer changes code, Jenkins: receives the news code, builds a new instance of DRVS, loads all our test data, runs our processing jobs, and checks the output. If anything fails, the development team receives an alert that summarizes the recent changes and all the tests that failed. Our engineers are required to fix any failing tests immediately, and all tests must be successful before we consider releasing a new version of DRVS.

As you can see, automated testing is both a major component of Azara’s development process and a source of pride for our engineering and quality assurance teams. There's always a trade-off with testing: it slows the development process, which be frustrating when you're in a rush to push out an important new feature to a client. But it's the only way to ensure that we deliver a stable, high-quality product while we continue to enhance DRVS.

Eric Gunther is an engineer at Azara Healthcare.