Learning from the Path: When a New Screening is More Than Just a New Screening

by Dr. Terese Finitzo

Population Health is a powerful agent for quality in the new Health Ecosystem. As Public Health implements Critical Congenital Heart Disease (CCHD) screening programs, Secretary Sibelius notes our programs should assure care for our children and simultaneously put in place a rigorous approach to quality improvement. Many of our past policies in early child health screening stem from the era of inoperable health information technology and, thankfully, these need not be replicated today. Was it Mark Twain who said, “If you always do what you always did, you will always get what you always got.“ It applies here. 

Public Health like no other agency, organization or university in this country, has the opportunity to forge a clear pathway for maternal and child health programs with a change in attitude about newborn screening. Information management and meaningful data collection are the prime catalysts for this change.

More than 40 years ago, public health began addressing data collection for newborn bloodspot screening (NBS). More than 20 years ago, public health began discussing universal newborn hearing screening implementation. As I stood in front of pediatricians advocating universal hearing screening at Grand Rounds across this country, their rallying cry in the 1990s was “show us the data!” They wanted proof that it made a difference. And so early on, we began addressing data collection for that program. Some advocated that hearing screening should piggyback on NBS efforts by adding a box for the hearing screening results on the filter paper card. Here’s what a nurse was expected to do:

             Obtain the hearing screening results

Write down the hearing results on a piece of paper

Carry those results until she did the heel stick

Then transfer the results (presumably on the correct baby) to the correct filter paper card

Next, here’s what laboratories analyzing newborn bloodspots were expected to do:

Enter the hearing results from the filter paper card (along with the other baby details)

Report the hearing screening results back to public health  

Of course the laboratories could neither verify that screening was done, nor replicate the hearing screening test since the procedures the lab had in place for blood spot quality assessment didn’t work for a test they had not conducted.  

Other programs, including the ones I was involved with, saw these limitations and built an improved data management system for newborn hearing. OZ Systems’ first systems sat in isolation in hospitals but provided accurate hospital data collection and tracking information. Our second systems included a way to send encrypted emails with results to public health but it was a slow process and not sustainable, especially for large states. Our first web-based system in 2002 was implemented for the National Health Service in England. It did provide real-time access to screening results and allowed care to take place in multiple locations: the hospital, the audiology clinic, and the pediatrician’s office. That is the reality for care delivery. It happens in many places. 

But this approach made cooperation between bloodspot and hearing screening staff unworkable, since really what we had implemented was a silo – a successful silo but nonetheless, a silo. We understood that. But integrated data systems were just a gleam in the eye of public health not to mention the newborn screening system software developers. Over the next dozen years, the value of integrated data systems where one could access both bloodspot and hearing screening in one location became widely accepted but rarely successfully implemented.

So we have the historical precedent. Now, what are some alternative approaches to newborn screening information management for CCHD?

Here is what not to do. First, don’t enter CCHD results on the filter paper card. How much more information can we ask a busy nurse to write down? The card is already crowded and modifications require more than just printing new cards; they require laboratory changes as well. One analysis of card information from Minnesota showed that physician contact is missing 12% of the time. Indiana has noted that hearing screening results are occasionally missing or inaccurate on the filter paper cards (usually reporting the first fail but not going back to report the pass). The result is that follow-up is more costly and time-consuming.

Second, it’s the era of interoperability and standards-based solutions in health information technology. We should not be entering screening results manually, even into a hospital or public health Electronic Health Record. It is error-prone and time consuming. With hearing screening, the hospital screener does not make an outcome decision and yet, when more than a decade ago we reviewed log books in Texas nurseries, we identified that 11% of log book entries differed from the letter given to the parent. There are standards-based HL7 messages, Technical Profiles and implementation Guides published by IHE to tell us how to get the data directly from the device! That means the message with the screening results has a pedigree. We know with certainty where it came from. That means there is an audit trail. We know if anything got changed. That means the message didn’t get sent to the Chucky Cheese’s fax number instead of the Department of Health. (Seriously, those dang numbers are really close.)  And, we know it is more secure.

Third, don’t ask hospital screening staff to examine the pulse oximetry data and interpret the outcome as pass or fail for public health. It’s data acquisition, pure and simple. Here’s a scenario. You are a nurse being asked to apply an algorithm and determine the correct CCHD screening outcome following screening. Please move quickly. You have 20 more babies to screen and that baby you hear crying in the background is your responsibility. Your cell phone is ringing. Your fourth grader can’t find his shoes for soccer practice. Did you write down the screening result yet? The bottom line is that human beings make errors and I know your fourth grader shouldn’t be calling you at work. What is the simple sanity? The pulse oximetry data can be captured from the devices electronically and securely, with a pedigree and an audit trail. We have just reduced time, stress and opportunity for human error.

A challenge to quality improvement in newborn hearing screening programs is to understand how a screening algorithm functions in “real life”. Few screening devices have peer-reviewed research validating their algorithm on large numbers of infants. Right now, many CCHD programs are using the well-described algorithm advocated by Kemper et al. (see below). But Kemper and colleagues have been clear: they do not see their current algorithm as the final solution or the only solution in all cases – higher elevations for example. The ability to capture raw data from a pulse oximetry device means that a statistician can analyze and manipulate data in order to improve the quality of the CCHD algorithm. To leave such a task to a research scientist who can, if lucky, collect data on 15,000 births over the three to five years of a grant, when that same scientist could have the aggregated, anonymized data on 1 million births is a waste of our resources. 

The good news is: we have the capability and capacity to do this right, right now. Using standards-based tools, hospitals can capture pulse oximetry data, apply an interpretive algorithm and submit the outcome and the raw data to public health. It could mean life or death for that baby who is now only a gleam in his father’s eye. We owe the youngest citizens we serve our best.