Data abounds in healthcare, and the amount of health data is bound to increase at a more rapid pace in the next several years – 48 percent growth annually based on estimates by research firm IDC.

However, given the lack of interconnectivity between systems (apparent in the EHR world as well as within MCOs), sloppy data accumulation, as well as a host of other issues, much of data to information in the industry is still GIGO ("garbage in, garbage out").

“The ‘garbage in, garbage out’ problem occurs when data comes in at irregular intervals or isn’t adequately linked to other relevant pieces of information,” says Jonathan Weiner, codeveloper of the Johns Hopkins ACG System, and professor of health policy and management at the Johns Hopkins Bloomberg School of Public Health. “The challenge is to structure miscellaneous data to get a useful picture of individual or community health.”

The ACG team, along with the Johns Hopkins Center for Population Health IT, works directly with organisations on a case-by-case basis to determine the most useful ways to link structured and unstructured medical, geographic and social data with insurance claims, medical administrative records, and other patient data for specific needs, according to Weiner.

In the UK, for example, providers are using the ACG System to combine primary and secondary care data with information from location and demography in ways that have led to innovative patient care strategies that reduce costs, improve outcomes and enhance patient experience, he says.

Weiner, who is also a professor of health informatics at the Johns Hopkins School of Medicine, is currently doing research focused on the application of EHRs and health IT for population-based applications such as performance measurement and predictive modelling/analytics. What is needed, he points out, is how best to translate all the available patient and population health data into practical solutions that improve standards of care in the real-world care settings.

"The practice of slicing and dicing data occurs within an industry with incompatible and idiosyncratic data collection. As a result, the overall industry is data-rich but information-poor," Weiner explains. "Our experience shows that a little data can often go a long way. Standardisation is becoming more common and easier. For example, the NQF certified HealthPartners’ Total Cost of Care model provides a common means of standardising costs across disparate data sources."
 
The professor also highlights the need to make EHRs more interoperable, saying that currently only about 10 percent of all EHR records are fully shared. The majority of information in an EHR is in free text and right now, the challenge is to find ways to capture the meaning from these data, he adds.

In addition to maintaining good data and improving analytics, Weiner says we need to figure out the most optimal ways to use these big data tools that are now available to improve health and wellbeing for both individual patients and the overall health of a patient sub-group or a community.

"For example, the data should give providers new ways to identify patients who are likely to require services for chronic or long-term care. Or it can inform key administrative decisions, like determining fair compensation for providers or allocate resources in the most cost-effective ways," he explains.

Source: Managed Healthcare Executive
Image Credit: Pixabay

«« 5 ways to reduce staff turnover


Bedside art therapy for cancer patients »»



Latest Articles

Healthcare, EHR, big data, health data Data abounds in healthcare, and the amount of health data is bound to increase at a more rapid pace in the next several years – 48 percent growth annually based on estimates by research firm IDC.