| | August 20179Data IntegrationThe first step was to effectively aggregate patient data from a myriad of disparate sources maintained by the plan and its delegated entities. `Interoperability' is one of the most frequently used and misleading terms in health IT. Because the technology landscape consists of a multitude of legacy systems, and because data standards are neither universally adopted nor identically implemented, the only way to effectively integrate with the plan's existing systems and formats was to create a dedicated health information exchange (HIE). The key issue contributing to failed HIEs was a combination of inability and vendor unwillingness to comply with each HIE's requirements for data integration. Based on this insight, the only practical path forward was to ensure our system could both consume and generate data in the preferred format of each of the Blues' other vendors. We built or customized our existing interfaces for every standard type of counterparty extract, including enrollment/eligibility, EMR, claims, authorizations, practice management, pharmacy, lab, vendor management, and provider network management, as well as ensured support for both standards and proprietary formats. Our interfaces were based upon a modular architecture that followed best practice object-oriented patterns to rapidly adjust interfaces for format variations and for new systems. This approach enabled unparalleled integration timelines for the plan to connect to state, claims, authorization, and provider management systems, and ensured that the challenging regulatory timetable was achieved.Data NormalizationStep two was ensuring that all the disparate data being consumed by our system was consistently mapped, indexed, and stored. To achieve this, the data schema had to be sufficiently comprehensive to encompass all major categories of healthcare data and also remain flexible to support variability across vendor systems and the plan's needs. We then were tasked with developing an approach to accurately define data dictionaries, validation criteria, and business rules that could apply across data domains. A substantial amount of critical information is captured in an unstructured format. To ensure that the system could optimally support all data categories, we designed our data repository to contain the elements of both a data warehouse and a data lake, which we termed as an `adaptive data model'.Data BindingWhile aggregating and indexing cross-domain data about a patient is valuable, it is the binding of that data to workflows and analytics that empowers care teams to work more efficiently. In collaboration with the Blues' clinical operations teams, we built data-driven workflows that alerted to the next best action for each case based on each patient's unique profile and most current information streaming into the system. Most population health systems focus on calculating quality metrics and care gaps across the entire patient population, but this approach fails to provide actionable feedback to the providers, care managers, and service coordinators on the front lines. By calculating these measures in realtime at the individual patient level and tying them to tasks and dashboards directly leveraged by care teams, it is possible to ensure more optimal and timely interventions at the point of care.By adhering to the three components, the Virtual Health platform enabled the success of the plan as it overcame the complexities of attaining optimal interoperability, scalability and flexibility while adhering to government regulations and resolving issues associated with complex populations. As the transition to value-based care continues to transform healthcare, attaining a 360-degree view of a patient view is critical to ensure quality outcomes, care, and operational processes. Our interfaces were based upon a modular architecture that followed best practice object-oriented patterns to rapidly adjust interfaces for format variations and for new systems
< Page 8 | Page 10 >