...
The Data Terminology Service (DTS) is assumed to be the source of truth for all terminology, nomenclature, and coded data within the OpenHIE shared health record. It will play a central role in terminology standardization and implementation throughout the HIE. However, it is only one component of the several components that comprise the OpenHIE. There are various ways the other components could interact with the DTS. Here are some areas that the OHIE-Terminology Services community has been discussing.
- Dumb vs. Smart Service
- Is DTS a dumb service (told what to do) or smart service (sees message and knows what to do)
It is assumed that the interoperability layer (IOL) will orchestrate much of the interaction with the DTS.
Incoming data destined for the SHR will need to be validated against existing terminology standards in the DTS.
Outgoing data destined for external consumers may need to be translated to other coding systems via the DTS.
So the DTS may need to be consulted many times for an incoming message or an outgoing message.
For discussion purposes, consider that a typical patient encounter may have say 5 to 20 individual concepts, with each
concept possibly having a coded answer and corresponding units and a reference range that may need validation.
In the DTS dumb service scenario, the IOL would make a call to DTS for each data item in the incoming message.
The IOL would loop through these items and validate that terminology standardization had been achieved.
In this scenario, the DTS only has to respond to simple queries for terminology and does not need to understand
the complexities of the message being processed.
In the DTS smart service scenario, the incoming message could be passed off to DTS, which could perform predefined validation protocols on the message.
These predefined protocols would be based message structure (HL7 v2, HL7 v3, etc) and message type (ORU, CDA, CCD, etc).
An example protocol might be: If HL7 v2 ORU message, validate OBX 3, OBX 5, OBX 6, and OBX 7. The validation needed would be the
same as in the dumb service scenario, but the smart service would not rely on the IOL and instead do all the parsing necessary to validate the
terminology in the message.
Much of the same parsing and validating machinery needed for this would be the same as the IOL requires.
The question becomes where should this terminology parsing and validation machinery reside. - Approaches to Scalability
-How to improve message throughput
As larger countries implement OpenHIE, one important metric to consider is message throughput.
Message volumes of greater than 250 messages/minute might be expected. This should influence the
architectural questions that relate to message throughput.
The following two approaches are being discussed.
Applications access DTS in real time
In this scenario, the DTS is fully integrated in the operational infrastructure of the HIE, and must be online and available
any time other components are in operation. Applications don't maintain a copy of standard terminology, but query the
DTS in real time when needed. This is easiest in terms of overall terminology standardization and deployment, but adds
some complexity in terms of network connectivity and the need for real-time terminology updates to a live system.
Applications use curated copy of terminology
In this scenario, the DTS is still the source of truth for all terminology, but it is not required to be online and available all the time.
It would be an integral part of disseminating terminology throughout the enterprise by creating curated copies of terminology that
would be consumed and incorporated by applications. The two important applications that would use these curated copies
would be the SHR and the Point-of-Care systems.
One advantage of this curated copy approach is that network infrastructure is not as critical to operations, since Point-of-Care systems could
continue without a network connection to the central system. They could continue with the latest update they had received from
the central system. Another advantage of this approach is that updates by terminologists can be staged for deployment with less
interference with the live system. - Change management
-How to handle life cycle of terms.
-How to mark terms that are deprecated or superceded.
-Process for continual periodic updates to common coding systems (ICD 9, ICD 10, LOINC, RXNORM, SNOMED CT, etc). - Deployment
- How terminology is disseminated throughout enterprise
Even if applications access DTS in real time, a mechanism is still needed for exporting a current snapshot of terminology.
External systems may need to consume a current snapshot for various purposes. Point-of-Care systems may need it to
keep up to date. External mapping applications may need to consume it to help automate the mapping process.
The follow two non-exclusive approaches are being discussed.
Deployment via sftp file transfer
This approach requires minimal technology and minimal monitoring. Sftp sites would be set up where interested parties could
pull the current terminology snapshot. It is assumed that larger sections of terminology would be deployed this way, and
ad-hoc one of queries would not be considered.
Deployment via API call
This approach requires a higher technological level, but offers more query options. - SHR's persistence model
- single code or multiple codes
-only 1 cardinal code stored or 1 cardinal code plus 1 equivalence code stored (Ministry 1 coding system, Insurance other coding system)