Technical Session 1: Interoperability Challenges; Introduction by Glenn Himes, Mitre Corporation.
Himes presented a model for medical device connectivity, emphasizing the challenges with interoperability facilitated by medical device controller systems. He then pulled back to show the context that device data and system connectivity plays in the broader health care delivery system.
The biggest challenge faced by interoperability is the people rather than the technology. Engineering solutions are very doable. Issues like a lack of consensus, lack of a regulatory path, lack of requirements, market forces, and other factors represent the real (and current) impediments to change.
Incentives to change include new market opportunities, improved patient safety, lower litigation and insurance costs, and a lower total cost of ownership.
A lot of these challenges are not about technology, but the return on investment for both vendors and their customers, along with political and consensus issues.
Panel discussions follow (underlined author was presenter):
Medical Device Interoperability: Assessing the Environment
Kathy Lesh, Sandy Weininger, Julian Goldman, Bob Wilson, Glenn Himes; The MITRE Corporation, FDA/CDRH, Massachusetts General Hospital
This presentation by Kathy Lesh, addressed:
- The FDA and the definition of medical device
- The scope of the medical device interoperability issue, including the regulatory aspects,
- The current standards environment,
- Why this is important,
- What needs to be done,
- What is being done and by whom, and
- Future steps
After a review of the usual suspects (regulatory requirements, existing industry standards and standards bodies), the term interoperability was defined as a diagram. In short, interoperability is the communications of data where the data are subjected to display, analysis and modification by integrated systems.
Were just after lunch, and already numerous folks have talked about why medical devices are not interoperable. Heres a good list:
- There is no incentive for device manufacturers to interoperate with other manufacturers devices.
- No loud, organized call from clinicians to have interoperable devices
- There is a scarcity of acceptable medical device interoperability standards and the existing protocols are complex.
- The health care sector, in general, has been lagging behind other industries with respect to computerization and networking.
- Interoperability solutions face complex technical and social problems, including liability and regulatory issues.
The consequences of the lack of interoperability is compromised patient safety (an issue that is getting increased attention), and that care delivery documentation is incomplete and/or inaccurate. What interoperability that does exist has resulted in vendor lock-in due to proprietary systems. Existing interoperable systems are also are very expensive (resulting in lock-in), slow to develop and update.
The optimal goal that was presented included:
- Easily accessible
- Unencumbered by excessive fees and/or licenses
- Simple as possible, complex as necessary
- Support safety
That sounded like an opens source solution to me. In fact, open source is what was endorsed.
Vendors should participate in national EHR/PHR interoperability efforts, e.g., HITSP and CCHIT. Participate in national and international device terminology standards development. Also, participate in national and international device standards development by speaking out for open standards, interoperability and make sure standards are relatively easy to implement. Finally, work with FDA to ensure least burdensome approach for regulating interoperable devices.
Moving Toward Semantic Interoperability of Medical Devices
John Garguilo, Sandra Martinez, Richard Rivello, Maria Cherkaoui; NIST National Institute of Standards & Technology
After a brief intro to NIST, presenter John Garguilo launched into their project to facilitate interoperability. NIST is involved in developing standards, tests, tools and prototypes to advance the use of information technologies in health care systems and achieve an interconnected electronic health information infrastructure.
Garguilo asked, why medical devices? The first reason is that by integrating processes it is possible to reduce medical errors that currently occur. For every connected IT device in the hospital, there are 4 medical devices that are not connected. There are over 1,500 medical device manufacturers and over 3,500 Make-Model combinations. The typical 200 bed hospital contains thousands of medical devices.
Some of the key issues include:
Semantic Interoperability ( comparability) Ability to respond to clinical context, compare information from different healthcare facilities, and interrogate systems across enterprises, driving clinical decision support systems with an economic business model.
Real-Time Availability Ability to provide data in a time frame appropriate to the physiologic function being measured, displayed or affected (controlled).
NIST is supporting the IHE Patient Care Device domain, facilitating the testing and validation of the implementation of standards. Their current standards focus is ISO/IEEE 11073, specifically the nomenclature and information model.
The Implementation Conformance Statement Generation Tool was described. This tool automates the creation of much of the documentation require to demonstrate conformance to a standard (in this case ISO/IEEE 11073). Heres a user guide.
Clinical Requirements Methodology: Incorporating Clinical Workflows
Invited Speakers: Jennifer Jackson, Brigham & Women's Hospital, Tracy Rausch, DocBox Inc. (formerly at Kaiser Permanente)
Heres the problem, as presented by Tracy Rausch. Literature describes how current product design is hampered by faulty user requirements definitions. Although interoperability is fundamental to patient safety and improved efficiency, currently existing technologies are too complex to implement and maintain with current design methodologies. Introducing clinical workflow descriptions as part of the requirements-gathering process will improve medical device/system design.
Their solution is to adopt a clinical workflow approach to product definition. A current trend is to hire cultural anthropologists to observe workflow. This approach has the challenges of a high cost, and the inadvertent effect on those observed. One reason for the cost is the highly specialized nature of health care delivery; ethnographers with no health care experience take a great amount of time to elicit a full set of use cases or requirements.
At Kaiser, the results of a recent device-EMR integration feasibility study found that they have 34 different device types across 70% of their inventory. They estimated that it will cost $10,500 per patient bed to implement EMR connectivity. This cost does not include the labor to manage and maintain the installed connectivity.
Jennifer Jackson presented a classic use case process to gather interoperability requirements. Jackson presented a hierarchy starting (at the top):
- Clinical Scenario
- Clinical Workflow
- Use Cases
- Logic Map/Key
- State Diagram
- Technical Solution, and
- Clinical Implementation.
To this list I would add Actors at or near the top of the hierarchy.
Clinical Requirements Methodology: Ensuring Sufficient Breadth in Use Case Development. How Should Non-Functional Requirements be Elicited and Represented?
Invited Speaker: Rick Schrenker, Massachusetts General Hospital
In Schrenkers presentation, Non-Functional Requirements include external interfaces and quality attributes. The essence of the question is how and well something is done reliability, quality, latency, etc. Perhaps, Schrenker suggested, meta-functional is a better term than non-functional. Heres the full list:
There are two distinct but closely related requirements for interoperable medical device systems:
- Data communication capability to support complete and accurate data acquisition by the EHR/EMR from vital signs monitors, infusion pumps, ventilators, portable imaging systems, and other hospital and home-based medical devices. Comprehensive data acquisition will also enable the development of remote monitoring, advanced clinical decision support systems, intelligent alarms, and robust databases for CQI use.
- Medical device control capability to permit the integration of distributed medical devices to produce error-resistant systems with safety interlocks between medical devices to decrease use errors, closed-loop systems to regulate the delivery of medication and fluids, and remote patient management to support health care efficiency and safety (e.g. "e- ICU", management of infected/contaminated casualties).
As we drive towards interoperability, Schrenker raises the questions, How can the ilities for each of these sub-domains be elicited and validated? and As these and other systems interoperate, how can semantic gaps that may arise between their respective ility specifications be recognized and addressed?
When requirements are wrong, this is the common outcome:
L Kohn et al, eds, To Err Is Human: Building a Safer Health System, National Academies Press, pp 55, 58
Another common outcome of inadequate requirements, especially regarding clinical workflow. Consider the following from the Director of AHRQ:
C Clancy, Evaluating the Potential of New Technologies, in Building a Better Delivery System: A New Engineering/Health Care Partnership, National Academies Press, 2005, pp 84-86.
To highlight the scope of device connectivity issues, Schrenker notes that in 2006, Partners added 498 new models of equipment, 31 new device types, 74 new manufacturers, and they changed the software version (firmware) of 4,282 devices. From 2002 to 2006, the Partners BME (biomedical engineering) bedside medical device inventory grew from 26,000 to almost 38,000 medical devices. Whew.
So, if Schrenker is right, he offered some logical questions. Should we invite in academic and professional Requirements Engineering experts to assist (at least) in the capture of accurate and appropriate requirements?
Challenges for the Large Integrated Healthcare Delivery System
Invited Speaker: Michael Robkin, Kaiser Permanente
Robkin presented the business challenges faced by Kaiser. As an entity, Kaiser has unique characteristics that drive unusual requirements. They have a management philosophy that includes complete span of control, influence, and accountability. They do this so that they may be able to provide the same quality of care and standard care processes across the organization. In some important ways, Kaiser may represent the future of high confidence interoperability, especially regarding driving down the cost of ownership for interoperability and system scalability.
Interoperability means many things:
- Acquisition: Can different vendors be swapped out? No propriety data, interfaces, or processes.
- Care Processes: Can different systems collect the data necessary to manage our care plans?
- Clinical capability: Are all 64-slice CTs the same?
- Workflow: Can different systems support the same workflows? Not necessarily the same user interface, but the same features.
- Training: Can different systems support the same user interface? This would be interoperability between users and systems.
- Data Connectivity: Can different systems transfer the same data? (Physiological monitors)
- Physical Connectivity: Can different boxes have the same plugs?
- System Integration. Can different systems be part of the same operational suite? (e.g. PACS and RIS).
- Real-time integration. Can different systems be part of one larger system? (e.g. feedback control loops in anesthesia monitoring)
- Data Integration. Can different systems provide the same data to data warehouses or analysis tools?
- Homogeneous systems, Homogeneous processes. (e.g. Radiologists sharing images PACS to PACS)
- Heterogeneous systems, Homogeneous processes. (e.g. PACS to EMR),
- Homogeneous systems, Heterogeneous process. (e.g. different specialties, same EMR and same PACS)
- Heterogeneous systems, Heterogeneous processes. (Dermatology photos taken in a controlled environment vs. cell phone camera photos collected sent by email)
The real challenge is not one-dimensional interoperability, but figuring out what KIND of interoperability we want and to what level. At its heart, interoperability is a governance and business problem, not a technical one. Theyve also had to recognize that the beneficiary of interoperability (the customer so to speak) may not be the users, the doctor, or the IT professional. It may be the enterprise and patients.
Robkin also suggested lining up the front end users and the back end receivers of the data. Dont stress collecting data nobody uses, and dont look for unnecessary interoperability. Dont pay for unneeded precision, and be very clear what you want to interoperate and how. Key questions include: Workflow? Data? Systems? Process? Interfaces? Which systems? When? How?
Connectivity is not a fire and forget process. Be sure to follow through the entire applicable work flow, functions, and data flow. Otherwise you will get partial solutions that are complete failures. Some bad examples:
- Buy the same device, but configure it differently.
- Same device and configurations, but used in different workflows.
- Same workflow and versions and support, but different data collected.
- Same data, but different accuracy, precision, history, meaning and retention rules.
- Exactly the same data, but different analysis.
Balance must be maintained. It is easy to have too much restriction and standardization. Robkin compared and contrasted proprietary single vendor solutions (suites) and standards based multi vendor systems.
- Off-the-shelf interoperability
- No need for internal integration capability
- Vendor lock
- Limited by vendor features
- Slower innovation
- More expensive in the long run
- More work: standards are not interoperability
- Longer leverage
- More flexibility
- Best of Breed
- Best potential
Finally, Robkin closed with the suggestion that standards versus proprietary systems are not two clear cut sides of the coin but a continuum.
Regarding Robkins recommendation not to pay for accuracy you dont need, a question asked if the likelihood of unanticipated needs mandated that hospitals should save as much data as possible. Robkin noted that 6 months ago, hed agree, but he has since determined that the costs involved are prohibitive.
Another question: $10,500 to integrate medical devices thats inexpensive, why arent all devices integrated now? The response was that the expense was considered excessive and insufficiently robust (brittle and requires replacement along with the devices).
Various groups have collected a lot of use cases and requirements. What can we do with them to make progress towards interoperability? The major issue seems to be an organizational structure to pull together the resources to make this happen. Follow up question: Weve got the requirements; there exist organizations, whats missing? Another route could be open science or open engineering.
Question: does every connected device fit into the client/server model? And is there some better, lower cost technology for serial point-to-point connectivity? The problem is not the connectivity, the problem is all the rest training, support, modification of the other system to connect to the medical device. There are some very interesting technologies to connect, there are thousands of people experimenting in their garage on innovative solutions and thats a good thing.
Question: how do you insert the clinical requirements into the standards process? There was no real answer here, but I was reminded of Robkin's observation, "A good standard collects the wisdom of the best thinkers in the industry, a bad standard collects something else."
It was noted that there was no one in attendance that could speak to any specifics regarding 11073. Surprising.
Finally, there was a comment suggesting a hall of mirrors situation in the industry, "You guys (providers) have got to give us (vendors) requirements. How can I gather requirements in an environment where no one has the context or frame of reference to describe requirements?"