HCMDSS-MDPnP-logo

The following is a continuation from the the Improving Patient Safety through Medical Device Interoperability and High Confidence Software joint workshop last week in Boston. I’ve got a bunch more notes that I’ll be tweaking and posting this week. This next bit is from a panel discussion that described the need for high confidence systems and interoperability. The panel was introduced by co-chair Julian Goldman.

The foundation for any quality product is requirements. Inadequate or incorrect requirements mean not just a lousy product, but one that could be unsafe or unreliable. This panel discussion targets the clinical need driving high confidence systems and interoperability.

A market for any kind of product is made up of both producers and buyers, each with their own responsibilities. Device vendors (and to a lesser degree, HIT vendors) have not demonstrated much command of either a good understanding of the workflows that occur around their devices, or a mastery of workflow requirements gathering. To achieve success with medical device interoperability, one must have high quality requirements.

At the same time, buyers are responsible for actively shaping demand and motivating vendors to build the best products possible with the right features. Healthcare providers must demonstrate a market for interoperable systems in order to provide vendors the motivation (and eventual financial return) on more comprehensive requirements gathering efforts.

The expectation is a phased market implementation, with medical device connectivity first, and then interoperability between devices, and between devices and systems. Such advances must support clinically meaningful use-cases. Standards can be used to mitigate risk and support interoperability, but they have yet to mature sufficiently to make connectivity or interoperability easy: a classic “chicken or the egg” conundrum.

The Clinical Needs panel included:

Jeff Cooper PhD, Anesthesia Patient Safety Foundation
Sandy Weininger PhD, FDA
Jennifer Jackson MBA, CCE, Brigham & Women’s Hospital
Jim Philip MD, Brigham & Women’s Hospital
Steven Dain MD, University of Western Ontario
Jim Fackler MD, Johns Hopkins
Moderator: Julian Goldman MD

Cooper’s introductory remarks went to his motivation for participating in the APSF and his interest in medical device interoperability. He described two experiences, the son of friend who went into respiratory arrest on a PCA. In another experience, a friend was in the hospital and he saw first hand what is the best and worst (disruption in care, interrupted communications) in hospital care.

“Hospitals today are not safe – if you go into the hospital, take someone with you. One of the biggest problems is that technology advancement has outstripped the infrastructure (how the technology is deployed and used) to ensure safety. The technology with perhaps the biggest potential impact on patient safety is the interoperability of medical devices.”

After Jeff Cooper described the need, Sandy Weininger, noted some accepted approaches to developing high confidence systems. Looking into the future, he also offered some rhetorical questions on creating safe and effective systems.

“Absolutely Safe,” how do you define it? How do you implement it? In reporting to the FDA (users voluntarily report, and manufacturers have mandatory reporting) they receive more than 100,000 reports per year. The FDA estimates that that figure represents just 2% of actual incidents. Of the fraction of events that the FDA does receive, they are hard to analyze and understand.

Best practices for interoperability and systems integration include:

  • Clinical requirements are necessary to understand what a complex medical device system is intended to do
  • “Interoperability” must be described using rich set of scenarios/use cases
  • Must address safety, security, effectiveness
  • Look at current clinical challenges and hazards, mitigations, future solutions and new risks

Given that this whole interoperability thing extends way beyond the actual “medical device,” Sandy went on to note the legal definition of a medical device:

An instrument, apparatus, implement, machine, contrivance, implant, in vitro reagent, or other similar or related article, including a component part, or accessory which is intended for use in the diagnosis of disease or other conditions, or in the cure, mitigation, treatment, or prevention of disease, in man or other animals, orintended to affect the structure or any function of the body of man or other animals, and which does not achieve any of it’s primary intended purposes through chemical action or on not dependent upon being metabolized

Device vendors, systems integrators and hospitals should study this definition.

Weininger also asked some important questions, like, “How do you enable innovators (who lack the resources to create their own vertically integrated systems) to contribute innovations that work safety?” and “How do you validate a system that is greater than the sum of its parts?”

He contrasted how one builds a conventional medical device with additional factors that should go into the broader scale systems resulting from interoperability and systems integration. The issue is a systems engineering challenge that requires the involvement of numerous specialties. This is not new – aviation is a great example. The best solutions require a multi disciplinary approach. These skills include control systems design, operations research, safety engineering, reliability engineering, interface design, cognitive systems engineering – human factors, in addition to communication protocols and security engineering.

Risk analysis is essential to designing a safe system. Weininger suggested the IEC 60300-3-9: Risk Analysis of Technological Systems standard as offering a good methodology for risk analysis.

Another important factor is the proper identification and description of requirements. Two common approaches are clinical scenarios and use cases. Clinical scenarios are descriptions of the current clinical situation and related problems identified from clinical stories, adverse event reports, near misses, etc. Use cases are a detailed look at a specific part of the clinical workflow. A work flow may not be required for a use case, but is helpful for examining human interaction.

Weininger wrapped up asking, “so, when is a system validated?” The answer with a stand alone embedded system medical device is unambiguous. When medical devices and information systems are combined, the answer is much less clear.

Jennifer Jackson discussed the recent popularity of ethnographic analysis for capturing medical device requirements. But even with these expensive and supposedly sophisticated requirements elicitation techniques, Jackson noted that providers are really struggling with good connectivity solutions in the absence of solutions from vendors that meet their needs.

The vast majority of connectivity solutions fall short, considerably short of safe and reliable systems needed by hospitals that are easy to use, and easy to maintain and support. One of the key reasons for this is a dearth of good requirements.

Jackson described clinical engineers as the interface between medical device vendors and regulators on one hand, and nurses and physicians on the other.

She noted that medical devices have traditionally had a 7-10 year lifespan. With the adoption of general purpose computer components into medical device systems, this length of time is falling. Vendor software updates (especially those driven by operating system patches or connectivity problems) are most problematic. The need for software updates is very unpredictable, and software releases frequently take 6 to 12 months – way too long. Vendor suggested workarounds place considerable burden on providers as they retrain users to the new operation, and because workarounds usually require some new manual steps that can introduce user error and lower productivity.

Current interoperability options are typically proprietary end-to-end systems. This is good because a single vendor provides quality and design control, and there’s a predictable market for the vendor. For customers, there is limited choice and “best of breed” is reduced to “what we have to offer.”

Jackson also noted a structural weakness that was a theme of the conference: that interoperability is usually a post-planning, post-market thought. Consequently, the solution is usually compromised by:

  • Unreliable performance, slow (and too frequent) software updates
  • Poor vendor ability to support integrated systems
  • Poor design with multiple points of failure that are – what do you know, prone to failure.

Jackson used an example of a ventilator-patient monitor alarm integration project they did in an ICU. The layout of their ICU inhibits the ability to hear ventilator alarms.

The solution from the vendor was “dongle-ware,” an external module that connects to the serial port on the back of the device. This approach works most of the time, but the interface is brittle with many links in the chain of connectivity that can render the interface inoperable.

Jackson described currently unmet connectivity requirements in a slide generously titled: Interoperability Tomorrow. In this scenario, vendors have a real competency in systems integration, providing software updates in a timely fashion, with technical support that understands general purpose computing environments in addition to medical devices and clinical environments.

Even with improved vendor execution, we’re still limited to what the device will output (not everything), and the interface still represents multiple points of failure in the system. This current state of connectivity adds unplanned costs to installations. The costs for interfaces are expensive (plus cabling costs), and the hardware required takes up a lot of space (often unallocated at the design stage).

Kaiser has estimated simple EMR connectivity costs at $10,500 per bed. Not including CE/IT labor to configure, support and maintain the integration.

The perfect solution uses a standardized interface language embedded in the medical devices. No dongles. Systems integration, clinical and technical support tools are incorporated with other utilities. No dongles. And these capabilities are offered as part of the basic connectivity offering, rather than positioned as optional, higher cost features.

The cost to retrofit their hospital is prohibitive. The cost of moving forward when purchasing new technology is also very high.

Jim Philip MD, MGH, is a clinical anesthesiologist and the director of bioengineering at Brigham & Women’s Hospital. His contribution to the panel discussion was a case study highlighting the potential benefits of high reliability and interoperability.

The case is a laparoscopic cholecystectomy on a middle aged female with no other medical problems.

Preparation included:

18 Gauge IV catheter
Monitors applied and connected
ECG
NIBP (q 1 minute)
Oxygen saturation (Pulse Oximeter)
Airway Gas Sampling and monitor for
Oxygen
CO2
Agent

Anesthetization:

Sodium Pentothal for Induction of general anesthesia
Tracheal tube inserted under direct vision
Inhalation anesthesia administered via Anes Delivery System
Moderate-duration muscle relaxant (Vecuronium 4 mg)
18 F Gastric Tube passed via mouth
Stomach emptied of gas and liquid

Abdominal Insufflations (where they inflate the abdomen for visability):

GI Surgeon, trained in laparoscopic surgery, division director, began surgery
15:28:16, BP 128/66 and pulse 90 / minute,
Minute Ventilation = 5.7 L/min, ET pCO2 = 30
Veris Needle placed in the abdomen for insufflation
Trochar with self-retracting incisor
Scope inserted

Monitoring Observations:

15:29:20, the NIBP monitor failed to record a blood pressure
15:29:40, peak inspiratory pressure (PIP) rose as peritoneal pressure was raised with insufflation
15:30:00 minute ventilation (VE) = 5.7 L/min and constant 15:30:40, pulse oximeter failed to record SpO2 or heart rate
but pulse was palpable
15:31:00, end-tidal CO2 fell from 30 mmHg to 18 mmHg, heart rate constant at 90 / minute, peripheral pulse not palpable, carotid pulse weak.

Clinical Communication:

15:31:00
Anes: “was there was bleeding when you inserted trochar?”
Surgeon: “a tiny bit”
Anes: “If there was bleeding, would you see it?”
Surgeon: “No, not able to visualize”
Anes: “I think you have major bleeding”
Surgeon: “What should I do?”
Anes: “Cut now
Surgeon to Scrub Tech: “knife”
Surgeon Action: Incision
15:31:10

Action:

15:31:10 Abdominal Exploration Incision
Surgeon Observation: Blood poured out, 2 L in suction
Surgeon Action: Finger on palpable site of bleeding Aorta
Call for Vascular Surgeon to assist
Surgeon: ”How the ‘h” did you know that?”

Anes: “That’s why we monitor carefully and continually and try to integrate it all”.

Resuscitation:

15:31:20 Anes Action: Call for help (more IV)
3 L Lactated Ringers solution over 10 minutes
Albumen 75g
15:45:03 NIBP = 44/31, ETCO2 = 24 mmHg
2 U Packed Red Blood Cells, more late
15:47:20: pulse 117, NIBP = 83/47, ETCO2 = 24
15:50:00 ABGs: pO2= 468, pCO2= 37, BE=- 4, Hct=13%
16:05:00 NIBP = NIBP = 100/60
Event declared under control
17:00:00 Emergence with patient awake and alert

Resolution:

PO Day 2 Ileus resolving
PO Day 4 Patient discharged home, alive and well

Philip noted that until they could detect the problem and make the diagnosis the patient was at considerable risk. Here’s a summary of the time line:

15:28:16 Event
15:29:20 First sign
15:31:00 Diagnosis
15:31:10 Definitive treatment
00:02:54 Event to Treatment
00:01:50 First sign to Treatment
00:00:10 Diagnosis to Treatment

This clinical scenario (which would be a worthy addition to requirements for an anesthesia system) demonstrated how lots of data and integration was used to achieve a good outcome. Much of this data was integrated in a particular monitor that was developed just for this kind of situation at Partners. This monitor is no longer made and not available. Data integration from multiple sources and presented in an integrated way is essential. Hospitals can no longer afford to build their own systems of this type, and universal interoperability is required and desperately needed to bring these capabilities the broader market at large.

Steven Dain MD, Director of Anesthesia Informatics at the University of Western Ontario, provided a historical perspective to interoperability. Back in 1990 he was asked to write a program to collect blood pressure and heart rate from an NIBP monitor every 2.5 minutes, and SpO2 from an oximeter every 10 seconds and put it into an Excel spreadsheet. Easy, right? After this character building experience, he wrote the paper: Anesthesia Monitoring and the Computer Interface: The Need for Standardization of Communications Protocols.

The NIBP monitor had an RS232 interface, but the SpO2 device was a custom interface and required a clinical engineering project to get access to the data link.

Dain asked, “What’s changed in 14 years?” Not much. We still have proprietary electronic interfaces; it is still difficult to connect medical devices; expensive custom software is needed for each device; and there are still no usable standards for the electrical interface, syntax or semantics.

There have been lots of standards committees: the International Electrotechnical Commission (IEC) committees and working groups, ISO TC Health Informatics, ISO TC’s for medical equipment, IEEE, SNOMED, MSHUG, HL7, IHE, IOTA, HIMSS, WHO, and various national agencies and standards bodies (see Kolodner’s presentation). But these groups are often working in isolation, and frequently at cross purposes.

In addition, past attempts at medical device communications standardization have proven unsuccessful. There has also been a lack of a multidisciplinary needs analysis and use scenarios. And even with ethnographers, efforts to completely understand the complex clinical environment in which healthcare providers work have failed.

In addition to the above, Dain suggested that a multi disciplinary approach is needed to design, manufacture, sell and support interoperable systems. Most vendors lack the core competencies to provide effective connectivity and interoperability solutions – still.

With his clinical and vendor experience, Jim Fackler MD, intensivist Johns Hopkins, offered a different perspective. He noted that, “Dr Kevorkian does not hold a candle to what I can do as a participant in the current health care delivery system.”

Fackler went on to describe the hostile clinical environment in which he and his peers provide care. Patients are surrounded by myriad medical devices, where nothing talks to anything; alarms are simple threshold alarms, that don’t know where they come from (based on variable locations of equipment). He showed photos of his clinical environment where as many as 350 data elements can come off of each patient – in a unit with a total of 26 kids.

A fundamental problem with patient safety in hospitals is the fact that humans can only handle 7 things at once. This bit of ground breaking research, known as cognitive psychology, dates from 1956. And one bit of information is the amount that we need to make a decision between two equally likely alternatives. It is no wonder patient safety is in the state it is, when the clinical environment exceeds to known human limitations.

More recent research looking at what differentiates chess masters from mere mortals reinforced the findings in the study from 1956. The chess masters research found that adults could memorize and then recognize random patterns as well as masters, but that chess masters recognized patterns of chess pieces at a much higher rate than adults. Part of physicians training is to turn them into something like a chess master so they can recognize patterns of symptoms to make a diagnosis.

The ability to correlate and process data in the head is best in surgery where there is a 1:1 anesthesiologist to patient ratio. In an NICU like Fackler’s, the ratio goes up to 1:27. I private practice the ratio averages 1:5,000. Is it little wonder that more should be done to improve the clinical environment?

The first line of care and vigilance in the hospital is the nurse, who receives little or none of the “chess master” type training received by physicians.

Interoperability complexity increases as scope of patient and care delivery grows. For example, when a weight scale used for home monitoring breaks the patient can’t go to Bed Bath and Beyond for a new one.

Questions to the panel:
What is needed to drive the adoption of interoperability standardization? Capitalization versus standards imposed on the market. Providers must demand change when buying new products and systems.

Plea to industry – you will create proprietary systems with artificial intelligence providing decisions support. You will be automating exactly what we have now, but physicians don’t have the data they need now. Without putting all data into a “bus” that is interoperable, the potential to impact patient safety and outcomes will be severely limited.

Used aerospace as an analog to medical device industry. Suggesting that aerospace has a small number of large integrated vendors pr0ovide most solutions. False assumption – aerospace has many small subcontractors and contract engineering services companies; all large aerospace systems (like a new plane or even new major components) are constituted from many sub contractors and contract engineering shops, under the management of a general contractor. In fact, unlike medical devices where subcontractors are hidden from the market, aerospace projects are very open about the many subcontractors that participate in a project.

Someone in the audience who was in aerospace before health care noted that airplanes are very complex, like medical device systems, but unlike medical devices they are not reconfigurable – they are integrated once over 10 or 15 years of development.

It was suggested that the industry was at a tipping point where manufacturers developing and maintaining large and unwieldy proprietary systems could give way to multi vendor interoperability with a focus on core technologies. This would allow medical device vendors to compete on what they’re best at, rather than the general purpose computing systems used to provide connectivity.

Open source software on standard platforms was noted as a potential solution (for both vendors and providers) that had yet to be mentioned.