After missing last year’s meeting in Tampa, I’m back at AAMI – one of the two events that I try to attend every year. The focus on connectivity has increased as there is a full track devoted to the topic this year.
One change this year, is that my blog posts from the conference will be appearing on the Medical Electronic Design magazine blog, found here. As usual, I’ll also be taking lots of photos, some of which may also be posted on the MED blog. Eventually most all the photos (the good ones that aren’t confidential) will be posted on my Flickr account, here.Read More
I was too beat to catch the breakfast symposium. My day started with the session titled…
Designing for the unforeseen: preparing your facility for evolving technologies
Presenters: Barrett Franklin and Sudhakar Nagavalli of KJWW Engineering; Valmik Thakare, Christner; and Dennis Minsent, OHSU.
Major trends that they see:
- Diagnostic imaging – portability
- Clinical information systems
- Video capture
- Transparency (RTLS)
Diagnostic imaging is moving out of conventional settings into surgery, procedure rooms and intensive care. This impacts workflow and consequently, workflow automation. Imaging is becoming an enterprise application, going beyond distributing images on an enterprise basis to include image acquisition anywhere and any time.
Patient monitoring is transitioning from disparate stand alone systems to an enterprise system. This creates specific infrastructure requirements, encompassing wired and wireless networks. Raising patient acuity and an increasing trend to spread higher acuity patients out to their medical services has increased the need for pervasive monitoring capabilities.
Integration was grouped into 3 different applications. Integration started in the operating room, and this trend evolving into a unified enterprise system. There is a growing requirement for disparate systems to work in concert, including: patient monitoring, ventilators, infusion pumps, defibrillators and information systems. This gives rise to challenges in defining a coherent network infrastructure.Read More
Clinical Engineering Symposium
The theme of this year’s Clinical Engineering Symposium (and also the title) is: Capturing the Heart and Mind of the Clinician — The Art and Science of Human Factors for Medical Systems.
Ed Israelski, the Human Factor Program Manager at Abbott, talked about the application of human factors to the design of medical devices. He presented a basic framework for incorporating human factors engineering (HFE) into the product development cycle.
He noted that the importance of developing quantified usability objectives, and through testing prototypes (both product models and user interface simulations), to improve usability and safety through design iterations.
The FDA design history file is required to include HFE in the design process, but does not proscribe specific methodologies for implementation. Consequently, most HFE efforts in medical device product development are very limited. You can read more, including a survey on medical device vendor software engineering methods, in this report (scroll down to the sub head, A Survey of…) from last year’s conference Improving Patient Safety through Medical Device Interoperability and High Confidence Software.Read More
I was in hog heaven at this year’s AAMI meeting. Connectivity was a major theme, and during every time slot in the program there was at least one presentation dealing with connectivity. During my presentation Monday afternoon, there was one I really wanted to see that dealt with alarm notification.
Lots of discussion centered around the evolving role of biomeds and clinical engineers and the kinds of training they might need in the future. There were rumblings from some in the ACCE who wanted to hold their annual meeting at HIMSS next year rather than AAMI. There certainly is a life-critical systems role that needs to be filled, and clinical engineers could fill that role. To this observer, it seems that clinical engineers will slowly become marginalized if they do not move in the “systems” direction. Even biomed techs will need IT skills to manage and support increasingly complex and pervasive medical device systems.
During the GE sponsored breakfast, there was a session on managing RF in your hospital. Reportedly the perennial “WMTS versus ISM” debate reared its tired ugly head. For many reasons mentioned here in the past (just google “WMTS” in the search box on the left colum). The WMTS bands will never have the bandwidth or (more importantly) the management tools to support more than a small portion of the wireless medical devices in a hospital. Only the usual suspects can even afford to develop the prorpietary radios required for WMTS, which is why 802.11 has seen so much uptake with device vendors.
But the inherent limitations of WMTS do not make 802.11 a slam-dunk. In fact, recent experience has highlighted the need for more rigorous RF engineering, wireless LAN design, and ongoing RF and network monitoring to ensure a reliable network. Hospitals are perhaps the most hostile environment for wireless networking. When it comes to networks, hospitals are faced with both selecting a hardware vendor that best meets their needs and a VAR (value added reseller – the indirect reps used by IT vendors to sell their products) who really knows what they’re doing. Only the best VARs can design and install a reliable network that supports all the big apps: data, wireless VoIP, positioning, and medical devices.
In a nod to presidential politics, “It’s the workflow, stupid.” To most, connectivity is about extracting data and moving it some place else. The real objective is to automate workflow – and how connectivity is implemented has a huge impact on what workflows it supports, and ultimately the usability of the system. A fundamental piece of this workflow is patient context, the association between a patient, their medical devices, and the data that comes out of them. Patient context remains a concept that’s poorly understood by most users and vendors. Many still try to fudge patient context by associating the patient to a port number or bed location. Guess what? Patients move, and mobile devices especially, must establish patient context in the device itself to be safe and effective. I would love to see some of the fantasy-based risk analysis and mitigation documents done for certain connectivity features that I saw this week.
All of this gets to another big change reflected in this weeks conference. Stand alone embedded products are evolving into real systems that extend functionality way beyond the box itself. This “systemization” of medical devices requires some changes in thinking. No longer can you focus on building safe and effective boxes, and after the fact plugging them together with other stuff and be sure the result is still safe and effective. Nor can you manage and support interconnected devices simply by maintaining the device – the entire system must be configured and maintained as a whole.
One of the good things to come from the increased involvement of IT in device connectivity is their insistence on a test system to support the “production” system. They do this with all their software systems. An indicator that connectivity is an afterthought is the total absence of test fixtures for an integration lab. Another symptom is the scarcity of such labs in hospitals and the limited capabilities of most manufacturers’ verification labs. As systems grow and become more complex, hospitals will increasingly demand support for these labs – in the absence of test fixtures, that means customers with clout will insist on indefinite loaners so they can effectively maintain their systems.
During the ACCE Clinical Engineering Symposium Saturday morning, Bridget Moorman referred to medical device connectivity as “brittle.” I know more than one person had an epiphany upon hearing that term. Any change, no matter how small, along the chain from medical device to target computing device renders the device interface inoperable. Device firmware changes, pin-outs, cable connections, terminal server configurations, network configurations, and interface configurations – on either side of the interface – all result in failure. Planning for these interfaces (hopefully by the vendor before product development) must take this brittleness into account. At the very least, customers must be able to monitor their connectivity all the way to the device, not just a server or terminal server.
Finally we come to FDA regulatory issues. I met an FDA representative in the exhibits. She works on the Issues Management Staff, a tiger team that addresses patient safety related issues that reach a point where they must be dealt with. Can you guess one of the simmering issues that may soon become an Issue? That’s right, medical device connectivity. Much of the current regulatory framework (both vendors regulatory strategies and how the FDA manages the process) is based on standalone medical devices, and “oh, by the way, it gets plugged into all this other stuff to do… stuff.” We can expect to see regulatory perspectives shift increasingly to a systems view, especially when multiple vendors are involved.
The contortions many vendors go through to avoid FDA regulation is a symptom of this spreading systemization of medical devices. While the FDA has a responsibility to ensure safety and effectiveness, they are also responsible for accomplishing their mission in a way that doesn’t drive undeserving vendors out of business or stymie the development of innovative solutions that promise even better safety and effectiveness. Don’t expect them to accept the status quo for long. I ask everyone who’s skirting the regs if they are committed to building a quality product, and the answer is inevitably yes. All it usually takes to get a 510(k) is compliance with a basic quality system (the FDA’s Quality System regulation) and 60 days for the FDA to process your 510(k) paperwork. And yet the reticence to be regulated suggests that things like prototype code makes it into finished products all too often.Read More
The crew from Lehigh Valley presented their experience creating a telemedicine system called aICU (advanced ICU). John Sokalsky lead off, describing how their aICU concept leverages intensivists and critical care nurses in a remote location to serve more ICU patients. The system improves outcomes and reduces costs – always good things. This system integrated their CPOE, meds administration, real-time documentation charting and medical device data via a critical care information system, and finally a camera/digital video system. The strategic initiative was to create and implement an off-site “tele-intensivist” program. This program provides round-the-clock intensivist coverage of critical care units throughout their health care system. Results showed improved patient outcomes and reduced overall costs by managing changes in patient conditions quickly and effectively.
The project was lead by Stephen Matchett, MD, Chair, and Project Sponsor, and included the following team members:
- I/S Applications and Administration
- Clinical Services Administration and leadership
- Respiratory Therapy
- Administrative Planning
- Clinical Engineering
- Others invited on as needed basis
The Lehigh Valley system is based on an application from iMDsoft. Device drivers for legacy devices (or devices that do not include connectivity) use serial device drivers written by iMDsoft. Patient context for devices with serial interfaces was done by bed location. [This works fine for an ICU implementation where patients rarely move – connectivity on devices connected to lower acuity patients should establish patient context in the device.] Data was received from devices with built in connectivity (via integrated network support) via HL7 from the device vendor’s HL7 server.
They use HP OpenView to monitor device connectivity as far as the Lantronix terminal server for serial-based devices. Devices with network connections can be monitored by OpenView up to the medical device vendor’s server. The links between the device and the next step (term server or device vendor’s server) is not visible to IT for monitoring. They usually get warning from biomedical engineering when new devices or firm ware upgrades, and test in advance of deployment.
An interesting part of their description of the project includes a test environment. During deployment, this environment was a “simulated ICU” that includes back to back TNICU/MICU beds in test, and four additional beds at remote ends of the ICU. Beds were added until the first twenty eight ICU beds were online. This required continuous coordination with Facilities and Bed Management. As the first 28 bed unit prepared to go live, additional units were subsequently wired & placed in test. This approach offered the following advantages: facilitation of training by department prior to “go live,” and identification and correction of system, device and workflow issues. Once fully deployed, they use spare devices (they’re usually available) to create a test environment as needed.
Surprisingly, they’ve had problems with some vendors getting the data required to develop a serial port device driver.
Christina Roberts, on the IT side, talked about the nursing and clinical engineering relationship. At Lehigh Valley, the IT department facilitates the coordination between nursing and biomedical engineering. The IT department takes calls 24×7 and provides tier 1 support for the aICU (and other clinical information systems). Depending on the problem, they will call biomedical engineering.Read More