Clinical Engineering Symposium
The theme of this year’s Clinical Engineering Symposium (and also the title) is: Capturing the Heart and Mind of the Clinician — The Art and Science of Human Factors for Medical Systems.
Ed Israelski, the Human Factor Program Manager at Abbott, talked about the application of human factors to the design of medical devices. He presented a basic framework for incorporating human factors engineering (HFE) into the product development cycle.
He noted that the importance of developing quantified usability objectives, and through testing prototypes (both product models and user interface simulations), to improve usability and safety through design iterations.
The FDA design history file is required to include HFE in the design process, but does not proscribe specific methodologies for implementation. Consequently, most HFE efforts in medical device product development are very limited. You can read more, including a survey on medical device vendor software engineering methods, in this report (scroll down to the sub head, A Survey of…) from last year’s conference Improving Patient Safety through Medical Device Interoperability and High Confidence Software.
Dr. Israelski closed with a case study of the new Hospira Symbiq infusion pump (then still part of Abbott).
Frank Painter and Mark Bruley presented next about accident investigations and the impact of human factors design. They showed numerous tragic incidents where poor design resulted in patient injury or death.
Frank noted that it is extremely rare that a stand alone medical device is involved in a sentinel event. In almost all cases, the medical device is used in conjunction with accessories, other systems, users or the patient.
Mark got into investigations, describing the investigative process, what to look for and numerous examples.
Izabella Gieras and Brian Vargo, from Beaumont Hospitals, presented the provider perspective on technology and workflow assessment using human factors and clinical engineering tools. They have 5 clinical engineers and one human factors engineer who are responsible for new equipment evaluation and incident investigation protocols. They’ve even got state of the art simulation facilities.
In her introduction, Izabella noted that they focus on interconnectivity and integration of various medical technologies for HFE and safety. Starting with the patient, environment, and users, the team at Beaumont evaluates:
- Warnings, labels and alarms
- Text and graphics
- Forcing functions (the physical inability to complete an operation when not intended)
The methodologies and tools include:
- Contextual inquiry – interviewing users and direct observation
- Cognitive walkthrough – an overall task broken down into specific steps
- Heuristic analysis – user interface evaluation
- Focus groups and surveys
- Usability testing – formal methodologies that quantitatively measure usability and seek to uncover potential for unanticipated user errors
There is special focus on workflow analysis. Beaumont’s approach here appears to give more consideration to existing technologies and systems integration requirements, rather than use cases and how the products under evaluation actually match or transform existing workflow.
Izabella provided a workflow analysis from their neuro unit, where they use SpectraLink phones for alarm notification. They are digging into the specific integration between the wireless handsets, the monitoring system, and clinical practice on the unit. They are using direct observation and data gathering to quantify alarm volumes and types, and specific use scenarios.
Brian Vargo presented a case study on epidural pump evaluation. Their traditional approach was to do an on site evaluation of the actual product. This provides great “real world” experience with the product. But there are many limitations. Feedback is difficult to capture when products are used in daily practice, and frequently requires someone dedicated to capturing feedback. The time frame for these evaluations is long, typically 2 weeks. There is also considerable expense, including training, lost staff productivity, and any IT infrastructure or systems integration required.
Brian described Beaumont’s alternative approach to the conventional evaluation, a structured heuristic evaluation that simulates use scenarios that highlight potential usability and safety issues. They create a test plan and formulate a test environment (simulating an actual clinical environment perhaps by using an empty patient room). Executing the evaluation requires moving through the plan in an orderly manner, documenting everything along the way. The previous 2 week evaluation turns into a 1 and a half day concentrated assessment.
Next up, Yadin David with Biomedical Engineering Consultants, presented a case study on impacting medical errors in a tele health program.
During his introductory remarks about the importance of HFE, Yadin noted problems at the point of care that are not being addressed. There is an absence of HFE consideration at the point of care from a user or systematic approach. Vendors continue to pursue incompatible point solutions. The proliferation of these multiple different systems – all focused on the same patient – create considerable human factors challenges that result in well known problems like alarm fatigue and failure to rescue. Yadin suggested that providers take a more assertive stance by applying HFE to the use of these systems at the point of care and document that short comings. He implied that providers should then insist on solutions from vendors.
Regarding HFE and tele health, it would seem to this observer that applying HFE to tele health is relatively straightforward. First, the workflow and interaction that is the subject of the HFE study is very constrained by the technology used for the tele health application. Unlike having to observe and model and evaluate workflow and usability in an actual practice environment, like a nursing unit, the tele health interaction and workflow is both constrained and mediated by the technologies use to create the virtual interaction.
Tele health would also be easy to model where both ends of the tele health event can be simulated at a single location. The technology that Yadin presented targeted specific clinical tools, like diagnostic imaging (of many different kinds), high resolution still images for microscopy or dermatology applications, real time audio and video for communications for history and physical or discussions of symptoms and treatments. HFE entails selecting the proper technology to mediate the desired communications, and through evaluation, ensures that clinical efficacy and safety requirements are met.
Yadin ended with video clip of Intuitive Surgical’s da Vinci surgical robotic system. From the 3D visualization and filtering out hand tremors, to the robot cart and visualization system seems to take the need for HFE to new levels.
Increased surgeon confidence and superior instrument dexterity – marketing claims highly dependent on effective HFE.
Q: Ray Zambuto asked, with medical devices increasingly used outside the hospital where the training and clinical training of the users can be very differ (not to mention changes in the environment) should this trend be recognized and incorporated into the HFE of new medical devices?
A: Ed suggested that these factors are taken into account at the beginning of the HFE process when user profiles and the environment are documented. When devices are used outside their intended use, there are significant risks – some of which are related to HFE.
Q: Question to Brian on Beaumont’s structured prep time for 1.5 day evaluation. What is the amount of time required from the Beaumont team and the vendor to plan and execute the evaluation?
A: They had 25 employees each from 2 locations over that 1.5 days. The rep provided an inservice on the product to the team (not during the evaluation). They had to recruit employees to participate and also develop the scripts for their study. This was a team effort with clinical engineer, human factors engineer, clinicians (user buyers), and other key stake holders. They identify what they want to evaluate, how they want to quantify various characteristics and outcomes, and then develop the scenarios to be executed in the study.
Q: Do the evaluation training scenarios include user training?
A: Sometimes, depending on the product. The case study that Brian gave did not include training. The case study that Izabella mentioned, an evaluation of beds, included a 10 minute training session that was identical for both vendor’s beds.
Q: You mention workflow a lot in your HFE evaluations, do you use a formal use case elicitation process?
A: (I didn’t get to ask this one, but I’ll try to catch up with Izabella or Brian later…)
Q: how do you apply your evaluation process to system based medical devices or solutions, like your alarm notification example, made up of multiple products?
A: (Same for this question.)
The rest of the presentations that I attended Saturday will be added over the next couple days. Be sure to check back!