Research firm Arketi sent me a survey on hospital patient flow. Sponsored by patient flow software vendor StatCom, the survey sought to quantify the patient flow problem (how many ED boarders, hours on divert, room turn over times, etc.) and identify the departments contributing to, or ameliorating hospital patient throughput. This will all be good marketing data once the study is compiled.
Lack of Data
I was struck by a couple things missing from the study. From my experience, the biggest challenge facing hospitals seeking to improve patient throughput is the near total absence of performance data. Unless an effort is made to manually log performance data – with the oversight to ensure it is accurate and complete – hospitals have little data available to them. Such manual data gathering operations are expensive, time consuming and hard to sustain on a broad basis or for an extended period of time.
As the canary in the coal mine, the Emergency Department is the most common source for any detailed data, and often the squeaky wheel pushing for patient flow improvements. Consequently, data about hours on diversion, ED patient flow metrics, patient boarding, and reasons for boarding frequently represent the best patient flow data in hospitals. Once you get beyond the ED, available data drops off quickly.
Frequently, the best data available is aggregate ADT data that shows length of stay (LOS), transfers and discharges among individual departments and the hospital as a whole. Have you looked at your ADT data lately? Found all those duplicate transaction codes for decommissioned units (still in use)? Does everyone who enters ADT transactions or uses the data have the same understanding of what all those codes and locations really mean and when to use them? If you;ve not addressed this issue (cleaning up your tables, training users, auditing data) in the past year or two, or don’t want to ruin your day (it is Monday after all) let sleeping dogs lie.
The paucity of detailed reliable data to gage patient throughput is a significant hospital need, and I think a key justification for buying a system like StatCom’s, that is not addressed in the survey. Having some quantitative market research as proof to this need would be something I’d want if I was a patient flow software product manager.
Patient Care Methodologies
The other thing that struck me was the lack of any questions about patient care methodologies that create patient flow bottlenecks. Most hospitals are structured and managed based on industrial principles from 30 or 40 years ago. Back in the 1980s before DRGs and prospective reimbursement when hospitals had excess capacity, setting aside a fixed number of beds (and equipment) for specific categories of patients worked pretty well – excess capacity hid a multitude of sins.
As falling reimbursement wrung out excess capacity, the fundamental weaknesses of allocating fixed resources on the expectation that a certain consistent number of patients will utilize those resources became evident. In manufacturing, just-in-time management tools were used to lower costs. The fact is there are no just-in-time patients. Just like manufacturers learned that market demand can’t be reliably forecasted, and implemented flexible manufacturing concepts, hospitals must move beyond rigid patient care strategies. Besides creating patient flow bottlenecks in units like critical care and telemetry, conventional care methodologies can result in unnecessary patient transfers (something else the survey didn’t explore).
The most common example of a patient care methodology analogous to modern manufacturing processes goes by various names: variable acuity units, flexible monitoring, or universal beds. Whatever you call it, it means providing patient care in one on-service unit and not transferring the patient every time their acuity changes. Patient transfers are bad; each transfer adds a day to the patients LOS, and represents an opportunity for adverse events resulting from less than perfect patient hand-offs. Manufacturing has developed many techniques to be more flexible, like cellular manufacturing, kaizen, the Toyota Way, LEAN and Six Sigma – and many of these tools have been adopted by innovative hospitals.
The best way to minimize critical care and telemetry as patient flow bottlenecks is through variable acuity units, where patients receive the most appropriate level of care in the lowest cost setting. You will never be able to match the number of critical care and telemetry beds to the exact number of patients who need them. You can waste money building too many of these high acuity beds – and don’t forget that about 15% of the patients in your critical care and tele beds right now don’t meet admit criteria and should be in other units. Or you can continue to go on ambulance diversion. A patient flow system like StatCom’s can be a useful tool in implementing variable acuity care. Good acuity scoring tools like ClairVia are also helpful.
Prognostications and Prevarications
Let me close by offering some predictions. Thankfully, most of these predictions are perfectly safe because any outcome can’t be verified. First off, the StatCom survey will show that the two departments representing the biggest patient flow bottlenecks are critical care and telemetry.
I see solid growth and adoption for the patient flow application market. McKesson’s acquisition of Awarix is further validation of anticipated growth. The most important justification for a solution like StatCom’s is that you can’t manage what isn’t measured. The market right now is still an early adopter market (that’s marketing-speak for those wild eyed innovators who will try anything). The generation of products coming to market now represents a big improvement over earlier solutions. These systems still require certain enabling technologies to work effectively in hospitals run by anyone but Jack Welch. These applications will provide a plethora of useful data, across the enterprise. As the market matures, hospitals will better learn to pick the good patient flow applications and get better at wringing the most value from them.
Oh, and donâ€™t throw six or seven figure consulting contracts at the problem. The 80/20 rule says that youâ€™ll get 80% of the value from only 20% of the patient flow issues excruciatingly documented by the hoard of fresh MBA graduates who descend on your hapless staff. Empower an internal champion and find someone (maybe even a vendor) who can quickly find the bottlenecks with the biggest impact. Buy a patient flow application, and converge your patient flow findings with your software implementation targeting the “big bang” opportunities. You’ll still spend six or seven figures, but most if it will be on a software application that will deliver real value for years – as opposed to the one-time shot of a big consulting gig.
As patient flow problems increase, hospitals will transition from boarding patients awaiting an inpatient room in the ED to placing them up on their on-service unit. There’s lots of rational patient safety and financial reasons to do this, and the outrage and consternation evoked by such a change will eventually give way to reason. And don’t get snowed by the “fire codes don’t allow us to leave patients in hallways” excuse. They’re in hallways in the ED, aren’t they?
Finally, the federal government’s double edged strategy for transforming health care (reduced reimbursement to drive change, and increased visibility of provider performance to increase quality) will eventually include public disclosure of ED waiting times, reporting of time on ambulance diversion, and statistics on the number of patients boarded. This will probably occur after vendors “cross the chasm” and the early majority of the hospital market starts to implement patient flow solutions. This visibility will eventually drive the market laggards to adopt.
Pictured above is the hospital illustration from StatCom’s home page – I like the 8-bit retro graphic design.