The 21st Century Cures Act of 2016 (Cures) had many provisions including defining what kinds of software are medical devices, and some specific kinds of software that are not, and which therefore weren’t subject to FDA regulation. Cures then directed the FDA to look into those that are not and report if they presented a public health problem. The FDA has now released  such a report.

Curiously the report is based only on information published after the passage of Cures through July 31, 2018 so that what was known before passage or after the given date is not considered. I would not suggest to any student that they follow this bibliographic method if they wanted their work to have any meaning. Another curiosity of the report, which may be related to the limited scope, is that “cybersecurity” is not mentioned, not even once.

The five expressly excluded types of software are: (1) administrative support of a health care facility; (2) software promoting a healthy lifestyle and unrelated to the diagnosis, cure, mitigation, prevention, or treatment of a disease or condition; (3) electronic patient records which do not interpret or analyze such records, (4) transferring, storing, converting formats, or displaying medical data but not interpreting that data; and (5) certain types of clinical decision support (CDS), a type of AI. For each of these 5 areas the report provides a brief overview of potential and reported issues, and a “best practices” for managing the associated risks.

Item 1 isn’t very interesting because admin software rarely attracts much attention in the medical device space. Healthy lifestyle products are in the consumer product space, although Apple among others is trying to bridge the gap. Note that the unrelated phrase defines what is a medical device. EHR’s have, rightly or not, long been excluded from FDA’s interest, which is not to say that they don't do anything that falls within the definition of a medical device, or that they can't be dangerous. Types 4 and 5 are the two that are more in the medical connectivity domain.

Type 4 is essentially Medical Device Data Systems (MDDS) which have received much prior discussion, including by us. In the new report, using the only-since-Cures criteria, the FDA found no MDDS impact on patient safety and no information on benefits and risks. The best practices discussion is limited to noting that the original data can be retained unaltered as the software reconfigures it. It is also suggested that industry consensus standards for data be used. We know that data standards are important for interoperability, and that some manufacturers find reason not to make data arising from their products readily available. We might also remember that standards are in general wonderful, which is why there are so many of them. These same issues arise in EHR interoperability for which we still haven’t managed to achieve the uniform formatting of data that would enable all EHRs to process and share it.

CDS in item 5 is limited to the type in which the clinician can understand how the software arrived at the conclusions it did. This is essentially only algorithmic CDS in which the clinician could apply the algorithm to the same input data and get the same result. This precludes “machine learning” based systems in which there is no underlying algorithm, and arguably no basis other than that is what the training data set resulted in. The FDA did identify some post Cures reports on value to be obtained from such systems as well as one error case reports. For the latter it is hardly surprising that a software error (bug) could cause a CDS to give the wrong answer. The fact that the clinician could in principle second guess the result is little solace here but speaks to the unanswered question, are you supposed to rely on the result, and if not, what good is it. The brief best practices section addresses the issue of the timing of alerts. This is but one part of a variety of human factors issues associated with the right result, at the right time, and presented in way that will be noticed and helpful. It is also noted that automated CDS requires accurate input (remembering garbage in-garbage out) and that accurate input depends in part on consistent data structure. They note that working around an alert should be possible, but the reason for doing so should be documented. For this I imagine the kind of list you get when you unsubscribe from some email blasts.

The report has no action plan to deal with any of the issues it uncovered, other than to publish another report in two years as required by Cures. This lack of other activity may be tied to the software in question not being medical devices, and therefore not under the FDA’s regulatory portfolio, and not something that interests them very much. We can imagine an exchange in which the FDA is asked why they did this study, with the answer being "because we had to".