The U.S. Food and Drug Administration is stuck with an old regulatory structure in a new world of constantly evolving medical devices. It’s just one more symptom of how ill-prepared our regulatory regime is for a world where technology is an invisible component of every aspect of our lives.
The agency has been trying to adapt to the changes that come with app-based health scans, vulnerable software that governs MRI machines and a new breed of medical devices that work with a patient’s smartphone. It has released clarifying guidelines around how software updates affect the certification of an existing device and made sure it has a role in regulating app-based medical devices such as a smartphone-delivered EKG. Now it is going further.
This week it released an action plan calling for a new approach to how it approves medical devices and ensures that they remain safe after those devices have been released into the market. The plan relies on a newly system called the National Evaluation System for health Technology (NEST) that will track individual devices throughout their life to determine the role they play in adverse patient events. If they find issues with a device the FDA is creating further rules about recalls, updating device software and more.
The role of NEST is an acknowledgement that digital health items are not static. A device that was “safe” when issued might become unsafe ten years later if the software isn’t supported and vulnerabilities can be exploited. In the medical world, a vulnerability could lead to a device becoming part of a DDoS to falling under the control of a hacker. In the latter case, depending on the device it could call results into question or kill a patient.
I wrote about the overall insecurity and challenges associated with digital health a year ago.
On Thursday a group of hackers demonstrated the basic insecurity of medical devices onstage at the RSA conference in a dramatic simulation that involved a real doctor trying to adapt to a hack and a patient that ultimately “died” onstage.
To address the issue of software updates and cybersecurity the FDA action plan calls for rules that make a huge amount of sense. For example, it requires devices be designed with security in mind. This would include designing the ability to patch a product into the device and requiring that as part of the submission for FDA approval. As part of the premarket submission it also wants a software “bill of materials” report that shows what the device has installed so hospital CISOs can track vulnerabilities.
This bill of materials would also help the FDA in its quest to monitor devices after they are sold. It wants to create a vulnerability disclosure program so it can track new exploits and proactively notify hospitals (or empower a new group to do so). The action plan also calls for the creation of this new group comprised of doctors, hospital administrators, networking experts, software developers and hardware engineers.
This group would take on the role of handling vulnerability disclosers, mediating issues between parties involved in medical device safety, and it would act as a “go-team” that could be sent to investigate a suspected or confirmed device compromise at a manufacturer’s or FDA’s request.
This is a lot. Those working in the field of cybersecurity are pleased. Beau Woods, a member of I am the Cavalry, an organization of security experts trying to work with the government and industry to improve the safety of connected devices, called it a good step.
The creation of some sort of digital health security A Team will be an essential element to this plan. Hospitals don’t have the security personnel of a large financial institution or a tech firm like Google. In smaller hospitals and doctor’s offices their IT administrator is the nurse who can figure out how to network the printer.
Other aspects such as the requirement for software updates will frustrate some medical device makers who will find it expensive to support medical equipment over its long life. Many of these machines are developed over a period of years and then sold to hospitals to last for ten or fifteen more years. After that they may be scrapped or sold to secondary buyers. I find it hard to fathom that Philips or GE want to support the software inside an ultrasound machine over 25 years.
What this action plan calls for is a change in the way the agency works today and how it needs to work to ensure a safe digital health system. It’s asking Congress for the funds to make that happen and the regulatory power to force manufactures to take action if problems are found after a device has been sold. Instead of approving a device as it’s set to go on the market and a piecemeal approach to recalling unsafe products after the fact, the FDA wants to take advantage of technology to become involved throughout the process. From the action plan:
FDA envisions a future state where the medical device ecosystem is inherently focused on device features and manufacturing practices that have the greatest impact on product quality and patient safety. FDA aims to adopt learnings from other industries, such as aerospace, where companies compete on quality and product users have access to quality metrics that allow them to be informed consumers. Internally, this requires a shift in FDA’s traditional regulatory approach, toward a model that helps manufacturers identify and prevent problems before they occur, adapts to changes in science and technology, and rapidly addresses events that impact safety. Externally, this may require partnerships and shared responsibility among FDA, industry, practitioners, and patients to evaluate and adjust continuously based on experiences across the full life cycle of a device.
When more and more of our products are based on software, this sort of approach makes sense. We swap out our phones every few years. We update our apps every few weeks. When the features you rely on can change with the shipment of a few bits we can’t regulate those bits like we did when a device was a static collection of atoms. The FDA is smart to realize this.