
Hospital administrators are worried about the security of the institutions they run. Fears around a hacked infusion pump killing a patient, data breaches, and all-too-fresh reminders of entire hospital systems being shut down due to malware are common. But prescient hospital administrators should also start thinking about how to handle security as we start connecting medical devices to consumer homes.
Telemedicine is on the rise, as are efforts to push connected clinical devices into consumers’ hands as a way to help make care more accessible and drive down costs. Concerns about helping people age in place, as well as younger consumers’ comfort with digital technology, are driving these trends. But if we want to avoid the security faux pas that led to Mirai and a veritable parade of hacked products, we should start thinking about medical device security for the home today.
“I think home health and telemedicine are already happening and a growing trend,” says Jonathan Langer, CEO of security startup Medigate. “We’re just beginning to digest how to handle it from an operational perspective.”
Medigate is an Israeli startup that was founded in 2017 to provide network-based security to hospitals. The company has built a database of devices used on hospital networks based on their proprietary protocols and their behavior. By analyzing network traffic it can tell hospital IT staff what’s on their network and alert them of any behavior that seems problematic.
But it can’t follow medical devices to the home, due to several potential threats. There’s a risk that the device could get hacked or that a hacker could use the device to infiltrate the hospital network. Network traffic could be stolen, risking a patient’s medical data. A hacker might also get into a consumer home network and find a way to disrupt the device, creating a potential medical issue.
With health care in the home, you have a generally unsophisticated person running a basic network on one end and at the other, a budget-constrained and beleaguered IT department straining to perform security basics. This doesn’t seem like a recipe for success, especially when you toss in medical devices made by companies whose focus hasn’t been on security. Sophistication can vary with any of these parties but as Lander points out, when it comes to securing remote health care, we don’t even know what sort of architecture we need.
Langer says we first need to answer a number of questions. For example, where does device data get encrypted? Does it happen in the cloud, at the hospital, or at a third-party electronics health records provider? He also wonders how consumers will handle security patches, especially on medical equipment that may be in use.
Among the practices that might help in this area that are already being promoted as good for the overall medical device industry is a software bill of materials that lets IT staff in hospitals know what software, firmware, and operating systems are running on their devices. It would be helpful when vulnerabilities are uncovered because the admin would be able to tell which machines might be affected.
Nick Dawson, who has been involved in innovation efforts at a few hospitals including Johns Hopkins, suggests that having a risk assessment team high up in the hospital organization could help. Such a group could determine what devices a hospital can buy based on security assessments. This might ruffle the feathers of a doctor who wants to prescribe a specific system to a patient if that system hasn’t been approved, but it also could stop vulnerable devices from communicating with a hospital.
Other best practices that have evolved on the home health side include using cellular modems to connect devices as opposed to putting them on the patient’s own Wi-Fi network. This is done less as a security measure and more to ensure the device is easy to install and use, but it should be more secure as well.
Perhaps a more subtle trend — and one to worry about — is how hospitals tend to view their role in preventing data loss and hacks. In many cases, when hospitals buy equipment or work with third-party vendors (such as those selling electronic medical records software), the burden of security falls on the vendor. As Dawson points out, hospitals are “terrified” of losing patient data. They should be. Under current federal regulations, hospitals get fined when patient data gets shared.
Dawson worries we will see the burden of securing patient data shift to the patients themselves. If that happens, he suggests that the big tech companies are probably in the best position to secure it, even though they have already proven to not necessarily be the most benevolent caretakers of personal data. But it’s clear the big tech firms are interested in it. Amazon has released HIPAA-compliant skills for Alexa. Apple has invested big in HealthKit. And Google is trying to get Google Health going again.
Health care is one of the more exciting areas where more data, better insights, and new ways to link doctors and patients could save money and improve lives. But before we get started, we should probably start talking about security.
Updated on Aug. 20, 2019 to correct the spelling of Jonathan Langer’s name. It is Langer, not Lander.
Leave a Reply