Tenable Research investigates compliance standards for EHR applications in the US healthcare industry and discusses possible gaps in the coverage of these standards. Real world examples are provided to demonstrate potential security impact.
Politics and legislation aside, it’s no secret that the US healthcare industry is a mess. Hospital networks and small-time medical practices alike are known to run outdated and vulnerable software. Compounded by the fact that vendors are jumping on the “smart-things” bandwagon like groupies to a Dave Matthews Band show, the world of healthcare technology is a scary place. From pacemaker malware to compromised insulin pumps and from large scale breaches to ransomware attacks, all facets of the healthcare industry have serious issues with clear and direct impact on consumers and end users.
Given how common large scale breaches are, it’s generally accepted that most people’s data, such as login credentials, personal information, and credit information, has already been stolen. Privacy breaches of this nature are simply a fact of life we all must deal with. Perhaps one of the most notable industries affected by this trend is indeed the healthcare industry, but these breaches go beyond usernames and credit card numbers and include personal health information. These compromises are happening at organizations of all sizes, ranging from small clinics in rural suburbs to large hospitals in major metropolitan areas. While these breaches make for spectacular headlines, we don’t often see much more beyond a surface-level analysis of the attack vectors involved. In other words, we rarely get to see the all the gory details of how attackers are compromising these health-related applications. That’s what we’re here to talk about today.
Creating a Standard
An Electronic Health Record or Electronic Medical Record (EHR / EMR) is a digital record of a patient’s medical history. Applications managing these records typically serve other purposes as well, such as ordering prescriptions electronically, scheduling appointments, providing interfaces for medical imaging devices, handling payment and insurance information, and a variety of other practice management functions. It’s likely anyone who’s visited a doctor sometime in the last decade has seen one or more of these types of applications.
What most people may not be familiar with, however, are the certification standards governing these applications.* To be brief, in 2009 the US government decided to incentivize medical practitioners to begin using certified health applications in order to achieve interoperable patient records and standardized metrics throughout the industry with regards to a provided set of Meaningful Use and Clinical Quality measurements. These standards are periodically reviewed and re-evaluated, but they exist today in much the same form as they did in 2009. While medical practices still have some freedom of choice, they’re likely to stick to a certified product in order to receive the incentive bonuses (read: money).
As with any compliance standard, such as PCI, a big part of the game is simply figuring out how to get the guy with the clipboard to check each box and give a stamp of approval. While obviously not perfect, these certifications are necessary and at least ensure some base level of compliance. As a security professional with a background in dealing with health-related software and the EHR certification process, I am primarily concerned with the certification standards regarding the security and privacy of patient information. To note, while HIPAA - an overarching policy intended to protect communication of patient information - is obviously involved to some degree, discussion about that particular bit of legislation is best left for another time.
Within the Office of the National Coordinator for Health Information Technology (ONC Health IT) certification standard, there are only a handful of criteria that even mention security or privacy-related matters. Most of these criteria fall under 170.315 (d). In general, the standards listed involve user authentication, user permissions, audit logs, secure transmission to a third party, and other basic security features. While these standards are mostly sane, citing NIST as the primary source for proper hashing algorithms and secure transport protocols, they’re often vague and the testing methodology appears to be flawed.
Take 170.315 (d)(1) for example. This criterion exists to ensure that the EHR allows individual logins with a varied set of permissions between users. For example, front desk staff may only be allowed to schedule appointments and print visit summaries; nurses may have the power to view prescriptions, but not order new ones; and doctors likely have the full gambit of privileges. While this is expected functionality for an application of this nature, none of the other criteria in the certification standard specifies how these users/privileges should be managed or stored. While standards and criteria do exist that govern acceptable hashing algorithms or secure transport protocols, they don’t apply to this criterion. This trend of vague descriptions and misplaced requirements can be seen throughout the certification standard, which allows vendors and developers to make mistakes or take shortcuts in areas where they shouldn’t. This leaves major security gaps and holes in many applications that may otherwise meet all certification criteria.
Following Standards Still Leaves Gaps
To further illustrate this point, take a look at some of the recent vulnerability research done in this area. Back in August, Project Insecurity looked into OpenEMR. Their findings detailed over 20 vulnerabilities in the application - ranging from simple SQL injection to remote code execution, which could lead to the breach of potentially hundreds of thousands of patient records.
Even more recently, Tenable disclosed a number of vulnerabilities in another popular open-source medical application, Open Dental, which is geared towards dental professionals. Open Dental Software advertises the “ONC Certified HIT” badge on its product homepage. The company’s official certification status and testing results can be found on HealthIT.gov. As can be seen under the Certification Criterion section, Open Dental does indeed meet the standards required for certification - including the standards regarding reasonable privacy and security of patient data. It was most recently evaluated on October 31, 2018.
From Tenable’s research advisory, we can see the clear gaps in certification standards as they relate to the security of a given application. Open Dental implements separate authentication and authorization mechanisms with varied permissions between user types, but the application makes no effort to securely transmit this information. There is no use of parameterized queries, potentially allowing an attacker to modify requests once access to the application is granted. While some form of local network access is required for these attacks, the attacks themselves are relatively trivial. Additionally, the fundamental design of the application (which is similar to the design seen in so many other EHR systems) contains both server- and client-side logic all in a single package, meaning most security features could be easily bypassed by a local attacker. And I’m sure we’ve all encountered a situation where a doctor or nurse has left patients alone in the exam room without first locking the computer.
To be clear: Tenable is not aiming to shame vendors, developers, medical professionals, legislators, or anyone else related to the medical field. In fact, we’d like to take a moment to compliment Open Dental on its responsiveness to its community. While the disclosure process was admittedly complicated, a peek at the company’s community forums, blog, and public issue tracker makes it obvious the company cares about its users and community. With so many moving parts, though, it isn’t surprising that these flaws exist. If doctors weren’t incentivized to use a product that adheres to a given standard, they may be inclined to use cheaper options with potentially worse issues. If vendors didn’t have a reason to get their products certified, their developers may implement features or accept risks in areas where they otherwise wouldn’t. If legislators didn’t create policies to emphasize the need for a standard in the first place, medical practices might be stuck using procedure codes or diagnostic codes that weren’t transferable from one practice to another, which leads to a whole different slew of problems.
In short, no single entity is to blame for the issues that have arisen in this industry. In fact, many of these smaller EHR companies are so far removed from the security industry that it’s possible they aren’t even aware these types of flaws are problems at all. Nothing in this article is new, groundbreaking, or innovative. We see these same old problems crop up again and again in the form of data leaks, breaches, ransomware attacks and other serious incidents. The medical industry has always been slow to adopt new technology, and government regulations and legislation lag even further behind. As an outsider who recognizes these issues, it’s difficult to stand by and watch the slow bureaucratic process chug along. In my opinion, the best way a security practitioner can effect change in this field is by researching these applications and coordinating the disclosure of findings with the appropriate vendors. By pointing out these flaws to vendors, we can hopefully bring awareness to these issues and point the industry in a more positive direction.
In the medical industry, the barrier to entry is high for security researchers due to cost and availability of software and devices to test. Trials are restricted to potential customers and open source alternatives are few and far between. Tenable’s Zero Day Research team has an ongoing initiative to periodically review medical-related products, both hardware and software. If you make use of such products and are able to share resources or information, please feel free to reach out to [email protected] to collaborate.
* This article primarily deals with the standards and issues mentioned as they exist in the US. While other countries and jurisdictions may experience similar issues, the regulations may differ.