HIPAA, PHI, and Software

The software that we deliver for several of our clients contains medical information. As soon as we brush up against the words “medical” or “health,” we have to consider building in facilities to comply with the Health Insurance Portability and Accountability Act of 1996 (HIPAA), the Health Information Technology for Economic and Clinical Health (HITECH) provisions in the American Recovery and Reinvestment Act of 2009, and the HIPAA Omnibus rule. These three pieces of legislation tend to be referred to collectively as HIPAA. The part that gets the most attention is the “Privacy Rule” covering Private Health Information (PHI), and that’s what we’ll primarily concern ourselves with here.

Much of this legislation was written by lawmakers possessing limited knowledge on how software works, so complying with these regulations can feel more like interpretive dance and less like checking boxes on a solid list of rules. Even so, there are many ways in which a client (or an implementer) can get into trouble by failing to comply, and we like to avoid that kind of trouble. Reading all of the relevant regulations, expert recommendations, and the plethora of disjointed advice that may or may not apply to your situation can be daunting. It can help to focus on a small piece of the puzzle with big implications. In the next article in this series, we’ll focus on caching—more specifically, caching in a key-value store (we’ll be using Redis), deployed into a cloud environment (AWS, in our case).

Before we tackle that, we’ll dive into HIPAA a bit further. If you’re already familiar with HIPAA, you can skip ahead to the tech.


The HIPAA Privacy Rule concerns itself with PHI, or “protected health information.” This consists of anything that connects health-related information and the person to whom it belongs. According to the U.S. Department of Health and Human Services (HHS), this is “all individually identifiable health information held or transmitted by a covered entity or its business associate, in any form or media, whether electronic, paper, or oral.” This includes, but is not limited to, names, email addresses, phone numbers, record numbers, photographs, and imaging of records that include PHI. Anything that can place a person next to their data is subject to the type of risk mitigation required by HIPAA. As software professionals, we tend to care about the Security Rule, which governs electronic PHI (e-PHI).

Obviously, the record of a visit to your physician or specialist would constitute PHI: It contains your name, address, phone, as well as the nursing assessment (heart rate, blood pressure, etc.), diagnosis, prescriptions, and so on. Less obviously, a pre-rendered and subsequently cached piece of view logic containing a patient name is also PHI, as is every record of a pivot table in a database that links a patient record identifier to a medical device identifier. The usual approach to dealing with PHI is to secure everything. It tends to be more time-efficient to treat everything in the system as PHI rather than identify what is and isn’t PHI and then building parallel systems to handle each.


The HIPAA regulations are written to cover “(1) health plans, (2) health care clearinghouses, and (3) health care providers who electronically transmit any health information in connection with transactions for which HHS has adopted standards”[1].

Large organizations with many concerns, some of which are health-related, can designate themselves as Hybrid Entities[2]. In that case, only the parts of the entity described as health-care components are subject to HIPAA. Entities “conducting certain functions on behalf of a covered entity” such as a software as a service provider that stores or manages PHI are covered as Business Associates. A covered entity is allowed to disclose PHI to a business associate, provided a Business Associate Agreement (BAA)[3] exists between the two parties. Under these circumstances, a covered entity can be indemnified from mishandling of the PHI by the Business Associate. The services covered under the Business Associate Agreement—and how they’re covered within that agreement—can help define who is responsible for securing the PHI addressed by the Privacy Rule.

Third-party organizations developing software for use by the public largely qualify as health care clearinghouses or business associates. As an example, AWS has a standard BAA that covers nine services. AWS ensures that HIPAA compliance is possible for those services under the shared responsibility model, though the specifics of the implementation are left to the service consumer.

Short answer: if you handle medical information that is coming from or headed to a health care provider, a health insurance company, or anyone in communication with either, you should seriously consider whether or not HIPAA applies to you.


HIPAA is not quite as broad as “it depends on who you ask,” but it doesn’t mandate specific measures either. While the Privacy Rule is pretty clear on what and who is covered, the Security Rule is somewhat less specific. It defines three classes of protection: administrative, technical, and physical. We’ll summarize them, but to give you an example, here is the text of §164.312(a)(2)(iv): “Encryption and decryption (Addressable). Implement a mechanism to encrypt and decrypt electronic protected health information.” This helpful little requirement doesn’t talk about when or where the PHI needs to be encrypted, so most implementers elect to cover their bases: encrypt the disk, encrypt the database record, encrypt the transport channel. However, the motivation behind the requirement may have been to encrypt data leaving the system, and to decrypt data entering the system. So instead of talking about that level of detail, let’s get to the higher-level summaries.

The administrative security requirements dictate a minimum level of bureaucracy necessary to document procedures and risk analysis, regularly conduct audits, identify appropriate user roles, and choose a responsible party for ensuring compliance with the Security Rule.

The physical security requirements are primarily organizational. They deal with access to facilities, proper (physical) security on workstations, and disposal of decommissioned hardware. Both physical security and administrative security are almost entirely outside the scope of this post.

The technical security mandates, our primary concern, include:

  1. Access control via unique user identification, authentication authorization, and system sign-off
  2. Procedures for accessing PHI in an emergency (and presumably disaster recovery) scenario
  3. Encryption “at rest” and “in flight”
  4. Protection from unauthorized modification and destruction
  5. Auditing

There is also the Final Rule[4], 138 pages of fine print that set out clear dates, penalties, and clarifies the other rules. The National Institute of Standards and Technology has provided several products and documents to help figure out the details with specifics, but the goal of all of these measures is to make it as difficult as possible for unauthorized persons to view protected health information.


Yes, doctors still use faxes, and they spend thousands of dollars per year on fax machines, fax paper, telephone lines, and long distance bills to enable it. Why? Two reasons: First, despite the allocation of $30 billion dollars in the HITECH Act of 2009 to subsidize the digitization of health records, the cost of developing general electronic health record (EHR) software that is compliant with HIPAA’s restriction is high. That cost gets passed on to the hospitals and doctors’ offices that use the software. Second, efforts to standardize EHR for interoperability are still in the ‘efforts’ stage. That is, even if your general practitioner’s office buys a fancy HIPAA-compliant software system for storing electronic health records, there’s no guarantee that when you send that record to the hospital on behalf of your patient, their software will be capable of understanding it.

In order to create a true Health Information Exchange (HIE), a number of standards have to be in place, including but not limited to:

  • The healthcare vocabulary must be standardized, so that words and acronyms mean the same in electronic health records across the board.
  • PHI stored electronically (e.g., EHR) must have a consistent structure, so that it can be understood by different systems.
  • Communication channels between systems must be secured, whether via email or through a third-party exchange system or authenticated drop box.

As you can imagine, building a set of standards robust enough to account for all EHR requirements and formats is complex, and would require the kind of consensus rarely seen in both government and non-governmental organizations (NGOs). Implementing those standards would be expensive, leading to expensive software products.

Faxes? Faxes work everywhere, and they work everywhere right now, so doctors, nurses, insurance companies, and many other organizations which handle PHI still use them.

How can this fundamentally insecure technology be HIPAA-compliant? Procedures surrounding the sending and reception of faxes stand in for technological solutions, since the technology was in place prior to HIPAA. When developing a new technology or software solution, the full scope of HIPAA applies. Because of this, it’s useful for us as technologists to understand HIPAA as best we can.


With this article we’ve introduced the concepts of HIPAA and PHI, and reviewed the broad brush strokes of compliance. We’ve also hinted at the difficulty inherent in developing software that is HIPAA compliant, especially if it needs to communicate with external systems. In the first follow-up, we’ll speak specifically about what this means for caching data using Redis, and describe how to implement a HIPAA-compliant Redis server.


[1] https://privacyruleandresearch.nih.gov/pr_06.asp
[2] https://privacyruleandresearch.nih.gov/pr_06.asp#6b
[3] http://www.hhs.gov/hipaa/for-professionals/covered-entities/sample-business-associate-agreement-provisions/index.html
[4] https://www.gpo.gov/fdsys/pkg/FR–2013–01–25/pdf/2013–01073.pdf

Ben Vandgrift

Ben Vandgrift

Principal Consultant

Ben Vandgrift struggles daily with a compulsion to solve problems, especially the tough ones. His decades-old journey has most recently led him to Levvel, where he solves complex architectural and design problems for a variety of clients. When he’s not working as a software engineer, he stays busy trying not to be mauled by his rescue panther.

Principal Consultant, and Author of Clojure Applied: From Practice to Practitioner.

Previous Post →