Among other things, HIPAA protects the patients’ health information.
Anthem, the largest US insurance company, had to learn this the hard way.
What started with a simple phishing email, has led to the biggest healthcare data breach in history. The hackers stole the data of 79 million patients. The information included their names, social security numbers, and medical IDs.
The enraged patients had sued Anthem and won a $115 million settlement. Although the company avoided the regulator’s fines, it would have to spend up to $260 million to improve its security.
HHS Office for Civil Rights (OCR) oversees the HIPAA compliance. In 2017 alone it has fined the US health care providers for almost $20 million.
Even if you’re a small organization, neglecting HIPAA requirements can lead to serious problems.
In 2013 Fresenius Medical Care North America had five data breaches. Combined, they’ve exposed the data of just 525 patients. But the company had to pay a monstrous $3.5 million fine because it didn’t properly analyze the security risks.
According to the degree of negligence, there are four tires of HIPAA fines:
You should understand these three key terms before you tackle HIPAA requirements.
- Protected health information (PHI) – any data that can be used to identify a patient.
PHI consists of two parts: health information and personal identifiers. The latter include patients’ names, addresses, birthdates, social security numbers, medical records, photos, etc. The fact that an individual has received medical services is a PHI in itself.
- Covered Entities – organizations and individuals offering healthcare services/operations, or accepting payments for them.
They include all healthcare providers (e.g. hospitals, doctors, dentists, psychologists), health plans (e.g. insurance providers, HMOs, government programs like Medicare and Medicaid) and clearinghouses (the organizations that act as middlemen between the healthcare providers and insurance companies).
- Business Associates – the third parties handling PHI on behalf of covered entities.
This category includes the developers of health care apps, hosting/data storage providers, email services, etc.
According to HIPAA, you must sign a Business Associate Agreement (BAA) with each party that has access to your PHI. Deciding not to sign a BAA doesn’t free you from HIPAA requirements.
There can be many unintended ways in which the sensitive data can enter your system.
Take, for example, a service that allows doctors to diagnose skin conditions based on anonymous photos. The app doesn’t handle PHI as you can’t identify its users. But as soon as you add the person’s name or address to the photos, they become PHI.
If your application collects, stores, or transmits PHI to the covered entities, you need to comply with HIPAA.
How to become HIPAA-compliant?
To be HIPAA compliant, you’ll have to make regular technical and non-technical evaluations of your efforts to protect health information and thoroughly document them. The regulator has published a sample audit protocol that can help you assess your HIPAA compliance.
You can hire an independent auditor to do the assessment for you. There are many organizations such as HITRUST that specialize in that sort of things. Just remember that OCR doesn’t recognize any third-party certificates.
Technical Safeguards. Security measures like login, encryption, emergency access, activity logs, etc. The law doesn’t specify what technologies you should use to protect PHI.
Physical Safeguards are aimed to secure the facilities and devices that store PHI (servers, data centers, PCs, laptops, etc.).
With modern cloud-based solutions, this rule mostly applies to HIPAA compliant hosting.
The law applies to a wide range of medical software. A hospital management systems (HMS) differs radically from a remote diagnostics apps. But there are some features that are essential to any kind of HIPAA compliant software.
So here’s a minimum list of required features for your health care app:
1. Access control
Any system that stores PHI should limit who can view or modify the sensitive data. According to HIPAA Privacy Rule, nobody should see more patient information than required to do their job. The rule also specifies de-identification, patient’s rights to view their own data and their ability to give or restrict access to their PHI.
One way to accomplish this is to assign each user a unique ID. This would allow you to identify people accessing your system and track their activity.
Next, you’ll have to give each user a list of privileges that would allow them to view or modify certain information. You can regulate access to individual database entities and URLs.
In the simplest form, user-based access control consists of two database tables. One table contains the list of all privileges and their IDs. The second table assigns these privileges to individual users.
In this example, the physician (user ID 1) can create, view, and modify the medical records, while the radiologist (user ID 2) can only update them.
A role-based access control is another way to implement this requirement. With it, you can assign privileges to different groups of users depending on their position (e.g. physicians, lab technicians, administrators).
2. Person or entity authentication
After you’ve assigned privileges, your system should be able to verify that the person trying to access PHI is the one he/she claims to be. The law offers several general ways in which you can implement this safeguard:
- Biometrics (e.g. a fingerprint, a voice or face ID);
- Physical means of identification (e.g. a key, card, or a token);
- Personal Identification Number (PIN).
A password is one of the simplest authentication methods. Sadly, it’s also one of the easiest to crack. According to Verizon, 63% of data breaches happen due to weak or stolen passwords. Another report states that one-fifth of corporate users has easily compromisable passwords.
On the other hand, a truly secure password:
- Consists of at least 8-12 characters including capital letters, numbers and special characters;
- Excludes the commonly used combinations (e.g. “password”, “123456”, “qwerty”, and for some inexplicable reason “monkey”) and vocabulary words;
- Disables any variations of the username;
- Prevents the password reuse.
Alternatively, it could be a string of random words smashed together like a concrete popsicle.
Your application could check these requirements on the signup screen and deny access to users with weak passwords.
There is no such thing as too much security. Source: mailbox.org
Some organizations force their employees to change passwords every 90 or so days. But doing this too often can actually harm your security efforts. When made to change passwords, people often come up with unoriginal combinations (eg. password ⇒ pa$$word).
Moreover, hackers can crack a bad password within seconds and use it immediately.
That’s why you should consider using a two-factor authentication. Such systems combine a secure password with a second method of verification. This can be anything from a biometric scanner to a one-use security code received via SMS.
The idea is simple: even if hackers somehow obtained your password, they’d need to steal your device or fingerprints to access PHI.
But a secure authentication isn’t enough. Some attackers might come between the user’s device and your servers. This way, the hackers could access the PHI without compromising the account. This is known as session hijacking, a kind of man-in-the-middle attacks.
One of the possible ways to hijack a session. Source: Heimdal Security
A digital signature is one of the ways to defend against such attacks. Re-entering the password when signing off on a document would prove the user’s identity.
As the roles in your system get more complex, HIPAA authorization can get in the way of aiding patients. It makes sense to implement emergency access. Such procedures allow authorized users to view any data they need when the situation requires so.
A doctor could, for example, access PHI of any patient in the emergency. At the same time, the system would automatically notify several other people and launch a review procedure.
3. Transmission security
You should protect the PHI you send over the network and between the different tires of your system.
That’s why you should force HTTPS for all your communications (or at least for the signup screens, all pages containing PHI and authorization cookies). This secure communication protocol encrypts data with SSL/TLS. Using a special algorithm, it turns PHI into a string of characters that is meaningless without decryption keys.
A file called SSL certificate ties the key to your digital identity.
When establishing an HTTP connection with your application, the browser requests your certificate. The client then checks its credibility and initiates the so-called SSL handshake. The result is an encrypted communication channel between the user and your app.
SSL handshake; source: the SSL store
Email isn’t a secure way to send PHI.
Popular services like Gmail don’t provide the necessary protection. If you send emails containing PHI beyond your firewalled server, you should encrypt them. There are many services and browser extensions that can do this for you. Alternatively, you can use a HIPAA compliant email service like Paubox.
You should also implement policies that limit what information that can be shared via emails.
Encryption is the best way to ensure the PHI integrity. Even if hackers managed to steal your data, it’d look like a gibberish without the decryption keys.
Unencrypted laptops and other portable devices are a common source of HIPAA breaches. To be on the safe side, encrypt the hard drives of all the devices that contain PHI. You can do this with free encryption tools like BitLocker for Windows or FileVault for Mac OS.
5. PHI disposal
You should permanently destroy PHI when you no longer need it. As long as its copy remains in one of your backups, the data isn’t considered “disposed of”.
In addition to erasing the data, you should also properly destroy the hardware that contains PHI (e.g. hard drives).
In 2010 Affinity Health Plan has returned its photocopiers to the leasing company. It didn’t, however, erase their hard drives. The resulting breach had exposed the personal information of more than 344,000 patients.
Affinity had to pay $1,2 million for this incident.
PHI can hide in many unexpected places: photocopiers, scanners, biomedical equipment (e.g. MRI or ultrasound machines), portable devices (e.g. laptops), old floppy disks, USB flash drives, DVDs/CDs, memory cards, flash memory in motherboards and network cards, ROM/RAM memory, etc.
You should properly destroy these media before you throw or give them away. Depending on the situation, you can either erase them magnetically (e.g. using a degausser), overwrite the data using software like DBAN, or destroy the drive physically (e.g smash it with a hammer).
In flash-based memory drives (e.g USB sticks), data is spread over the whole media to prevent wear. Because of this, it’s hard to completely erase the sensitive information with regular data destruction software. You can, however, use manufacturer utilities like Samsung Magician Software and other tools (including hammers) to dispose of your flash drives.
6. Data backup and storage
Backups are essential for data integrity. A database corruption or a server crash could easily damage your PHI. So does a fire in a data center or an earthquake.
That’s why it’s important to have multiple copies of your PHI stored in several different locations.
Your PHI backup plan should determine the probability of data compromise. All the high- and medium-risk information should be backed up daily and stored in a secure facility. You should also sign a BAA with your backup providers.
A backup is useless if you can’t restore it.
In August 2016, Martin Medical Practice Concepts fell victim to a ransomware attack. The company has paid hackers to decrypt the PHI. But due to a backup failure, the local hospitals lost information of 5,000 patients.
Test your system regularly to prevent recovery failures. You should also log the system’s downtime and any failures to back up the PHI.
And remember, the backups themselves should comply with HIPAA security standards.
7. Audit controls
You should monitor what is done to the PHI stored in your system. Record each time a user logs in and out of your system. You should know who, when and where accessed, updated, modified or deleted the sensitive data.
The absence of audit controls could lead to higher fines.
The monitoring could be done via software, hardware, or procedural means. A simple solution would be to use a table in a database or a log file to record all the interactions with the patient information.
Such table could consist of five columns:
- user_id. The unique identifier of the user who interacted with PHI;
- entity_name. The entity that the user has interacted with (An entity is a representation of some real-world concept in your database e.g. a health record);
- record_id. The entity’s identifier;
- action_type. The interaction’s nature (create, read, update, or delete);
- action_time. The precise time of interaction.
In this example, a physician (user_id 1) has created a patient’s record, a radiologist has viewed it, and later the same physician has altered the record.
You’ll have to periodically audit the activity logs to discover if some users abuse their privileges to access PHI.
8. Automatic logoff
A system with PHI should automatically terminate the user’s session after a set period of inactivity. To continue, the user would have to re-enter his/her password or authorize in some other way.
This would protect PHI if somebody loses his/her device while logged into your app.
The exact period of inactivity triggering the logout should depend on the specifics of your system.
With a secure workstation in a highly protected environment, you can set the timer for 10-15 minutes. For web-based solutions, this period shouldn’t exceed 10 minutes. And for a mobile app, you can set the timeout for 2-3 minutes.
Different programming languages implement automatic logoff in different ways.
9. Mobile app extra-security
Mobile devices present many additional risks. A smartphone could easily be stolen or lost in a high-traffic area compromising the sensitive info.
To prevent this, you can use:
You can’t force these features on users, but you can encourage people to use them. You can include the instructions into your onboarding or send emails describing how to enable them.
Tip: You can store PHI in a secure container separately from the personal data. This way, you can remotely erase the health information without affecting anything else.
Many physicians use personal smartphones to send health information. You can neutralize this threat with secure messaging platforms.
Such applications host the sensitive data in a secure database. To access PHI, users have to download the messenger and log into their accounts.
Another solution is encrypted password-protected health portals where patients can read messages from their doctors. Such portals send notifications without PHI in them (e.g. “Dear User, you’ve got a new message from [redacted]”).
Remember, push notifications aren’t secure by default. They can appear on the screen even if it’s locked. So make sure you don’t send any PHI via push notifications. The same applies to SMS and any automatic messages.
Source: Bridge Patient Portal
Another thing to consider is that FDA may classify some mHealth apps as medical devices (software that influences the decision-making process of healthcare professionals).
So remember to check if your app has to comply with other healthcare regulations before you start development. You can check this test from Federal Trade Commission to get a quick answer.
Now you’ve got a minimum list of features for a HIPAA compliant application.
Alone they won’t guarantee its security. They won’t protect you from phishing or social engineering.
But having these features would convince the auditor that you’ve done enough to protect the data of your clients.
To make the audit painless, document all your HIPAA compliance efforts. For each release of your app, provide the written specifications, plans to test its security, and their results. To be on the safe side, gather and store only the bare minimum of sensitive information.
And don’t forget to check if you need to comply with other regulations before you start the development.
- HIPAA compliance tips for health care organizations
- PHI de-identification guidance
- 2018 changes to HIPAA
So what’s your take on HIPAA compliance? Did you find your answers or were left wondering? What other aspects of HIPAA compliance you’d like us to cover in the next article? Write your suggestions in the comments and hit the subscribe button!