Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Charles Araujo, Pat Romanski, Flint Brenton

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @DXWorldExpo, SDN Journal

@CloudExpo: Blog Feed Post

The HIPAA Final Rule and Staying Compliant in the Cloud

As healthcare and patient data move to the cloud, HIPAA compliance issues follow

The HIPAA Omnibus Final Rule went into effect on March 26, 2013.  In order to stay compliant, the date for fulfilling the new rules is September 23, 2013, except for companies operating under existing “business associate agreements (BAA),” may be allowed an extension until September 23, 2014.

As healthcare and patient data move to the cloud, HIPAA compliance issues follow.  With many vendors, consultants, internal and external IT departments at work, the question of who is responsible for compliance comes up quite often.  Not all organizations are equipped or experienced to meet the HIPAA compliance rules by themselves.  Due to the nature of the data and the privacy rules of patients, it is important to secure the data correctly the first time.

HIPAA and the Cloud
Do you have to build your own cloud HIPAA compliance solutions from scratch?  The short answer is no.  There are solutions and consulting companies available to help move patient data to the cloud as well as secure it following HIPAA compliance rules and best practices.

The following checklist provides a guide to help plan for meeting the new HIPAA compliance rules.

A Cloud HIPAA Compliance Checklist

1. Ensure “Business Associates” are HIPAA compliant

-          Data Centers and cloud providers that serve the healthcare industry are in the category of “business associates.”

-          Business Associates can also be any entity that “…creates, receives, maintains, or transmits protected health information (PHI) on behalf of a covered entity.”  This means document storage companies and cloud providers now officially have to follow HIPAA rules as well.

-          Subcontractors are also considered business associates if they are creating, receiving, transmitting, or maintaining Protected Health Information (PHI) on behalf of a business associate agreement.

-          As a business associate they must meet the compliance rules for all privacy and security requirements.

What can you do?
Ensure business associates and subcontractors sign a business associate agreement and follow the HIPAA compliance rules for themselves and any of their subcontractors. A sample Business Associate Agreement is available on the HHS.gov website.

What happens if you are in violation?
The Office of Civil Rights (OCR) investigates HIPAA violations and can charge $100 – 50,000 per violation.  That gets capped at $1.5 million for multiple violations.  The charges are harsh to help ensure that data is safe and companies are following the HIPAA rules.

2. Data Backup

- Health care providers, business associates, and subcontractors must have a backup contingency plan.

- Requirements state that it has to include a:

Backup plan for data, disaster recovery plan, and an emergency mode operations plan

- The backup vendor needs to encrypt backup images during transit to their off-site data centers so that data cannot be read without an encryption key

- The end user/partner is required to encrypt the source data to meet HIPAA compliance

What can you do?

If you handle the data backup internally, set a plan to meet HIPAA compliance and execute it.
If you have external backup solution providers, ensure they have a working plan in place.

3. Security Rules

-          Physical safeguards need to be implemented to secure the facility, like access controls for the facility

-          Develop procedures to address and respond to security breaches

-          There are an additional 18 technical security standards and 36 implementation specifications as well

What can you do?

Put a plan in place to protect data from internal and external threats as well as limiting access to only those that require it.

4. Technical Safeguards
Health care providers, business associates, and subcontractors must implement technical safeguards. While many technical safeguards are not required – they do mitigate your risk in case of a breach. In particular, encryption of sensitive data allows you to claim “safe harbor” in the case of a breach.

-  Study encryption and decryption of electronically protected health information

-  Use AES encryption for data “at rest” in the cloud

-  Use strong – and highly protected – encryption key management; this is the most sensitive and difficult piece on this list – consider to use split-key cloud encryption or homomorphic key management

-  Transmission of data must be secured: use SSL/TLS or IPSec

-  When any data is deleted in the cloud any mirrored version of the data must be deleted as well

-  Limit access to electronically protected health information

-  Audit controls and procedures that record and analyze activity in information systems which contain electronically protected health information

-  Implement technical security measures such as strong authentication and authorization, guarding against unauthorized access to electronically protected information transmitted over electronic communication networks

What can you do?

Adopt strong encryption technology and develop a plan to ensure data is transmitted, stored, and deleted securely. Develop a plan to monitor data access and control access.

5. Administrative Safeguards

For organizations to meet HIPAA compliance they must have HIPAA Administrative Safeguards in place to “prevent, detect, contain and correct security violations.”  Policies and procedures are required to deal with: risk analysis, risk management, workforce sanctions for non-compliance, and a review of records.

-  Assign a privacy officer for developing and implementing HIPAA policies and procedures

  • Ensure that business associates also have a privacy officer since they are also liable for complying with the Security Rule

-  Implement a set of privacy procedures to meet compliance for four areas:

Risk Analysis
“Conduct an accurate and thorough assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of electronic protected health information held by the covered entity”

Risk Management
“Implement security measures sufficient to reduce risks and vulnerabilities to a reasonable and appropriate level to comply with §164.306(a).”

Workforce Sanctions for Non-Compliance
“Apply appropriate sanctions against workforce members who fail to comply with the security policies and procedures of the covered entity.”

Review of Records
“Implement procedures to regularly review records of information system activity, such as audit logs, access reports, and security incident tracking reports.”

-  Provide ongoing administrative employee training on Protected Health Information (PHI)

-  Implement a procedure and plan for internal HIPAA compliance audits

What can you do?
Develop an internal plan to meet HIPAA compliance and have a privacy officer to implement requirements.  Ensure that policies and procedures deal with analysis of risk, management of risk, policy violations, and sanctions for staff or contractors in violation of the policy.  Develop and maintain documentation for internal policies to meet HIPAA compliance as it will help define those policies to your organization and could assist during a HIPAA audit.

The post The HIPAA Final Rule and Staying Compliant in the Cloud appeared first on Porticor Cloud Security.

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

@MicroservicesExpo Stories
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
"This all sounds great. But it's just not realistic." This is what a group of five senior IT executives told me during a workshop I held not long ago. We were working through an exercise on the organizational characteristics necessary to successfully execute a digital transformation, and the group was doing their ‘readout.' The executives loved everything we discussed and agreed that if such an environment existed, it would make transformation much easier. They just didn't believe it was reali...
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
Don’t go chasing waterfall … development, that is. According to a recent post by Madison Moore on Medium featuring insights from several software delivery industry leaders, waterfall is – while still popular – not the best way to win in the marketplace. With methodologies like Agile, DevOps and Continuous Delivery becoming ever more prominent over the past 15 years or so, waterfall is old news. Or, is it? Moore cites a recent study by Gartner: “According to Gartner’s IT Key Metrics Data report, ...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...