|By Jon Shende||
|February 7, 2013 03:00 PM EST||
Over the last few weeks I've been hearing a lot of discussion around HIPAA. When we speak about HIPAA, invariably the two components of data security and data privacy arises.
In the traditional data centers, database managers and data owners know where their data resides and implement the necessary processes to preserve privacy and audit access.
However, when we move to the cloud, the cloud being all about data, we are looking at servers, network, and storage that are abstracted. This raises concern that data owners may not necessarily know where their data sets physically reside and we are looking at Cloud Service Provider (CSP) employees who will be handling confidential patient data or Personally Identifiable Information (PII).
Of importance here is that when it comes to leveraging the cloud ecosystem for healthcare segments, the foremost concerns are around HIPAA and the HITECH Act compliance capabilities and meaningful use provisions.
So what is meaningful use? According to HealthIT.gov
"Meaningful use is the set of standards defined by the Centers for Medicare & Medicaid Services (CMS) Incentive Programs that governs the use of electronic health records and allows eligible providers and hospitals to earn incentive payments by meeting specific criteria."
The goal of meaningful use is to promote the spread of Electronic Health Records (EHR) to improve health care in the United States.
Benefits of meaningful use of EHRs include:
- Complete and accurate information.
- Better access to information.
- Patient empowerment.
In the healthcare world, organizations are positioning to attain meaningful use. This to capture the incentives allocated by the Federal Government as well as to ensure that reimbursements do not face jeopardy for providers not in line with the meaningful use provisions.
As healthcare practitioners and organizations increase the use of technology solutions in delivering clinical care, their IT departments are faced with additional stress to provide availability on demand and operate data center approaching 99.999 percent availability. In most cases this is a major challenge that can lead to the risk of unscheduled outages and costly solutions.
Assuring high availability for healthcare applications, means meeting uptime requirements; and in today's environments will require access to more than one data center. This can significantly impact the overall capital investment in data center infrastructure for healthcare organizations.
Looking to the cloud as a solution is not only the next step in services but will ensure high availability of clinical applications. This will allow a healthcare organization to leverage the expertise and financial stability of an established CSP. Another advantage of leveraging a cloud ecosystem, is that of rapid provisioning and deployment, with the ability to change compute capacity as demand changes.
Thus in the event of failure, server instances can be seamlessly moved to alternate hosts or in anticipation can be clustered to provide redundancy.
Some may ask whether it is risky to transfer data from site to cloud. The answer is no as a majority of organizations move data over the Internet via encryption channels. Where we can see concerns arising is with the hand-off of data into the (CSP) environment.
In a seamless environment all data will have site to site encryption up to and including storage. Where we can see some separation is with healthcare application vendors support.
In the cloud, it is a given that we can have a number of people with access to the physical servers and storage that cloud consumers have no control over. For an IT Security person this will elicit conflicting concerns as on one hand there is the presupposition that complete control is being relinquished which can only be assured with prescriptive precautions defined by a CSP.
The cloud computing ecosystem is still evolving and as such there is still a lack of industry-wide certifications. As we mature within this ecosystem the intent is to drive toward processes, best practices and certifications which would provide legal protection that can reduce the complexities of a long negotiation and complex SLA requirements.
Within a regular data center or even a small IT shop, as an IT Security leader one of my first expectation for any shop is some form of centralized logging with automation. Similarly by transferring such a mindset into the cloud ecosystem (they are after datacenters) any healthcare customer security leaders expect the assurance that detailed reporting is a given.
Having worked on the security strategy and assessment separately for a few cloud computing projects I have seen first-hand, that access rights was a major focus. In light of this, it is not a complex process to segment solutions for healthcare. As a result any access to servers and storage dedicated to a healthcare customer by anyone within a CSP organization will be logged and thus can provide the assurance of controls around access.
From a legal perspective, more specifically talking contracts, healthcare customers expect the provisions of strong financial penalties to indemnify against a breech as well as to hold the CSP accountable.
Some CSPs are moving to providing a HIPAA Business Associate Agreement (BAA) for their healthcare customers. The assurance provided by their BAA demonstrates meeting the compliance requirements (enabling the physical, technical, and administrative safeguards required) of the HIPAA and the HITECH Acts.
In closing, I will state that HIPPA compliance and cloud computing do not have to be in conflict. Rather healthcare entities can leverage the benefits of the cloud, coupled with the necessary due diligence and legal contracts to meet their needs.
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.
Jan. 21, 2017 03:15 AM EST Reads: 571
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 21, 2017 02:30 AM EST Reads: 6,098
True Story. Over the past few years, Fannie Mae transformed the way in which they delivered software. Deploys increased from 1,200/month to 15,000/month. At the same time, productivity increased by 28% while reducing costs by 30%. But, how did they do it? During the All Day DevOps conference, over 13,500 practitioners from around the world to learn from their peers in the industry. Barry Snyder, Senior Manager of DevOps at Fannie Mae, was one of 57 practitioners who shared his real world journe...
Jan. 21, 2017 02:30 AM EST Reads: 950
Software development is a moving target. You have to keep your eye on trends in the tech space that haven’t even happened yet just to stay current. Consider what’s happened with augmented reality (AR) in this year alone. If you said you were working on an AR app in 2015, you might have gotten a lot of blank stares or jokes about Google Glass. Then Pokémon GO happened. Like AR, the trends listed below have been building steam for some time, but they’ll be taking off in surprising new directions b...
Jan. 21, 2017 02:15 AM EST Reads: 2,309
We call it DevOps but much of the time there’s a lot more discussion about the needs and concerns of developers than there is about other groups. There’s a focus on improved and less isolated developer workflows. There are many discussions around collaboration, continuous integration and delivery, issue tracking, source code control, code review, IDEs, and xPaaS – and all the tools that enable those things. Changes in developer practices may come up – such as developers taking ownership of code ...
Jan. 21, 2017 12:30 AM EST Reads: 1,814
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Jan. 21, 2017 12:00 AM EST Reads: 5,101
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Jan. 21, 2017 12:00 AM EST Reads: 4,704
When building DevOps or continuous delivery practices you can learn a great deal from others. What choices did they make, what practices did they put in place, and how did they connect the dots? At Sonatype, we pulled together a set of 21 reference architectures for folks building continuous delivery and DevOps practices using Docker. Why? After 3,000 DevOps professionals attended our webinar on "Continuous Integration using Docker" discussing just one reference architecture example, we recogn...
Jan. 20, 2017 10:30 PM EST Reads: 1,384
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 20, 2017 06:30 PM EST Reads: 5,463
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Jan. 20, 2017 05:30 PM EST Reads: 1,483
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Jan. 20, 2017 05:15 PM EST Reads: 3,545
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
Jan. 20, 2017 05:15 PM EST Reads: 4,965
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
Jan. 20, 2017 02:45 PM EST Reads: 4,725
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Jan. 20, 2017 02:30 PM EST Reads: 1,140
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Jan. 20, 2017 01:30 PM EST Reads: 5,252
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Jan. 20, 2017 01:30 PM EST Reads: 3,609
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Jan. 20, 2017 01:00 PM EST Reads: 2,587
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...
Jan. 20, 2017 12:15 PM EST Reads: 5,845
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being...
Jan. 20, 2017 11:45 AM EST Reads: 2,561
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Jan. 20, 2017 11:30 AM EST Reads: 802