Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: @CloudExpo, Microservices Expo, Cloud Security

@CloudExpo: Article

Who’s Responsible for Protecting Data Stored in the Cloud?

Cloud is the natural evolution of the data center - but with it comes responsibility

With cloud comes the notion of liberation. Cloud is the natural evolution of the data center. It’s easy to deploy, infinitely scalable, and highly redundant. It is the shiny new

component inside the storage controller and is making it possible for an old dog to learn some very impressive new tricks. But with the cloud, comes responsibility.

An article recently appeared over at BusinessWeek explaining how many businesses now operate under the assumption that once their data is sent offsite they need not be concerned with protecting it. In a perfect world, this is how it should work. One of the main selling points of outsourcing infrastructure is the idea that there is now one less thing for IT to worry about. However, before any business can trust a third party to protect their invaluable corporate IP, some due diligence must be conducted.

The two areas businesses need to be concerned with are:

  1. Security and Encryption - Is my data protected from a malicious third party?
  2. Durability - What happens if I lose the local copy of my data? What if one or more of the cloud provider’s data centers are destroyed?

Security in Cloud StorageSecurity and Encryption
Not that one is more important than the other, but most of the focus tends to fall on the security and encryption category. No matter what a vendor says in their SLA, one of the more critical components of “protecting data” has to do with how the keys are handled. Do you generate your own key? Who controls it? Does data remain encrypted at rest or is it modified in the cloud (outside of your control)?

The only way to ensure absolute security is to use a system in which the end user (IT/Storage Admin) generates and controls the crypto keys. Under no circumstances should the storage vendor hold a copy of your keys. If data is encrypted before it leaves your data center using a trusted form of encryption (i.e. OpenPGP) and the end user is the only party controlling the keys, there is no way the data can be compromised by a third-party hacker or by a malicious employee at the cloud provider.

Durability
The second component has to do with the actual durability/redundancy of your data. You could be using the best security and encryption in the world, but if your data only exists in two physical data centers - and both are wiped out, your data is gone. One of the inherent benefits of the cloud is the unique property of geographic replication. When a file is stored in Amazon S3 or Microsoft Azure it is saved on multiple servers, located in multiple data centers. It is through this technique of geographically dispersed replication that Amazon is able to offer their incredible 99.999999999% durability. Even if you utilized site-to-site replication and nightly tape backups, the cost to achieve anywhere near that level of durability would be astronomical. It is for this reason the large cloud providers will continue to dominate the market as the barriers to entry continue to rise. It’s no small task to build a cloud. Iron Mountain is one of the more recent examples of this, and it’s likely we’ll see more consolidation in the future.

A Team Effort
So to answer the question of who is responsible for protecting data stored in the cloud, as you’ve likely guessed, it’s a combination of the vendor and the end user. The vendor needs to do their part to ensure the data is secured by proper use of encryption (customer holds the keys!), and replicated in multiple data centers spread across multiple geographies. The end user should have a general knowledge of proper key handling and ask the right questions when it comes to data durability. With the right boxes checked, there is no reason why data stored in the cloud cannot be as, if not more secure than data stored on-site.

More Stories By Louis Abate

Louis is a connoisseur of technology, photography and music. As a passionate tastemaker, he is on a lifelong mission to seek out and evangelize best of breed products and services.

At Nasuni, he heads up the multimedia content creation and social aspects of the company. With one eye on the constant flow of industry news and the other on Nasuni’s services, you’ll find Louis writing about the evolving storage industry and general musings about high technology in the modern office.

Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...