Click here to close now.




















Welcome!

Microservices Expo Authors: Liz McMillan, Trevor Parsons, Lori MacVittie, Roger Strukhoff, Tom Lounibos

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security

@CloudExpo: Article

Cloud Security: Encryption Is Key

Cloud security should include a blend of traditional security elements combined with new “cloud-adjusted” security technologies

Today, with enterprises migrating to the cloud, the security challenge around protecting data is greater than ever before. Keeping data private and secure has always been a business imperative. But for many companies and organizations, it has also become a compliance requirement and a necessity to stay in business. Standards including HIPAA, Sarbanes-Oxley, PCI DSS and the Gramm-Leach-Bliley Act all require that organizations protect their data at rest and provide defenses against data loss and threats.

Public cloud computing is the delivery of computing as a service rather than as a product, and is usually categorized into three service models: Software as a Service (SaaS), Infrastructure as a Service (IaaS), and Platform as a Service (PaaS). When it comes to public cloud security, all leading cloud providers are investing significant efforts and resources in securing and certifying their datacenters. However, as cloud computing matures, enterprises are learning that cloud security cannot be delivered by the cloud provider alone. In fact, cloud providers make sure enterprises know that security is a shared responsibility, and that cloud customers do share responsibility for data security, protection from unauthorized access, and backup of their data.

Actually, this "shared responsibility" makes sense most of the time. The responsibility of cloud providers offering Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) reasonably extends to the network and the infrastructure they provide. In fact, a typical agreement between you and your cloud provider will usually state that "...you acknowledge that you bear sole responsibility for adequate security..." So businesses hosting their applications in the cloud understand that they must share responsibility for ensuring the security of their data.

As cloud computing becomes increasingly more mainstream, it's harder to distinguish the generic security issues that an IT manager needs to tackle, from those that are specific to cloud computing. Issues such as roles and responsibilities, secure application development, least privilege and many more apply equally well in traditional on-premise environments as they do in the cloud.

When an IT application is moved to a public cloud, all of the old security risks associated with it in the past still exist but, in addition, there are new risk vectors. Previously your servers and your data were physically protected within your server room. Now the "virtual servers" and "virtual storage devices" are accessible to you, the customer, via a browser; raising the concern that hackers may attempt to access the same. Here are some new risks scenarios to consider when migrating to the cloud:

  1. Snapshotting your virtual storage by gaining access to your cloud console.
    A malicious user might gain access to your cloud console by stealing your credentials or by exploiting vulnerabilities in cloud access control. In any case, once inside your account, a "snapshot" of your virtual disks will allow an attacker to move a copy of your virtual storage to his or her preferred location and abuse the data stored on those virtual disks. This risk is in our opinion the most obvious reason to deploy data encryption in the cloud, but surprisingly enough, not all companies are aware of the threat and unknowingly expose their cloud-residing data to this significant risk.
  2. Gaining access from a different server within the same account.
    Gaining access to sensitive data from a different virtual server inside the same account can be achieved by an attacker exploiting a vulnerability on that other server (such as a misconfiguration), or by one of your other cloud system administrators (a "malicious insider" from a different project in your own organization) using credentials or exploiting one of many known web application vulnerabilities to launch an attack on your virtual server. Unencrypted data can be exposed and stolen using this method.
  3. The insider threat.
    Though this scenario gets mentioned a lot, it's unlikely that a cloud provider employee will be involved in data theft. The more realistic scenario is an accidental incident related to an insider with physical access to the data center. One well-known example is the HealthNet case where 1.9 million customer records of HealthNet, a major US health insurer, were lost after its IT vendor misplaced nine server drives following a move to a new data center. According to HIPAA rules, disk-level encryption would have negated the incident impact.

The industry consensus is that encryption is an essential first step in achieving cloud computing security. An effective solution needs to meet four critical needs: High security, convenient management, robust performance and regulatory compliance. Data at rest is no longer between the proverbial "four walls" of the enterprise; the data owner is managing their own data with browsers and cloud APIs, and the concern is that a hacker can do the same. As such, cloud encryption is recognized as a basic building block of cloud security, though one difficult question has remained - where to store the encryption keys, since the keys cannot safely be stored in the cloud along with the data.

Protecting Content with Cloud Encryption and Key Management
Encryption technology is only as secure as the encryption keys. You have to keep your keys in a safe place. You need a cloud key management solution that can support encryption of your data and should supply the encryption keys for files, databases (whether the complete database or at the column, table, or tablespace level), or disks. This is actually the trickiest security question when implementing encryption in the cloud and requires thought and expertise. For example, database encryption keys are often kept in a database "wallet," which is often a file on your virtual disk. The concern is that hackers will attack the virtual disk in the cloud, and from there get access to the wallet, and through the wallet access the data.

Conclusion
Encrypting sensitive data in the cloud is an absolute must. Cloud security should include a blend of traditional security elements combined with new "cloud-adjusted" security technologies. Encryption should be a key part of your cloud security strategy due to the new cloud threat vectors (but also due to regulations such as the Patriot Act), and you should pay specific attention to key management.

More Stories By Ariel Dan

Ariel Dan is co-founder and Executive Vice President at Porticor cloud security. Follow him on twitter: @ariel_dan

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Auto-scaling environments, micro-service architectures and globally-distributed teams are just three common examples of why organizations today need automation and interoperability more than ever. But is interoperability something we simply start doing, or does it require a reexamination of our processes? And can we really improve our processes without first making interoperability a requirement for how we choose our tools?
At DevOps Summit NY there’s been a whole lot of talk about not just DevOps, but containers, IoT, and microservices. Sessions focused not just on the cultural shift needed to grow at scale with a DevOps approach, but also made sure to include the network ”plumbing” needed to ensure success as applications decompose into the microservice architectures enabling rapid growth and support for the Internet of (Every)Things.
How do you securely enable access to your applications in AWS without exposing any attack surfaces? The answer is usually very complicated because application environments morph over time in response to growing requirements from your employee base, your partners and your customers. In his session at @DevOpsSummit, Haseeb Budhani, CEO and Co-founder of Soha, shared five common approaches that DevOps teams follow to secure access to applications deployed in AWS, Azure, etc., and the friction an...
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
This week, I joined SOASTA as Senior Vice President of Performance Analytics. Given my background in cloud computing and distributed systems operations — you may have read my blogs on CNET or GigaOm — this may surprise you, but I want to explain why this is the perfect time to take on this opportunity with this team. In fact, that’s probably the best way to break this down. To explain why I’d leave the world of infrastructure and code for the world of data and analytics, let’s explore the timing...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
JavaScript is primarily a client-based dynamic scripting language most commonly used within web browsers as client-side scripts to interact with the user, browser, and communicate asynchronously to servers. If you have been part of any web-based development, odds are you have worked with JavaScript in one form or another. In this article, I'll focus on the aspects of JavaScript that are relevant within the Node.js environment.
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Countless business models have spawned from the IaaS industry. Resell Web hosting, blogs, public cloud, and on and on. With the overwhelming amount of tools available to us, it's sometimes easy to overlook that many of them are just new skins of resources we've had for a long time. In his General Session at 16th Cloud Expo, Phil Jackson, Lead Technology Evangelist at SoftLayer, broke down what we've got to work with and discuss the benefits and pitfalls to discover how we can best use them to d...
Puppet Labs has published their annual State of DevOps report and it is loaded with interesting information as always. Last year’s report brought home the point that DevOps was becoming widely accepted in the enterprise. This year’s report further validates that point and provides us with some interesting insights from surveying a wide variety of companies in different phases of their DevOps journey.
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at DevOps Summit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Microservices are hot. And for good reason. To compete in today’s fast-moving application economy, it makes sense to break large, monolithic applications down into discrete functional units. Such an approach makes it easier to update and add functionalities (text-messaging a customer, calculating sales tax for a specific geography, etc.) and get those updates / adds into production fast. In fact, some would argue that microservices are a prerequisite for true continuous delivery. But is it too...
Summer is finally here and it’s time for a DevOps summer vacation. From San Francisco to New York City, our top summer conferences list is going to continuously deliver you to the summer destinations of your dreams. These DevOps parties are hitting all the hottest summer trends with Microservices, Agile, Continuous Delivery, DevSecOps, and even Continuous Testing. Move over Kanye. These are the top 5 Summer DevOps Conferences of 2015.
What we really mean to ask is whether microservices architecture is SOA done right. But then, of course, we’d have to figure out what microservices architecture was. And if you think defining SOA is difficult, pinning down microservices architecture is unquestionably frying pan into fire time. Given my years at ZapThink, fighting to help architects understand what Service-Oriented Architecture really was and how to get it right, it’s no surprise that many people ask me this question.
One of the ways to increase scalability of services – and applications – is to go “stateless.” The reasons for this are many, but in general by eliminating the mapping between a single client and a single app or service instance you eliminate the need for resources to manage state in the app (overhead) and improve the distributability (I can make up words if I want) of requests across a pool of instances. The latter occurs because sessions don’t need to hang out and consume resources that could ...
"ProfitBricks was founded in 2010 and we are the painless cloud - and we are also the Infrastructure as a Service 2.0 company," noted Achim Weiss, Chief Executive Officer and Co-Founder of ProfitBricks, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Approved this February by the Internet Engineering Task Force (IETF), HTTP/2 is the first major update to HTTP since 1999, when HTTP/1.1 was standardized. Designed with performance in mind, one of the biggest goals of HTTP/2 implementation is to decrease latency while maintaining a high-level compatibility with HTTP/1.1. Though not all testing activities will be impacted by the new protocol, it's important for testers to be aware of any changes moving forward.