Welcome!

Microservices Expo Authors: Pat Romanski, XebiaLabs Blog, Liz McMillan, Derek Weeks, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security

@CloudExpo: Article

Cloud Security: Encryption Is Key

Cloud security should include a blend of traditional security elements combined with new “cloud-adjusted” security technologies

Today, with enterprises migrating to the cloud, the security challenge around protecting data is greater than ever before. Keeping data private and secure has always been a business imperative. But for many companies and organizations, it has also become a compliance requirement and a necessity to stay in business. Standards including HIPAA, Sarbanes-Oxley, PCI DSS and the Gramm-Leach-Bliley Act all require that organizations protect their data at rest and provide defenses against data loss and threats.

Public cloud computing is the delivery of computing as a service rather than as a product, and is usually categorized into three service models: Software as a Service (SaaS), Infrastructure as a Service (IaaS), and Platform as a Service (PaaS). When it comes to public cloud security, all leading cloud providers are investing significant efforts and resources in securing and certifying their datacenters. However, as cloud computing matures, enterprises are learning that cloud security cannot be delivered by the cloud provider alone. In fact, cloud providers make sure enterprises know that security is a shared responsibility, and that cloud customers do share responsibility for data security, protection from unauthorized access, and backup of their data.

Actually, this "shared responsibility" makes sense most of the time. The responsibility of cloud providers offering Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) reasonably extends to the network and the infrastructure they provide. In fact, a typical agreement between you and your cloud provider will usually state that "...you acknowledge that you bear sole responsibility for adequate security..." So businesses hosting their applications in the cloud understand that they must share responsibility for ensuring the security of their data.

As cloud computing becomes increasingly more mainstream, it's harder to distinguish the generic security issues that an IT manager needs to tackle, from those that are specific to cloud computing. Issues such as roles and responsibilities, secure application development, least privilege and many more apply equally well in traditional on-premise environments as they do in the cloud.

When an IT application is moved to a public cloud, all of the old security risks associated with it in the past still exist but, in addition, there are new risk vectors. Previously your servers and your data were physically protected within your server room. Now the "virtual servers" and "virtual storage devices" are accessible to you, the customer, via a browser; raising the concern that hackers may attempt to access the same. Here are some new risks scenarios to consider when migrating to the cloud:

  1. Snapshotting your virtual storage by gaining access to your cloud console.
    A malicious user might gain access to your cloud console by stealing your credentials or by exploiting vulnerabilities in cloud access control. In any case, once inside your account, a "snapshot" of your virtual disks will allow an attacker to move a copy of your virtual storage to his or her preferred location and abuse the data stored on those virtual disks. This risk is in our opinion the most obvious reason to deploy data encryption in the cloud, but surprisingly enough, not all companies are aware of the threat and unknowingly expose their cloud-residing data to this significant risk.
  2. Gaining access from a different server within the same account.
    Gaining access to sensitive data from a different virtual server inside the same account can be achieved by an attacker exploiting a vulnerability on that other server (such as a misconfiguration), or by one of your other cloud system administrators (a "malicious insider" from a different project in your own organization) using credentials or exploiting one of many known web application vulnerabilities to launch an attack on your virtual server. Unencrypted data can be exposed and stolen using this method.
  3. The insider threat.
    Though this scenario gets mentioned a lot, it's unlikely that a cloud provider employee will be involved in data theft. The more realistic scenario is an accidental incident related to an insider with physical access to the data center. One well-known example is the HealthNet case where 1.9 million customer records of HealthNet, a major US health insurer, were lost after its IT vendor misplaced nine server drives following a move to a new data center. According to HIPAA rules, disk-level encryption would have negated the incident impact.

The industry consensus is that encryption is an essential first step in achieving cloud computing security. An effective solution needs to meet four critical needs: High security, convenient management, robust performance and regulatory compliance. Data at rest is no longer between the proverbial "four walls" of the enterprise; the data owner is managing their own data with browsers and cloud APIs, and the concern is that a hacker can do the same. As such, cloud encryption is recognized as a basic building block of cloud security, though one difficult question has remained - where to store the encryption keys, since the keys cannot safely be stored in the cloud along with the data.

Protecting Content with Cloud Encryption and Key Management
Encryption technology is only as secure as the encryption keys. You have to keep your keys in a safe place. You need a cloud key management solution that can support encryption of your data and should supply the encryption keys for files, databases (whether the complete database or at the column, table, or tablespace level), or disks. This is actually the trickiest security question when implementing encryption in the cloud and requires thought and expertise. For example, database encryption keys are often kept in a database "wallet," which is often a file on your virtual disk. The concern is that hackers will attack the virtual disk in the cloud, and from there get access to the wallet, and through the wallet access the data.

Conclusion
Encrypting sensitive data in the cloud is an absolute must. Cloud security should include a blend of traditional security elements combined with new "cloud-adjusted" security technologies. Encryption should be a key part of your cloud security strategy due to the new cloud threat vectors (but also due to regulations such as the Patriot Act), and you should pay specific attention to key management.

More Stories By Ariel Dan

Ariel Dan is co-founder and Executive Vice President at Porticor cloud security. Follow him on twitter: @ariel_dan

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
When scaling agile / Scrum, we invariable run into the alignment vs autonomy problem. In short: you cannot have autonomous self directing teams if they have no clue in what direction they should go, or even shorter: Alignment breeds autonomy. But how do we create alignment? and what tools can we use to quickly evaluate if what we want to do is part of the mission or better left out? Niel Nickolaisen created the Purpose Alignment model and I use it with innovation labs in large enterprises to de...
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management solutions, helping companies worldwide activate their data to drive more value and business insight and to transform moder...
Throughout history, various leaders have risen up and tried to unify the world by conquest. Fortunately, none of their plans have succeeded. The world goes on just fine with each country ruling itself; no single ruler is necessary. That’s how it is with the container platform ecosystem, as well. There’s no need for one all-powerful, all-encompassing container platform. Think about any other technology sector out there – there are always multiple solutions in every space. The same goes for conta...
Let's recap what we learned from the previous chapters in the series: episode 1 and episode 2. We learned that a good rollback mechanism cannot be designed without having an intimate knowledge of the application architecture, the nature of your components and their dependencies. Now that we know what we have to restore and in which order, the question is how?
About a year ago we tuned into “the need for speed” and how a concept like "serverless computing” was increasingly catering to this. We are now a year further and the term “serverless” is taking on unexpected proportions. With some even seeing it as the successor to cloud in general or at least as a successor to the clouds’ poorer cousin in terms of revenue, hype and adoption: PaaS. The question we need to ask is whether this constitutes an example of Hype Hopping: to effortlessly pivot to the ...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the protocols that communicate data and the emerging data analy...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Video experiences should be unique and exciting! But that doesn’t mean you need to patch all the pieces yourself. Users demand rich and engaging experiences and new ways to connect with you. But creating robust video applications at scale can be complicated, time-consuming and expensive. In his session at @ThingsExpo, Zohar Babin, Vice President of Platform, Ecosystem and Community at Kaltura, will discuss how VPaaS enables you to move fast, creating scalable video experiences that reach your...
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
With the rise of Docker, Kubernetes, and other container technologies, the growth of microservices has skyrocketed among dev teams looking to innovate on a faster release cycle. This has enabled teams to finally realize their DevOps goals to ship and iterate quickly in a continuous delivery model. Why containers are growing in popularity is no surprise — they’re extremely easy to spin up or down, but come with an unforeseen issue. However, without the right foresight, DevOps and IT teams may lo...
As applications are promoted from the development environment to the CI or the QA environment and then into the production environment, it is very common for the configuration settings to be changed as the code is promoted. For example, the settings for the database connection pools are typically lower in development environment than the QA/Load Testing environment. The primary reason for the existence of the configuration setting differences is to enhance application performance. However, occas...