Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Carmen Gonzalez, Roger Strukhoff

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Agile Computing, Release Management , Cloud Security

@CloudExpo: Article

Encryption of Data-in-Use to Harness the Power of the Cloud

Enabling cloud adoption for organizations worldwide

Cloud computing has dramatically altered how IT infrastructure is delivered and managed, as well as how IT functionality is consumed. However, security and privacy concerns continue to be major inhibitors for risk-conscious organizations to adoption of cloud computing - whether infrastructure as a service, software as a service applications or email as a service.

Cloud service providers, in response, have made strategic decisions on the investment they make in directly addressing these concerns in order to encourage broader adoption of cloud-based services. By implementing controls and processes to further improve security, cloud service providers are increasingly aiming to deliver more safeguards for the cloud environment than individual customer could within on-premise environments. However, a significant consideration for many organizations as they look to best exploit the benefits of the cloud is whether they can retain ownership and control of data processed by third party services.

Defining Roles, Responsibilities and Data Control Borders
The value proposition delivered by cloud service providers is in managing IT infrastructure in a more flexible, scalable and cost-efficient manner than an organization could do independently. The basic roles and responsibilities of the cloud service provider therefore should focus on the security, resiliency, scalability and manageability of their service. Security encompasses not only physical datacenter security, but also the means to limit administrator access across a multi-tenant environment and customer instances based on the principle of least privilege. However, at best, the cloud service provider can only provide a set of tools and options for customers looking to encrypt data in place.

Maintaining ownership and control of data is discrete from the underlying security and processes implemented by the cloud service provider. Even though the data resides on their infrastructure, cloud service providers are compelled to maintain that an organization retains responsibility for its own data. The not-for-profit Cloud Security Alliance notes in its most recent Email Security Implementation Guidance that it is critical that the customer - not the cloud service provider - be responsible for the security and encryption protection controls necessary to meet their organization's requirements.

By contrast, the roles and responsibilities of organization in regards to corporate data remain the same regardless of where it resides or is processed: specifically, maintaining ownership and direct control of that data. When corporate data is moved from on-premise to the cloud, compliance and security requirements dictate that the organization cannot relinquish ownership or control of its data. Also, the loss of visibility into who has access to that data implies that it can be subpoenaed and handed over to law enforcement agencies without its knowledge.

Principal Business Challenges of Migrating Data to the Cloud
The principal business challenges that organizations typically face when migrating data to the cloud encompass data security, regulatory compliance, unauthorized data disclosure and access, and international privacy/ data residency regulations. These issues need to be resolved to address the requirements of the legal team, as well security or compliance officers, before moving an organization's data to the cloud.

Data Security and Risk Mitigation
In cloud computing applications, data is frequently stored and processed at the cloud provider in the clear - unless customers themselves encrypt the data-at-rest and in-use. This brings up numerous data ownership and control responsibilities/concerns for an organization.

From a structural perspective, cloud-based services pose a challenge to traditional methods of securing data. Traditionally, encryption has been used to secure data resident on internal systems, or to protect data moving from one point to another. Ensuring that data remains encrypted in place within a third-party provider's environment and throughout the data lifecycle, but is seamlessly available to authorized users presents a new set of technical challenges.

In order to satisfy the new set of requirements introduced by migration to cloud-based services, cloud data must remain in encrypted cipher format. Also, data should be encrypted before it leaves the corporate or trusted network in order to meet data residency and privacy requirements. To maintain control of data that is no longer resident on a trusted network, the encryption keys remain under the organization's control and ownership.

Regulatory Compliance Requirements for Safeguards on Sensitive Data
Organizations are subject to a broad array of regulatory requirements including federal laws such as Sarbanes-Oxley, varying state data protection measures, The USA Patriot Act and vertical-specific regulations (HIPAA, HITECH, Basel II, GLBA and PCI DSS), in addition to potential international data privacy and residency requirements such as the EU Data Protection Directive.

Although the specifics vary according to the compliance requirements specified, a common stipulation is that organizations retain control over their data and maintain mechanisms to prevent unauthorized access. For instance, HIPAA regulations require technical safeguards to ensure that each covered entity is responsible for ensuring that the data within its systems has not been changed or erased in an unauthorized manner. The GLBA specifies that financial institutions within the US are mandated to protect against any anticipated threats or hazards to the security or integrity of customer records and information. Likewise, in terms of the requirements spelled out by PCI Data Security Standards, stored cardholder data needs to be protected by strong encryption.

Unauthorized Data Disclosure and Access
In the US, personal information is protected by the Fourth Amendment. However once it is shared, it is no longer protected. Until legal guidelines are established to address the application of the Fourth Amendment in cloud computing, uploaded data is not considered private.

Cloud service providers are compelled by law to comply with subpoenas and other requests by the government to turn over customer data, including data subject to attorney-client privilege and other protected data. Often, cloud providers will only notify customers that data was turned over to the government after the fact, if at all. In some instances, they may even be expressly prohibited from notifying customers. This risk prevents many organizations from migrating sensitive data to the cloud.

International Privacy/ Data Residency Regulations
Data protection laws and privacy regulations mandate the direct control of an organization's information and safeguards for moving data outside of defined jurisdictions. These laws are broad, and are continually being implemented in a growing number of countries across the globe -- making it difficult for some organizations to fully realize the promise of cloud computing.

To comply with specific data protection laws and international privacy regulations, organizations often pay cloud providers a premium to add costly infrastructure in each location of interest, resulting in a sharp increase in costs and decrease in efficiency. Furthermore, most providers are unwilling to duplicate infrastructure in all locations, making it difficult for customers to comply with these regulations.

Implementing Best Practices for Cloud Data Control: Data-in-Use Encryption
Encryption of data-in-transit and data-at-rest has long been recognized as best practices to enforce the security and privacy of data, regardless of where it resides. However, these two states of encryption are no longer sufficient as they do not protect data while it is being processed in the cloud.

According to the Cloud Security Alliance's Encryption Implementation Guidance, organizations should implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use). To prevent unauthorized access and maintain the state of encryption even when processed in a third-party environment, enterprise IT should retain ownership of the encryption keys. As a result, the cloud provider never has access to customer data in an unencrypted form, and an organization's cloud data remains unreadable if an unauthorized third-party attempts access -- or even if the data is disclosed in response to a government request.

Figure 1: The not-for-profit industry association, the Cloud Security Alliance, recommends that organizations implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use).

Traditionally, if cloud-hosted data was encrypted, basic server-side operations such as indexing, searching and sorting records became impossible. Once cipher text was put into a SaaS application, some of the features of the program no longer worked, and the user experience suffered as a result. The implementation of data-in-use encryption supports dynamic operations such as search, sort and index of encrypted data in the cloud. Even as the data is processed by a cloud-based service, the IT department of the organization that owns the data or a trusted third party retains control of the encryption keys. As a result, application functionality is preserved and decryption is policy-driven and automated.

The Implementation of Data-in-Use Encryption Enables Organizations to Seamlessly Harness the Power of the Cloud
By addressing the concerns associated with control and ownership of proprietary data residing on third-party cloud-based servers, data-in-use encryption technology directly addresses material concerns related to compliance requirements, separation of data controls through key retention, data residency and unauthorized disclosure of data in response to a government request.

Data-in-use encryption is of particular value for organizations with the desire to independently manage data disclosure requests from law enforcement agencies. Equally, cloud service provides are not eager to be in the undesirable position of being compelled to disclose customer data. The cloud provider will still turn over customer data when presented with a subpoena or other government request because they have no choice but to comply. However, because all of the data was encrypted before it was received by the cloud provider, and the organization is holding the encryption keys, they cannot decrypt that data. Therefore, when complying with an order, the cloud provider can only turn over cipher text. If the government wants to decrypt the data, it must go directly to the organization that owns the data.

Figure 2: Sample of an authorized \ unauthorized view of sensitive data in a hosted Exchange email application.

In geographically distributed environments, smart encryption also creates a paradigm shift from requiring the data to remain locally to only requiring the encryption keys to remain locally for data. Organizations with multiple data residency requirements can deploy and maintain an instance of the encryption appliance in each jurisdiction. Once the data is encrypted with keys that are maintained in that jurisdiction, the encrypted data can lawfully reside in any location.

The addition of encryption-in-use empowers the organization to retain full ownership and control during the entire process, including when the data is out of its network and in the cloud, while ensuring maximum security and regulatory compliance.

Industry analysts agree. According to Ahmad Zeffirelli, Industry Analyst at Frost & Sullivan, "This solution with its ability to encrypt data-in-use, data-in-transit, and data-at-rest, would bring immense benefits to a vast majority of organizations concerned about data security while leveraging cloud computing."

Building Commercially Viable Encryption
One of the most difficult technical challenges in developing encryption for commercial applications running in the cloud is to establish the right balance between the competing goals of encryption/security on the one hand versus features/performance on the other. In commercial markets, especially in the cloud, introducing additional steps for users to follow in order to address security requirements both undermines the ease of use value propositions of cloud-based services and creates the likelihood that users will look for ways to circumvent controls.

The entire process should be transparent to the end-user. Specifically, the security functionality should not require the installation of an application or agent on the end user's client device or mobile phone. Also, there should be no impact to the end-user experience in terms of functionality, performance, or task workflow. Furthermore, commercially viable encryption capabilities should not interfere with standard email security features such as malware and anti-virus protection.

Conclusion
By effectively addressing data control, compliance and security requirements, while ensuring preservation of application functionality including search, sort and index capabilities and a seamless user experience, technology that enables the encryption of data-at-rest, data-in-transit and data-in-use within the cloud environment functions as an enabler for cloud adoption for organizations worldwide.

More Stories By Elad Yoran

Elad Yoran is the CEO of Vaultive, Inc. He is a recognized expert on information security market and technology trends. Yoran has 20 years of experience in the cyber security industry as an executive, consultant, investor, investment banker and several-time successful entrepreneur. He is also a member of a number of technology, security and community Boards, including FBI Information Technology Advisory Council (ITAC); Department of Homeland Security Advisory Board for Command, Control and Interoperability for Advanced Data Analysis (CCICADA); and Cloud Security Alliance New York Metro Chapter.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud: This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, showed how customers are able to achieve a level of transparency that enables everyone fro...
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...