Click here to close now.


Microservices Expo Authors: Elizabeth White, Ian Khan, Jason Bloomberg, AppDynamics Blog, Liz McMillan

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Agile Computing, Release Management , Cloud Security

@CloudExpo: Article

Encryption of Data-in-Use to Harness the Power of the Cloud

Enabling cloud adoption for organizations worldwide

Cloud computing has dramatically altered how IT infrastructure is delivered and managed, as well as how IT functionality is consumed. However, security and privacy concerns continue to be major inhibitors for risk-conscious organizations to adoption of cloud computing - whether infrastructure as a service, software as a service applications or email as a service.

Cloud service providers, in response, have made strategic decisions on the investment they make in directly addressing these concerns in order to encourage broader adoption of cloud-based services. By implementing controls and processes to further improve security, cloud service providers are increasingly aiming to deliver more safeguards for the cloud environment than individual customer could within on-premise environments. However, a significant consideration for many organizations as they look to best exploit the benefits of the cloud is whether they can retain ownership and control of data processed by third party services.

Defining Roles, Responsibilities and Data Control Borders
The value proposition delivered by cloud service providers is in managing IT infrastructure in a more flexible, scalable and cost-efficient manner than an organization could do independently. The basic roles and responsibilities of the cloud service provider therefore should focus on the security, resiliency, scalability and manageability of their service. Security encompasses not only physical datacenter security, but also the means to limit administrator access across a multi-tenant environment and customer instances based on the principle of least privilege. However, at best, the cloud service provider can only provide a set of tools and options for customers looking to encrypt data in place.

Maintaining ownership and control of data is discrete from the underlying security and processes implemented by the cloud service provider. Even though the data resides on their infrastructure, cloud service providers are compelled to maintain that an organization retains responsibility for its own data. The not-for-profit Cloud Security Alliance notes in its most recent Email Security Implementation Guidance that it is critical that the customer - not the cloud service provider - be responsible for the security and encryption protection controls necessary to meet their organization's requirements.

By contrast, the roles and responsibilities of organization in regards to corporate data remain the same regardless of where it resides or is processed: specifically, maintaining ownership and direct control of that data. When corporate data is moved from on-premise to the cloud, compliance and security requirements dictate that the organization cannot relinquish ownership or control of its data. Also, the loss of visibility into who has access to that data implies that it can be subpoenaed and handed over to law enforcement agencies without its knowledge.

Principal Business Challenges of Migrating Data to the Cloud
The principal business challenges that organizations typically face when migrating data to the cloud encompass data security, regulatory compliance, unauthorized data disclosure and access, and international privacy/ data residency regulations. These issues need to be resolved to address the requirements of the legal team, as well security or compliance officers, before moving an organization's data to the cloud.

Data Security and Risk Mitigation
In cloud computing applications, data is frequently stored and processed at the cloud provider in the clear - unless customers themselves encrypt the data-at-rest and in-use. This brings up numerous data ownership and control responsibilities/concerns for an organization.

From a structural perspective, cloud-based services pose a challenge to traditional methods of securing data. Traditionally, encryption has been used to secure data resident on internal systems, or to protect data moving from one point to another. Ensuring that data remains encrypted in place within a third-party provider's environment and throughout the data lifecycle, but is seamlessly available to authorized users presents a new set of technical challenges.

In order to satisfy the new set of requirements introduced by migration to cloud-based services, cloud data must remain in encrypted cipher format. Also, data should be encrypted before it leaves the corporate or trusted network in order to meet data residency and privacy requirements. To maintain control of data that is no longer resident on a trusted network, the encryption keys remain under the organization's control and ownership.

Regulatory Compliance Requirements for Safeguards on Sensitive Data
Organizations are subject to a broad array of regulatory requirements including federal laws such as Sarbanes-Oxley, varying state data protection measures, The USA Patriot Act and vertical-specific regulations (HIPAA, HITECH, Basel II, GLBA and PCI DSS), in addition to potential international data privacy and residency requirements such as the EU Data Protection Directive.

Although the specifics vary according to the compliance requirements specified, a common stipulation is that organizations retain control over their data and maintain mechanisms to prevent unauthorized access. For instance, HIPAA regulations require technical safeguards to ensure that each covered entity is responsible for ensuring that the data within its systems has not been changed or erased in an unauthorized manner. The GLBA specifies that financial institutions within the US are mandated to protect against any anticipated threats or hazards to the security or integrity of customer records and information. Likewise, in terms of the requirements spelled out by PCI Data Security Standards, stored cardholder data needs to be protected by strong encryption.

Unauthorized Data Disclosure and Access
In the US, personal information is protected by the Fourth Amendment. However once it is shared, it is no longer protected. Until legal guidelines are established to address the application of the Fourth Amendment in cloud computing, uploaded data is not considered private.

Cloud service providers are compelled by law to comply with subpoenas and other requests by the government to turn over customer data, including data subject to attorney-client privilege and other protected data. Often, cloud providers will only notify customers that data was turned over to the government after the fact, if at all. In some instances, they may even be expressly prohibited from notifying customers. This risk prevents many organizations from migrating sensitive data to the cloud.

International Privacy/ Data Residency Regulations
Data protection laws and privacy regulations mandate the direct control of an organization's information and safeguards for moving data outside of defined jurisdictions. These laws are broad, and are continually being implemented in a growing number of countries across the globe -- making it difficult for some organizations to fully realize the promise of cloud computing.

To comply with specific data protection laws and international privacy regulations, organizations often pay cloud providers a premium to add costly infrastructure in each location of interest, resulting in a sharp increase in costs and decrease in efficiency. Furthermore, most providers are unwilling to duplicate infrastructure in all locations, making it difficult for customers to comply with these regulations.

Implementing Best Practices for Cloud Data Control: Data-in-Use Encryption
Encryption of data-in-transit and data-at-rest has long been recognized as best practices to enforce the security and privacy of data, regardless of where it resides. However, these two states of encryption are no longer sufficient as they do not protect data while it is being processed in the cloud.

According to the Cloud Security Alliance's Encryption Implementation Guidance, organizations should implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use). To prevent unauthorized access and maintain the state of encryption even when processed in a third-party environment, enterprise IT should retain ownership of the encryption keys. As a result, the cloud provider never has access to customer data in an unencrypted form, and an organization's cloud data remains unreadable if an unauthorized third-party attempts access -- or even if the data is disclosed in response to a government request.

Figure 1: The not-for-profit industry association, the Cloud Security Alliance, recommends that organizations implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use).

Traditionally, if cloud-hosted data was encrypted, basic server-side operations such as indexing, searching and sorting records became impossible. Once cipher text was put into a SaaS application, some of the features of the program no longer worked, and the user experience suffered as a result. The implementation of data-in-use encryption supports dynamic operations such as search, sort and index of encrypted data in the cloud. Even as the data is processed by a cloud-based service, the IT department of the organization that owns the data or a trusted third party retains control of the encryption keys. As a result, application functionality is preserved and decryption is policy-driven and automated.

The Implementation of Data-in-Use Encryption Enables Organizations to Seamlessly Harness the Power of the Cloud
By addressing the concerns associated with control and ownership of proprietary data residing on third-party cloud-based servers, data-in-use encryption technology directly addresses material concerns related to compliance requirements, separation of data controls through key retention, data residency and unauthorized disclosure of data in response to a government request.

Data-in-use encryption is of particular value for organizations with the desire to independently manage data disclosure requests from law enforcement agencies. Equally, cloud service provides are not eager to be in the undesirable position of being compelled to disclose customer data. The cloud provider will still turn over customer data when presented with a subpoena or other government request because they have no choice but to comply. However, because all of the data was encrypted before it was received by the cloud provider, and the organization is holding the encryption keys, they cannot decrypt that data. Therefore, when complying with an order, the cloud provider can only turn over cipher text. If the government wants to decrypt the data, it must go directly to the organization that owns the data.

Figure 2: Sample of an authorized \ unauthorized view of sensitive data in a hosted Exchange email application.

In geographically distributed environments, smart encryption also creates a paradigm shift from requiring the data to remain locally to only requiring the encryption keys to remain locally for data. Organizations with multiple data residency requirements can deploy and maintain an instance of the encryption appliance in each jurisdiction. Once the data is encrypted with keys that are maintained in that jurisdiction, the encrypted data can lawfully reside in any location.

The addition of encryption-in-use empowers the organization to retain full ownership and control during the entire process, including when the data is out of its network and in the cloud, while ensuring maximum security and regulatory compliance.

Industry analysts agree. According to Ahmad Zeffirelli, Industry Analyst at Frost & Sullivan, "This solution with its ability to encrypt data-in-use, data-in-transit, and data-at-rest, would bring immense benefits to a vast majority of organizations concerned about data security while leveraging cloud computing."

Building Commercially Viable Encryption
One of the most difficult technical challenges in developing encryption for commercial applications running in the cloud is to establish the right balance between the competing goals of encryption/security on the one hand versus features/performance on the other. In commercial markets, especially in the cloud, introducing additional steps for users to follow in order to address security requirements both undermines the ease of use value propositions of cloud-based services and creates the likelihood that users will look for ways to circumvent controls.

The entire process should be transparent to the end-user. Specifically, the security functionality should not require the installation of an application or agent on the end user's client device or mobile phone. Also, there should be no impact to the end-user experience in terms of functionality, performance, or task workflow. Furthermore, commercially viable encryption capabilities should not interfere with standard email security features such as malware and anti-virus protection.

By effectively addressing data control, compliance and security requirements, while ensuring preservation of application functionality including search, sort and index capabilities and a seamless user experience, technology that enables the encryption of data-at-rest, data-in-transit and data-in-use within the cloud environment functions as an enabler for cloud adoption for organizations worldwide.

More Stories By Elad Yoran

Elad Yoran is the CEO of Vaultive, Inc. He is a recognized expert on information security market and technology trends. Yoran has 20 years of experience in the cyber security industry as an executive, consultant, investor, investment banker and several-time successful entrepreneur. He is also a member of a number of technology, security and community Boards, including FBI Information Technology Advisory Council (ITAC); Department of Homeland Security Advisory Board for Command, Control and Interoperability for Advanced Data Analysis (CCICADA); and Cloud Security Alliance New York Metro Chapter.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@MicroservicesExpo Stories
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
One of the most important tenets of digital transformation is that it’s customer-driven. In fact, the only reason technology is involved at all is because today’s customers demand technology-based interactions with the companies they do business with. It’s no surprise, therefore, that we at Intellyx agree with Patrick Maes, CTO, ANZ Bank, when he said, “the fundamental element in digital transformation is extreme customer centricity.” So true – but note the insightful twist that Maes adde...
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Using any programming framework to the fullest extent possible first requires an understanding of advanced software architecture concepts. While writing a little client-side JavaScript does not necessarily require as much consideration when designing a scalable software architecture, the evolution of tools like Node.js means that you could be facing large code bases that must be easy to maintain.
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Su...
You may have heard about the pets vs. cattle discussion – a reference to the way application servers are deployed in the cloud native world. If an application server goes down it can simply be dropped from the mix and a new server added in its place. The practice so far has mostly been applied to application deployments. Management software on the other hand is treated in a very special manner. Dedicated resources are set aside to run the management software components and several alerting syst...
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at, explored the value of Kibana 4 for log analysis and provided a hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He examined three use cases: IT operations, business intelligence, and security and compliance. Asaf Yigal is co-founder and VP of Product at log analytics software company In the past, he was co-founder of social-trading platform Currensee, which...
It's been a busy time for tech's ongoing infatuation with containers. Amazon just announced EC2 Container Registry to simply container management. The new Azure container service taps into Microsoft's partnership with Docker and Mesosphere. You know when there's a standard for containers on the table there's money on the table, too. Everyone is talking containers because they reduce a ton of development-related challenges and make it much easier to move across production and testing environm...
People want to get going with DevOps or Continuous Delivery, but need a place to start. Others are already on their way, but need some validation of their choices. A few months ago, I published the first volume of DevOps and Continuous Delivery reference architectures which has now been viewed over 50,000 times on SlideShare (it's free to registration required). Three things helped people in the deck: (1) the reference architectures, (2) links to the sources for each architectur...
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion...
Hiring the wrong candidate can cost a company hundreds of thousands of dollars, and result in lost profit and productivity during the search for a replacement. In fact, the Harvard Business Review has found that as much as 80 percent of turnover is caused by bad hiring decisions. But when your organization has implemented DevOps, the job is about more than just technical chops. It’s also about core behaviors: how they work with others, how they make decisions, and how those decisions translate t...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace. Traditional approaches for driving innovation are now woefully inadequate for keeping up with the breadth of disruption and change facin...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound...
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNu...
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem"...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content. Join @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 7-9, 2016 in New York City, for three days of intense 'Internet of Things' discussion and focus, including Big Data's indespensable role in IoT, Smart Grids and Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) IoT's use in Vertical Markets.