|By Elad Yoran||
|November 19, 2012 08:00 AM EST||
Cloud computing has dramatically altered how IT infrastructure is delivered and managed, as well as how IT functionality is consumed. However, security and privacy concerns continue to be major inhibitors for risk-conscious organizations to adoption of cloud computing - whether infrastructure as a service, software as a service applications or email as a service.
Cloud service providers, in response, have made strategic decisions on the investment they make in directly addressing these concerns in order to encourage broader adoption of cloud-based services. By implementing controls and processes to further improve security, cloud service providers are increasingly aiming to deliver more safeguards for the cloud environment than individual customer could within on-premise environments. However, a significant consideration for many organizations as they look to best exploit the benefits of the cloud is whether they can retain ownership and control of data processed by third party services.
Defining Roles, Responsibilities and Data Control Borders
The value proposition delivered by cloud service providers is in managing IT infrastructure in a more flexible, scalable and cost-efficient manner than an organization could do independently. The basic roles and responsibilities of the cloud service provider therefore should focus on the security, resiliency, scalability and manageability of their service. Security encompasses not only physical datacenter security, but also the means to limit administrator access across a multi-tenant environment and customer instances based on the principle of least privilege. However, at best, the cloud service provider can only provide a set of tools and options for customers looking to encrypt data in place.
Maintaining ownership and control of data is discrete from the underlying security and processes implemented by the cloud service provider. Even though the data resides on their infrastructure, cloud service providers are compelled to maintain that an organization retains responsibility for its own data. The not-for-profit Cloud Security Alliance notes in its most recent Email Security Implementation Guidance that it is critical that the customer - not the cloud service provider - be responsible for the security and encryption protection controls necessary to meet their organization's requirements.
By contrast, the roles and responsibilities of organization in regards to corporate data remain the same regardless of where it resides or is processed: specifically, maintaining ownership and direct control of that data. When corporate data is moved from on-premise to the cloud, compliance and security requirements dictate that the organization cannot relinquish ownership or control of its data. Also, the loss of visibility into who has access to that data implies that it can be subpoenaed and handed over to law enforcement agencies without its knowledge.
Principal Business Challenges of Migrating Data to the Cloud
The principal business challenges that organizations typically face when migrating data to the cloud encompass data security, regulatory compliance, unauthorized data disclosure and access, and international privacy/ data residency regulations. These issues need to be resolved to address the requirements of the legal team, as well security or compliance officers, before moving an organization's data to the cloud.
Data Security and Risk Mitigation
In cloud computing applications, data is frequently stored and processed at the cloud provider in the clear - unless customers themselves encrypt the data-at-rest and in-use. This brings up numerous data ownership and control responsibilities/concerns for an organization.
From a structural perspective, cloud-based services pose a challenge to traditional methods of securing data. Traditionally, encryption has been used to secure data resident on internal systems, or to protect data moving from one point to another. Ensuring that data remains encrypted in place within a third-party provider's environment and throughout the data lifecycle, but is seamlessly available to authorized users presents a new set of technical challenges.
In order to satisfy the new set of requirements introduced by migration to cloud-based services, cloud data must remain in encrypted cipher format. Also, data should be encrypted before it leaves the corporate or trusted network in order to meet data residency and privacy requirements. To maintain control of data that is no longer resident on a trusted network, the encryption keys remain under the organization's control and ownership.
Regulatory Compliance Requirements for Safeguards on Sensitive Data
Organizations are subject to a broad array of regulatory requirements including federal laws such as Sarbanes-Oxley, varying state data protection measures, The USA Patriot Act and vertical-specific regulations (HIPAA, HITECH, Basel II, GLBA and PCI DSS), in addition to potential international data privacy and residency requirements such as the EU Data Protection Directive.
Although the specifics vary according to the compliance requirements specified, a common stipulation is that organizations retain control over their data and maintain mechanisms to prevent unauthorized access. For instance, HIPAA regulations require technical safeguards to ensure that each covered entity is responsible for ensuring that the data within its systems has not been changed or erased in an unauthorized manner. The GLBA specifies that financial institutions within the US are mandated to protect against any anticipated threats or hazards to the security or integrity of customer records and information. Likewise, in terms of the requirements spelled out by PCI Data Security Standards, stored cardholder data needs to be protected by strong encryption.
Unauthorized Data Disclosure and Access
In the US, personal information is protected by the Fourth Amendment. However once it is shared, it is no longer protected. Until legal guidelines are established to address the application of the Fourth Amendment in cloud computing, uploaded data is not considered private.
Cloud service providers are compelled by law to comply with subpoenas and other requests by the government to turn over customer data, including data subject to attorney-client privilege and other protected data. Often, cloud providers will only notify customers that data was turned over to the government after the fact, if at all. In some instances, they may even be expressly prohibited from notifying customers. This risk prevents many organizations from migrating sensitive data to the cloud.
International Privacy/ Data Residency Regulations
Data protection laws and privacy regulations mandate the direct control of an organization's information and safeguards for moving data outside of defined jurisdictions. These laws are broad, and are continually being implemented in a growing number of countries across the globe -- making it difficult for some organizations to fully realize the promise of cloud computing.
To comply with specific data protection laws and international privacy regulations, organizations often pay cloud providers a premium to add costly infrastructure in each location of interest, resulting in a sharp increase in costs and decrease in efficiency. Furthermore, most providers are unwilling to duplicate infrastructure in all locations, making it difficult for customers to comply with these regulations.
Implementing Best Practices for Cloud Data Control: Data-in-Use Encryption
Encryption of data-in-transit and data-at-rest has long been recognized as best practices to enforce the security and privacy of data, regardless of where it resides. However, these two states of encryption are no longer sufficient as they do not protect data while it is being processed in the cloud.
According to the Cloud Security Alliance's Encryption Implementation Guidance, organizations should implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use). To prevent unauthorized access and maintain the state of encryption even when processed in a third-party environment, enterprise IT should retain ownership of the encryption keys. As a result, the cloud provider never has access to customer data in an unencrypted form, and an organization's cloud data remains unreadable if an unauthorized third-party attempts access -- or even if the data is disclosed in response to a government request.
Figure 1: The not-for-profit industry association, the Cloud Security Alliance, recommends that organizations implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use).
Traditionally, if cloud-hosted data was encrypted, basic server-side operations such as indexing, searching and sorting records became impossible. Once cipher text was put into a SaaS application, some of the features of the program no longer worked, and the user experience suffered as a result. The implementation of data-in-use encryption supports dynamic operations such as search, sort and index of encrypted data in the cloud. Even as the data is processed by a cloud-based service, the IT department of the organization that owns the data or a trusted third party retains control of the encryption keys. As a result, application functionality is preserved and decryption is policy-driven and automated.
The Implementation of Data-in-Use Encryption Enables Organizations to Seamlessly Harness the Power of the Cloud
By addressing the concerns associated with control and ownership of proprietary data residing on third-party cloud-based servers, data-in-use encryption technology directly addresses material concerns related to compliance requirements, separation of data controls through key retention, data residency and unauthorized disclosure of data in response to a government request.
Data-in-use encryption is of particular value for organizations with the desire to independently manage data disclosure requests from law enforcement agencies. Equally, cloud service provides are not eager to be in the undesirable position of being compelled to disclose customer data. The cloud provider will still turn over customer data when presented with a subpoena or other government request because they have no choice but to comply. However, because all of the data was encrypted before it was received by the cloud provider, and the organization is holding the encryption keys, they cannot decrypt that data. Therefore, when complying with an order, the cloud provider can only turn over cipher text. If the government wants to decrypt the data, it must go directly to the organization that owns the data.
Figure 2: Sample of an authorized \ unauthorized view of sensitive data in a hosted Exchange email application.
In geographically distributed environments, smart encryption also creates a paradigm shift from requiring the data to remain locally to only requiring the encryption keys to remain locally for data. Organizations with multiple data residency requirements can deploy and maintain an instance of the encryption appliance in each jurisdiction. Once the data is encrypted with keys that are maintained in that jurisdiction, the encrypted data can lawfully reside in any location.
The addition of encryption-in-use empowers the organization to retain full ownership and control during the entire process, including when the data is out of its network and in the cloud, while ensuring maximum security and regulatory compliance.
Industry analysts agree. According to Ahmad Zeffirelli, Industry Analyst at Frost & Sullivan, "This solution with its ability to encrypt data-in-use, data-in-transit, and data-at-rest, would bring immense benefits to a vast majority of organizations concerned about data security while leveraging cloud computing."
Building Commercially Viable Encryption
One of the most difficult technical challenges in developing encryption for commercial applications running in the cloud is to establish the right balance between the competing goals of encryption/security on the one hand versus features/performance on the other. In commercial markets, especially in the cloud, introducing additional steps for users to follow in order to address security requirements both undermines the ease of use value propositions of cloud-based services and creates the likelihood that users will look for ways to circumvent controls.
The entire process should be transparent to the end-user. Specifically, the security functionality should not require the installation of an application or agent on the end user's client device or mobile phone. Also, there should be no impact to the end-user experience in terms of functionality, performance, or task workflow. Furthermore, commercially viable encryption capabilities should not interfere with standard email security features such as malware and anti-virus protection.
By effectively addressing data control, compliance and security requirements, while ensuring preservation of application functionality including search, sort and index capabilities and a seamless user experience, technology that enables the encryption of data-at-rest, data-in-transit and data-in-use within the cloud environment functions as an enabler for cloud adoption for organizations worldwide.
Early in my DevOps Journey, I was introduced to a book of great significance circulating within the Web Operations industry titled The Phoenix Project. (You can read our review of Gene’s book, if interested.) Written as a novel and loosely based on many of the same principles explored in The Goal, this book has been read and referenced by many who have adopted DevOps into their continuous improvement and software delivery processes around the world. As I began planning my travel schedule last...
Sep. 2, 2015 06:00 AM EDT Reads: 557
Puppet Labs has announced the next major update to its flagship product: Puppet Enterprise 2015.2. This release includes new features providing DevOps teams with clarity, simplicity and additional management capabilities, including an all-new user interface, an interactive graph for visualizing infrastructure code, a new unified agent and broader infrastructure support.
Sep. 2, 2015 05:15 AM EDT Reads: 546
API-Driven Digital Healthcare Solution By @AkanaInc | @DevOpsSummit #API #IoT #DevOps #Microservices
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device acce...
Sep. 2, 2015 04:00 AM EDT Reads: 245
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
Sep. 2, 2015 03:45 AM EDT Reads: 426
SYS-CON Events announced today that the "Second Containers & Microservices Expo" will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Sep. 2, 2015 03:30 AM EDT Reads: 607
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
Sep. 2, 2015 03:00 AM EDT Reads: 522
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...
Sep. 2, 2015 02:30 AM EDT Reads: 452
It’s been proven time and time again that in tech, diversity drives greater innovation, better team productivity and greater profits and market share. So what can we do in our DevOps teams to embrace diversity and help transform the culture of development and operations into a true “DevOps” team? In her session at DevOps Summit, Stefana Muller, Director, Product Management – Continuous Delivery at CA Technologies, answered that question citing examples, showing how to create opportunities for ...
Sep. 1, 2015 11:00 PM EDT Reads: 512
Microservice architecture is fast becoming a go-to solution for enterprise applications, but it's not always easy to make the transition from an established, monolithic infrastructure. Lightweight and loosely coupled, building a set of microservices is arguably more difficult than building a monolithic application. However, once established, microservices offer a series of advantages over traditional architectures as deployment times become shorter and iterating becomes easier.
Sep. 1, 2015 06:30 PM EDT
In his session at 17th Cloud Expo, Ernest Mueller, Product Manager at Idera, will explain the best practices and lessons learned for tracking and optimizing costs while delivering a cloud-hosted service. He will describe a DevOps approach where the applications and systems work together to track usage, model costs in a granular fashion, and make smart decisions at runtime to minimize costs. The trickier parts covered include triggering off the right metrics; balancing resilience and redundancy ...
Sep. 1, 2015 03:30 PM EDT Reads: 251
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Sep. 1, 2015 12:30 PM EDT Reads: 926
Whether you like it or not, DevOps is on track for a remarkable alliance with security. The SEC didn’t approve the merger. And your boss hasn’t heard anything about it. Yet, this unruly triumvirate will soon dominate and deliver DevSecOps faster, cheaper, better, and on an unprecedented scale. In his session at DevOps Summit, Frank Bunger, VP of Customer Success at ScriptRock, will discuss how this cathartic moment will propel the DevOps movement from such stuff as dreams are made on to a prac...
Sep. 1, 2015 12:00 PM EDT Reads: 243
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advance...
Sep. 1, 2015 11:45 AM EDT Reads: 332
The pricing of tools or licenses for log aggregation can have a significant effect on organizational culture and the collaboration between Dev and Ops teams. Modern tools for log aggregation (of which Logentries is one example) can be hugely enabling for DevOps approaches to building and operating business-critical software systems. However, the pricing of an aggregated logging solution can affect the adoption of modern logging techniques, as well as organizational capabilities and cross-team ...
Sep. 1, 2015 11:30 AM EDT Reads: 409
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
Sep. 1, 2015 11:15 AM EDT Reads: 429
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery ...
Sep. 1, 2015 10:45 AM EDT Reads: 610
Several years ago, I was a developer in a travel reservation aggregator. Our mission was to pull flight and hotel data from a bunch of cryptic reservation platforms, and provide it to other companies via an API library - for a fee. That was before companies like Expedia standardized such things. We started with simple methods like getFlightLeg() or addPassengerName(), each performing a small, well-understood function. But our customers wanted bigger, more encompassing services that would "do ...
Sep. 1, 2015 10:30 AM EDT Reads: 280
Docker containerization is increasingly being used in production environments. How can these environments best be monitored? Monitoring Docker containers as if they are lightweight virtual machines (i.e., monitoring the host from within the container), with all the common metrics that can be captured from an operating system, is an insufficient approach. Docker containers can’t be treated as lightweight virtual machines; they must be treated as what they are: isolated processes running on hosts....
Sep. 1, 2015 08:30 AM EDT Reads: 167
Introducing Containers & Microservices Bootcamp at @CloudExpo Silicon Valley | #Containers #Microservices
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on...
Sep. 1, 2015 08:15 AM EDT Reads: 370
DevOps has traditionally played important roles in development and IT operations, but the practice is quickly becoming core to other business functions such as customer success, business intelligence, and marketing analytics. Modern marketers today are driven by data and rely on many different analytics tools. They need DevOps engineers in general and server log data specifically to do their jobs well. Here’s why: Server log files contain the only data that is completely full and accurate in th...
Sep. 1, 2015 07:45 AM EDT Reads: 412