Click here to close now.

Welcome!

@MicroservicesE Blog Authors: Elizabeth White, Pat Romanski, Lori MacVittie, Liz McMillan, Cloud Best Practices Network

Related Topics: @CloudExpo Blog, @MicroservicesE Blog, @ContainersExpo, Cloud Security, @BigDataExpo Blog, SDN Journal

@CloudExpo Blog: Article

Data Control and Data Residency

Consider your options carefully when planning the move to the cloud

The benefits associated with adoption of the cloud are well documented and understood. Organizations cite tremendous cost savings, fast deployment times and streamlined application support and maintenance when compared to traditional on-premise software deployments. So what is holding many companies back from adopting the cloud? A recent report from Gartner entitled "Five Cloud Data Residency Issues That Must Not Be Ignored" highlights one key reason for this hesitancy - enterprises' questions and concerns about jurisdictional and regulatory control arising from a lack of clarity on where cloud data truly resides. The report from Gartner recommends that enterprises adopt measures that will simultaneously boost the security of sensitive data as well as assist them in satisfying regulatory compliance with data residency laws.

While the report provides some excellent guidance associated with the implementation of one technique - encryption - to safeguard sensitive information in the cloud, it did not cover a few key points that deserve to be mentioned:

  • Tokenization should be given strong consideration as the data security technique that enterprises deploy when data residency is a critical concern.
  • If encryption is deployed by enterprises, they should take every measure to ensure that they are deploying the strongest form of encryption possible (e.g., use FIPS 140-2 validated modules) to guard against the inherent threats associated with multi-tenant cloud environments.

Why Tokenization?
Tokenization is a process by which a sensitive data field, such as a "Name" or "National ID Number," is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value. While various approaches to creating tokens exist, frequently they are simply randomly generated values that have no mathematical relation to the original data field (click here to review third-party evaluation of PerspecSys' tokenization approach). This underlies the inherent security of the approach - it is nearly impossible to determine the original value of a sensitive data field by knowing only the surrogate token value. When deployed as a technique within a Cloud Data Protection Gateway, the token "vault" that matches the clear text value with the surrogate token stays on-site within an organization's data-center. Because of this, the benefit from a data residency compliance perspective is apparent - the data truly never leaves the enterprise's location.

How Encryption Differs
Encryption is an obfuscation approach that uses a cipher algorithm to mathematically transform sensitive data's original value to a surrogate value. The surrogate can be transformed back to the original value via the use of a "key," which can be thought of as the means to undo the mathematical lock. While encryption clearly can be used to obfuscate a value, a mathematical link back to its true form still exists. As described, tokenization is unique in that it completely removes the original data from the systems in which the tokens reside (the cloud) and there is no construct of a "key" that can be used to bring it back into the clear in the cloud.

In our experience with many customers, it is this unique characteristic of tokenization that has made it the preferred approach selected by enterprises when they are explicitly trying to address data residency requirements. In the words of one of our largest customers (who selected tokenization as their data security approach), "encrypted data leaving your premises is still data leaving your premises."

But If Encryption Is Used - Deploy Using Best Practices
If an organization decides to deploy encryption in order to protect sensitive information going to the cloud, then they need to ensure that industry standard best practices on the use of encryption are followed. As highlighted in the Cloud Security Alliance's Guidelines as well as numerous Gartner Reports, the use of published, well-vetted strong encryption algorithms is a must. In fact, the previously mentioned report "Five Cloud Data Residency Issues That Must Not Be Ignored" notes that enterprises need to ensure that the "strength of the security is not compromised." A good guideline is to look for solutions that support FIPS 140-2 validated algorithms from well-known providers such as McAfee, RSA, SafeNet, Symantec and Voltage Security. A unique and highly valued quality of the PerspecSys gateway is that cloud end users can still enjoy the full capabilities of cloud applications (such as SEARCH) even with data that is strongly encrypted with these industry accepted, validated algorithms.

Netting It Out
There is much to gain from using data obfuscation and replacement technologies to satisfy residency requirements in order to pave the way to cloud adoption. But equally, there is much to lose if the implementation is not well thought through. Do your homework - consider tokenization as an approach, question any encryption techniques that are not well vetted and accepted in the industry and finally, compare solutions from multiple vendors (a suggestion - refer to our whitepaper as a guide: "Critical Questions to Ask Cloud Protection Gateway Providers". We know from our experience helping many organizations around the world tackle these challenges via the use of our Cloud Data Protection Gateway, that by charting your path carefully at the beginning of your project, you can arrive at a solution that will fully meet the needs of your Security, Legal, and Business Line teams.

Read the original blog entry...


PerspecSys Inc. is a leading provider of cloud protection and cloud encryption solutions that enable mission-critical cloud applications to be adopted throughout the enterprise. Cloud security companies like PerspecSys remove the technical, legal and financial risks of placing sensitive company data in the cloud. PerspecSys accomplishes this for many large, heavily regulated companies across the world by never allowing sensitive data to leave a customer's network, while maintaining the functionality of cloud applications. For more information please visit http://www.perspecsys.com/ or follow on Twitter @perspecsys.

More Stories By Gerry Grealish

Gerry Grealish is Vice President, Marketing & Products, at PerspecSys. He is responsible for defining and executing PerspecSys’ marketing vision and driving revenue growth through strategic market expansion and new product development. Previously, he ran Product Marketing for the TNS Payments Division, helping create the marketing and product strategy for its cloud-based payment gateway and tokenization/encryption security solutions. He has held senior marketing and leadership roles for venture-backed startups as well as F500 companies, and his industry experience includes enterprise analytical software, payment processing and security services, and marketing and credit risk decisioning platforms.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at DevOps Summit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction. ...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Conferences agendas. Event navigation. Specific tasks, like buying a house or getting a car loan. If you've installed an app for any of these things you've installed what's known as a "disposable mobile app" or DMA. Apps designed for a single use-case and with the expectation they'll be "thrown away" like brochures. Deleted until needed again. These apps are necessarily small, agile and highly volatile. Sometimes existing only for a short time - say to support an event like an election, the Wor...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Cloud Migration Management (CMM) refers to the best practices for planning and managing migration of IT systems from a legacy platform to a Cloud Provider through a combination professional services consulting and software tools. A Cloud migration project can be a relatively simple exercise, where applications are migrated ‘as is’, to gain benefits such as elastic capacity and utility pricing, but without making any changes to the application architecture, software development methods or busine...
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
Data center models are changing. A variety of technical trends and business demands are forcing that change, most of them centered on the explosive growth of applications. That means, in turn, that the requirements for application delivery are changing. Certainly application delivery needs to be agile, not waterfall. It needs to deliver services in hours, not weeks or months. It needs to be more cost efficient. And more than anything else, it needs to be really, dc infra axisreally, super focus...
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
Many people recognize DevOps as an enormous benefit – faster application deployment, automated toolchains, support of more granular updates, better cooperation across groups. However, less appreciated is the journey enterprise IT groups need to make to achieve this outcome. The plain fact is that established IT processes reflect a very different set of goals: stability, infrequent change, hands-on administration, and alignment with ITIL. So how does an enterprise IT organization implement change...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations migh...
At DevOps Summit NY there’s been a whole lot of talk about not just DevOps, but containers, IoT, and microservices. Sessions focused not just on the cultural shift needed to grow at scale with a DevOps approach, but also made sure to include the network ”plumbing” needed to ensure success as applications decompose into the microservice architectures enabling rapid growth and support for the Internet of (Every)Things.
Mashape is bringing real-time analytics to microservices with the release of Mashape Analytics. First built internally to analyze the performance of more than 13,000 APIs served by the mashape.com marketplace, this new tool provides developers with robust visibility into their APIs and how they function within microservices. A purpose-built, open analytics platform designed specifically for APIs and microservices architectures, Mashape Analytics also lets developers and DevOps teams understand w...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud envir...
Sumo Logic has announced comprehensive analytics capabilities for organizations embracing DevOps practices, microservices architectures and containers to build applications. As application architectures evolve toward microservices, containers continue to gain traction for providing the ideal environment to build, deploy and operate these applications across distributed systems. The volume and complexity of data generated by these environments make monitoring and troubleshooting an enormous chall...
Containers and Docker are all the rage these days. In fact, containers — with Docker as the leading container implementation — have changed how we deploy systems, especially those comprised of microservices. Despite all the buzz, however, Docker and other containers are still relatively new and not yet mainstream. That being said, even early Docker adopters need a good monitoring tool, so last month we added Docker monitoring to SPM. We built it on top of spm-agent – the extensible framework f...
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.