Click here to close now.



Welcome!

Microservices Expo Authors: Jason Bloomberg, Elizabeth White, Anders Wallgren, Pat Romanski, Cloud Best Practices Network

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Data Control and Data Residency

Consider your options carefully when planning the move to the cloud

The benefits associated with adoption of the cloud are well documented and understood. Organizations cite tremendous cost savings, fast deployment times and streamlined application support and maintenance when compared to traditional on-premise software deployments. So what is holding many companies back from adopting the cloud? A recent report from Gartner entitled "Five Cloud Data Residency Issues That Must Not Be Ignored" highlights one key reason for this hesitancy - enterprises' questions and concerns about jurisdictional and regulatory control arising from a lack of clarity on where cloud data truly resides. The report from Gartner recommends that enterprises adopt measures that will simultaneously boost the security of sensitive data as well as assist them in satisfying regulatory compliance with data residency laws.

While the report provides some excellent guidance associated with the implementation of one technique - encryption - to safeguard sensitive information in the cloud, it did not cover a few key points that deserve to be mentioned:

  • Tokenization should be given strong consideration as the data security technique that enterprises deploy when data residency is a critical concern.
  • If encryption is deployed by enterprises, they should take every measure to ensure that they are deploying the strongest form of encryption possible (e.g., use FIPS 140-2 validated modules) to guard against the inherent threats associated with multi-tenant cloud environments.

Why Tokenization?
Tokenization is a process by which a sensitive data field, such as a "Name" or "National ID Number," is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value. While various approaches to creating tokens exist, frequently they are simply randomly generated values that have no mathematical relation to the original data field (click here to review third-party evaluation of PerspecSys' tokenization approach). This underlies the inherent security of the approach - it is nearly impossible to determine the original value of a sensitive data field by knowing only the surrogate token value. When deployed as a technique within a Cloud Data Protection Gateway, the token "vault" that matches the clear text value with the surrogate token stays on-site within an organization's data-center. Because of this, the benefit from a data residency compliance perspective is apparent - the data truly never leaves the enterprise's location.

How Encryption Differs
Encryption is an obfuscation approach that uses a cipher algorithm to mathematically transform sensitive data's original value to a surrogate value. The surrogate can be transformed back to the original value via the use of a "key," which can be thought of as the means to undo the mathematical lock. While encryption clearly can be used to obfuscate a value, a mathematical link back to its true form still exists. As described, tokenization is unique in that it completely removes the original data from the systems in which the tokens reside (the cloud) and there is no construct of a "key" that can be used to bring it back into the clear in the cloud.

In our experience with many customers, it is this unique characteristic of tokenization that has made it the preferred approach selected by enterprises when they are explicitly trying to address data residency requirements. In the words of one of our largest customers (who selected tokenization as their data security approach), "encrypted data leaving your premises is still data leaving your premises."

But If Encryption Is Used - Deploy Using Best Practices
If an organization decides to deploy encryption in order to protect sensitive information going to the cloud, then they need to ensure that industry standard best practices on the use of encryption are followed. As highlighted in the Cloud Security Alliance's Guidelines as well as numerous Gartner Reports, the use of published, well-vetted strong encryption algorithms is a must. In fact, the previously mentioned report "Five Cloud Data Residency Issues That Must Not Be Ignored" notes that enterprises need to ensure that the "strength of the security is not compromised." A good guideline is to look for solutions that support FIPS 140-2 validated algorithms from well-known providers such as McAfee, RSA, SafeNet, Symantec and Voltage Security. A unique and highly valued quality of the PerspecSys gateway is that cloud end users can still enjoy the full capabilities of cloud applications (such as SEARCH) even with data that is strongly encrypted with these industry accepted, validated algorithms.

Netting It Out
There is much to gain from using data obfuscation and replacement technologies to satisfy residency requirements in order to pave the way to cloud adoption. But equally, there is much to lose if the implementation is not well thought through. Do your homework - consider tokenization as an approach, question any encryption techniques that are not well vetted and accepted in the industry and finally, compare solutions from multiple vendors (a suggestion - refer to our whitepaper as a guide: "Critical Questions to Ask Cloud Protection Gateway Providers". We know from our experience helping many organizations around the world tackle these challenges via the use of our Cloud Data Protection Gateway, that by charting your path carefully at the beginning of your project, you can arrive at a solution that will fully meet the needs of your Security, Legal, and Business Line teams.

Read the original blog entry...


PerspecSys Inc. is a leading provider of cloud protection and cloud encryption solutions that enable mission-critical cloud applications to be adopted throughout the enterprise. Cloud security companies like PerspecSys remove the technical, legal and financial risks of placing sensitive company data in the cloud. PerspecSys accomplishes this for many large, heavily regulated companies across the world by never allowing sensitive data to leave a customer's network, while maintaining the functionality of cloud applications. For more information please visit http://www.perspecsys.com/ or follow on Twitter @perspecsys.

More Stories By Gerry Grealish

Gerry Grealish is Vice President, Marketing & Products, at PerspecSys. He is responsible for defining and executing PerspecSys’ marketing vision and driving revenue growth through strategic market expansion and new product development. Previously, he ran Product Marketing for the TNS Payments Division, helping create the marketing and product strategy for its cloud-based payment gateway and tokenization/encryption security solutions. He has held senior marketing and leadership roles for venture-backed startups as well as F500 companies, and his industry experience includes enterprise analytical software, payment processing and security services, and marketing and credit risk decisioning platforms.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
The battle over bimodal IT is heating up. Now that there’s a reasonably broad consensus that Gartner’s advice about bimodal IT is deeply flawed – consensus everywhere except perhaps at Gartner – various ideas are springing up to fill the void. The bimodal problem, of course, is well understood. ‘Traditional’ or ‘slow’ IT uses hidebound, laborious processes that would only get in the way of ‘fast’ or ‘agile’ digital efforts. The result: incoherent IT strategies and shadow IT struggles that lead ...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
As software organizations continue to invest in achieving Continuous Delivery (CD) of their applications, we see increased interest in microservices architectures, which–on the face of it–seem like a natural fit for enabling CD. In microservices (or its predecessor, “SOA”), the business functionality is decomposed into a set of independent, self-contained services that communicate with each other via an API. Each of the services has their own application release cycle, and are developed and depl...
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
At the heart of the Cloud Native model is a microservices application architecture, and applying this to a telco SDN scenario offers enormous opportunity for product innovation and competitive advantage. For example in the ETSI NFV Ecosystem white paper they describe one of the product markets that SDN might address to be the Home sector. Vendors like Alcatel market SDN-based solutions for the home market, offering Home Gateways – A virtual residential gateway (vRGW) where service provider...
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
SYS-CON Events announced today that AppNeta, the leader in performance insight for business-critical web applications, will exhibit and present at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. AppNeta is the only application performance monitoring (APM) company to provide solutions for all applications – applications you develop internally, business-critical SaaS applications you use and the networks that deli...
In the Bimodal model we find two areas of IT - the traditional kind where the main concern is keeping the lights on and the IT focusing on agility and speed, where everything needs to be faster. Today companies are investing in new technologies and processes to emulate their most agile competitors. Gone are the days of waterfall development and releases only every few months. Today's IT and the business it powers demands performance akin to a supercar - everything needs to be faster, every sc...
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24x7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be muc...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
With microservices, SOA and distributed architectures becoming more popular, it is becoming increasingly harder to keep track of where time is spent in a distributed application when trying to diagnose performance problems. Distributed tracing systems attempt to address this problem by following application requests across service boundaries, persisting metadata along the way that provide context for fine-grained performance monitoring.
Web performance issues and advances have been gaining a stronger presence in the headlines as people are becoming more aware of its impact on virtually every business, and 2015 was no exception. We saw a myriad of major outages this year hit some of the biggest corporations, as well as some technology integrations and other news that we IT Ops aficionados find very exciting. This past year has offered several opportunities for growth and evolution in the performance realm — even the worst failu...
Are you someone who knows that the number one rule in DevOps is “Don’t Panic”? Especially when it comes to making Continuous Delivery changes inside your organization? Are you someone that theorizes that if anyone implements real automation changes, the solution will instantly become antiquated and be replaced by something even more bizarre and inexplicable?
Welcome to the first top DevOps news roundup of 2016! At the end of last year, we saw some great predictions for 2016. While we’re excited to kick off the new year, this week’s top posts reminded us to take a second to slow down and really understand the current state of affairs. For example, do you actually know what microservices are – or aren’t? What about DevOps? Does the emphasis still fall mostly on the development side? This week’s top news definitely got the wheels turning and just migh...
Test automation is arguably the most important innovation to the process of QA testing in software development. The ability to automate regression testing and other repetitive test cases can significantly reduce the overall production time for even the most complex solutions. As software continues to be developed for new platforms – including mobile devices and the diverse array of endpoints that will be created during the rise of the Internet of Things - automation integration will have a huge ...
Providing a full-duplex communication channel over a single TCP connection, WebSocket is the most efficient protocol for real-time responses over the web. If you’re utilizing WebSocket technology, performance testing will boil down to simulating the bi-directional nature of your application. Introduced with HTML5, the WebSocket protocol allows for more interaction between a browser and website, facilitating real-time applications and live content. WebSocket technology creates a persistent conne...