|By Gerry Grealish||
|February 4, 2013 07:00 AM EST||
The benefits associated with adoption of the cloud are well documented and understood. Organizations cite tremendous cost savings, fast deployment times and streamlined application support and maintenance when compared to traditional on-premise software deployments. So what is holding many companies back from adopting the cloud? A recent report from Gartner entitled "Five Cloud Data Residency Issues That Must Not Be Ignored" highlights one key reason for this hesitancy - enterprises' questions and concerns about jurisdictional and regulatory control arising from a lack of clarity on where cloud data truly resides. The report from Gartner recommends that enterprises adopt measures that will simultaneously boost the security of sensitive data as well as assist them in satisfying regulatory compliance with data residency laws.
While the report provides some excellent guidance associated with the implementation of one technique - encryption - to safeguard sensitive information in the cloud, it did not cover a few key points that deserve to be mentioned:
- Tokenization should be given strong consideration as the data security technique that enterprises deploy when data residency is a critical concern.
- If encryption is deployed by enterprises, they should take every measure to ensure that they are deploying the strongest form of encryption possible (e.g., use FIPS 140-2 validated modules) to guard against the inherent threats associated with multi-tenant cloud environments.
Tokenization is a process by which a sensitive data field, such as a "Name" or "National ID Number," is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value. While various approaches to creating tokens exist, frequently they are simply randomly generated values that have no mathematical relation to the original data field (click here to review third-party evaluation of PerspecSys' tokenization approach). This underlies the inherent security of the approach - it is nearly impossible to determine the original value of a sensitive data field by knowing only the surrogate token value. When deployed as a technique within a Cloud Data Protection Gateway, the token "vault" that matches the clear text value with the surrogate token stays on-site within an organization's data-center. Because of this, the benefit from a data residency compliance perspective is apparent - the data truly never leaves the enterprise's location.
How Encryption Differs
Encryption is an obfuscation approach that uses a cipher algorithm to mathematically transform sensitive data's original value to a surrogate value. The surrogate can be transformed back to the original value via the use of a "key," which can be thought of as the means to undo the mathematical lock. While encryption clearly can be used to obfuscate a value, a mathematical link back to its true form still exists. As described, tokenization is unique in that it completely removes the original data from the systems in which the tokens reside (the cloud) and there is no construct of a "key" that can be used to bring it back into the clear in the cloud.
In our experience with many customers, it is this unique characteristic of tokenization that has made it the preferred approach selected by enterprises when they are explicitly trying to address data residency requirements. In the words of one of our largest customers (who selected tokenization as their data security approach), "encrypted data leaving your premises is still data leaving your premises."
But If Encryption Is Used - Deploy Using Best Practices
If an organization decides to deploy encryption in order to protect sensitive information going to the cloud, then they need to ensure that industry standard best practices on the use of encryption are followed. As highlighted in the Cloud Security Alliance's Guidelines as well as numerous Gartner Reports, the use of published, well-vetted strong encryption algorithms is a must. In fact, the previously mentioned report "Five Cloud Data Residency Issues That Must Not Be Ignored" notes that enterprises need to ensure that the "strength of the security is not compromised." A good guideline is to look for solutions that support FIPS 140-2 validated algorithms from well-known providers such as McAfee, RSA, SafeNet, Symantec and Voltage Security. A unique and highly valued quality of the PerspecSys gateway is that cloud end users can still enjoy the full capabilities of cloud applications (such as SEARCH) even with data that is strongly encrypted with these industry accepted, validated algorithms.
Netting It Out
There is much to gain from using data obfuscation and replacement technologies to satisfy residency requirements in order to pave the way to cloud adoption. But equally, there is much to lose if the implementation is not well thought through. Do your homework - consider tokenization as an approach, question any encryption techniques that are not well vetted and accepted in the industry and finally, compare solutions from multiple vendors (a suggestion - refer to our whitepaper as a guide: "Critical Questions to Ask Cloud Protection Gateway Providers". We know from our experience helping many organizations around the world tackle these challenges via the use of our Cloud Data Protection Gateway, that by charting your path carefully at the beginning of your project, you can arrive at a solution that will fully meet the needs of your Security, Legal, and Business Line teams.
PerspecSys Inc. is a leading provider of cloud protection and cloud encryption solutions that enable mission-critical cloud applications to be adopted throughout the enterprise. Cloud security companies like PerspecSys remove the technical, legal and financial risks of placing sensitive company data in the cloud. PerspecSys accomplishes this for many large, heavily regulated companies across the world by never allowing sensitive data to leave a customer's network, while maintaining the functionality of cloud applications. For more information please visit http://www.perspecsys.com/ or follow on Twitter @perspecsys.
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
Dec. 11, 2016 06:45 AM EST Reads: 2,321
I’m a huge fan of open source DevOps tools. I’m also a huge fan of scaling open source tools for the enterprise. But having talked with my fair share of companies over the years, one important thing I’ve learned is that you can’t scale your release process using open source tools alone. They simply require too much scripting and maintenance when used that way. Scripting may be fine for smaller organizations, but it’s not ok in an enterprise environment that includes many independent teams and to...
Dec. 11, 2016 06:30 AM EST Reads: 926
IT leaders face a monumental challenge. They must figure out how to sort through the cacophony of new technologies, buzzwords, and industry hype to find the right digital path forward for their organizations. And they simply cannot afford to fail. Those organizations that are fastest to the right digital path will be the ones that win. The path forward, however, is strewn with the legacy of decisions made long ago — often before any of the current leadership team assumed their roles. While it’s ...
Dec. 11, 2016 06:00 AM EST Reads: 1,874
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud: This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Dec. 11, 2016 05:15 AM EST Reads: 3,089
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 11, 2016 02:45 AM EST Reads: 1,808
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
Dec. 11, 2016 02:30 AM EST Reads: 1,076
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dec. 11, 2016 01:15 AM EST Reads: 1,356
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
Dec. 11, 2016 12:00 AM EST Reads: 2,405
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Dec. 10, 2016 09:30 PM EST Reads: 5,901
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Dec. 10, 2016 06:30 PM EST Reads: 1,977
Today’s IT environments are increasingly heterogeneous, with Linux, Java, Oracle and MySQL considered nearly as common as traditional Windows environments. In many cases, these platforms have been integrated into an organization’s Windows-based IT department by way of an acquisition of a company that leverages one of those platforms. In other cases, the applications may have been part of the IT department for years, but managed by a separate department or singular administrator. Still, whether...
Dec. 10, 2016 04:45 PM EST Reads: 978
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Dec. 10, 2016 04:30 PM EST Reads: 1,382
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Dec. 10, 2016 04:00 PM EST Reads: 1,935
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
Dec. 10, 2016 03:45 PM EST Reads: 1,346
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
Dec. 10, 2016 03:45 PM EST Reads: 961
If you haven’t heard yet, CollabNet just put out some very big news for managing and gaining value from DevOps. We introduced CollabNet DevOps Lifecycle Manager (DLM) — a platform designed exclusively for providing a single pane of glass, dashboard, and traceability views across your DevOps toolchain and processes from planning to operations and that can be traced back to planning and development.
Dec. 10, 2016 02:45 PM EST Reads: 1,018
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 10, 2016 02:30 PM EST Reads: 2,092
SYS-CON Events announced today that Fusion, a leading provider of cloud services, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Fusion, a leading provider of integrated cloud solutions to small, medium and large businesses, is the industry’s single source for the cloud. Fusion’s advanced, proprietary cloud service platform enables the integration of leading edge solutions in the cloud, including cloud...
Dec. 10, 2016 01:45 PM EST Reads: 835
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Dec. 10, 2016 12:30 PM EST Reads: 2,010
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Dec. 10, 2016 12:00 PM EST Reads: 2,449