Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Yeshim Deniz, Zakia Bouachraoui

Blog Feed Post

Content / Context / Device Aware Cloud Data Protection

In this two-part blog, I am going to talk about the Intel Cloud Data protection solution that helps our customers utilize their data, in both a context and content-aware manner.

This is a newer set of technologies that has hit the market in the last few years. In the past, we used to think just encrypting the transport layer (such as TLS/SSL) was good enough. Given the complex nature of services and API composition, we quickly realized that it was not enough. Then we moved to protect the messages (most of the time,  the entire message), or at a field level to protect the specific sensitive fields. The problem with any of these scenarios was that it was somewhat static in nature; somewhere there was a definition of what “sensitive data” is, and details related to strict protection of that data. However, when there is a real need to send sensitive data out and a need to protect that, making sure only the authenticated party can receive and/or use the message is critical.

Content Context Device Aware Cloud Data Protection

Essentially “Content/Context Aware” data protection is data protection on steroids. Remember in prior years when we used the DLP technologies, identified data leakage/ data loss based on certain policies/ parameters and stopped the data loss but did nothing about it? The problem with DLP is that it is passive in most cases. It identifies sensitive data based on some context/policy combination and then blocks the transaction. While this can work for rigid enterprise policy sets, this may not work for cloud environments where you need these policies to be flexible. The issue with that is when someone really needs to have that data (who is authorized for it), it is unacceptable to have the transactions stopped.

What if there were a way to provide data protection which would be identity aware, location aware, invocation aware — and yet, would be policy based, compliance based, and more importantly, very dynamic? In other words, what if you were to provide data protection based on content and context awareness? Gone are the days in which you ensure that your systems are compliant, and you are done. Read my blog on why getting compliant is not enough anymore. (link here). That is because your data is NOT staying within your compliant enterprise Ft. Knox anymore; it is moving around. Getting your systems compliant, risk averse and secure, is just not good enough as your data is moving through other eco-systems, not just yours.

When you move your data through cloud providers (especially public cloud) and add removable devices (mobility) to the mix, the issue gets even more interesting. Sprinkle data residency issues on top of that to spice it up.

First of all, take a look at your cloud provider contract closely if you haven’t done so already.

  • Are there any guarantees on where the data is stored (in other words, the location of the data residency)?
  • Are there any guarantees on where the data will be processed (or the location of data processing)?
  • Are they willing to share the liability with you if they lose your or your customer’s data?

Yes, some providers are better than others, but I have seen some other contracts, that give me heart palpitations. No wonder companies are scared to death about protecting their data when moving to the cloud!

The data residency issues are especially big for some of our European customers. This is certainly true for multi-country services, where one has to restrict data residency for data at rest,  but also where mandates exist for where data can be processed. Imagine when you are dealing with financial, healthcare and other sensitive data for a specific country and they ask that you not only store that data in a place that is within legal boundaries of that country, but also ask that you process the data within the data centers located in their country as well.  You are faced with yet additional requirements including a need to sanitize data, route messages to services located in a specific place, desensitize the data for processing, and sanitize it again for storage.

Essentially, your solution needs to be:

  • Have a strong encryption engine which has all the possible security certifications that you can think of – such as FIPS 140-2 Level 3, DoD PKI, CC EAL 4+, etc.
  • Use very strong encryption standards/ algorithm for data, whether in storage or in transit.
  • Protect the encryption keys with your life. There is no point in encrypting the data yet giving away the “Keys to the Kingdom” easily.
  • Have a solution that can sanitize the data very dynamically and very granularly, based on either pre-defined policies (such as XACML, etc.) or DLP based.
  • Make a decision based on the content/context and protect the data based on the need. This means having the flexibility to encrypt the entire message, specific sensitive data in the message, have an option to preserve the format of the sensitive data of the message and/or tokenize the data based on the need.
  • Encrypt the message while preserving the format, so it won’t break the backend systems.
  • Tokenize the PCI and/or PII data for compliance and security reasons.
  • Scrutinize the message more deeply if the message is intended to go to a non-secure location/ endpoint – such as mobile devices, cloud location, third world country, etc.
  • Comply with data residency issues by mandating the processing and storage of data in to a specific instance of the service based on where it is located.
  • Have an elaborate access-control mechanism to the data based on user/ application clearance, data classification and the time and day of the access request.
  • Most importantly, all of the above should be policy based which can be dynamically changed based on the need.
  • Do all of the above seamlessly (or “automagically”).

In part 2 of my blog, I will discuss how Intel Cloud data privacy solutions (or the Cloud encryption / tokenization gateway) elegantly solves this problem and should be the only tool kit you will ever need in your arsenal to solve this issue.

In the meanwhile, you can check out information about our tokenization and cloud data privacy solutions here.

Intel Cloud Data Privacy/ Tokenization Solutions

Intel Cloud/ API resource center

I also encourage you to download the Intel Expressway Tokenization Broker Data Sheet:

 

Andy Thurai — Chief Architect & Group CTO, Application Security and Identity Products, Intel

Andy Thurai is Chief Architect and Group CTO of Application Security and Identity Products with Intel, where he is responsible for architecting SOA, Cloud, Mobile, Big Data, Governance, Security, and Identity solutions for their major corporate customers. In his role, he is responsible for helping Intel/McAfee field sales, technical teams and customer executives. Prior to this role, he has held technology architecture leadership and executive positions with L-1 Identity Solutions, IBM (Datapower), BMC, CSC, and Nortel. His interests and expertise include Cloud, SOA, identity management, security, governance, and SaaS. He holds a degree in Electrical and Electronics engineering and has over 25+ years of IT experience.

He blogs regularly at www.thurai.net/securityblog on Security, SOA, Identity, Governance and Cloud topics. You can also find him on LinkedIn at http://www.linkedin.com/in/andythurai

Read the original blog entry...

More Stories By Andy Thurai

Andy Thurai is Program Director for API, IoT and Connected Cloud with IBM, where he is responsible for solutionizing, strategizing, evangelizing, and providing thought leadership for those technologies. Prior to this role, he has held technology, architecture leadership and executive positions with Intel, Nortel, BMC, CSC, and L-1 Identity Solutions.

You can find more of his thoughts at www.thurai.net/blog or follow him on Twitter @AndyThurai.

Microservices Articles
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.