Click here to close now.

Welcome!

Microservices Journal Authors: Elizabeth White, AppDynamics Blog, Liz McMillan, Pat Romanski, JP Morgenthal

Blog Feed Post

Content / Context / Device Aware Cloud Data Protection

In this two-part blog, I am going to talk about the Intel Cloud Data protection solution that helps our customers utilize their data, in both a context and content-aware manner.

This is a newer set of technologies that has hit the market in the last few years. In the past, we used to think just encrypting the transport layer (such as TLS/SSL) was good enough. Given the complex nature of services and API composition, we quickly realized that it was not enough. Then we moved to protect the messages (most of the time,  the entire message), or at a field level to protect the specific sensitive fields. The problem with any of these scenarios was that it was somewhat static in nature; somewhere there was a definition of what “sensitive data” is, and details related to strict protection of that data. However, when there is a real need to send sensitive data out and a need to protect that, making sure only the authenticated party can receive and/or use the message is critical.

Content Context Device Aware Cloud Data Protection

Essentially “Content/Context Aware” data protection is data protection on steroids. Remember in prior years when we used the DLP technologies, identified data leakage/ data loss based on certain policies/ parameters and stopped the data loss but did nothing about it? The problem with DLP is that it is passive in most cases. It identifies sensitive data based on some context/policy combination and then blocks the transaction. While this can work for rigid enterprise policy sets, this may not work for cloud environments where you need these policies to be flexible. The issue with that is when someone really needs to have that data (who is authorized for it), it is unacceptable to have the transactions stopped.

What if there were a way to provide data protection which would be identity aware, location aware, invocation aware — and yet, would be policy based, compliance based, and more importantly, very dynamic? In other words, what if you were to provide data protection based on content and context awareness? Gone are the days in which you ensure that your systems are compliant, and you are done. Read my blog on why getting compliant is not enough anymore. (link here). That is because your data is NOT staying within your compliant enterprise Ft. Knox anymore; it is moving around. Getting your systems compliant, risk averse and secure, is just not good enough as your data is moving through other eco-systems, not just yours.

When you move your data through cloud providers (especially public cloud) and add removable devices (mobility) to the mix, the issue gets even more interesting. Sprinkle data residency issues on top of that to spice it up.

First of all, take a look at your cloud provider contract closely if you haven’t done so already.

  • Are there any guarantees on where the data is stored (in other words, the location of the data residency)?
  • Are there any guarantees on where the data will be processed (or the location of data processing)?
  • Are they willing to share the liability with you if they lose your or your customer’s data?

Yes, some providers are better than others, but I have seen some other contracts, that give me heart palpitations. No wonder companies are scared to death about protecting their data when moving to the cloud!

The data residency issues are especially big for some of our European customers. This is certainly true for multi-country services, where one has to restrict data residency for data at rest,  but also where mandates exist for where data can be processed. Imagine when you are dealing with financial, healthcare and other sensitive data for a specific country and they ask that you not only store that data in a place that is within legal boundaries of that country, but also ask that you process the data within the data centers located in their country as well.  You are faced with yet additional requirements including a need to sanitize data, route messages to services located in a specific place, desensitize the data for processing, and sanitize it again for storage.

Essentially, your solution needs to be:

  • Have a strong encryption engine which has all the possible security certifications that you can think of – such as FIPS 140-2 Level 3, DoD PKI, CC EAL 4+, etc.
  • Use very strong encryption standards/ algorithm for data, whether in storage or in transit.
  • Protect the encryption keys with your life. There is no point in encrypting the data yet giving away the “Keys to the Kingdom” easily.
  • Have a solution that can sanitize the data very dynamically and very granularly, based on either pre-defined policies (such as XACML, etc.) or DLP based.
  • Make a decision based on the content/context and protect the data based on the need. This means having the flexibility to encrypt the entire message, specific sensitive data in the message, have an option to preserve the format of the sensitive data of the message and/or tokenize the data based on the need.
  • Encrypt the message while preserving the format, so it won’t break the backend systems.
  • Tokenize the PCI and/or PII data for compliance and security reasons.
  • Scrutinize the message more deeply if the message is intended to go to a non-secure location/ endpoint – such as mobile devices, cloud location, third world country, etc.
  • Comply with data residency issues by mandating the processing and storage of data in to a specific instance of the service based on where it is located.
  • Have an elaborate access-control mechanism to the data based on user/ application clearance, data classification and the time and day of the access request.
  • Most importantly, all of the above should be policy based which can be dynamically changed based on the need.
  • Do all of the above seamlessly (or “automagically”).

In part 2 of my blog, I will discuss how Intel Cloud data privacy solutions (or the Cloud encryption / tokenization gateway) elegantly solves this problem and should be the only tool kit you will ever need in your arsenal to solve this issue.

In the meanwhile, you can check out information about our tokenization and cloud data privacy solutions here.

Intel Cloud Data Privacy/ Tokenization Solutions

Intel Cloud/ API resource center

I also encourage you to download the Intel Expressway Tokenization Broker Data Sheet:

 

Andy Thurai — Chief Architect & Group CTO, Application Security and Identity Products, Intel

Andy Thurai is Chief Architect and Group CTO of Application Security and Identity Products with Intel, where he is responsible for architecting SOA, Cloud, Mobile, Big Data, Governance, Security, and Identity solutions for their major corporate customers. In his role, he is responsible for helping Intel/McAfee field sales, technical teams and customer executives. Prior to this role, he has held technology architecture leadership and executive positions with L-1 Identity Solutions, IBM (Datapower), BMC, CSC, and Nortel. His interests and expertise include Cloud, SOA, identity management, security, governance, and SaaS. He holds a degree in Electrical and Electronics engineering and has over 25+ years of IT experience.

He blogs regularly at www.thurai.net/securityblog on Security, SOA, Identity, Governance and Cloud topics. You can also find him on LinkedIn at http://www.linkedin.com/in/andythurai

Read the original blog entry...

More Stories By Andy Thurai

Andy Thurai is Program Director for API, IoT and Connected Cloud with IBM, where he is responsible for solutionizing, strategizing, evangelizing, and providing thought leadership for those technologies. Prior to this role, he has held technology, architecture leadership and executive positions with Intel, Nortel, BMC, CSC, and L-1 Identity Solutions. You can find more of his thoughts at www.thurai.net/blog or follow him on Twitter @AndyThurai.

@MicroservicesExpo Stories
Once the decision has been made to move part or all of a workload to the cloud, a methodology for selecting that workload needs to be established. How do you move to the cloud? What does the discovery, assessment and planning look like? What workloads make sense? Which cloud model makes sense for each workload? What are the considerations for how to select the right cloud model? And how does that fit in with the overall IT transformation?
Docker is an open platform for developers and sysadmins of distributed applications that enables them to build, ship, and run any app anywhere. Docker allows applications to run on any platform irrespective of what tools were used to build it making it easy to distribute, test, and run software. I found this 5 Minute Docker video, which is very helpful when you want to get a quick and digestible overview. If you want to learn more, you can go to Docker’s web page and start with this Docker intro...
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
There’s a lot of discussion around managing outages in production via the likes of DevOps principles and the corresponding software development lifecycles that does enable higher quality output from development, however, one cannot lay all blame for “bugs” and failures at the feet of those responsible for coding and development. As developers incorporate features and benefits of these paradigm shift, there is a learning curve and a point of not-knowing-what-is-not-known. Sometimes, the only way ...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
How can you compare one technology or tool to its competitors? Usually, there is no objective comparison available. So how do you know which is better? Eclipse or IntelliJ IDEA? Java EE or Spring? C# or Java? All you can usually find is a holy war and biased comparisons on vendor sites. But luckily, sometimes, you can find a fair comparison. How does this come to be? By having it co-authored by the stakeholders. The binary repository comparison matrix is one of those rare resources. It is edite...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers ...
SYS-CON Events announced today that EnterpriseDB (EDB), the leading worldwide provider of enterprise-class Postgres products and database compatibility solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. EDB is the largest provider of Postgres software and services that provides enterprise-class performance and scalability and the open source freedom to divert budget from more costly traditiona...
Do you think development teams really update those BMC Remedy tickets with all the changes contained in a release? They don't. Most of them just "check the box" and move on. They rose a Risk Level that won't raise questions from the Change Control managers and they work around the checks and balances. The alternative is to stop and wait for a department that still thinks releases are rare events. When a release happens every day there's just not enough time for people to attend CAB meeting...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud en...
I’ve been thinking a bit about microservices (μServices) recently. My immediate reaction is to think: “Isn’t this just yet another new term for the same stuff, Web Services->SOA->APIs->Microservices?” Followed shortly by the thought, “well yes it is, but there are some important differences/distinguishing factors.” Microservices is an evolutionary paradigm born out of the need for simplicity (i.e., get away from the ESB) and alignment with agile (think DevOps) and scalable (think Containerizati...