Welcome!

Microservices Expo Authors: Lori MacVittie, Liz McMillan, Elizabeth White, David Green, XebiaLabs Blog

Related Topics: @CloudExpo, Cloud Security

@CloudExpo: Article

The TPM, Trust and the Cloud: Making Trust Real

The journey towards higher cloud security begins with the first step of establishing a foundation

As one of the hottest trends going in the information/computing world, how do the requirements for cloud security relate to the security concerns that enterprises have been addressing prior to the uptick in cloud computing? While the cloud is a relatively new area for concern, improved security has been a hot button for organizations of all size for well over a decade.

The cloud continues to evolve with various services and models including externally hosted public cloud services such as Amazon, independently hosted cloud services by an enterprise or third party that are either outside the firewall or inside the firewall (private cloud services), and a hybrid of private/internal and public/external models. On the surface, the different models might appear to have different privacy, access control and data protection issues. However, compliance issues and business data leakage out of any of these clouds pose problems for an organization similar to ones that existed prior to working in the cloud.

No matter which cloud services or model is pursued, security improvements should be among the criteria on the "must have" list. Issues to consider include: hardware versus software-based security, hardware activation, known users, known machines, accessing both data and application services, data protection and compliance, and protecting the service provider's agreement for controlled user access. For data leakage and access control, authorization to access information from the cloud, whether it's an application or an application with data, requires a trusted endpoint to ensure strong authentication and knowledge of who is accessing the cloud service and the data hosted by the cloud service. In fact, a trusted endpoint is part of the solution for addressing all of the issues. One solution involves implementing the Trusted Platform Module (TPM), a widely available security chip that already resides in most business PCs. This standard-based hardware component provides stronger security than software-only approaches that can be stolen, hacked, impersonated or worse, causing security breaches and business disruption.

Hardware-Based Trust
Developed by the Trusted Computing Group (TCG), the TPM is a standard for providing a hardware-based root of trust for computing and other systems. Unlike other hardware token security tools that are available in USB, key fob, and smart card type products, the standards-based TPM is typically an application-specific integrated (ASIC) available from multiple sources to ensure a highly competitive and readily available component installed in the computer when it was manufactured. TPM capability also can be integrated into chipsets, Ethernet controllers, and the like. To date, it has been is estimated that 200-300 million TPMs have shipped in enterprise PCs based on these standards.

The open standards-based approach provides a low-cost, flexible solution from multiple sources with a higher adoption rate and extensive support from over 100 technology companies in the industry across the globe. International governments have already adopted or plan to adopt the TPM as the standard for authentication. If these governments pursue cloud services, they will not proceed without a similar strong authentication methodology - and they will not implement proprietary technology.

Although inside the computer, the TPM needs to be activated by the user to enable its ability to improve security. Once the TPM is activated, users can more securely store and manage keys and passwords as well as easily encrypt files, folders and email. The TPM allows multi-factor authentication to be based on hardware vs. the simple password in software. For multi-factor authentication requirements, the TPM complements fingerprint readers, PKI, certificates and smart cards.

Dual-Level Identification & Authentication
As a hardware item inside the computer, the TPM allows the identification of the machine as well as the user.

This elevates the sign-on process to two-factor or dual-level authentication: the user through their passwords and the second through the machine itself or the machine's authentication to a service. The approach provides a significant increase in security, especially when compared to single-factor software-only approaches.

With its secure storage for a credential or critical piece of information, the TPM also provides the ability for a multi-use module where multiple users employ the machine's TPM for machine authentication and each user authenticates to the machine. A user could have a single sign-on to different services but would still have to authenticate to the machine. Once the user has authenticated to the machine, the machine can release credentials.

The layers of authentication, where the machine is known by the cloud and the machine has a relationship with the user, significantly enhance cloud security. This process provides a chain of trust for machine-to-machine connectivity so that only known machines and known users obtain access to applications and data. Implementing this dual-level authentication solves one of the most frequently discussed cloud issues - specifically knowing the users and machines that are connected to the cloud service.

Integrating Solutions
The TPM's hardware-based security can easily integrate with software security identification (ID) standards for federated ID management such as OpenID, Security Assertion Markup Language (SAML), WS (Web Services) - Federation and others. For example, OpenID is an industry standard protocol for communication or authentication at the service level. Over 20,000 service providers support OpenID as a software token. Figure 1 shows the process for cloud access based on the two-level authentication and an identity service that uses OpenID.

Figure 1: Improved cloud security requires a hardware-based TPM token to ensure machine integrity.
Source: id.wave.com.

A cloud service can use the TPM binding an OpenID certificate to the TPM for strong machine authentication to their service. This provides not only the identity of the client but could provide a health measure as well. The service could measure the health of the client, verify a relationship, and then provide the appropriate access. This also allows multiple users per client because the machine has a relationship with the service and multiple users have relationships with the machine. The process extends the security approach across different use models.

SAML, WS-Federation and other software identification standards can be implemented in a similar manner for establishing a known identity for cloud access. The essential element for sharing an identity is a trustworthy endpoint. In all of these cases, the TPM provides the base level, the foundation of trust.

Summary
The hardware-based security model that has been described supports cloud-based private/internal, public/external and hybrid combinations of both as well as traditional IT managed networks. This allows a reuse of the security technology for minimizing costs, including the reuse of corporate client policies. The alternative is continuing on the same path of software-only authentication. However, software in the computer can be hacked, impersonated, stolen, or victimized by the malicious software it is attempting to stop. This is the same model that is broken inside the enterprise today.

Once hardware-based access is implemented for machine authentication, it enables the use of other industry standards-based tools for self-encrypting drives (SEDs) and trusted network connect (TNC). These added tools can address the data leakage problem that may result from storing the data accessed from the cloud on a user's client. The journey towards higher cloud security begins with the first step of establishing a foundation by implementing the hardware-based security of the TPM.

More Stories By Brian Berger

Brian Berger is an executive vice president for Wave Systems Corp. He manages the business, strategy and marketing functions including product management, marketing and sales direction for the company. He has been involved in security products for a number of years including work with embedded hardware, client/server applications, PKI and biometrics. He has worked in the computer industry for 20 years and has held several senior level positions in multinational companies. Berger holds two patents and has two pending patents for security products and commerce transactions capabilities using security technology. He has a bachelors of arts degree from California State University, Northridge and attended Harvard Business School, Executive Education.

Berger is Promoter member Wave’s representative to the TCG board of directors and chairs the Trusted Computing Group’s marketing work group, where he leads strategy and implementation of the organization’s marketing and communications programs. He has spoken at a number of RSA conferences, the IT Roadmap events, CTIA and a number of other industry events.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...

Let's just nip the conflation of these terms in the bud, shall we?

"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.

They are not.

One is about the application. The other, the network. T...

As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
The following fictional case study is a composite of actual horror stories I’ve heard over the years. Unfortunately, this scenario often occurs when in-house integration teams take on the complexities of DevOps and ALM integration with an enterprise service bus (ESB) or custom integration. It is written from the perspective of an enterprise architect tasked with leading an organization’s effort to adopt Agile to become more competitive. The company has turned to Scaled Agile Framework (SAFe) as ...
If you are within a stones throw of the DevOps marketplace you have undoubtably noticed the growing trend in Microservices. Whether you have been staying up to date with the latest articles and blogs or you just read the definition for the first time, these 5 Microservices Resources You Need In Your Life will guide you through the ins and outs of Microservices in today’s world.
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
This complete kit provides a proven process and customizable documents that will help you evaluate rapid application delivery platforms and select the ideal partner for building mobile and web apps for your organization.
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
A company’s collection of online systems is like a delicate ecosystem – all components must integrate with and complement each other, and one single malfunction in any of them can bring the entire system to a screeching halt. That’s why, when monitoring and analyzing the health of your online systems, you need a broad arsenal of different tools for your different needs. In addition to a wide-angle lens that provides a snapshot of the overall health of your system, you must also have precise, ...
It's been a busy time for tech's ongoing infatuation with containers. Amazon just announced EC2 Container Registry to simply container management. The new Azure container service taps into Microsoft's partnership with Docker and Mesosphere. You know when there's a standard for containers on the table there's money on the table, too. Everyone is talking containers because they reduce a ton of development-related challenges and make it much easier to move across production and testing environm...
Node.js and io.js are increasingly being used to run JavaScript on the server side for many types of applications, such as websites, real-time messaging and controllers for small devices with limited resources. For DevOps it is crucial to monitor the whole application stack and Node.js is rapidly becoming an important part of the stack in many organizations. Sematext has historically had a strong support for monitoring big data applications such as Elastic (aka Elasticsearch), Cassandra, Solr, S...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...