Click here to close now.

Welcome!

Microservices Expo Authors: Bart Copeland, Elizabeth White, Pat Romanski, Tom Lounibos, Carmen Gonzalez

Related Topics: @CloudExpo, Cloud Security

@CloudExpo: Article

The TPM, Trust and the Cloud: Making Trust Real

The journey towards higher cloud security begins with the first step of establishing a foundation

As one of the hottest trends going in the information/computing world, how do the requirements for cloud security relate to the security concerns that enterprises have been addressing prior to the uptick in cloud computing? While the cloud is a relatively new area for concern, improved security has been a hot button for organizations of all size for well over a decade.

The cloud continues to evolve with various services and models including externally hosted public cloud services such as Amazon, independently hosted cloud services by an enterprise or third party that are either outside the firewall or inside the firewall (private cloud services), and a hybrid of private/internal and public/external models. On the surface, the different models might appear to have different privacy, access control and data protection issues. However, compliance issues and business data leakage out of any of these clouds pose problems for an organization similar to ones that existed prior to working in the cloud.

No matter which cloud services or model is pursued, security improvements should be among the criteria on the "must have" list. Issues to consider include: hardware versus software-based security, hardware activation, known users, known machines, accessing both data and application services, data protection and compliance, and protecting the service provider's agreement for controlled user access. For data leakage and access control, authorization to access information from the cloud, whether it's an application or an application with data, requires a trusted endpoint to ensure strong authentication and knowledge of who is accessing the cloud service and the data hosted by the cloud service. In fact, a trusted endpoint is part of the solution for addressing all of the issues. One solution involves implementing the Trusted Platform Module (TPM), a widely available security chip that already resides in most business PCs. This standard-based hardware component provides stronger security than software-only approaches that can be stolen, hacked, impersonated or worse, causing security breaches and business disruption.

Hardware-Based Trust
Developed by the Trusted Computing Group (TCG), the TPM is a standard for providing a hardware-based root of trust for computing and other systems. Unlike other hardware token security tools that are available in USB, key fob, and smart card type products, the standards-based TPM is typically an application-specific integrated (ASIC) available from multiple sources to ensure a highly competitive and readily available component installed in the computer when it was manufactured. TPM capability also can be integrated into chipsets, Ethernet controllers, and the like. To date, it has been is estimated that 200-300 million TPMs have shipped in enterprise PCs based on these standards.

The open standards-based approach provides a low-cost, flexible solution from multiple sources with a higher adoption rate and extensive support from over 100 technology companies in the industry across the globe. International governments have already adopted or plan to adopt the TPM as the standard for authentication. If these governments pursue cloud services, they will not proceed without a similar strong authentication methodology - and they will not implement proprietary technology.

Although inside the computer, the TPM needs to be activated by the user to enable its ability to improve security. Once the TPM is activated, users can more securely store and manage keys and passwords as well as easily encrypt files, folders and email. The TPM allows multi-factor authentication to be based on hardware vs. the simple password in software. For multi-factor authentication requirements, the TPM complements fingerprint readers, PKI, certificates and smart cards.

Dual-Level Identification & Authentication
As a hardware item inside the computer, the TPM allows the identification of the machine as well as the user.

This elevates the sign-on process to two-factor or dual-level authentication: the user through their passwords and the second through the machine itself or the machine's authentication to a service. The approach provides a significant increase in security, especially when compared to single-factor software-only approaches.

With its secure storage for a credential or critical piece of information, the TPM also provides the ability for a multi-use module where multiple users employ the machine's TPM for machine authentication and each user authenticates to the machine. A user could have a single sign-on to different services but would still have to authenticate to the machine. Once the user has authenticated to the machine, the machine can release credentials.

The layers of authentication, where the machine is known by the cloud and the machine has a relationship with the user, significantly enhance cloud security. This process provides a chain of trust for machine-to-machine connectivity so that only known machines and known users obtain access to applications and data. Implementing this dual-level authentication solves one of the most frequently discussed cloud issues - specifically knowing the users and machines that are connected to the cloud service.

Integrating Solutions
The TPM's hardware-based security can easily integrate with software security identification (ID) standards for federated ID management such as OpenID, Security Assertion Markup Language (SAML), WS (Web Services) - Federation and others. For example, OpenID is an industry standard protocol for communication or authentication at the service level. Over 20,000 service providers support OpenID as a software token. Figure 1 shows the process for cloud access based on the two-level authentication and an identity service that uses OpenID.

Figure 1: Improved cloud security requires a hardware-based TPM token to ensure machine integrity.
Source: id.wave.com.

A cloud service can use the TPM binding an OpenID certificate to the TPM for strong machine authentication to their service. This provides not only the identity of the client but could provide a health measure as well. The service could measure the health of the client, verify a relationship, and then provide the appropriate access. This also allows multiple users per client because the machine has a relationship with the service and multiple users have relationships with the machine. The process extends the security approach across different use models.

SAML, WS-Federation and other software identification standards can be implemented in a similar manner for establishing a known identity for cloud access. The essential element for sharing an identity is a trustworthy endpoint. In all of these cases, the TPM provides the base level, the foundation of trust.

Summary
The hardware-based security model that has been described supports cloud-based private/internal, public/external and hybrid combinations of both as well as traditional IT managed networks. This allows a reuse of the security technology for minimizing costs, including the reuse of corporate client policies. The alternative is continuing on the same path of software-only authentication. However, software in the computer can be hacked, impersonated, stolen, or victimized by the malicious software it is attempting to stop. This is the same model that is broken inside the enterprise today.

Once hardware-based access is implemented for machine authentication, it enables the use of other industry standards-based tools for self-encrypting drives (SEDs) and trusted network connect (TNC). These added tools can address the data leakage problem that may result from storing the data accessed from the cloud on a user's client. The journey towards higher cloud security begins with the first step of establishing a foundation by implementing the hardware-based security of the TPM.

More Stories By Brian Berger

Brian Berger is an executive vice president for Wave Systems Corp. He manages the business, strategy and marketing functions including product management, marketing and sales direction for the company. He has been involved in security products for a number of years including work with embedded hardware, client/server applications, PKI and biometrics. He has worked in the computer industry for 20 years and has held several senior level positions in multinational companies. Berger holds two patents and has two pending patents for security products and commerce transactions capabilities using security technology. He has a bachelors of arts degree from California State University, Northridge and attended Harvard Business School, Executive Education.

Berger is Promoter member Wave’s representative to the TCG board of directors and chairs the Trusted Computing Group’s marketing work group, where he leads strategy and implementation of the organization’s marketing and communications programs. He has spoken at a number of RSA conferences, the IT Roadmap events, CTIA and a number of other industry events.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
I want to start by saying that, at ActiveState, we absolutely love Docker. We think it's phenomenal technology that is really becoming the global currency of the cloud. We've written about Docker a lot on the ActiveState blog – we've celebrated its birthday, discussed evolving technologies surrounding it, and shown how we've integrated Docker into Stackato. But, I wanted to share with you why we feel Docker alone is not enough for the enterprise. First, I'm going to discuss two underlying prob...
SYS-CON Media announced today that CloudBees, the Jenkins Enterprise company, has launched ad campaigns on SYS-CON's DevOps Journal. CloudBees' campaigns focus on the business value of Continuous Delivery and how it has been recognized as a game changer for IT and is now a top priority for organizations, and the best ways to optimize Jenkins to ensure your continuous integration environment is optimally configured.
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at 16th Cloud Expo, Jake Moshenko, Product Manager at CoreOS, examined how CoreOS + Quay.io fit into the development lifecycle from pushing gi...
I recently attended and presented at the east coast version of the Jenkins User Conference held this year in Washington, DC. The weather certainly fit the theme of the conference: The heat was continuous. The humidity was fully integrated with the heat. And, most importantly as you can see above, SWAG was out in full force. Right from the opening keynote by the founder of Jenkins, Kohsuke Kawaguchi, this conference was jam-packed with all the latest capabilities of Jenkins, including discussi...
One of the charter responsibilities of DevOps (because it's a charter responsibility of ops) is measuring and monitoring applications once they're in production. That means both performance and availability. Which means a lot more than folks might initially think because generally speaking what you measure and monitor is a bit different depending on whether you're looking at performance or availability*.
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than
The most often asked question post-DevOps introduction is: “How do I get started?” There’s plenty of information on why DevOps is valid and important, but many managers still struggle with simple basics for how to initiate a DevOps program in their business. They struggle with issues related to current organizational inertia, the lack of experience on Continuous Integration/Delivery, understanding where DevOps will affect revenue and budget, etc. In their session at DevOps Summit, JP Morgenthal...
In his session at 16th Cloud Expo, Simone Brunozzi, VP and Chief Technologist of Cloud Services at VMware, reviewed the changes that the cloud computing industry has gone through over the last five years and shared insights into what the next five will bring. He also chronicled the challenges enterprise companies are facing as they move to the public cloud. He delved into the "Hybrid Cloud" space and explained why every CIO should consider ‘hybrid cloud' as part of their future strategy to achie...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
SYS-CON Events announced today that WHOA.com, an ISO 27001 Certified secure cloud computing company, participated as “Bronze Sponsor” of SYS-CON's 16th International Cloud Expo® New York, which took place June 9-11, 2015, at the Javits Center in New York City, NY. WHOA.com is a leader in next-generation, ISO 27001 Certified secure cloud solutions. WHOA.com offers a comprehensive portfolio of best-in-class cloud services for business including Infrastructure as a Service (IaaS), Secure Cloud Desk...
One of the hottest new terms in the world of enterprise computing is the microservice. Starting with the seminal 2014 article by James Lewis and Martin Fowler of ThoughtWorks, microservices have taken on a life of their own – and as with any other overhyped term, they have generated their fair share of confusion as well. Perhaps the best definition of microservices comes from Janakiram MSV, Principal at Janakiram & Associates. “Microservices are fine-grained units of execution. They are designe...
SYS-CON Events announced today that Intelligent Systems Services will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ ...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes. In his session at DevOps Summit, Michael Demmer, VP of Engineering at Jut, will discuss how this can...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
SYS-CON Events announced today that kintone has been named “Bronze Sponsor” of SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. kintone promotes cloud-based workgroup productivity, transparency and profitability with a seamless collaboration space, build your own business application (BYOA) platform, and workflow automation system.
SYS-CON Events announced today that Harbinger Systems will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Harbinger Systems is a global company providing software technology services. Since 1990, Harbinger has developed a strong customer base worldwide. Its customers include software product companies ranging from hi-tech start-ups in Silicon Valley to leading product companies in the US a...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Live Webinar with 451 Research Analyst Peter Christy. Join us on Wednesday July 22, 2015, at 10 am PT / 1 pm ET In a world where users are on the Internet and the applications are in the cloud, how do you maintain your historic SLA with your users? Peter Christy, Research Director, Networks at 451 Research, will discuss this new network paradigm, one in which there is no LAN and no WAN, and discuss what users and network administrators gain and give up when migrating to the agile world of clo...