Click here to close now.


Microservices Expo Authors: Elizabeth White, PagerDuty Blog, Liz McMillan, Ian Khan, Carmen Gonzalez

Related Topics: @CloudExpo, Cloud Security

@CloudExpo: Article

The TPM, Trust and the Cloud: Making Trust Real

The journey towards higher cloud security begins with the first step of establishing a foundation

As one of the hottest trends going in the information/computing world, how do the requirements for cloud security relate to the security concerns that enterprises have been addressing prior to the uptick in cloud computing? While the cloud is a relatively new area for concern, improved security has been a hot button for organizations of all size for well over a decade.

The cloud continues to evolve with various services and models including externally hosted public cloud services such as Amazon, independently hosted cloud services by an enterprise or third party that are either outside the firewall or inside the firewall (private cloud services), and a hybrid of private/internal and public/external models. On the surface, the different models might appear to have different privacy, access control and data protection issues. However, compliance issues and business data leakage out of any of these clouds pose problems for an organization similar to ones that existed prior to working in the cloud.

No matter which cloud services or model is pursued, security improvements should be among the criteria on the "must have" list. Issues to consider include: hardware versus software-based security, hardware activation, known users, known machines, accessing both data and application services, data protection and compliance, and protecting the service provider's agreement for controlled user access. For data leakage and access control, authorization to access information from the cloud, whether it's an application or an application with data, requires a trusted endpoint to ensure strong authentication and knowledge of who is accessing the cloud service and the data hosted by the cloud service. In fact, a trusted endpoint is part of the solution for addressing all of the issues. One solution involves implementing the Trusted Platform Module (TPM), a widely available security chip that already resides in most business PCs. This standard-based hardware component provides stronger security than software-only approaches that can be stolen, hacked, impersonated or worse, causing security breaches and business disruption.

Hardware-Based Trust
Developed by the Trusted Computing Group (TCG), the TPM is a standard for providing a hardware-based root of trust for computing and other systems. Unlike other hardware token security tools that are available in USB, key fob, and smart card type products, the standards-based TPM is typically an application-specific integrated (ASIC) available from multiple sources to ensure a highly competitive and readily available component installed in the computer when it was manufactured. TPM capability also can be integrated into chipsets, Ethernet controllers, and the like. To date, it has been is estimated that 200-300 million TPMs have shipped in enterprise PCs based on these standards.

The open standards-based approach provides a low-cost, flexible solution from multiple sources with a higher adoption rate and extensive support from over 100 technology companies in the industry across the globe. International governments have already adopted or plan to adopt the TPM as the standard for authentication. If these governments pursue cloud services, they will not proceed without a similar strong authentication methodology - and they will not implement proprietary technology.

Although inside the computer, the TPM needs to be activated by the user to enable its ability to improve security. Once the TPM is activated, users can more securely store and manage keys and passwords as well as easily encrypt files, folders and email. The TPM allows multi-factor authentication to be based on hardware vs. the simple password in software. For multi-factor authentication requirements, the TPM complements fingerprint readers, PKI, certificates and smart cards.

Dual-Level Identification & Authentication
As a hardware item inside the computer, the TPM allows the identification of the machine as well as the user.

This elevates the sign-on process to two-factor or dual-level authentication: the user through their passwords and the second through the machine itself or the machine's authentication to a service. The approach provides a significant increase in security, especially when compared to single-factor software-only approaches.

With its secure storage for a credential or critical piece of information, the TPM also provides the ability for a multi-use module where multiple users employ the machine's TPM for machine authentication and each user authenticates to the machine. A user could have a single sign-on to different services but would still have to authenticate to the machine. Once the user has authenticated to the machine, the machine can release credentials.

The layers of authentication, where the machine is known by the cloud and the machine has a relationship with the user, significantly enhance cloud security. This process provides a chain of trust for machine-to-machine connectivity so that only known machines and known users obtain access to applications and data. Implementing this dual-level authentication solves one of the most frequently discussed cloud issues - specifically knowing the users and machines that are connected to the cloud service.

Integrating Solutions
The TPM's hardware-based security can easily integrate with software security identification (ID) standards for federated ID management such as OpenID, Security Assertion Markup Language (SAML), WS (Web Services) - Federation and others. For example, OpenID is an industry standard protocol for communication or authentication at the service level. Over 20,000 service providers support OpenID as a software token. Figure 1 shows the process for cloud access based on the two-level authentication and an identity service that uses OpenID.

Figure 1: Improved cloud security requires a hardware-based TPM token to ensure machine integrity.

A cloud service can use the TPM binding an OpenID certificate to the TPM for strong machine authentication to their service. This provides not only the identity of the client but could provide a health measure as well. The service could measure the health of the client, verify a relationship, and then provide the appropriate access. This also allows multiple users per client because the machine has a relationship with the service and multiple users have relationships with the machine. The process extends the security approach across different use models.

SAML, WS-Federation and other software identification standards can be implemented in a similar manner for establishing a known identity for cloud access. The essential element for sharing an identity is a trustworthy endpoint. In all of these cases, the TPM provides the base level, the foundation of trust.

The hardware-based security model that has been described supports cloud-based private/internal, public/external and hybrid combinations of both as well as traditional IT managed networks. This allows a reuse of the security technology for minimizing costs, including the reuse of corporate client policies. The alternative is continuing on the same path of software-only authentication. However, software in the computer can be hacked, impersonated, stolen, or victimized by the malicious software it is attempting to stop. This is the same model that is broken inside the enterprise today.

Once hardware-based access is implemented for machine authentication, it enables the use of other industry standards-based tools for self-encrypting drives (SEDs) and trusted network connect (TNC). These added tools can address the data leakage problem that may result from storing the data accessed from the cloud on a user's client. The journey towards higher cloud security begins with the first step of establishing a foundation by implementing the hardware-based security of the TPM.

More Stories By Brian Berger

Brian Berger is an executive vice president for Wave Systems Corp. He manages the business, strategy and marketing functions including product management, marketing and sales direction for the company. He has been involved in security products for a number of years including work with embedded hardware, client/server applications, PKI and biometrics. He has worked in the computer industry for 20 years and has held several senior level positions in multinational companies. Berger holds two patents and has two pending patents for security products and commerce transactions capabilities using security technology. He has a bachelors of arts degree from California State University, Northridge and attended Harvard Business School, Executive Education.

Berger is Promoter member Wave’s representative to the TCG board of directors and chairs the Trusted Computing Group’s marketing work group, where he leads strategy and implementation of the organization’s marketing and communications programs. He has spoken at a number of RSA conferences, the IT Roadmap events, CTIA and a number of other industry events.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@MicroservicesExpo Stories
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Hiring the wrong candidate can cost a company hundreds of thousands of dollars, and result in lost profit and productivity during the search for a replacement. In fact, the Harvard Business Review has found that as much as 80 percent of turnover is caused by bad hiring decisions. But when your organization has implemented DevOps, the job is about more than just technical chops. It’s also about core behaviors: how they work with others, how they make decisions, and how those decisions translate t...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty ...
One of the most important tenets of digital transformation is that it’s customer-driven. In fact, the only reason technology is involved at all is because today’s customers demand technology-based interactions with the companies they do business with. It’s no surprise, therefore, that we at Intellyx agree with Patrick Maes, CTO, ANZ Bank, when he said, “the fundamental element in digital transformation is extreme customer centricity.” So true – but note the insightful twist that Maes adde...
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at, explored the value of Kibana 4 for log analysis and provided a hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He examined three use cases: IT operations, business intelligence, and security and compliance. Asaf Yigal is co-founder and VP of Product at log analytics software company In the past, he was co-founder of social-trading platform Currensee, which...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Su...
People want to get going with DevOps or Continuous Delivery, but need a place to start. Others are already on their way, but need some validation of their choices. A few months ago, I published the first volume of DevOps and Continuous Delivery reference architectures which has now been viewed over 50,000 times on SlideShare (it's free to registration required). Three things helped people in the deck: (1) the reference architectures, (2) links to the sources for each architectur...
Using any programming framework to the fullest extent possible first requires an understanding of advanced software architecture concepts. While writing a little client-side JavaScript does not necessarily require as much consideration when designing a scalable software architecture, the evolution of tools like Node.js means that you could be facing large code bases that must be easy to maintain.
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
You may have heard about the pets vs. cattle discussion – a reference to the way application servers are deployed in the cloud native world. If an application server goes down it can simply be dropped from the mix and a new server added in its place. The practice so far has mostly been applied to application deployments. Management software on the other hand is treated in a very special manner. Dedicated resources are set aside to run the management software components and several alerting syst...
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem"...
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, San...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace. Traditional approaches for driving innovation are now woefully inadequate for keeping up with the breadth of disruption and change facin...
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNu...
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.