Microservices Expo Authors: Liz McMillan, Yeshim Deniz, Derek Weeks, Pat Romanski, Sematext Blog

Related Topics: @CloudExpo, Cloud Security

@CloudExpo: Article

The TPM, Trust and the Cloud: Making Trust Real

The journey towards higher cloud security begins with the first step of establishing a foundation

As one of the hottest trends going in the information/computing world, how do the requirements for cloud security relate to the security concerns that enterprises have been addressing prior to the uptick in cloud computing? While the cloud is a relatively new area for concern, improved security has been a hot button for organizations of all size for well over a decade.

The cloud continues to evolve with various services and models including externally hosted public cloud services such as Amazon, independently hosted cloud services by an enterprise or third party that are either outside the firewall or inside the firewall (private cloud services), and a hybrid of private/internal and public/external models. On the surface, the different models might appear to have different privacy, access control and data protection issues. However, compliance issues and business data leakage out of any of these clouds pose problems for an organization similar to ones that existed prior to working in the cloud.

No matter which cloud services or model is pursued, security improvements should be among the criteria on the "must have" list. Issues to consider include: hardware versus software-based security, hardware activation, known users, known machines, accessing both data and application services, data protection and compliance, and protecting the service provider's agreement for controlled user access. For data leakage and access control, authorization to access information from the cloud, whether it's an application or an application with data, requires a trusted endpoint to ensure strong authentication and knowledge of who is accessing the cloud service and the data hosted by the cloud service. In fact, a trusted endpoint is part of the solution for addressing all of the issues. One solution involves implementing the Trusted Platform Module (TPM), a widely available security chip that already resides in most business PCs. This standard-based hardware component provides stronger security than software-only approaches that can be stolen, hacked, impersonated or worse, causing security breaches and business disruption.

Hardware-Based Trust
Developed by the Trusted Computing Group (TCG), the TPM is a standard for providing a hardware-based root of trust for computing and other systems. Unlike other hardware token security tools that are available in USB, key fob, and smart card type products, the standards-based TPM is typically an application-specific integrated (ASIC) available from multiple sources to ensure a highly competitive and readily available component installed in the computer when it was manufactured. TPM capability also can be integrated into chipsets, Ethernet controllers, and the like. To date, it has been is estimated that 200-300 million TPMs have shipped in enterprise PCs based on these standards.

The open standards-based approach provides a low-cost, flexible solution from multiple sources with a higher adoption rate and extensive support from over 100 technology companies in the industry across the globe. International governments have already adopted or plan to adopt the TPM as the standard for authentication. If these governments pursue cloud services, they will not proceed without a similar strong authentication methodology - and they will not implement proprietary technology.

Although inside the computer, the TPM needs to be activated by the user to enable its ability to improve security. Once the TPM is activated, users can more securely store and manage keys and passwords as well as easily encrypt files, folders and email. The TPM allows multi-factor authentication to be based on hardware vs. the simple password in software. For multi-factor authentication requirements, the TPM complements fingerprint readers, PKI, certificates and smart cards.

Dual-Level Identification & Authentication
As a hardware item inside the computer, the TPM allows the identification of the machine as well as the user.

This elevates the sign-on process to two-factor or dual-level authentication: the user through their passwords and the second through the machine itself or the machine's authentication to a service. The approach provides a significant increase in security, especially when compared to single-factor software-only approaches.

With its secure storage for a credential or critical piece of information, the TPM also provides the ability for a multi-use module where multiple users employ the machine's TPM for machine authentication and each user authenticates to the machine. A user could have a single sign-on to different services but would still have to authenticate to the machine. Once the user has authenticated to the machine, the machine can release credentials.

The layers of authentication, where the machine is known by the cloud and the machine has a relationship with the user, significantly enhance cloud security. This process provides a chain of trust for machine-to-machine connectivity so that only known machines and known users obtain access to applications and data. Implementing this dual-level authentication solves one of the most frequently discussed cloud issues - specifically knowing the users and machines that are connected to the cloud service.

Integrating Solutions
The TPM's hardware-based security can easily integrate with software security identification (ID) standards for federated ID management such as OpenID, Security Assertion Markup Language (SAML), WS (Web Services) - Federation and others. For example, OpenID is an industry standard protocol for communication or authentication at the service level. Over 20,000 service providers support OpenID as a software token. Figure 1 shows the process for cloud access based on the two-level authentication and an identity service that uses OpenID.

Figure 1: Improved cloud security requires a hardware-based TPM token to ensure machine integrity.
Source: id.wave.com.

A cloud service can use the TPM binding an OpenID certificate to the TPM for strong machine authentication to their service. This provides not only the identity of the client but could provide a health measure as well. The service could measure the health of the client, verify a relationship, and then provide the appropriate access. This also allows multiple users per client because the machine has a relationship with the service and multiple users have relationships with the machine. The process extends the security approach across different use models.

SAML, WS-Federation and other software identification standards can be implemented in a similar manner for establishing a known identity for cloud access. The essential element for sharing an identity is a trustworthy endpoint. In all of these cases, the TPM provides the base level, the foundation of trust.

The hardware-based security model that has been described supports cloud-based private/internal, public/external and hybrid combinations of both as well as traditional IT managed networks. This allows a reuse of the security technology for minimizing costs, including the reuse of corporate client policies. The alternative is continuing on the same path of software-only authentication. However, software in the computer can be hacked, impersonated, stolen, or victimized by the malicious software it is attempting to stop. This is the same model that is broken inside the enterprise today.

Once hardware-based access is implemented for machine authentication, it enables the use of other industry standards-based tools for self-encrypting drives (SEDs) and trusted network connect (TNC). These added tools can address the data leakage problem that may result from storing the data accessed from the cloud on a user's client. The journey towards higher cloud security begins with the first step of establishing a foundation by implementing the hardware-based security of the TPM.

More Stories By Brian Berger

Brian Berger is an executive vice president for Wave Systems Corp. He manages the business, strategy and marketing functions including product management, marketing and sales direction for the company. He has been involved in security products for a number of years including work with embedded hardware, client/server applications, PKI and biometrics. He has worked in the computer industry for 20 years and has held several senior level positions in multinational companies. Berger holds two patents and has two pending patents for security products and commerce transactions capabilities using security technology. He has a bachelors of arts degree from California State University, Northridge and attended Harvard Business School, Executive Education.

Berger is Promoter member Wave’s representative to the TCG board of directors and chairs the Trusted Computing Group’s marketing work group, where he leads strategy and implementation of the organization’s marketing and communications programs. He has spoken at a number of RSA conferences, the IT Roadmap events, CTIA and a number of other industry events.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@MicroservicesExpo Stories
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
What do dependency resolution, situational awareness, and superheroes have in common? Meet Chris Corriere, a DevOps/Software Engineer at Autotrader, speaking on creative ways to maximize usage of all of the above. Mark Miller, Community Advocate and senior storyteller at Sonatype, caught up with Chris to learn more about what his team is up to.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
DevOps is a term that comes full of controversy. A lot of people are on the bandwagon, while others are waiting for the term to jump the shark, and eventually go back to business as usual. Regardless of where you are along the specturm of loving or hating the term DevOps, one thing is certain. More and more people are using it to describe a system administrator who uses scripts, or tools like, Chef, Puppet or Ansible, in order to provision infrastructure. There is also usually an expectation of...
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
The general concepts of DevOps have played a central role advancing the modern software delivery industry. With the library of DevOps best practices, tips and guides expanding quickly, it can be difficult to track down the best and most accurate resources and information. In order to help the software development community, and to further our own learning, we reached out to leading industry analysts and asked them about an increasingly popular tenet of a DevOps transformation: collaboration.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
In case you haven’t heard, the new hotness in app architectures is serverless. Mainly restricted to cloud environments (Amazon Lambda, Google Cloud Functions, Microsoft Azure Functions) the general concept is that you don’t have to worry about anything but the small snippets of code (functions) you write to do something when something happens. That’s an event-driven model, by the way, that should be very familiar to anyone who has taken advantage of a programmable proxy to do app or API routing ...
At its core DevOps is all about collaboration. The lines of communication must be opened and it takes some effort to ensure that they stay that way. It’s easy to pay lip service to trends and talk about implementing new methodologies, but without action, real benefits cannot be realized. Success requires planning, advocates empowered to effect change, and, of course, the right tooling. To bring about a cultural shift it’s important to share challenges. In simple terms, ensuring that everyone k...
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and microservices. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your conta...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will show how customers are able to achieve a level of transparency that enables everyon...
DevOps theory promotes a culture of continuous improvement built on collaboration, empowerment, systems thinking, and feedback loops. But how do you collaborate effectively across the traditional silos? How can you make decisions without system-wide visibility? How can you see the whole system when it is spread across teams and locations? How do you close feedback loops across teams and activities delivering complex multi-tier, cloud, container, serverless, and/or API-based services?
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.