Click here to close now.

Welcome!

Microservices Journal Authors: Elizabeth White, Liz McMillan, Tech Spot, XebiaLabs Blog, Baruch Sadogursky

Related Topics: Cloud Expo, Security

Cloud Expo: Article

The TPM, Trust and the Cloud: Making Trust Real

The journey towards higher cloud security begins with the first step of establishing a foundation

As one of the hottest trends going in the information/computing world, how do the requirements for cloud security relate to the security concerns that enterprises have been addressing prior to the uptick in cloud computing? While the cloud is a relatively new area for concern, improved security has been a hot button for organizations of all size for well over a decade.

The cloud continues to evolve with various services and models including externally hosted public cloud services such as Amazon, independently hosted cloud services by an enterprise or third party that are either outside the firewall or inside the firewall (private cloud services), and a hybrid of private/internal and public/external models. On the surface, the different models might appear to have different privacy, access control and data protection issues. However, compliance issues and business data leakage out of any of these clouds pose problems for an organization similar to ones that existed prior to working in the cloud.

No matter which cloud services or model is pursued, security improvements should be among the criteria on the "must have" list. Issues to consider include: hardware versus software-based security, hardware activation, known users, known machines, accessing both data and application services, data protection and compliance, and protecting the service provider's agreement for controlled user access. For data leakage and access control, authorization to access information from the cloud, whether it's an application or an application with data, requires a trusted endpoint to ensure strong authentication and knowledge of who is accessing the cloud service and the data hosted by the cloud service. In fact, a trusted endpoint is part of the solution for addressing all of the issues. One solution involves implementing the Trusted Platform Module (TPM), a widely available security chip that already resides in most business PCs. This standard-based hardware component provides stronger security than software-only approaches that can be stolen, hacked, impersonated or worse, causing security breaches and business disruption.

Hardware-Based Trust
Developed by the Trusted Computing Group (TCG), the TPM is a standard for providing a hardware-based root of trust for computing and other systems. Unlike other hardware token security tools that are available in USB, key fob, and smart card type products, the standards-based TPM is typically an application-specific integrated (ASIC) available from multiple sources to ensure a highly competitive and readily available component installed in the computer when it was manufactured. TPM capability also can be integrated into chipsets, Ethernet controllers, and the like. To date, it has been is estimated that 200-300 million TPMs have shipped in enterprise PCs based on these standards.

The open standards-based approach provides a low-cost, flexible solution from multiple sources with a higher adoption rate and extensive support from over 100 technology companies in the industry across the globe. International governments have already adopted or plan to adopt the TPM as the standard for authentication. If these governments pursue cloud services, they will not proceed without a similar strong authentication methodology - and they will not implement proprietary technology.

Although inside the computer, the TPM needs to be activated by the user to enable its ability to improve security. Once the TPM is activated, users can more securely store and manage keys and passwords as well as easily encrypt files, folders and email. The TPM allows multi-factor authentication to be based on hardware vs. the simple password in software. For multi-factor authentication requirements, the TPM complements fingerprint readers, PKI, certificates and smart cards.

Dual-Level Identification & Authentication
As a hardware item inside the computer, the TPM allows the identification of the machine as well as the user.

This elevates the sign-on process to two-factor or dual-level authentication: the user through their passwords and the second through the machine itself or the machine's authentication to a service. The approach provides a significant increase in security, especially when compared to single-factor software-only approaches.

With its secure storage for a credential or critical piece of information, the TPM also provides the ability for a multi-use module where multiple users employ the machine's TPM for machine authentication and each user authenticates to the machine. A user could have a single sign-on to different services but would still have to authenticate to the machine. Once the user has authenticated to the machine, the machine can release credentials.

The layers of authentication, where the machine is known by the cloud and the machine has a relationship with the user, significantly enhance cloud security. This process provides a chain of trust for machine-to-machine connectivity so that only known machines and known users obtain access to applications and data. Implementing this dual-level authentication solves one of the most frequently discussed cloud issues - specifically knowing the users and machines that are connected to the cloud service.

Integrating Solutions
The TPM's hardware-based security can easily integrate with software security identification (ID) standards for federated ID management such as OpenID, Security Assertion Markup Language (SAML), WS (Web Services) - Federation and others. For example, OpenID is an industry standard protocol for communication or authentication at the service level. Over 20,000 service providers support OpenID as a software token. Figure 1 shows the process for cloud access based on the two-level authentication and an identity service that uses OpenID.

Figure 1: Improved cloud security requires a hardware-based TPM token to ensure machine integrity.
Source: id.wave.com.

A cloud service can use the TPM binding an OpenID certificate to the TPM for strong machine authentication to their service. This provides not only the identity of the client but could provide a health measure as well. The service could measure the health of the client, verify a relationship, and then provide the appropriate access. This also allows multiple users per client because the machine has a relationship with the service and multiple users have relationships with the machine. The process extends the security approach across different use models.

SAML, WS-Federation and other software identification standards can be implemented in a similar manner for establishing a known identity for cloud access. The essential element for sharing an identity is a trustworthy endpoint. In all of these cases, the TPM provides the base level, the foundation of trust.

Summary
The hardware-based security model that has been described supports cloud-based private/internal, public/external and hybrid combinations of both as well as traditional IT managed networks. This allows a reuse of the security technology for minimizing costs, including the reuse of corporate client policies. The alternative is continuing on the same path of software-only authentication. However, software in the computer can be hacked, impersonated, stolen, or victimized by the malicious software it is attempting to stop. This is the same model that is broken inside the enterprise today.

Once hardware-based access is implemented for machine authentication, it enables the use of other industry standards-based tools for self-encrypting drives (SEDs) and trusted network connect (TNC). These added tools can address the data leakage problem that may result from storing the data accessed from the cloud on a user's client. The journey towards higher cloud security begins with the first step of establishing a foundation by implementing the hardware-based security of the TPM.

More Stories By Brian Berger

Brian Berger is an executive vice president for Wave Systems Corp. He manages the business, strategy and marketing functions including product management, marketing and sales direction for the company. He has been involved in security products for a number of years including work with embedded hardware, client/server applications, PKI and biometrics. He has worked in the computer industry for 20 years and has held several senior level positions in multinational companies. Berger holds two patents and has two pending patents for security products and commerce transactions capabilities using security technology. He has a bachelors of arts degree from California State University, Northridge and attended Harvard Business School, Executive Education.

Berger is Promoter member Wave’s representative to the TCG board of directors and chairs the Trusted Computing Group’s marketing work group, where he leads strategy and implementation of the organization’s marketing and communications programs. He has spoken at a number of RSA conferences, the IT Roadmap events, CTIA and a number of other industry events.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers ...
Converging digital disruptions is creating a major sea change - Cisco calls this the Internet of Everything (IoE). IoE is the network connection of People, Process, Data and Things, fueled by Cloud, Mobile, Social, Analytics and Security, and it represents a $19Trillion value-at-stake over the next 10 years. In her keynote at @ThingsExpo, Manjula Talreja, VP of Cisco Consulting Services, will discuss IoE and the enormous opportunities it provides to public and private firms alike. She will shar...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
How can you compare one technology or tool to its competitors? Usually, there is no objective comparison available. So how do you know which is better? Eclipse or IntelliJ IDEA? Java EE or Spring? C# or Java? All you can usually find is a holy war and biased comparisons on vendor sites. But luckily, sometimes, you can find a fair comparison. How does this come to be? By having it co-authored by the stakeholders. The binary repository comparison matrix is one of those rare resources. It is edite...
With the advent of micro-services, the application design paradigm has undergone a major shift. The days of developing monolithic applications are over. We are bringing in the principles (read SOA) hereto the preserve of applications or system integration space into the application development world. Since the micro-services are consumed within the application, the need of ESB is not there. There is no message transformation or mediations required. But service discovery and load balancing of ...
The integration between the 2 solutions is handled by a module provided by XebiaLabs that will ensure the containers are correctly defined in the XL Deloy repository based on the information managed by Puppet. It uses the REST API offered by the XL Deploy server: so the security permissions are checked as a operator could do it using the GUI or the CLI. This article shows you how use the xebialabs/xldeploy Puppet module. The Production environment is based on 2 tomcats instances (tomcat1 &...
SYS-CON Events announced today that EnterpriseDB (EDB), the leading worldwide provider of enterprise-class Postgres products and database compatibility solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. EDB is the largest provider of Postgres software and services that provides enterprise-class performance and scalability and the open source freedom to divert budget from more costly traditiona...
Do you think development teams really update those BMC Remedy tickets with all the changes contained in a release? They don't. Most of them just "check the box" and move on. They rose a Risk Level that won't raise questions from the Change Control managers and they work around the checks and balances. The alternative is to stop and wait for a department that still thinks releases are rare events. When a release happens every day there's just not enough time for people to attend CAB meeting...
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud en...
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
I’ve been thinking a bit about microservices (μServices) recently. My immediate reaction is to think: “Isn’t this just yet another new term for the same stuff, Web Services->SOA->APIs->Microservices?” Followed shortly by the thought, “well yes it is, but there are some important differences/distinguishing factors.” Microservices is an evolutionary paradigm born out of the need for simplicity (i.e., get away from the ESB) and alignment with agile (think DevOps) and scalable (think Containerizati...
In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, president of Intellyx, panelists Roberto Medrano, Executive Vice President at Akana; Lori MacVittie, IoT_Microservices Power PanelEvangelist for F5 Networks; and Troy Topnik, ActiveState’s Technical Product Manager; will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of ...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
Announced separately, New Relic is joining the Cloud Foundry Foundation to continue the support of customers and partners investing in this leading PaaS. As a member, New Relic is contributing the New Relic tile, service broker and build pack with the goal of easing the development of applications on Cloud Foundry and enabling the success of these applications without dedicated monitoring infrastructure. Supporting Quotes "The proliferation of microservices and new technologies like Docker ha...
There’s a lot of discussion around managing outages in production via the likes of DevOps principles and the corresponding software development lifecycles that does enable higher quality output from development, however, one cannot lay all blame for “bugs” and failures at the feet of those responsible for coding and development. As developers incorporate features and benefits of these paradigm shift, there is a learning curve and a point of not-knowing-what-is-not-known. Sometimes, the only way ...
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, will cover the union between the two topics and why this is important. He will cover an overview of Immutable Infrastructure then show how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He will end the session with some interesting case study examples.