Welcome!

Microservices Expo Authors: Pat Romanski, Carmen Gonzalez, Elizabeth White, Ken Schwaber, Aruna Ravichandran

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, @BigDataExpo, SDN Journal

@CloudExpo: Blog Feed Post

The Evolution of Single Sign-on

Replacing mainframes with 21st century identity

By Paul Madsen, senior technical architect

The concept of single sign-on (SSO) is not a new one, and over the years it has successfully bridged the gap between security and productivity for organizations all over the globe.

Allowing users to authenticate once to gain access to enterprise applications improves access security and user productivity by reducing the need for passwords.

In the days of mainframes, SSO was used to help maintain productivity and security from inside the protection of firewalls. As organizations moved to custom-built authentication systems in the 1990’s, it became recognized as enterprise SSO (ESSO) and later evolved into browser-based plugin or web-proxy methods known as web access management (WAM). IT’s focus was on integrating applications exclusively within the network perimeter.

However, as enterprises shifted toward cloud-based services at the turn of the century and software-as-a-service (SaaS) applications became more prevalent, the domain-based SSO mechanisms began breaking. This shift created a new need for a secure connection to multiple applications outside of the enterprise perimeter and transformed the perception on SSO.

ping-cloud1Large-scale Internet providers like Facebook and Google also created a need for consumer-facing SSO, which did not previously exist.

Prior to these social networks, SSO was used only within the enterprise and new technology was created to meet the demands of businesses as well as securely authenticate billions of Internet users.

There are many SSO options available today that fit all types of use cases for the enterprise, business and consumer, and they have been divided into three tiers—Tier 1 SSO being the strongest and most advanced of the trio. Tier 1 SSO offers maximum security when moving to the cloud, the highest convenience to all parties, the highest reliability as browser and web applications go through revisions and generally have the lowest total cost of ownership. Tier 2 SSO is the mid-level offering meant for enterprises with a cloud second strategy. Tier 3 SSO offers the least amount of security and is generally used by small businesses moving to the cloud outside of high-security environments.

The defining aspect of Tier 1 SSO is that authentication is driven by standards-based token exchange while the user directories remain in place within the centrally administered domain as opposed to synchronized externally. Standards such as SAML (Security Assertion Markup Language), OpenID Connect and OAuth have allowed for this new class of SSO to emerge for the cloud generation. Standards are important because they provide a framework that promotes consistent authentication of identity by government agencies to ensure security.

These standards have become such a staple in the authentication industry that government agencies like the United States Federal CIO Council, NIST (National Institute of Standards and Technology) and Industry Canada have created programs to ensure these standards are viable, robust, reliable, sustainable and interoperable as documented.

The Federal CIO Council has created the Identity, Credential, and Access Management (ICAM) committee to define a process where the government profiles identity management standards to incorporate the government’s security and privacy requirements, to ensure secure and reliable processes.

The committee created the Federal Identity, Credential, and Access Management (FICAM) roadmap to provide agencies with architecture and implementation guidance that addresses security problems, concerns and best practices. Industry Canada’s Authentication Principles Working Group created the Principles for Electronic Authentication which was designed to function as benchmarks for the development, provision and use of authentication services in Canada.

As enterprises continue to adopt cloud-based technologies outside of their network perimeter, the need for reliable SSO solutions becomes more vital. Vendors that support these government-issued guidelines offer strongest and most secure access management available today. Since the establishment of SSO, the technological capabilities have greatly advanced and SSO has been forced to evolve over the past few decades. First generation SSO solutions were not faced with Internet scale or exterior network access, whereas today’s SSO is up against many more obstacles.

As IT technology progresses in the future, SSO will have to grow with it and strengthen its security. For instance, while SSO is the expectation for web browser applications, the emergence of native applications (downloaded and installed onto mobile devices) has hilted the necessity of a similar SSO experience for this class of applications. To address these new use cases, new standards (or profiles of existing standards) are emerging and initiatives like the Principles for Electronic Authentication will have to adapt accordingly in order to offer the best guidance possible.

Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@MicroservicesExpo Stories
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
I’m told that it has been 21 years since Scrum became public when Jeff Sutherland and I presented it at an Object-Oriented Programming, Systems, Languages & Applications (OOPSLA) workshop in Austin, TX, in October of 1995. Time sure does fly. Things mature. I’m still in the same building and at the same company where I first formulated Scrum.[1] Initially nobody knew of Scrum, yet it is now an open source body of knowledge translated into more than 30 languages[2] People use Scrum worldwide for ...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being...
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Synthetic monitoring is hardly a new technology. It’s been around almost as long as the commercial World Wide Web has. But the importance of monitoring the performance and availability of a web application by simulating users’ interactions with that application, from around the globe, has never been more important. We’ve seen prominent vendors in the broad APM space add this technology with new development or partnerships just in the last 18 months.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
The unique combination of Amazon Web Services and Cloud Raxak, a Gartner Cool Vendor in IT Automation, provides a seamless and cost-effective way of securely moving on-premise IT workloads to Amazon Web Services. Any enterprise can now leverage the cloud, manage risk, and maintain continuous security compliance. Forrester's analysis shows that enterprises need automated security to lower security risk and decrease IT operational costs. Through the seamless integration into Amazon Web Services, ...
A lot of time, resources and energy has been invested over the past few years on de-siloing development and operations. And with good reason. DevOps is enabling organizations to more aggressively increase their digital agility, while at the same time reducing digital costs and risks. But as 2017 approaches, the hottest trends in DevOps aren’t specifically about dev or ops. They’re about testing, security, and metrics.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.