Welcome!

Microservices Expo Authors: Liz McMillan, Flint Brenton, Pat Romanski, Elizabeth White, Yeshim Deniz

Related Topics: Cloud Security, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, @DXWorldExpo, SDN Journal

Cloud Security: Article

Switching the Locks: Who Has Copies of Your SSH Keys?

Organizations are constantly leaving themselves open to security breaches and noncompliance with federal regulations

Despite the recent flood of high profile network breaches, hacking attempts are hardly new. In 1995, I was attending school in Helsinki when I discovered a password "sniffer" attack in our university network. In response, I wrote a program called the "secure shell" to safeguard information as it traveled from point to point within the network. This new program shielded all of our data and ensured that these kinds of attacks didn't jeopardize our logins.

This program, SSH, works by developing an encryption key pair - one key for the server and the other key for the user's computer - and encrypting the data that is transferred between those two keys. Currently, almost every major network environment - including those in large enterprises, financial institutions and governments - uses a version of SSH to preserve data in transit and let administrators operate systems remotely. Organizations use SSH to encrypt everything from health records to logins, financial data and other personal information.

Management of Keys a Low Priority
Despite the fact that SSH keys safeguard extremely sensitive information, companies have been incredibly casual at managing SSH key generation, access and location throughout their network environments. It's similar to a home security company making numerous copies of a person's housekeys, throwing them all over the streets and never changing the lock. The only things needed to pick up one of these keys and use it to access encrypted data are interest, time and a little know-how.

Organizations are constantly leaving themselves open to security breaches and noncompliance with federal regulations by not being more diligent about SSH key management. Many are incapable of controlling who creates keys, how many are created, or where they are positioned in the network after being dispensed and those discrepancies will lead them to network-wide attacks.

Swept Under the Rug
The issue has remained concealed in the IT department, guarded by its vastly technical nature and frequent organizational challenges. System administrators may not appreciate or understand the full scope of the problem because they typically only see a small piece of their environment. On the other side of the company, even if executives and business managers recognize that there is an issue, they are usually too busy to evaluate its scope or possible implications.

SSH key mismanagement is as mysterious as it is widespread. Through dialogs with prominent governments, financial institutions and enterprises, we have determined that on average most companies have between eight and over 100 SSH keys in their environments that allow access to each Unix/Linux server. Some of these keys also permit high-level root access, allowing servers to be vulnerable to "high-risk" insiders. These "insiders," including anyone who has ever been given server access, can use these mismanaged SSH keys to gain permanent access to production servers.

Mismanaged SSH Keys Give Viruses the Advantage
Each day, the probability increases of such a breach occurring. Attacks are becoming more prevalent and sophisticated, and news stories about network breaches are popping up daily. Using SSH keys as an attack vector in a virus is very easy, requiring only a few hundred lines of code. Once a virus secures successful entry, it can use mismanaged SSH keys to spread from server to server throughout the company.

Key-based access networks are so closely connected that it is extremely likely that a successful attack will travel through all organizational servers, especially if the virus also uses additional attack vectors to increase privileges to "root" after breaching a server. With the high number of keys being distributed, it is likely that the virus will infect nearly all servers within minutes, including disaster recovery and backup servers that are typically also managed using such keys.

In the worst case scenario, a virus utilizing numerous attack vectors could spread Internet-wide, rapidly and, combined with dissolution technologies, could corrupt enormous quantities of data.

Industry Regulations Flouted
Organizations lacking proper SSH key management protocols are not only vulnerable to security breaches, they are also out of compliance with mandatory security requirements and laws. SOX, FISMA, PCI and HIPAA are all industry regulations that require control of server access as well as the ability to discontinue that access. Additionally, companies may also be disregarding internal security practices (in some cases, policies mandated by customers).

The SSH protocol and its most commonly used implementations do not create these risks. Rather, it is the result of faulty protocols relating to SSH keys, inadequate time and means to research the problem to develop solutions, lack of understanding of the implications of the issue and the hesitancy of auditors to flag problems that they do not have solutions for.

Clearly the issue of SSH keys being improperly managed cannot be glossed over forever. Without auditing, controlling, or terminating SSH key-based access to their IT systems and data properly, most healthcare providers, enterprises and government agencies are easy targets for an attacker.

Steps to Combat the Risks
Before steps can be taken to solve a problem, it must be identified as a legitimate issue. It may take multiple IT teams to begin a remediation project and will require proper endorsement and support within the company.

There are multiple steps that make up the core of the remediation project:

  • Automating key setups and key removals; eliminating human errors, manual work and reducing the amount of administrators from hundreds to almost none.
  • Controlling what commands can be executed using each key and where the keys can be used from.
  • Enforcing proper protocols for establishing keys and other key operations.
  • Monitoring the environment in order to determine which keys are actively in use and removing keys that are no longer being used.
  • Rotating keys, i.e., switching out every authorized key (and corresponding identity keys) on a regular basis, so that any compromised (copied) keys stop working.
  • Unearthing all current trust-relationships (who has access to what).

The Future of Security
SSH continues to be the gold standard for data-in-transit security but the management of SSH network access must be addressed by organizations in the current threat landscape.

Nearly all of the Fortune 500 and several prominent government agencies are inadvertently putting themselves at risk to major security threats from hackers or rogue employees because they continue to operate out of compliance. This problem cannot be solved overnight. It will take numerous years and thousands of well-trained people to fully combat the problem. It must be the entire organization's responsibility to address the issue. Time must be allotted and it must become a priority to ensure that SSH user keys are properly managed in their companies.

More Stories By Tatu Ylönen

Tatu Ylönen is the CEO and founder of SSH Communications Security. While working as a researcher at Helsinki University of Technology, he began working on a solution to combat a password-sniffing attack that targeted the university’s networks. What resulted was the development of the secure shell (SSH), a security technology that would quickly replace vulnerable rlogin, TELNET and rsh protocols as the gold standard for data-in-transit security.

Tatu has been a key driver in the emergence of security technology, including SSH & SFTP protocols and co-author of globally recognized IETF standards. He has been with SSH since its inception in 1995, holding various roles including CEO, CTO and as a board member.

In October 2011 Tatu returned as chief executive officer of SSH Communications Security, bringing his experience as a network security innovator to SSH’s product line. He is charting an exciting new course for the future of the space that he invented.

Tatu holds a Master of Science degree from the Helsinki University of Technology.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Don’t go chasing waterfall … development, that is. According to a recent post by Madison Moore on Medium featuring insights from several software delivery industry leaders, waterfall is – while still popular – not the best way to win in the marketplace. With methodologies like Agile, DevOps and Continuous Delivery becoming ever more prominent over the past 15 years or so, waterfall is old news. Or, is it? Moore cites a recent study by Gartner: “According to Gartner’s IT Key Metrics Data report, ...
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
"This all sounds great. But it's just not realistic." This is what a group of five senior IT executives told me during a workshop I held not long ago. We were working through an exercise on the organizational characteristics necessary to successfully execute a digital transformation, and the group was doing their ‘readout.' The executives loved everything we discussed and agreed that if such an environment existed, it would make transformation much easier. They just didn't believe it was reali...
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
We all know that end users experience the Internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices – not doing so will be a path to eventual b...
"Opsani helps the enterprise adopt containers, help them move their infrastructure into this modern world of DevOps, accelerate the delivery of new features into production, and really get them going on the container path," explained Ross Schibler, CEO of Opsani, and Peter Nickolov, CTO of Opsani, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Docker is sweeping across startups and enterprises alike, changing the way we build and ship applications. It's the most prominent and widely known software container platform, and it's particularly useful for eliminating common challenges when collaborating on code (like the "it works on my machine" phenomenon that most devs know all too well). With Docker, you can run and manage apps side-by-side - in isolated containers - resulting in better compute density. It's something that many developer...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.