Click here to close now.

Welcome!

Microservices Journal Authors: AppDynamics Blog, Pat Romanski, Elizabeth White, Carmen Gonzalez, Liz McMillan

Related Topics: Security, Microservices Journal, Virtualization, Web 2.0, Cloud Expo, Big Data Journal, SDN Journal

Security: Article

Switching the Locks: Who Has Copies of Your SSH Keys?

Organizations are constantly leaving themselves open to security breaches and noncompliance with federal regulations

Despite the recent flood of high profile network breaches, hacking attempts are hardly new. In 1995, I was attending school in Helsinki when I discovered a password "sniffer" attack in our university network. In response, I wrote a program called the "secure shell" to safeguard information as it traveled from point to point within the network. This new program shielded all of our data and ensured that these kinds of attacks didn't jeopardize our logins.

This program, SSH, works by developing an encryption key pair - one key for the server and the other key for the user's computer - and encrypting the data that is transferred between those two keys. Currently, almost every major network environment - including those in large enterprises, financial institutions and governments - uses a version of SSH to preserve data in transit and let administrators operate systems remotely. Organizations use SSH to encrypt everything from health records to logins, financial data and other personal information.

Management of Keys a Low Priority
Despite the fact that SSH keys safeguard extremely sensitive information, companies have been incredibly casual at managing SSH key generation, access and location throughout their network environments. It's similar to a home security company making numerous copies of a person's housekeys, throwing them all over the streets and never changing the lock. The only things needed to pick up one of these keys and use it to access encrypted data are interest, time and a little know-how.

Organizations are constantly leaving themselves open to security breaches and noncompliance with federal regulations by not being more diligent about SSH key management. Many are incapable of controlling who creates keys, how many are created, or where they are positioned in the network after being dispensed and those discrepancies will lead them to network-wide attacks.

Swept Under the Rug
The issue has remained concealed in the IT department, guarded by its vastly technical nature and frequent organizational challenges. System administrators may not appreciate or understand the full scope of the problem because they typically only see a small piece of their environment. On the other side of the company, even if executives and business managers recognize that there is an issue, they are usually too busy to evaluate its scope or possible implications.

SSH key mismanagement is as mysterious as it is widespread. Through dialogs with prominent governments, financial institutions and enterprises, we have determined that on average most companies have between eight and over 100 SSH keys in their environments that allow access to each Unix/Linux server. Some of these keys also permit high-level root access, allowing servers to be vulnerable to "high-risk" insiders. These "insiders," including anyone who has ever been given server access, can use these mismanaged SSH keys to gain permanent access to production servers.

Mismanaged SSH Keys Give Viruses the Advantage
Each day, the probability increases of such a breach occurring. Attacks are becoming more prevalent and sophisticated, and news stories about network breaches are popping up daily. Using SSH keys as an attack vector in a virus is very easy, requiring only a few hundred lines of code. Once a virus secures successful entry, it can use mismanaged SSH keys to spread from server to server throughout the company.

Key-based access networks are so closely connected that it is extremely likely that a successful attack will travel through all organizational servers, especially if the virus also uses additional attack vectors to increase privileges to "root" after breaching a server. With the high number of keys being distributed, it is likely that the virus will infect nearly all servers within minutes, including disaster recovery and backup servers that are typically also managed using such keys.

In the worst case scenario, a virus utilizing numerous attack vectors could spread Internet-wide, rapidly and, combined with dissolution technologies, could corrupt enormous quantities of data.

Industry Regulations Flouted
Organizations lacking proper SSH key management protocols are not only vulnerable to security breaches, they are also out of compliance with mandatory security requirements and laws. SOX, FISMA, PCI and HIPAA are all industry regulations that require control of server access as well as the ability to discontinue that access. Additionally, companies may also be disregarding internal security practices (in some cases, policies mandated by customers).

The SSH protocol and its most commonly used implementations do not create these risks. Rather, it is the result of faulty protocols relating to SSH keys, inadequate time and means to research the problem to develop solutions, lack of understanding of the implications of the issue and the hesitancy of auditors to flag problems that they do not have solutions for.

Clearly the issue of SSH keys being improperly managed cannot be glossed over forever. Without auditing, controlling, or terminating SSH key-based access to their IT systems and data properly, most healthcare providers, enterprises and government agencies are easy targets for an attacker.

Steps to Combat the Risks
Before steps can be taken to solve a problem, it must be identified as a legitimate issue. It may take multiple IT teams to begin a remediation project and will require proper endorsement and support within the company.

There are multiple steps that make up the core of the remediation project:

  • Automating key setups and key removals; eliminating human errors, manual work and reducing the amount of administrators from hundreds to almost none.
  • Controlling what commands can be executed using each key and where the keys can be used from.
  • Enforcing proper protocols for establishing keys and other key operations.
  • Monitoring the environment in order to determine which keys are actively in use and removing keys that are no longer being used.
  • Rotating keys, i.e., switching out every authorized key (and corresponding identity keys) on a regular basis, so that any compromised (copied) keys stop working.
  • Unearthing all current trust-relationships (who has access to what).

The Future of Security
SSH continues to be the gold standard for data-in-transit security but the management of SSH network access must be addressed by organizations in the current threat landscape.

Nearly all of the Fortune 500 and several prominent government agencies are inadvertently putting themselves at risk to major security threats from hackers or rogue employees because they continue to operate out of compliance. This problem cannot be solved overnight. It will take numerous years and thousands of well-trained people to fully combat the problem. It must be the entire organization's responsibility to address the issue. Time must be allotted and it must become a priority to ensure that SSH user keys are properly managed in their companies.

More Stories By Tatu Ylönen

Tatu Ylönen is the CEO and founder of SSH Communications Security. While working as a researcher at Helsinki University of Technology, he began working on a solution to combat a password-sniffing attack that targeted the university’s networks. What resulted was the development of the secure shell (SSH), a security technology that would quickly replace vulnerable rlogin, TELNET and rsh protocols as the gold standard for data-in-transit security.

Tatu has been a key driver in the emergence of security technology, including SSH & SFTP protocols and co-author of globally recognized IETF standards. He has been with SSH since its inception in 1995, holding various roles including CEO, CTO and as a board member.

In October 2011 Tatu returned as chief executive officer of SSH Communications Security, bringing his experience as a network security innovator to SSH’s product line. He is charting an exciting new course for the future of the space that he invented.

Tatu holds a Master of Science degree from the Helsinki University of Technology.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Docker is an open platform for developers and sysadmins of distributed applications that enables them to build, ship, and run any app anywhere. Docker allows applications to run on any platform irrespective of what tools were used to build it making it easy to distribute, test, and run software. I found this 5 Minute Docker video, which is very helpful when you want to get a quick and digestible overview. If you want to learn more, you can go to Docker’s web page and start with this Docker intro...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
In her General Session at 15th Cloud Expo, Anne Plese, Senior Consultant, Cloud Product Marketing, at Verizon Enterprise, focused on finding the right mix of renting vs. buying Oracle capacity to scale to meet business demands, and offer validated Oracle database TCO models for Oracle development and testing environments. Anne Plese is a marketing and technology enthusiast/realist with over 19+ years in high tech. At Verizon Enterprise, she focuses on driving growth for the Verizon Cloud platfo...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her Day 2 Keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, discussed...
How does one bridge the gap between traditional enterprise storage infrastructures and the private, hybrid, and public cloud? In his session at 15th Cloud Expo, Dan Pollack, Chief Architect of Storage Operations at AOL Inc., examed the workload differences and required changes to reuse existing knowledge and components when building and using a cloud infrastructure. He also looked into the operational considerations, tool requirements, and behavioral changes required for private cloud storage s...
Working with Big Data is challenging, especially when decision makers depend on market insights and intelligence from your data but don't have quick access to it or find it unusable. In their session at 6th Big Data Expo, Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia; Zel Bianco, President, CEO and Co-Founder of Interactive Edge of Solgenia; and Ermanno Bonifazi, CEO & Founder at Solgenia, discussed how a revolutionary cloud-based BI along with mobile analytics is already c...
How can you compare one technology or tool to its competitors? Usually, there is no objective comparison available. So how do you know which is better? Eclipse or IntelliJ IDEA? Java EE or Spring? C# or Java? All you can usually find is a holy war and biased comparisons on vendor sites. But luckily, sometimes, you can find a fair comparison. How does this come to be? By having it co-authored by the stakeholders. The binary repository comparison matrix is one of those rare resources. It is edite...
There’s a lot of discussion around managing outages in production via the likes of DevOps principles and the corresponding software development lifecycles that does enable higher quality output from development, however, one cannot lay all blame for “bugs” and failures at the feet of those responsible for coding and development. As developers incorporate features and benefits of these paradigm shift, there is a learning curve and a point of not-knowing-what-is-not-known. Sometimes, the only way ...
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore's Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at Big Data Expo, Mason Katz, CTO and co-founder of StackIQ, disc...
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers ...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...