|By Tatu Ylönen||
|April 7, 2013 12:00 PM EDT||
Despite the recent flood of high profile network breaches, hacking attempts are hardly new. In 1995, I was attending school in Helsinki when I discovered a password "sniffer" attack in our university network. In response, I wrote a program called the "secure shell" to safeguard information as it traveled from point to point within the network. This new program shielded all of our data and ensured that these kinds of attacks didn't jeopardize our logins.
This program, SSH, works by developing an encryption key pair - one key for the server and the other key for the user's computer - and encrypting the data that is transferred between those two keys. Currently, almost every major network environment - including those in large enterprises, financial institutions and governments - uses a version of SSH to preserve data in transit and let administrators operate systems remotely. Organizations use SSH to encrypt everything from health records to logins, financial data and other personal information.
Management of Keys a Low Priority
Despite the fact that SSH keys safeguard extremely sensitive information, companies have been incredibly casual at managing SSH key generation, access and location throughout their network environments. It's similar to a home security company making numerous copies of a person's housekeys, throwing them all over the streets and never changing the lock. The only things needed to pick up one of these keys and use it to access encrypted data are interest, time and a little know-how.
Organizations are constantly leaving themselves open to security breaches and noncompliance with federal regulations by not being more diligent about SSH key management. Many are incapable of controlling who creates keys, how many are created, or where they are positioned in the network after being dispensed and those discrepancies will lead them to network-wide attacks.
Swept Under the Rug
The issue has remained concealed in the IT department, guarded by its vastly technical nature and frequent organizational challenges. System administrators may not appreciate or understand the full scope of the problem because they typically only see a small piece of their environment. On the other side of the company, even if executives and business managers recognize that there is an issue, they are usually too busy to evaluate its scope or possible implications.
SSH key mismanagement is as mysterious as it is widespread. Through dialogs with prominent governments, financial institutions and enterprises, we have determined that on average most companies have between eight and over 100 SSH keys in their environments that allow access to each Unix/Linux server. Some of these keys also permit high-level root access, allowing servers to be vulnerable to "high-risk" insiders. These "insiders," including anyone who has ever been given server access, can use these mismanaged SSH keys to gain permanent access to production servers.
Mismanaged SSH Keys Give Viruses the Advantage
Each day, the probability increases of such a breach occurring. Attacks are becoming more prevalent and sophisticated, and news stories about network breaches are popping up daily. Using SSH keys as an attack vector in a virus is very easy, requiring only a few hundred lines of code. Once a virus secures successful entry, it can use mismanaged SSH keys to spread from server to server throughout the company.
Key-based access networks are so closely connected that it is extremely likely that a successful attack will travel through all organizational servers, especially if the virus also uses additional attack vectors to increase privileges to "root" after breaching a server. With the high number of keys being distributed, it is likely that the virus will infect nearly all servers within minutes, including disaster recovery and backup servers that are typically also managed using such keys.
In the worst case scenario, a virus utilizing numerous attack vectors could spread Internet-wide, rapidly and, combined with dissolution technologies, could corrupt enormous quantities of data.
Industry Regulations Flouted
Organizations lacking proper SSH key management protocols are not only vulnerable to security breaches, they are also out of compliance with mandatory security requirements and laws. SOX, FISMA, PCI and HIPAA are all industry regulations that require control of server access as well as the ability to discontinue that access. Additionally, companies may also be disregarding internal security practices (in some cases, policies mandated by customers).
The SSH protocol and its most commonly used implementations do not create these risks. Rather, it is the result of faulty protocols relating to SSH keys, inadequate time and means to research the problem to develop solutions, lack of understanding of the implications of the issue and the hesitancy of auditors to flag problems that they do not have solutions for.
Clearly the issue of SSH keys being improperly managed cannot be glossed over forever. Without auditing, controlling, or terminating SSH key-based access to their IT systems and data properly, most healthcare providers, enterprises and government agencies are easy targets for an attacker.
Steps to Combat the Risks
Before steps can be taken to solve a problem, it must be identified as a legitimate issue. It may take multiple IT teams to begin a remediation project and will require proper endorsement and support within the company.
There are multiple steps that make up the core of the remediation project:
- Automating key setups and key removals; eliminating human errors, manual work and reducing the amount of administrators from hundreds to almost none.
- Controlling what commands can be executed using each key and where the keys can be used from.
- Enforcing proper protocols for establishing keys and other key operations.
- Monitoring the environment in order to determine which keys are actively in use and removing keys that are no longer being used.
- Rotating keys, i.e., switching out every authorized key (and corresponding identity keys) on a regular basis, so that any compromised (copied) keys stop working.
- Unearthing all current trust-relationships (who has access to what).
The Future of Security
SSH continues to be the gold standard for data-in-transit security but the management of SSH network access must be addressed by organizations in the current threat landscape.
Nearly all of the Fortune 500 and several prominent government agencies are inadvertently putting themselves at risk to major security threats from hackers or rogue employees because they continue to operate out of compliance. This problem cannot be solved overnight. It will take numerous years and thousands of well-trained people to fully combat the problem. It must be the entire organization's responsibility to address the issue. Time must be allotted and it must become a priority to ensure that SSH user keys are properly managed in their companies.
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Dec. 4, 2016 03:30 PM EST Reads: 5,505
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, showed how customers are able to achieve a level of transparency that enables everyone fro...
Dec. 4, 2016 03:15 PM EST Reads: 1,876
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Dec. 4, 2016 03:00 PM EST Reads: 3,245
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Dec. 4, 2016 03:00 PM EST Reads: 2,516
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
Dec. 4, 2016 02:00 PM EST Reads: 1,898
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Dec. 4, 2016 12:45 PM EST Reads: 2,124
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Dec. 4, 2016 12:30 PM EST Reads: 1,671
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Dec. 4, 2016 11:15 AM EST Reads: 5,738
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 4, 2016 10:45 AM EST Reads: 884
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
Dec. 4, 2016 09:45 AM EST Reads: 891
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
Dec. 4, 2016 04:45 AM EST Reads: 5,000
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Dec. 4, 2016 04:30 AM EST Reads: 918
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 4, 2016 12:30 AM EST Reads: 1,794
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 3, 2016 09:30 PM EST Reads: 1,790
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Dec. 3, 2016 08:00 PM EST Reads: 1,762
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Dec. 3, 2016 05:15 PM EST Reads: 2,157
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 3, 2016 04:30 PM EST Reads: 1,498
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to delive...
Dec. 3, 2016 08:30 AM EST Reads: 1,386
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Dec. 3, 2016 04:00 AM EST Reads: 2,779
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
Dec. 3, 2016 01:45 AM EST Reads: 4,559