Click here to close now.



Welcome!

Microservices Expo Authors: Jason Bloomberg, Elizabeth White, Anders Wallgren, Pat Romanski, Cloud Best Practices Network

Related Topics: Cloud Security, Java IoT, Microservices Expo, Linux Containers, @CloudExpo, @BigDataExpo

Cloud Security: Article

Protecting the Network with Proactive Encryption Monitoring

Encryption technology is everywhere: in applications, data centers and other foundation infrastructure

Encryption is a key element of a complete security strategy. The 2013 Global Encryption Trends Study shows a steady increase in the use of encryption solutions over the past nine years. Thirty-five percent of organizations now have an encryption strategy applied consistently across the entire enterprise, up from 29 percent in 2012. The study showed that, for the first time, the main goal for most organizations in deploying encryption is mitigating the effects of data breaches. There is good reason for this shift: the latest Ponemon Institute research reveals that the cost of a data breach is $3.5 million, up 15 percent from last year.

On the surface, the 35 percent figure seems like good news, until one realizes that 65 percent of organizations do not have an enterprise-wide encryption strategy. In addition, even a consistently applied strategy can lack visibility, management controls or remediation processes. This gives hackers the green light to attack as soon as they spot a vulnerability.

While organizations are moving in the right direction when it comes to encryption, much more needs to be done - and quickly. Encryption has come to be viewed as a commodity: organizations deploy it and assume they've taken the steps they need to maintain security. If breaches occur, it's rarely the fault of the software or the encryption protocol. The fault lies rather in the fact that encryption management is left in the domain of IT system administrators and has never been properly managed with access controls, monitoring or proactive data loss prevention.

Too Many Keys Spoil the Security
While recent high-profile vulnerabilities have exposed the need to manage encrypted networks better, it's important to understand that administrators can cause vulnerabilities as well. In the Secure Shell (SSH) data-in-transit protocol, key-based authentication is one of the more common methods used to gain access to critical information. Keys are easy to create, and, at the most basic level, are simple text files that can be easily uploaded to the appropriate system. Associated with each key is an identity: either a person or machine that grants access to information assets and performs specific tasks, such as transferring a file or dropping a database, depending on the assigned authorizations. In the case of Secure Shell keys, those basic text files provide access to some of the most critical information within an organization.

A quick calculation will reveal that the number of keys assigned over the past decade to employees, contractors and applications can run up to a million or more for a single enterprise. In one example, a major bank with around 15,000 hosts had over 1.5 million keys circulating within its network environment. Around 10 percent of those keys - or 150,000 - provided high-level administrator access. This represents an astonishing number of open doors that no one was monitoring.

It may seem impossible that such a security lapse could happen, but consider that encryption is often perceived merely as a tool. Because nothing appeared on the surface to be out of place, no processes were shut down and the problem was undetected.

Safety Hazards
Forgetting to keep track of keys is one problem; failing to remove them is another. System administrators and application developers will often deploy keys in order to readily gain access to systems they are working on. These keys grant a fairly high level of privilege and are often used across multiple systems, creating a one-to-many relationship. In many cases, employees or contractors who are terminated - or even simply reassigned to other tasks that no longer require the same access - continue to carry access via Secure Shell keys; the assumption is that terminating the account is enough. Unfortunately, this is not the case when Secure Shell keys are involved; the keys must also be removed or the access remains in place.

SSH keys pose another threat as well: subverting privileged access management systems (PAMs). Many PAMs use a gateway or jump host that administrators log into to gain access to network assets. PAM solutions connect with user directories to assign privileges, monitor user actions and record which actions have taken place. While this appears like an airtight way to monitor administrators, it is incredibly easy for an administrator to log into the gateway, deploy a key and then log in using key authentication, thereby circumventing any PAM safeguards in place.

Too Clever for Their Own Good
Poorly monitored access is just one security hazard in encrypted environments. Conventional PAM solutions, which use gateways and focus on interactive users only, are designed to monitor administrator activities. Unfortunately, as mentioned earlier, they end up being fairly easy to work around. Additionally, encryption blinds attackers the same way it blinds security operations and forensics teams. For this reason, encrypted traffic is rarely monitored and is allowed to flow freely in and out of the network environment. This creates obvious risks and negates security intelligence capabilities to a large degree.

The Internet offers many articles on how to use Secure Shell to bypass corporate firewalls. This is actually a fairly common and clever workaround policy that unfortunately creates a huge security risk. In order to eliminate this risk, the organization must decrypt and inspect the traffic.

Traffic Safety
Decrypting Secure Shell traffic would require an organization to use an inline proxy with access to the private keys - essentially a friendly man-in-the-middle - to decrypt the traffic without interfering with the network. When successfully deployed, 100 percent of encrypted traffic for both interactive users and M2M identities can be monitored. Also, because this is done at the network level, it's not possible for malicious parties to execute a workaround. With this method, enterprises can proactively detect suspicious or out-of-policy traffic. This is called encrypted channel monitoring and represents the next generation in the evolution of PAM.

This kind of monitoring solves the issue of decrypting traffic at the perimeter and helps organizations move away from a gateway approach to PAM. At the same time, it prevents attackers from using the organization's own encryption technology against itself. In addition, an organization can use inline access controls and user profiling to control what activities a user can undertake. For example, policy controls can be enforced to forbid file transfers from certain critical systems. With the more advanced solutions, an organization can even block subchannels from running inside the encrypted tunnel, the preferred method of quickly exfiltrating data.

Encryption technologies are often set up without effective monitoring or proper access controls, which also blinds layered defenses. A major vulnerability could potentially compromise the entire server, which could in turn expose other areas of the network to subsequent attacks.

A Healthy Respect for Encryption
Encryption technology is everywhere: in applications, data centers and other foundation infrastructure. While it has been widely embraced, it has also often been abused, misused or neglected. Most organizations have not instituted centralized provisioning, encrypted channel monitoring and other best practices, even though the consequence of inadequate security can be severe. IT security staff may think conventional PAM is keeping their organizations safe, when commonly-known workarounds are instead putting their data in jeopardy.

No one understands better than IT administrators how critical network security is. This understanding should spur security professionals to do all in their power to make their organizations' data as safe as possible. Given all that can go awry, it's important to examine encrypted networks, enabling layered defenses and putting proactive monitoring in place if they have not yet done so. An all-inclusive encrypted channel monitoring strategy will go a long way toward securing the network.

More Stories By Jason Thompson

Jason Thompson is director of global marketing for SSH Communications Security. He brings more than 12 years of experience launching new, innovative solutions across a number of industry verticals. Prior to joining SSH, he worked at Q1 Labs where he helped build awareness around security intelligence and holistic approaches dealing with advanced threat vectors. Mr. Thompson holds a BA from Colorado State University and an MA for the University of North Carolina at Wilmington.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
The battle over bimodal IT is heating up. Now that there’s a reasonably broad consensus that Gartner’s advice about bimodal IT is deeply flawed – consensus everywhere except perhaps at Gartner – various ideas are springing up to fill the void. The bimodal problem, of course, is well understood. ‘Traditional’ or ‘slow’ IT uses hidebound, laborious processes that would only get in the way of ‘fast’ or ‘agile’ digital efforts. The result: incoherent IT strategies and shadow IT struggles that lead ...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
As software organizations continue to invest in achieving Continuous Delivery (CD) of their applications, we see increased interest in microservices architectures, which–on the face of it–seem like a natural fit for enabling CD. In microservices (or its predecessor, “SOA”), the business functionality is decomposed into a set of independent, self-contained services that communicate with each other via an API. Each of the services has their own application release cycle, and are developed and depl...
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
At the heart of the Cloud Native model is a microservices application architecture, and applying this to a telco SDN scenario offers enormous opportunity for product innovation and competitive advantage. For example in the ETSI NFV Ecosystem white paper they describe one of the product markets that SDN might address to be the Home sector. Vendors like Alcatel market SDN-based solutions for the home market, offering Home Gateways – A virtual residential gateway (vRGW) where service provider...
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
SYS-CON Events announced today that AppNeta, the leader in performance insight for business-critical web applications, will exhibit and present at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. AppNeta is the only application performance monitoring (APM) company to provide solutions for all applications – applications you develop internally, business-critical SaaS applications you use and the networks that deli...
In the Bimodal model we find two areas of IT - the traditional kind where the main concern is keeping the lights on and the IT focusing on agility and speed, where everything needs to be faster. Today companies are investing in new technologies and processes to emulate their most agile competitors. Gone are the days of waterfall development and releases only every few months. Today's IT and the business it powers demands performance akin to a supercar - everything needs to be faster, every sc...
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24x7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be muc...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
With microservices, SOA and distributed architectures becoming more popular, it is becoming increasingly harder to keep track of where time is spent in a distributed application when trying to diagnose performance problems. Distributed tracing systems attempt to address this problem by following application requests across service boundaries, persisting metadata along the way that provide context for fine-grained performance monitoring.
Web performance issues and advances have been gaining a stronger presence in the headlines as people are becoming more aware of its impact on virtually every business, and 2015 was no exception. We saw a myriad of major outages this year hit some of the biggest corporations, as well as some technology integrations and other news that we IT Ops aficionados find very exciting. This past year has offered several opportunities for growth and evolution in the performance realm — even the worst failu...
Are you someone who knows that the number one rule in DevOps is “Don’t Panic”? Especially when it comes to making Continuous Delivery changes inside your organization? Are you someone that theorizes that if anyone implements real automation changes, the solution will instantly become antiquated and be replaced by something even more bizarre and inexplicable?
Welcome to the first top DevOps news roundup of 2016! At the end of last year, we saw some great predictions for 2016. While we’re excited to kick off the new year, this week’s top posts reminded us to take a second to slow down and really understand the current state of affairs. For example, do you actually know what microservices are – or aren’t? What about DevOps? Does the emphasis still fall mostly on the development side? This week’s top news definitely got the wheels turning and just migh...
Test automation is arguably the most important innovation to the process of QA testing in software development. The ability to automate regression testing and other repetitive test cases can significantly reduce the overall production time for even the most complex solutions. As software continues to be developed for new platforms – including mobile devices and the diverse array of endpoints that will be created during the rise of the Internet of Things - automation integration will have a huge ...
Providing a full-duplex communication channel over a single TCP connection, WebSocket is the most efficient protocol for real-time responses over the web. If you’re utilizing WebSocket technology, performance testing will boil down to simulating the bi-directional nature of your application. Introduced with HTML5, the WebSocket protocol allows for more interaction between a browser and website, facilitating real-time applications and live content. WebSocket technology creates a persistent conne...