Welcome!

Microservices Expo Authors: Elizabeth White, Pat Romanski, Ruxit Blog, Greg O'Connor, Liz McMillan

Related Topics: SDN Journal, Microservices Expo, Microsoft Cloud, Containers Expo Blog, @CloudExpo, Cloud Security

SDN Journal: Blog Feed Post

So That DNS DDoS Thing Happened

Smurfs aren't just for ICMP anymore...

DNS, like any public service, is vulnerable. Not in the sense that it has vulnerabilities but vulnerable in the sense that it must, by its nature and purpose, be publicly available. It can't hide behind access control lists or other traditional security mechanisms because the whole point of DNS is to provide a way to find your corporate presence in all its digital forms.

It should therefore not come as a surprise that eventually it turned up in the news as the primary player in a global and quite disruptive DDoS attack.

The gory details, most of which have already been circulated, are nonetheless fascinating given the low technological investment required. You can duplicate the effort with about 30 friends each with a 30Mbps connection (that means I'm out, sorry). As those who've been in the security realm for a while know, that's because these types of attacks require very little on the attack side; the desired effects come due to the unbalanced request-response ratio inherent in many protocols, DNS being one of them.

In the world of security taxonomies these are called "amplification" attacks. They aren't new; "Smurf attacks" (which exploited ICMP) were first seen in the 1990s and effected their disruption by taking advantage of broadcast addresses. DNS amplification works on the same premise, because queries are small but responses tend to be large. Both ICMP and DNS amplification attacks are effective because they use UDP, which does not require a handshake and is entirely uninterested in verifying whether or not the IP address in the request is the one from which the the request was received. It's ripe for spoofing with much less work than a connection-oriented protocol such as TCP.

To understand just how unbalanced the request-response ratio was in this attack, consider that the request was: “dig ANY ripe.net @ <OpenDNSResolverIP> +edns0=0 +bufsize=4096”. That's 36 bytes. The responses are typically 3K bytes, for an amplification factor of 100. There were 30,000 open DNS resolvers in the attack, each sending 2.5Mbps of traffic each, all directed at the target victim. CloudFlare has a great blog on the attack, I recommend a read. Another good resource on DNS amplification attacks is this white paper. Also fascinating is that this attack differed in that the target was sent a massive number of DNS responses - rather than queries - that it never solicited in the first place.

The problem is DNS is, well, public. Restricting responses could ostensibly unintentionally block legitimate client resolvers causing a kind of self-imposed denial of service. That's not acceptable. Transitioning to TCP to take advantage of handshaking and thus improve the ability to detect and shut down attempted attacks would certainly work, but at the price of performance. While F5's BIG-IP DNS solutions are optimized to avoid that penalty, most DNS infrastructure isn't and that means a general slowdown of a process that's already considered "too slow" by many users, particularly those trying to navigate the Internet via a mobile device.

So it seems we're stuck with UDP and with being attacked. But that doesn't mean we have to sit back and take it. There are ways in which you can protect against the impact of such an attack as well as others lurking in the shadows.

1. DEPRECRATE REQUESTS (and CHECKING RESPONSES)

It is important to validate that the queries being sent by the clients are ones that the DNS servers are interested in answering, and are able to. A DNS firewall or other security product can be used to validate and only allow the DNS queries that the DNS server is configured for. When the DNS protocol was designed, there were a lot of features built into the protocol that are no longer valid due to the evolving nature of the Internet. This includes many DNS query types, flags available and other settings.  One would be surprised at what types of parameters are available to mark on a DNS request and how they can be manipulated.  For example, DNS type 17=RP, which is the Responsible Person for that record.  In addition, there are ways to disrupt DNS communications by putting bad data in many of these fields. A DNS firewall is able to inspect these DNS queries and drop the requests that do not conform to DNS standards and do not use parameters that the DNS servers are configured for.

But as this attack proved, it's not just queries you have to watch out for - it's aslo responses. F5 DNS firewall features include stateful inspection of responses which means any unsolicited DNS responses are immediately dropped. While that won't change the impact on bandwidth, it will keep the server from becoming overwhelmed by processing unnecessary responses.

F5’s DNS Services includes industry-leading DNS Firewall services

2. ENSURE CAPACITY

DNS query capacity is critical to delivering a resilient available DNS infrastructure. Most organizations recognize this and put into place solutions to ensure high availability and scale of DNS. Often these solutions are simply caching DNS load balancing solutions which have their own set of risks, including being vulnerable to attack using random, jabberwocky host names. Caching DNS solutions only cache responses returned from authoritative sources and thus when presented with an unknown host name, it must query the origin server. Given a high enough volume of queries, the origin servers can still be overwhelmed, regardless of the capacity of the caching intermediary.

A high performance in-memory authoritative DNS server such as F5 DNS Express (part of F5 BIG-IP Global Traffic Manager) can shield origin servers from being overwhelmed.

3. PROTECT AGAINST HIJACKING

The vulnerability of DNS to hijacking and poisoning is still very real. In 2008, a researcher, Evgeniy Polyakov, showed that it was possible to cache poison a DNS server that was patched and running current code within 10 hours. This is simply unacceptable in an Internet-driven world that relies, ultimately, on the validity and integrity of DNS. The best solution to this and other vulnerabilities which compromise the integrity of DNS information is DNSSEC. DNSSEC was introduced to specifically correct the open and trusting nature of the protocol’s original design. DNS queries and responses are signed using keys that validate that the DNS answer was not tampered with and that it came from a reliable DNS server.

F5 BIG-IP Global Traffic Manager (GTM) not only supports DNSSEC, but does so without breaking global server load balancing techniques.

As a general rule, you should verify that you aren't accidentally running an open resolver. Consider the benefits of implementing DNS with an ICSA certified and hardened solution that does not function as an open resolver, period. And yes, F5 is a good choice for that.

Additional Resources:

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@MicroservicesExpo Stories
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the protocols that communicate data and the emerging data analy...
This digest provides an overview of good resources that are well worth reading. We’ll be updating this page as new content becomes available, so I suggest you bookmark it. Also, expect more digests to come on different topics that make all of our IT-hearts go boom!
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, will compare the Jevons Paradox to modern-day enterprise IT, e...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lea...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
With emerging ideas, innovation, and talents, the lines between DevOps, release engineering, and even security are rapidly blurring. I invite you to sit down for a moment with Principle Consultant, J. Paul Reed, and listen to his take on what the intersection between these once individualized fields entails, and may even foreshadow.
In case you haven’t heard, the new hotness in app architectures is serverless. Mainly restricted to cloud environments (Amazon Lambda, Google Cloud Functions, Microsoft Azure Functions) the general concept is that you don’t have to worry about anything but the small snippets of code (functions) you write to do something when something happens. That’s an event-driven model, by the way, that should be very familiar to anyone who has taken advantage of a programmable proxy to do app or API routing ...
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
With the rise of Docker, Kubernetes, and other container technologies, the growth of microservices has skyrocketed among dev teams looking to innovate on a faster release cycle. This has enabled teams to finally realize their DevOps goals to ship and iterate quickly in a continuous delivery model. Why containers are growing in popularity is no surprise — they’re extremely easy to spin up or down, but come with an unforeseen issue. However, without the right foresight, DevOps and IT teams may lo...
Cloud Expo 2016 New York at the Javits Center New York was characterized by increased attendance and a new focus on operations. These were both encouraging signs for all involved in Cloud Computing and all that it touches. As Conference Chair, I work with the Cloud Expo team to structure three keynotes, numerous general sessions, and more than 150 breakout sessions along 10 tracks. Our job is to balance the state of enterprise IT today with the trends that will be commonplace tomorrow. Mobile...
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...
Digitization is driving a fundamental change in society that is transforming the way businesses work with their customers, their supply chains and their people. Digital transformation leverages DevOps best practices, such as Agile Parallel Development, Continuous Delivery and Agile Operations to capitalize on opportunities and create competitive differentiation in the application economy. However, information security has been notably absent from the DevOps movement. Speed doesn’t have to negat...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...