Welcome!

Microservices Expo Authors: Kevin Benedict, Yeshim Deniz, Elizabeth White, Jyoti Bansal, Pat Romanski

Related Topics: Cloud Security, Java IoT, Microservices Expo, Linux Containers, @CloudExpo

Cloud Security: Article

Setting the Stage for Cybersecurity with Threat Intelligence

Effective cybersecurity requires an understanding of what assets need to be protected

Ransomware is the latest example of the increasingly sophisticated and damaging inventions of hackers. Individuals and organizations of all sizes are finding that their data has been locked down or encrypted until a ransom is paid. One program, CryptoLocker, infected more than 300,000 computers before the FBI and international law enforcement agencies disabled it. A few days later, Cryptowall showed up to take its place. Companies paid $1.3 billion last year in insurance to help offset the costs of combatting data attacks like these.

Other examples include highly customized malware, advanced persistent threats and large-scale Distributed Denial of Service (DDoS) attacks. Security professionals must remain ever vigilant to both known and new threats on the rise. However, with proper visibility into the extended network and robust intelligence, an attack can often be detected and stopped before it causes significant damage. By using the network to gain intelligence, cyber defenders can gain greater visibility of adversary actions and quickly shut them down.

Since an attack can be broken down into stages, it is helpful to think of a response to an attack in stages as well: before, during and after. This is standard operating procedure for anyone in the security profession. Let's examine each stage:

Before: Cyber defenders are constantly on the lookout for areas of vulnerability. Historically, security had been all about defense. Today, teams are developing more intelligent methods of halting intruders. With total visibility into their environments - including, but not limited, to physical and virtual hosts, operating systems, applications, services, protocols, users, content and network behavior -defenders can take action before an attack has even begun.

During the attack, impact can be minimized if security staff understands what is happening and how to stop it as quickly as possible. They need to be able to continuously address threats, not just at a single point in time. Tools including content inspection, behavior anomaly detection, context awareness of users, devices, location information and applications are critical to understanding an attack as it is occurring. Security teams need to discover where, what and how users are connected to applications and resources.

After the attack, cyber defenders must understand the nature of the attack and how to minimize any damage that may have occurred. Advanced forensics and assessment tools help security teams learn from attacks. Where did the attacker come from? How did they find a vulnerability in the network? Could anything have been done to prevent the breach? More important, retrospective security allows for an infrastructure that can continuously gather and analyze data to create security intelligence. Compromises that would have gone undetected for weeks or months can instead be identified, scoped, contained and remediated in real time or close to it.

The two most important aspects of a defensive strategy, then, are understanding and intelligence. Cybersecurity teams are constantly trying to learn more about who their enemies are, why they are attacking and how. This is where the extended network provides unexpected value: delivering a depth of intelligence that cannot be attained anywhere else in the computing environment. Much like in counterterrorism, intelligence is key to stopping attacks before they happen.

Virtual security, as is sometimes the case in real-world warfare, is often disproportionate to available resources. Relatively small adversaries with limited means can inflict disproportionate damage on larger adversaries. In these unbalanced situations, intelligence is one of the most important assets for addressing threats. But intelligence alone is of little benefit without an approach that optimizes the organizational and operational use of intelligence.

Security teams can correlate identity and context, using network analysis techniques that enable the collection of IP network traffic as it enters or exits an interface, and then add to that threat intelligence and analytics capabilities.

This allows security teams to combine what they learn from multiple sources of information to help identify and stop threats. Sources include what they know from the Web, what they know that's happening in the network and a growing amount of collaborative intelligence gleaned from exchange with public and private entities.

Cryptowall will eventually be defeated, but other ransomware programs and as-yet-unknown attacks will rise to threaten critical data. Effective cybersecurity requires an understanding of what assets need to be protected and an alignment of organizational priorities and capabilities. Essentially, a framework of this type enables security staff to think like malicious actors and therefore do a better job of securing their environments. The security team's own threat intelligence practice, uniting commercial threat information with native analysis of user behavior, will detect, defend against and remediate security events more rapidly and effectively than once thought possible.

More Stories By Greg Akers

Greg Akers is the Senior Vice President of Advanced Security Initiatives and Chief Technology Officer within the Threat Response, Intelligence and Development (TRIAD) group at Cisco. With more than two decades of executive experience, Akers brings a wide range of technical and security knowledge to his current role.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
The evolution of JavaScript and HTML 5 to support a genuine component based framework (Web Components) with the necessary tools to deliver something close to a native experience including genuine realtime networking (UDP using WebRTC). HTML5 is evolving to offer built in templating support, the ability to watch objects (which will speed up Angular) and Web Components (which offer Angular Directives). The native level support will offer a massive performance boost to frameworks having to fake all...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The rise of containers and microservices has skyrocketed the rate at which new applications are moved into production environments today. While developers have been deploying containers to speed up the development processes for some time, there still remain challenges with running microservices efficiently. Most existing IT monitoring tools don’t actually maintain visibility into the containers that make up microservices. As those container applications move into production, some IT operations t...
Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, will discuss solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Microservices (μServices) are a fascinating evolution of the Distributed Object Computing (DOC) paradigm. Initial design of DOC attempted to solve the problem of simplifying developing complex distributed applications by applying object-oriented design principles to disparate components operating across networked infrastructure. In this model, DOC “hid” the complexity of making this work from the developer regardless of the deployment architecture through the use of complex frameworks, such as C...
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.