|By Greg Akers||
|July 27, 2014 02:00 PM EDT||
Ransomware is the latest example of the increasingly sophisticated and damaging inventions of hackers. Individuals and organizations of all sizes are finding that their data has been locked down or encrypted until a ransom is paid. One program, CryptoLocker, infected more than 300,000 computers before the FBI and international law enforcement agencies disabled it. A few days later, Cryptowall showed up to take its place. Companies paid $1.3 billion last year in insurance to help offset the costs of combatting data attacks like these.
Other examples include highly customized malware, advanced persistent threats and large-scale Distributed Denial of Service (DDoS) attacks. Security professionals must remain ever vigilant to both known and new threats on the rise. However, with proper visibility into the extended network and robust intelligence, an attack can often be detected and stopped before it causes significant damage. By using the network to gain intelligence, cyber defenders can gain greater visibility of adversary actions and quickly shut them down.
Since an attack can be broken down into stages, it is helpful to think of a response to an attack in stages as well: before, during and after. This is standard operating procedure for anyone in the security profession. Let's examine each stage:
Before: Cyber defenders are constantly on the lookout for areas of vulnerability. Historically, security had been all about defense. Today, teams are developing more intelligent methods of halting intruders. With total visibility into their environments - including, but not limited, to physical and virtual hosts, operating systems, applications, services, protocols, users, content and network behavior -defenders can take action before an attack has even begun.
During the attack, impact can be minimized if security staff understands what is happening and how to stop it as quickly as possible. They need to be able to continuously address threats, not just at a single point in time. Tools including content inspection, behavior anomaly detection, context awareness of users, devices, location information and applications are critical to understanding an attack as it is occurring. Security teams need to discover where, what and how users are connected to applications and resources.
After the attack, cyber defenders must understand the nature of the attack and how to minimize any damage that may have occurred. Advanced forensics and assessment tools help security teams learn from attacks. Where did the attacker come from? How did they find a vulnerability in the network? Could anything have been done to prevent the breach? More important, retrospective security allows for an infrastructure that can continuously gather and analyze data to create security intelligence. Compromises that would have gone undetected for weeks or months can instead be identified, scoped, contained and remediated in real time or close to it.
The two most important aspects of a defensive strategy, then, are understanding and intelligence. Cybersecurity teams are constantly trying to learn more about who their enemies are, why they are attacking and how. This is where the extended network provides unexpected value: delivering a depth of intelligence that cannot be attained anywhere else in the computing environment. Much like in counterterrorism, intelligence is key to stopping attacks before they happen.
Virtual security, as is sometimes the case in real-world warfare, is often disproportionate to available resources. Relatively small adversaries with limited means can inflict disproportionate damage on larger adversaries. In these unbalanced situations, intelligence is one of the most important assets for addressing threats. But intelligence alone is of little benefit without an approach that optimizes the organizational and operational use of intelligence.
Security teams can correlate identity and context, using network analysis techniques that enable the collection of IP network traffic as it enters or exits an interface, and then add to that threat intelligence and analytics capabilities.
This allows security teams to combine what they learn from multiple sources of information to help identify and stop threats. Sources include what they know from the Web, what they know that's happening in the network and a growing amount of collaborative intelligence gleaned from exchange with public and private entities.
Cryptowall will eventually be defeated, but other ransomware programs and as-yet-unknown attacks will rise to threaten critical data. Effective cybersecurity requires an understanding of what assets need to be protected and an alignment of organizational priorities and capabilities. Essentially, a framework of this type enables security staff to think like malicious actors and therefore do a better job of securing their environments. The security team's own threat intelligence practice, uniting commercial threat information with native analysis of user behavior, will detect, defend against and remediate security events more rapidly and effectively than once thought possible.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Feb. 14, 2016 09:30 AM EST Reads: 102
Microservices are a type of software architecture where large applications are made up of small, self-contained units working together through APIs that are not dependent on a specific language. Each service has a limited scope, concentrates on a specific task and is highly independent. This setup allows IT managers and developers to build systems in a modular way. In his book, “Building Microservices,” Sam Newman said microservices are small, focused components built to do a single thing very w...
Feb. 14, 2016 09:30 AM EST Reads: 116
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
Feb. 14, 2016 09:15 AM EST
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
Feb. 14, 2016 08:30 AM EST
Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages.
Feb. 14, 2016 08:30 AM EST Reads: 117
At first adopted by enterprises to consolidate physical servers, virtualization is now widely used in cloud computing to offer elasticity and scalability. On the other hand, Docker has developed a new way to handle Linux containers, inspired by version control software such as Git, which allows you to keep all development versions. In his session at 17th Cloud Expo, Dominique Rodrigues, the co-founder and CTO of Nanocloud Software, discussed how in order to also handle QEMU / KVM virtual machin...
Feb. 14, 2016 08:15 AM EST Reads: 157
Microservices are all the rage right now — and the industry is still learning, experimenting, and developing patterns, for successfully designing, deploying and managing Microservices in the real world. Are you considering jumping on the Microservices-wagon? Do Microservices make sense for your particular use case? What are some of the “gotchas” you should be aware of? This morning on #c9d9 we had experts from popular chat app Kik, SMB SaaS platform Yodle and hosted CI solution Semaphore sha...
Feb. 14, 2016 08:00 AM EST
How is your DevOps transformation coming along? How do you measure Agility? Reliability? Efficiency? Quality? Success?! How do you optimize your processes? This morning on #c9d9 we talked about some of the metrics that matter for the different stakeholders throughout the software delivery pipeline. Our panelists shared their best practices.
Feb. 14, 2016 08:00 AM EST Reads: 132
SYS-CON Events announced today that FalconStor Software® Inc., a 15-year innovator of software-defined storage solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. FalconStor Software®, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged, hardware-agnostic, software-defined storage and data services platform. Its flagship solution FreeStor®, utilizes a horizonta...
Feb. 14, 2016 07:30 AM EST
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Feb. 14, 2016 04:30 AM EST Reads: 406
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 14, 2016 04:00 AM EST Reads: 491
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 14, 2016 04:00 AM EST Reads: 263
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
Feb. 14, 2016 03:45 AM EST Reads: 262
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
Feb. 14, 2016 03:30 AM EST Reads: 397
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
Feb. 14, 2016 03:00 AM EST Reads: 256
In a previous article, I demonstrated how to effectively and efficiently install the Dynatrace Application Monitoring solution using Ansible. In this post, I am going to explain how to achieve the same results using Chef with our official dynatrace cookbook available on GitHub and on the Chef Supermarket. In the following hands-on tutorial, we’ll also apply what we see as good practice on working with and extending our deployment automation blueprints to suit your needs.
Feb. 14, 2016 02:00 AM EST Reads: 106
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Feb. 14, 2016 01:15 AM EST Reads: 298
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 13, 2016 11:15 PM EST Reads: 318
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
Feb. 13, 2016 08:45 PM EST Reads: 403
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
Feb. 13, 2016 08:00 PM EST Reads: 159