Click here to close now.




















Welcome!

Microservices Expo Authors: Pat Romanski, Trevor Parsons, Cloud Best Practices Network, Elizabeth White, Joe Pruitt

Related Topics: Cloud Security, Java IoT, Industrial IoT, Microservices Expo, Microsoft Cloud, IoT User Interface

Cloud Security: Blog Post

Addressing the Root Cause – A Proactive Approach to Securing Desktops

The trouble with reactionary behavior

The computers on your network are protected from malware right? If you are operating an environment based largely on Windows based PCs you likely have some kind of anti-virus installed and centrally managed. If you have purchased a more complete desktop protection suite, you probably even have a Host Based IDS/IPS protecting your machine from incoming malicious TCP scans, or possible outbound connections to known malicious sites (like google.com occasionally). Operating system firewall activated? Yep! AV signatures current? Check! Global Threat Intelligence updated? Uh, yeah....sure. Then you should be covered against threats targeting your organization, right? Most likely not, and at times these tools actually mask intrusions as they provide a false sense of security and protection.

The Trouble with Reactionary Behavior
The problem with these tools, all of them, is that they are purely reactionary in nature. Reactionary protection tools on every level, is something that basically states that an event has already occurred on your host computer, and those protection mechanisms will now activate. That means when you get an antivirus alert on your computer, the malware ALREADY present on the system. Yes, it may have stopped it, deleted it or possibly quarantined it (all of which are good). It has only done so because the AV software either has an existing signature in its database or the malware has attempted to operate in a suspicious manner, flagging the heuristics detection of the AV. What about when brand new malware, 0-day exploits, or sophisticated targeted malware executes on your host?

Do you imagine your AV will detect and mitigate it? I would suggest that your AV will be none the wiser to the presence of this yet to be detected threat, and only once it has been submitted to an AV vendor for analysis will you be provided with an updated signature. Well certainly if my AV missed it, one of the other layers of protection should stop it, right? It is possible, if the malware uses outbound connections that aren't considered "normal" by your OS's firewall or HIDS/HIPS software, then the malware could potentially be detected. If the malware uses standard outbound connections, port 80 or more than likely port 443, this appears as "normal" to the other layers of your systems host based defenses in place.

These tools all require some kind of known characteristics of a particular threat in order to detect its presence and mitigate it. These characteristics are obtained through analysis of reported and discovered threats of a similar nature, of which are used to develop signatures or heuristic models to detect the presence of malware on a host. If that threat has not yet been submitted for analysis and the callback domains not reported as malicious, it may be a while for it to be "discovered" and signatures made available. Until that time, your computer, its files, all of your activities as well as other computers on your network are at the mercy of an attacker unabated.

Being Proactive Is Essentially Free
This is the part that is really frustrating for me as an analyst, and also as an advocate for root cause solutions. Reactionary defenses cost an unreal amount of money for consumers, businesses, governments (both state and local), federal and military. You would think with all of this time and money spent on the various products billed as "protecting" you from cyber threats & intrusions, your environment would be better protected whether it is an enterprise or a single computer. This is not the case.  In fact, many studies show computer related intrusions are on the rise. Nation state threats, advanced persistent threats (APT) and even less skilled hackers continue to improve their sophistication as tools get cheaper and information is freely exchanged. Why is it then that I say, Proactive defenses are essentially free? And if that is in fact the case, why is this not being used more frequently? Proactive defense measures are essentially free, minus the time and effort in securing the root problems within your network. For this particular blog post, I am focused on host based proactive defensive measures.

Denying Execution at the Directory Level
The "how" is actually quite simple to explain, and in fact it is not a new protection technique at all, its just not as widely used outside of *nix based systems. All that an operating system provides is a platform for applications to run on, sometimes graphical based, sometimes a simple command line. The applications are typically stored in a common location within the operating system, allowing for dynamic linking as well as simplifying the directory structure. Not all applications require the need for linking to a dynamic library as they contain all of the requirements to run on their own, so they can easily be placed anywhere within the OS and they will execute.

This is extremely convenient when a developer wants to provide software that doesn't need to officially "install", and can be easily moved around. Therein lies the issue with the execution of these "self contained" applications, they can execute from anywhere on the host, without restriction. For a demonstration of this, copy "calc.exe" from the "system32" folder on your Windows PC to your "desktop". The program "calc.exe" will execute just the same as if it were under "system32" as it is a completely self contained binary. Almost all malware is designed the same way, and typically executes from a "temp" location or the root of your currently logged in user directory. The execution of malware needs to be stopped from occurring in the first place. This way, regardless of your current AV signatures or HIDS/HIPS capabilities, the malware cannot run. If the malware is unable to run, the threat is effectively mitigated before it can gain any foothold.

So how on earth do you stop the malware from executing from within these locations, and do I need some kind of "agent" based solution to monitor those particular directories to stop them? The approach is simple: deny ALL execution of programs outside of a particular directory (e.g., "Program Files" and "System32"). Require all necessary applications on the host, putty for instance, to be placed within one of the approved directories. If you are running a Windows based environment, locking down execution outside of approved directories can be implemented through both Group Policy (GPO) and Local Policy.

By expanding on an existing Windows policy called "Microsoft Windows Software Restriction" (which has been around since 2002 BTW) you can define directories that allow for execution of applications. This exact same technique can be employed on OSX systems as well.  Simply remove the execute privilege from locations within the OS that you would like to protect. In fact, I would venture to say it is easiest to implement on any *nix based system (if it's not already, as is the case on most unix/linux flavors).

No Silver Bullet
No solution is 100% effective, and this is no exception, as there are a number of ways to get past this protection.  Having said that, it adds a layer to your defense and will stop the majority of execution-based attacks.  If your software is properly patched (0-days not included), you have user privileges locked down with separate dedicated accounts, directory protection just steps up the difficulty your attackers have in gaining a presence on your network. No single solution will solve all of your problems, no matter how much a vendor sales engineer tries to sell you. Holistic, full spectrum defenses are the future, not "plug & play" protection hardware or software that requires updates, patching, signatures and "threat intelligence". The other side extremely important level of protection is in your Infosec professionals you have supporting you. Spend the money on good, talented and well rounded security professionals that understand the cyber threat landscape and the ways in which they can help better protect your organization.

To research further into how your network and its assets can be better protected please check out CyberSquared for solutions to root cause issues.

More Stories By Cory Marchand

Cory Marchand is a trusted subject matter expert on topics of Cyber Security Threats, Network and Host based Assessment and Computer Forensics. Mr. Marchand has supported several customers over his 10+ years within the field of Computer Security including State, Federal and Military Government as well as the Private sector. Mr. Marchand holds several industry related certificates including CISSP, EnCE, GSEC, GCIA, GCIH, GREM, GSNA and CEH.

@MicroservicesExpo Stories
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
Auto-scaling environments, micro-service architectures and globally-distributed teams are just three common examples of why organizations today need automation and interoperability more than ever. But is interoperability something we simply start doing, or does it require a reexamination of our processes? And can we really improve our processes without first making interoperability a requirement for how we choose our tools?
Cloud Migration Management (CMM) refers to the best practices for planning and managing migration of IT systems from a legacy platform to a Cloud Provider through a combination professional services consulting and software tools. A Cloud migration project can be a relatively simple exercise, where applications are migrated ‘as is’, to gain benefits such as elastic capacity and utility pricing, but without making any changes to the application architecture, software development methods or busine...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
JavaScript is primarily a client-based dynamic scripting language most commonly used within web browsers as client-side scripts to interact with the user, browser, and communicate asynchronously to servers. If you have been part of any web-based development, odds are you have worked with JavaScript in one form or another. In this article, I'll focus on the aspects of JavaScript that are relevant within the Node.js environment.
Approved this February by the Internet Engineering Task Force (IETF), HTTP/2 is the first major update to HTTP since 1999, when HTTP/1.1 was standardized. Designed with performance in mind, one of the biggest goals of HTTP/2 implementation is to decrease latency while maintaining a high-level compatibility with HTTP/1.1. Though not all testing activities will be impacted by the new protocol, it's important for testers to be aware of any changes moving forward.
One of the ways to increase scalability of services – and applications – is to go “stateless.” The reasons for this are many, but in general by eliminating the mapping between a single client and a single app or service instance you eliminate the need for resources to manage state in the app (overhead) and improve the distributability (I can make up words if I want) of requests across a pool of instances. The latter occurs because sessions don’t need to hang out and consume resources that could ...
Alibaba, the world’s largest ecommerce provider, has pumped over a $1 billion into its subsidiary, Aliya, a cloud services provider. This is perhaps one of the biggest moments in the global Cloud Wars that signals the entry of China into the main arena. Here is why this matters. The cloud industry worldwide is being propelled into fast growth by tremendous demand for cloud computing services. Cloud, which is highly scalable and offers low investment and high computational capabilities to end us...
The Internet of Things. Cloud. Big Data. Real-Time Analytics. To those who do not quite understand what these phrases mean (and let’s be honest, that’s likely to be a large portion of the world), words like “IoT” and “Big Data” are just buzzwords. The truth is, the Internet of Things encompasses much more than jargon and predictions of connected devices. According to Parker Trewin, Senior Director of Content and Communications of Aria Systems, “IoT is big news because it ups the ante: Reach out ...
At DevOps Summit NY there’s been a whole lot of talk about not just DevOps, but containers, IoT, and microservices. Sessions focused not just on the cultural shift needed to grow at scale with a DevOps approach, but also made sure to include the network ”plumbing” needed to ensure success as applications decompose into the microservice architectures enabling rapid growth and support for the Internet of (Every)Things.
Our guest on the podcast this week is Adrian Cockcroft, Technology Fellow at Battery Ventures. We discuss what makes Docker and Netflix highly successful, especially through their use of well-designed IT architecture and DevOps.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
This week, I joined SOASTA as Senior Vice President of Performance Analytics. Given my background in cloud computing and distributed systems operations — you may have read my blogs on CNET or GigaOm — this may surprise you, but I want to explain why this is the perfect time to take on this opportunity with this team. In fact, that’s probably the best way to break this down. To explain why I’d leave the world of infrastructure and code for the world of data and analytics, let’s explore the timing...
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
Public Cloud IaaS started its life in the developer and startup communities and has grown rapidly to a $20B+ industry, but it still pales in comparison to how much is spent worldwide on IT: $3.6 trillion. In fact, there are 8.6 million data centers worldwide, the reality is many small and medium sized business have server closets and colocation footprints filled with servers and storage gear. While on-premise environment virtualization may have peaked at 75%, the Public Cloud has lagged in adop...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with ...
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. The DevOps approach is a way to increase business agility through collaboration, communication, and integration across different teams in the IT organization. In his session at DevOps Summit, Chris Van Tuin, Chief Technologist for the Western US at Red Hat, will discuss: The acceleration of application delivery for the business with DevOps
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Opening Keynote at 16th Cloud Expo, S...