Welcome!

Microservices Expo Authors: Pat Romanski, PagerDuty Blog, Elizabeth White, Liz McMillan, David Green

Related Topics: Cloud Security, Java IoT, Industrial IoT, Microservices Expo, Microsoft Cloud, IoT User Interface

Cloud Security: Blog Post

Addressing the Root Cause – A Proactive Approach to Securing Desktops

The trouble with reactionary behavior

The computers on your network are protected from malware right? If you are operating an environment based largely on Windows based PCs you likely have some kind of anti-virus installed and centrally managed. If you have purchased a more complete desktop protection suite, you probably even have a Host Based IDS/IPS protecting your machine from incoming malicious TCP scans, or possible outbound connections to known malicious sites (like google.com occasionally). Operating system firewall activated? Yep! AV signatures current? Check! Global Threat Intelligence updated? Uh, yeah....sure. Then you should be covered against threats targeting your organization, right? Most likely not, and at times these tools actually mask intrusions as they provide a false sense of security and protection.

The Trouble with Reactionary Behavior
The problem with these tools, all of them, is that they are purely reactionary in nature. Reactionary protection tools on every level, is something that basically states that an event has already occurred on your host computer, and those protection mechanisms will now activate. That means when you get an antivirus alert on your computer, the malware ALREADY present on the system. Yes, it may have stopped it, deleted it or possibly quarantined it (all of which are good). It has only done so because the AV software either has an existing signature in its database or the malware has attempted to operate in a suspicious manner, flagging the heuristics detection of the AV. What about when brand new malware, 0-day exploits, or sophisticated targeted malware executes on your host?

Do you imagine your AV will detect and mitigate it? I would suggest that your AV will be none the wiser to the presence of this yet to be detected threat, and only once it has been submitted to an AV vendor for analysis will you be provided with an updated signature. Well certainly if my AV missed it, one of the other layers of protection should stop it, right? It is possible, if the malware uses outbound connections that aren't considered "normal" by your OS's firewall or HIDS/HIPS software, then the malware could potentially be detected. If the malware uses standard outbound connections, port 80 or more than likely port 443, this appears as "normal" to the other layers of your systems host based defenses in place.

These tools all require some kind of known characteristics of a particular threat in order to detect its presence and mitigate it. These characteristics are obtained through analysis of reported and discovered threats of a similar nature, of which are used to develop signatures or heuristic models to detect the presence of malware on a host. If that threat has not yet been submitted for analysis and the callback domains not reported as malicious, it may be a while for it to be "discovered" and signatures made available. Until that time, your computer, its files, all of your activities as well as other computers on your network are at the mercy of an attacker unabated.

Being Proactive Is Essentially Free
This is the part that is really frustrating for me as an analyst, and also as an advocate for root cause solutions. Reactionary defenses cost an unreal amount of money for consumers, businesses, governments (both state and local), federal and military. You would think with all of this time and money spent on the various products billed as "protecting" you from cyber threats & intrusions, your environment would be better protected whether it is an enterprise or a single computer. This is not the case.  In fact, many studies show computer related intrusions are on the rise. Nation state threats, advanced persistent threats (APT) and even less skilled hackers continue to improve their sophistication as tools get cheaper and information is freely exchanged. Why is it then that I say, Proactive defenses are essentially free? And if that is in fact the case, why is this not being used more frequently? Proactive defense measures are essentially free, minus the time and effort in securing the root problems within your network. For this particular blog post, I am focused on host based proactive defensive measures.

Denying Execution at the Directory Level
The "how" is actually quite simple to explain, and in fact it is not a new protection technique at all, its just not as widely used outside of *nix based systems. All that an operating system provides is a platform for applications to run on, sometimes graphical based, sometimes a simple command line. The applications are typically stored in a common location within the operating system, allowing for dynamic linking as well as simplifying the directory structure. Not all applications require the need for linking to a dynamic library as they contain all of the requirements to run on their own, so they can easily be placed anywhere within the OS and they will execute.

This is extremely convenient when a developer wants to provide software that doesn't need to officially "install", and can be easily moved around. Therein lies the issue with the execution of these "self contained" applications, they can execute from anywhere on the host, without restriction. For a demonstration of this, copy "calc.exe" from the "system32" folder on your Windows PC to your "desktop". The program "calc.exe" will execute just the same as if it were under "system32" as it is a completely self contained binary. Almost all malware is designed the same way, and typically executes from a "temp" location or the root of your currently logged in user directory. The execution of malware needs to be stopped from occurring in the first place. This way, regardless of your current AV signatures or HIDS/HIPS capabilities, the malware cannot run. If the malware is unable to run, the threat is effectively mitigated before it can gain any foothold.

So how on earth do you stop the malware from executing from within these locations, and do I need some kind of "agent" based solution to monitor those particular directories to stop them? The approach is simple: deny ALL execution of programs outside of a particular directory (e.g., "Program Files" and "System32"). Require all necessary applications on the host, putty for instance, to be placed within one of the approved directories. If you are running a Windows based environment, locking down execution outside of approved directories can be implemented through both Group Policy (GPO) and Local Policy.

By expanding on an existing Windows policy called "Microsoft Windows Software Restriction" (which has been around since 2002 BTW) you can define directories that allow for execution of applications. This exact same technique can be employed on OSX systems as well.  Simply remove the execute privilege from locations within the OS that you would like to protect. In fact, I would venture to say it is easiest to implement on any *nix based system (if it's not already, as is the case on most unix/linux flavors).

No Silver Bullet
No solution is 100% effective, and this is no exception, as there are a number of ways to get past this protection.  Having said that, it adds a layer to your defense and will stop the majority of execution-based attacks.  If your software is properly patched (0-days not included), you have user privileges locked down with separate dedicated accounts, directory protection just steps up the difficulty your attackers have in gaining a presence on your network. No single solution will solve all of your problems, no matter how much a vendor sales engineer tries to sell you. Holistic, full spectrum defenses are the future, not "plug & play" protection hardware or software that requires updates, patching, signatures and "threat intelligence". The other side extremely important level of protection is in your Infosec professionals you have supporting you. Spend the money on good, talented and well rounded security professionals that understand the cyber threat landscape and the ways in which they can help better protect your organization.

To research further into how your network and its assets can be better protected please check out CyberSquared for solutions to root cause issues.

More Stories By Cory Marchand

Cory Marchand is a trusted subject matter expert on topics of Cyber Security Threats, Network and Host based Assessment and Computer Forensics. Mr. Marchand has supported several customers over his 10+ years within the field of Computer Security including State, Federal and Military Government as well as the Private sector. Mr. Marchand holds several industry related certificates including CISSP, EnCE, GSEC, GCIA, GCIH, GREM, GSNA and CEH.

@MicroservicesExpo Stories

Modern organizations face great challenges as they embrace innovation and integrate new tools and services. They begin to mature and move away from the complacency of maintaining traditional technologies and systems that only solve individual, siloed problems and work “well enough.” In order to build...

The post Gearing up for Digital Transformation appeared first on Aug. 29, 2016 04:30 PM EDT  Reads: 1,646

DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
The following fictional case study is a composite of actual horror stories I’ve heard over the years. Unfortunately, this scenario often occurs when in-house integration teams take on the complexities of DevOps and ALM integration with an enterprise service bus (ESB) or custom integration. It is written from the perspective of an enterprise architect tasked with leading an organization’s effort to adopt Agile to become more competitive. The company has turned to Scaled Agile Framework (SAFe) as ...
This complete kit provides a proven process and customizable documents that will help you evaluate rapid application delivery platforms and select the ideal partner for building mobile and web apps for your organization.
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...

Let's just nip the conflation of these terms in the bud, shall we?

"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.

They are not.

One is about the application. The other, the network. T...

If you are within a stones throw of the DevOps marketplace you have undoubtably noticed the growing trend in Microservices. Whether you have been staying up to date with the latest articles and blogs or you just read the definition for the first time, these 5 Microservices Resources You Need In Your Life will guide you through the ins and outs of Microservices in today’s world.
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Before becoming a developer, I was in the high school band. I played several brass instruments - including French horn and cornet - as well as keyboards in the jazz stage band. A musician and a nerd, what can I say? I even dabbled in writing music for the band. Okay, mostly I wrote arrangements of pop music, so the band could keep the crowd entertained during Friday night football games. What struck me then was that, to write parts for all the instruments - brass, woodwind, percussion, even k...
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
This digest provides an overview of good resources that are well worth reading. We’ll be updating this page as new content becomes available, so I suggest you bookmark it. Also, expect more digests to come on different topics that make all of our IT-hearts go boom!
Node.js and io.js are increasingly being used to run JavaScript on the server side for many types of applications, such as websites, real-time messaging and controllers for small devices with limited resources. For DevOps it is crucial to monitor the whole application stack and Node.js is rapidly becoming an important part of the stack in many organizations. Sematext has historically had a strong support for monitoring big data applications such as Elastic (aka Elasticsearch), Cassandra, Solr, S...
It's been a busy time for tech's ongoing infatuation with containers. Amazon just announced EC2 Container Registry to simply container management. The new Azure container service taps into Microsoft's partnership with Docker and Mesosphere. You know when there's a standard for containers on the table there's money on the table, too. Everyone is talking containers because they reduce a ton of development-related challenges and make it much easier to move across production and testing environm...