|By Lori MacVittie||
|February 22, 2012 09:56 AM EST||
A multitude of security-related solutions rely upon the ability to extract and examine mime-objects from web-content. HTML5 may significantly impair their ability to do so.
The trade off between security and performance has long been a known issue across IT organizations. One of the first things to go when performance is unacceptable is a security solution. This isn’t just an IT phenomenon either; consider how many of us have disabled endpoint security solutions like anti-virus scanners to improve performance?
Our refusal to be slowed down by what may seem to some as extraneous security is what eventually led IT security professionals to revise their strategies and enforce such scans on inbound content in the network. Network-attached security scanning solutions have long been a staple of inbound e-mail and has found increasing use as a means to scan inbound web-content, as well, as an attempt to eliminate potential malware from having access to the corporate network.
A new [at the time of publication, July 2011] survey of 487 IT professionals that was conducted by Crossbeam, a provider of high-performance security gateways, finds that while 91 percent of the respondents were not only making tradeoffs between security and performance, a full 81 percent were actually disabling security features.
HTML and soon, if we believe the predictions HTML5, is the lingua franca of Internet communication. Oh, applications may speak JSON under the covers, but in the end it’s just data to be displayed to the user which means HTML(5).
What does that mean for anti-virus and malware web scanners? Well, if one of the features of HTML5 being leveraged is WebSockets, a lot. Otherwise, not much. At least not yet.
You see, WebSockets accidentally trades performance for security.
One of the things WebSockets does to dramatically improve performance is eliminate all those pesky HTTP headers. You know, things like CONTENT-TYPE. You know, the header that tells the endpoint what kind of content is being transferred, such as text/html and video/avi. One of the things anti-virus and malware scanning solutions are very good at is detecting anomalies in specific types of content. The problem is that without a MIME type, the ability to correctly identify a given object gets a bit iffy. Bits and bytes are bytes and bytes, and while you could certainly infer the type based on format “tells” within the actual data, how would you really know? Sure, the HTTP headers could by lying, but generally speaking the application serving the object doesn’t lie about the type of data and it is a rare vulnerability that attempts to manipulate that value. After all, you want a malicious payload delivered via a specific medium, because that’s the cornerstone upon which many exploits are based – execution of a specific operation against a specific manipulated payload. That means you really need the endpoint to believe the content is of the type it thinks it is.
But couldn’t you just use the URL? Nope – there is no URL associated with objects via a WebSocket. There is also no standard application information that next-generation firewalls can use to differentiate the content; developers are free to innovate and create their own formats and micro-formats, and undoubtedly will. And trying to prevent its use is nigh-unto impossible because of the way in which the upgrade handshake is performed – it’s all over HTTP, and stays HTTP. One minute the session is talking understandable HTTP, the next they’re whispering in Lakota, a traditionally oral-only language which neatly illustrates the overarching point of this post thus far: there’s no way to confidently know what is being passed over a WebSocket unless you “speak” the language used, which you may or may not have access to.
The result of all this confusion is that security software designed to scan for specific signatures or anomalies within specific types of content can’t. They can’t extract the object flowing through a WebSocket because there’s no indication of where it begins or ends, or even what it is. The loss of HTTP headers that indicate not only type but length is problematic for any software – or hardware for that matter – that uses the information contained within to extract and process the data.
Wedge Networks, whose name you may never before heard even though you might have had content scrubbed by their devices and not known it, has a solution to the problem of disaggregating web objects without requiring specific identification by HTTP headers, thus solving this problem and several other similar ones where protocols lack the means to definitively identify specific content by type.
The WedgeOS Network Data Processor ("NDP") is the proprietary architecture that allows content inspection at Gigabit speeds without impacting network performance. The WedgeOS NDP architecture revolutionized Web Security Appliances with the introduction of BeSecure. BeSecure is capable of intercepting and actively scanning all internet traffic for malicious content as it enters the network.
What they meant to say was “we do deep content inspection on streaming traffic and are able to accurately identify – and subsequently extract – MIME objects at line rate and then scan them for bad stuff you don’t want on your network.” Content comes into their device (and it’s off-the shelf hardware, I’m told), MIME objects are disaggregated regardless of transport or application protocol, shoved down a high-speed internal bus into which are plugged a variety of security scanning functions, and then shoved back out the other side, assuming all was well. Policies enable the ability to determine exactly what happens if there are anomalies or malicious code discovered.
Wedge Networks has partnered with a number of well-known and industry leading security scanning solutions and brought them together into a single device. Applying the old “crack the packet only once” doctrine, the device is able to perform its scans as fast as objects can traverse its internal bus.
The devices deploys in either proxy or transparent mode, with the latter being most popular simply due to the mitigation of disruption that can come with inserting a proxy-based solution into an established network.
Let’s assume for a moment that a Wedge Networks device really does accomplish all this – at line rate. I can’t know, I don’t evaluate products in lab environments any more, so I can take their word for it. But let’s assume it does. That opens a wide variety of possibilities – both inbound and outbound – for protecting web applications and customers alike, and not just for HTML5.
Assuming no degradation of overall performance, the ability to detect and prevent delivery of malware that may have been surgically inserted into your database or CMS via XSS or SQLi would be a boon, if only to let you know it happened much sooner and provide the time necessary to redress the infection. Nearly every rational organization scans inbound e-mail for potential risks, but very few (if any) scan outbound. We all know why – the belief that performance is more important than security, especially when consumer dollars are on the line. If Wedge Networks can do as it promises and not impede performance while still providing a valuable security service, well, that might be something to think about.
This week we're attending SYS-CON Event's DevOps Summit in New York City. It's a great conference and energy behind DevOps is enormous. Thousands of attendees from every company you can imagine are focused on automation, the challenges of DevOps, and how to bring greater agility to software delivery. But, even with the energy behind DevOps there's something missing from the movement. For all the talk of deployment automation, continuous integration, and cloud infrastructure I'm still not se...
Jul. 4, 2015 07:00 PM EDT Reads: 2,274
In his session at 16th Cloud Expo, Simone Brunozzi, VP and Chief Technologist of Cloud Services at VMware, reviewed the changes that the cloud computing industry has gone through over the last five years and shared insights into what the next five will bring. He also chronicled the challenges enterprise companies are facing as they move to the public cloud. He delved into the "Hybrid Cloud" space and explained why every CIO should consider ‘hybrid cloud' as part of their future strategy to achie...
Jul. 4, 2015 07:00 PM EDT Reads: 1,129
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development...
Jul. 4, 2015 06:45 PM EDT Reads: 1,172
[session] Dark Art of Container Monitoring By @Sysdig | @DevOpsSummit #DevOps #Docker #Containers #Microservices
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult – let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and liv...
Jul. 4, 2015 06:30 PM EDT Reads: 1,092
[slides] IoT Middleware for a Faster Time to Market By @robomq | @ThingsExpo #IoT #M2M #InternetOfThings
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Jul. 4, 2015 04:30 PM EDT Reads: 2,249
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at 16th Cloud Expo, Jake Moshenko, Product Manager at CoreOS, examined how CoreOS + Quay.io fit into the development lifecycle from pushing gi...
Jul. 4, 2015 03:00 PM EDT Reads: 2,394
SYS-CON Events announced today that Harbinger Systems will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Harbinger Systems is a global company providing software technology services. Since 1990, Harbinger has developed a strong customer base worldwide. Its customers include software product companies ranging from hi-tech start-ups in Silicon Valley to leading product companies in the US a...
Jul. 4, 2015 02:00 PM EDT Reads: 1,903
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises ar...
Jul. 4, 2015 02:00 PM EDT Reads: 992
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Jul. 4, 2015 12:30 PM EDT Reads: 1,050
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the ...
Jul. 4, 2015 12:15 PM EDT Reads: 1,030
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of...
Jul. 4, 2015 12:00 PM EDT Reads: 2,441
Data center models are changing. A variety of technical trends and business demands are forcing that change, most of them centered on the explosive growth of applications. That means, in turn, that the requirements for application delivery are changing. Certainly application delivery needs to be agile, not waterfall. It needs to deliver services in hours, not weeks or months. It needs to be more cost efficient. And more than anything else, it needs to be really, dc infra axisreally, super focus...
Jul. 4, 2015 12:00 PM EDT Reads: 2,168
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
Jul. 4, 2015 11:45 AM EDT Reads: 1,513
The most often asked question post-DevOps introduction is: “How do I get started?” There’s plenty of information on why DevOps is valid and important, but many managers still struggle with simple basics for how to initiate a DevOps program in their business. They struggle with issues related to current organizational inertia, the lack of experience on Continuous Integration/Delivery, understanding where DevOps will affect revenue and budget, etc. In their session at DevOps Summit, JP Morgenthal...
Jul. 4, 2015 11:00 AM EDT Reads: 1,061
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
Jul. 4, 2015 09:00 AM EDT Reads: 1,868
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 4, 2015 08:00 AM EDT Reads: 1,387
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction. ...
Jul. 4, 2015 07:30 AM EDT Reads: 1,263
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Jul. 4, 2015 07:00 AM EDT Reads: 1,087
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes. In his session at DevOps Summit, Michael Demmer, VP of Engineering at Jut, will discuss how this can...
Jul. 3, 2015 07:00 PM EDT Reads: 1,028
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Jul. 2, 2015 02:15 PM EDT Reads: 1,077