Microservices Expo Authors: Liz McMillan, Yeshim Deniz, Derek Weeks, Pat Romanski, Sematext Blog

Related Topics: @CloudExpo, Mobile IoT, Microservices Expo, Containers Expo Blog, IoT User Interface, Agile Computing

@CloudExpo: Blog Feed Post

BYOD and the Death of the DMZ

It's context that counts, not corporate connections

BYOD remains a topic of interest as organizations grapple not only technologically with the trend but politically, as well. There are dire warnings that refusing to support BYOD will result in an inability to attract and retain up and coming technologists, that ignoring the problems associated with BYOD will eventually result in some sort of karmic IT event that will be painful for all involved.

Surveys continue to tell us organizations cannot ignore BYOD. A recent ITIC survey indicated a high level of BYOD across the global 550 companies polled.

51% of workers utilize smart phones as their BYOD devices; another 44% use notebooks and ultra books, while 31% of respondents indicated they use tablets (most notably the Apple iPad) and 23% use home-based desktop PCs or Macs.

It's here, it's now, and it's in the data center. The question is no longer "will you allow it" but "how will you secure/manage/support it"? It's that first piece – secure it – that's causing some chaos and confusion.  Just as we discovered with cloud computing early on, responsibility for anything shared is muddled. When asked who should bear responsibility for the security of devices in BYOD situations, respondents offered a nearly equal split between company (37%) and end-user (39%) with 21% stating it fell equally on both.   byodsecurity

From an IT security perspective, this is not a bad split. Employees should be active participants in organizational security. Knowing is, as GI Joe says, half the battle and if employees bringing their own devices to work are informed and understand the risks, they can actively participate in improving security practices and processes.

But relying on end-users for organizational security would be folly, and thus IT must take responsibility for the technological enforcement of security policies developed in conjunction with the business.

One of the first and most important things we must do to enable better security in a BYOD (and cloudy) world is to kill the DMZ.

[Pause for apoplectic fits]

By kill the DMZ I don't mean physically dismantle the underlying network architecture supporting it – I mean logically. The DMZ was developed as a barrier between the scary and dangerous Internet and sensitive corporate data and applications. That barrier now must extend to inside the data center, to the LAN, where the assumption has long been devices and users accessing data center resources are inherently safe.

They are not (probably never have been, really).

Every connection, every request, every attempt to access an application or data within the data center must be treated as suspect, regardless of where it may have originated and without automatically giving certain devices privileges over others. A laptop on the LAN may or may not be BYOD, it may or may not be secure, it may or may not be infected. A laptop on the LAN is no more innately safe than a tablet than is a smart phone.


This is where the concept of a strategic point of control comes in handy. If every end-user is funneled through the same logical tier in the data center regardless of network origination, policies can be centrally deployed and enforced to ensure appropriate levels of access based on the security profile of the device and user.


By sharing access control across all devices, regardless of who purchased and manages them, policies can be tailored to focus on the application and the data, not solely on the point of origination.

While policies may trigger specific rules or inspections based on device or originating location, ultimately the question is who can access a given application and data and under what circumstances? It's context that counts, not corporate connections.

The questions must be asked, regardless of whether the attempt to access begins within the corporate network boundaries or not. Traffic coming from the local LAN should not be treated any differently than that of traffic entering via the WAN. The notion of "trusted" and "untrusted" network connectivity has simply been obviated by the elimination of wires and the rampant proliferation of malware and other destructive digital infections.

In essence, the DMZ is being – and must be - transformed. It's no longer a zone of inherent distrust between the corporate network and the Internet, it's a zone of inherent distrust between corporate resources and everything else. Its design and deployment as a buffer is still relevant, but only in the sense that it stands between critical assets and access by hook, crook, or tablet.

The DMZ as we have known it is dead.

Trust no one.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@MicroservicesExpo Stories
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
What do dependency resolution, situational awareness, and superheroes have in common? Meet Chris Corriere, a DevOps/Software Engineer at Autotrader, speaking on creative ways to maximize usage of all of the above. Mark Miller, Community Advocate and senior storyteller at Sonatype, caught up with Chris to learn more about what his team is up to.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
DevOps is a term that comes full of controversy. A lot of people are on the bandwagon, while others are waiting for the term to jump the shark, and eventually go back to business as usual. Regardless of where you are along the specturm of loving or hating the term DevOps, one thing is certain. More and more people are using it to describe a system administrator who uses scripts, or tools like, Chef, Puppet or Ansible, in order to provision infrastructure. There is also usually an expectation of...
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
The general concepts of DevOps have played a central role advancing the modern software delivery industry. With the library of DevOps best practices, tips and guides expanding quickly, it can be difficult to track down the best and most accurate resources and information. In order to help the software development community, and to further our own learning, we reached out to leading industry analysts and asked them about an increasingly popular tenet of a DevOps transformation: collaboration.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
In case you haven’t heard, the new hotness in app architectures is serverless. Mainly restricted to cloud environments (Amazon Lambda, Google Cloud Functions, Microsoft Azure Functions) the general concept is that you don’t have to worry about anything but the small snippets of code (functions) you write to do something when something happens. That’s an event-driven model, by the way, that should be very familiar to anyone who has taken advantage of a programmable proxy to do app or API routing ...
At its core DevOps is all about collaboration. The lines of communication must be opened and it takes some effort to ensure that they stay that way. It’s easy to pay lip service to trends and talk about implementing new methodologies, but without action, real benefits cannot be realized. Success requires planning, advocates empowered to effect change, and, of course, the right tooling. To bring about a cultural shift it’s important to share challenges. In simple terms, ensuring that everyone k...
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and microservices. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your conta...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will show how customers are able to achieve a level of transparency that enables everyon...
DevOps theory promotes a culture of continuous improvement built on collaboration, empowerment, systems thinking, and feedback loops. But how do you collaborate effectively across the traditional silos? How can you make decisions without system-wide visibility? How can you see the whole system when it is spread across teams and locations? How do you close feedback loops across teams and activities delivering complex multi-tier, cloud, container, serverless, and/or API-based services?
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.