|By Lori MacVittie||
|November 28, 2012 08:45 AM EST||
It is quite difficult to find any environment that is truly homogeneous today. Even home networks, which at one time may have been "all Windows" or "all Linux" today are a mixture of mobile device, Apple, and Windows-based products. As networks grow, the heterogeneity of its infrastructure increases, with a variety of switches, servers, and platforms adding to the mix of end-user computing devices.
Virtualization is no different. As the uses for virtualization technology continue to expand to various application functions, the heterogeneity of the virtual platforms used is also growing. The 2013 Virtualization Management Survey from Information Week indicated over a third of respondents have more than one hypervisor in use today. In the next two years that number is expected to increase to nearly half of all organizations leveraging multiple hypervisor technology.
The reasons for that divergence are varied, the same survey shows. Some organizations use different virtualization vendors for server virtualization than they do for desktop. Others cite hardware compatibility for legacy hardware as a reason for using multiple hypervisors and so on.
Whatever the reason, the trend seems clear: heterogeneous virtualization is increasing, and eventually that divergence will have an impact on network infrastructure.
With multiple hypervisors in use, it is increasingly important that the network infrastructure is equally capable of providing the appropriate services for all hypervisors in use. This is particularly true as we begin to see an increase in adoption of SDN-related protocols such as NVGRE and VXLAN to virtualize the network. Protocol incompatibilities will become problematic for organizations attempting to leverage such technology in conjunction with multiple hypervisors due to a lack of ubiquitous support. Either organizations must standardize across inter-dependent and integrated services to ensure compatibility at the network layer or they must plan to put into place protocol transition capabilities in the network itself, much in the same way "gateways" are used within IPv4 to IPv6 transitional architectures.
Too, virtualization of the network through SDN-related protocols will be primarily confined to within the data center. Users accessing those services will not be doing so using these protocols, making the two networks nearly incompatible. A bridge between the traditional and virtual networks must exist, and it must be agnostic to ensure seamless communication.
Virtualized solution architectures such as those associated with VDI that require specific network infrastructure will also be trouble for organizations, as it limits their ability to share the cost of infrastructure across multiple projects and applications. Agnostic infrastructure, capable of providing equal levels of delivery, security and availability to any IP-based application – including those virtualized - will offer a more cost-effective and operationally consistent approach to managing a heterogeneous virtual infrastructure.
What would be detrimental to the adoption and advancement of virtualization in the data center – whether in the application or network infrastructure – is to create even more complex networks consisting of multiple paths that are dependent on a given hypervisor and SDN-related technology. The disruption of the network and application services should be kept to a minimum and the network architecture kept as simple as possible to avoid the introduction of additional (and unnecessary) points of failure.
By leveraging a strategic point of control within the network with agnostic infrastructure capable of being a bridge not only between traditional and virtual networks but also between competing virtual network implementations will eliminate complexity while offering the operational consistency that reduces both cost and risks.
Fragmentation of virtualization in the application infrastructure should not result in fragmentation of the network.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Oct. 21, 2016 05:15 PM EDT Reads: 580
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Oct. 21, 2016 04:30 PM EDT Reads: 3,786
24Notion is full-service global creative digital marketing, technology and lifestyle agency that combines strategic ideas with customized tactical execution. With a broad understand of the art of traditional marketing, new media, communications and social influence, 24Notion uniquely understands how to connect your brand strategy with the right consumer. 24Notion ranked #12 on Corporate Social Responsibility - Book of List.
Oct. 21, 2016 04:15 PM EDT Reads: 1,476
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, will discuss the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docke...
Oct. 21, 2016 02:15 PM EDT Reads: 2,198
The reason I believe digital transformation is not only more than a fad, but is actually a life-or-death imperative for every business and IT executive on the planet is simple: there will be no place for an “industrial enterprise” in a digital world. Transformation, by definition, is a metamorphosis from one state to another, wholly new state. As such, a true digital transformation must be the act of transforming an industrial-era organization into something wholly different – the Digital Enter...
Oct. 21, 2016 02:00 PM EDT Reads: 1,222
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Oct. 21, 2016 02:00 PM EDT Reads: 6,782
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
Oct. 21, 2016 01:30 PM EDT Reads: 1,466
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, will discuss what every business should plan for how to structure their teams to d...
Oct. 21, 2016 01:00 PM EDT Reads: 1,223
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to...
Oct. 21, 2016 11:45 AM EDT Reads: 13,561
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 21, 2016 10:30 AM EDT Reads: 1,238
In many organizations governance is still practiced by phase or stage gate peer review, and Agile projects are forced to accommodate, which leads to WaterScrumFall or worse. But governance criteria and policies are often very weak anyway, out of date or non-existent. Consequently governance is frequently a matter of opinion and experience, highly dependent upon the experience of individual reviewers. As we all know, a basic principle of Agile methods is delegation of responsibility, and ideally ...
Oct. 21, 2016 10:00 AM EDT Reads: 3,034
Oct. 21, 2016 10:00 AM EDT Reads: 3,711
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
Oct. 21, 2016 09:30 AM EDT Reads: 449
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
Oct. 21, 2016 08:15 AM EDT Reads: 1,837
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Oct. 21, 2016 07:45 AM EDT Reads: 2,059
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
Oct. 21, 2016 07:45 AM EDT Reads: 1,174
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service.
Oct. 21, 2016 07:15 AM EDT Reads: 895
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Oct. 21, 2016 07:00 AM EDT Reads: 4,404
Let's just nip the conflation of these terms in the bud, shall we?
"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.
They are not.
One is about the application. The other, the network. T...
Oct. 21, 2016 06:45 AM EDT Reads: 6,321
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
Oct. 21, 2016 06:00 AM EDT Reads: 7,159