Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: Containers Expo Blog, Virtualization Conference & Expo 2008 East

Containers Expo Blog: Article

Virtualization Will Be the Highest-Impact IT Trend Through 2012

Virtualization will transform how IT is managed, what is bought, how it is deployed, how companies plan and how they are charged

Virtualization will be the highest-impact trend changing infrastructure and operations through 2012, according to Gartner. Virtualization will transform how IT is managed, what is bought, how it is deployed, how companies plan and how they are charged. As a result, virtualization is creating a new wave of competition among infrastructure vendors that will result in considerable market disruption and consolidation over the next few years.

"Virtualization is hardly a new concept; storage has already been virtualized — albeit primarily within the scope of individual vendor architectures — and networking is also virtualized," said Philip Dawson, vice president and distinguished analyst at Gartner. "However, as both server and PC virtualization become more pervasive, traditional IT infrastructure orthodoxy is being challenged and is changing the way business works with IT."

According to Gartner, the leading edge of this change is server virtualization, which promises to unlock much of the underutilized capacity of existing server architectures. Server virtualization is already having an impact on the server market; Gartner believes that virtualization reduced the x86 server market by 4 percent in 2006. As hypervisor prices drop and management costs decrease because of increased competition, virtualization will have a larger impact, and Gartner analysts predict that more than 4 million virtual machines will be installed on x86 servers by 2009.

The use of PC virtualization is also set to increase. The number of virtualized PCs is expected to grow from less than 5 million in 2007 to 660 million by 2011. On the PC, the decoupling technology that breaks the close ties and dependencies between hardware and software occurs at two levels: between hardware and the operating system (machine virtualization) and between the operating system and applications (application virtualization).

Although application virtualization is gaining considerable interest, Gartner maintains that it is machine virtualization that will have a more-long-term impact, making personal computing more manageable, flexible and secure by allowing multiple individual footprints to be defined on the same device.

"Essentially, virtualization creates a fork in the road for operating systems," said Thomas Bittman, vice president and distinguished analyst at Gartner. "Traditionally the operating system has been the center of gravity for client and server computing, but new technologies, new modes of computing, and infrastructure virtualization and automation are changing the architecture and role of the operating system. The days of the monolithic, general-purpose operating system will soon be over."

Infrastructure vendors that have always vied for the largest share of budgets on a best-of breed basis must alter their approach. In the future, the virtualization and automation of infrastructure will be managed by policies at a business-service level, requiring all parts of the infrastructure to work in harmony. This concerns some vendors, which believe a smooth-running and standardized infrastructure threatens to commoditize their component parts and are keen to establish a critical "linchpin" status in the market.

"This competition will play itself out in the market and in users' infrastructure, and it will be messy," said Mr. Dawson. "Eventually a few dominant infrastructure control architectures will emerge, and in those architectures, vendors will solidify a span on control in a hierarchy of governance."

As a result of the uncertainty that will prevail over the market in the short-to-medium term, Mr. Bittman advised against following a specific vendor's vision and instead advised users to determine their own vision of architecture control and build toward it with a constantly updated strategic plan. "In the medium term, align your virtualization strategy with the business, avoid vendor hype and beware of software pricing and licensing," he said. "Be prepared to experiment, but make sure that you are the scientist, not the subject."

More Stories By Virtualization News

SYS-CON's Virtualization News Desk trawls the news sources of the world for the latest details of virtualization technologies, products, and market trends, and provides breaking news updates from the Virtualization Conference & Expo.

Comments (1)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...