Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

The Next Virtualization Waves Are Forming

We are still in the beginning stages of realizing what virtualization can do. Where do we go from here?

Pete Manca's Blog

The virtualization "waves" are just forming. And while server virtualization is at full crest, there are many more waves behind this that are taking shape and quite frankly, are more significant.

Server virtualization was about saving money. Allowing multiple applications to be consolidated onto a single server saves capital and operational expenses. Reducing the number of servers running in the data center is a good thing, as it also saves some carbon emissions as well. But is that it? If so, that’s more like a ripple than a wave. Don’t get me wrong, reducing power, cooling, server count and consolidating apps is a good thing, but it's not the whole story. Not by a long shot.

I don’t believe that this is it. In fact, I think we are still in the beginning stages of realizing what virtualization can do. It’s really the enabling technology that fuels the ability to create new ways to solve problems that exist in today’s data center.

As with all great technology movements, a core set of technologies must be established first. Server virtualization is one for sure, but what are the others? I/O Virtualization might be the next important cornerstone technology. Without solving this problem, servers continue to be static and inflexible. We might be able to utilize servers more by virtue of the hypervisor, but we can’t exploit them to their fullest extent without the flexibility to change their I/O bindings dynamically. Other key virtualization technologies include file virtualization, data virtualization, and application virtualization. These are keys to making access to applications, data, and resources agile and ubiquitous.

Once the server, I/O, data, and applications are virtualized, the resulting possibilities and opportunities really are endless. These cornerstones open the market for management, security, converged fabrics, and a whole host of technologies that can free up the data center and open new markets.

Expect 2008 to be another banner year for virtualization. The next wave is here.

More Stories By Pete Manca

Pete Manca is CTO and EVP of Engineering, Egenera. He brings over 20 years' experience in enterprise computing to Egenera. His expertise spans a wide range of critical enterprise data center technologies including virtualization, operating systems, large-scale architectures and open standards. In particular, his leadership and experience in virtualization technologies has led to the continued progression of Egenera's advanced PAN (Processing Area Network) architecture. Manca leads product planning by working directly with customers to understand their most difficult challenges and guide Egenera's architecture, hardware and software engineering teams to translate those requirements into solutions. Prior to Egenera, he served as Vice President of Engineering at Hitachi Computer Products America with responsibility for operating systems and enterprise middleware products.

Comments (2)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...