Click here to close now.

Welcome!

@MicroservicesE Blog Authors: Elizabeth White, Pat Romanski, Lori MacVittie, Liz McMillan, Cloud Best Practices Network

Related Topics: @ContainersExpo, @MicroservicesE Blog, @CloudExpo Blog

@ContainersExpo: Article

Cloud Computing & Virtualization: Hot Trends Organizations Can’t Ignore

How can you use virtualization and cloud computing as game changers for your company based on where it’s evolving?

The use of virtualization and cloud computing is growing quickly among companies of all sizes. Currently, 30 percent of servers are virtualized, and surveys show that by 2012, that number will grow to 50 percent.

Virtualization and cloud computing go hand-in-hand, and virtualizing servers is just the tip of the iceberg. The trend to virtualize everything from servers to processing power to software offerings actually started years ago in the personal sector. In the recent past, it was common for individuals within major organizations to use virtualized services or cloud computing when at home, but at work they weren't using those services at all. Why? Because corporate IT didn't trust the lack of security of the cloud and they weren't sure it was a hard trend - something that was definitely here to stay. Today, we know better.

In order to fully understand how virtualization and cloud computing will transform the business world let's first look at the evolution of these capabilities.

Cloud Computing
When talking about virtualization, cloud computing is a natural component. Cloud computing, which refers to companies using remote servers that can store data and allow users to access information from anywhere, takes three different evolutionary forms.

The first is a public cloud. This could be something like Google docs, where you store your data, or something like Flicker, where you store your photos. Basically, you're storing files somewhere else other than your hard drive, and in a place where you can access the items from any device at any time as long as you have an Internet connection.

The second form of cloud computing, which is a private cloud, is emerging rapidly. A private cloud exists when a company wants added security with cloud computing, yet they still want their people to have access to their bigger files and bigger databases from any device anywhere. Since it is private, it's secure and the public does not have access to it. Companies are now beginning to establish private clouds.

The third iteration that is part of the evolution of cloud computing is the private/public cloud - also called a hybrid cloud. In this configuration, you have a private part of your corporate cloud that is secure and only accessible by employees, but you also have a part of the cloud that is public where strategic partners, vendors, and customers can access limited content.

Virtualization
Virtualization can take many forms aside from servers. For example, you can virtualize a desktop, meaning your desktop is stored virtually in the clouds and you can access it from anywhere. You can virtualize your operating system. That means you can be using a Mac yet running the latest Windows operating system on the Mac, or you can have a PC and have three different operating systems running all at the same time. That's the power of virtualization.

Another element of virtualization is Software as a Service (SaaS). Decades ago, we started with software that you had to buy, install, maintain, and update. Thanks to SaaS, the software is in the clouds, so you no longer buy it; you simply buy time to use it. It's a cost-effective way for companies of all sizes to have access to enterprise-level software.

Similarly, we're also starting to see virtualized processing power. Think of this as accessing a supercomputer in the clouds and having that supercomputer's processing power available on your smartphone or tablet. In February 2011, the TV game show "Jeopardy" featured IBM's supercomputer Watson against human contestants. Watson beat the humans at "Jeopardy" quite well because it knew what it was good at and it focused on those categories. With virtualized processing power, you're basically getting a Watson on your phone. That means you and your employees can make informed decisions about many things, very quickly.

One of the ways Watson has been used since "Jeopardy" is looking at MRI scans. When Watson reviews an MRI scan, it can detect anomalies and see things a human doctor can't see. Watson can also analyze many variables in an effort to help the human doctor make a better diagnosis faster. It's about allowing professionals rapid access to vast amounts of information and knowledge that can help them work faster and smarter.

Health care is just one example. Could people who do sales, R&D, purchasing, delivery, sourcing, shipping, accounting, and a host of other functions benefit from a Watson-like supercomputer in the palm of their hand? Yes. Could it make them work smarter, better, and more effective? Most definitely!

The Game Changer
Part of this evolution of virtualization and cloud computing is that we can now virtualize various components of IT. And in the near future, we'll start seeing IT as a Service (much like how SaaS became popular). This means that much of the IT department will be virtualized and running in the cloud.

The benefits of IT as a Service are immense. Not only will it save money, but it will also increase speed and agility. Since your servers aren't being used 100 percent all the time, the efficiency varies. With IT as a Service, a company will be able to scale in real time as demand dictates by the nanosecond. As sales increase, instantly the system will self-configure. As sales decrease, it does the same. Now you're only paying for what you're using. In this case, you'll be able to benefit from dynamic resource allocation so you're able to maximize what you have and what you're paying for at all times.

IT as a Service is a game changer. Because you now have components of the IT department existing in the cloud, you're freeing your in-house IT staff to shift from a maintenance mode to an innovation mode. As such, your IT department can focus on achieving business goals, creating innovative solutions, and driving sales rather than upgrading individual user's computers and firefighting everyday problems. It allows the IT department to really look at the industry trends unfolding so your company can give customers the products and services they'd ask for, if they only knew what was possible.

It's Time to V-Enable the Organization
In terms of implementing virtualization and cloud computing options, organizations are now starting to move quickly. Virtualization received a big push in 2009 and 2010 because of the recession, which prompted many companies to cut their IT budget. Companies realized that one way to save money is through virtualization. For example, virtual desktops alone lower costs by 15 percent.

Now in 2011, the factors that are increasing an organization's interest in virtualization are speed and agility. Virtualization enables you to do things faster, thus making your company more agile. Instead of delivering a new service in two months, companies are able to do it in two days.

As virtualization and cloud computing become more prevalent, companies are going to need to form new strategic relationships because existing relationships may not have the core competencies needed to drive the fundamental changes that will be needed. At this point it would be good to ask yourself if you have the relationships you need to move forward given this shift. Do your current strategic relationships understand the shifts taking place and are they embracing the things you know will happen?

Realize, too, that the wrong question to ask is, "What should we buy?" Rather, you have to look at the bigger picture of what you're trying to accomplish in this transformational time. How can you use virtualization and cloud computing as game changers for your company based on where it's evolving? The key is to understand the new capabilities, because in order to know what to buy or what to do, you first have to know what is possible.

2011 is the year most are sticking their toes in the waters of virtualization and cloud computing. It's the year organizations realize this isn't a fad that's going to fade. Virtualization and the cloud are hard trends that provide transformational opportunities and will continue to rapidly evolve. The time to embrace this trend is now.

More Stories By Daniel Burrus

Daniel Burrus is considered one of the world’s leading technology forecasters and business strategists, and is the founder and CEO of Burrus Research, a research and consulting firm that monitors global advancements in technology driven trends to help clients better understand how technological, social and business forces are converging to create enormous, untapped opportunities. He is the author of six books, including the New York Times and Wall Street Journal bestselling Flash Foresight: How To See the Invisible and Do the Impossible, as well as the highly acclaimed Technotrends. For more information, please visit www.burrus.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at DevOps Summit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction. ...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Conferences agendas. Event navigation. Specific tasks, like buying a house or getting a car loan. If you've installed an app for any of these things you've installed what's known as a "disposable mobile app" or DMA. Apps designed for a single use-case and with the expectation they'll be "thrown away" like brochures. Deleted until needed again. These apps are necessarily small, agile and highly volatile. Sometimes existing only for a short time - say to support an event like an election, the Wor...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
Cloud Migration Management (CMM) refers to the best practices for planning and managing migration of IT systems from a legacy platform to a Cloud Provider through a combination professional services consulting and software tools. A Cloud migration project can be a relatively simple exercise, where applications are migrated ‘as is’, to gain benefits such as elastic capacity and utility pricing, but without making any changes to the application architecture, software development methods or busine...
Data center models are changing. A variety of technical trends and business demands are forcing that change, most of them centered on the explosive growth of applications. That means, in turn, that the requirements for application delivery are changing. Certainly application delivery needs to be agile, not waterfall. It needs to deliver services in hours, not weeks or months. It needs to be more cost efficient. And more than anything else, it needs to be really, dc infra axisreally, super focus...
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
Many people recognize DevOps as an enormous benefit – faster application deployment, automated toolchains, support of more granular updates, better cooperation across groups. However, less appreciated is the journey enterprise IT groups need to make to achieve this outcome. The plain fact is that established IT processes reflect a very different set of goals: stability, infrequent change, hands-on administration, and alignment with ITIL. So how does an enterprise IT organization implement change...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations migh...
At DevOps Summit NY there’s been a whole lot of talk about not just DevOps, but containers, IoT, and microservices. Sessions focused not just on the cultural shift needed to grow at scale with a DevOps approach, but also made sure to include the network ”plumbing” needed to ensure success as applications decompose into the microservice architectures enabling rapid growth and support for the Internet of (Every)Things.
Mashape is bringing real-time analytics to microservices with the release of Mashape Analytics. First built internally to analyze the performance of more than 13,000 APIs served by the mashape.com marketplace, this new tool provides developers with robust visibility into their APIs and how they function within microservices. A purpose-built, open analytics platform designed specifically for APIs and microservices architectures, Mashape Analytics also lets developers and DevOps teams understand w...
Sumo Logic has announced comprehensive analytics capabilities for organizations embracing DevOps practices, microservices architectures and containers to build applications. As application architectures evolve toward microservices, containers continue to gain traction for providing the ideal environment to build, deploy and operate these applications across distributed systems. The volume and complexity of data generated by these environments make monitoring and troubleshooting an enormous chall...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud envir...
Containers and Docker are all the rage these days. In fact, containers — with Docker as the leading container implementation — have changed how we deploy systems, especially those comprised of microservices. Despite all the buzz, however, Docker and other containers are still relatively new and not yet mainstream. That being said, even early Docker adopters need a good monitoring tool, so last month we added Docker monitoring to SPM. We built it on top of spm-agent – the extensible framework f...
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.