|By Daniel Burrus||
|August 20, 2011 12:00 PM EDT||
The use of virtualization and cloud computing is growing quickly among companies of all sizes. Currently, 30 percent of servers are virtualized, and surveys show that by 2012, that number will grow to 50 percent.
Virtualization and cloud computing go hand-in-hand, and virtualizing servers is just the tip of the iceberg. The trend to virtualize everything from servers to processing power to software offerings actually started years ago in the personal sector. In the recent past, it was common for individuals within major organizations to use virtualized services or cloud computing when at home, but at work they weren't using those services at all. Why? Because corporate IT didn't trust the lack of security of the cloud and they weren't sure it was a hard trend - something that was definitely here to stay. Today, we know better.
In order to fully understand how virtualization and cloud computing will transform the business world let's first look at the evolution of these capabilities.
When talking about virtualization, cloud computing is a natural component. Cloud computing, which refers to companies using remote servers that can store data and allow users to access information from anywhere, takes three different evolutionary forms.
The first is a public cloud. This could be something like Google docs, where you store your data, or something like Flicker, where you store your photos. Basically, you're storing files somewhere else other than your hard drive, and in a place where you can access the items from any device at any time as long as you have an Internet connection.
The second form of cloud computing, which is a private cloud, is emerging rapidly. A private cloud exists when a company wants added security with cloud computing, yet they still want their people to have access to their bigger files and bigger databases from any device anywhere. Since it is private, it's secure and the public does not have access to it. Companies are now beginning to establish private clouds.
The third iteration that is part of the evolution of cloud computing is the private/public cloud - also called a hybrid cloud. In this configuration, you have a private part of your corporate cloud that is secure and only accessible by employees, but you also have a part of the cloud that is public where strategic partners, vendors, and customers can access limited content.
Virtualization can take many forms aside from servers. For example, you can virtualize a desktop, meaning your desktop is stored virtually in the clouds and you can access it from anywhere. You can virtualize your operating system. That means you can be using a Mac yet running the latest Windows operating system on the Mac, or you can have a PC and have three different operating systems running all at the same time. That's the power of virtualization.
Another element of virtualization is Software as a Service (SaaS). Decades ago, we started with software that you had to buy, install, maintain, and update. Thanks to SaaS, the software is in the clouds, so you no longer buy it; you simply buy time to use it. It's a cost-effective way for companies of all sizes to have access to enterprise-level software.
Similarly, we're also starting to see virtualized processing power. Think of this as accessing a supercomputer in the clouds and having that supercomputer's processing power available on your smartphone or tablet. In February 2011, the TV game show "Jeopardy" featured IBM's supercomputer Watson against human contestants. Watson beat the humans at "Jeopardy" quite well because it knew what it was good at and it focused on those categories. With virtualized processing power, you're basically getting a Watson on your phone. That means you and your employees can make informed decisions about many things, very quickly.
One of the ways Watson has been used since "Jeopardy" is looking at MRI scans. When Watson reviews an MRI scan, it can detect anomalies and see things a human doctor can't see. Watson can also analyze many variables in an effort to help the human doctor make a better diagnosis faster. It's about allowing professionals rapid access to vast amounts of information and knowledge that can help them work faster and smarter.
Health care is just one example. Could people who do sales, R&D, purchasing, delivery, sourcing, shipping, accounting, and a host of other functions benefit from a Watson-like supercomputer in the palm of their hand? Yes. Could it make them work smarter, better, and more effective? Most definitely!
The Game Changer
Part of this evolution of virtualization and cloud computing is that we can now virtualize various components of IT. And in the near future, we'll start seeing IT as a Service (much like how SaaS became popular). This means that much of the IT department will be virtualized and running in the cloud.
The benefits of IT as a Service are immense. Not only will it save money, but it will also increase speed and agility. Since your servers aren't being used 100 percent all the time, the efficiency varies. With IT as a Service, a company will be able to scale in real time as demand dictates by the nanosecond. As sales increase, instantly the system will self-configure. As sales decrease, it does the same. Now you're only paying for what you're using. In this case, you'll be able to benefit from dynamic resource allocation so you're able to maximize what you have and what you're paying for at all times.
IT as a Service is a game changer. Because you now have components of the IT department existing in the cloud, you're freeing your in-house IT staff to shift from a maintenance mode to an innovation mode. As such, your IT department can focus on achieving business goals, creating innovative solutions, and driving sales rather than upgrading individual user's computers and firefighting everyday problems. It allows the IT department to really look at the industry trends unfolding so your company can give customers the products and services they'd ask for, if they only knew what was possible.
It's Time to V-Enable the Organization
In terms of implementing virtualization and cloud computing options, organizations are now starting to move quickly. Virtualization received a big push in 2009 and 2010 because of the recession, which prompted many companies to cut their IT budget. Companies realized that one way to save money is through virtualization. For example, virtual desktops alone lower costs by 15 percent.
Now in 2011, the factors that are increasing an organization's interest in virtualization are speed and agility. Virtualization enables you to do things faster, thus making your company more agile. Instead of delivering a new service in two months, companies are able to do it in two days.
As virtualization and cloud computing become more prevalent, companies are going to need to form new strategic relationships because existing relationships may not have the core competencies needed to drive the fundamental changes that will be needed. At this point it would be good to ask yourself if you have the relationships you need to move forward given this shift. Do your current strategic relationships understand the shifts taking place and are they embracing the things you know will happen?
Realize, too, that the wrong question to ask is, "What should we buy?" Rather, you have to look at the bigger picture of what you're trying to accomplish in this transformational time. How can you use virtualization and cloud computing as game changers for your company based on where it's evolving? The key is to understand the new capabilities, because in order to know what to buy or what to do, you first have to know what is possible.
2011 is the year most are sticking their toes in the waters of virtualization and cloud computing. It's the year organizations realize this isn't a fad that's going to fade. Virtualization and the cloud are hard trends that provide transformational opportunities and will continue to rapidly evolve. The time to embrace this trend is now.
The burgeoning trends around DevOps are translating into new types of IT infrastructure that both developers and operators can take advantage of. The next BriefingsDirect Voice of the Customer thought leadership discussion focuses on the burgeoning trends around DevOps and how that’s translating into new types of IT infrastructure that both developers and operators can take advantage of.
Aug. 26, 2016 02:00 AM EDT Reads: 2,438
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 26, 2016 01:30 AM EDT Reads: 1,656
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 26, 2016 01:00 AM EDT Reads: 1,932
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 26, 2016 12:45 AM EDT Reads: 2,236
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
Aug. 25, 2016 11:45 PM EDT Reads: 3,103
Thomas Bitman of Gartner wrote a blog post last year about why OpenStack projects fail. In that article, he outlined three particular metrics which together cause 60% of OpenStack projects to fall short of expectations: Wrong people (31% of failures): a successful cloud needs commitment both from the operations team as well as from "anchor" tenants. Wrong processes (19% of failures): a successful cloud automates across silos in the software development lifecycle, not just within silos.
Aug. 25, 2016 09:30 PM EDT Reads: 2,040
A company’s collection of online systems is like a delicate ecosystem – all components must integrate with and complement each other, and one single malfunction in any of them can bring the entire system to a screeching halt. That’s why, when monitoring and analyzing the health of your online systems, you need a broad arsenal of different tools for your different needs. In addition to a wide-angle lens that provides a snapshot of the overall health of your system, you must also have precise, ...
Aug. 25, 2016 06:45 PM EDT Reads: 1,532
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Aug. 25, 2016 06:30 PM EDT Reads: 1,928
The following fictional case study is a composite of actual horror stories I’ve heard over the years. Unfortunately, this scenario often occurs when in-house integration teams take on the complexities of DevOps and ALM integration with an enterprise service bus (ESB) or custom integration. It is written from the perspective of an enterprise architect tasked with leading an organization’s effort to adopt Agile to become more competitive. The company has turned to Scaled Agile Framework (SAFe) as ...
Aug. 25, 2016 05:00 PM EDT Reads: 598
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Aug. 25, 2016 04:00 PM EDT Reads: 1,938
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Aug. 25, 2016 02:00 PM EDT Reads: 3,549
It's been a busy time for tech's ongoing infatuation with containers. Amazon just announced EC2 Container Registry to simply container management. The new Azure container service taps into Microsoft's partnership with Docker and Mesosphere. You know when there's a standard for containers on the table there's money on the table, too. Everyone is talking containers because they reduce a ton of development-related challenges and make it much easier to move across production and testing environm...
Aug. 25, 2016 01:30 PM EDT Reads: 4,967
Cloud Expo 2016 New York at the Javits Center New York was characterized by increased attendance and a new focus on operations. These were both encouraging signs for all involved in Cloud Computing and all that it touches. As Conference Chair, I work with the Cloud Expo team to structure three keynotes, numerous general sessions, and more than 150 breakout sessions along 10 tracks. Our job is to balance the state of enterprise IT today with the trends that will be commonplace tomorrow. Mobile...
Aug. 25, 2016 01:00 PM EDT Reads: 3,267
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 25, 2016 01:00 PM EDT Reads: 3,896
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Aug. 25, 2016 01:00 PM EDT Reads: 2,598
[session] Architecting for the Cloud By @RagsS | @CloudExpo @IBMBluemix #Cloud #Docker #Microservices
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Aug. 25, 2016 12:45 PM EDT Reads: 598
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
Aug. 25, 2016 12:15 PM EDT Reads: 3,389
Modern organizations face great challenges as they embrace innovation and integrate new tools and services. They begin to mature and move away from the complacency of maintaining traditional technologies and systems that only solve individual, siloed problems and work “well enough.” In order to build...
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Aug. 25, 2016 09:00 AM EDT Reads: 532
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Aug. 25, 2016 08:15 AM EDT Reads: 1,550