Welcome!

Microservices Expo Authors: Elizabeth White, Derek Weeks, Pat Romanski, Mano Marks, Liz McMillan

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Article

The Difference Between Public and Private Cloud Computing

If the cloud’s origin is essentially private, what is the “public cloud”?

Cloud computing has essentially been private from the beginning. Google, for one, demonstrates that the world's most successful search engine runs not on an IBM Power, Sun Enterprise, or other massively powerful machine engineered for business by the go-to-vendors for computer solutions. More in keeping with its democratic, spare, and ubiquitous, engineering ethos Google created a "cloud" of white box computer - throw-away hardware servers running Linux.

Rather than purchase software box-software or hire a large consulting firm to design and build the infrastructure, Google engineered proprietary software to enable search requests and indexing tasks to be coordinated across this white "cloud" of white box hardware. Like most innovations that somehow become synonymous with brilliance and innovation, the innovation of using large numbers of identical machines to process enormous amounts of work isn't new at all. In fact it isn't even an innovation that can be attributed to computing at all.

If the cloud's origin is essentially private, what is the "public cloud"? One possible definition is that the public cloud is a disruptive channel through which to package and deliver some of the advantages of the cloud to "the public." Why is the public interested in the cloud? The cloud promises advantages that elude even the largest enterprises today. Public cloud computing services are purchased by individual users, small business, medium-sized business, and one of the world's largest firms.

What problems does the business community imagine the cloud might solve? Traditional technology solutions take a lot of time and money to implement, yet business needs to move quickly and flexibly to seize new opportunities. The business systems of yesterday seldom adapt easily to new requirements. And the work involved in adding new capabilities to existing systems is almost always significantly greater than the effort to build from scratch. Given this dynamic, the risk of building new systems or modifying existing systems seems high. What makes the current practice even less appealing is the tendency over time for the cost of maintaining existing systems to grow. Some studies show that today maintaining current systems consumes seventy percent of a firm's technology budget.

Given these dynamics, the (public) cloud computing model seems to offer an approach that mitigates some of the issues faced in businesses of all sizes.

  1. Firms would like to be able to purchase focused, low-cost, customizable, and flexible technology services
  2. Pay for these services when and how they are consumed. For example, some cloud computing vendors offer a metered rate model in which the firm or individual pays for just the right amount and quality of resources required to meet demand.
  3. Provision these firms as they are needed. If a company needs to provide a temporary call center in Asia for three months while consolidating their data centers in the region, then the cloud computing model offers the ability to provision, configure, and host the software and desktops to do so. If a 25-person firm decides that a customer relationship management solution seems like a good idea, the firm can provision and use that solution in the cloud.

Beyond the rapid deployment, the capability to flexibly alter and shape technology services in the cloud infrastructure can help firms design, deploy, and "shape" technology solutions that fit their immediate needs, yet can adapt over time as the business evolves. In other words, compared to traditional technology practices, the financial model of the cloud seems attractive.

CIOs and business owners tend to look at the return on investment for existing and new technology spending. One of the key factors in the ROI model of investment is the length of time, or "payback" period over which the benefits of the expenditure outweigh the costs. It's not any easy decision because in the traditional model, the CIO purchases equipment, software, services, training up-front, and then hopes that the benefits can be clearly demonstrated. Yet most firms have difficulty tracking costs and benefits in a way that makes the outcome clear. If the CIO chooses too little hardware, or implements a solution that the business users later reject, the whole solution can require additional customization or additional hardware. The cloud computing model assists in mitigating these risks by enabling both the cost and the benefit flows to be aligned. Because the building blocks of a cloud solution are much more scalable, many aspects of the solution can be tuned.

Most firms choose not to write their own desktop operating system or desktop applications. Firms make these choices every day. Yet much of the computing expenditures today deliver little competitive advantage, yet consume scarce and valuable human and capital resources. For a firm like Google, a Private Cloud of white boxes orchestrated to index and return search results makes sense. Yet, for the majority of businesses the public cloud computing model may enable business to better align the cost of computing with the business value, and make competitive advantages achievable through technology.

More Stories By Brian McCallion

Brian McCallion Bronze Drum works with executives to develop Cloud Strategy, Big Data proof-of-concepts, and trains enterprise teams to rethink process and operations. Focus areas include: Enterprise Cloud Strategy and Project Management Cloud Data Governance and Compliance Infrastructure Automation

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
True Story. Over the past few years, Fannie Mae transformed the way in which they delivered software. Deploys increased from 1,200/month to 15,000/month. At the same time, productivity increased by 28% while reducing costs by 30%. But, how did they do it? During the All Day DevOps conference, over 13,500 practitioners from around the world to learn from their peers in the industry. Barry Snyder, Senior Manager of DevOps at Fannie Mae, was one of 57 practitioners who shared his real world journe...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
When building DevOps or continuous delivery practices you can learn a great deal from others. What choices did they make, what practices did they put in place, and how did they connect the dots? At Sonatype, we pulled together a set of 21 reference architectures for folks building continuous delivery and DevOps practices using Docker. Why? After 3,000 DevOps professionals attended our webinar on "Continuous Integration using Docker" discussing just one reference architecture example, we recogn...
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager - Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, reviewed next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discussed how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has been engaged in t...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
The evolution of JavaScript and HTML 5 to support a genuine component based framework (Web Components) with the necessary tools to deliver something close to a native experience including genuine realtime networking (UDP using WebRTC). HTML5 is evolving to offer built in templating support, the ability to watch objects (which will speed up Angular) and Web Components (which offer Angular Directives). The native level support will offer a massive performance boost to frameworks having to fake all...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...