Click here to close now.

Welcome!

Microservices Journal Authors: Lori MacVittie, Alena Prokharchyk, Carmen Gonzalez, Srinivasan Sundara Rajan, Elizabeth White

Related Topics: Cloud Expo, Microservices Journal

Cloud Expo: Article

The Difference Between Public and Private Cloud Computing

If the cloud’s origin is essentially private, what is the “public cloud”?

Cloud computing has essentially been private from the beginning. Google, for one, demonstrates that the world's most successful search engine runs not on an IBM Power, Sun Enterprise, or other massively powerful machine engineered for business by the go-to-vendors for computer solutions. More in keeping with its democratic, spare, and ubiquitous, engineering ethos Google created a "cloud" of white box computer - throw-away hardware servers running Linux.

Rather than purchase software box-software or hire a large consulting firm to design and build the infrastructure, Google engineered proprietary software to enable search requests and indexing tasks to be coordinated across this white "cloud" of white box hardware. Like most innovations that somehow become synonymous with brilliance and innovation, the innovation of using large numbers of identical machines to process enormous amounts of work isn't new at all. In fact it isn't even an innovation that can be attributed to computing at all.

If the cloud's origin is essentially private, what is the "public cloud"? One possible definition is that the public cloud is a disruptive channel through which to package and deliver some of the advantages of the cloud to "the public." Why is the public interested in the cloud? The cloud promises advantages that elude even the largest enterprises today. Public cloud computing services are purchased by individual users, small business, medium-sized business, and one of the world's largest firms.

What problems does the business community imagine the cloud might solve? Traditional technology solutions take a lot of time and money to implement, yet business needs to move quickly and flexibly to seize new opportunities. The business systems of yesterday seldom adapt easily to new requirements. And the work involved in adding new capabilities to existing systems is almost always significantly greater than the effort to build from scratch. Given this dynamic, the risk of building new systems or modifying existing systems seems high. What makes the current practice even less appealing is the tendency over time for the cost of maintaining existing systems to grow. Some studies show that today maintaining current systems consumes seventy percent of a firm's technology budget.

Given these dynamics, the (public) cloud computing model seems to offer an approach that mitigates some of the issues faced in businesses of all sizes.

  1. Firms would like to be able to purchase focused, low-cost, customizable, and flexible technology services
  2. Pay for these services when and how they are consumed. For example, some cloud computing vendors offer a metered rate model in which the firm or individual pays for just the right amount and quality of resources required to meet demand.
  3. Provision these firms as they are needed. If a company needs to provide a temporary call center in Asia for three months while consolidating their data centers in the region, then the cloud computing model offers the ability to provision, configure, and host the software and desktops to do so. If a 25-person firm decides that a customer relationship management solution seems like a good idea, the firm can provision and use that solution in the cloud.

Beyond the rapid deployment, the capability to flexibly alter and shape technology services in the cloud infrastructure can help firms design, deploy, and "shape" technology solutions that fit their immediate needs, yet can adapt over time as the business evolves. In other words, compared to traditional technology practices, the financial model of the cloud seems attractive.

CIOs and business owners tend to look at the return on investment for existing and new technology spending. One of the key factors in the ROI model of investment is the length of time, or "payback" period over which the benefits of the expenditure outweigh the costs. It's not any easy decision because in the traditional model, the CIO purchases equipment, software, services, training up-front, and then hopes that the benefits can be clearly demonstrated. Yet most firms have difficulty tracking costs and benefits in a way that makes the outcome clear. If the CIO chooses too little hardware, or implements a solution that the business users later reject, the whole solution can require additional customization or additional hardware. The cloud computing model assists in mitigating these risks by enabling both the cost and the benefit flows to be aligned. Because the building blocks of a cloud solution are much more scalable, many aspects of the solution can be tuned.

Most firms choose not to write their own desktop operating system or desktop applications. Firms make these choices every day. Yet much of the computing expenditures today deliver little competitive advantage, yet consume scarce and valuable human and capital resources. For a firm like Google, a Private Cloud of white boxes orchestrated to index and return search results makes sense. Yet, for the majority of businesses the public cloud computing model may enable business to better align the cost of computing with the business value, and make competitive advantages achievable through technology.

More Stories By Brian McCallion

Brian McCallion Bronze Drum works with executives to develop Cloud Strategy, Big Data proof-of-concepts, and trains enterprise teams to rethink process and operations. Focus areas include: Enterprise Cloud Strategy and Project Management Cloud Data Governance and Compliance Infrastructure Automation

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Security is one the more prominent of the application service categories, likely due to its high profile impact. After all, if security fails, we all hear about it. The entire Internet. Forever. So when one conducts a survey on the state of application delivery (which is implemented using application services) you kinda have to include security. Which of course, we did.
One of the most frequently requested Rancher features, load balancers are used to distribute traffic between docker containers. Now Rancher users can configure, update and scale up an integrated load balancing service to meet their application needs, using either Rancher's UI or API. To implement our load balancing functionality we decided to use HAproxy, which is deployed as a contianer, and managed by the Rancher orchestration functionality. With Rancher's Load Balancing capability, users ...
The world's leading Cloud event, Cloud Expo has launched Microservices Journal on the SYS-CON.com portal, featuring over 19,000 original articles, news stories, features, and blog entries. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. Microservices Journal offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Follow new article posts on T...
The concept and subsequent adoption of 'Containerization'' is growing at a rapid speed with the support of almost every other major player in the industry. This concept is much more efficient than the Virtualization which has been a major option for Infrastructure optimization in the past decade. The following factors distinguish a Container from a Virtual Machine. Containers contain Only the Application Specific libraries and binaries. They do not include a guest operating system. Rather ...
Chef and Canonical announced a partnership to integrate and distribute Chef with Ubuntu. Canonical is integrating the Chef automation platform with Canonical's Machine-As-A-Service (MAAS), enabling users to automate the provisioning, configuration and deployment of bare metal compute resources in the data center. Canonical is packaging Chef 12 server in upcoming distributions of its Ubuntu open source operating system and will provide commercial support for Chef within its user base.
In 2015, 4.9 billion connected "things" will be in use. By 2020, Gartner forecasts this amount to be 25 billion, a 410 percent increase in just five years. How will businesses handle this rapid growth of data? Hadoop will continue to improve its technology to meet business demands, by enabling businesses to access/analyze data in real time, when and where they need it. Cloudera's Chief Technologist, Eli Collins, will discuss how Big Data is keeping up with today's data demands and how in t...
Choosing between BIG-IP and LineRate isn't as difficult as it seems.... Our recent announcement of the availability of LineRate Point raised the same question over and over: isn't this just a software-version of BIG-IP? How do I know when to choose LineRate Point instead of BIG-IP VE (Virtual Edition)? Aren't they the same?? No, no they aren't. LineRate Point (and really Line Rate Precision, too) is more akin to an app proxy while BIG-IP VE remains, of course, an ADC (Application Delivery ...
SYS-CON Media announced today that @ThingsExpo Blog launched with 7,788 original stories. @ThingsExpo Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @ThingsExpo Blog can be bookmarked. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago.
No, not the head-banging, gritty, heavy metal Metallica song (though that's certainly awesome too.. excuse me for a moment while I turn it up to 11) but the Puppet as in automation kind of master. The importance placed on APIs - which are key to automation - in our State of Application Delivery 2015 survey was high, with 40% of respondents saying it was important to them that their infrastructure be API-enabled. Automation using those APIs is generally being accomplished through a variety of m...
So I guess we’ve officially entered a new era of lean and mean. I say this with the announcement of Ubuntu Snappy Core, “designed for lightweight cloud container hosts running Docker and for smart devices,” according to Canonical. “Snappy Ubuntu Core is the smallest Ubuntu available, designed for security and efficiency in devices or on the cloud.” This first version of Snappy Ubuntu Core features secure app containment and Docker 1.6 (1.5 in main release), is available on public clouds, ...
SYS-CON Events announced today that Vicom Computer Services, Inc., a provider of technology and service solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. They are located at booth #427. Vicom Computer Services, Inc. is a progressive leader in the technology industry for over 30 years. Headquartered in the NY Metropolitan area. Vicom provides products and services based on today’s requirements...
How do you securely enable access to your applications in AWS without exposing any attack surfaces? The answer is usually very complicated because application environments morph over time in response to growing requirements from your employee base, your partners and your customers. In his session at 16th Cloud Expo, Haseeb Budhani, CEO and Co-founder of Soha, will share five common approaches that DevOps teams follow to secure access to applications deployed in AWS, Azure, etc., and the frict...
Modern Systems announced completion of a successful project with its new Rapid Program Modernization (eavRPMa"c) software. The eavRPMa"c technology architecturally transforms legacy applications, enabling faster feature development and reducing time-to-market for critical software updates. Working with Modern Systems, the University of California at Santa Barbara (UCSB) leveraged eavRPMa"c to transform its Student Information System from Software AG's Natural syntax to a modern application lev...
The only place to be June 9-11 is Cloud Expo & @ThingsExpo 2015 East at the Javits Center in New York City. Join us there as delegates from all over the world come to listen to and engage with speakers & sponsors from the leading Cloud Computing, IoT & Big Data companies. Cloud Expo & @ThingsExpo are the leading events covering the booming market of Cloud Computing, IoT & Big Data for the enterprise. Speakers from all over the world will be hand-picked for their ability to explore the economic...
As we recently previewed (read more about our London PoP in Jesse's post), Blue Box is opening a new Data Center in London, but hadn't announced the provider. Today we're excited to partner with TelecityGroup, whom we've selected as our data center partner in London. We chose their Powergate location, which is one of the U.K.'s most advanced, flexible and energy efficient carrier-neutral data centres. Why does that matter to you? Well, when customers choose Blue Box, they're trusting us with ...

As a company making software for Continuous Delivery and Devops at scale, at XebiaLabs we’re pretty much always in discussions with users about the benefits and challenges of new development styles, application architectures, and runtime platforms. Unsurprisingly, many of these discussions right now focus on microservices on the application side and containers and related frameworks […]

The post Apr. 26, 2015 10:00 AM EDT  Reads: 1,134

Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
A few weeks ago, SmartBear hosted API Craft Boston with the folks from Akana, Ian Goldsmith and Laura Heritage, to talk about microservices. It was an extremely informative presentation of where microservices came from, what it solves, and considerations around how it might fit into an organizational API strategy. It’s one thing to read everyone else’s opinions on blogs, twitter, etc. It’s great to go to workshops and conferences, but this was so intelligently presented (and for a meetup too)...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? Join this panel of experts as they peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you’ll have no problem filling in your buzzword bingo cards.
SYS-CON Events announced today the IoT Bootcamp – Jumpstart Your IoT Strategy, being held June 9–10, 2015, in conjunction with 16th Cloud Expo and Internet of @ThingsExpo at the Javits Center in New York City. This is your chance to jumpstart your IoT strategy. Combined with real-world scenarios and use cases, the IoT Bootcamp is not just based on presentations but includes hands-on demos and walkthroughs. We will introduce you to a variety of Do-It-Yourself IoT platforms including Arduino, Ras...