|By Brian McCallion||
|June 16, 2011 07:15 AM EDT||
Cloud computing has essentially been private from the beginning. Google, for one, demonstrates that the world's most successful search engine runs not on an IBM Power, Sun Enterprise, or other massively powerful machine engineered for business by the go-to-vendors for computer solutions. More in keeping with its democratic, spare, and ubiquitous, engineering ethos Google created a "cloud" of white box computer - throw-away hardware servers running Linux.
Rather than purchase software box-software or hire a large consulting firm to design and build the infrastructure, Google engineered proprietary software to enable search requests and indexing tasks to be coordinated across this white "cloud" of white box hardware. Like most innovations that somehow become synonymous with brilliance and innovation, the innovation of using large numbers of identical machines to process enormous amounts of work isn't new at all. In fact it isn't even an innovation that can be attributed to computing at all.
If the cloud's origin is essentially private, what is the "public cloud"? One possible definition is that the public cloud is a disruptive channel through which to package and deliver some of the advantages of the cloud to "the public." Why is the public interested in the cloud? The cloud promises advantages that elude even the largest enterprises today. Public cloud computing services are purchased by individual users, small business, medium-sized business, and one of the world's largest firms.
What problems does the business community imagine the cloud might solve? Traditional technology solutions take a lot of time and money to implement, yet business needs to move quickly and flexibly to seize new opportunities. The business systems of yesterday seldom adapt easily to new requirements. And the work involved in adding new capabilities to existing systems is almost always significantly greater than the effort to build from scratch. Given this dynamic, the risk of building new systems or modifying existing systems seems high. What makes the current practice even less appealing is the tendency over time for the cost of maintaining existing systems to grow. Some studies show that today maintaining current systems consumes seventy percent of a firm's technology budget.
Given these dynamics, the (public) cloud computing model seems to offer an approach that mitigates some of the issues faced in businesses of all sizes.
- Firms would like to be able to purchase focused, low-cost, customizable, and flexible technology services
- Pay for these services when and how they are consumed. For example, some cloud computing vendors offer a metered rate model in which the firm or individual pays for just the right amount and quality of resources required to meet demand.
- Provision these firms as they are needed. If a company needs to provide a temporary call center in Asia for three months while consolidating their data centers in the region, then the cloud computing model offers the ability to provision, configure, and host the software and desktops to do so. If a 25-person firm decides that a customer relationship management solution seems like a good idea, the firm can provision and use that solution in the cloud.
Beyond the rapid deployment, the capability to flexibly alter and shape technology services in the cloud infrastructure can help firms design, deploy, and "shape" technology solutions that fit their immediate needs, yet can adapt over time as the business evolves. In other words, compared to traditional technology practices, the financial model of the cloud seems attractive.
CIOs and business owners tend to look at the return on investment for existing and new technology spending. One of the key factors in the ROI model of investment is the length of time, or "payback" period over which the benefits of the expenditure outweigh the costs. It's not any easy decision because in the traditional model, the CIO purchases equipment, software, services, training up-front, and then hopes that the benefits can be clearly demonstrated. Yet most firms have difficulty tracking costs and benefits in a way that makes the outcome clear. If the CIO chooses too little hardware, or implements a solution that the business users later reject, the whole solution can require additional customization or additional hardware. The cloud computing model assists in mitigating these risks by enabling both the cost and the benefit flows to be aligned. Because the building blocks of a cloud solution are much more scalable, many aspects of the solution can be tuned.
Most firms choose not to write their own desktop operating system or desktop applications. Firms make these choices every day. Yet much of the computing expenditures today deliver little competitive advantage, yet consume scarce and valuable human and capital resources. For a firm like Google, a Private Cloud of white boxes orchestrated to index and return search results makes sense. Yet, for the majority of businesses the public cloud computing model may enable business to better align the cost of computing with the business value, and make competitive advantages achievable through technology.
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 31, 2016 02:45 AM EDT Reads: 1,886
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 31, 2016 02:15 AM EDT Reads: 2,275
Aug. 30, 2016 11:15 PM EDT Reads: 4,963
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
Aug. 30, 2016 08:45 PM EDT Reads: 2,471
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Aug. 30, 2016 08:00 PM EDT Reads: 2,036
Before becoming a developer, I was in the high school band. I played several brass instruments - including French horn and cornet - as well as keyboards in the jazz stage band. A musician and a nerd, what can I say? I even dabbled in writing music for the band. Okay, mostly I wrote arrangements of pop music, so the band could keep the crowd entertained during Friday night football games. What struck me then was that, to write parts for all the instruments - brass, woodwind, percussion, even k...
Aug. 30, 2016 07:45 PM EDT Reads: 3,232
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
Aug. 30, 2016 07:45 PM EDT Reads: 10,932
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
Aug. 30, 2016 05:45 PM EDT Reads: 3,596
Modern organizations face great challenges as they embrace innovation and integrate new tools and services. They begin to mature and move away from the complacency of maintaining traditional technologies and systems that only solve individual, siloed problems and work “well enough.” In order to build...
This complete kit provides a proven process and customizable documents that will help you evaluate rapid application delivery platforms and select the ideal partner for building mobile and web apps for your organization.
Aug. 30, 2016 05:00 PM EDT Reads: 3,172
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Aug. 30, 2016 03:30 PM EDT Reads: 3,785
Thomas Bitman of Gartner wrote a blog post last year about why OpenStack projects fail. In that article, he outlined three particular metrics which together cause 60% of OpenStack projects to fall short of expectations: Wrong people (31% of failures): a successful cloud needs commitment both from the operations team as well as from "anchor" tenants. Wrong processes (19% of failures): a successful cloud automates across silos in the software development lifecycle, not just within silos.
Aug. 30, 2016 03:15 PM EDT Reads: 2,227
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 30, 2016 01:00 PM EDT Reads: 3,246
Let's just nip the conflation of these terms in the bud, shall we?
"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.
They are not.
One is about the application. The other, the network. T...
Aug. 30, 2016 09:45 AM EDT Reads: 4,777
[session] Architecting for the Cloud By @RagsS | @CloudExpo @IBMBluemix #Cloud #Docker #Microservices
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Aug. 30, 2016 09:45 AM EDT Reads: 1,092
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Aug. 30, 2016 09:45 AM EDT Reads: 905
The following fictional case study is a composite of actual horror stories I’ve heard over the years. Unfortunately, this scenario often occurs when in-house integration teams take on the complexities of DevOps and ALM integration with an enterprise service bus (ESB) or custom integration. It is written from the perspective of an enterprise architect tasked with leading an organization’s effort to adopt Agile to become more competitive. The company has turned to Scaled Agile Framework (SAFe) as ...
Aug. 30, 2016 09:30 AM EDT Reads: 1,006
If you are within a stones throw of the DevOps marketplace you have undoubtably noticed the growing trend in Microservices. Whether you have been staying up to date with the latest articles and blogs or you just read the definition for the first time, these 5 Microservices Resources You Need In Your Life will guide you through the ins and outs of Microservices in today’s world.
Aug. 30, 2016 08:45 AM EDT Reads: 5,265
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
Aug. 30, 2016 08:30 AM EDT Reads: 5,331
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Aug. 30, 2016 06:00 AM EDT Reads: 2,151