Welcome!

Microservices Expo Authors: Elizabeth White, PagerDuty Blog, Liz McMillan, Mano Marks, Jyoti Bansal

Related Topics: Microservices Expo

Microservices Expo: Article

Data Centers: The Next Wave of IT Innovation

Industry poised for electrical, mechanical and architectural innovations

Energy has been one of the largest variable costs for large technology companies and today is becoming increasingly material for large enterprises of all kinds.  Last week at the FIRE Conference I heard Ford's CTO talk about how Ford wanted to be seen as a technology company.  As I listened to Paul Mascarenas talk about smart cars I couldn't help but to wonder how many of the world's largest companies have at least discretely considered evolving into more technology-centric products.

The data center is the factory of the new cloud economy and is a major inflection point for enterprise profitability.  Those who deliver the most apps, services, etc. per kilowatt/hour have a competitive advantage.  And with data centers accounting for close to 1.5% of electricity consumption in the U.S., increasing energy efficiency in the data center is becoming a strategic business and community imperative.

Since before the dotcom era, enterprises have built their own data centers with a keen focus on availability, or uptime.  Many of those data centers have now outlived their usefulness and are substantial burdens on their IT teams.  As new data centers are built, uptime considerations need to be combined with efficiency considerations.  They must be addressed together.

Increasing demands for IT resources, rising rack densities, and increased power and cooling requirements are exposing tired designs, and increasing power requirements. Simply adding more space is a shortsighted approach to what promises to be a longstanding issue: the efficient use of company resources, especially those strategic to the bottom line.

Today's modern data centers are, on average, 30%+ more efficient than data centers built even five years ago, due to rising densities and the impact on electrical and mechanical innovation.  Well-capitalized tech companies (including Google and Facebook) have invested billions in data center innovation, from sophisticated water-cooling to internal rack architectures optimized for efficient airflow.

Many enterprises, however, are suspended between the cost and risk of building innovative data centers and leasing wholesale data centers.  The traditional wholesale data center industry (including Digital Realty Trust [DLR], Dupont Fabros Technology [DFT}, and regional player CoreSite [COR]) has been very successful in building standardized designs that address a subset of the enterprise data center market.  Innovation, in a nutshell, has been limited to those with the deep pockets and courage to build their own.

Today wholesale data centers can be classified as innovative (engineering-optimized for specific enterprise goals and local resource abundance/scarcity) or traditional (from pods to containers, once type of space serves all).

With Vantage Data Centers entering the market (see highlights from our Smart Data Center Revolution event on Earth Day 2011), expect to see some changes in an otherwise transaction-centric industry.

Increasing Reliability and Efficiency

As wholesale data center providers evolve you can expect more campus-scale projects with:

  • dedicated substations and higher voltage distribution from the substation to the data center floor;
  • elimination of PDUs;
  • redundant backup generator power with 2N electrical configurations to the floor;
  • high efficiency UPS units; and
  • pre-provisioning of data centers for additional load (vertical scalability) including skid-mounted generators and UPS and preprovisioned switch gear.

Enterprises that continue to operate or lease traditional data center space (where only about half the electricity entering the building is used to power and cool the data center facility), put themselves at a competitive disadvantage. They pay significantly more for the operation of every server.  Increasingly what is good for business is good for the environment, and vice versa.

The problem was starting to appear as early as five years ago (from Computerworld):

Data centers "are becoming more and more swollen," IDC analyst Vernon Turner said today at the IDC Virtualization Forum here. Most of the servers purchased today cost less than $3,000. And while that may sound inexpensive, the annual power and cooling bill for 100 servers is about $40,000. In total, for every $1 spent on a server, $7 is spent on support, he said.

-          Patrick Thibodeau, Servers Swamp Data Centers as Chip Vendors Move Ahead, Feb 6, 2006

After the energy consumed directly by the servers, routers and switches within a data center, power distribution and cooling provide significant opportunities for energy conservation.  New, high efficiency data centers -from the innovators- are bringing power closer to the data center at utility distribution 12 kV to 34.5 kW. Stepping it down close to 480 V conditioned power loads results in less loss of power.

Cooling is the other major area where energy savings are being achieved. Where geography and climate permit, data center owners and operators are taking advantage of free cooling via airside and water side economization. Supplementing this form of cooling with chillers only in hot months and operating the data center at higher overall temperatures is also positively impacting energy consumption.

You can therefore expect to see more data center customization based on location, including climate, humidity and air and water quality.  Efficient data centers will be designed for the optimum use of both scarce and plentiful local resources, instead of the "one design fits all" approach common today.   There will always be a robust demand for traditional data centers, but expect more of the tech-centric enterprises to shift to highly-customized solutions engineered for specific needs and locations.

Recent advancements in specialized mechanical architectures will also optimize the flow of air and enable granular visibility and control of cooling with real-time data and power metering.

With these electrical, mechanical and architectural innovations campus-scale wholesale data centers are matching or closely approaching the best Power Usage Effectiveness (PUE) numbers for enterprise-owned data centers by the likes of Facebook and Google.  With the closing of the innovation gap, the decision then becomes one of whether to build or lease.

Here is a recent (April 2011) article in InformationWeek on How to Build a Modern Data Center.

Upgrading, Consolidating or... Leasing

By understanding the critical elements of a high efficiency data center and the options, and by looking at metrics such as PUE plus Carbon Usage Effectiveness (CUE) which looks at the carbon emissions associated with operating a data center (not its construction) and Water Usage Effectiveness (WUE) which measures how efficiently a data center is using water, enterprises can make better decisions about whether or not their existing data center(s) can or should be upgraded.

Per IDC (2010) the average data center in the U.S. is 12 years old, meaning it cannot be upgraded economically because of inadequate electrical systems and other physical and site limitations. A site's power distribution features for example, are not something that a company can readily go back and replace to save energy.

Every business will need to assess for itself the difference that leasing a more efficient building could make compared with owning an older building that is wasting increasing amounts of power and cooling every year as power demands increase.

If it is not possible to upgrade a data center, the build/lease question should be addressed.

What is the capital expense and risk involved in building or expanding data center capacity and what is the lost opportunity cost in time and potential unrealized return in making a decision to build? As innovation accelerates how reasonable is it to expect internal teams to keep up?  What will be the ongoing operating expense to run the new data center and what is the TCO over the 10-15 year lifespan of a modern, optimized data center that offers more IT capacity (more services, applications, etc.)  per kW?

What are the costs, advantages and others considerations of leasing data center space?

The ability to quickly access secure space and scale economies with operational service levels as needs evolve has strategic competitive implications, as does being able to reduce OPEX while preserving ownership and control of critical IT assets.

Smart data centers, whether they are owned or leased, offer significant environmental benefits and measureable cost savings.  For example, a 20k square foot space in a smart data center can reduce power and cooling by more than $1 million per year.  Data center innovation will become a critical inflection point, especially for technology-centric organizations, in the next 5-10 years.  And the location of those data centers will drive the location of strategic jobs, economic growth and the efficient stewardship of environmental resources.  The innovations being designed into these new catalysts of innovation will similarly drive additional IT efficiencies and innovations in other commercial and even residential construction.

More Stories By Greg Ness

Greg Ness is a Silicon Valley marketing veteran with background in networking, security, virtualization and cloud computing. He is VP Marketing at CloudVelocity. Formerly at Vantage Data Centers, Infoblox, Blue Lane Technologies, Juniper Networks, Redline Networks, McAfee, IntruVerofficer at Networks and ShoreTel. He is one of the world's top cloud bloggers.

@MicroservicesExpo Stories
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
The rise of containers and microservices has skyrocketed the rate at which new applications are moved into production environments today. While developers have been deploying containers to speed up the development processes for some time, there still remain challenges with running microservices efficiently. Most existing IT monitoring tools don’t actually maintain visibility into the containers that make up microservices. As those container applications move into production, some IT operations t...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Microservices (μServices) are a fascinating evolution of the Distributed Object Computing (DOC) paradigm. Initial design of DOC attempted to solve the problem of simplifying developing complex distributed applications by applying object-oriented design principles to disparate components operating across networked infrastructure. In this model, DOC “hid” the complexity of making this work from the developer regardless of the deployment architecture through the use of complex frameworks, such as C...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...