Microservices Expo Authors: Roger Strukhoff, Liz McMillan, Andreas Grabner, Elizabeth White, Ruxit Blog

Related Topics: Microservices Expo

Microservices Expo: Article

Data Centers: The Next Wave of IT Innovation

Industry poised for electrical, mechanical and architectural innovations

Energy has been one of the largest variable costs for large technology companies and today is becoming increasingly material for large enterprises of all kinds.  Last week at the FIRE Conference I heard Ford's CTO talk about how Ford wanted to be seen as a technology company.  As I listened to Paul Mascarenas talk about smart cars I couldn't help but to wonder how many of the world's largest companies have at least discretely considered evolving into more technology-centric products.

The data center is the factory of the new cloud economy and is a major inflection point for enterprise profitability.  Those who deliver the most apps, services, etc. per kilowatt/hour have a competitive advantage.  And with data centers accounting for close to 1.5% of electricity consumption in the U.S., increasing energy efficiency in the data center is becoming a strategic business and community imperative.

Since before the dotcom era, enterprises have built their own data centers with a keen focus on availability, or uptime.  Many of those data centers have now outlived their usefulness and are substantial burdens on their IT teams.  As new data centers are built, uptime considerations need to be combined with efficiency considerations.  They must be addressed together.

Increasing demands for IT resources, rising rack densities, and increased power and cooling requirements are exposing tired designs, and increasing power requirements. Simply adding more space is a shortsighted approach to what promises to be a longstanding issue: the efficient use of company resources, especially those strategic to the bottom line.

Today's modern data centers are, on average, 30%+ more efficient than data centers built even five years ago, due to rising densities and the impact on electrical and mechanical innovation.  Well-capitalized tech companies (including Google and Facebook) have invested billions in data center innovation, from sophisticated water-cooling to internal rack architectures optimized for efficient airflow.

Many enterprises, however, are suspended between the cost and risk of building innovative data centers and leasing wholesale data centers.  The traditional wholesale data center industry (including Digital Realty Trust [DLR], Dupont Fabros Technology [DFT}, and regional player CoreSite [COR]) has been very successful in building standardized designs that address a subset of the enterprise data center market.  Innovation, in a nutshell, has been limited to those with the deep pockets and courage to build their own.

Today wholesale data centers can be classified as innovative (engineering-optimized for specific enterprise goals and local resource abundance/scarcity) or traditional (from pods to containers, once type of space serves all).

With Vantage Data Centers entering the market (see highlights from our Smart Data Center Revolution event on Earth Day 2011), expect to see some changes in an otherwise transaction-centric industry.

Increasing Reliability and Efficiency

As wholesale data center providers evolve you can expect more campus-scale projects with:

  • dedicated substations and higher voltage distribution from the substation to the data center floor;
  • elimination of PDUs;
  • redundant backup generator power with 2N electrical configurations to the floor;
  • high efficiency UPS units; and
  • pre-provisioning of data centers for additional load (vertical scalability) including skid-mounted generators and UPS and preprovisioned switch gear.

Enterprises that continue to operate or lease traditional data center space (where only about half the electricity entering the building is used to power and cool the data center facility), put themselves at a competitive disadvantage. They pay significantly more for the operation of every server.  Increasingly what is good for business is good for the environment, and vice versa.

The problem was starting to appear as early as five years ago (from Computerworld):

Data centers "are becoming more and more swollen," IDC analyst Vernon Turner said today at the IDC Virtualization Forum here. Most of the servers purchased today cost less than $3,000. And while that may sound inexpensive, the annual power and cooling bill for 100 servers is about $40,000. In total, for every $1 spent on a server, $7 is spent on support, he said.

-          Patrick Thibodeau, Servers Swamp Data Centers as Chip Vendors Move Ahead, Feb 6, 2006

After the energy consumed directly by the servers, routers and switches within a data center, power distribution and cooling provide significant opportunities for energy conservation.  New, high efficiency data centers -from the innovators- are bringing power closer to the data center at utility distribution 12 kV to 34.5 kW. Stepping it down close to 480 V conditioned power loads results in less loss of power.

Cooling is the other major area where energy savings are being achieved. Where geography and climate permit, data center owners and operators are taking advantage of free cooling via airside and water side economization. Supplementing this form of cooling with chillers only in hot months and operating the data center at higher overall temperatures is also positively impacting energy consumption.

You can therefore expect to see more data center customization based on location, including climate, humidity and air and water quality.  Efficient data centers will be designed for the optimum use of both scarce and plentiful local resources, instead of the "one design fits all" approach common today.   There will always be a robust demand for traditional data centers, but expect more of the tech-centric enterprises to shift to highly-customized solutions engineered for specific needs and locations.

Recent advancements in specialized mechanical architectures will also optimize the flow of air and enable granular visibility and control of cooling with real-time data and power metering.

With these electrical, mechanical and architectural innovations campus-scale wholesale data centers are matching or closely approaching the best Power Usage Effectiveness (PUE) numbers for enterprise-owned data centers by the likes of Facebook and Google.  With the closing of the innovation gap, the decision then becomes one of whether to build or lease.

Here is a recent (April 2011) article in InformationWeek on How to Build a Modern Data Center.

Upgrading, Consolidating or... Leasing

By understanding the critical elements of a high efficiency data center and the options, and by looking at metrics such as PUE plus Carbon Usage Effectiveness (CUE) which looks at the carbon emissions associated with operating a data center (not its construction) and Water Usage Effectiveness (WUE) which measures how efficiently a data center is using water, enterprises can make better decisions about whether or not their existing data center(s) can or should be upgraded.

Per IDC (2010) the average data center in the U.S. is 12 years old, meaning it cannot be upgraded economically because of inadequate electrical systems and other physical and site limitations. A site's power distribution features for example, are not something that a company can readily go back and replace to save energy.

Every business will need to assess for itself the difference that leasing a more efficient building could make compared with owning an older building that is wasting increasing amounts of power and cooling every year as power demands increase.

If it is not possible to upgrade a data center, the build/lease question should be addressed.

What is the capital expense and risk involved in building or expanding data center capacity and what is the lost opportunity cost in time and potential unrealized return in making a decision to build? As innovation accelerates how reasonable is it to expect internal teams to keep up?  What will be the ongoing operating expense to run the new data center and what is the TCO over the 10-15 year lifespan of a modern, optimized data center that offers more IT capacity (more services, applications, etc.)  per kW?

What are the costs, advantages and others considerations of leasing data center space?

The ability to quickly access secure space and scale economies with operational service levels as needs evolve has strategic competitive implications, as does being able to reduce OPEX while preserving ownership and control of critical IT assets.

Smart data centers, whether they are owned or leased, offer significant environmental benefits and measureable cost savings.  For example, a 20k square foot space in a smart data center can reduce power and cooling by more than $1 million per year.  Data center innovation will become a critical inflection point, especially for technology-centric organizations, in the next 5-10 years.  And the location of those data centers will drive the location of strategic jobs, economic growth and the efficient stewardship of environmental resources.  The innovations being designed into these new catalysts of innovation will similarly drive additional IT efficiencies and innovations in other commercial and even residential construction.

More Stories By Greg Ness

Greg Ness is a Silicon Valley marketing veteran with background in networking, security, virtualization and cloud computing. He is VP Marketing at CloudVelocity. Formerly at Vantage Data Centers, Infoblox, Blue Lane Technologies, Juniper Networks, Redline Networks, McAfee, IntruVerofficer at Networks and ShoreTel. He is one of the world's top cloud bloggers.

@MicroservicesExpo Stories
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
operations aren’t merging to become one discipline. Nor is operations simply going away. Rather, DevOps is leading software development and operations – together with other practices such as security – to collaborate and coexist with less overhead and conflict than in the past. In his session at @DevOpsSummit at 19th Cloud Expo, Gordon Haff, Red Hat Technology Evangelist, will discuss what modern operational practices look like in a world in which applications are more loosely coupled, are deve...
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and microservices. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your conta...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will show how customers are able to achieve a level of transparency that enables everyon...
What do dependency resolution, situational awareness, and superheroes have in common? Meet Chris Corriere, a DevOps/Software Engineer at Autotrader, speaking on creative ways to maximize usage of all of the above. Mark Miller, Community Advocate and senior storyteller at Sonatype, caught up with Chris to learn more about what his team is up to.
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
At its core DevOps is all about collaboration. The lines of communication must be opened and it takes some effort to ensure that they stay that way. It’s easy to pay lip service to trends and talk about implementing new methodologies, but without action, real benefits cannot be realized. Success requires planning, advocates empowered to effect change, and, of course, the right tooling. To bring about a cultural shift it’s important to share challenges. In simple terms, ensuring that everyone k...
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...