Click here to close now.



Welcome!

Microservices Expo Authors: Jason Bloomberg, Elizabeth White, Pat Romanski, Liz McMillan, Anders Wallgren

Related Topics: Apache, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog, Agile Computing, @CloudExpo

Apache: Article

Data Centers: Where Big Data Will Be Exploited

First things first: powering Big Data

On any given day, it's not uncommon for a company to generate 2.5 quintillion bytes of data, pushing the amount of data that must be processed and managed to unimaginable levels. Because of the requirements for power and low-latency connections that such data growth entails, many companies have become more inclined to outsource their Big Data needs to colocation data center facilities. In turn, this has created a huge demand for colocation space as additional processing grounds for Big Data. According to analyst firm Nemertes, colocation providers will not have the available space to capitalize on approximately $869 million of market demand by 2015. This is with good reason, though, as colocation data centers offer huge benefits for Big Data, including high-density power, opportunities to decrease latency and a community of like-minded companies with which to cross-connect.

First Things First: Powering Big Data
A lot has been made of Big Data analytics and the tools most capable of cataloging and valuing Big Data. However, before analyzing data, companies must first ensure that they are able to meet the power demands of such in-depth data analysis. With growing levels of data that must be processed, compute speeds must similarly increase to keep up. As a result, a higher level of power is required. Traditionally, power demands for data computation have been well below 1 kilo Volt Ampere (kVA) per square meter; with the rise of Big Data, however, this is now pushing power levels greater than 15 kVA per square meter. Without the ability to meet these power demands, it doesn't quite matter which analytics tool a company uses, as they won't have the resources to make it effective.

Colocation data centers are invaluable in this regard, as they typically purchase bundles of power, giving them the ability to offer power to customers at a much lower cost than customers would be charged in a private data center. In addition, data center facilities are required to provide backup power supplies, so that if they were to incur a power outage or electric shortage, there would still be enough power on reserve to negate any impact on Big Data value or performance for companies hosted in the facility. Finally, colocation data centers charge only for the power used by each customer, so companies can extract significant energy savings rather than miscalculating power usage and consistently overpaying for such an expensive resource.

Slow and Steady Doesn't Always Win the Race
Beyond the immense power requirements, Big Data requires faster compute speeds than the historic norm. This is not easily achieved, but without it, companies may face significant latency for Big Data. This likely won't make a difference in some industries, such as scientific research, where quality is prioritized over speed, but for many other industries, including online and mobile advertising, even a few seconds of latency has the potential to prohibit good service or remove value from Big Data.

For example, a mobile advertising company that targets consumers based on their location will need to process GPS data, consumer demographics and preferences, and advertising platform data - all within the time it takes for a consumer to walk by a store front. Clearly, time is of the essence when the window of opportunity for a sale is just a few seconds long.

The time between the GPS data noting that a customer is approaching a storefront to the end result of an advertisement popping up on his phone will inevitably have a lag - something I like to call the "virtual hop." Simply put, the virtual hop is the time that is required to mine and manipulate data to create an end result. This is a widely known concept when it comes to website impressions, which require a virtual hop of less than two seconds. Currently, a virtual hop for the mobile advertising scenario described above is much longer than this, though it is expected to develop into a much quicker process, similar to the evolution of website response times. Rather than wait, though, companies are looking for solutions to implement today to reduce their virtual hop time - and are finding their concerns answered in colocation data center facilities.

Connect With the Server Next Door
Colocating infrastructure in a data center has the potential to remove a layer of latency and reduce virtual hop times dramatically by providing a common ground for companies that often work together to collaborate and exchange information. The ability to directly connect to the servers of two different companies housed within the same data center, a concept known as cross-connecting, has huge potential to expedite compute times and eliminate a significant portion of latency.

Cross connecting has already been used in many colocation facilities to provide near instantaneous collaboration results between data center tenants. Common examples of this include a speedy deployment to a cloud environment or seamless content aggregation across digital media platforms.

The communities of like-minded companies that are often developed within data centers offer an ideal environment for decreased latency and improved Big Data analysis. Communities typically cater to specific verticals, including cloud, financial services and digital media. For example, going back to the mobile advertising example, a cross-connect between the mobile provider's database of customer demographics and the advertiser's data would greatly improve this process, as the two servers would be able to work congruently and seamlessly from within the same digital media facility.

Despite their enormous benefits to customers, these communities are not yet a common feature across colocation facilities. A recent study from Infineta Systems found that data center-to-data center connectivity, as opposed to cross connecting, is a "silent killer" for Big Data deployments, as many data centers do not prioritize the development of communities within their facilities. Therefore, this is a distinguishing feature that companies must seek when profiling data centers to house their Big Data.

Back to Basics
Without the high-density power required to process data, or the ability to improve Big Data analysis through connectivity and cross-connections, companies are essentially collecting data for fun.

More Stories By Ian McVey

As Director of Marketing and Business Development for the enterprise and systems integrator segment, Ian McVey is responsible for developing and driving Interxion’s go-to-market proposition for the enterprise and systems integrator segments, including all aspects of sales and marketing. He has over 15 years of industry experience in a variety of strategy, sales and marketing roles at Microsoft, Cable & Wireless and LEK Consulting. Before joining Interxion in 2011 he was Director of Sales and Business Development for the Microsoft Practice at CSC. He holds an MBA from London Business School and a Masters in Engineering from the University of Oxford.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
The battle over bimodal IT is heating up. Now that there’s a reasonably broad consensus that Gartner’s advice about bimodal IT is deeply flawed – consensus everywhere except perhaps at Gartner – various ideas are springing up to fill the void. The bimodal problem, of course, is well understood. ‘Traditional’ or ‘slow’ IT uses hidebound, laborious processes that would only get in the way of ‘fast’ or ‘agile’ digital efforts. The result: incoherent IT strategies and shadow IT struggles that lead ...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24x7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be muc...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
SYS-CON Events announced today that AppNeta, the leader in performance insight for business-critical web applications, will exhibit and present at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. AppNeta is the only application performance monitoring (APM) company to provide solutions for all applications – applications you develop internally, business-critical SaaS applications you use and the networks that deli...
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
As software organizations continue to invest in achieving Continuous Delivery (CD) of their applications, we see increased interest in microservices architectures, which–on the face of it–seem like a natural fit for enabling CD. In microservices (or its predecessor, “SOA”), the business functionality is decomposed into a set of independent, self-contained services that communicate with each other via an API. Each of the services has their own application release cycle, and are developed and depl...
At the heart of the Cloud Native model is a microservices application architecture, and applying this to a telco SDN scenario offers enormous opportunity for product innovation and competitive advantage. For example in the ETSI NFV Ecosystem white paper they describe one of the product markets that SDN might address to be the Home sector. Vendors like Alcatel market SDN-based solutions for the home market, offering Home Gateways – A virtual residential gateway (vRGW) where service provider...
In the Bimodal model we find two areas of IT - the traditional kind where the main concern is keeping the lights on and the IT focusing on agility and speed, where everything needs to be faster. Today companies are investing in new technologies and processes to emulate their most agile competitors. Gone are the days of waterfall development and releases only every few months. Today's IT and the business it powers demands performance akin to a supercar - everything needs to be faster, every sc...
With microservices, SOA and distributed architectures becoming more popular, it is becoming increasingly harder to keep track of where time is spent in a distributed application when trying to diagnose performance problems. Distributed tracing systems attempt to address this problem by following application requests across service boundaries, persisting metadata along the way that provide context for fine-grained performance monitoring.
Web performance issues and advances have been gaining a stronger presence in the headlines as people are becoming more aware of its impact on virtually every business, and 2015 was no exception. We saw a myriad of major outages this year hit some of the biggest corporations, as well as some technology integrations and other news that we IT Ops aficionados find very exciting. This past year has offered several opportunities for growth and evolution in the performance realm — even the worst failu...
Are you someone who knows that the number one rule in DevOps is “Don’t Panic”? Especially when it comes to making Continuous Delivery changes inside your organization? Are you someone that theorizes that if anyone implements real automation changes, the solution will instantly become antiquated and be replaced by something even more bizarre and inexplicable?
Welcome to the first top DevOps news roundup of 2016! At the end of last year, we saw some great predictions for 2016. While we’re excited to kick off the new year, this week’s top posts reminded us to take a second to slow down and really understand the current state of affairs. For example, do you actually know what microservices are – or aren’t? What about DevOps? Does the emphasis still fall mostly on the development side? This week’s top news definitely got the wheels turning and just migh...
Test automation is arguably the most important innovation to the process of QA testing in software development. The ability to automate regression testing and other repetitive test cases can significantly reduce the overall production time for even the most complex solutions. As software continues to be developed for new platforms – including mobile devices and the diverse array of endpoints that will be created during the rise of the Internet of Things - automation integration will have a huge ...
Providing a full-duplex communication channel over a single TCP connection, WebSocket is the most efficient protocol for real-time responses over the web. If you’re utilizing WebSocket technology, performance testing will boil down to simulating the bi-directional nature of your application. Introduced with HTML5, the WebSocket protocol allows for more interaction between a browser and website, facilitating real-time applications and live content. WebSocket technology creates a persistent conne...