|By Ian McVey||
|August 12, 2012 08:15 PM EDT||
On any given day, it's not uncommon for a company to generate 2.5 quintillion bytes of data, pushing the amount of data that must be processed and managed to unimaginable levels. Because of the requirements for power and low-latency connections that such data growth entails, many companies have become more inclined to outsource their Big Data needs to colocation data center facilities. In turn, this has created a huge demand for colocation space as additional processing grounds for Big Data. According to analyst firm Nemertes, colocation providers will not have the available space to capitalize on approximately $869 million of market demand by 2015. This is with good reason, though, as colocation data centers offer huge benefits for Big Data, including high-density power, opportunities to decrease latency and a community of like-minded companies with which to cross-connect.
First Things First: Powering Big Data
A lot has been made of Big Data analytics and the tools most capable of cataloging and valuing Big Data. However, before analyzing data, companies must first ensure that they are able to meet the power demands of such in-depth data analysis. With growing levels of data that must be processed, compute speeds must similarly increase to keep up. As a result, a higher level of power is required. Traditionally, power demands for data computation have been well below 1 kilo Volt Ampere (kVA) per square meter; with the rise of Big Data, however, this is now pushing power levels greater than 15 kVA per square meter. Without the ability to meet these power demands, it doesn't quite matter which analytics tool a company uses, as they won't have the resources to make it effective.
Colocation data centers are invaluable in this regard, as they typically purchase bundles of power, giving them the ability to offer power to customers at a much lower cost than customers would be charged in a private data center. In addition, data center facilities are required to provide backup power supplies, so that if they were to incur a power outage or electric shortage, there would still be enough power on reserve to negate any impact on Big Data value or performance for companies hosted in the facility. Finally, colocation data centers charge only for the power used by each customer, so companies can extract significant energy savings rather than miscalculating power usage and consistently overpaying for such an expensive resource.
Slow and Steady Doesn't Always Win the Race
Beyond the immense power requirements, Big Data requires faster compute speeds than the historic norm. This is not easily achieved, but without it, companies may face significant latency for Big Data. This likely won't make a difference in some industries, such as scientific research, where quality is prioritized over speed, but for many other industries, including online and mobile advertising, even a few seconds of latency has the potential to prohibit good service or remove value from Big Data.
For example, a mobile advertising company that targets consumers based on their location will need to process GPS data, consumer demographics and preferences, and advertising platform data - all within the time it takes for a consumer to walk by a store front. Clearly, time is of the essence when the window of opportunity for a sale is just a few seconds long.
The time between the GPS data noting that a customer is approaching a storefront to the end result of an advertisement popping up on his phone will inevitably have a lag - something I like to call the "virtual hop." Simply put, the virtual hop is the time that is required to mine and manipulate data to create an end result. This is a widely known concept when it comes to website impressions, which require a virtual hop of less than two seconds. Currently, a virtual hop for the mobile advertising scenario described above is much longer than this, though it is expected to develop into a much quicker process, similar to the evolution of website response times. Rather than wait, though, companies are looking for solutions to implement today to reduce their virtual hop time - and are finding their concerns answered in colocation data center facilities.
Connect With the Server Next Door
Colocating infrastructure in a data center has the potential to remove a layer of latency and reduce virtual hop times dramatically by providing a common ground for companies that often work together to collaborate and exchange information. The ability to directly connect to the servers of two different companies housed within the same data center, a concept known as cross-connecting, has huge potential to expedite compute times and eliminate a significant portion of latency.
Cross connecting has already been used in many colocation facilities to provide near instantaneous collaboration results between data center tenants. Common examples of this include a speedy deployment to a cloud environment or seamless content aggregation across digital media platforms.
The communities of like-minded companies that are often developed within data centers offer an ideal environment for decreased latency and improved Big Data analysis. Communities typically cater to specific verticals, including cloud, financial services and digital media. For example, going back to the mobile advertising example, a cross-connect between the mobile provider's database of customer demographics and the advertiser's data would greatly improve this process, as the two servers would be able to work congruently and seamlessly from within the same digital media facility.
Despite their enormous benefits to customers, these communities are not yet a common feature across colocation facilities. A recent study from Infineta Systems found that data center-to-data center connectivity, as opposed to cross connecting, is a "silent killer" for Big Data deployments, as many data centers do not prioritize the development of communities within their facilities. Therefore, this is a distinguishing feature that companies must seek when profiling data centers to house their Big Data.
Back to Basics
Without the high-density power required to process data, or the ability to improve Big Data analysis through connectivity and cross-connections, companies are essentially collecting data for fun.
Lacking the traditional fanfare associated with any technology that can use the word "container" or mention "Docker" in its press release, Ubuntu Core and its new Snappy system management scheme was introduced late last year. Since then, it's been gaining steam with Microsoft and Amazon and Google announcing support for the stripped-down version of the operating system. Ubuntu Core is what's being called a "micro-OS"; a stripped down, lean container-supporting machine that's becoming more pop...
May. 3, 2015 02:15 PM EDT Reads: 884
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
May. 3, 2015 02:00 PM EDT Reads: 1,767
This digest provides an overview of good resources that are well worth reading. We’ll be updating this page as new content becomes available, so I suggest you bookmark it. Also, expect more digests to come on different topics that make all of our IT-hearts go boom!
May. 3, 2015 01:00 PM EDT Reads: 1,792
The stack is the hack, Jack. That's my takeaway from several events I attended over the past few weeks in Silicon Valley and Southeast Asia. I listened to and participated in discussions about everything from large datacenter management (think Facebook Open Compute) to enterprise-level cyberfraud (at a seminar in Manila attended by the US State Dept. and Philippine National Police) to the world of entrepreneurial startups, app deployment, and mobility (in a series of meetups and talks in bot...
May. 3, 2015 01:00 PM EDT Reads: 2,195
SYS-CON Events announced today the DevOps Foundation Certification Course, being held June ?, 2015, in conjunction with DevOps Summit and 16th Cloud Expo at the Javits Center in New York City, NY. This sixteen (16) hour course provides an introduction to DevOps – the cultural and professional movement that stresses communication, collaboration, integration and automation in order to improve the flow of work between software developers and IT operations professionals. Improved workflows will res...
May. 3, 2015 12:30 PM EDT Reads: 3,196
I woke up this morning to the devastating news about the earthquake in Nepal. Sitting here in California that destruction is literally on the other side of the world but my mind immediately went to thinking about my good friend Jeremy Geelan. See Jeremy and his family have been living in Kathmandu for a while now. His wife, in fact, is the Danish Ambassador to Nepal!
May. 3, 2015 12:00 PM EDT Reads: 956
SYS-CON Events announced today that B2Cloud, a provider of enterprise resource planning software, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. B2cloud develops the software you need. They have the ideal tools to help you work with your clients. B2Cloud’s main solutions include AGIS – ERP, CLOHC, AGIS – Invoice, and IZUM
May. 3, 2015 12:00 PM EDT Reads: 4,339
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
May. 3, 2015 12:00 PM EDT Reads: 3,680
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
May. 3, 2015 11:00 AM EDT Reads: 5,543
The world's leading Cloud event, Cloud Expo has launched Microservices Journal on the SYS-CON.com portal, featuring over 19,000 original articles, news stories, features, and blog entries. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. Microservices Journal offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Follow new article posts on T...
May. 3, 2015 11:00 AM EDT Reads: 2,775
SYS-CON Events announced today that MangoApps will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY., and the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides private all-in-one social intranets allowing workers to securely collaborate from anywhere in the world and from any device. Social, mobile, and eas...
May. 3, 2015 11:00 AM EDT Reads: 4,262
One of the most frequently requested Rancher features, load balancers are used to distribute traffic between docker containers. Now Rancher users can configure, update and scale up an integrated load balancing service to meet their application needs, using either Rancher's UI or API. To implement our load balancing functionality we decided to use HAproxy, which is deployed as a contianer, and managed by the Rancher orchestration functionality. With Rancher's Load Balancing capability, users ...
May. 3, 2015 10:45 AM EDT Reads: 2,080
There are 182 billion emails sent every day, generating a lot of data about how recipients and ISPs respond. Many marketers take a more-is-better approach to stats, preferring to have the ability to slice and dice their email lists based numerous arbitrary stats. However, fundamentally what really matters is whether or not sending an email to a particular recipient will generate value. Data Scientists can design high-level insights such as engagement prediction models and content clusters that a...
May. 3, 2015 10:15 AM EDT Reads: 4,257
What’s hot in today’s cloud computing world? Containers are fast becoming a viable alternative to virtualization for the right use cases. But to understand why containers can be a better option, we need to first understand their origins. In basic terms, containers are application-centric environments that help isolate and run workloads far more efficiently than the traditional hypervisor technology found in commodity cloud Infrastructure as a Service. Modern operating systems (Linux, Windows, e...
May. 3, 2015 10:00 AM EDT Reads: 706
As a company making software for Continuous Delivery and Devops at scale, at XebiaLabs we’re pretty much always in discussions with users about the benefits and challenges of new development styles, application architectures, and runtime platforms. Unsurprisingly, many of these discussions right now focus on microservices on the application side and containers and related frameworks […]
SYS-CON Events announced today the IoT Bootcamp – Jumpstart Your IoT Strategy, being held June 9–10, 2015, in conjunction with 16th Cloud Expo and Internet of @ThingsExpo at the Javits Center in New York City. This is your chance to jumpstart your IoT strategy. Combined with real-world scenarios and use cases, the IoT Bootcamp is not just based on presentations but includes hands-on demos and walkthroughs. We will introduce you to a variety of Do-It-Yourself IoT platforms including Arduino, Ras...
May. 3, 2015 10:00 AM EDT Reads: 3,797
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
May. 3, 2015 09:30 AM EDT Reads: 3,205
Financial services organizations were among the earliest enterprise adopters of cloud computing. The ability to leverage massive compute, storage and networking resources via RESTful APIs and automated tools like Chef and Puppet made it possible for their high-horsepower IT users to develop a whole new array of applications. Companies like Wells Fargo, Fidelity and BBVA are visible, vocal and engaged supporters of the OpenStack community, running production clouds for applications ranging from d...
May. 3, 2015 09:00 AM EDT Reads: 2,155
Chuck Piluso will present a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. Speaker Bio: Prior to Data Storage Corporation (DSC), Mr. Piluso founded North American Telecommunication Corporation, a facilities-based Competitive Local Exchange Carrier licensed by the Public Service Commission in 10 states, serving as the company's chairman and president from 1997 to 2000. Between 1990 and 1997, Mr. Piluso served as chairman & founder of ...
May. 3, 2015 09:00 AM EDT Reads: 703
To manage complex web services with lots of calls to the cloud, many businesses have invested in Application Performance Management (APM) and Network Performance Management (NPM) tools. Together APM and NPM tools are essential aids in improving a business's infrastructure required to support an effective web experience... but they are missing a critical component - Internet visibility. Internet connectivity has always played a role in customer access to web presence, but in the past few years u...
May. 3, 2015 08:45 AM EDT Reads: 1,248