Welcome!

Microservices Expo Authors: AppDynamics Blog, Automic Blog, Liz McMillan, Jason Bloomberg, JP Morgenthal

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, Apache

@BigDataExpo: Article

Examining the True Cost of Big Data

As you start on your Big Data journey or project, be sure to ask what exactly the business requires

The good news about the Big Data market is that we generally all agree on the definition of Big Data, which has come to be known as data that has volume, velocity and variety where businesses need to collect, store, manage and analyze in order to derive business value or otherwise known as the "4 V's." However, the problem with such a broad definition is that it can mean different things to different people once you start to put some real values next to those V's.

Let's be honest, Volume can be a different thing to different organizations. To some it is anything above 10 terabytes of managed data in their BI environment and to others it is petabyte scale and nothing less. Likewise velocity can be multi-billions of daily records coming into the enterprise from various external and internal networks. When it really comes down to it, each business situation will be quite different not only from a size and speed perspective but also more important from the business use-case or requirement. A large bank's Big Data problem could be very different to that of an online retailer or an airline. If you compare what say a hospital is trying to do collecting and analyzing all the sensor patient data compared to a utilities provider running a smart-grid or a telecommunications operator. True, all could be categorized as machine generated or raw data but the exact type of data might be different not to mention the volume or growth rate. Probably the one unique common denominator across all aforementioned industries is that everyone is keeping the data for longer time-periods. No one is throwing it away - not even the detailed data.

The Many Cost Factors to Consider
Costs will of course vary depending on the individual allocated IT budget but regardless, how the company allocates IT budget dollars to new Big Data initiatives needs consideration. Let's face it, enterprise buyers didn't suddenly come into a bunch of newfound IT assets or line items on their budget and the current world economic situation would certainly not suggest so. More likely existing budgets are being re-allocated and instead of spending more on say existing traditional data warehouses or appliances, monies are being allocated to new projects running on open source projects including Apache Hadoop which promises both low cost, ease of scale not to mention the obvious best approach to managing and analyzing multi-structured data sets. The difficultly then arises how do you integrate or have your Hadoop environment co-exist with the established BI or DW environment that the business has grown to love and rely upon?

Leverage What You Already Have
Let's assume you have a data warehouse or data mart in place today and you already use various ETL or data movement tools and BI dashboard, analytics or reporting tools and you don't want to disrupt business users which could not only impacting performance levels but also training up on a new set of tools. In fact you already likely beholden to strict SLA's around response times for the various business reports and KPI's. However, at the same time the business is demanding access to new data sets in order to glean better insights either directly analyzing this data or co-mingling it with existing customer data. This could take the form of web-logs, click stream data or social media data from various interactive sites the business is now leveraging and tracking. The promise of impacting profit margins and gaining a competitive edge just cannot be avoided.

As we all know, traditional relational or columnar databases can't handle the unstructured data types so IT needs to rollout a different solution to satisfy the business demands. Evaluations can take many forms but typically will start with which Hadoop distribution, which NoSQL or NewSQL database and what query access tools in addition to MapReduce. It is certainly no easy task as there are a large number of technology solutions on the market today that claim to run on or with Hadoop providing MapReduce or SQL-like capabilities which all satisfy the requirement of managing volumes of unstructured data. Some are more mature than others; some proven and not all are low-cost. Open source on the surface looks very low cost but as soon as you require any level of support, which lets face it once it's live and relied upon as a business critical environment, you will need to allocate a line item on your budget. The Big Data line item won't just be one line as it will need to include all components required to properly rollout a Big Data solution to truly satisfy the business demands. Just like any other IT environment the obvious pieces will include: Software licensing and support, hardware, skilled dedicated resources, professional services and training and the dedicated time of business users to provide input on key requirements including specifying types of reports, queries and analysis which will naturally change and evolve over time.

Big Data Costs Can Quickly Creep Up
In terms of the hardware expenditure required to manage the new Big Data set, you may start out with a Hadoop cluster of say 10 nodes and yes that is certainly manageable but if your data velocity is significant, you can quickly reach 100+ nodes and now you will face a number of other expenses including additional headcount and skilled resources to manage the environment proactively in addition to tools for managing the cluster including system management and alerting and potentially add-on software which can vary by business use-case but might cover real-time analytics against streaming data for say fraud detection or detection of unusual patterns. You may also need a business tool to provide a front-end GUI dashboard to track specific KPIs or data visualization tools so business users can quickly understand what is going on. Very quickly the costs become less about the storage and hardware and more around the software that focuses on getting the most value from this newly collected data set.

There is no denying the fact that Big Data presents great new opportunities but reaching the point of a quantifiable ROI in a fast time frame is still a very real challenge. Everyone is talking about Big Data and all the innovative technology approaches to tackling it but it is still difficult to find lots of business success stories within any one-industry sector. It's still fairly immature but the good news is that its moving at a much faster pace than any other IT project today and certainly our data warehouse and BI forefathers have provided lessons learned over the past two decades.

Big Data Is Big Business but It Comes with Strict Requirements
If we want to examine more closely the main areas of expenditure for a Big Data project, it is probably best to look at it through the lens of a specific type of business and use-case. Let's take a large financial institution that has a number of existing traditional data warehouse / BI environments but because the business doesn't want to throw any data away (well let's face it regulations don't allow that for a number of years) and realistically the business wants to retain specific data sets for ongoing trending and analysis. This includes examining questions such as "what constitutes a low-risk client based on spending behavior patterns over a specific time period cross-referenced with customer demographics" which will help the institution better target a particular segment of the market.

Given the IT budget doesn't allow for increased spend that correlates with data growth rates, they need to seriously reduce costs and so decide to go the route of a Hadoop-based environment given its promise for low-cost scale and the fact that it can provide insights into customer patterns by capturing semi- and unstructured data. Front-ending the warehouse with a dedicated Hadoop cluster is the preferred architectural approach but the business users still want access to both the Hadoop environment and the existing traditional data warehouse environment.

Given we are talking about a financial institution, the question of security and availability quickly come to the top of the requirements list. At the same time, if business users want to access that data, SQL query access and using the current BI tool against that new set of data is also a requirement. If you can avoid having to the move large chunks of data on a frequent basis from one to the other, it will not only reduce costs but also latency. In an ideal world, being able to leverage the skill sets you already have and avoiding duplication of work is key.

Below is a quick table outlining the main cost factors to be considered and a set of comments against each of these areas that could reduce costs.

 

Big Data on Hadoop Cost Factors

Key Consideration to drive down cost

 

Storage

Look at databases that provide data compression to yield storage savings (better than GZip or LZO).

 

Hardware (Nodes)

Granular data compression at database level will reduce nodes over time.

 

Data Analytics - Skilled Resources

Examine technology solutions that provide standard SQL or BI tool access in addition to MapReduce (Pig etc.)

 

Cluster management - Skilled Resources

Leverage existing Dev-operations staff if you deploy a SQL-compliant data environment

 

Security

Look for database solutions that provide built-in security permissions and access.

 

Availability / DR

Consider a data management environment that doesn't require additional tools for replication.

 

Training

Consider solutions where you don't need to retrain or hire all new resources. Leverage what you have (standard SQL-skilled DBAs)

Summary: Consider All Factors and Get Business Buy-in Quickly
Big Data is fundamentally a business problem. If you begin with the question of "what is the business trying to achieve by collecting, storing and analyzing this new set of data...", you will start down the right path to realizing business gains. Whether you outsource the initiative or bring in external consultants and vendors to manage the project, the same questions will arise and in order to leverage what you already have which includes both existing IT environments and skills, you will be better able to contain costs.

Furthermore, we all love the promise of new innovative technologies including Hadoop and MapReduce but without leveraging tried and tested standards we have come to love and respect, it doesn't make a whole lot of sense from both a technical or economic sense. As you start on your Big Data journey or project, be sure to ask what exactly the business requires and how can you leverage what you already have today. We all know, getting business user buy-in and success is half the battle to a successful rollout.

More Stories By John Bantleman

John Bantleman, CEO of RainStor, has more than 20 years’ experience in the management of software companies. Prior to overseeing RainStor, he transformed LBMS into a $45 million business prior to its successful NASDAQ flotation in 1997. Today’s LBMS’ technology is now part of CA’s product portfolio. The following year John was instrumental in the launch of Evolve, and drove the company through to a successful IPO on NASDAQ.

Returning to the UK in 2003, John spent 12 months working on the advisory boards of venture capital organizations such as Apax Partners. He joined RainStor Inc. as Chairman in 2004 and became CEO at the start of 2007 and relocated back to the US to head-up worldwide operations in 2009.

Comments (3) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Vikas.Deolaliker 09/21/12 06:49:00 PM EDT

Great article. Another data point, the IT budget is up only 4% in 2013 over 2012, so don't expect everyone to rush into Bigdata.

The fourth "V" is visualization. If you cannot render the analysis in a intuitive way, there is no value in that analysis. In fact, visualization should be the first step in design of a bigdata system - it helps trim down the architectural bloat into something that is within budget and useful.

Elad Israeli 09/19/12 06:07:00 PM EDT

Fascinating post. Still waiting for someone to crack the nut that is Big Data Analytics.

douglaney 08/29/12 03:36:00 PM EDT

Great piece John. Excellent detail. Thought you and your readers might be interested in where the "3Vs" of big data originated--in a Gartner piece I authored over 11 years ago. I recently unearthed a copy so folks to refer to and cite it.

Cheers,
Doug Laney, VP Research, Gartner, @doug_laney

@MicroservicesExpo Stories
In a crowded world of popular computer languages, platforms and ecosystems, Node.js is one of the hottest. According to w3techs.com, Node.js usage has gone up 241 percent in the last year alone. Retailers have taken notice and are implementing it on many levels. I am going to share the basics of Node.js, and discuss why retailers are using it to reduce page load times and improve server efficiency. I’ll talk about similar developments such as Docker and microservices, and look at several compani...
The goal of any tech business worth its salt is to provide the best product or service to its clients in the most efficient and cost-effective way possible. This is just as true in the development of software products as it is in other product design services. Microservices, an app architecture style that leans mostly on independent, self-contained programs, are quickly becoming the new norm, so to speak. With this change comes a declining reliance on older SOAs like COBRA, a push toward more s...
From the conception of Docker containers to the unfolding microservices revolution we see today, here is a brief history of what I like to call 'containerology'. In 2013, we were solidly in the monolithic application era. I had noticed that a growing amount of effort was going into deploying and configuring applications. As applications had grown in complexity and interdependency over the years, the effort to install and configure them was becoming significant. But the road did not end with a ...
Many private cloud projects were built to deliver self-service access to development and test resources. While those clouds delivered faster access to resources, they lacked visibility, control and security needed for production deployments. In their session at 18th Cloud Expo, Steve Anderson, Product Manager at BMC Software, and Rick Lefort, Principal Technical Marketing Consultant at BMC Software, will discuss how a cloud designed for production operations not only helps accelerate developer...
I have an article in the recently released “DZone Guide to Building and Deploying Applications on the Cloud” entitled “Fullstack Engineering in the Age of Hybrid Cloud”. In this article I discuss the need and skills of a Fullstack Engineer with relation to troubleshooting and repairing complex, distributed hybrid cloud applications. My recent experiences with troubleshooting issues with my Docker WordPress container only reinforce the details I wrote about in this piece. Without my comprehensive...
Digital means customer preferences and behavior are driving enterprise technology decisions to be sure, but let’s not forget our employees. After all, when we say customer, we mean customer writ large, including partners, supply chain participants, and yes, those salaried denizens whose daily labor forms the cornerstone of the enterprise. While your customers bask in the warm rays of your digital efforts, are your employees toiling away in the dark recesses of your enterprise, pecking data into...
Admittedly, two years ago I was a bulk contributor to the DevOps noise with conversations rooted in the movement around culture, principles, and goals. And while all of these elements of DevOps environments are important, I’ve found that the biggest challenge now is a lack of understanding as to why DevOps is beneficial. It’s getting the wheels going, or just taking the next step. The best way to start on the road to change is to take a look at the companies that have already made great headway ...
In the world of DevOps there are ‘known good practices’ – aka ‘patterns’ – and ‘known bad practices’ – aka ‘anti-patterns.' Many of these patterns and anti-patterns have been developed from real world experience, especially by the early adopters of DevOps theory; but many are more feasible in theory than in practice, especially for more recent entrants to the DevOps scene. In this power panel at @DevOpsSummit at 18th Cloud Expo, moderated by DevOps Conference Chair Andi Mann, panelists will dis...
Small teams are more effective. The general agreement is that anything from 5 to 12 is the 'right' small. But of course small teams will also have 'small' throughput - relatively speaking. So if your demand is X and the throughput of a small team is X/10, you probably need 10 teams to meet that demand. But more teams also mean more effort to coordinate and align their efforts in the same direction. So, the challenge is how to harness the power of small teams and yet orchestrate multiples of them...
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, will show how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningfu...
You deployed your app with the Bluemix PaaS and it's gaining some serious traction, so it's time to make some tweaks. Did you design your application in a way that it can scale in the cloud? Were you even thinking about the cloud when you built the app? If not, chances are your app is going to break. Check out this webcast to learn various techniques for designing applications that will scale successfully in Bluemix, for the confidence you need to take your apps to the next level and beyond.
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
With DevOps becoming more well-known and established practice in nearly every industry that delivers software, it is important to continually reassess its efficacy. This week’s top 10 includes a discussion on how the quick uptake of DevOps adoption in the enterprise has posed some serious challenges. Additionally, organizations who have taken the DevOps plunge must find ways to find, hire and keep their DevOps talent in order to keep the machine running smoothly.
Wow, if you ever wanted to learn about Rugged DevOps (some call it DevSecOps), sit down for a spell with Shannon Lietz, Ian Allison and Scott Kennedy from Intuit. We discussed a number of important topics including internal war games, culture hacking, gamification of Rugged DevOps and starting as a small team. There are 100 gold nuggets in this conversation for novices and experts alike.
The notion of customer journeys, of course, are central to the digital marketer’s playbook. Clearly, enterprises should focus their digital efforts on such journeys, as they represent customer interactions over time. But making customer journeys the centerpiece of the enterprise architecture, however, leaves more questions than answers. The challenge arises when EAs consider the context of the customer journey in the overall architecture as well as the architectural elements that make up each...
Much of the discussion around cloud DevOps focuses on the speed with which companies need to get new code into production. This focus is important – because in an increasingly digital marketplace, new code enables new value propositions. New code is also often essential for maintaining competitive parity with market innovators. But new code doesn’t just have to deliver the functionality the business requires. It also has to behave well because the behavior of code in the cloud affects performan...
In 2006, Martin Fowler posted his now famous essay on Continuous Integration. Looking back, what seemed revolutionary, radical or just plain crazy is now common, pedestrian and "just what you do." I love it. Back then, building and releasing software was a real pain. Integration was something you did at the end, after code complete, and we didn't know how long it would take. Some people may recall how we, as an industry, spent a massive amount of time integrating code from one team with another...
As the software delivery industry continues to evolve and mature, the challenge of managing the growing list of the tools and processes becomes more daunting every day. Today, Application Lifecycle Management (ALM) platforms are proving most valuable by providing the governance, management and coordination for every stage of development, deployment and release. Recently, I spoke with Madison Moore at SD Times about the changing market and where ALM is headed.
Struggling to keep up with increasing application demand? Learn how Platform as a Service (PaaS) can streamline application development processes and make resource management easy.
If there is anything we have learned by now, is that every business paves their own unique path for releasing software- every pipeline, implementation and practices are a bit different, and DevOps comes in all shapes and sizes. Software delivery practices are often comprised of set of several complementing (or even competing) methodologies – such as leveraging Agile, DevOps and even a mix of ITIL, to create the combination that’s most suitable for your organization and that maximize your busines...