Welcome!

Microservices Expo Authors: Flint Brenton, Liz McMillan, Cameron Van Orman, Karthick Viswanathan, Elizabeth White

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, Apache

@BigDataExpo: Article

Examining the True Cost of Big Data

As you start on your Big Data journey or project, be sure to ask what exactly the business requires

The good news about the Big Data market is that we generally all agree on the definition of Big Data, which has come to be known as data that has volume, velocity and variety where businesses need to collect, store, manage and analyze in order to derive business value or otherwise known as the "4 V's." However, the problem with such a broad definition is that it can mean different things to different people once you start to put some real values next to those V's.

Let's be honest, Volume can be a different thing to different organizations. To some it is anything above 10 terabytes of managed data in their BI environment and to others it is petabyte scale and nothing less. Likewise velocity can be multi-billions of daily records coming into the enterprise from various external and internal networks. When it really comes down to it, each business situation will be quite different not only from a size and speed perspective but also more important from the business use-case or requirement. A large bank's Big Data problem could be very different to that of an online retailer or an airline. If you compare what say a hospital is trying to do collecting and analyzing all the sensor patient data compared to a utilities provider running a smart-grid or a telecommunications operator. True, all could be categorized as machine generated or raw data but the exact type of data might be different not to mention the volume or growth rate. Probably the one unique common denominator across all aforementioned industries is that everyone is keeping the data for longer time-periods. No one is throwing it away - not even the detailed data.

The Many Cost Factors to Consider
Costs will of course vary depending on the individual allocated IT budget but regardless, how the company allocates IT budget dollars to new Big Data initiatives needs consideration. Let's face it, enterprise buyers didn't suddenly come into a bunch of newfound IT assets or line items on their budget and the current world economic situation would certainly not suggest so. More likely existing budgets are being re-allocated and instead of spending more on say existing traditional data warehouses or appliances, monies are being allocated to new projects running on open source projects including Apache Hadoop which promises both low cost, ease of scale not to mention the obvious best approach to managing and analyzing multi-structured data sets. The difficultly then arises how do you integrate or have your Hadoop environment co-exist with the established BI or DW environment that the business has grown to love and rely upon?

Leverage What You Already Have
Let's assume you have a data warehouse or data mart in place today and you already use various ETL or data movement tools and BI dashboard, analytics or reporting tools and you don't want to disrupt business users which could not only impacting performance levels but also training up on a new set of tools. In fact you already likely beholden to strict SLA's around response times for the various business reports and KPI's. However, at the same time the business is demanding access to new data sets in order to glean better insights either directly analyzing this data or co-mingling it with existing customer data. This could take the form of web-logs, click stream data or social media data from various interactive sites the business is now leveraging and tracking. The promise of impacting profit margins and gaining a competitive edge just cannot be avoided.

As we all know, traditional relational or columnar databases can't handle the unstructured data types so IT needs to rollout a different solution to satisfy the business demands. Evaluations can take many forms but typically will start with which Hadoop distribution, which NoSQL or NewSQL database and what query access tools in addition to MapReduce. It is certainly no easy task as there are a large number of technology solutions on the market today that claim to run on or with Hadoop providing MapReduce or SQL-like capabilities which all satisfy the requirement of managing volumes of unstructured data. Some are more mature than others; some proven and not all are low-cost. Open source on the surface looks very low cost but as soon as you require any level of support, which lets face it once it's live and relied upon as a business critical environment, you will need to allocate a line item on your budget. The Big Data line item won't just be one line as it will need to include all components required to properly rollout a Big Data solution to truly satisfy the business demands. Just like any other IT environment the obvious pieces will include: Software licensing and support, hardware, skilled dedicated resources, professional services and training and the dedicated time of business users to provide input on key requirements including specifying types of reports, queries and analysis which will naturally change and evolve over time.

Big Data Costs Can Quickly Creep Up
In terms of the hardware expenditure required to manage the new Big Data set, you may start out with a Hadoop cluster of say 10 nodes and yes that is certainly manageable but if your data velocity is significant, you can quickly reach 100+ nodes and now you will face a number of other expenses including additional headcount and skilled resources to manage the environment proactively in addition to tools for managing the cluster including system management and alerting and potentially add-on software which can vary by business use-case but might cover real-time analytics against streaming data for say fraud detection or detection of unusual patterns. You may also need a business tool to provide a front-end GUI dashboard to track specific KPIs or data visualization tools so business users can quickly understand what is going on. Very quickly the costs become less about the storage and hardware and more around the software that focuses on getting the most value from this newly collected data set.

There is no denying the fact that Big Data presents great new opportunities but reaching the point of a quantifiable ROI in a fast time frame is still a very real challenge. Everyone is talking about Big Data and all the innovative technology approaches to tackling it but it is still difficult to find lots of business success stories within any one-industry sector. It's still fairly immature but the good news is that its moving at a much faster pace than any other IT project today and certainly our data warehouse and BI forefathers have provided lessons learned over the past two decades.

Big Data Is Big Business but It Comes with Strict Requirements
If we want to examine more closely the main areas of expenditure for a Big Data project, it is probably best to look at it through the lens of a specific type of business and use-case. Let's take a large financial institution that has a number of existing traditional data warehouse / BI environments but because the business doesn't want to throw any data away (well let's face it regulations don't allow that for a number of years) and realistically the business wants to retain specific data sets for ongoing trending and analysis. This includes examining questions such as "what constitutes a low-risk client based on spending behavior patterns over a specific time period cross-referenced with customer demographics" which will help the institution better target a particular segment of the market.

Given the IT budget doesn't allow for increased spend that correlates with data growth rates, they need to seriously reduce costs and so decide to go the route of a Hadoop-based environment given its promise for low-cost scale and the fact that it can provide insights into customer patterns by capturing semi- and unstructured data. Front-ending the warehouse with a dedicated Hadoop cluster is the preferred architectural approach but the business users still want access to both the Hadoop environment and the existing traditional data warehouse environment.

Given we are talking about a financial institution, the question of security and availability quickly come to the top of the requirements list. At the same time, if business users want to access that data, SQL query access and using the current BI tool against that new set of data is also a requirement. If you can avoid having to the move large chunks of data on a frequent basis from one to the other, it will not only reduce costs but also latency. In an ideal world, being able to leverage the skill sets you already have and avoiding duplication of work is key.

Below is a quick table outlining the main cost factors to be considered and a set of comments against each of these areas that could reduce costs.

 

Big Data on Hadoop Cost Factors

Key Consideration to drive down cost

 

Storage

Look at databases that provide data compression to yield storage savings (better than GZip or LZO).

 

Hardware (Nodes)

Granular data compression at database level will reduce nodes over time.

 

Data Analytics - Skilled Resources

Examine technology solutions that provide standard SQL or BI tool access in addition to MapReduce (Pig etc.)

 

Cluster management - Skilled Resources

Leverage existing Dev-operations staff if you deploy a SQL-compliant data environment

 

Security

Look for database solutions that provide built-in security permissions and access.

 

Availability / DR

Consider a data management environment that doesn't require additional tools for replication.

 

Training

Consider solutions where you don't need to retrain or hire all new resources. Leverage what you have (standard SQL-skilled DBAs)

Summary: Consider All Factors and Get Business Buy-in Quickly
Big Data is fundamentally a business problem. If you begin with the question of "what is the business trying to achieve by collecting, storing and analyzing this new set of data...", you will start down the right path to realizing business gains. Whether you outsource the initiative or bring in external consultants and vendors to manage the project, the same questions will arise and in order to leverage what you already have which includes both existing IT environments and skills, you will be better able to contain costs.

Furthermore, we all love the promise of new innovative technologies including Hadoop and MapReduce but without leveraging tried and tested standards we have come to love and respect, it doesn't make a whole lot of sense from both a technical or economic sense. As you start on your Big Data journey or project, be sure to ask what exactly the business requires and how can you leverage what you already have today. We all know, getting business user buy-in and success is half the battle to a successful rollout.

More Stories By John Bantleman

John Bantleman, CEO of RainStor, has more than 20 years’ experience in the management of software companies. Prior to overseeing RainStor, he transformed LBMS into a $45 million business prior to its successful NASDAQ flotation in 1997. Today’s LBMS’ technology is now part of CA’s product portfolio. The following year John was instrumental in the launch of Evolve, and drove the company through to a successful IPO on NASDAQ.

Returning to the UK in 2003, John spent 12 months working on the advisory boards of venture capital organizations such as Apax Partners. He joined RainStor Inc. as Chairman in 2004 and became CEO at the start of 2007 and relocated back to the US to head-up worldwide operations in 2009.

Comments (3) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Vikas.Deolaliker 09/21/12 06:49:00 PM EDT

Great article. Another data point, the IT budget is up only 4% in 2013 over 2012, so don't expect everyone to rush into Bigdata.

The fourth "V" is visualization. If you cannot render the analysis in a intuitive way, there is no value in that analysis. In fact, visualization should be the first step in design of a bigdata system - it helps trim down the architectural bloat into something that is within budget and useful.

Elad Israeli 09/19/12 06:07:00 PM EDT

Fascinating post. Still waiting for someone to crack the nut that is Big Data Analytics.

douglaney 08/29/12 03:36:00 PM EDT

Great piece John. Excellent detail. Thought you and your readers might be interested in where the "3Vs" of big data originated--in a Gartner piece I authored over 11 years ago. I recently unearthed a copy so folks to refer to and cite it.

Cheers,
Doug Laney, VP Research, Gartner, @doug_laney

@MicroservicesExpo Stories
DevSecOps – a trend around transformation in process, people and technology – is about breaking down silos and waste along the software development lifecycle and using agile methodologies, automation and insights to help get apps to market faster. This leads to higher quality apps, greater trust in organizations, less organizational friction, and ultimately a five-star customer experience. These apps are the new competitive currency in this digital economy and they’re powered by data. Without ...
Most of the time there is a lot of work involved to move to the cloud, and most of that isn't really related to AWS or Azure or Google Cloud. Before we talk about public cloud vendors and DevOps tools, there are usually several technical and non-technical challenges that are connected to it and that every company needs to solve to move to the cloud. In his session at 21st Cloud Expo, Stefano Bellasio, CEO and founder of Cloud Academy Inc., will discuss what the tools, disciplines, and cultural...
These days, APIs have become an integral part of the digital transformation journey for all enterprises. Every digital innovation story is connected to APIs . But have you ever pondered over to know what are the source of these APIs? Let me explain - APIs sources can be varied, internal or external, solving different purposes, but mostly categorized into the following two categories. Data lakes is a term used to represent disconnected but relevant data that are used by various business units wit...
Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reducti...
Enterprises are moving to the cloud faster than most of us in security expected. CIOs are going from 0 to 100 in cloud adoption and leaving security teams in the dust. Once cloud is part of an enterprise stack, it’s unclear who has responsibility for the protection of applications, services, and data. When cloud breaches occur, whether active compromise or a publicly accessible database, the blame must fall on both service providers and users. In his session at 21st Cloud Expo, Ben Johnson, C...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
With the rise of DevOps, containers are at the brink of becoming a pervasive technology in Enterprise IT to accelerate application delivery for the business. When it comes to adopting containers in the enterprise, security is the highest adoption barrier. Is your organization ready to address the security risks with containers for your DevOps environment? In his session at @DevOpsSummit at 21st Cloud Expo, Chris Van Tuin, Chief Technologist, NA West at Red Hat, will discuss: The top security r...
‘Trend’ is a pretty common business term, but its definition tends to vary by industry. In performance monitoring, trend, or trend shift, is a key metric that is used to indicate change. Change is inevitable. Today’s websites must frequently update and change to keep up with competition and attract new users, but such changes can have a negative impact on the user experience if not managed properly. The dynamic nature of the Internet makes it necessary to constantly monitor different metrics. O...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
The nature of the technology business is forward-thinking. It focuses on the future and what’s coming next. Innovations and creativity in our world of software development strive to improve the status quo and increase customer satisfaction through speed and increased connectivity. Yet, while it's exciting to see enterprises embrace new ways of thinking and advance their processes with cutting edge technology, it rarely happens rapidly or even simultaneously across all industries.
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
Many organizations adopt DevOps to reduce cycle times and deliver software faster; some take on DevOps to drive higher quality and better end-user experience; others look to DevOps for a clearer line-of-sight to customers to drive better business impacts. In truth, these three foundations go together. In this power panel at @DevOpsSummit 21st Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, industry experts will discuss how leading organizations build application success from all...
The last two years has seen discussions about cloud computing evolve from the public / private / hybrid split to the reality that most enterprises will be creating a complex, multi-cloud strategy. Companies are wary of committing all of their resources to a single cloud, and instead are choosing to spread the risk – and the benefits – of cloud computing across multiple providers and internal infrastructures, as they follow their business needs. Will this approach be successful? How large is the ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
One of the biggest challenges with adopting a DevOps mentality is: new applications are easily adapted to cloud-native, microservice-based, or containerized architectures - they can be built for them - but old applications need complex refactoring. On the other hand, these new technologies can require relearning or adapting new, oftentimes more complex, methodologies and tools to be ready for production. In his general session at @DevOpsSummit at 20th Cloud Expo, Chris Brown, Solutions Marketi...
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Today companies are looking to achieve cloud-first digital agility to reduce time-to-market, optimize utilization of resources, and rapidly deliver disruptive business solutions. However, leveraging the benefits of cloud deployments can be complicated for companies with extensive legacy computing environments. In his session at 21st Cloud Expo, Craig Sproule, founder and CEO of Metavine, will outline the challenges enterprises face in migrating legacy solutions to the cloud. He will also prese...