Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Liz McMillan, Don MacVittie, Derek Weeks

Related Topics: Microservices Expo, Industrial IoT

Microservices Expo: Article

Putting the Costs of Business Intelligence in Perspective

Why is buying a BI solution for dozens or hundreds of users so much more complicated than buying a solution for a few users?

Successful business intelligence (BI) solutions serve as many business users as possible.  As more users use it, the more value the solution brings.

However, if you’ve had any experience with BI, you must have noticed that as the number of users grow – so does the complexity (and consequent cost) of the solution. This is a fundamental reality in the traditional business intelligence space, although many startups in the space are attempting to change it – each according to their own vision and understanding of the space.

But why is buying a BI solution for dozens or hundreds of users so much more complicated than buying a solution for a select group of power users?

Perspective #1: The Cost of Software Licenses

People often think that the answer to this question lies in software costs, but in fact software costs are usually the red herring in the process of business intelligence costing.

It is obvious that the more users your solution has the more software licenses are going to cost. Therefore, you might be tempted to choose a vendor that sells software for 30% less than another vendor – but basing a decision solely on this is a big mistake as license costs have little bearing on the total cost of a BI solution, and hardly any impact on ROI.

Some proof to this can be found in open source. Open source BI provides (by definition) free software, and there is no shortage of open source BI tools/platforms. However, none of them are doing as well as the established non-open source vendors, even though they have been around since the beginning of the century. They’re having trouble acquiring customers, at least compared to commercial vendors. It is very easy to assume that if software costs were significant inhibitors in the BI space, open source solutions would be much more prominent than they actually are.

Another hint at this can be found in the ‘commercial’ (non-open source) world, where BI vendors do charge for licenses but will usually provide significant discounts on purchasing of large volumes of licenses. BI vendors do it for reasons that go beyond the obvious attempt to motivate potential buyers to expand their purchase orders. They do it because they realize the total cost of the solution – to the customer – grows significantly as the number of users grows, regardless of license costs (preparation projects, IT personnel assignment, etc). They need to take this into account when they price their software.

Tip: Pay attention to software costs, but there are way more important things to consider. You should really leave the license cost comparison to last.

Perspective #2: The Cost of Hardware


Two things that have great impact on the hardware requirements of a BI solution are the amounts of data being queried directly by business users, and the number of business users doing the querying concurrently. Depending on which technology you use, each user can add between 10%-50% to the configuration of hardware resources required (disk, RAM and CPU).

(For you technology geeks out there, there is an interesting discussion about this topic on Curt Monash’s blog. Check out the comments section, as it will also give you a good idea on what hardware configurations can be used, when different technologies are utilized)

The tipping point, however, is when your requirements grow beyond what can be fitted inside a single commodity hardware box (read: cheap off-the-shelf computer). If this limit is hit, you basically have three options, none of which are practical for most companies:

1. Buy a high-end proprietary server>
2. Clustering / sharding
3. Build a data warehouse / pre-processed OLAP cubes

Unfortunately, BI technologies that were designed prior to the 21st century (RDBMS, OLAP, In-Memory Databases) don’t leave much room for innovation on this particular aspect. They were designed for hardware that was different than what exists today. So while there will always be a limit on what can be achieved with a single hardware box, with traditional BI technologies the threshold is too low to be feasible for most modern companies that both have large volumes of data and seek extensive usage at reasonable and consistent response times.

The good news is that this is not the case with new technologies that are designed specifically utilize the modern chipsets that are available on any commodity 64-bit machine, and therefore get more (orders of magnitude more) juice out of a single 64-bit commodity box. Running dozens or hundreds of users on a single box is more than possible these days, even when data is in the 100s of GBs size range.

Tip: If you do not wish to spend loads of money on high-end or proprietary servers, and your internal IT department has better things to than to manage a cluster for BI, you should really give preference to technologies that would allow you to set up your BI solution on a single commodity box.

Perspective #3: The Cost of Starting Too Big… or Too Small

After talking to business managers, executives and other stakeholders, you’ve determined that this BI solution you’re considering has the potential of serving 100 users. How would you then go about calculating your project costs? This is where things get tricky, and where most BI buyers fail to protect their wallets. Making the wrong decision here is far more significant than any decision you make on software licenses or even hardware.

Even if the development stage of your BI project goes without a hitch, getting a hundred users to use any kind of software, in any company, is a challenge that is not at all easier than any technical challenge you will encounter during the various stages of the project. You could easily find yourself spending tons of money on the development and deployment of a complicated 100 user solution, only to find that only 15 of them are actually using it.

So instead of your total cost per user being reduced due to the ‘volume-pricing’ model, you actually paid much more – because each one of these 15 users absorbs the cost of the 85 others who find it utterly useless, too difficult to use or completely misaligned with their business objectives. You'd be surprised how often this happens.

The obvious way of dealing with this common problem is to start off small (10-20 users), and expand as usage of the system grows (assuming it will). But when it comes to traditional business intelligence solutions, there’s a catch - deploying a solution for 10-20 users and deploying a solution for 100 users are utterly different tasks and require significant changes in solution architecture.

Following this path will save you some cost on the software licenses you did not purchase straight off. However, if demand for the solution grows inside the business, you will have to re-design your solution – which would probably end up costing more than it would have initially.

Tip: The correct way of dealing with this challenge is to seek a solution that scales without having to re-architect the solution as usage grows. Buying more software and upgrading hardware when the time comes is relatively easy and inexpensive, while rebuilding the entire solution from scratch every year or two costs way more.


The ElastiCube Chronicles - Business Intelligence Blog

More Stories By Elad Israeli

Elad Israeli is co-founder of business intelligence software company, SiSense. SiSense has developed Prism, a next-generation business intelligence platform based on its own, unique ElastiCube BI technology. Elad is responsible for driving the vision and strategy of SiSense’s unique BI products. Before co-founding SiSense, Elad served as a Product Manager at global IT services firm Ness Technologies (NASDAQ: NSTC). Previously, Elad was a Product Manager at Anysoft and, before that, he co-founded and led technology development at BiSense, a BI technology company.

@MicroservicesExpo Stories
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that's no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, explored how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He expla...
Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things c...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
Admiral Calcote - also known as Lee Calcote (@lcalcote) or the Ginger Geek to his friends - gave a presentation entitled Characterizing and Contrasting Container Orchestrators at the 2016 All Day DevOps conference. Okay, he isn't really an admiral - nor does anyone call him that - but he used the title admiral to describe what container orchestrators do, relating it to an admiral directing a fleet of container ships. You could also say that they are like the conductor of an orchestra, directing...
Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task ...
The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Mi...
"We started a Master of Science in business analytics - that's the hot topic. We serve the business community around San Francisco so we educate the working professionals and this is where they all want to be," explained Judy Lee, Associate Professor and Department Chair at Golden Gate University, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The past few years have seen a huge increase in the amount of critical IT services that companies outsource to SaaS/IaaS/PaaS providers, be it security, storage, monitoring, or operations. Of course, along with any outsourcing to a service provider comes a Service Level Agreement (SLA) to ensure that the vendor is held financially responsible for any lapses in their service which affect the customer’s end users, and ultimately, their bottom line. SLAs can be very tricky to manage for a number ...
In a recent post, titled “10 Surprising Facts About Cloud Computing and What It Really Is”, Zac Johnson highlighted some interesting facts about cloud computing in the SMB marketplace: Cloud Computing is up to 40 times more cost-effective for an SMB, compared to running its own IT system. 94% of SMBs have experienced security benefits in the cloud that they didn’t have with their on-premises service
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...