Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Yeshim Deniz, Pat Romanski, Zakia Bouachraoui

Related Topics: Microservices Expo, Industrial IoT

Microservices Expo: Article

Putting the Costs of Business Intelligence in Perspective

Why is buying a BI solution for dozens or hundreds of users so much more complicated than buying a solution for a few users?

Successful business intelligence (BI) solutions serve as many business users as possible.  As more users use it, the more value the solution brings.

However, if you’ve had any experience with BI, you must have noticed that as the number of users grow – so does the complexity (and consequent cost) of the solution. This is a fundamental reality in the traditional business intelligence space, although many startups in the space are attempting to change it – each according to their own vision and understanding of the space.

But why is buying a BI solution for dozens or hundreds of users so much more complicated than buying a solution for a select group of power users?

Perspective #1: The Cost of Software Licenses

People often think that the answer to this question lies in software costs, but in fact software costs are usually the red herring in the process of business intelligence costing.

It is obvious that the more users your solution has the more software licenses are going to cost. Therefore, you might be tempted to choose a vendor that sells software for 30% less than another vendor – but basing a decision solely on this is a big mistake as license costs have little bearing on the total cost of a BI solution, and hardly any impact on ROI.

Some proof to this can be found in open source. Open source BI provides (by definition) free software, and there is no shortage of open source BI tools/platforms. However, none of them are doing as well as the established non-open source vendors, even though they have been around since the beginning of the century. They’re having trouble acquiring customers, at least compared to commercial vendors. It is very easy to assume that if software costs were significant inhibitors in the BI space, open source solutions would be much more prominent than they actually are.

Another hint at this can be found in the ‘commercial’ (non-open source) world, where BI vendors do charge for licenses but will usually provide significant discounts on purchasing of large volumes of licenses. BI vendors do it for reasons that go beyond the obvious attempt to motivate potential buyers to expand their purchase orders. They do it because they realize the total cost of the solution – to the customer – grows significantly as the number of users grows, regardless of license costs (preparation projects, IT personnel assignment, etc). They need to take this into account when they price their software.

Tip: Pay attention to software costs, but there are way more important things to consider. You should really leave the license cost comparison to last.

Perspective #2: The Cost of Hardware


Two things that have great impact on the hardware requirements of a BI solution are the amounts of data being queried directly by business users, and the number of business users doing the querying concurrently. Depending on which technology you use, each user can add between 10%-50% to the configuration of hardware resources required (disk, RAM and CPU).

(For you technology geeks out there, there is an interesting discussion about this topic on Curt Monash’s blog. Check out the comments section, as it will also give you a good idea on what hardware configurations can be used, when different technologies are utilized)

The tipping point, however, is when your requirements grow beyond what can be fitted inside a single commodity hardware box (read: cheap off-the-shelf computer). If this limit is hit, you basically have three options, none of which are practical for most companies:

1. Buy a high-end proprietary server>
2. Clustering / sharding
3. Build a data warehouse / pre-processed OLAP cubes

Unfortunately, BI technologies that were designed prior to the 21st century (RDBMS, OLAP, In-Memory Databases) don’t leave much room for innovation on this particular aspect. They were designed for hardware that was different than what exists today. So while there will always be a limit on what can be achieved with a single hardware box, with traditional BI technologies the threshold is too low to be feasible for most modern companies that both have large volumes of data and seek extensive usage at reasonable and consistent response times.

The good news is that this is not the case with new technologies that are designed specifically utilize the modern chipsets that are available on any commodity 64-bit machine, and therefore get more (orders of magnitude more) juice out of a single 64-bit commodity box. Running dozens or hundreds of users on a single box is more than possible these days, even when data is in the 100s of GBs size range.

Tip: If you do not wish to spend loads of money on high-end or proprietary servers, and your internal IT department has better things to than to manage a cluster for BI, you should really give preference to technologies that would allow you to set up your BI solution on a single commodity box.

Perspective #3: The Cost of Starting Too Big… or Too Small

After talking to business managers, executives and other stakeholders, you’ve determined that this BI solution you’re considering has the potential of serving 100 users. How would you then go about calculating your project costs? This is where things get tricky, and where most BI buyers fail to protect their wallets. Making the wrong decision here is far more significant than any decision you make on software licenses or even hardware.

Even if the development stage of your BI project goes without a hitch, getting a hundred users to use any kind of software, in any company, is a challenge that is not at all easier than any technical challenge you will encounter during the various stages of the project. You could easily find yourself spending tons of money on the development and deployment of a complicated 100 user solution, only to find that only 15 of them are actually using it.

So instead of your total cost per user being reduced due to the ‘volume-pricing’ model, you actually paid much more – because each one of these 15 users absorbs the cost of the 85 others who find it utterly useless, too difficult to use or completely misaligned with their business objectives. You'd be surprised how often this happens.

The obvious way of dealing with this common problem is to start off small (10-20 users), and expand as usage of the system grows (assuming it will). But when it comes to traditional business intelligence solutions, there’s a catch - deploying a solution for 10-20 users and deploying a solution for 100 users are utterly different tasks and require significant changes in solution architecture.

Following this path will save you some cost on the software licenses you did not purchase straight off. However, if demand for the solution grows inside the business, you will have to re-design your solution – which would probably end up costing more than it would have initially.

Tip: The correct way of dealing with this challenge is to seek a solution that scales without having to re-architect the solution as usage grows. Buying more software and upgrading hardware when the time comes is relatively easy and inexpensive, while rebuilding the entire solution from scratch every year or two costs way more.


The ElastiCube Chronicles - Business Intelligence Blog

More Stories By Elad Israeli

Elad Israeli is co-founder of business intelligence software company, SiSense. SiSense has developed Prism, a next-generation business intelligence platform based on its own, unique ElastiCube BI technology. Elad is responsible for driving the vision and strategy of SiSense’s unique BI products. Before co-founding SiSense, Elad served as a Product Manager at global IT services firm Ness Technologies (NASDAQ: NSTC). Previously, Elad was a Product Manager at Anysoft and, before that, he co-founded and led technology development at BiSense, a BI technology company.

Microservices Articles
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Consumer-driven contracts are an essential part of a mature microservice testing portfolio enabling independent service deployments. In this presentation we'll provide an overview of the tools, patterns and pain points we've seen when implementing contract testing in large development organizations.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...