Welcome!

Microservices Expo Authors: Liz McMillan, Karthick Viswanathan, Pat Romanski, Stackify Blog, Dalibor Siroky

Related Topics: Microservices Expo, Industrial IoT

Microservices Expo: Article

Putting the Costs of Business Intelligence in Perspective

Why is buying a BI solution for dozens or hundreds of users so much more complicated than buying a solution for a few users?

Successful business intelligence (BI) solutions serve as many business users as possible.  As more users use it, the more value the solution brings.

However, if you’ve had any experience with BI, you must have noticed that as the number of users grow – so does the complexity (and consequent cost) of the solution. This is a fundamental reality in the traditional business intelligence space, although many startups in the space are attempting to change it – each according to their own vision and understanding of the space.

But why is buying a BI solution for dozens or hundreds of users so much more complicated than buying a solution for a select group of power users?

Perspective #1: The Cost of Software Licenses

People often think that the answer to this question lies in software costs, but in fact software costs are usually the red herring in the process of business intelligence costing.

It is obvious that the more users your solution has the more software licenses are going to cost. Therefore, you might be tempted to choose a vendor that sells software for 30% less than another vendor – but basing a decision solely on this is a big mistake as license costs have little bearing on the total cost of a BI solution, and hardly any impact on ROI.

Some proof to this can be found in open source. Open source BI provides (by definition) free software, and there is no shortage of open source BI tools/platforms. However, none of them are doing as well as the established non-open source vendors, even though they have been around since the beginning of the century. They’re having trouble acquiring customers, at least compared to commercial vendors. It is very easy to assume that if software costs were significant inhibitors in the BI space, open source solutions would be much more prominent than they actually are.

Another hint at this can be found in the ‘commercial’ (non-open source) world, where BI vendors do charge for licenses but will usually provide significant discounts on purchasing of large volumes of licenses. BI vendors do it for reasons that go beyond the obvious attempt to motivate potential buyers to expand their purchase orders. They do it because they realize the total cost of the solution – to the customer – grows significantly as the number of users grows, regardless of license costs (preparation projects, IT personnel assignment, etc). They need to take this into account when they price their software.

Tip: Pay attention to software costs, but there are way more important things to consider. You should really leave the license cost comparison to last.

Perspective #2: The Cost of Hardware


Two things that have great impact on the hardware requirements of a BI solution are the amounts of data being queried directly by business users, and the number of business users doing the querying concurrently. Depending on which technology you use, each user can add between 10%-50% to the configuration of hardware resources required (disk, RAM and CPU).

(For you technology geeks out there, there is an interesting discussion about this topic on Curt Monash’s blog. Check out the comments section, as it will also give you a good idea on what hardware configurations can be used, when different technologies are utilized)

The tipping point, however, is when your requirements grow beyond what can be fitted inside a single commodity hardware box (read: cheap off-the-shelf computer). If this limit is hit, you basically have three options, none of which are practical for most companies:

1. Buy a high-end proprietary server>
2. Clustering / sharding
3. Build a data warehouse / pre-processed OLAP cubes

Unfortunately, BI technologies that were designed prior to the 21st century (RDBMS, OLAP, In-Memory Databases) don’t leave much room for innovation on this particular aspect. They were designed for hardware that was different than what exists today. So while there will always be a limit on what can be achieved with a single hardware box, with traditional BI technologies the threshold is too low to be feasible for most modern companies that both have large volumes of data and seek extensive usage at reasonable and consistent response times.

The good news is that this is not the case with new technologies that are designed specifically utilize the modern chipsets that are available on any commodity 64-bit machine, and therefore get more (orders of magnitude more) juice out of a single 64-bit commodity box. Running dozens or hundreds of users on a single box is more than possible these days, even when data is in the 100s of GBs size range.

Tip: If you do not wish to spend loads of money on high-end or proprietary servers, and your internal IT department has better things to than to manage a cluster for BI, you should really give preference to technologies that would allow you to set up your BI solution on a single commodity box.

Perspective #3: The Cost of Starting Too Big… or Too Small

After talking to business managers, executives and other stakeholders, you’ve determined that this BI solution you’re considering has the potential of serving 100 users. How would you then go about calculating your project costs? This is where things get tricky, and where most BI buyers fail to protect their wallets. Making the wrong decision here is far more significant than any decision you make on software licenses or even hardware.

Even if the development stage of your BI project goes without a hitch, getting a hundred users to use any kind of software, in any company, is a challenge that is not at all easier than any technical challenge you will encounter during the various stages of the project. You could easily find yourself spending tons of money on the development and deployment of a complicated 100 user solution, only to find that only 15 of them are actually using it.

So instead of your total cost per user being reduced due to the ‘volume-pricing’ model, you actually paid much more – because each one of these 15 users absorbs the cost of the 85 others who find it utterly useless, too difficult to use or completely misaligned with their business objectives. You'd be surprised how often this happens.

The obvious way of dealing with this common problem is to start off small (10-20 users), and expand as usage of the system grows (assuming it will). But when it comes to traditional business intelligence solutions, there’s a catch - deploying a solution for 10-20 users and deploying a solution for 100 users are utterly different tasks and require significant changes in solution architecture.

Following this path will save you some cost on the software licenses you did not purchase straight off. However, if demand for the solution grows inside the business, you will have to re-design your solution – which would probably end up costing more than it would have initially.

Tip: The correct way of dealing with this challenge is to seek a solution that scales without having to re-architect the solution as usage grows. Buying more software and upgrading hardware when the time comes is relatively easy and inexpensive, while rebuilding the entire solution from scratch every year or two costs way more.


The ElastiCube Chronicles - Business Intelligence Blog

More Stories By Elad Israeli

Elad Israeli is co-founder of business intelligence software company, SiSense. SiSense has developed Prism, a next-generation business intelligence platform based on its own, unique ElastiCube BI technology. Elad is responsible for driving the vision and strategy of SiSense’s unique BI products. Before co-founding SiSense, Elad served as a Product Manager at global IT services firm Ness Technologies (NASDAQ: NSTC). Previously, Elad was a Product Manager at Anysoft and, before that, he co-founded and led technology development at BiSense, a BI technology company.

@MicroservicesExpo Stories
"As we've gone out into the public cloud we've seen that over time we may have lost a few things - we've lost control, we've given up cost to a certain extent, and then security, flexibility," explained Steve Conner, VP of Sales at Cloudistics,in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
From manual human effort the world is slowly paving its way to a new space where most process are getting replaced with tools and systems to improve efficiency and bring down operational costs. Automation is the next big thing and low code platforms are fueling it in a significant way. The Automation era is here. We are in the fast pace of replacing manual human efforts with machines and processes. In the world of Information Technology too, we are linking disparate systems, softwares and tool...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...
It has never been a better time to be a developer! Thanks to cloud computing, deploying our applications is much easier than it used to be. How we deploy our apps continues to evolve thanks to cloud hosting, Platform-as-a-Service (PaaS), and now Function-as-a-Service. FaaS is the concept of serverless computing via serverless architectures. Software developers can leverage this to deploy an individual "function", action, or piece of business logic. They are expected to start within milliseconds...
The nature of test environments is inherently temporary—you set up an environment, run through an automated test suite, and then tear down the environment. If you can reduce the cycle time for this process down to hours or minutes, then you may be able to cut your test environment budgets considerably. The impact of cloud adoption on test environments is a valuable advancement in both cost savings and agility. The on-demand model takes advantage of public cloud APIs requiring only payment for t...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
These days, APIs have become an integral part of the digital transformation journey for all enterprises. Every digital innovation story is connected to APIs . But have you ever pondered over to know what are the source of these APIs? Let me explain - APIs sources can be varied, internal or external, solving different purposes, but mostly categorized into the following two categories. Data lakes is a term used to represent disconnected but relevant data that are used by various business units wit...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
With continuous delivery (CD) almost always in the spotlight, continuous integration (CI) is often left out in the cold. Indeed, it's been in use for so long and so widely, we often take the model for granted. So what is CI and how can you make the most of it? This blog is intended to answer those questions. Before we step into examining CI, we need to look back. Software developers often work in small teams and modularity, and need to integrate their changes with the rest of the project code b...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Cloud4U builds software services that help people build DevOps platforms for cloud-based software and using our platform people can draw a picture of the system, network, software," explained Kihyeon Kim, CEO and Head of R&D at Cloud4U, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that's no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, explored how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He expla...
"Grape Up leverages Cloud Native technologies and helps companies build software using microservices, and work the DevOps agile way. We've been doing digital innovation for the last 12 years," explained Daniel Heckman, of Grape Up in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task ...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
Let's do a visualization exercise. Imagine it's December 31, 2018, and you're ringing in the New Year with your friends and family. You think back on everything that you accomplished in the last year: your company's revenue is through the roof thanks to the success of your product, and you were promoted to Lead Developer. 2019 is poised to be an even bigger year for your company because you have the tools and insight to scale as quickly as demand requires. You're a happy human, and it's not just...