Welcome!

Microservices Expo Authors: AppDynamics Blog, Liz McMillan, Elizabeth White, APM Blog, Pat Romanski

Related Topics: Open Source Cloud, Microservices Expo

Open Source Cloud: Article

Case Study: Open Source + Business Intelligence

A marriage made for data-driven businesses

Data-driven businesses are facing some tough challenges in today's rapidly changing information landscape. As decision cycles continue to shrink, companies need to act on information within hours and minutes rather than weeks and days. At the same time, the volume of data that needs to be analyzed is growing exponentially. Business intelligence (BI) approaches that might have made sense a decade or even five years ago may no longer be the best fit for organizations that must quickly and affordably make sense of terabytes of incoming data that shows no sign of slowing down.

For my company, MX Force, speedy data analysis is not simply a "nice to have," it's critical to our business. As a cloud-based provider of email security for organizations of all sizes, we need to identify the origins of spam, viruses and other potential threats for our clients, fast. But as our business has grown, so has the volume of email log data that we must store, filter, search, analyze and report on. Recently, we were challenged to find a database that could reliably enable quick and efficient ad-hoc queries on up to a year's worth of email log data. Our staff uses this data to analyze and report on statistical information, and we also give our clients the ability to query their own logs to diagnose mail delivery issues. It was important to find a database that could deliver the high performance we required, but affordability and ease of administration were also of vital concern. These considerations prompted us to seek an open source solution.

Open Source Meets Business Intelligence
MX Force uses a number of open source tools within our organization. The low cost of open source is one reason for this, but flexibility is another important driver. Because open source projects are community-driven, users can tweak, customize and tinker with the software as much as they like. This is a big advantage when it comes to business intelligence, as data analysis requirements can change quickly, and you don't want to have to wait weeks or months to get a new query set up or to change the parameters of those that are already running. MX Force was already using MySQL in our business, so we decided to try Infobright's open source analytic database, ICE (Infobright Community Edition.) ICE combines a columnar database with innovative compression and self-tuning capabilities that eliminate the need to create indexes, partition data or do any manual intervention to achieve fast response for queries and reports. The software is built on MySQL, so for us there was a very small implementation and training curve - ICE uses the same familiar MySQL interface. The fact that ICE is an open source analytic solution presented us with several key benefits:

  1. Deployment speed: The time from download and installation to first production use was just three weeks.
  2. Affordability: Many of the proprietary commercial BI solutions available today require custom configuration, expensive licensing agreements and equally expensive hardware to support and run it. Not only was ICE free to install, we could also run the software on inexpensive commodity servers, eliminating the need to invest in high performance servers and storage arrays. (Our entire workload is supported by a single quad-core server.)
  3. Simplicity and flexibility: Because ICE is open and standards-based, we can quickly make changes as needed without requiring extensive IT assistance. In addition, it's often a lot simpler to make fixes or upgrade an open source solution because an entire community contributes their expertise to fixing bugs and making improvements. With proprietary software, users have to wait for issues to be addressed by the vendor, which can take much longer.

MX Force is currently using ICE to quickly isolate mail flow problems and trends. In our experience, using a free, open source product has not in any way involved a compromise on performance or capabilities. We are achieving 10:1 data compression, which saves on storage costs and boosts performance. Most statistical queries render results in less than five seconds. Ongoing administration is simple. The net result is that the product delivers the fast query performance and reporting functionality we needed, at an incredibly low cost for hardware and ongoing maintenance.

Look, then Leap
Interested in giving open source a try for your BI and analytic efforts? There are a number of compelling benefits to doing so, but as with any type of software, it's also important to look before you leap. Evaluation and testing considerations are no different than they would be for licensed software - you want to make sure the solution has the features and capabilities most relevant to your business. Also, there's a difference between open source projects that are at a very early and experimental stage and software that is well established and has a vibrant and involved community behind it, strong vendor support, or both. Investigate the support offered for the solution under consideration. How often are new features added? Are bug fixes made in a timely manner? Is there useful and accurate supporting documentation?

With ICE, we were certainly attracted by the many resources and significant participation of both Infobright and the user community. We also knew there was a commercial version available if we decided we needed the additional functionality it offered or a formal support contract. For companies just jumping in to the open source arena, it's best to avoid tools that haven't yet cultivated a strong following. But even if you do make a mistake, the low (and usually free) cost of open source means that there's minimal risk.

The BI requirements of today's data-driven businesses demand speed, simplicity and affordability. As open source solutions continue to mature, it's worth looking at projects that are focused on analytics, BI and other data management activities. The more nimble and flexible approach embodied by open source may just be the best fit for addressing the many information management challenges driven by data growth and complexity.

More Stories By Mike Makowski

Mike Makowski is CTO of MX Force, a leading provider of email security in the cloud and member of Infobright’s Customer Advisory Council. More information about MX Force can be found at http://www.mxforce.com/

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, will compare the Jevons Paradox to modern-day enterprise IT, e...
As applications are promoted from the development environment to the CI or the QA environment and then into the production environment, it is very common for the configuration settings to be changed as the code is promoted. For example, the settings for the database connection pools are typically lower in development environment than the QA/Load Testing environment. The primary reason for the existence of the configuration setting differences is to enhance application performance. However, occas...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
When scaling agile / Scrum, we invariable run into the alignment vs autonomy problem. In short: you cannot have autonomous self directing teams if they have no clue in what direction they should go, or even shorter: Alignment breeds autonomy. But how do we create alignment? and what tools can we use to quickly evaluate if what we want to do is part of the mission or better left out? Niel Nickolaisen created the Purpose Alignment model and I use it with innovation labs in large enterprises to de...
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said “No to LUNs.” With Tintri they mana...
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
Throughout history, various leaders have risen up and tried to unify the world by conquest. Fortunately, none of their plans have succeeded. The world goes on just fine with each country ruling itself; no single ruler is necessary. That’s how it is with the container platform ecosystem, as well. There’s no need for one all-powerful, all-encompassing container platform. Think about any other technology sector out there – there are always multiple solutions in every space. The same goes for conta...
Let's recap what we learned from the previous chapters in the series: episode 1 and episode 2. We learned that a good rollback mechanism cannot be designed without having an intimate knowledge of the application architecture, the nature of your components and their dependencies. Now that we know what we have to restore and in which order, the question is how?
Digitization is driving a fundamental change in society that is transforming the way businesses work with their customers, their supply chains and their people. Digital transformation leverages DevOps best practices, such as Agile Parallel Development, Continuous Delivery and Agile Operations to capitalize on opportunities and create competitive differentiation in the application economy. However, information security has been notably absent from the DevOps movement. Speed doesn’t have to negat...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management solutions, helping companies worldwide activate their data to drive more value and business insight and to transform moder...
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...