Welcome!

Microservices Expo Authors: Liz McMillan, Flint Brenton, Elizabeth White, Yeshim Deniz, Pat Romanski

Related Topics: Microservices Expo, @CloudExpo

Microservices Expo: Blog Feed Post

The Business Intelligence A Cloud Paradox

Simultaneously one of the best use-cases for cloud as well as the worst. What’s IT to do?

Simultaneously one of the best use-cases for cloud as well as the worst. What’s IT to do?

David Linthicum, SOA and cloud pundit and all-around interesting technology guy, recently pointed out a short post on business intelligence (BI) vendors joining forces with the cloud to offer cloud-based BI services.

Four open-source and proprietary vendors on Wednesday announced a new partnership resulting in a cloud-based BI (business intelligence) stack.

Jaspersoft and Talend will respectively lend their open-source BI and data-integration technologies to the integrated offering, which also employs Vertica's analytic database and RightScale's management software for cloud-based application deployments.

Thought #1: That’s a perfect use case for cloud! Business intelligence processing is (a) compute intensive and (b) interval based (periodic processing; BI processing and reports are generally scheduled occurrences). It is a rare organization that builds OLAP cubes and runs ETL processes and generates reports all the time. But when they do, look out, they can bring a server (or three) to their knees.

Thought #2: That’s a horrible use case for cloud! The data, though it may be anonymized to remove personally identifiable information/account data, is still sensitive. It’s not any one field or combination of fields that are sensitive, it’s the whole data set. That combination of data is used to make business decisions, analyze business performance, and provides a great view of an organization’s current operating and financial status. That kind of data is not something you want shared outside the organization.


WHY IS THE DATA SO IMPORTANT?


Some of the data used for business intelligence and data mining operations is not all that interesting. But some of it contains the very data from which businesses can and do make very critical business-oriented decisions. Some of the data being analyzed consists of sales and demographics: past, present, and future. Some of that data can be interpreted as information about the financial health of an organization which, for some companies, is not something that should be handed off to a third-party without some fairly stringent assurances regarding the security of that data both from prying eyes and theft.

The last time I took a hard look at BI solutions I managed to convince the powers that be to ship me a copy of the subscription database. To a publishing company the subscription database, and all the data that describes those subscribers, is a goldmine. Even though I then worked for the same organization and the testing was being conducted in an isolated environment the data was stripped, masked, and as anonymized as possible lest it “leak” out into the real world. It was that important to the organization that the data be safe that most of the juicy bits were effectively gone by the time I got my hands on it.

This is not an uncommon scenario. The data often churned through by business intelligence systems is extremely sensitive and crucial to the day-to-day decisions made regarding the business. This is not just sales data, it’s the keys to the kingdom, effectively. For a competitor to get their hands on that data would be disastrous. Thus, it is hard to believe any organization would trust a third-party provider with that data.

Not sure about that? Remember that BI systems and reports are supposed to enable business decision makers to answer questions like:

  • What product is selling well and at what price point in area X?
  • What are buyers in demographic Y currently spending their dollars on?
  • Who buys products like Z?

Imagine a fierce competitor getting their hands on that data, and what they could do with it. Consider the impact on the organization if competing businesses can answer those questions for both themselves and your organization? “Not good” would be an understatement.

But the fact is that such systems are the perfect use-case for cloud given their periodic and highly compute intensive demands on the data center.


A SOLUTION: PRIVATE CLOUD


warning-private-cloud If ever there was a good use-case scenario for a private, internal cloud BI systems are it. The ability to consume additional resources, on-demand across the data center in a distributed way would certainly decrease the overall operating and capital expenses often associated with business intelligence initiatives and the data – the very important, sensitive data – would remain safely locked up within the confines a data center over which the organization has complete (one hopes) control.

Now, given the mix of vendors involved in the aforementioned venture that kicked off this quandary, it appears to be the case that this will be an offering more along the lines of Salesforce.com. Salesforce has, of course, done well with ensuring the security of its sensitive data and providing the isolation required by many customers to trust essentially outsourcing such a business-critical function (SFA). Seriously – when have you heard of a breach in Salesforce security leading to data leakage? Exactly. So it is not beyond imagining that a BI-related venture based on a similar shared platform/isolated data (multi-tenant) model might entice some away from investing in expensive hardware and software to support BI efforts internally.

The difference between the data housed in Salesforce and data shoved into BI systems, however, is enough to continue to be cautious about pushing such responsibility off to the cloud. This really isn’t about the security of the cloud, it’s about the value – and risk – of the data to the organization, and whether the potential savings would offset the risk. Many organizations are likely to say “no, no it doesn’t.” Thus it is a good idea to consider the potential benefits of building out an internal cloud instead. With a quarter of IT executives recently surveyed initiating private cloud implementations in 2009 anyway, it makes sense to look to BI as a possible early contender for deployment in an internal cloud-based architecture.

 

Follow me on Twitter View Lori's profile on SlideShare friendfeedicon_facebook AddThis Feed Button Bookmark and Share

Related blogs & articles:

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@MicroservicesExpo Stories
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Archi...
Don’t go chasing waterfall … development, that is. According to a recent post by Madison Moore on Medium featuring insights from several software delivery industry leaders, waterfall is – while still popular – not the best way to win in the marketplace. With methodologies like Agile, DevOps and Continuous Delivery becoming ever more prominent over the past 15 years or so, waterfall is old news. Or, is it? Moore cites a recent study by Gartner: “According to Gartner’s IT Key Metrics Data report, ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
We all know that end users experience the Internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices – not doing so will be a path to eventual b...
Docker is sweeping across startups and enterprises alike, changing the way we build and ship applications. It's the most prominent and widely known software container platform, and it's particularly useful for eliminating common challenges when collaborating on code (like the "it works on my machine" phenomenon that most devs know all too well). With Docker, you can run and manage apps side-by-side - in isolated containers - resulting in better compute density. It's something that many developer...
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Mi...
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
"This all sounds great. But it's just not realistic." This is what a group of five senior IT executives told me during a workshop I held not long ago. We were working through an exercise on the organizational characteristics necessary to successfully execute a digital transformation, and the group was doing their ‘readout.' The executives loved everything we discussed and agreed that if such an environment existed, it would make transformation much easier. They just didn't believe it was reali...