Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Roger Strukhoff, Pat Romanski, Carmen Gonzalez

Related Topics: Containers Expo Blog, Microservices Expo, Open Source Cloud, API Journal, Agile Computing, @CloudExpo

Containers Expo Blog: Article

CIOs' Top Priority: Analytics and BI

How to Deal with the Data Integration Bottleneck

Whether as a driver for growth, a means to attract and retain customers, or a way to drive innovation and reduce costs, the business value of analytics and business intelligence has never been higher.

Gartner's Amplifying the Enterprise: The 2012 CIO Agenda as well as IBM's Global CIO Study 2011 confirm this point, with analytics and BI setting atop CIO's technology priorities in both reports.

Data Integration Is the Biggest Bottleneck
Providing analytics and BI solutions with the data required has always been difficult, with data integration long considered the biggest bottleneck in any analytics or BI project.

Complex data landscapes, diverse data types, new sources such as big data and the cloud are but a few of the well-known barriers.

For the past two decades, the default solution has been to first consolidate the data into a data warehouse, and then provide users with tools to analyze and report on this consolidated data.

However, data integration based on these traditional replication and consolidation approaches have numerous moving parts that must be synchronized. Doing this right extends lead times.

The Data Warehousing Institute confirms this lack of agility. Their recent study stated the average time needed to add a new data source to an existing BI application was 8.4 weeks in 2009, 7.4 weeks in 2010, and 7.8 weeks in 2011. And 33% of the organizations needed more than 3 months to add a new data source.

Data Virtualization Brings Agility to Analytics and BI
According to Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, data virtualization significantly accelerates data integration agility. Key to this success has been data virtualization's ability to provide:

  • A more streamlined data integration approach
  • A more iterative development process
  • A more adaptable change management process

Using data virtualization as a complement to existing data integration approaches, the ten organizations profiled in the book cut analytics and BI project times in half or more.

This agility allowed the same teams to double their number of analytics and BI projects, significantly accelerating business benefits.

For more insights on data virtualization and business agility, check out my earlier articles on this topic.

Simplify to Overcome Historical IT Complexity

Data virtualization's simplified information access and faster time-to-solution is especially useful as an enabler for  more agile analytics and BI

Is Data Virtualization the Fast Path to BI Agility? describes how the architectures of most business intelligence systems are based on a complex chain of data stores starting with production databases, data staging areas, a data warehouse, dependent data marts, and personal data stores.   Simply maintaining this complexity is overwhelming IT today.

These classic BI architectures served business well for the last twenty years. However, considering the need for more agility, they have some disadvantages:

  • Duplication of data
  • Non-shared meta data specifications
  • Limited flexibility
  • Decrease of data quality
  • Limited support for operational reporting:
  • Limited support for reporting on unstructured and external data"

From a different point of view, SOA World's Zettabytes of Data and Beyond describes the challenges of force-fitting development methods that were appropriate for earlier times when less data complexity was the norm.

In addition, the proliferation of fit-for-purpose data stores including data warehouse appliances, Hadoop-based file systems, and a range of No-SQL data stores are breaking the hegemony of the traditional data warehouse as the "best" solution to the enterprise-level data integration problem.   The business and IT impact of these new approaches can be explored in the Virtualization Magazine article NoSQL and Data Virtualization - Soon to Be Best Friends.

Self-Service Analytics and BI are Important Too!
Responding to constantly changing business demands for analytics and BI is a daunting effort.

Mergers and acquisitions and evolving supply chains require new comparisons and aggregations. The explosion of social media drives demand for new customer insights. Mobile computing changes form factors. And self-service BI puts users in the driver's seat.

Business Taking Charge of Analytics and BI

In true Darwinian fashion, the business side of most organizations is now taking greater responsibility for fulfilling its own information needs rather than depending solely on already-burdened IT resources.

For example, in a 2011 survey of over 625 business and IT professionals entitled Self-Service Business Intelligence: TDWI Best Practices Report, @TDWI July 2011,The Data Warehousing Institute (TDWI) identified the following top five factors driving businesses toward self-service business intelligence:

  • Constantly changing business needs (65%)
  • IT's inability to satisfy new requests in a timely manner (57%)
  • The need to be a more analytics-driven organization (54%)
  • Slow and untimely access to information (47%)
  • Business user dissatisfaction with IT-delivered BI capabilities (34%)

In the same survey report, authors Claudia Imhoff and Colin White suggest that IT's focus shifts toward making it easier for business users "to access the growing number of dispersed data sources that exist in most organizations."

Examples Imhoff and White cite include:

  • providing friendlier business views of source data
  • improving on-demand access to data across multiple data sources
  • enabling data discovery and search functions
  • supporting access to other types of data, such as unstructured documents; and more.

Data Virtualization to the Self-Service Rescue

In the TDWI survey, 60% of respondents rated business views of source data as "very important," and 44% said on-demand access to multiple data sources using data federation technologies was "very important."

According to Imhoff and White, "Data virtualization and associated data federation technologies enable BI/DW builders to build shared business views of multiple data sources so that the users do not have to be concerned about the physical location or structure of the data.

These views are sometimes known as virtual business views because, from an application perspective, the data appears to be consolidated in a single logical data store. In fact, it may be managed in multiple physical data structures on several different servers.

Data virtualization platforms such as the Composite Data Virtualization Platform support access to different types of data sources, including relational databases, non-relational systems, application package databases, flat files, Web data feeds, and Web services.

To Achieve Self-Service BI, Consider Using Data Virtualization provides additional insights on about how data virtualization enables self-service analytics and BI.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@MicroservicesExpo Stories
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to delive...
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, showed how customers are able to achieve a level of transparency that enables everyone fro...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...