Welcome!

Microservices Expo Authors: Aruna Ravichandran, Elizabeth White, Ken Schwaber, Mehdi Daoudi, Carmen Gonzalez

Related Topics: Containers Expo Blog, Microservices Expo, Open Source Cloud, Cognitive Computing , Agile Computing, @CloudExpo

Containers Expo Blog: Article

CIOs' Top Priority: Analytics and BI

How to Deal with the Data Integration Bottleneck

Whether as a driver for growth, a means to attract and retain customers, or a way to drive innovation and reduce costs, the business value of analytics and business intelligence has never been higher.

Gartner's Amplifying the Enterprise: The 2012 CIO Agenda as well as IBM's Global CIO Study 2011 confirm this point, with analytics and BI setting atop CIO's technology priorities in both reports.

Data Integration Is the Biggest Bottleneck
Providing analytics and BI solutions with the data required has always been difficult, with data integration long considered the biggest bottleneck in any analytics or BI project.

Complex data landscapes, diverse data types, new sources such as big data and the cloud are but a few of the well-known barriers.

For the past two decades, the default solution has been to first consolidate the data into a data warehouse, and then provide users with tools to analyze and report on this consolidated data.

However, data integration based on these traditional replication and consolidation approaches have numerous moving parts that must be synchronized. Doing this right extends lead times.

The Data Warehousing Institute confirms this lack of agility. Their recent study stated the average time needed to add a new data source to an existing BI application was 8.4 weeks in 2009, 7.4 weeks in 2010, and 7.8 weeks in 2011. And 33% of the organizations needed more than 3 months to add a new data source.

Data Virtualization Brings Agility to Analytics and BI
According to Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, data virtualization significantly accelerates data integration agility. Key to this success has been data virtualization's ability to provide:

  • A more streamlined data integration approach
  • A more iterative development process
  • A more adaptable change management process

Using data virtualization as a complement to existing data integration approaches, the ten organizations profiled in the book cut analytics and BI project times in half or more.

This agility allowed the same teams to double their number of analytics and BI projects, significantly accelerating business benefits.

For more insights on data virtualization and business agility, check out my earlier articles on this topic.

Simplify to Overcome Historical IT Complexity

Data virtualization's simplified information access and faster time-to-solution is especially useful as an enabler for  more agile analytics and BI

Is Data Virtualization the Fast Path to BI Agility? describes how the architectures of most business intelligence systems are based on a complex chain of data stores starting with production databases, data staging areas, a data warehouse, dependent data marts, and personal data stores.   Simply maintaining this complexity is overwhelming IT today.

These classic BI architectures served business well for the last twenty years. However, considering the need for more agility, they have some disadvantages:

  • Duplication of data
  • Non-shared meta data specifications
  • Limited flexibility
  • Decrease of data quality
  • Limited support for operational reporting:
  • Limited support for reporting on unstructured and external data"

From a different point of view, SOA World's Zettabytes of Data and Beyond describes the challenges of force-fitting development methods that were appropriate for earlier times when less data complexity was the norm.

In addition, the proliferation of fit-for-purpose data stores including data warehouse appliances, Hadoop-based file systems, and a range of No-SQL data stores are breaking the hegemony of the traditional data warehouse as the "best" solution to the enterprise-level data integration problem.   The business and IT impact of these new approaches can be explored in the Virtualization Magazine article NoSQL and Data Virtualization - Soon to Be Best Friends.

Self-Service Analytics and BI are Important Too!
Responding to constantly changing business demands for analytics and BI is a daunting effort.

Mergers and acquisitions and evolving supply chains require new comparisons and aggregations. The explosion of social media drives demand for new customer insights. Mobile computing changes form factors. And self-service BI puts users in the driver's seat.

Business Taking Charge of Analytics and BI

In true Darwinian fashion, the business side of most organizations is now taking greater responsibility for fulfilling its own information needs rather than depending solely on already-burdened IT resources.

For example, in a 2011 survey of over 625 business and IT professionals entitled Self-Service Business Intelligence: TDWI Best Practices Report, @TDWI July 2011,The Data Warehousing Institute (TDWI) identified the following top five factors driving businesses toward self-service business intelligence:

  • Constantly changing business needs (65%)
  • IT's inability to satisfy new requests in a timely manner (57%)
  • The need to be a more analytics-driven organization (54%)
  • Slow and untimely access to information (47%)
  • Business user dissatisfaction with IT-delivered BI capabilities (34%)

In the same survey report, authors Claudia Imhoff and Colin White suggest that IT's focus shifts toward making it easier for business users "to access the growing number of dispersed data sources that exist in most organizations."

Examples Imhoff and White cite include:

  • providing friendlier business views of source data
  • improving on-demand access to data across multiple data sources
  • enabling data discovery and search functions
  • supporting access to other types of data, such as unstructured documents; and more.

Data Virtualization to the Self-Service Rescue

In the TDWI survey, 60% of respondents rated business views of source data as "very important," and 44% said on-demand access to multiple data sources using data federation technologies was "very important."

According to Imhoff and White, "Data virtualization and associated data federation technologies enable BI/DW builders to build shared business views of multiple data sources so that the users do not have to be concerned about the physical location or structure of the data.

These views are sometimes known as virtual business views because, from an application perspective, the data appears to be consolidated in a single logical data store. In fact, it may be managed in multiple physical data structures on several different servers.

Data virtualization platforms such as the Composite Data Virtualization Platform support access to different types of data sources, including relational databases, non-relational systems, application package databases, flat files, Web data feeds, and Web services.

To Achieve Self-Service BI, Consider Using Data Virtualization provides additional insights on about how data virtualization enables self-service analytics and BI.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@MicroservicesExpo Stories
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
I’m told that it has been 21 years since Scrum became public when Jeff Sutherland and I presented it at an Object-Oriented Programming, Systems, Languages & Applications (OOPSLA) workshop in Austin, TX, in October of 1995. Time sure does fly. Things mature. I’m still in the same building and at the same company where I first formulated Scrum.[1] Initially nobody knew of Scrum, yet it is now an open source body of knowledge translated into more than 30 languages[2] People use Scrum worldwide for ...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Synthetic monitoring is hardly a new technology. It’s been around almost as long as the commercial World Wide Web has. But the importance of monitoring the performance and availability of a web application by simulating users’ interactions with that application, from around the globe, has never been more important. We’ve seen prominent vendors in the broad APM space add this technology with new development or partnerships just in the last 18 months.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
A lot of time, resources and energy has been invested over the past few years on de-siloing development and operations. And with good reason. DevOps is enabling organizations to more aggressively increase their digital agility, while at the same time reducing digital costs and risks. But as 2017 approaches, the hottest trends in DevOps aren’t specifically about dev or ops. They’re about testing, security, and metrics.
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed ...
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager - Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, reviewed next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discussed how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has been engaged in t...
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions with...
Here’s a novel, but controversial statement, “it’s time for the CEO, COO, CIO to start to take joint responsibility for application platform decisions.” For too many years now technical meritocracy has led the decision-making for the business with regard to platform selection. This includes, but is not limited to, servers, operating systems, virtualization, cloud and application platforms. In many of these cases the decision has not worked in favor of the business with regard to agility and cost...
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
True Story. Over the past few years, Fannie Mae transformed the way in which they delivered software. Deploys increased from 1,200/month to 15,000/month. At the same time, productivity increased by 28% while reducing costs by 30%. But, how did they do it? During the All Day DevOps conference, over 13,500 practitioners from around the world to learn from their peers in the industry. Barry Snyder, Senior Manager of DevOps at Fannie Mae, was one of 57 practitioners who shared his real world journe...