Click here to close now.


Microservices Expo Authors: Lori MacVittie, Pat Romanski, Jason Bloomberg, Elizabeth White, Yeshim Deniz

Related Topics: Microservices Expo, Containers Expo Blog

Microservices Expo: Article

Zettabytes of Data and Beyond

Is data discovery the answer to the data complexity problem?

Data Data Everywhere!
According to IDC's June 2011 report Extracting Value from Chaos, the amount of information currently stored is 1.8 zettabytes (1.8 trillion gigabytes).  It has grown by a factor of nine in the past five years.

How can anyone deal with the complexity associated with these volumes?

For IT teams who need to understand this data as a first step before they can figure out how to use it to create high-impact business applications, this complexity is overwhelming.

As a result, the business waits, opportunities are missed, and IT is once again the blame.

Too Complex to Model
Today's application development approaches are based on methods that were appropriate for earlier times when less data complexity was the norm.

Take data modeling for example. Data modeling is the critical step where someone manually models a logical view of the data that an application requires.  While lots of books have been written about the pros and cons of various modeling techniques, all assume the same prerequisite.  That is the person doing the modeling understands the source data and data relationships.

That's great if you stay in a single domain or subject area, such as SAP or finance. But what happens when the application you are building needs data from SAP, Oracle,, and two different data marts?

  • Does anyone in IT understand all data models, metadata, syntax, semantics and more across these sources?
  • Let alone understand how these might relate with one and other?
  • Or if they even do relate at all?

This challenge is far bigger than the need to understand five letter German acronyms in SAP R3 or flexfields in Oracle E-Business.   In this example, each of these sources might have hundreds of table and column names that include some variation of the word "customer."  Understanding these will take a lot of time.  And the odds are several iterations before any models derived from these sources are correct.

What If?
What if we could somehow help data modelers better understand their data?

  • Would they build their data models faster?
  • Would their models be more accurate?
  • Would the developers who used these models to integrate data using tools such as data virtualization, be able to build their virtualized views and data services faster?
  • Would the applications that use these views and services be delivered to the business faster?
  • And would the business be better off if the benefits of these new applications could be achieved sooner?

Yes. Yes. Yes. Yes. And Yes!

Discovery Tools Turn "What If" into Reality
There is a new class of data discovery products that can turn this what if's into reality. Some are standalone tools derived from data profiling offerings originally developed for use in data quality initiatives.  Others are integrated with downstream tools in integrated suites. An example is Composite Discovery which is fully integrated with the Composite Data Virtualization Platform.

These discovery products use advanced mathematical algorithms and heuristics to reveal data patterns that are difficult for even the most experienced data modelers to uncover.  Automatically crawling the source data and applying these methods, discovery tools reveal data and relationships across multiple source systems scattered throughout an organization.  These products then present the data to the modelers using visualization studios that make it easy for the modelers to examine data, locate key entities and comprehend seemingly hidden connections.

Modelers can use that knowledge to quickly build the data models or schemas required.  And then they can turn these over to data integration teams who can bind these models to the sources using views or data services. These rapidly built views are easy to validate and test with business users.  If iteration is required, it can be done quickly.  And once the views and data services are firmed, building out the application's user interface layer is a snap.

Data Discovery Delivers Faster Time to Solution
By accelerating the data relationship finding and modeling process, data modelers can use discovery tools to eliminate much of the time and effort typically employed in uncovering the entities and relationships necessary to build data models. Accelerating these initial critical development lifecycle steps reduces overall time to solution for new applications.   IT looks better.  And the business gains the application benefits sooner.

Data Discovery Delivers Better Quality Applications
Discovery tools help align data's business and technical contexts, facilitating greater collaboration between business and IT professionals.  Discovery tools such as Composite Discovery display meta-data in an easy-to-read format that allows modelers and end users to easily validate requirements with greater confidence.  More accurate validation at the front-end of the project reduces corrective actions in downstream steps.  This reduces frustration and ensures a higher quality application.

Data Discovery Frees Top Talent
Discovery tools provide ease-of-use and automation that reduces the need for data modeling expertise. Top data modelers can now redirect these efforts towards other data governance activities.

Go for it!
In the time it took to read this article, your organization likely added a gigabyte of new data.  Old methods and manual techniques cannot keep pace.  Try data discovery.  You'll be glad you did.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@MicroservicesExpo Stories
For it to be SOA – let alone SOA done right – we need to pin down just what "SOA done wrong" might be. First-generation SOA with Web Services and ESBs, perhaps? But then there's second-generation, REST-based SOA. More lightweight and cloud-friendly, but many REST-based SOA practices predate the microservices wave. Today, microservices and containers go hand in hand – only the details of "container-oriented architecture" are largely on the drawing board – and are not likely to look much like S...
Despite all the talk about public cloud services and DevOps, you would think the move to cloud for enterprises is clear and simple. But in a survey of almost 1,600 IT decision makers across the USA and Europe, the state of the cloud in enterprise today is still fraught with considerable frustration. The business case for apps in the real world cloud is hybrid, bimodal, multi-platform, and difficult. Download this report commissioned by NTT Communications to see the insightful findings – registra...
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
DevOps has often been described in terms of CAMS: Culture, Automation, Measuring, Sharing. While we’ve seen a lot of focus on the “A” and even on the “M”, there are very few examples of why the “C" is equally important in the DevOps equation. In her session at @DevOps Summit, Lori MacVittie, of F5 Networks, will explore HTTP/1 and HTTP/2 along with Microservices to illustrate why a collaborative culture between Dev, Ops, and the Network is critical to ensuring success.
The APN DevOps Competency highlights APN Partners who demonstrate deep capabilities delivering continuous integration, continuous delivery, and configuration management. They help customers transform their business to be more efficient and agile by leveraging the AWS platform and DevOps principles.
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....
With containerization using Docker, the orchestration of containers using Kubernetes, the self-service model for provisioning your projects and applications and the workflows we built in OpenShift is the best in class Platform as a Service that enables introducing DevOps into your organization with ease. In his session at DevOps Summit, Veer Muchandi, PaaS evangelist with RedHat, will provide a deep dive overview of OpenShift v3 and demonstrate how it helps with DevOps.
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
As the world moves towards more DevOps and microservices, application deployment to the cloud ought to become a lot simpler. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. In his session at 17th Cloud Expo, Raghavan "Rags" Srinivas, an Architect/Developer Evangeli...
In their session at DevOps Summit, Asaf Yigal, co-founder and the VP of Product at, and Tomer Levy, co-founder and CEO of, will explore the entire process that they have undergone – through research, benchmarking, implementation, optimization, and customer success – in developing a processing engine that can handle petabytes of data. They will also discuss the requirements of such an engine in terms of scalability, resilience, security, and availability along with how the archi...
In a report titled “Forecast Analysis: Enterprise Application Software, Worldwide, 2Q15 Update,” Gartner analysts highlighted the increasing trend of application modernization among enterprises. According to a recent survey, 45% of respondents stated that modernization of installed on-premises core enterprise applications is one of the top five priorities. Gartner also predicted that by 2020, 75% of
DevOps Summit at Cloud Expo 2014 Silicon Valley was a terrific event for us. The Qubell booth was crowded on all three days. We ran demos every 30 minutes with folks lining up to get a seat and usually standing around. It was great to meet and talk to over 500 people! My keynote was well received and so was Stan's joint presentation with RingCentral on Devops for BigData. I also participated in two Power Panels – ‘Women in Technology’ and ‘Why DevOps Is Even More Important than You Think,’ both ...
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
All we need to do is have our teams self-organize, and behold! Emergent design and/or architecture springs up out of the nothingness! If only it were that easy, right? I follow in the footsteps of so many people who have long wondered at the meanings of such simple words, as though they were dogma from on high. Emerge? Self-organizing? Profound, to be sure. But what do we really make of this sentence?
There once was a time when testers operated on their own, in isolation. They’d huddle as a group around the harsh glow of dozens of CRT monitors, clicking through GUIs and recording results. Anxiously, they’d wait for the developers in the other room to fix the bugs they found, yet they’d frequently leave the office disappointed as issues were filed away as non-critical. These teams would rarely interact, save for those scarce moments when a coder would wander in needing to reproduce a particula...