Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Liz McMillan, Yeshim Deniz, Zakia Bouachraoui

Related Topics: Microservices Expo, Containers Expo Blog

Microservices Expo: Article

Zettabytes of Data and Beyond

Is data discovery the answer to the data complexity problem?

Data Data Everywhere!
According to IDC's June 2011 report Extracting Value from Chaos, the amount of information currently stored is 1.8 zettabytes (1.8 trillion gigabytes).  It has grown by a factor of nine in the past five years.

How can anyone deal with the complexity associated with these volumes?

For IT teams who need to understand this data as a first step before they can figure out how to use it to create high-impact business applications, this complexity is overwhelming.

As a result, the business waits, opportunities are missed, and IT is once again the blame.

Too Complex to Model
Today's application development approaches are based on methods that were appropriate for earlier times when less data complexity was the norm.

Take data modeling for example. Data modeling is the critical step where someone manually models a logical view of the data that an application requires.  While lots of books have been written about the pros and cons of various modeling techniques, all assume the same prerequisite.  That is the person doing the modeling understands the source data and data relationships.

That's great if you stay in a single domain or subject area, such as SAP or finance. But what happens when the application you are building needs data from SAP, Oracle, salesforce.com, and two different data marts?

  • Does anyone in IT understand all data models, metadata, syntax, semantics and more across these sources?
  • Let alone understand how these might relate with one and other?
  • Or if they even do relate at all?

This challenge is far bigger than the need to understand five letter German acronyms in SAP R3 or flexfields in Oracle E-Business.   In this example, each of these sources might have hundreds of table and column names that include some variation of the word "customer."  Understanding these will take a lot of time.  And the odds are several iterations before any models derived from these sources are correct.

What If?
What if we could somehow help data modelers better understand their data?

  • Would they build their data models faster?
  • Would their models be more accurate?
  • Would the developers who used these models to integrate data using tools such as data virtualization, be able to build their virtualized views and data services faster?
  • Would the applications that use these views and services be delivered to the business faster?
  • And would the business be better off if the benefits of these new applications could be achieved sooner?

Yes. Yes. Yes. Yes. And Yes!

Discovery Tools Turn "What If" into Reality
There is a new class of data discovery products that can turn this what if's into reality. Some are standalone tools derived from data profiling offerings originally developed for use in data quality initiatives.  Others are integrated with downstream tools in integrated suites. An example is Composite Discovery which is fully integrated with the Composite Data Virtualization Platform.

These discovery products use advanced mathematical algorithms and heuristics to reveal data patterns that are difficult for even the most experienced data modelers to uncover.  Automatically crawling the source data and applying these methods, discovery tools reveal data and relationships across multiple source systems scattered throughout an organization.  These products then present the data to the modelers using visualization studios that make it easy for the modelers to examine data, locate key entities and comprehend seemingly hidden connections.

Modelers can use that knowledge to quickly build the data models or schemas required.  And then they can turn these over to data integration teams who can bind these models to the sources using views or data services. These rapidly built views are easy to validate and test with business users.  If iteration is required, it can be done quickly.  And once the views and data services are firmed, building out the application's user interface layer is a snap.

Data Discovery Delivers Faster Time to Solution
By accelerating the data relationship finding and modeling process, data modelers can use discovery tools to eliminate much of the time and effort typically employed in uncovering the entities and relationships necessary to build data models. Accelerating these initial critical development lifecycle steps reduces overall time to solution for new applications.   IT looks better.  And the business gains the application benefits sooner.

Data Discovery Delivers Better Quality Applications
Discovery tools help align data's business and technical contexts, facilitating greater collaboration between business and IT professionals.  Discovery tools such as Composite Discovery display meta-data in an easy-to-read format that allows modelers and end users to easily validate requirements with greater confidence.  More accurate validation at the front-end of the project reduces corrective actions in downstream steps.  This reduces frustration and ensures a higher quality application.

Data Discovery Frees Top Talent
Discovery tools provide ease-of-use and automation that reduces the need for data modeling expertise. Top data modelers can now redirect these efforts towards other data governance activities.

Go for it!
In the time it took to read this article, your organization likely added a gigabyte of new data.  Old methods and manual techniques cannot keep pace.  Try data discovery.  You'll be glad you did.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Microservices Articles
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app secu...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.