Welcome!

Microservices Expo Authors: Elizabeth White, Kelly Burford, Derek Weeks, Liz McMillan, John Katrick

Related Topics: Microservices Expo, Containers Expo Blog

Microservices Expo: Article

Zettabytes of Data and Beyond

Is data discovery the answer to the data complexity problem?

Data Data Everywhere!
According to IDC's June 2011 report Extracting Value from Chaos, the amount of information currently stored is 1.8 zettabytes (1.8 trillion gigabytes).  It has grown by a factor of nine in the past five years.

How can anyone deal with the complexity associated with these volumes?

For IT teams who need to understand this data as a first step before they can figure out how to use it to create high-impact business applications, this complexity is overwhelming.

As a result, the business waits, opportunities are missed, and IT is once again the blame.

Too Complex to Model
Today's application development approaches are based on methods that were appropriate for earlier times when less data complexity was the norm.

Take data modeling for example. Data modeling is the critical step where someone manually models a logical view of the data that an application requires.  While lots of books have been written about the pros and cons of various modeling techniques, all assume the same prerequisite.  That is the person doing the modeling understands the source data and data relationships.

That's great if you stay in a single domain or subject area, such as SAP or finance. But what happens when the application you are building needs data from SAP, Oracle, salesforce.com, and two different data marts?

  • Does anyone in IT understand all data models, metadata, syntax, semantics and more across these sources?
  • Let alone understand how these might relate with one and other?
  • Or if they even do relate at all?

This challenge is far bigger than the need to understand five letter German acronyms in SAP R3 or flexfields in Oracle E-Business.   In this example, each of these sources might have hundreds of table and column names that include some variation of the word "customer."  Understanding these will take a lot of time.  And the odds are several iterations before any models derived from these sources are correct.

What If?
What if we could somehow help data modelers better understand their data?

  • Would they build their data models faster?
  • Would their models be more accurate?
  • Would the developers who used these models to integrate data using tools such as data virtualization, be able to build their virtualized views and data services faster?
  • Would the applications that use these views and services be delivered to the business faster?
  • And would the business be better off if the benefits of these new applications could be achieved sooner?

Yes. Yes. Yes. Yes. And Yes!

Discovery Tools Turn "What If" into Reality
There is a new class of data discovery products that can turn this what if's into reality. Some are standalone tools derived from data profiling offerings originally developed for use in data quality initiatives.  Others are integrated with downstream tools in integrated suites. An example is Composite Discovery which is fully integrated with the Composite Data Virtualization Platform.

These discovery products use advanced mathematical algorithms and heuristics to reveal data patterns that are difficult for even the most experienced data modelers to uncover.  Automatically crawling the source data and applying these methods, discovery tools reveal data and relationships across multiple source systems scattered throughout an organization.  These products then present the data to the modelers using visualization studios that make it easy for the modelers to examine data, locate key entities and comprehend seemingly hidden connections.

Modelers can use that knowledge to quickly build the data models or schemas required.  And then they can turn these over to data integration teams who can bind these models to the sources using views or data services. These rapidly built views are easy to validate and test with business users.  If iteration is required, it can be done quickly.  And once the views and data services are firmed, building out the application's user interface layer is a snap.

Data Discovery Delivers Faster Time to Solution
By accelerating the data relationship finding and modeling process, data modelers can use discovery tools to eliminate much of the time and effort typically employed in uncovering the entities and relationships necessary to build data models. Accelerating these initial critical development lifecycle steps reduces overall time to solution for new applications.   IT looks better.  And the business gains the application benefits sooner.

Data Discovery Delivers Better Quality Applications
Discovery tools help align data's business and technical contexts, facilitating greater collaboration between business and IT professionals.  Discovery tools such as Composite Discovery display meta-data in an easy-to-read format that allows modelers and end users to easily validate requirements with greater confidence.  More accurate validation at the front-end of the project reduces corrective actions in downstream steps.  This reduces frustration and ensures a higher quality application.

Data Discovery Frees Top Talent
Discovery tools provide ease-of-use and automation that reduces the need for data modeling expertise. Top data modelers can now redirect these efforts towards other data governance activities.

Go for it!
In the time it took to read this article, your organization likely added a gigabyte of new data.  Old methods and manual techniques cannot keep pace.  Try data discovery.  You'll be glad you did.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@MicroservicesExpo Stories
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
Admiral Calcote - also known as Lee Calcote (@lcalcote) or the Ginger Geek to his friends - gave a presentation entitled Characterizing and Contrasting Container Orchestrators at the 2016 All Day DevOps conference. Okay, he isn't really an admiral - nor does anyone call him that - but he used the title admiral to describe what container orchestrators do, relating it to an admiral directing a fleet of container ships. You could also say that they are like the conductor of an orchestra, directing...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things c...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Mi...
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task...
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...
Many IT organizations have come to learn that leveraging cloud infrastructure is not just unavoidable, it’s one of the most effective paths for IT organizations to become more responsive to business needs. Yet with the cloud comes new challenges, including minimizing downtime, decreasing the cost of operations, and preventing employee burnout to name a few. As companies migrate their processes and procedures to their new reality of a cloud-based infrastructure, an incident management solution...
Cloud Governance means many things to many people. Heck, just the word cloud means different things depending on who you are talking to. While definitions can vary, controlling access to cloud resources is invariably a central piece of any governance program. Enterprise cloud computing has transformed IT. Cloud computing decreases time-to-market, improves agility by allowing businesses to adapt quickly to changing market demands, and, ultimately, drives down costs.
Recent survey done across top 500 fortune companies shows almost 70% of the CIO have either heard about IAC from their infrastructure head or they are on their way to implement IAC. Yet if you look under the hood while some level of automation has been done, most of the infrastructure is still managed in much tradition/legacy way. So, what is Infrastructure as Code? how do you determine if your IT infrastructure is truly automated?
Every few years, a disruptive force comes along that prompts us to reframe our understanding of what something means, or how it works. For years, the notion of what a computer is and how you make one went pretty much unchallenged. Then virtualization came along, followed by cloud computing, and most recently containers. Suddenly the old rules no longer seemed to apply, or at least they didn’t always apply. These disruptors made us reconsider our IT worldview.