Welcome!

Microservices Expo Authors: Jason Bloomberg, Elizabeth White, Carmen Gonzalez, Pat Romanski, AppNeta Blog

Related Topics: Microservices Expo, Containers Expo Blog

Microservices Expo: Article

Zettabytes of Data and Beyond

Is data discovery the answer to the data complexity problem?

Data Data Everywhere!
According to IDC's June 2011 report Extracting Value from Chaos, the amount of information currently stored is 1.8 zettabytes (1.8 trillion gigabytes).  It has grown by a factor of nine in the past five years.

How can anyone deal with the complexity associated with these volumes?

For IT teams who need to understand this data as a first step before they can figure out how to use it to create high-impact business applications, this complexity is overwhelming.

As a result, the business waits, opportunities are missed, and IT is once again the blame.

Too Complex to Model
Today's application development approaches are based on methods that were appropriate for earlier times when less data complexity was the norm.

Take data modeling for example. Data modeling is the critical step where someone manually models a logical view of the data that an application requires.  While lots of books have been written about the pros and cons of various modeling techniques, all assume the same prerequisite.  That is the person doing the modeling understands the source data and data relationships.

That's great if you stay in a single domain or subject area, such as SAP or finance. But what happens when the application you are building needs data from SAP, Oracle, salesforce.com, and two different data marts?

  • Does anyone in IT understand all data models, metadata, syntax, semantics and more across these sources?
  • Let alone understand how these might relate with one and other?
  • Or if they even do relate at all?

This challenge is far bigger than the need to understand five letter German acronyms in SAP R3 or flexfields in Oracle E-Business.   In this example, each of these sources might have hundreds of table and column names that include some variation of the word "customer."  Understanding these will take a lot of time.  And the odds are several iterations before any models derived from these sources are correct.

What If?
What if we could somehow help data modelers better understand their data?

  • Would they build their data models faster?
  • Would their models be more accurate?
  • Would the developers who used these models to integrate data using tools such as data virtualization, be able to build their virtualized views and data services faster?
  • Would the applications that use these views and services be delivered to the business faster?
  • And would the business be better off if the benefits of these new applications could be achieved sooner?

Yes. Yes. Yes. Yes. And Yes!

Discovery Tools Turn "What If" into Reality
There is a new class of data discovery products that can turn this what if's into reality. Some are standalone tools derived from data profiling offerings originally developed for use in data quality initiatives.  Others are integrated with downstream tools in integrated suites. An example is Composite Discovery which is fully integrated with the Composite Data Virtualization Platform.

These discovery products use advanced mathematical algorithms and heuristics to reveal data patterns that are difficult for even the most experienced data modelers to uncover.  Automatically crawling the source data and applying these methods, discovery tools reveal data and relationships across multiple source systems scattered throughout an organization.  These products then present the data to the modelers using visualization studios that make it easy for the modelers to examine data, locate key entities and comprehend seemingly hidden connections.

Modelers can use that knowledge to quickly build the data models or schemas required.  And then they can turn these over to data integration teams who can bind these models to the sources using views or data services. These rapidly built views are easy to validate and test with business users.  If iteration is required, it can be done quickly.  And once the views and data services are firmed, building out the application's user interface layer is a snap.

Data Discovery Delivers Faster Time to Solution
By accelerating the data relationship finding and modeling process, data modelers can use discovery tools to eliminate much of the time and effort typically employed in uncovering the entities and relationships necessary to build data models. Accelerating these initial critical development lifecycle steps reduces overall time to solution for new applications.   IT looks better.  And the business gains the application benefits sooner.

Data Discovery Delivers Better Quality Applications
Discovery tools help align data's business and technical contexts, facilitating greater collaboration between business and IT professionals.  Discovery tools such as Composite Discovery display meta-data in an easy-to-read format that allows modelers and end users to easily validate requirements with greater confidence.  More accurate validation at the front-end of the project reduces corrective actions in downstream steps.  This reduces frustration and ensures a higher quality application.

Data Discovery Frees Top Talent
Discovery tools provide ease-of-use and automation that reduces the need for data modeling expertise. Top data modelers can now redirect these efforts towards other data governance activities.

Go for it!
In the time it took to read this article, your organization likely added a gigabyte of new data.  Old methods and manual techniques cannot keep pace.  Try data discovery.  You'll be glad you did.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@MicroservicesExpo Stories
By now, every company in the world is on the lookout for the digital disruption that will threaten their existence. In study after study, executives believe that technology has either already disrupted their industry, is in the process of disrupting it or will disrupt it in the near future. As a result, every organization is taking steps to prepare for or mitigate unforeseen disruptions. Yet in almost every industry, the disruption trend continues unabated.
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore t...
SYS-CON Events announced today that Auditwerx will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Auditwerx specializes in SOC 1, SOC 2, and SOC 3 attestation services throughout the U.S. and Canada. As a division of Carr, Riggs & Ingram (CRI), one of the top 20 largest CPA firms nationally, you can expect the resources, skills, and experience of a much larger firm combined with the accessibility and attent...
Everyone wants to use containers, but monitoring containers is hard. New ephemeral architecture introduces new challenges in how monitoring tools need to monitor and visualize containers, so your team can make sense of everything. In his session at @DevOpsSummit, David Gildeh, co-founder and CEO of Outlyer, will go through the challenges and show there is light at the end of the tunnel if you use the right tools and understand what you need to be monitoring to successfully use containers in your...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
The IT industry is undergoing a significant evolution to keep up with cloud application demand. We see this happening as a mindset shift, from traditional IT teams to more well-rounded, cloud-focused job roles. The IT industry has become so cloud-minded that Gartner predicts that by 2020, this cloud shift will impact more than $1 trillion of global IT spending. This shift, however, has left some IT professionals feeling a little anxious about what lies ahead. The good news is that cloud computin...
Lots of cloud technology predictions and analysis are still dealing with future spending and planning, but there are plenty of real-world cloud use cases and implementations happening now. One approach, taken by stalwart GE, is to use SaaS applications for non-differentiated uses. For them, that means moving functions like HR, finance, taxes and scheduling to SaaS, while spending their software development time and resources on the core apps that make GE better, such as inventory, planning and s...
After more than five years of DevOps, definitions are evolving, boundaries are expanding, ‘unicorns’ are no longer rare, enterprises are on board, and pundits are moving on. Can we now look at an evolution of DevOps? Should we? Is the foundation of DevOps ‘done’, or is there still too much left to do? What is mature, and what is still missing? What does the next 5 years of DevOps look like? In this Power Panel at DevOps Summit, moderated by DevOps Summit Conference Chair Andi Mann, panelists l...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
The rise of containers and microservices has skyrocketed the rate at which new applications are moved into production environments today. While developers have been deploying containers to speed up the development processes for some time, there still remain challenges with running microservices efficiently. Most existing IT monitoring tools don’t actually maintain visibility into the containers that make up microservices. As those container applications move into production, some IT operations t...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...