Welcome!

Microservices Expo Authors: Elizabeth White, Derek Weeks, Mehdi Daoudi, Don MacVittie, John Katrick

Related Topics: Microservices Expo, Java IoT

Microservices Expo: Article

Naturally Increasing Data Value with Hierarchical Structures

Why hierarchical structures are useful

Hierarchical structures have an inherent ability for significant data value increases beyond the data collected.  This will be shown to exist in hierarchical structures and even more powerfully in their natural hierarchical processing capabilities. These will demonstrate flexible and efficient ways to increase data value automatically and will be discussed in this article. SQL will be used to perform a wide range of hierarchical processing operations that easily demonstrate these increasing data value capabilities.

Basic Hierarchical Data Modeling
The SQL view definition in Figure 1 below uses a sequence of standard SQL LEFT Outer Joins to model the shown hierarchical structure. This hierarchical view is modeled at the node level to define basic physical structures like IMS or XML. It defines logical hierarchical structures using flat data like relational. The LEFT Outer Join hierarchically preserves the left data argument over the right argument. This allows the left data argument to exist when there is no matching right data argument. These LEFT Outer Joins can be strung together with the ON clause to define and build full hierarchical multiple path structures. The ON clause was also introduced in the ANSI SQL-92 Outer Join to offer more local control over the WHERE clause for specifying join criteria. Notice in Figure 1 below that the ON clause is specified at each specific join point to control its specific join operation. All of these capabilities allow the LEFT Outer Join syntax to exactly model hierarchical structures unambiguously. This was not previously possible using a single WHERE clause for join control. The WHERE clause is still used to specify global data filtering described later.

Basic Hierarchical Data Modeling

CREATE VIEW RDB AS
SELECT * FROM R
LEFT JOIN D ON R.r=D.d    /* Extend path from node R to node D */
LEFT JOIN B ON R.r=B.b    /* Start new Path from node R to node B */

Basic Structure

   R

 /    \

D     B

Figure 1: SQL Hierarchical View

The corresponding hierarchical semantics to the LEFT Outer Join data modeling syntax shown in Figure 1 defines exactly how this data structure definition operates hierarchically. It performs hierarchical data preservation when processed directly by the ANSI SQL processor. As shown in Figure 1 above, this hierarchical data definition can be specified in an SQL view.  The R, D, and B items in the hierarchical view structure in Figure 1 above are relational table names which also represent nodes in the hierarchical structure.

The hierarchical structure defined in Figure 1 above follows basic hierarchical principles supplied by the SQL Outer Join semantics. Nodes D and B can not exist without the higher level R node. But node R can exist without nodes D and B. Node D can exist without node B and visa versa because hierarchical pathways are independent of each other.

Hierarchical Structures Contain More Data Value than Data Stored
As a hierarchical structure naturally grows top to bottom and left to right as in Figure 1 above, their data value increases nonlinearly as new nodes are added. This is because natural data reuse is used in a hierarchical pyramid fashion. Both linear single path data stacking and the more powerful nonlinear multiple path stacking significantly increases the value of the data.

With linear path data stacking, the multiple data occurrences of the lower level data node are shared across the current parent data occurrence. With nonlinear path data stacking, all of the current parent data node's data occurrence lower level pathways are included in the data sharing. All of these different natural combinations of related data enable hierarchical structures to contain more data value than was initially captured. This automatic data value accrual as new data is added is a valuable capability not being utilized today. This is a good reason to use hierarchical data structures when possible. SQL hierarchical processing can automatically utilize this goldmine of accrued information to perform more powerful queries described later.

Static Hierarchical Structure Combining
Basic structure views like the one in Figure 1 above can also be joined together hierarchically to create larger more powerful views that combine hierarchical structures. This is also performed by using the LEFT Outer Join and ON clause as shown in Figure 2 below. The only difference is that the view structures are being joined instead of isolated nodes. In this example, two additional basic views, IMS and XYZ, are used. They represent a physical IMS and XML data source modeled in the same way as the logical RDB view was defined in Figure 1 by using the LEFT Outer Join. In the IMS view, Segments I, M, and S represent nodes. In the XML view, Elements X, Y, and Z represent nodes. These common nodes allow seamless hierarchical data structure integration. Not being relational data, the non relational views contain additional specific physical meta information needed for their specific physical processing. They will also require an external process for retrieving their non relational data into rowsets when needed. This process will be covered further later.

Logical Data Structure

CREATE VIEW TestView AS
SELECT * FROM RDB
LEFT JOIN IMS ON B.b=I.i
LEFT JOIN XYZ ON B.b=X.x

Full Heterogeneous Structure

     R

   /    \

  D     B

        /    \

      I        X

    /   \      /   \

  M   S    Y   Z

Figure 2: Logical data view

The combined view named TestView in Figure 2 consists of views RDB, IMS and XYZ. This makes TestView a heterogeneous view consisting of multiple data structure types defined seamlessly to ANSI SQL by using the LEFT Outer Join data modeling operation described above. When joining physical view structures together, the result is a logical structure. Logical structures can be joined with other logical or physical structures. Logical structures can also be directly processed by the ANSI SQL processor. This is because the logical view defined by TestView in Figure 2  above with its corresponding relational rowset hierarchically maps to the combined and expanded LEFT Outer Join views RDB, IMS and XYZ.  This supplies the natural hierarchical data modeling syntax and semantics needed for SQL to automatically perform hierarchically on the now homogeneous flattened rowset.

While the hierarchical data has been flattened in the rowset, it still automatically represents hierarchically preserved multiple variable length pathways using null filled data. The null data is used to fill out the variable length paths keeping the data aligned properly in the rowset.   This allows hierarchical structures to be still operated on hierarchically by SQL.

Logical View Joining Further Increases Data Value
The joining of structures shown in Figure 2 above continues to significantly increase the value of data being processed as described previously. But now by joining larger more meaningful logical structures, the level of data value increase is raised to a new level. In addition, these views are free to be joined in multiple ways each creating new logical different views that contain more data value when processed. These views are temporarily materialized when processed, so they are not taking up space when not used and avoid the replicated data problems.

Increasing Data Value with Multipath Query Data Qualification
Specifying multiple paths on a data qualification query requires nonlinear processing logic. These multipath queries using the SQL WHERE operation are incredibly powerful and extremely complex to process.  They correlate references across pathways that interact with each other. This processing is hidden by SQL's automatic and inherent hierarchical structured data processing capability.

There are two different usages of a SQL multipath query data qualification. These are: 1) Selecting data from one pathway of the hierarchical data structure based on data from another pathway or: 2) WHERE clause searches comparing data on or across multiple pathways. These multipath queries use the naturally occurring structure semantics that exists between the concurrently processed pathways to solve this query. This dynamically increases the value of the data and significantly increases the number of different queries possible. This further increase the value of the data beyond the static hierarchical view data value increases described earlier. This enables a new level of powerful hierarchical processing available to uncover new more deeper meaning in the data without requiring additional query syntax or knowledge of the structure for the user. Global view capability magnifies the usefulness of these multipath processing capabilities further by making unlimited combinations of pathways more available and easier to specify because these large views have no overhead.

Dynamic Hierarchical Structure Combining
It is important to point out that SQL hierarchical processing can perform all of its hierarchical processing capabilities dynamically. This means that SQL can dynamically and hierarchically join hierarchical structures in an ad hoc or interactive way (instead of in a static view). This is shown in Figure 3 below where views RDB and IMS are dynamically joined at runtime to combine structures. This also increases data value with the joining of views, but in this example it is dynamically performed by interactively querying which enables discovery and drill down operations. This makes these interactive operations even more powerful with the dynamic data value increase.

SELECT B.b, M.m, S.s
FROM RDB
LEFT JOIN IMS ON B.b=I.i

Result Structure

     B

   /    \

 M     S

Figure 3: Dynamic Structure Joining

Conclusion
This article has described how hierarchical data structures can be very powerful and useful for storing data and in the processing of this data by dynamically increasing the value of the stored data. Statically the stored data value increases in use and information value as data is added and then dynamically as the data value is increased when processing the query. The query controls how the data value is increased in order to answer the query. This automatic multilevel data reuse and dynamic combining of the hierarchical data structures makes hierarchical structures a very useful data storage structure that automatically leverages its increasing data value.

More Stories By Michael M David

Michael M. David is founder and CTO of Advanced Data Access Technologies, Inc. He has been a staff scientist and lead XML architect for NCR/Teradata and their representative to the SQLX Group. He has researched, designed and developed commercial query languages for heterogeneous hierarchical and relational databases for over twenty years. He has authored the book "Advanced ANSI SQL Data Modeling and Structure Processing" Published by Artech House Publishers and many papers and articles on database topics. His research and findings have shown that Hierarchical Data Processing is a subset of Relational Processing and how to utilize this advanced inherent capability in ANSI SQL. Additionally, his research has shown that advanced multipath (LCA) processing is also naturally supported and performed automatically in ANSI SQL, and advanced hierarchical processing operations are also possible. These advanced capabilities can be performed and explained in the ANSI SQL Transparent XML Hierarchical Processor at his site at: www.adatinc.com/demo.html.

@MicroservicesExpo Stories
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Admiral Calcote - also known as Lee Calcote (@lcalcote) or the Ginger Geek to his friends - gave a presentation entitled Characterizing and Contrasting Container Orchestrators at the 2016 All Day DevOps conference. Okay, he isn't really an admiral - nor does anyone call him that - but he used the title admiral to describe what container orchestrators do, relating it to an admiral directing a fleet of container ships. You could also say that they are like the conductor of an orchestra, directing...
The past few years have seen a huge increase in the amount of critical IT services that companies outsource to SaaS/IaaS/PaaS providers, be it security, storage, monitoring, or operations. Of course, along with any outsourcing to a service provider comes a Service Level Agreement (SLA) to ensure that the vendor is held financially responsible for any lapses in their service which affect the customer’s end users, and ultimately, their bottom line. SLAs can be very tricky to manage for a number ...
Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things c...
Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to clos...
The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Mi...
The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...
Many IT organizations have come to learn that leveraging cloud infrastructure is not just unavoidable, it’s one of the most effective paths for IT organizations to become more responsive to business needs. Yet with the cloud comes new challenges, including minimizing downtime, decreasing the cost of operations, and preventing employee burnout to name a few. As companies migrate their processes and procedures to their new reality of a cloud-based infrastructure, an incident management solution...