Welcome!

Microservices Expo Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui

Related Topics: Microservices Expo, Java IoT, Microsoft Cloud, Machine Learning

Microservices Expo: Blog Post

The Advantages of a Model-Based Approach

A Safer, Faster, and More Transparent Method of Managing Schema

The vast majority of schema management today is handled through the generation, review, and execution of SQL scripts.  These scripts can be tiny or huge; they can encapsulate the creation and relationships of several objects or they can describe a one-time alteration to a single object.  Once executed they generally leave no history of their passing other than the presence of the pieces they create, delete or modify; you can be dependent on hundreds of small scripts or on one giant script to build out new environments or evaluate existing ones.  You’re left with a schema that is a massive collection of individual parts applied in an order you can’t reproduce.

Assessing the compatibility of the schema you develop against in relation to what’s in production is a time consuming and error prone process. When you craft new changes, ferreting out the possible impact on other objects in your schema is not intuitive. The specific business needs associated with the individual parts of your schema get murky over time and it’s harder for you to design to your future instead of accommodating your past. There is no traceable history of who did what and why from environment to environment.  Application issues caused by database errors become hard to troubleshoot because there is no easily digestible standard to use as measuring stick in evaluating a malfunctioning environment.

At Datical, we talk a lot about the model-based approach we take to database schema management and how it’s superior to the scenario described above.  When you use Datical DB to employ a model of your database schema, everyone in your organization is working with a transparent and comprehensive representation of your entire schema. There’s no more wandering off into the weeds of individual changes with no sense of history or relationship when trying to make sense of your schema. The model enables a faster, safer, and more transparent method of managing your schema.  Here are some of the perks:

No need for SQL
Instead of writing SQL, the model powers a series of graphical form-based wizards that take the guess work out of change authoring.

Reliable change design based on what your database is, not what you think it should be.
When you need to update the schema, relationships between objects are clearly mapped out in the model and their purposes and history are fully documented and easily accessible.

Forecast: Impact Assessment on a Representative Model
Prior to execution the model can be used to simulate proposed changes in memory without touching your database allowing you to deploy with confidence in sensitive environments.

Traceable History of the Evolution of your Schema
Managing schema change with the model becomes an exercise in incrementally updating a single historical document. Changes are described in a simple, readable format.  The application features and business initiatives the changes support are tied to the changes themselves giving data architects and developers insight into why a change was made. The reliance on tribal knowledge disappears because everything you need to know about the “why” of a database change is tied to the change itself.

Easy Comparison of Models
Every database instance is now an instance of the same model.  The model provides the structure you need to quickly ascertain what’s missing, what’s wrong, and what needs to be done to bring your disparate environments into synchronization.

Clear View of Completeness
The operations personnel that deploy and monitor your applications can easily establish the database schema is everything it should be and detect drift more easily than they could previously by eyeballing diagrams or reviewing batches of deployment script.  If reality doesn’t match the model, the path to remediation is clear.

More Stories By Pete Pickerill

Pete Pickerill is Vice President of Products and Co-founder of Datical. Pete is a software industry veteran who has built his career in Austin’s technology sector. Prior to co-founding Datical, he was employee number one at Phurnace Software and helped lead the company to a high profile acquisition by BMC Software, Inc. Pete has spent the majority of his career in successful startups and the companies that acquired them including Loop One (acquired by NeoPost Solutions), WholeSecurity (acquired by Symantec, Inc.) and Phurnace Software.

Microservices Articles
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app secu...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.