Welcome!

Microservices Expo Authors: Zakia Bouachraoui, Pat Romanski, Elizabeth White, Liz McMillan, Yeshim Deniz

Related Topics: @DevOpsSummit, Microservices Expo, Linux Containers, Containers Expo Blog

@DevOpsSummit: Article

Continuous Integration | @DevOpsSummit @Datical #ContinuousIntegration

Bringing Continuous Integration to the Database

In 2006, Martin Fowler posted his now famous essay on Continuous Integration. Looking back, what seemed revolutionary, radical or just plain crazy is now common, pedestrian and "just what you do." I love it.

Back then, building and releasing software was a real pain. Integration was something you did at the end, after code complete, and we didn't know how long it would take. Some people may recall how we, as an industry, spent a massive amount of time integrating code from one team with another, or even just between developers that sat next to each other. The arguments against it at the time seemed valid and impactful. Now they seem weak and silly.

Fowler proposed Continuous Integration, an idea that was simple, elegant and had far-reaching repercussions. By producing constant, automated, self-testing builds in development, a huge amount of resource overhead typically incurred down the line was eliminated.

Fowler calls out two arguments against Continuous Integration: "It can't work (here)" and "Doing it won't make much difference." Now that we see that Continuous Integration canwork here and that it will make a huge difference, let's make the database a top-tier software citizen and recognize its importance in our software development lifecycle. It is time for the Continuous Integration database.

The database is just as important as the application because it is part of the application. For far too long, we have heard from DBAs about the special import the database has. Or we have seen software developers denigrate the database as a consistent dumping ground. Both sides are wrong. We need continuous Integration to include database changes.

Now, I would argue that the data itself is the most important asset here. After all, the data will live far longer than the application or the database that contains it, as the application morphs from one platform to the next or structured databases are exchanged for semi-structured. But the database is not any more or less important than the application. Separately they are worthless; together they become valuable.

Thus, if our database structure and stored logic depends so much on the application (and vice versa), it is time that we give the database its due as a software asset. We must include DDL in our single source code repository and Continuous Integration. We also must include, alongside the DDL, mechanisms to gracefully upgrade and downgrade database schema to align with the application.

If the database is just as important as the application, then the database infrastructure is also just as important as the application infrastructure. The fact that we can fire up a server in the cloud to host an app but must wait weeks to get a database set up is embarrassing. Unacceptable. Bizarre. Negligent.

Once application code and DDL are linked in the source code repository and a new release is in place that will update any older copy of database schema, it should be simple to quickly provision an independent development and testing environment.

Just as Continuous Integration eliminated the dreaded, "never-ending" integration stage of software development, when we adopt Continuous Integration for the database, we can expect mountains of wasted effort to simply evaporate.

No longer do we have to meet to discuss which version of the database works with which version of the application. After all, every bit of friction between DBAs and application developers is a thinly-veiled integration exercise. And in today's Continuous Integration environment, that's just unnecessary.

Find out how Datical DB can automate database changes in your Continuous Integration process.

More Stories By Robert Reeves

Robert Reeves is President and Co-founder of Datical. Previously, as Datical’s Chief Technical Officer, Robert Reeves advocated for customers and provided technical architecture leadership. Prior to co-founding Datical, Robert was a Director at the Austin Technology Incubator. At ATI, he provided real world entrepreneurial expertise to ATI member companies to aid in market validation, product development and fundraising efforts. Robert cofounded Phurnace Software in 2005. He invented and created the flagship product, Phurnace Deliver, which provides middleware infrastructure management to multiple Fortune 500 companies. As Chief Technology Officer, he led technical evangelism efforts, product vision and large account technical sales efforts. After BMC Software acquired Phurnace in 2009, Robert served as Chief Architect and lead worldwide technical evangelism.

Microservices Articles
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app secu...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...