Welcome!

Microservices Expo Authors: Zakia Bouachraoui, Elizabeth White, Pat Romanski, Liz McMillan, Yeshim Deniz

Related Topics: @DevOpsSummit, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, FinTech Journal

@DevOpsSummit: Blog Post

DIY Enterprise DevOps | @DevOpsSummit @Datical #DevOps #Microservcies

Insights into the DIY DevOps Dilemma

In Enterprise DevOps, It’s Not Always Better to Roll Your Own

I read an insightful article this morning from Bernard Golden on DZone discussing the DevOps conundrum facing many enterprises today – is it better to build your own DevOps tools or go commercial?  For Golden, the question arose from his observations at a number of DevOps Days events he has attended, where typically the audience is composed of startup professionals:

I have to say, though, that a typical feature of most presentations is a recitation of the various open source products and components and how they integrated them to implement their solution. In a word, how they created their home-grown solution. Given that many of these speakers hail from startups with small teams and a focus on conserving cash, this approach makes sense. Moreover, given that these are typically small teams working at companies following the Lean Startup approach, using open source that allows rapid change as circumstances dictate makes sense as well. And, in any case, startups need to solve problems today because who knows what the future will bring?

That last part is what sparks the question – what does the future hold?  For that startup that begins to scale and grow, what are the future implications of building and, more importantly, trying to maintain a homegrown solution as more teams, products, and use cases proliferate?  “And for enterprises, which must plan for the future,” Golden writes, “an approach that doesn’t have a long-term time horizon is problematic, to say the least.”

The first issue Golden sees in a DIY DevOps approach is the unspoken presumption that the same intensity of interaction and collaboration experienced at a startup can scale to a larger organization, or is achievable within a large enterprise.  Golden writes, “in an enterprise, the kind of ‘he sits two seats away from me, so I can just turn to him and ask a question’ is unachievable,” arguing that, “solutions based on proximity and immediate response to problems is not scalable.”  Large IT organizations are going to need a solution that scales enough to cover the myriad of different applications that are developed and supported, and in Golden’s opinion “Homegrown solutions invariably are written for a limited use case that reflects the situation at the moment and are difficult to modify when new requirements appear associated with a new use case.”

This perspective is interesting to me for the simple fact that I’ve read a great deal about how a number of large enterprises like Macy’s, Nationwide and Highmark, heck, even IBM, are in various stages of tackling this issue right now, and are reporting a great deal of success in their efforts.  The DevOps leaders in these organizations have embraced the idea of a DevOps culture where development and operations collaborate closely together and are working hard to systematize those interactions.  On the flip side, though, these organizations are, to Golden’s point, leveraging commercial DevOps solutions pretty heavily in order to achieve their goals for technical processes like Continuous Delivery.

Another issue Golden sees in the DIY DevOps approach is the potential for promoting the unique snowflake problem to a system-level issue rather than just a one-off application issue.  “It’s fantastic that the application resources themselves are standardized [in DevOps], but a bespoke system invariably falls further and further behind commercial systems, particularly those that take responsibility for selecting, integrating, and supporting one or more open source components,” Golden argues.  In this scenario, the vendor supported open source solution benefits from the wide community of developers working to make it better, increasing the rate of innovation over a homegrown solution.  Additionally, the vendor becomes responsible “to make sure all the components are properly integrated” to the benefit of all customers, particularly those in large organizations.

We’ve seen this scenario play out many times with our customers.  Built on Liquibase, the leading open source solution for versioning and migrating the database, the task for Datical is to ensure the solution is viable for large enterprises in terms of supporting their myriad use cases as well as their requirements for scalability and reliability.  It’s rather often that we’ll be approached by a team who has invested years in supporting Liquibase within their organization, but are at a point now where either new requirements dictate the reallocation of resources to more strategic initiatives, or they simply want to get out from under the overhead created by maintaining their homegrown Liquibase implementation.  It’s perhaps even more often that a large team investigating Liquibase as a possible solution contacts us because they themselves have realized the kind of investment they will have to make, in terms of time and money, in order to customize Liquibase to their use cases and environments.

The final issue Golden raises in the DIY DevOps dilemma is that of continuity.  “It’s fantastic that you have a member of your staff who is talented and creative and puts together your DevOps system,” writes Golden, “However, someday he or she will be gone, and someone else will have to maintain the system.”  Going back to Golden’s argument that the enterprise has to plan for long-term time horizons, this is an important point to consider.  IT often complains of the cost of supporting and maintaining legacy systems, and in some cases it’s possible that a DIY DevOps solution will end up being one of those legacy systems.  You could certainly argue that an internal DevOps system, because of its high visibility, will have staff members clamoring to work on it after the original maintainer departs, but it’s still an issue that should be carefully analyzed and examined before committing to a course of action.

All of these issues lead to Golden’s closing argument, which is salient.  When considering a DIY DevOps approach, what you’re really thinking about is how you’re going to allocate your finite resources towards achieving your goals.  If resources are committed to developing and maintaining a DevOps system or suite of tools, then those resources can’t be used elsewhere.  In companies that were born in the cloud and whose business models rest upon their ability to devise new and innovative technologies, rolling their own DevOps probably makes sense.  For a large commercial bank, however, with core competencies in things like finance and investment, it is probably the better course of action to purchase a commercial DevOps solution instead, freeing up precious resources to focus on serving their customers through innovative financial products and services.

More Stories By Rex Morrow

Rex is the Marketing Director at Datical, a venture-backed software company whose solution, Datical DB, manages and simplifies database schema change management in support of high velocity application releases. Prior to Datical, Rex co-founded Texas Venture Labs, a startup accelerator at the University of Texas, and received his MBA from the McCombs School of Business. Before graduate school, Rex served as a Captain in the U.S. Army, and was awarded two bronze stars during combat deployments in Iraq.

Microservices Articles
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app secu...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...