Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Jason Bloomberg, Liz McMillan, PagerDuty Blog

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Linux Containers, SDN Journal

@CloudExpo: Article

An End-to-End Solution

Can one company fulfill all of the requirements of an organization?

I was asked by Mr. Peter Hastings (NH DoIT Commissioner) about my understanding and knowledge of "End to End Solutions". I have witnessed these solutions before but I wanted to find a good definition. So, I began my research and found this definition: One supplier or one company that can provide all of your hardware and software needs in a suite of tools that meets all of the customers requirements and no other suppliers need to be involved. I think this is a good one and makes sense to me.

I'm sure there are many other definitions that exist and this is simplistic in nature but I see this as a baseline definition. One could certainly branch off of this definition to encompass a variety of explanations but let's stick with this one for this article.

This idea has been around for sometime now and slowly but surely as the years go on it seems to go away and pop up and go away and pop up, etc... I have been watching and observing for a long time as well and witnessed the process and implementation of these suites over the years and have developed my own opinions about all encompassing tools. So, this has come up before in my career.

Commissioner Hastings wanted to understand at a high level if there was any benefit for our development teams across the enterprise of State government to integrate and entire suite of tools to achieve a more seamless Software Development Life Cycle (SDLC). If this were possible there could be savings realized by having shorter development life cycles rather than an application taking three years to complete, trim a year and the State could save money.

Theoretically, this is a great idea and makes good sense but in reality the only shops that can take on an idea of this magnitude are shops with a lot of money, a lot of staff and a lot of expertise in house because inherit in the words "End to End Solution" I hear big, large, enormous, all encompassing and there is a lot of overhead in this idea right out of the gate.

I explained to the Commissioner in advance and prefaced my dialog as such that I would be happy to write about this subject but that not all would agree with my analysis and conclusions. I explained to the Commissioner that this would only be my perspective and that there are varying experiences that would be a counterpoint to my thoughts on this subject. The Commissioner is quite open to varying opinions and really just wanted to understand the subject more than anything else at a high level. Now, I can only tell you what I have seen and witnessed over the years and share my thoughts.

The premise that one company can fulfill all of the requirements of an organization: Can this be true? Can one supplier/company provide this much and help an organization realize a cost savings? My opinion is that I have not seen this be true to date. It's a big white unicorn in my opinion, although, you have to believe in a unicorn once in a while. Every company has their specialty. Those things they do well and those things they try to do well and fail at.

We are all microcosms of the wider world so we are all "End-to-End Solutions" in some way, do we as humans do it all well? I think not. We all have those things we do well and those things we don't. The same goes for companies. So, the premise that a company can deliver it all to me is not real but fantasy in my opinion. Not only that but let's take it one step further and play devils advocate for one moment and even if a company could deliver an entire "End to End Solution" a suite of tools that delivered it all.

These are the next questions to ask: Can the organization it's being delivered to in reality manage it all? Do they have all the staff to support the toolsets in the suite, do they have all the expertise required to implement the toolsets in the suite, do they have the time it takes to learn and support the toolsets in the suite and can they really manage an entire suite of toolsets successfully?

If I were to think even of one tool in the suite, let's take the SCM tool since I know that tool well. You have the software configuration administration piece of the tool, you have the release management tool performing compiles, you have defect tracking tool to manage deficiencies, modifications and enhancements against the application, then you have code reviews and theoretically this could be three to four people with these skills. That's just one toolset in the suite. Do you have all of this expertise in one person in house already? SCM is not an easy skill to come by to begin with.

What tools will your organization do well at and which ones do you have the knowledge in already? Several toolsets in the suite may be worked because there is expertise in house on those toolsets but not on others and you may have to hire or train staff in house on the other toolsets providing you have the staff to train. In any case it usually turns out to be that you don't have all that it takes and as such fail at the implementation. Once the implementation fails and the buy in from management fades because no one wants to be associated with a failing project and the grass roots folks that are really working the products on the ground are to stressed from the implementation the experiment is over and lots of money is wasted that could have been targeted in another more productive manner.

The problem really is that when you get a lot of folks in a room and the sell is on and the atmosphere is charged with enthusiasm and a lot of folks see potential savings not yet realized that could be they become swept away in a sell. When this happens it usually ends up costing an organization a lot of money that was unintended and at some point the question is always asked what did we do?

I agreed to write on this subject ultimately because I'm bugged about "End to End Solutions". I bugged about the sell. I'm bugged about the hype and the promise that I don't see how anyone could deliver. For example: I have recording software that is more automated than anything I have seen before but I still have to know how to run the tool, record with it, how to mix, add effects, edit the recording, master to a stereo track even though the tool is very automated.

This would be the case for the "End to End Solution": There is the Testing tool, The QA tool, The Project Management tool, The Requirements Gathering tool, the SCM tool, the Defect Tracking tool and whatever other tool or plug-in you add to the suite. I'm sure the suites can be customized as well to only include what you need for your organization, which makes more sense.

The bottom line is, that's a lot of expertise, staff, time, money and resources. The tools have to be learned; you might have to hire folks in addition to support them, others will have to be cross trained in the event that your expertise goes away. What organization can really afford this level of overhead other than large organizations with a chunk of capital?

I think one of the big mistakes that organizations make is that when these sells are occurring there is often times no expertise from the organization in the room to ask the tough questions. What training is the supplier offering, what is the technical support like, are you using a proprietary database, if the database goes down who supports it, etc...

So, it's typically managers who are in the room who are not even going to do the work on the ground or support these behemoths. I'm bugged by the suites themselves that typically do a good job in one area and fall down in others but yet you now have an entire suite of tools to make-work. For example: I have seen some companies make a great SCM tool but a terrible Defect Tracking tool and a great Project Management tool but a terrible SCM tool.

The idea that one supplier can do it all in my opinion I feel is flawed. Just like when I'm recording music, since that's the analogy that I understand best. As automated as any mix they want to make whether they say click here for rock, jazz, easy listening, etc... The user still has to do something to get to that place and if that is not the right place the user must have an understanding of how to get to the right place. So, human intervention is required. Truly automated tools are a myth in my opinion.

Now a lot of folks are going to disagree with me because they are selling these monster's to people who can't support them, implement them, maintain them, provide technical support on them and mentor other's on them. It really requires and army depending on how big the organization is and that brings up my first example.

Every organization wants to save money and often times find themselves getting sold on this pot of gold that if everything could be integrated across the enterprise of our business however big or small we could save so much money and here's the thing as I said earlier.

A large organization can take the impact of a completely integrated software environment like this because it has the staff. Now if they have the staff, then it's possible they have the expertise or you at least have the staff that can be trained by the seller initially to get your people trained. However, when the customer leaves there must be competent staff that can truly maintain the tools and champion the tools to create the standard that can actually produce the desired result and cost savings.

For most organizations the additional staff, training and expertise in itself would more than likely be a great expense that was unintended. Good for you and your organization if you have the expertise of the suites toolsets in house but typically that is not the case.

Often times you have to go outside your space to find some help. Maybe to find someone who knows QA, which is hard to find these days or SCM's, that is even harder, Tester's, etc... If you have a suite of five toolsets you must have a champion of each one of those toolsets in order to get them introduced successfully in an organization.

Now, if you try this with State government your even more likely to fail because you never have the staff, expertise, time or the money. I have seen this attempted at least two times in my tenure.

Also, something that organizations don't take into consideration when taking on suites like this is that it usually requires a culture that accepts change and having their cheese moved and understand the chain of command. It requires typically a culture shift and most folks don't like change, including me. However, it was once said to me many years ago that Change is stability. It's the one constant. So, I try to believe in that.

So, consider for one moment an organization taking on a tool suite of this nature the "End to End Solution". Once you have hired staff, have your trained expertise in house, your staffing is solidly in place, you now need your leader to convincingly direct the staff to implement and accept the new methodology. This industry is always in flux and moves fast and is never is static but change is still difficult to implement.

When everyone gets on the bus and you have institutionalized a new culture with new toolsets and they are embraced, then savings can be realized because anyone who then draws outside the lines changes the integrity of the outcome and a new outcome emerges.

I've told you what you need and that larger organizations are more successful at this than smaller ones because they have the staffing, expertise, money and time on their side.

Let me just provide one example of what I have seen and the way I think this can be done successfully even in smaller organizations. Many attempts fail and the reason is simple. It's too big to eat. I mean these implementations are monstrous. They take multiple administrators for starters of every toolset in the suite. One person should never be running the whole suite with each tool having it's own administrator that would be a security risk. You never want one person having all the keys to the kingdom anyway.

There are multiple layers of understanding and expertise and a lot of time to get up and running. Now, I'll have plenty of folks who will disagree with me and that's fine. They are the ones selling these suites. They are going to have a different opinion and that's fine.

As I have learned in life anyway one-third of the people are going to like you, one third of the people are going to hate you and one third of the people don't care.

So, you purchase the suite anyway and the first toolset you'll tackle is the testing suite. Now you need to have a body that understands the testing tool. How to write the scripts that get populated into the tool to run the test scripts and they are constantly changing based on new functionality, modifications, deficiencies and enhancements against the applications under development. If you don't have a good scriptwriter for testing you are already in trouble.

Now you are going to tackle the Project Management tool. Who really knows this skill? I've never met anyone who actually used a tool like this and understood it and implemented it well. That's just my experience. I'm sure they are out there. To me this is a rare expertise. Moving onto the QA tool, then the SCM tool, Defect Tracking Tool, etc.. There could be a combination of toolsets. My point is that you have to eat a tool one at a time. It's extremely hard to begin with to standardize any tool across any organization never mind a suite of toolsets. However, if you take a small bite, digest that and then the next one and so one you'll be more successful.

I feel scalability is the answer and actually makes more sense. If you can customize the suite to only include in it what you have expertise in and build upon that knowledge slowly I feel you can be more successful. For example start with the expertise you have in house. Let's say you have SCM and Defect Tracking knowledge.

Introduce those tools and let users in your organization get comfortable with that standardization. Then once that is stable introduce the Project Management, then Testing, then Requirements Gathering, etc... It doesn't have to be in this order clearly. It can be in whatever order you want but if you go slowly you won't break the bank and it will evolve. I always tell folks this is an evolution not a revolution.

Just look at world events, when there is a revolution and the old government is now out of office typically no one knows how to introduce a new government successfully. Even when they do they try to do the whole thing at once and it fails because it usually takes years to get in the situation that you are in and it's going to take time to get out of it and change it. So, an evolution is always a better approach in my opinion. Slow progress over time gathering up steam as one tool is introduced after another until it's all in place.

Should an organization even take on this kind of an exercise you have to put competent people on it because competent people can mentor others and this makes the implementation more successful at the end of the day. Every person has their strength's and weaknesses so your going to want to put folks on the job that have a proven track record at pushing change through. Not an easy thing to do.

You want folks running this who have experience with rejection and get a leadership that is 100% unfalteringly driving the process. One exception to the process and the entire suite will fail.

If you were smart while you were implementing the toolset you have strength in you would have been learning the next toolset in the suite to be ready to move forward with that implementation the moment the first tool was completely implemented. This way there is no pause in the next learning curve.

There is a lot of work that goes into these suites and it's not just staffing a tool like this and finding the expertise to implement the various pieces/tools. This includes upgrading the tool, applying any bug fixes, maintaining the OS, supporting back-up's. The list really does go on.

One thing to make sure to find when examining the SCM toolset in the suites is that it has a Generic repository perspective. This really is the answer for the SCM piece of the puzzle. You need to have repositories, which is a glorified name for a folder that will house any type of asset in the organization. You can have an "End to End Solution" for one development tool and one project but most development environments contain multiple tools (Java, Cobol, PowerBuilder, etc.). Now you'll have to introduce multiples toolsets and twice the overhead and no way of measure across an enterprise. That's a waste of money in the final analysis.

You need to make sure that these repositories can support any platform of development tool that an organization has in house or across the enterprise depending on how your organization is structured. SCM tools that only support one platform or another that are part of a suite of this nature in the "End to End Solution" are limited in scope and are not flexible enough to be called standards and standardizing and centralizing is how you really save money.

I don't know of to many organizations anyway that only have one application language they are developing in and nothing else and if you're serious about SCM you have to take into account every type of software asset or intellectual property you have in the enterprise.

I'm not saying that there are not exclusive Java, Visual Studio or PowerBuilder shops because they exist. What I am saying is that large organizations like States for example typically have a lot of legacy applications that were developed in many types of languages and to have a SCM tool in your "End to End Solution" suite that only covers one language or another could never be a standard. Where would you house all the rest of your source data assets, intellectual property or mission-critical assets in a second SCM tool or a third?

If that is the case there goes the whole benefit of reporting, managing, requirements gathering, testing on all of your assets in the enterprise because you won't have everything in one repository but in multiple repositories. Then the "End to End Solution" idea is a moot point. The benefit is lost. Now you will need more expertise, more money, more staff, more time, more resources and now your wasting more of everything and you lose the value of the "End to End Solution" anyway. There is no value in supporting just one type of language at all, the metrics will be quite limited and remember the purpose of these suites to begin with is to standardize, centralize and realize cost savings across an entire organization not just one project.

Listen to how this makes no sense we spent all of the money and bought an "End to End Solution" just for the Java team or the Visual Studio team or the PowerBuilder team. What about all the other types of development you are performing? Don't you want to save money in those areas of application development as well? For the cost of these suites I would hope so and anything else would be a waste of time, money and resources for any organization.

There is also the disaster recovery aspect would you secure one set of assets and then not secure another set. For example would I only care about my PowerBuilder source assets and not my Visual Studio assets? If I have a repository perspective that can only handle one type of data assets, what happens if there is a catastrophic event I only can recover one sets of assets but not another. So, when looking at these suites and considering disaster recovery make sure the SCM toolset has a generic repository that can secure and control any type of source data assets in your organization. It doesn't have to be great at it but it does have to work.

Summary
There are a lot of pitfalls to these "End to End Solutions" make sure you have competent folks in the room that can ask the hard questions and if the managers are embarrassed by the hard questions at the dog and pony show they are going to be even more embarrassed when they buy the suites and begin to lose money for the organization, the implementation fails, they lose their jobs or have to move on to distance themselves from the failure and the organization has to absorb the cost.

You have to make sure you have the staff, the time, the money; the expertise and that you choose suites that can support every type of source data asset in your enterprise. Make sure leadership convincingly communicates the change. Take small bites and digest those and make the cultural shifts in your organization you need to make before moving on to the next toolsets in the suite. It's an evolution not a revolution so think it through and don't get caught up in the sell. It will only cost you more money later when you buy on impulse.

My background is that I have been working in Software Configuration Management (SCM) for a few years now and I am presently, the administrator of SCM at the State of NH. SCM is a process-based Software Configuration Management (SCM) tool for managing application source code, intellectual property and mission critical assets. In this capacity I also secure these assets for disaster recovery purposes. I manage 400 plus applications housed in SCM and support 400 users using the product. The development tools we currently use are PowerBuilder PBV8, PBV11 and 12; Visual Studio 2003, 2005, 2008, 2010 and 2012; Eclipse, Juno, RAD, Mule, Cold Fusion, Java, COBOL and VB.

As the Software Configuration Manager (SCM), I provide the administration of the source code management tool. This includes the entire infrastructure of the environment for development, developing life cycles, providing best practices, procedures, processes, documentation; defect tracking, disaster recovery, maintaining build machines and the training of all the developers on proper source code management using the development tools in our development environment.

Risk to End to End Solutions

  • Not enough staff to support the toolsets
  • Not enough expertise to understand the toolsets with great clarity (Testing, SCM, Defect Tracking, QA, Requirements Gathering, Project Management)
  • Loss of expertise to other employment opportunities.
  • No cross training of expertise or transfer of knowledge
  • No solid implementation team
  • The expertise required to analyze these tools are typically not invited to the dog and pony show.
  • Not enough capital when problems arise.

More Stories By Al Soucy

Al Soucy is software configuration manager at the State of New Hampshire's Department of Information Technology (DoIT). In that role Al manages software configuration for dozens of PowerBuilder applications as well as applications written in Java, .NET, and COBOL (yes, COBOL). Al plays bass guitar, acoustic guitar, electric rhythm/lead guitar, drums, mandolin, keyboard; he sings lead and back up vocals and he has released 8 CDs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
SYS-CON Events announced today that Auditwerx will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Auditwerx specializes in SOC 1, SOC 2, and SOC 3 attestation services throughout the U.S. and Canada. As a division of Carr, Riggs & Ingram (CRI), one of the top 20 largest CPA firms nationally, you can expect the resources, skills, and experience of a much larger firm combined with the accessibility and attent...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
By now, every company in the world is on the lookout for the digital disruption that will threaten their existence. In study after study, executives believe that technology has either already disrupted their industry, is in the process of disrupting it or will disrupt it in the near future. As a result, every organization is taking steps to prepare for or mitigate unforeseen disruptions. Yet in almost every industry, the disruption trend continues unabated.
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, will discuss how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He will discuss how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore t...
Everyone wants to use containers, but monitoring containers is hard. New ephemeral architecture introduces new challenges in how monitoring tools need to monitor and visualize containers, so your team can make sense of everything. In his session at @DevOpsSummit, David Gildeh, co-founder and CEO of Outlyer, will go through the challenges and show there is light at the end of the tunnel if you use the right tools and understand what you need to be monitoring to successfully use containers in your...
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
Lots of cloud technology predictions and analysis are still dealing with future spending and planning, but there are plenty of real-world cloud use cases and implementations happening now. One approach, taken by stalwart GE, is to use SaaS applications for non-differentiated uses. For them, that means moving functions like HR, finance, taxes and scheduling to SaaS, while spending their software development time and resources on the core apps that make GE better, such as inventory, planning and s...
Building custom add-ons does not need to be limited to the ideas you see on a marketplace. In his session at 20th Cloud Expo, Sukhbir Dhillon, CEO and founder of Addteq, will go over some adventures they faced in developing integrations using Atlassian SDK and other technologies/platforms and how it has enabled development teams to experiment with newer paradigms like Serverless and newer features of Atlassian SDKs. In this presentation, you will be taken on a journey of Add-On and Integration ...