Welcome!

Microservices Expo Authors: Elizabeth White, Gopala Krishna Behara, Sridhar Chalasani, Tirumala Khandrika, Liz McMillan

Related Topics: Microservices Expo, Mobile IoT

Microservices Expo: Article

The Automation Paradox

This script is not intended for use in the operation of nuclear facilities

A recent article appeared in the IEEE spectrum on "the automation paradox" which was reflected with the question by Steven Cherry of "Would we be safer overall if we just accept a few deaths due to software?" The concept is a little funny in my mind, since for many years it seems like a lot of the software that I have worked with always carried a EULA that stated something like "...IS NOT INTENDED FOR USE IN THE OPERATION OF NUCLEAR FACILITIES, AIRCRAFT NAVIGATION OR COMMUNICATION SYSTEMS, AIR TRAFFIC CONTROL SYSTEMS, LIFE SUPPORT MACHINES OR OTHER EQUIPMENT IN WHICH THE FAILURE OF THE SOFTWARE COULD LEAD TO DEATH, PERSONAL INJURY, OR SEVERE PHYSICAL OR ENVIRONMENTAL DAMAGE"

This got me to thinking about how far we have come with software and how many devices these days rely on generic software to run a multitude of devices. The question I guess comes down to the level of rigor that has been applied in the testing and quality assurance processes, and the relevance of the technology to the task at hand. It is becoming increasingly difficult to find technology that doesn't rely to some extent on a homogeneous platform and in fact the use of a platform brings many benefits like scale, total cost of ownership, etc. The idea of moving away from discrete things built to perform discrete actions is very appealing.

Consider the smartphone for example. Irrespective of whether you use one, the underlying technology is pretty standard for a given family of products. Of course this ‘standard' has proven to be somewhat of an undoing for some of these platforms in that you might have a tablet device running an Android operating system with the froyo release and yet you cannot make phone calls with it, despite the fact that it has a dial-pad and a ‘make a call' button. So, what it appears a given platform can do is not necessarily reflected in the reality of what you can actually do - this is an obvious one that you quickly realize. Again, as previously mentioned, it is however cost effective for manufacturers to use a generic image of the software to make the hardware usable and then it is up to the consumer to determine what he/she wants to use it for.

Another parallel comes from the world of music. Over the holidays, I met with family and we discussed how being a deejay has transformed over the last couple of years with the almost complete elimination for  the tunes-minder to drive a minivan brimming with boxes of CDs or vinyl records. The MP3 standard and digitizing music into data files has effectively rendered that industry much more efficient. So much so  that you could be carrying thousands of hours of broadcast grade audio media in your pocket in the same device that you make phone calls. All pretty amazing except that now there is so much of it, it is difficult to manage unless you are an incredibly disciplined or organized person. I have terabytes of recorded media at home: songs, videos, movies, and photos that I have tried to store in a sorted fashion, but whose volume has rapidly outstripped my discipline levels. I almost despair when I try to find things sometimes. So do deejays really carry their entire library of music with them? I assume not. They have a standard repertoire probably and a few extra tracks in the wings for the occasionally requested, but not mainstream songs.

Back to the concept of the automation paradox then. The idea here is that automation is the operation of some activity automatically, without continuous input from an operator. The more reliable one considers an automation to be, the more complexity one introduces to the automation and ultimately the less the operator can contribute to the resulting success. This is a paradox because it could be a contradiction. I have my audio tracks so now I don't have to carry hundreds of discs from venue to venue. But because there is so much of it, I only carry some of it now. I have changed my usage model. I've switched from a phone that just makes calls, to a phone that makes calls, surfs the webs, plays music and takes photographs but really, what do I use it mostly for? I have over 60 applications installed, but I only use two or three regularly. I am not representative of anyone except myself, but do you see some parallels here with yourself? In theory at least, I don't need my phone, MP3 player, camera and my laptop, but which ones have I really shed?

I thought this topic relevant in the realm of creating automations around SAP transactions because we assume that we can save time and energy by building scripts around all manner of actions in the world of SAP. However, sometimes we simply need to step back from the problem and evaluate whether we really should be modeling automations around things that we find annoying or that we think we can build automations around.

The message has to be that even though you ‘can' build a script around a given transaction or process in SAP, is it the right thing to do? The transaction recording mode for example that comes to my mind is building scripts using GUI scripting. While this method often works and works well, it is an area that has been frequently called out by SAP admins  and SAP auditors as an area to be cautious with. The challenge with this method is that it leverages classic screen scraping and doesn't rely to the same extent on the field and screen definitions defined programmatically in the SAP transaction. In a phrase, "you can sometimes land up with unexpected outcomes"; you can only assess success if you perform a review of the results. Do you always do that?

Again, you can build a degree of checking and controlling into your recording, but the level of effort may ultimately make it better by not trying to build such an automation. In such circumstances, I always encourage people to look at alternative ways to achieving the same objective, perhaps using multiple scripts or considering a BAPI or an SAP API like a remotely enabled Function Module to achieve the same result. Irrespective of the approach that you choose, of course, testing will be key and playing through a number of use cases and scenarios will be pivotal to determining whether your automation is robust and reliable.

Fortunately, all of this isn't likely to be very life threatening. You're not likely to be using that transaction recording to run a defibrillator, fly an airplane or a heart lung machine. Don't underestimate the script's importance though, especially if you are using it to do interesting things like maintain bills of materials for the manufacture and assembly of items. Do consider though whether ultimately you aren't building something that will make more work for you in the long run, if something goes wrong. Remember that in the world of SAP, without a system restore, there is no undo, only the ability to fix forward.

Suggested further reading:

The Benefits of Risk - IEEE Spectrum

What is Froyo - Gizmodo

Radio Deejays versus Radio Automation - Hubpages

More Stories By Clinton Jones

Clinton Jones is a Product Manager at Winshuttle. He is experienced in international technology and business process with a focus on integrated business technologies. Clinton also services a technical consultant on technology and quality management as it relates to data and process management and governance. Before coming to Winshuttle, Clinton served as a Technical Quality Manager at SAP. Twitter @winshuttle

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...