Welcome!

Microservices Expo Authors: Elizabeth White, Pat Romanski, Scott Davis, Stackify Blog, Kelly Burford

Related Topics: Java IoT, Eclipse

Java IoT: Article

i-Technology Viewpoint: The Future of Software Tools

i-Technology Viewpoint: The Future of Software Tools

In a recent press interview I was asked what I thought were some of the important trends for the future of software tools.

It's an interesting question, with many facets, so I was not sure how to respond. After some thought, here are the five areas I chose to highlight from my context of design and construction tool strategy. These are areas that have been occupying much of my thinking and discussions lately with customers, IBM Research, and the Rational teams. These are changing the kinds of software tools we are delivering, and the features the tools support.

1. Connecting business with IT: Business-driven development. The importance of understanding the business context for IT investment has never been more obvious than it is today. More organizations are recognizing the role of IT as a determining factor in the efficiency of their operations, and a bottleneck in their ability to innovate.

I am spending a lot of time with customers who want to explore business alternatives, drive IT projects more directly from business needs with well established business goals and ROI, choreograph services to realize their business processes, and monitor those services in execution to relate operations to the needs of the business. Support for that flow (in its myriad variations) is essential. As we use the current generation of tools in this context we are seeing the emergence of new roles, usage scenarios, and support needs. The lessons from this work are leading to a complete refactoring of tooling capabilities.


2. Greater transparency in the software development process:
Auditing, traceability, and accountability. Software plays a pivotal role in all our lives. It runs our financial institutions, controls the power and utility infrastructure, is embedded in almost every useful device we use, and so on. With this important role comes a certain responsibility.

Government regulators, lawyers, and auditors are beginning to pay increasing attention to the software industry to verify that the software we all rely on has been developed according to some provable quality standards. Sarbanes-Oxley and BASEL2 are just the tip of a very large iceburg. For example, in discussions with those in the auto industry I was overwhelmed by the role software plays in the design, manufacture, control, and management of automobiles, and the kinds of requirements they need fulfiled by the software tools they are using.

Suppose there is a major design flaw in the software managing the anti-lock brakes on a popular model of car that results in injury of a number of people. How does the manufacturer of the braking system prove that it was not negilgent in the design and implementation of that software? Were the engineers developing the software certified against some recognized standards? Were the processes used to develop the software audited for quality? How were software designs analyzed and validated before they were put into production? And so on.

This kind of rigour and auditability will become the norm. Tools must permit this level of access and control. I refer to this as transparency...of prcoess, design, realization, etc. New tooling will emerge that supports and enforces these design principles. Traceability and reporting at all levels will become essential.


3. RAD using new programming models: As Grady Booch likes to say, software drives the world's economies and in some regards we can consider software development to be the most important job in the world!. Yet we all know that the skills and qualities of the best software engineers are in short supply.

It must be possible for a larger community of people to develop sophisticated enterprise solutions and deploy them to heterogeneous runtime environments. We have a long way to go to make this happen. The gap between the way domain-focused users view the problem space and the way in which they must describe systems in the solution space is far too great. In the past, various ways of addressing this gap with CASE tools and 4GLs solved part of the problem, but created their own challenges in return (e.g., proprietary runtime layers, non-standard artifacts, lack of openness to integrate with other systems and services, inflexible high-ceremony design approaches, and so on).

Over the last few years we have seen ways to overcome these limitations with the emergence of robust patterns and frameworks for application development in many technology and business domains. We can raise the abstraction of programming model to be closer to the end-users' mental model of their problem space and use the patterns and frameworks to transform that to the solutions space for today's technologies. Techniques such as generative programming and MDA are a realization of this. We are seeing a lot of innovation in the software tools here.


4. Collaboration among individuals and teams: Much of the inefficiency in software development is a result of the friction between individuals and teams as they work together to share a common understanding of some element of concern, investigate issues from multiple perspectives to make a balanced decision, solve multi-dimensional problems, and so on.

There are many great advances in collaborative tools for interaction and sharing. It's great to be able to start a chat session in a new window while understanding a new piece of code, to view a remote desktop to see the problem a customer is experiencing in their environment, or to create a teleconference as needed to resolve a design issues among colleagues. But there is much more to be done to make those kinds of capabilities part of the software tools workbench of the teams. We'll see those ideas become much better aligned with software development tools so that software engineers can more easily work together on all aspects of the design process, and we'll see design practices evolve to take better advantage of their capabilities.


5. "Pay-per-use" software tools:
New licensing and subscription offerings. There are many pressures on software tool vendors to change the way in which tools are packaged and delivered. Initiatives such as open source software and hosted services via ASPs challenge conventional thinking on software tools.

We've seen some reaction in the marketplace (e.g., open source development tools workbenches such as Eclipse, and on-line testing services from different vendors). Customers are demanding more -- greater flexibility in how software tools are delivered, less overhead in upgrading software tools, more creative pricing based on how the tool is used, when it is used, and how much of it is used.

We are working on different kinds of software tool offerings in response to this by re-factoring the products we offer, increasing the ease with which different tool capabilities can be interchanged, and allowing access to software tool capabilities in a variety of access modes (various flavors of fat client and thin client access). Safe-to-say that lots of people building software in the future will not be buying, installing, and using tools in the way they do today.

Alan W. Brown blogged these comments originally at developerWorks.com. Reproduced here with the kind permission of the author.

More Stories By Alan W. Brown

Alan W. Brown is a Distinguished Engineer at IBM Rational software responsible for future product strategy of IBM Rational's Design and Construction products. He defines technical strategy and evangelizes product direction with customers looking to improve software development efficiency through visual modeling, generating code from abstract models, and systematic reuse.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Nice piece 11/22/04 07:57:23 AM EST

Refreshing to come at this from a tools perspective instead on la-di-da generalizations. The future of technology is better viewed through the lens of tools than the rose-tinted perspectives most CEOs trot out.

Toolsman 11/22/04 07:47:53 AM EST

Much of the inefficiency in software development is a result of the friction between individuals and teams as they work together to share a common understanding of some element of concern, investigate issues from multiple perspectives to make a balanced decision, solve multi-dimensional problems, and so on

How true. The human element is the Great Imponderable that not enough people seem to think, let alone write, about. great article!

@MicroservicesExpo Stories
The cloud era has reached the stage where it is no longer a question of whether a company should migrate, but when. Enterprises have embraced the outsourcing of where their various applications are stored and who manages them, saving significant investment along the way. Plus, the cloud has become a defining competitive edge. Companies that fail to successfully adapt risk failure. The media, of course, continues to extol the virtues of the cloud, including how easy it is to get there. Migrating...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...
It has never been a better time to be a developer! Thanks to cloud computing, deploying our applications is much easier than it used to be. How we deploy our apps continues to evolve thanks to cloud hosting, Platform-as-a-Service (PaaS), and now Function-as-a-Service. FaaS is the concept of serverless computing via serverless architectures. Software developers can leverage this to deploy an individual "function", action, or piece of business logic. They are expected to start within milliseconds...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
From manual human effort the world is slowly paving its way to a new space where most process are getting replaced with tools and systems to improve efficiency and bring down operational costs. Automation is the next big thing and low code platforms are fueling it in a significant way. The Automation era is here. We are in the fast pace of replacing manual human efforts with machines and processes. In the world of Information Technology too, we are linking disparate systems, softwares and tool...
DevOps is good for organizations. According to the soon to be released State of DevOps Report high-performing IT organizations are 2X more likely to exceed profitability, market share, and productivity goals. But how do they do it? How do they use DevOps to drive value and differentiate their companies? We recently sat down with Nicole Forsgren, CEO and Chief Scientist at DORA (DevOps Research and Assessment) and lead investigator for the State of DevOps Report, to discuss the role of measure...
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...
The nature of test environments is inherently temporary—you set up an environment, run through an automated test suite, and then tear down the environment. If you can reduce the cycle time for this process down to hours or minutes, then you may be able to cut your test environment budgets considerably. The impact of cloud adoption on test environments is a valuable advancement in both cost savings and agility. The on-demand model takes advantage of public cloud APIs requiring only payment for t...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
"As we've gone out into the public cloud we've seen that over time we may have lost a few things - we've lost control, we've given up cost to a certain extent, and then security, flexibility," explained Steve Conner, VP of Sales at Cloudistics,in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
These days, APIs have become an integral part of the digital transformation journey for all enterprises. Every digital innovation story is connected to APIs . But have you ever pondered over to know what are the source of these APIs? Let me explain - APIs sources can be varied, internal or external, solving different purposes, but mostly categorized into the following two categories. Data lakes is a term used to represent disconnected but relevant data that are used by various business units wit...
With continuous delivery (CD) almost always in the spotlight, continuous integration (CI) is often left out in the cold. Indeed, it's been in use for so long and so widely, we often take the model for granted. So what is CI and how can you make the most of it? This blog is intended to answer those questions. Before we step into examining CI, we need to look back. Software developers often work in small teams and modularity, and need to integrate their changes with the rest of the project code b...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Cloud4U builds software services that help people build DevOps platforms for cloud-based software and using our platform people can draw a picture of the system, network, software," explained Kihyeon Kim, CEO and Head of R&D at Cloud4U, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...