Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Matt Brickey, Pat Romanski, Wayne Salpietro

Related Topics: Microservices Expo, Java IoT, Agile Computing, Release Management , @CloudExpo

Microservices Expo: Article

How to Enhance Applications to Support Business Agility (Part 2)

Soaring with the cloud – “no software” doesn’t mean “no integration”

Sustaining applications in the most cost-effective and efficient fashion is the foundation to maximizing a return on data. But it is only the foundation. Organizations have to move beyond sustaining applications to driving innovation, and the first step in that progression is learning the best ways to enhance existing applications and implement new applications that will help modernize business processes and support business agility.

The challenges around enhancing applications are well agreed upon. The top challenges include:

  • Insufficient Data Quality - Data quality issues of one degree or another are pervasive in the majority of enterprises, leading users to distrust the data. This is a problem within applications and, even more so, across applications.
  • Right-Time Access to Information - The pace of business continues to accelerate, and users can no longer wait weeks or days for information necessary to perform their jobs. If business users need immediate access to fresh and trusted information in the applications they use every day, IT must find ways to provide it.
  • SaaS Sprawl - As more and more applications move to the Cloud, IT needs to be proactive in maintaining visibility and control over SaaS applications and their data, including the ability to easily integrate them with on-premise applications. After all, "no software" does not imply "no integration."
  • Successful Data Migration to New Applications - As organizations implement new applications, existing data must be moved quickly and smoothly to the new apps, on time and on budget, so that dependent business processes are not negatively impacted.

Building Trust in the Data with Automated Data Quality
The core reason why business users lack trust in their data is because that data resides in data silos across multiple systems and, when it is delivered to them, it is all too frequently inconsistent, incorrect and incomplete, not to mention late. This impacts both day-to-day data usage as well as strategic usage. For example, Procurement processes need consistent and correct data on vendor price and performance in order to negotiate favorable contracts, as well as data that is timely in order to either drive or block purchase requisitions and payments to vendors per their adherence to their contracts.

Hence, the first step to modernizing business processes is to enhance application quality with trusted, authoritative data that is predictable, valuable and timely, regardless of how many source systems it is being drawn and integrated from.

Integrating data quality processes into the overall enterprise data integration process is a definitive step for any organization looking to build user trust in their data. This is as simple as introducing automated data cleansing that can be leveraged by all applications across the company. A further step is introducing proactive data quality monitoring capabilities into the hands of data owners, so they participate in improving the quality of their information.

The payoff comes with users spending less time reconciling data and more time working with it.

Ensuring Right-Time Delivery with Data services
Ensuring that the right information is delivered at the right time to the right person is another formidable challenge, but one that can be solved, and data services provide the solution within the context of a service-oriented architecture (SOA). Traditional SOA approaches lack a data integration layer. Anything that cannot be handled by a simple Web service, such as complex data transformations or data cleansing, requires hand-coding and proprietary interfaces, which are things one wants to avoid.

Data services, on the other hand, present a discrete set of sophisticated data integration tasks that support the entire data integration life cycle. These data services can be readily consumed as Web services by the various components of a SOA, and also by composite applications and portals. The complexity of the task is hidden, plus the data services can be easily published to SOA registries and repositories.

Results? Organizations using data services have reported up to five times faster delivery of new data, and cost savings of up to three times. This means that these organizations are able to respond faster to changing information demands, increase IT project success rates, and even deliver comprehensive single customer views on-demand to help drive new revenues and increase customer satisfaction.

Soaring with the Cloud - "No Software" Doesn't Mean "No Integration"
SaaS application spending, as everyone knows, is soaring. As a result, more and more companies need to find ways to support hybrid IT infrastructures that span cloud and on-premise applications and make them work seamlessly together to maximize the return on all enterprise data. And this requires data integration.

Cloud applications have to integrate with other systems in order to provide full value. At the same time, integration needs to happen in a secure fashion lest IT lose control of enterprise data assets. Fortunately, appropriately designed cloud data integration will support hybrid IT environments, essentially by extending unified, enterprise-class data integration services to the cloud.

Things to look for when supporting hybrid IT include a "secure agent" that provides the ability to create and securely manage all aspects of integration jobs, which can be shared between on-premise and cloud deployments. While the agent can be invoked via a web browser, what it does is establish a secure connection between data source and data target and all data integration processing occurs on-premise in the enterprise environment for maximum IT control.

Ensuring the Success of New Applications Through Efficient Migration
As organizations modernize their systems and business processes, they find that migrating data to their new applications is not usually a slam dunk proposition. All data migrations are inherently risky and subject to the lack of suitable tools, skills, knowledge of the data, and an access, validation, and audit strategy. Moreover, there is a lack of tools and processes to help business stakeholders and data users ensure and verify that the data is actually fit for use.

Having the right technology platform and skills goes a long way toward ensuring an on-time and on-budget migration. Knowledge of the data that is being moved is critical to each step of the migration process and ultimately is key to ensuring that the migrated data actually "works" in the new application. Having an infrastructure that supports change during the migration process is mandatory. And active business involvement is the hallmark of every successful migration. Hand-coding, using the wrong migration methodology, and relying strictly on IT are all pathways to migration time and money overruns or outright failure.

In selecting a migration platform, you want to ensure it provides:

  • Connectivity to the broadest range of environments and databases
  • Built-in data quality profiling, cleansing, and transformations (for all data types)
  • Fast, easy development, updating, and reuse of transformations
  • Easy auditability of the data from source to target

From Application Enhancement to Business Transformation
The above steps, from building trust in data to ensuring the success of new applications, all speak to enhancing applications to drive business process modernization and business agility. The next leap is to transform applications and, by extension, to transform the business, to drive innovation and growth. Much of what is accomplishable to enhance applications is leveragable when transforming them, but there are also new and highly valuable things to accomplish, as will to be seen in Part 3 of this article.

More Stories By Adam Wilson

Adam Wilson is the General Manager for Informatica’s Information Lifecycle Management Business Unit. Prior to assuming this role, he was in charge of product definition and go-to-market strategy for Informatica’s award-winning enterprise data integration platform. Mr. Wilson holds an MBA from the Kellogg School of Management and an engineering degree from Northwestern University. He can be reached at [email protected] or follow him on Twitter @ a_adam_wilson

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
"Peak 10 is a hybrid infrastructure provider across the nation. We are in the thick of things when it comes to hybrid IT," explained Michael Fuhrman, Chief Technology Officer at Peak 10, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
If you read a lot of business and technology publications, you might think public clouds are universally preferred over all other cloud options. To be sure, the numbers posted by Amazon Web Services (AWS) and Microsoft’s Azure platform are nothing short of impressive. Statistics reveal that public clouds are growing faster than private clouds and analysts at IDC predict that public cloud growth will be 3 times that of private clouds by 2019.
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Data reduction delivers compelling cost reduction that substantially improves the business case in every cloud deployment model. No matter which cloud approach you choose, the cost savings benefits from data reduction should not be ignored and must be a component of your cloud strategy. IT professionals are finding that the future of IT infrastructure lies in the cloud. Data reduction technologies enable clouds — public, private, and hybrid — to deliver business agility and elasticity at the lo...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
"As we've gone out into the public cloud we've seen that over time we may have lost a few things - we've lost control, we've given up cost to a certain extent, and then security, flexibility," explained Steve Conner, VP of Sales at Cloudistics,in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
From personal care products to groceries and movies on demand, cloud-based subscriptions are fulfilling the needs of consumers across an array of market sectors. Nowhere is this shift to subscription services more evident than in the technology sector. By adopting an Everything-as-a-Service (XaaS) delivery model, companies are able to tailor their computing environments to shape the experiences they want for customers as well as their workforce.
"We focus on SAP workloads because they are among the most powerful but somewhat challenging workloads out there to take into public cloud," explained Swen Conrad, CEO of Ocean9, Inc., in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"Outscale was founded in 2010, is based in France, is a strategic partner to Dassault Systémes and has done quite a bit of work with divisions of Dassault," explained Jackie Funk, Digital Marketing exec at Outscale, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"I think DevOps is now a rambunctious teenager – it’s starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
For over a decade, Application Programming Interface or APIs have been used to exchange data between multiple platforms. From social media to news and media sites, most websites depend on APIs to provide a dynamic and real-time digital experience. APIs have made its way into almost every device and service available today and it continues to spur innovations in every field of technology. There are multiple programming languages used to build and run applications in the online world. And just li...
If you are thinking about moving applications off a mainframe and over to open systems and the cloud, consider these guidelines to prioritize what to move and what to eliminate. On the surface, mainframe architecture seems relatively simple: A centrally located computer processes data through an input/output subsystem and stores its computations in memory. At the other end of the mainframe are printers and terminals that communicate with the mainframe through protocols. For all of its appare...
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Busine...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.