Click here to close now.




















Welcome!

Microservices Expo Authors: Pat Romanski, Liz McMillan, Elizabeth White, SmartBear Blog, Ruxit Blog

Related Topics: Microservices Expo, Java IoT, Agile Computing, Release Management , @CloudExpo

Microservices Expo: Article

How to Enhance Applications to Support Business Agility (Part 2)

Soaring with the cloud – “no software” doesn’t mean “no integration”

Sustaining applications in the most cost-effective and efficient fashion is the foundation to maximizing a return on data. But it is only the foundation. Organizations have to move beyond sustaining applications to driving innovation, and the first step in that progression is learning the best ways to enhance existing applications and implement new applications that will help modernize business processes and support business agility.

The challenges around enhancing applications are well agreed upon. The top challenges include:

  • Insufficient Data Quality - Data quality issues of one degree or another are pervasive in the majority of enterprises, leading users to distrust the data. This is a problem within applications and, even more so, across applications.
  • Right-Time Access to Information - The pace of business continues to accelerate, and users can no longer wait weeks or days for information necessary to perform their jobs. If business users need immediate access to fresh and trusted information in the applications they use every day, IT must find ways to provide it.
  • SaaS Sprawl - As more and more applications move to the Cloud, IT needs to be proactive in maintaining visibility and control over SaaS applications and their data, including the ability to easily integrate them with on-premise applications. After all, "no software" does not imply "no integration."
  • Successful Data Migration to New Applications - As organizations implement new applications, existing data must be moved quickly and smoothly to the new apps, on time and on budget, so that dependent business processes are not negatively impacted.

Building Trust in the Data with Automated Data Quality
The core reason why business users lack trust in their data is because that data resides in data silos across multiple systems and, when it is delivered to them, it is all too frequently inconsistent, incorrect and incomplete, not to mention late. This impacts both day-to-day data usage as well as strategic usage. For example, Procurement processes need consistent and correct data on vendor price and performance in order to negotiate favorable contracts, as well as data that is timely in order to either drive or block purchase requisitions and payments to vendors per their adherence to their contracts.

Hence, the first step to modernizing business processes is to enhance application quality with trusted, authoritative data that is predictable, valuable and timely, regardless of how many source systems it is being drawn and integrated from.

Integrating data quality processes into the overall enterprise data integration process is a definitive step for any organization looking to build user trust in their data. This is as simple as introducing automated data cleansing that can be leveraged by all applications across the company. A further step is introducing proactive data quality monitoring capabilities into the hands of data owners, so they participate in improving the quality of their information.

The payoff comes with users spending less time reconciling data and more time working with it.

Ensuring Right-Time Delivery with Data services
Ensuring that the right information is delivered at the right time to the right person is another formidable challenge, but one that can be solved, and data services provide the solution within the context of a service-oriented architecture (SOA). Traditional SOA approaches lack a data integration layer. Anything that cannot be handled by a simple Web service, such as complex data transformations or data cleansing, requires hand-coding and proprietary interfaces, which are things one wants to avoid.

Data services, on the other hand, present a discrete set of sophisticated data integration tasks that support the entire data integration life cycle. These data services can be readily consumed as Web services by the various components of a SOA, and also by composite applications and portals. The complexity of the task is hidden, plus the data services can be easily published to SOA registries and repositories.

Results? Organizations using data services have reported up to five times faster delivery of new data, and cost savings of up to three times. This means that these organizations are able to respond faster to changing information demands, increase IT project success rates, and even deliver comprehensive single customer views on-demand to help drive new revenues and increase customer satisfaction.

Soaring with the Cloud - "No Software" Doesn't Mean "No Integration"
SaaS application spending, as everyone knows, is soaring. As a result, more and more companies need to find ways to support hybrid IT infrastructures that span cloud and on-premise applications and make them work seamlessly together to maximize the return on all enterprise data. And this requires data integration.

Cloud applications have to integrate with other systems in order to provide full value. At the same time, integration needs to happen in a secure fashion lest IT lose control of enterprise data assets. Fortunately, appropriately designed cloud data integration will support hybrid IT environments, essentially by extending unified, enterprise-class data integration services to the cloud.

Things to look for when supporting hybrid IT include a "secure agent" that provides the ability to create and securely manage all aspects of integration jobs, which can be shared between on-premise and cloud deployments. While the agent can be invoked via a web browser, what it does is establish a secure connection between data source and data target and all data integration processing occurs on-premise in the enterprise environment for maximum IT control.

Ensuring the Success of New Applications Through Efficient Migration
As organizations modernize their systems and business processes, they find that migrating data to their new applications is not usually a slam dunk proposition. All data migrations are inherently risky and subject to the lack of suitable tools, skills, knowledge of the data, and an access, validation, and audit strategy. Moreover, there is a lack of tools and processes to help business stakeholders and data users ensure and verify that the data is actually fit for use.

Having the right technology platform and skills goes a long way toward ensuring an on-time and on-budget migration. Knowledge of the data that is being moved is critical to each step of the migration process and ultimately is key to ensuring that the migrated data actually "works" in the new application. Having an infrastructure that supports change during the migration process is mandatory. And active business involvement is the hallmark of every successful migration. Hand-coding, using the wrong migration methodology, and relying strictly on IT are all pathways to migration time and money overruns or outright failure.

In selecting a migration platform, you want to ensure it provides:

  • Connectivity to the broadest range of environments and databases
  • Built-in data quality profiling, cleansing, and transformations (for all data types)
  • Fast, easy development, updating, and reuse of transformations
  • Easy auditability of the data from source to target

From Application Enhancement to Business Transformation
The above steps, from building trust in data to ensuring the success of new applications, all speak to enhancing applications to drive business process modernization and business agility. The next leap is to transform applications and, by extension, to transform the business, to drive innovation and growth. Much of what is accomplishable to enhance applications is leveragable when transforming them, but there are also new and highly valuable things to accomplish, as will to be seen in Part 3 of this article.

More Stories By Adam Wilson

Adam Wilson is the General Manager for Informatica’s Information Lifecycle Management Business Unit. Prior to assuming this role, he was in charge of product definition and go-to-market strategy for Informatica’s award-winning enterprise data integration platform. Mr. Wilson holds an MBA from the Kellogg School of Management and an engineering degree from Northwestern University. He can be reached at [email protected] or follow him on Twitter @ a_adam_wilson

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Puppet Labs has announced the next major update to its flagship product: Puppet Enterprise 2015.2. This release includes new features providing DevOps teams with clarity, simplicity and additional management capabilities, including an all-new user interface, an interactive graph for visualizing infrastructure code, a new unified agent and broader infrastructure support.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, discussed why containers should be paired with new architectural practices such as microservices rathe...
It’s been proven time and time again that in tech, diversity drives greater innovation, better team productivity and greater profits and market share. So what can we do in our DevOps teams to embrace diversity and help transform the culture of development and operations into a true “DevOps” team? In her session at DevOps Summit, Stefana Muller, Director, Product Management – Continuous Delivery at CA Technologies, answered that question citing examples, showing how to create opportunities for ...
Whether you like it or not, DevOps is on track for a remarkable alliance with security. The SEC didn’t approve the merger. And your boss hasn’t heard anything about it. Yet, this unruly triumvirate will soon dominate and deliver DevSecOps faster, cheaper, better, and on an unprecedented scale. In his session at DevOps Summit, Frank Bunger, VP of Customer Success at ScriptRock, will discuss how this cathartic moment will propel the DevOps movement from such stuff as dreams are made on to a prac...
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...
In his session at 17th Cloud Expo, Ernest Mueller, Product Manager at Idera, will explain the best practices and lessons learned for tracking and optimizing costs while delivering a cloud-hosted service. He will describe a DevOps approach where the applications and systems work together to track usage, model costs in a granular fashion, and make smart decisions at runtime to minimize costs. The trickier parts covered include triggering off the right metrics; balancing resilience and redundancy ...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
Several years ago, I was a developer in a travel reservation aggregator. Our mission was to pull flight and hotel data from a bunch of cryptic reservation platforms, and provide it to other companies via an API library - for a fee. That was before companies like Expedia standardized such things. We started with simple methods like getFlightLeg() or addPassengerName(), each performing a small, well-understood function. But our customers wanted bigger, more encompassing services that would "do ...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advance...
What does “big enough” mean? It’s sometimes useful to argue by reductio ad absurdum. Hello, world doesn’t need to be broken down into smaller services. At the other extreme, building a monolithic enterprise resource planning (ERP) system is just asking for trouble: it’s too big, and it needs to be decomposed.
Early in my DevOps Journey, I was introduced to a book of great significance circulating within the Web Operations industry titled The Phoenix Project. (You can read our review of Gene’s book, if interested.) Written as a novel and loosely based on many of the same principles explored in The Goal, this book has been read and referenced by many who have adopted DevOps into their continuous improvement and software delivery processes around the world. As I began planning my travel schedule last...
Docker containerization is increasingly being used in production environments. How can these environments best be monitored? Monitoring Docker containers as if they are lightweight virtual machines (i.e., monitoring the host from within the container), with all the common metrics that can be captured from an operating system, is an insufficient approach. Docker containers can’t be treated as lightweight virtual machines; they must be treated as what they are: isolated processes running on hosts....
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on...
The pricing of tools or licenses for log aggregation can have a significant effect on organizational culture and the collaboration between Dev and Ops teams. Modern tools for log aggregation (of which Logentries is one example) can be hugely enabling for DevOps approaches to building and operating business-critical software systems. However, the pricing of an aggregated logging solution can affect the adoption of modern logging techniques, as well as organizational capabilities and cross-team ...
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery ...
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
DevOps has traditionally played important roles in development and IT operations, but the practice is quickly becoming core to other business functions such as customer success, business intelligence, and marketing analytics. Modern marketers today are driven by data and rely on many different analytics tools. They need DevOps engineers in general and server log data specifically to do their jobs well. Here’s why: Server log files contain the only data that is completely full and accurate in th...
The Microservices architectural pattern promises increased DevOps agility and can help enable continuous delivery of software. This session is for developers who are transforming existing applications to cloud-native applications, or creating new microservices style applications. In his session at DevOps Summit, Jim Bugwadia, CEO of Nirmata, will introduce best practices, patterns, challenges, and solutions for the development and operations of microservices style applications. He will discuss ...