Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Ruxit Blog, Liz McMillan, Sematext Blog

Related Topics: Microservices Expo, Java IoT, Agile Computing, Release Management , @CloudExpo

Microservices Expo: Article

How to Enhance Applications to Support Business Agility (Part 2)

Soaring with the cloud – “no software” doesn’t mean “no integration”

Sustaining applications in the most cost-effective and efficient fashion is the foundation to maximizing a return on data. But it is only the foundation. Organizations have to move beyond sustaining applications to driving innovation, and the first step in that progression is learning the best ways to enhance existing applications and implement new applications that will help modernize business processes and support business agility.

The challenges around enhancing applications are well agreed upon. The top challenges include:

  • Insufficient Data Quality - Data quality issues of one degree or another are pervasive in the majority of enterprises, leading users to distrust the data. This is a problem within applications and, even more so, across applications.
  • Right-Time Access to Information - The pace of business continues to accelerate, and users can no longer wait weeks or days for information necessary to perform their jobs. If business users need immediate access to fresh and trusted information in the applications they use every day, IT must find ways to provide it.
  • SaaS Sprawl - As more and more applications move to the Cloud, IT needs to be proactive in maintaining visibility and control over SaaS applications and their data, including the ability to easily integrate them with on-premise applications. After all, "no software" does not imply "no integration."
  • Successful Data Migration to New Applications - As organizations implement new applications, existing data must be moved quickly and smoothly to the new apps, on time and on budget, so that dependent business processes are not negatively impacted.

Building Trust in the Data with Automated Data Quality
The core reason why business users lack trust in their data is because that data resides in data silos across multiple systems and, when it is delivered to them, it is all too frequently inconsistent, incorrect and incomplete, not to mention late. This impacts both day-to-day data usage as well as strategic usage. For example, Procurement processes need consistent and correct data on vendor price and performance in order to negotiate favorable contracts, as well as data that is timely in order to either drive or block purchase requisitions and payments to vendors per their adherence to their contracts.

Hence, the first step to modernizing business processes is to enhance application quality with trusted, authoritative data that is predictable, valuable and timely, regardless of how many source systems it is being drawn and integrated from.

Integrating data quality processes into the overall enterprise data integration process is a definitive step for any organization looking to build user trust in their data. This is as simple as introducing automated data cleansing that can be leveraged by all applications across the company. A further step is introducing proactive data quality monitoring capabilities into the hands of data owners, so they participate in improving the quality of their information.

The payoff comes with users spending less time reconciling data and more time working with it.

Ensuring Right-Time Delivery with Data services
Ensuring that the right information is delivered at the right time to the right person is another formidable challenge, but one that can be solved, and data services provide the solution within the context of a service-oriented architecture (SOA). Traditional SOA approaches lack a data integration layer. Anything that cannot be handled by a simple Web service, such as complex data transformations or data cleansing, requires hand-coding and proprietary interfaces, which are things one wants to avoid.

Data services, on the other hand, present a discrete set of sophisticated data integration tasks that support the entire data integration life cycle. These data services can be readily consumed as Web services by the various components of a SOA, and also by composite applications and portals. The complexity of the task is hidden, plus the data services can be easily published to SOA registries and repositories.

Results? Organizations using data services have reported up to five times faster delivery of new data, and cost savings of up to three times. This means that these organizations are able to respond faster to changing information demands, increase IT project success rates, and even deliver comprehensive single customer views on-demand to help drive new revenues and increase customer satisfaction.

Soaring with the Cloud - "No Software" Doesn't Mean "No Integration"
SaaS application spending, as everyone knows, is soaring. As a result, more and more companies need to find ways to support hybrid IT infrastructures that span cloud and on-premise applications and make them work seamlessly together to maximize the return on all enterprise data. And this requires data integration.

Cloud applications have to integrate with other systems in order to provide full value. At the same time, integration needs to happen in a secure fashion lest IT lose control of enterprise data assets. Fortunately, appropriately designed cloud data integration will support hybrid IT environments, essentially by extending unified, enterprise-class data integration services to the cloud.

Things to look for when supporting hybrid IT include a "secure agent" that provides the ability to create and securely manage all aspects of integration jobs, which can be shared between on-premise and cloud deployments. While the agent can be invoked via a web browser, what it does is establish a secure connection between data source and data target and all data integration processing occurs on-premise in the enterprise environment for maximum IT control.

Ensuring the Success of New Applications Through Efficient Migration
As organizations modernize their systems and business processes, they find that migrating data to their new applications is not usually a slam dunk proposition. All data migrations are inherently risky and subject to the lack of suitable tools, skills, knowledge of the data, and an access, validation, and audit strategy. Moreover, there is a lack of tools and processes to help business stakeholders and data users ensure and verify that the data is actually fit for use.

Having the right technology platform and skills goes a long way toward ensuring an on-time and on-budget migration. Knowledge of the data that is being moved is critical to each step of the migration process and ultimately is key to ensuring that the migrated data actually "works" in the new application. Having an infrastructure that supports change during the migration process is mandatory. And active business involvement is the hallmark of every successful migration. Hand-coding, using the wrong migration methodology, and relying strictly on IT are all pathways to migration time and money overruns or outright failure.

In selecting a migration platform, you want to ensure it provides:

  • Connectivity to the broadest range of environments and databases
  • Built-in data quality profiling, cleansing, and transformations (for all data types)
  • Fast, easy development, updating, and reuse of transformations
  • Easy auditability of the data from source to target

From Application Enhancement to Business Transformation
The above steps, from building trust in data to ensuring the success of new applications, all speak to enhancing applications to drive business process modernization and business agility. The next leap is to transform applications and, by extension, to transform the business, to drive innovation and growth. Much of what is accomplishable to enhance applications is leveragable when transforming them, but there are also new and highly valuable things to accomplish, as will to be seen in Part 3 of this article.

More Stories By Adam Wilson

Adam Wilson is the General Manager for Informatica’s Information Lifecycle Management Business Unit. Prior to assuming this role, he was in charge of product definition and go-to-market strategy for Informatica’s award-winning enterprise data integration platform. Mr. Wilson holds an MBA from the Kellogg School of Management and an engineering degree from Northwestern University. He can be reached at [email protected] or follow him on Twitter @ a_adam_wilson

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
This digest provides an overview of good resources that are well worth reading. We’ll be updating this page as new content becomes available, so I suggest you bookmark it. Also, expect more digests to come on different topics that make all of our IT-hearts go boom!
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Node.js and io.js are increasingly being used to run JavaScript on the server side for many types of applications, such as websites, real-time messaging and controllers for small devices with limited resources. For DevOps it is crucial to monitor the whole application stack and Node.js is rapidly becoming an important part of the stack in many organizations. Sematext has historically had a strong support for monitoring big data applications such as Elastic (aka Elasticsearch), Cassandra, Solr, S...
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.

Let's just nip the conflation of these terms in the bud, shall we?

"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.

They are not.

One is about the application. The other, the network. T...

If you are within a stones throw of the DevOps marketplace you have undoubtably noticed the growing trend in Microservices. Whether you have been staying up to date with the latest articles and blogs or you just read the definition for the first time, these 5 Microservices Resources You Need In Your Life will guide you through the ins and outs of Microservices in today’s world.
Before becoming a developer, I was in the high school band. I played several brass instruments - including French horn and cornet - as well as keyboards in the jazz stage band. A musician and a nerd, what can I say? I even dabbled in writing music for the band. Okay, mostly I wrote arrangements of pop music, so the band could keep the crowd entertained during Friday night football games. What struck me then was that, to write parts for all the instruments - brass, woodwind, percussion, even k...