Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: Industrial IoT, Microservices Expo

Industrial IoT: Article

Enterprise Data Integration: Business Boon or Budget Breaker?

Data is king in today's information-driven economy

Data is king in today's information-driven economy, which is why organizations are willing to spend tens or even hundreds of thousands of dollars on data integration frameworks and applications. These organizations understand two critical truths: they have yet to capitalize on the potential business value stored in relational databases, EDI, flat files, and XML-based systems; and they must seamlessly connect with customers, suppliers, and business units - all of which may store and process data in different formats - to remain competitive.

Open standards-based technologies like XML promise to unify enterprise data and enable advanced Web Services and SOA. But while XML may be standards-based, most existing enterprise data isn't, nor is it easily extensible. Complicating matters further is the fact that many large enterprises rely on EDI systems to exchange business information with their partners. EDI is generally not interoperable with other systems.

XML gives organizations the ability to leverage existing systems and increase their usefulness by adding the flexibility required for real-time data exchange. Furthermore, it can facilitate the exchange across departmental and geographical boundaries and through system and programmatic constraints. But XML is not, in and of itself, a cure-all for data integration. Successfully integrating XML with other data formats requires applications that integrate system interfaces and map between data structures.

No Two Data Formats Are Alike
There are various formats for storing and exchanging data in use today and the fact that no two are alike add to the challenges of information accessibility and data integration. Let's take a look at the most popular formats and what makes each of them unique.

Relational databases
This is the dominant storage mechanism for structured enterprise data, an efficient means of storing, searching for, and retrieving information from large collections of data. Relational databases specialize in relating individual data records grouped by type in tables. Records can be joined together as needed using SQL and presented to business users as meaningful information.

The technology is mature, and so the sheer volume of information stored in relational databases and the number of hours invested developing structures and specialized systems make them valuable assets. But their flexibility remains limited when it comes to integrating with other systems. Also, the differences between major commercial implementations can make data integration difficult.

Electronic data interchange (EDI)
Long before the Internet made business-to-business electronic trade a standard practice, there was EDI. This set of widely used formats allows for the electronic exchange of information, and was developed to enable independent organizations to reliably exchange various types of data, including purchase orders, invoices, shipping notices, medical and insurance claims, and the like.

EDI has proven valuable for supplanting paper-based business processes. It has also enabled organizations to exchange large amounts of information with partners and other companies quickly in a fairly standardized interaction.

Many larger organizations have substantial investments in EDI technology. But smaller companies have been less likely to invest in EDI technology partly because the implementations are infrastructure-, training-, and maintenance-intensive. To partner with larger enterprises, however, these smaller companies must find a way to handle EDI-based business transactions otherwise they risk missing lucrative business opportunities.

Flat files
Many legacy enterprise systems and popular applications, including accounting, banking, CRM, and spreadsheet software, support flat-file formats. They are frequently used as an interchange format for transferring information between applications, including databases. However, flat files generally require additional processing to interoperate with common data formats such as EDI or XML and can be cumbersome when dealing with large amounts of information.

XML
In the world of information interchange, the use of XML has grown steadily, and it now plays a central role in data management, transformation, and exchange. It has gotten widespread support from leading software, server, and database vendors, and has become the language of choice for lowering the cost of processing, searching, exchanging, and reusing data and information.

The openness of XML allows it to be exchanged between virtually any hardware, software, or operating system, and allows for information interchange without restriction. XML and XML-based technologies such as XML Schema, XSL, WSDL, and SOAP are all open standards that can be used in conjunction with any programming language or platform. Thus, XML technologies and Web services can be used on and between virtually any combination of database, application runtime, and operating system - a characteristic that's essential for integration with heterogeneous systems.

Developing Data Integration Applications
Data integration applications and frameworks offer the potential to unify business data while capitalizing on the particular strengths of relational databases, EDI, flat files, and XML systems, and there are several approaches an organization can take to developing these solutions - each with its own advantages and disadvantages.

Middleware or server-based platforms, for example, tend to be proprietary, closed solutions that are extremely expensive to purchase, implement, and maintain. For some the more viable option is to build customized data integration applications that are flexible enough to adapt to changes and don't force businesses to lock into a particular system vendor. Yet despite the advantage of flexibility, customized data integration applications are often extremely complex, expensive, and time-consuming to develop.

Alternative, More Cost-Effective Approaches
With complicated ESB or EAI installations, expensive server deployments, hundreds of hours of programming, consulting and training; clearly the cost and complexity of an effective data integration application can add up, and fast. But there are alternatives that can simplify the development of data integration applications and meet the needs of individual integration challenges.

When timeliness and budget make byzantine enterprise solutions impractical, data integration development tools like Altova MapForce can be used to build customized applications very quickly and at a fraction of the cost. Such tools are easy to use and sell for under $1,000 - far less than an ESB or EAI solution. Most companies with specific data integration requirements will find using these tools a more manageable and cost-effective investment.

If a data integration tool is the right choice for your organization, there are certain characteristics you should look for to be certain the tool is up to the task.

Visual data integration and mapping
The advantages of a visual data integration tool can't be overstated. Because the tool reads and writes all the native file formats, from relational databases and EDI to flat files and XML, and even established Web services, one well-rounded programmer is all it takes to accomplish what otherwise would require a specialized team of experts. A visual interface lets the developer design a data mapping without having to understand the specific details of how to programmatically access the data formats that are being integrated. (see sidebar)

Multiple sources and targets
Mapping shouldn't be limited to one-to-one relationships. Look for a tool that lets developers mix multiple sources and multiple targets to map any combination of different data sources and targets in a mixed environment.

A data integration tool should also provide a comprehensive library of advanced data processing functions, and let developers specify mappings based on conditions - Boolean logic, string operations, filters, mathematical computations, and so on. It should also let developers save complex functions for use at other stages of data processing to save time and effort.

Automatic generation of royalty-free code
The ability to auto-generate code in various languages (Java, C#, C++, XSLT, XQuery) means you get reliable code faster and with less effort. Instead developers can focus on the all-important business logic of the application while leaving the generation of low-level infrastructure code to the tool. Make sure the code that the tool generates can be used royalty-free and doesn't require any proprietary deployment adapters.

Furthermore, some tools, such as Altova MapForce, also have the capability to process transformations internally, letting you preview the output of your mapping and ensure accuracy before generating code. This feature is also useful for doing periodic or light-duty integration tasks on the fly.

A Way To Achieve Data Integration Today
The value of an organization's business information is directly proportional to its ability to share the information internally and externally, which is why organizations can no longer afford to let data storage and exchange systems operate in a vacuum.

Fortunately, the arrival of simple cost-effective data integration and development tools means that sharing data with customers, partners, and business units doesn't have to cost companies a small fortune for a largely unnecessary enterprise solution.

More Stories By Tim Hale

Tim Hale is director of worldwide marketing for Altova (www.altova.com), creator of XMLSpy and other XML, data management, UML, and Web services tools. In this role, Mr. Hale leads all product marketing, online and channel marketing, marketing communications, events, advertising, and public relations activities for the company.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...