Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Yeshim Deniz, Kevin Jackson, ManageEngine IT Matters

Related Topics: Microservices Expo, Containers Expo Blog, @CloudExpo

Microservices Expo: Blog Feed Post

The Golden Age of Data Mobility?

Those masters were written on every platform imaginable – from Novell Netware to Windows to Linux to Solaris

image I was working for a mid-sized enterprise as an IT manager, a project that was on the cutting edge of technology at the time, and because it was on the cutting edge, we were using a whole slew of different embedded applications and their masters to collect data. Those masters were written on every platform imaginable – from Novell Netware to Windows to Linux to Solaris – and in every language that was common on each of the platforms. Our job was to make sense of it all. The information these systems collected was billing data, they all collected similar datasets, but all in different manners and all used different ways to store the data in databases. And they all used different RDBMS’s. We had Oracle, MS SQL Server, Sybase, MySQL, IBM UDB, and a few you’ve not likely heard of. We had our own datacenter, and it was a non-stop flurry of activity just trying to consolidate the data and get it into a consistent format and centralized for billing on a single DBMS. We had custom code, Extract Transform and Load (ETL) systems, extraction systems that we then loaded the resulting data from into our central database, all just to get the data in one place.

That’s the worst case I’ve ever been involved in, but seriously every place I’ve worked has had multiple database vendors because we live in the age of purchased applications, and even when a vendor says “oh yeah, we support X, Y, and Z”, smart IT folks immediately ask which one they develop for primarily, because that’s the one that will get the first attention when updates occur, and it is the one most likely to be stable. So while you theoretically could standardize on a single database, and every enterprise I’ve ever worked at has either wanted to or said they did… But purchased applications make it highly unlikely that they ever will.

Image Courtesy of www.servermachine.net

Still, you need a way to communicate that data back and forth, and when the enterprise shifted to “buy before build”, that’s where the programmers went – to integration duties to try and straighten out communications. Your purchased (or service) shipping system needs to update inventory, which is a different system on a different database, etc. We’ve got about a decade of this, and most IT shops have a relatively stable environment that transfers data back and forth as needed, but is  high maintenance, since every release that changes tables or columns evokes a new round of integration work. And unless you’re terribly lucky, no two purchased packages are on the same update cycle.

It is not my habit to plug specific products in this blog, even F5 products. I like to keep it useful to you and figure that if you find it useful, F5 indirectly gets the name recognition. F5 has thus far allowed me the freedom to do just that, and this blog is not a sign of some major shift. While I am going to  plug a specific product, it is not an F5 product. I’m going to tell you how all of the pain caused by the above issues can be alleviated, using Oracle Goldengate. Oracle is a partner of F5, and our uber-smart Business Development and Product Management Engineering teams have been working with Oracle on the Goldengate product and how it fits into our partnership. I was brought in to produce some collateral, and after reading up on Goldengate, fell in love.

It is not often that I, after more than a decade working in IT and several years as a Technology Editor, get excited about a product, but Goldengate fits the bill. It solves a problem that other solutions (like ETL engines) could be hacked to solve, but it does it directly and simply.

Oracle acquired Goldengate in mid-2009, and because it is not my job to pay attention to this stuff, the importance of the announcement flew under my radar. That being the case, I figure it might well have flown under your radar also. The architecture of Goldengate is, like most technology, simple to understand at the 50,000 foot level, and I’ll direct you to Oracle’s Goldengate website if you need more info. You purchase two copies of Goldengate, one to be the source and one to be the destination. The source reads log files and generates a binary representation called a trail file. There is another process on the source called the data pump that then sends this data out across the network to the destination. A piece of software called the Collector picks up the incoming stream and writes it out to a new trail file, then a final process called Replicat reads this binary trail file and creates transactions from it to submit to the database.

This sounds like an optimized database replication tool, which in itself would be kind of cool but not real earth-shattering. The reason this tool caught my attention (and garnered enough excitement to warrant a blog) is that the source RDBMS and the target RDBMS do not have to be the same vendor. Yes indeed, you read that right. Think of it as heterogeneous near-real-time replication. Have a purchased application that runs on SQL Server but your core datacenter RDBMS is UDB? No problem, purchase SQL Server for the source and UDB for the target, configure and tune, and then tell the DBAs where to find the replica of the data. So you create a separate tablespace and just dump into it. If nothing else, you only have to back up the big master database.

In the case of serious integration issues with many systems on many RDBMS’s needing to talk, this is a lot cleaner than what most of us are doing. And a lot faster to adapt to changing table/column configurations. If this had been available on that first project I reference above, perhaps my team wouldn’t have grown so quickly from tiny to huge. We’d have still needed DBAs and Systems Admins and Engineers, but developer count might have been smaller since almost all of our developer hours were database integration time. We only developed a few applications, our policy was definitely “purchase if possible”. I know in mergers and acquisitions space this tool would also be a huge boon. “We need to move data from our new subsidiary into our systems” is perhaps the most dreaded M&A phrase an IT person can hear. Or second most if “and you’re in charge of the integration, be done by Monday?” is first most dreaded.

I haven’t used Goldengate, and I know there are a host of ETL solutions that could be hacked to perform this job, but they list all of the major database vendors on their supported RDMBS list, and Oracle is pretty good about providing solid support before issuing such a statement. And the relative simplicity is striking. Sure it will take installation on two (or more) systems, and configuration of both the networking component and the trail file component – it has to know what data you want replicated, and where to send that data – but that’s much less work than writing or hacking tools to do the same job.

So it is worth checking out. I know I would if I was still in IT management. Life is complex enough, let me move all of my data to one DBMS and do all of my calculations, reporting, tabulation, etc. there. And since it is essentially a replication tool, I’d also replicate it off so things like reporting weren’t bogging down the primary database.

And yeah, we have tools to make it even better. If you’re thinking of running Goldengate over the WAN, watch for updates from our BIG-IP WOM team, but I’m sticking with my general rule not to plug products.

It certainly does appear that Goldengate is going to usher in the golden age of data mobility, which would be good, data integration is one of the sticking points in highly adaptable IT.

 


 

Connect with Don: Connect with F5:
linkedin rss facebook twitter o_facebook[1] o_twitter[1] o_slideshare[1] o_youtube[1]

Related Articles and Blogs:

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is founder of Ingrained Technology, A technical advocacy and software development consultancy. He has experience in application development, architecture, infrastructure, technical writing,DevOps, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

@MicroservicesExpo Stories
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
"Tintri focuses on the Ops side of the DevOps, which basically is pushing more and more of the accessibility of the infrastructure to the developers and trying to get behind the scenes," explained Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Busine...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Managing mission-critical SAP systems and landscapes has never been easy. Add public cloud with its myriad of powerful cloud native services and this may not change any time soon. Public cloud offers exciting new possibilities for enterprise workloads. But to make use of these possibilities and capabilities, IT teams need to re-think everything they have done before. Otherwise, they will just end up using public cloud as a hosting platform for their workloads, aka known as “lift and shift.”
There's a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive. Cloud computing as a mission transformation activity, not a technological one. As an organization moves from local information hosting to the cloud, one of the most important challenges is addressi...
The reality of data ubiquity is here—data is buried in operational statistics, machine logs, stacks of overflowing tickets and customer details, among other things. How can any user get valuable information amid this rapid influx of data? Imagine a situation where your firm’s revenue takes a hit owing to an unexpected failure in some business process. It would be a nightmare for IT admins to sift through the interminable piles of data to deduce exactly why and where the problem occurred. To sav...
Hybrid IT is today’s reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it’s in every IT professional’s best interest to know what to expect.
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
In the decade following his article, cloud computing further cemented Carr’s perspective. Compute, storage, and network resources have become simple utilities, available at the proverbial turn of the faucet. The value they provide is immense, but the cloud playing field is amazingly level. Carr’s quote above presaged the cloud to a T. Today, however, we’re in the digital era. Mark Andreesen’s ‘software is eating the world’ prognostication is coming to pass, as enterprises realize they must be...
A common misconception about the cloud is that one size fits all. Companies expecting to run all of their operations using one cloud solution or service must realize that doing so is akin to forcing the totality of their business functionality into a straightjacket. Unlocking the full potential of the cloud means embracing the multi-cloud future where businesses use their own cloud, and/or clouds from different vendors, to support separate functions or product groups. There is no single cloud so...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
Companies have always been concerned that traditional enterprise software is slow and complex to install, often disrupting critical and time-sensitive operations during roll-out. With the growing need to integrate new digital technologies into the enterprise to transform business processes, this concern has become even more pressing. A 2016 Panorama Consulting Solutions study revealed that enterprise resource planning (ERP) projects took an average of 21 months to install, with 57 percent of th...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.