Welcome!

Microservices Expo Authors: LeanTaaS Blog, Derek Weeks, Don MacVittie, Karthick Viswanathan, Gopala Krishna Behara

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Linux Containers, Agile Computing, @CloudExpo

@DXWorldExpo: Article

Integrating Globally Distributed Data: A New Approach

The challenges of doing business in today’s “small,” connected world

The era of Big Data is upon us. The volume, variety and velocity of data now being generated is unprecedented in human history. This poses a challenge for those tasked with data integration: how can we manage all this data, particularly across distributed data centers around the world? The complexity and compliance issues of modern data management must be addressed.

Health care organizations, online subscription services, banks and many other businesses need to provide user-friendly service while ensuring trust by protecting and managing critical data. For example, Personally Identifiable Information (PII) that includes sensitive information such as credit card numbers, names and Social Security numbers, can be extremely challenging to manage effectively. There are multiple issues associated with data integration, especially when applied in a cross-regional context that must be considered.

The Challenges of Globalization
Back in the good old days, just a short while ago, an organization's data was easy to control and access because it was typically stored in one location. Today, many large enterprises have a global component to daily business transactions, with customers, partners and employees located around the world. Given the distributed nature of an organization's users and increasing data location regulations, the traditional method of storing data on a central server to support worldwide stakeholders no longer meets business needs.

The effects of globalization are many and varied, adding layers of complexity to business operations. One effect is that many foreign governments are becoming increasingly inflexible about data privacy and security for data originating in-country. While regulations vary by country, there are growing requirements for PII data to remain in the country of origin. This means that policies must be created and maintained to ensure that data is stored in compliance with these regulations, which might be easier said than done when a company operates across continents.

This places companies between the devil and the deep blue sea, as it were. They must store data where it is most convenient and thereby risk non-compliance or set up data stores by region. Each of these brings its own difficulties:

  • Ignoring local requirements: Organizations face serious legal and regulatory implications if they decide to store data outside the parameters of local regulations.
  • Losing immediate access: Organizations must continually consolidate and synchronize their data if they hope to remain compliant by storing data in separate geographic regions. No matter how often the data needs to be consolidated, real-time access to data is not possible.

In the best of all possible worlds, businesses would be able to both adhere to regulations and have real-time access to their data.

A Best-Case Scenario
Data integration technology exists that enables organizations to automate data location compliance while retaining their existing infrastructure, a best-case scenario for protecting and managing data. One approach is an integrated policy-driven data management system that eliminates the challenges described above, by automatically synchronizing data in real time, which provides a holistic view of the data at all times.

Organizations that implement a data integration solution that lets them retain the infrastructure they have, while addressing data location compliance issues, will substantially reduce costs and administrative time. This new approach takes advantage of a "scale-out" architecture where capabilities are extended by simply adding identical data management "nodes" and enables easy scaling either within a data center or to multiple locations around the globe. Integrated policy management virtually eliminates the manual labor usually involved with scaling such a system and delivers a more streamlined, automated process.

It's a Small World After All
Adding data management nodes to existing infrastructure, as needed and where needed, enables businesses to run a data integration solution alongside their current data stores. When a data transaction is completed, most of the data is stored as usual, with region-specific data stored only on the node in that region. For example, if a company chooses to do business in a country that requires all PII data be maintained in-country, it can place a node in that country and the PII data will be stored only on that node, rather than deploying a separate instance of the company's existing database. The nodes create a geographically distributed fabric that provides data visibility in real time.

Nodes can run alongside existing database systems and may also be deployed
in remote locations to enable PII data to remain in the country of origin.

Globalization has changed the way we do business, and some of those changes require organizations to rethink how they manage their cross-regional data. They must find a way to remain compliant with regional regulations while ensuring real-time access to their data. New node-based data management unifies data across different systems and regions, providing real cost savings, real-time data visibility and better response times for remote users. This development in data management helps address the challenges of doing business in today's "small," connected world.

More Stories By Frank Huerta

Frank Huerta is CEO and co-founder of TransLattice, where he is responsible for the vision and strategic direction of the company. He has been published in numerous trade publications and is a respected leader in the database management industry. He has an MBA from the Stanford Graduate School of Business and an undergraduate degree in physics from Harvard University cum laude.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task...
Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things c...
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Mi...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
Admiral Calcote - also known as Lee Calcote (@lcalcote) or the Ginger Geek to his friends - gave a presentation entitled Characterizing and Contrasting Container Orchestrators at the 2016 All Day DevOps conference. Okay, he isn't really an admiral - nor does anyone call him that - but he used the title admiral to describe what container orchestrators do, relating it to an admiral directing a fleet of container ships. You could also say that they are like the conductor of an orchestra, directing...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...
Many IT organizations have come to learn that leveraging cloud infrastructure is not just unavoidable, it’s one of the most effective paths for IT organizations to become more responsive to business needs. Yet with the cloud comes new challenges, including minimizing downtime, decreasing the cost of operations, and preventing employee burnout to name a few. As companies migrate their processes and procedures to their new reality of a cloud-based infrastructure, an incident management solution...
Cloud Governance means many things to many people. Heck, just the word cloud means different things depending on who you are talking to. While definitions can vary, controlling access to cloud resources is invariably a central piece of any governance program. Enterprise cloud computing has transformed IT. Cloud computing decreases time-to-market, improves agility by allowing businesses to adapt quickly to changing market demands, and, ultimately, drives down costs.
Recent survey done across top 500 fortune companies shows almost 70% of the CIO have either heard about IAC from their infrastructure head or they are on their way to implement IAC. Yet if you look under the hood while some level of automation has been done, most of the infrastructure is still managed in much tradition/legacy way. So, what is Infrastructure as Code? how do you determine if your IT infrastructure is truly automated?
Every few years, a disruptive force comes along that prompts us to reframe our understanding of what something means, or how it works. For years, the notion of what a computer is and how you make one went pretty much unchallenged. Then virtualization came along, followed by cloud computing, and most recently containers. Suddenly the old rules no longer seemed to apply, or at least they didn’t always apply. These disruptors made us reconsider our IT worldview.