Welcome!

Microservices Expo Authors: Pat Romanski, Derek Weeks, Liz McMillan, AppNeta Blog, Carmen Gonzalez

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, Linux Containers, Agile Computing, @CloudExpo

@BigDataExpo: Article

Integrating Globally Distributed Data: A New Approach

The challenges of doing business in today’s “small,” connected world

The era of Big Data is upon us. The volume, variety and velocity of data now being generated is unprecedented in human history. This poses a challenge for those tasked with data integration: how can we manage all this data, particularly across distributed data centers around the world? The complexity and compliance issues of modern data management must be addressed.

Health care organizations, online subscription services, banks and many other businesses need to provide user-friendly service while ensuring trust by protecting and managing critical data. For example, Personally Identifiable Information (PII) that includes sensitive information such as credit card numbers, names and Social Security numbers, can be extremely challenging to manage effectively. There are multiple issues associated with data integration, especially when applied in a cross-regional context that must be considered.

The Challenges of Globalization
Back in the good old days, just a short while ago, an organization's data was easy to control and access because it was typically stored in one location. Today, many large enterprises have a global component to daily business transactions, with customers, partners and employees located around the world. Given the distributed nature of an organization's users and increasing data location regulations, the traditional method of storing data on a central server to support worldwide stakeholders no longer meets business needs.

The effects of globalization are many and varied, adding layers of complexity to business operations. One effect is that many foreign governments are becoming increasingly inflexible about data privacy and security for data originating in-country. While regulations vary by country, there are growing requirements for PII data to remain in the country of origin. This means that policies must be created and maintained to ensure that data is stored in compliance with these regulations, which might be easier said than done when a company operates across continents.

This places companies between the devil and the deep blue sea, as it were. They must store data where it is most convenient and thereby risk non-compliance or set up data stores by region. Each of these brings its own difficulties:

  • Ignoring local requirements: Organizations face serious legal and regulatory implications if they decide to store data outside the parameters of local regulations.
  • Losing immediate access: Organizations must continually consolidate and synchronize their data if they hope to remain compliant by storing data in separate geographic regions. No matter how often the data needs to be consolidated, real-time access to data is not possible.

In the best of all possible worlds, businesses would be able to both adhere to regulations and have real-time access to their data.

A Best-Case Scenario
Data integration technology exists that enables organizations to automate data location compliance while retaining their existing infrastructure, a best-case scenario for protecting and managing data. One approach is an integrated policy-driven data management system that eliminates the challenges described above, by automatically synchronizing data in real time, which provides a holistic view of the data at all times.

Organizations that implement a data integration solution that lets them retain the infrastructure they have, while addressing data location compliance issues, will substantially reduce costs and administrative time. This new approach takes advantage of a "scale-out" architecture where capabilities are extended by simply adding identical data management "nodes" and enables easy scaling either within a data center or to multiple locations around the globe. Integrated policy management virtually eliminates the manual labor usually involved with scaling such a system and delivers a more streamlined, automated process.

It's a Small World After All
Adding data management nodes to existing infrastructure, as needed and where needed, enables businesses to run a data integration solution alongside their current data stores. When a data transaction is completed, most of the data is stored as usual, with region-specific data stored only on the node in that region. For example, if a company chooses to do business in a country that requires all PII data be maintained in-country, it can place a node in that country and the PII data will be stored only on that node, rather than deploying a separate instance of the company's existing database. The nodes create a geographically distributed fabric that provides data visibility in real time.

Nodes can run alongside existing database systems and may also be deployed
in remote locations to enable PII data to remain in the country of origin.

Globalization has changed the way we do business, and some of those changes require organizations to rethink how they manage their cross-regional data. They must find a way to remain compliant with regional regulations while ensuring real-time access to their data. New node-based data management unifies data across different systems and regions, providing real cost savings, real-time data visibility and better response times for remote users. This development in data management helps address the challenges of doing business in today's "small," connected world.

More Stories By Frank Huerta

Frank Huerta is CEO and co-founder of TransLattice, where he is responsible for the vision and strategic direction of the company. He has been published in numerous trade publications and is a respected leader in the database management industry. He has an MBA from the Stanford Graduate School of Business and an undergraduate degree in physics from Harvard University cum laude.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
True Story. Over the past few years, Fannie Mae transformed the way in which they delivered software. Deploys increased from 1,200/month to 15,000/month. At the same time, productivity increased by 28% while reducing costs by 30%. But, how did they do it? During the All Day DevOps conference, over 13,500 practitioners from around the world to learn from their peers in the industry. Barry Snyder, Senior Manager of DevOps at Fannie Mae, was one of 57 practitioners who shared his real world journe...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, will discuss solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed ...
As 2016 approaches its end, the time to prepare for the year ahead is now! Following our own advice, we sat down with three XebiaLabs thought leaders–Andrew Phillips, Tim Buntel, and TJ Randall–and asked what they think the future has in store for the DevOps world. In 2017, we’ll see a new wave of “next gen platform” projects focused on container orchestration frameworks such as Kubernetes, and re-tooled PaaS platforms such as OpenShift or Cloud Foundry. Acceptance of the need for a cross-machi...
When building DevOps or continuous delivery practices you can learn a great deal from others. What choices did they make, what practices did they put in place, and how did they connect the dots? At Sonatype, we pulled together a set of 21 reference architectures for folks building continuous delivery and DevOps practices using Docker. Why? After 3,000 DevOps professionals attended our webinar on "Continuous Integration using Docker" discussing just one reference architecture example, we recogn...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, drew upon his own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He also discussed the implementation of microservices in data and application integrat...
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...