Welcome!

Microservices Expo Authors: Kalyan Ramanathan, Carmen Gonzalez, Pat Romanski, Liz McMillan, Elizabeth White

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, Linux Containers, Agile Computing, @CloudExpo

@BigDataExpo: Article

Integrating Globally Distributed Data: A New Approach

The challenges of doing business in today’s “small,” connected world

The era of Big Data is upon us. The volume, variety and velocity of data now being generated is unprecedented in human history. This poses a challenge for those tasked with data integration: how can we manage all this data, particularly across distributed data centers around the world? The complexity and compliance issues of modern data management must be addressed.

Health care organizations, online subscription services, banks and many other businesses need to provide user-friendly service while ensuring trust by protecting and managing critical data. For example, Personally Identifiable Information (PII) that includes sensitive information such as credit card numbers, names and Social Security numbers, can be extremely challenging to manage effectively. There are multiple issues associated with data integration, especially when applied in a cross-regional context that must be considered.

The Challenges of Globalization
Back in the good old days, just a short while ago, an organization's data was easy to control and access because it was typically stored in one location. Today, many large enterprises have a global component to daily business transactions, with customers, partners and employees located around the world. Given the distributed nature of an organization's users and increasing data location regulations, the traditional method of storing data on a central server to support worldwide stakeholders no longer meets business needs.

The effects of globalization are many and varied, adding layers of complexity to business operations. One effect is that many foreign governments are becoming increasingly inflexible about data privacy and security for data originating in-country. While regulations vary by country, there are growing requirements for PII data to remain in the country of origin. This means that policies must be created and maintained to ensure that data is stored in compliance with these regulations, which might be easier said than done when a company operates across continents.

This places companies between the devil and the deep blue sea, as it were. They must store data where it is most convenient and thereby risk non-compliance or set up data stores by region. Each of these brings its own difficulties:

  • Ignoring local requirements: Organizations face serious legal and regulatory implications if they decide to store data outside the parameters of local regulations.
  • Losing immediate access: Organizations must continually consolidate and synchronize their data if they hope to remain compliant by storing data in separate geographic regions. No matter how often the data needs to be consolidated, real-time access to data is not possible.

In the best of all possible worlds, businesses would be able to both adhere to regulations and have real-time access to their data.

A Best-Case Scenario
Data integration technology exists that enables organizations to automate data location compliance while retaining their existing infrastructure, a best-case scenario for protecting and managing data. One approach is an integrated policy-driven data management system that eliminates the challenges described above, by automatically synchronizing data in real time, which provides a holistic view of the data at all times.

Organizations that implement a data integration solution that lets them retain the infrastructure they have, while addressing data location compliance issues, will substantially reduce costs and administrative time. This new approach takes advantage of a "scale-out" architecture where capabilities are extended by simply adding identical data management "nodes" and enables easy scaling either within a data center or to multiple locations around the globe. Integrated policy management virtually eliminates the manual labor usually involved with scaling such a system and delivers a more streamlined, automated process.

It's a Small World After All
Adding data management nodes to existing infrastructure, as needed and where needed, enables businesses to run a data integration solution alongside their current data stores. When a data transaction is completed, most of the data is stored as usual, with region-specific data stored only on the node in that region. For example, if a company chooses to do business in a country that requires all PII data be maintained in-country, it can place a node in that country and the PII data will be stored only on that node, rather than deploying a separate instance of the company's existing database. The nodes create a geographically distributed fabric that provides data visibility in real time.

Nodes can run alongside existing database systems and may also be deployed
in remote locations to enable PII data to remain in the country of origin.

Globalization has changed the way we do business, and some of those changes require organizations to rethink how they manage their cross-regional data. They must find a way to remain compliant with regional regulations while ensuring real-time access to their data. New node-based data management unifies data across different systems and regions, providing real cost savings, real-time data visibility and better response times for remote users. This development in data management helps address the challenges of doing business in today's "small," connected world.

More Stories By Frank Huerta

Frank Huerta is CEO and co-founder of TransLattice, where he is responsible for the vision and strategic direction of the company. He has been published in numerous trade publications and is a respected leader in the database management industry. He has an MBA from the Stanford Graduate School of Business and an undergraduate degree in physics from Harvard University cum laude.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud: This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
I’m a huge fan of open source DevOps tools. I’m also a huge fan of scaling open source tools for the enterprise. But having talked with my fair share of companies over the years, one important thing I’ve learned is that you can’t scale your release process using open source tools alone. They simply require too much scripting and maintenance when used that way. Scripting may be fine for smaller organizations, but it’s not ok in an enterprise environment that includes many independent teams and to...