Welcome!

Microservices Expo Authors: Liz McMillan, Roger Strukhoff, Derek Weeks, Elizabeth White, Miska Kaipiainen

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, Linux Containers, Agile Computing, @CloudExpo

@BigDataExpo: Article

Integrating Globally Distributed Data: A New Approach

The challenges of doing business in today’s “small,” connected world

The era of Big Data is upon us. The volume, variety and velocity of data now being generated is unprecedented in human history. This poses a challenge for those tasked with data integration: how can we manage all this data, particularly across distributed data centers around the world? The complexity and compliance issues of modern data management must be addressed.

Health care organizations, online subscription services, banks and many other businesses need to provide user-friendly service while ensuring trust by protecting and managing critical data. For example, Personally Identifiable Information (PII) that includes sensitive information such as credit card numbers, names and Social Security numbers, can be extremely challenging to manage effectively. There are multiple issues associated with data integration, especially when applied in a cross-regional context that must be considered.

The Challenges of Globalization
Back in the good old days, just a short while ago, an organization's data was easy to control and access because it was typically stored in one location. Today, many large enterprises have a global component to daily business transactions, with customers, partners and employees located around the world. Given the distributed nature of an organization's users and increasing data location regulations, the traditional method of storing data on a central server to support worldwide stakeholders no longer meets business needs.

The effects of globalization are many and varied, adding layers of complexity to business operations. One effect is that many foreign governments are becoming increasingly inflexible about data privacy and security for data originating in-country. While regulations vary by country, there are growing requirements for PII data to remain in the country of origin. This means that policies must be created and maintained to ensure that data is stored in compliance with these regulations, which might be easier said than done when a company operates across continents.

This places companies between the devil and the deep blue sea, as it were. They must store data where it is most convenient and thereby risk non-compliance or set up data stores by region. Each of these brings its own difficulties:

  • Ignoring local requirements: Organizations face serious legal and regulatory implications if they decide to store data outside the parameters of local regulations.
  • Losing immediate access: Organizations must continually consolidate and synchronize their data if they hope to remain compliant by storing data in separate geographic regions. No matter how often the data needs to be consolidated, real-time access to data is not possible.

In the best of all possible worlds, businesses would be able to both adhere to regulations and have real-time access to their data.

A Best-Case Scenario
Data integration technology exists that enables organizations to automate data location compliance while retaining their existing infrastructure, a best-case scenario for protecting and managing data. One approach is an integrated policy-driven data management system that eliminates the challenges described above, by automatically synchronizing data in real time, which provides a holistic view of the data at all times.

Organizations that implement a data integration solution that lets them retain the infrastructure they have, while addressing data location compliance issues, will substantially reduce costs and administrative time. This new approach takes advantage of a "scale-out" architecture where capabilities are extended by simply adding identical data management "nodes" and enables easy scaling either within a data center or to multiple locations around the globe. Integrated policy management virtually eliminates the manual labor usually involved with scaling such a system and delivers a more streamlined, automated process.

It's a Small World After All
Adding data management nodes to existing infrastructure, as needed and where needed, enables businesses to run a data integration solution alongside their current data stores. When a data transaction is completed, most of the data is stored as usual, with region-specific data stored only on the node in that region. For example, if a company chooses to do business in a country that requires all PII data be maintained in-country, it can place a node in that country and the PII data will be stored only on that node, rather than deploying a separate instance of the company's existing database. The nodes create a geographically distributed fabric that provides data visibility in real time.

Nodes can run alongside existing database systems and may also be deployed
in remote locations to enable PII data to remain in the country of origin.

Globalization has changed the way we do business, and some of those changes require organizations to rethink how they manage their cross-regional data. They must find a way to remain compliant with regional regulations while ensuring real-time access to their data. New node-based data management unifies data across different systems and regions, providing real cost savings, real-time data visibility and better response times for remote users. This development in data management helps address the challenges of doing business in today's "small," connected world.

More Stories By Frank Huerta

Frank Huerta is CEO and co-founder of TransLattice, where he is responsible for the vision and strategic direction of the company. He has been published in numerous trade publications and is a respected leader in the database management industry. He has an MBA from the Stanford Graduate School of Business and an undergraduate degree in physics from Harvard University cum laude.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Cloud Expo 2016 New York at the Javits Center New York was characterized by increased attendance and a new focus on operations. These were both encouraging signs for all involved in Cloud Computing and all that it touches. As Conference Chair, I work with the Cloud Expo team to structure three keynotes, numerous general sessions, and more than 150 breakout sessions along 10 tracks. Our job is to balance the state of enterprise IT today with the trends that will be commonplace tomorrow. Mobile...
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
Throughout history, various leaders have risen up and tried to unify the world by conquest. Fortunately, none of their plans have succeeded. The world goes on just fine with each country ruling itself; no single ruler is necessary. That’s how it is with the container platform ecosystem, as well. There’s no need for one all-powerful, all-encompassing container platform. Think about any other technology sector out there – there are always multiple solutions in every space. The same goes for conta...
Let's recap what we learned from the previous chapters in the series: episode 1 and episode 2. We learned that a good rollback mechanism cannot be designed without having an intimate knowledge of the application architecture, the nature of your components and their dependencies. Now that we know what we have to restore and in which order, the question is how?
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management solutions, helping companies worldwide activate their data to drive more value and business insight and to transform moder...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Whether they’re located in a public, private, or hybrid cloud environment, cloud technologies are constantly evolving. While the innovation is exciting, the end mission of delivering business value and rapidly producing incremental product features is paramount. In his session at @DevOpsSummit at 19th Cloud Expo, Kiran Chitturi, CTO Architect at Sungard AS, will discuss DevOps culture, its evolution of frameworks and technologies, and how it is achieving maturity. He will also cover various st...
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Video experiences should be unique and exciting! But that doesn’t mean you need to patch all the pieces yourself. Users demand rich and engaging experiences and new ways to connect with you. But creating robust video applications at scale can be complicated, time-consuming and expensive. In his session at @ThingsExpo, Zohar Babin, Vice President of Platform, Ecosystem and Community at Kaltura, will discuss how VPaaS enables you to move fast, creating scalable video experiences that reach your...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the protocols that communicate data and the emerging data analy...
This digest provides an overview of good resources that are well worth reading. We’ll be updating this page as new content becomes available, so I suggest you bookmark it. Also, expect more digests to come on different topics that make all of our IT-hearts go boom!
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lea...