Welcome!

Microservices Expo Authors: Liz McMillan, Lori MacVittie, Elizabeth White, David Green, XebiaLabs Blog

Related Topics: Containers Expo Blog, Microservices Expo, @CloudExpo

Containers Expo Blog: Article

Data Mining and Data Virtualization

Extending Data Virtualization Platforms

Data Mining helps organizations to discover new insights from existing data, so that predictive techniques can be applied towards various business needs. The following are the typical characteristics of data mining.

  • Extends Business Intelligence, beyond Query, Reporting and OLAP (Online Analytical Processing)
  • Data Mining is cornerstone for assessing the customer risk, market segmentation and prediction
  • Data Mining is about performing computationally complex analysis techniques on very large volumes of data
  • It combines the analysis of historical data with modeling techniques towards future predictions, it turns Operations into performance

The following are the use cases that can benefit from the application of data mining:

  • Manufacturing / Product Development: Understanding the defect and customer complaints into a model that can provide insight into customer satisfaction and help enterprises build better products
  • Consumer Payments: Understand the payment patterns of consumers to predict market penetration analysis and discount guidelines.
  • Consumer Industry: Customer segmentation to understand the customer base and help targeted advertisements and promotions.
  • Consumer Industry: Campaign effectiveness can be gauged with customer segmentation coupled with predictive marketing models.
  • Retail Indsutry: Supply chain efficiencies can be brought by mining the supply demand data

‘In Database' Data Mining
Data Mining is typically a multi-step process.

  1. Define the Business Issue to Be Addressed, e.g., Customer Attrition, Fraud Detection, Cross Selling.
  2. Identify the Data Model / Define the Data / Source the Data.(Data Sources, Data Types, Data Usage etc.)
  3. Choose the Mining Technique (Discovery Data Mining, Predictive Data Mining, Clustering, Link Analysis, Classification, Value Prediction)
  4. Interpret the Results (Visualization Techniques)
  5. Deploy the Results (CRM Systems.)

Initially Data Mining has been implemented with a combination of multiple tools and systems, which resulted in latency and a long cycle for realization of results.

Sensing this issue, major RDBMS vendors have implemented Data Mining as part of their core database offering. This offering has the following key features:

  • Data Mining engine resides inside the traditional database environment facilitating easier licensing and packaging options
  • Eliminates the data extraction and data movement and avoids costly ETL process
  • Major Data Mining models are available as pre-built SQL functions which can be easily integrated into the existing database development process.

The following is some of the information about data mining features as part of the popular databases:

Built as DB2 data mining functions, the Modeling and Scoring services directly integrate data mining technology into DB2. This leads to faster application performance. Developers want integration and performance, as well as any facility to make their job easier. The model can be used within any SQL statement. This means the scoring function can be invoked with ease from any application that is SQL aware, either in batch, real time, or as a trigger.

Oracle Data Mining, a component of the Oracle Advanced Analytics Option, delivers a wide range of cutting edge machine learning algorithms inside the Oracle Database. Since Oracle Data Mining functions reside natively in the Oracle Database kernel, they deliver unparallel performance, scalability and security. The data and data mining functions never leave the database to deliver a comprehensive in-database processing solution.

Data Virtualization: Data Virtualization is the new concept that allows , enterprises to access their information contained in disparate data sources in a seamless way. As mentioned in my earlier articles there are specialized Data virtualization platforms from vendors like, Composite Software, Denodo Technologies, IBM, Informatica, Microsoft have developed specialized data virtualization engines. My earlier article details out Data Virtualization using Middleware Vs RDBMS.

Data virtualization solutions provide a virtualized data services layer that integrates data from heterogeneous data sources and content in real time, near-real time, or batch as needed to support a wide range of applications and processes. : The Forrester Wave: Data Virtualization, Q1 2012 puts the data virtualization in the following perspective, in the past 24 months, we have seen a significant increase in adoption in the healthcare, insurance, retail, manufacturing, eCommerce, and media/entertainment sectors. Regardless of industry, all firms can benefit from data virtualization.

Data Mining Inside Data Virtualization Platforms?
The increase in data sources, especially integration with Big Data and Unstructured data made Data Virtualization platform a important part of enterprise data access strategy. Data virtualization provides the following attributes for efficient data access across enterprise.

  • Abstraction: Provides location, API, language and storage technology independent access of data
  • Federation: Converges data from multiple disparate data sources
  • Transformation: Enriches the quality and quantity of data on a need basis
  • On-Demand Delivery: Provides the consuming applications the required information on-demand

With the above benefits of the Data Virtualization Platform in mind, it is evident that enterprises will find it more useful if Data Virtualization platforms are built with Data Mining Models and Algorithms, so that effective Data Mining can be performed on top of Data Virtualization platform.

As the important part of Data Mining is about identifying the correct data sources and associated events of interest, effective Data Mining can be built if disparate data sources are brought under the scope of Data Virtualization Platform rather than putting the Data Mining inside a single database engine.

The following extended view of Data Virtualization Platform signifies how Data Mining can be part of Data Virtualization Platform.

Summary
Data Virtualization is becoming part of the mainstream enterprise data access strategy, mainly because it abstracts the multiple data sources and avoids complex ETL processing and facilitates the single version of truth, data quality and zero latency enterprise.

If value adds like a Data Mining engine can be built on top of the existing Data Virtualization platform, the enterprises will benefit further.

More Stories By Srinivasan Sundara Rajan

Highly passionate about utilizing Digital Technologies to enable next generation enterprise. Believes in enterprise transformation through the Natives (Cloud Native & Mobile Native).

@MicroservicesExpo Stories
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...

Let's just nip the conflation of these terms in the bud, shall we?

"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.

They are not.

One is about the application. The other, the network. T...

As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
The following fictional case study is a composite of actual horror stories I’ve heard over the years. Unfortunately, this scenario often occurs when in-house integration teams take on the complexities of DevOps and ALM integration with an enterprise service bus (ESB) or custom integration. It is written from the perspective of an enterprise architect tasked with leading an organization’s effort to adopt Agile to become more competitive. The company has turned to Scaled Agile Framework (SAFe) as ...
If you are within a stones throw of the DevOps marketplace you have undoubtably noticed the growing trend in Microservices. Whether you have been staying up to date with the latest articles and blogs or you just read the definition for the first time, these 5 Microservices Resources You Need In Your Life will guide you through the ins and outs of Microservices in today’s world.
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
This complete kit provides a proven process and customizable documents that will help you evaluate rapid application delivery platforms and select the ideal partner for building mobile and web apps for your organization.
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
A company’s collection of online systems is like a delicate ecosystem – all components must integrate with and complement each other, and one single malfunction in any of them can bring the entire system to a screeching halt. That’s why, when monitoring and analyzing the health of your online systems, you need a broad arsenal of different tools for your different needs. In addition to a wide-angle lens that provides a snapshot of the overall health of your system, you must also have precise, ...
It's been a busy time for tech's ongoing infatuation with containers. Amazon just announced EC2 Container Registry to simply container management. The new Azure container service taps into Microsoft's partnership with Docker and Mesosphere. You know when there's a standard for containers on the table there's money on the table, too. Everyone is talking containers because they reduce a ton of development-related challenges and make it much easier to move across production and testing environm...
Node.js and io.js are increasingly being used to run JavaScript on the server side for many types of applications, such as websites, real-time messaging and controllers for small devices with limited resources. For DevOps it is crucial to monitor the whole application stack and Node.js is rapidly becoming an important part of the stack in many organizations. Sematext has historically had a strong support for monitoring big data applications such as Elastic (aka Elasticsearch), Cassandra, Solr, S...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...