Welcome!

Microservices Expo Authors: Mehdi Daoudi, Pat Romanski, Elizabeth White, Steve Wilson, Stackify Blog

Related Topics: @ThingsExpo, Artificial Intelligence, @BigDataExpo

@ThingsExpo: Blog Feed Post

Me, Myself and Digital Twins | @ThingsExpo #AI #IoT #BigData #DX #DigitalTransformation

It’s hard to get into the world of the Internet of Things (IoT) without eventually talking about Digital Twins

It’s hard to get into the world of the Internet of Things (IoT) without eventually talking about Digital Twins. I was first exposed to the concept of Digital Twins when working with GE. Great concept. But are Digital Twins only relevant to physical machines such as wind turbines, jet engines, and locomotives? What can we learn about the concept of digital twins that we can apply more broadly – to other physical entities (like contracts and agreements) and even humans?

What Is a Digital Twin?
A Digital Twin is a digital representation of an industrial asset that enables companies to better understand and predict the performance of their machines, find new revenue streams, and change the way their business operates[1].

GE uses the concept of Digital Twins to create a digital replica of their physical product (e.g., wind turbine, jet engine, locomotive) that captures the asset’s detailed history (from design to build to maintenance to retirement/salvage) that can be mined to provide actionable insights into the product’s operations, maintenance and performance.

The Digital Twin concept seems to work for any entity around which there is ongoing activity or “life”; that is, there is a continuous flow of new information about that entity that can constantly update the condition or “state” of that entity.

The Digital Twin concept is so powerful that it would be a shame to not apply the concept beyond just physical products. So let’s try to apply the Digital Twin concept to another type of common physical entity – contracts or agreements.

Applying Digital Twins to Contracts
Many contracts and agreements have a life of their own; they are not just static entities. Many contracts (e.g., warranties, health insurance, automobile insurance, car rental agreements, construction contracts, apartment rental agreements, leasing agreements, personal loans, line of credit, maintenance contracts) once established, have a stream of ongoing interactions, changes, enhancements, additions, enforcements and filings. In fact, the very process of establishing a contract can constitute many interactions with exchanges of information (negotiations) that shape the coverage, agreement, responsibilities and expectations of the contract.

Let’s take a simple home insurance contract. Once the contract is established, there is a steady stream of interactions and enhancements to that contract including:

  • Payments
  • Addendums
  • Claims filings
  • Changes in terms and conditions
  • Changes in coverage
  • Changes in deductibles

Each of these engagements changes the nature of the contract including its value and potential liabilities. The insurance contract looks like a Digital Twin of the physical property for which it insures given the cumulative history of the interactions with and around that contract.

Expanding upon the house insurance contract as a living entity, the insurance company might want to gather other data about the property in order to increase the value of the contract while reducing the contract’s risks, liabilities and obligations. Other data sources that the insurance company might want to integrate with the house insurance contract could include:

  • Changes in the value of the home (as measured by Zillow and others)
  • Changes in the value of the nearby homes
  • Changes in local crime
  • Changes in local traffic
  • Changes in the credentials and quality of local schools
  • Quality of nearby parks
  • Changes in zoning (the value and liabilities of a house could change if someone constructs a mall nearby)
  • Changes in utilities (a house with lots of grass might not be as attractive as the price of water starts to increase)
  • New nearby construction permits (residential and commercial)
  • Aerial photos (to check for unauthorized additions or construction)
  • Changes in the location of mass transit
  • Even predicted changes in local climate (see Figure 1)

Figure 1: “As Climate Changes, Southern States Will Suffer More Than Others” New York Times

If the goal of the insurance company holding the home insurance policy is to 1) maximize the value of that policy while 2) reducing any potential costs, liabilities and obligations, then creating a Digital Twin via a home insurance contract seems like a smart economic move.

Applying Digital Twins to Humans
“Big Data is not about big data; it’s about getting down to the individual!”

Your customers and prospects are leaving their digital fingerprints everywhere on the Internet. From social media (likes, retweets, posts, shares), mobile apps, website clicks, keyword searches, blog postings, on-line forum comments, posting photos, listening to music and watching videos, your customers and your best prospects are sharing their digital DNA including their interests, passions, associations and affiliations (see Figure 2).

Figure 2: Consumers Digital Fingerprint

One of the keys to data monetization is to understand the behaviors and tendencies of each individual including consumers, students, teachers, patients, doctors, nurses, engineers, technicians, agents, brokers, store managers, baristas, clerks, and athletes. In order to better serve your customers, you need to capture and quantify each individual customer’s preferences, behaviors, tendencies, inclinations, interests, passions, associations and affiliations in the form of actionable insights (such as propensity scores). See Figure 3.

Figure 3: Big Data About Insights at Level of the Individual

These actionable insights can be captured in an Analytic Profile for re-use across a number of use cases including customer acquisition, retention, cross-sell/up-selling, fraud reduction, money laundering, advocacy development and likelihood to recommend (see Figure 3).

Figure 4: Analytic Profiles as Data Monetization Foundation

The behavioral insights captured in the Analytic Profiles ultimately allow organizations to optimize their entire value chain process – from product design, development, and manufacturing to optimizing marketing activities, prioritizing sales efforts, and optimizing service and support activities (see “Big Data MBA: Course 101A – Unit III” for more details on opportunities for monetizing Michael Porter’s Value Chain Analysis).

Summary
GE uses the concept of Digital Twins to create a digital replica of a physical product that captures the asset’s detailed history (from design to build to maintenance to retirement/salvage) that can be mined to provide actionable insights into the product’s operations, maintenance and performance.

That same Digital Twins concept can be applied to contracts and agreements in order to increase the value of those contracts while minimizing any potential risks and liabilities.  And the Digital Twins concept can also be applied to humans in order to better monetize the individual human (customers) across the organization’s value creation process.

[1] https://www.ge.com/digital/industrial-internet/digital-twin

The post Me, Myself and Digital Twins appeared first on InFocus Blog | Dell EMC Services.

Read the original blog entry...

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business”, is responsible for setting the strategy and defining the Big Data service line offerings and capabilities for the EMC Global Services organization. As part of Bill’s CTO charter, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, avid blogger and is a frequent speaker on the use of Big Data and advanced analytics to power organization’s key business initiatives. He also teaches the “Big Data MBA” at the University of San Francisco School of Management.

Bill has nearly three decades of experience in data warehousing, BI and analytics. Bill authored EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the Vice President of Advertiser Analytics at Yahoo and the Vice President of Analytic Applications at Business Objects.

@MicroservicesExpo Stories
We have already established the importance of APIs in today’s digital world (read about it here). With APIs playing such an important role in keeping us connected, it’s necessary to maintain the API’s performance as well as availability. There are multiple aspects to consider when monitoring APIs, from integration to performance issues, therefore a general monitoring strategy that only accounts for up-time is not ideal.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that’s no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, will explore how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He wi...
These days, change is the only constant. In order to adapt and thrive in an ever-advancing and sometimes chaotic workforce, companies must leverage intelligent tools to streamline operations. While we're only at the dawn of machine intelligence, using a workflow manager will benefit your company in both the short and long term. Think: reduced errors, improved efficiency and more empowered employees-and that's just the start. Here are five other reasons workflow automation is leading a revolution...
Docker is sweeping across startups and enterprises alike, changing the way we build and ship applications. It's the most prominent and widely known software container platform, and it's particularly useful for eliminating common challenges when collaborating on code (like the "it works on my machine" phenomenon that most devs know all too well). With Docker, you can run and manage apps side-by-side - in isolated containers - resulting in better compute density. It's something that many developer...
We have Continuous Integration and we have Continuous Deployment, but what’s continuous across all of what we do is people. Even when tasks are automated, someone wrote the automation. So, Jayne Groll evangelizes about Continuous Everyone. Jayne is the CEO of the DevOps Institute and the author of Agile Service Management Guide. She talked about Continuous Everyone at the 2016 All Day DevOps conference. She describes it as "about people, culture, and collaboration mapped into your value streams....
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically ab...
Docker is on a roll. In the last few years, this container management service has become immensely popular in development, especially given the great fit with agile-based projects and continuous delivery. In this article, I want to take a brief look at how you can use Docker to accelerate and streamline the software development lifecycle (SDLC) process.
We define Hybrid IT as a management approach in which organizations create a workload-centric and value-driven integrated technology stack that may include legacy infrastructure, web-scale architectures, private cloud implementations along with public cloud platforms ranging from Infrastructure-as-a-Service to Software-as-a-Service.
Did you know that you can develop for mainframes in Java? Or that the testing and deployment can be automated across mobile to mainframe? In his session and demo at @DevOpsSummit at 21st Cloud Expo, Dana Boudreau, a Senior Director at CA Technologies, will discuss how increasingly teams are developing with agile methodologies, using modern development environments, and automating testing and deployments, mobile to mainframe.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory?
While some vendors scramble to create and sell you a fancy solution for monitoring your spanking new Amazon Lambdas, hear how you can do it on the cheap using just built-in Java APIs yourself. By exploiting a little-known fact that Lambdas aren’t exactly single-threaded, you can effectively identify hot spots in your serverless code. In his session at @DevOpsSummit at 21st Cloud Expo, Dave Martin, Product owner at CA Technologies, will give a live demonstration and code walkthrough, showing how ...
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
There are several reasons why businesses migrate their operations to the cloud. Scalability and price are among the most important factors determining this transition. Unlike legacy systems, cloud based businesses can scale on demand. The database and applications in the cloud are not rendered simply from one server located in your headquarters, but is instead distributed across several servers across the world. Such CDNs also bring about greater control in times of uncertainty. A database hack ...
API Security is complex! Vendors like Forum Systems, IBM, CA and Axway have invested almost 2 decades of engineering effort and significant capital in building API Security stacks to lockdown APIs. The API Security stack diagram shown below is a building block for rapidly locking down APIs. The four fundamental pillars of API Security - SSL, Identity, Content Validation and deployment architecture - are discussed in detail below.
With Cloud Foundry you can easily deploy and use apps utilizing websocket technology, but not everybody realizes that scaling them out is not that trivial. In his session at 21st Cloud Expo, Roman Swoszowski, CTO and VP, Cloud Foundry Services, at Grape Up, will show you an example of how to deal with this issue. He will demonstrate a cloud-native Spring Boot app running in Cloud Foundry and communicating with clients over websocket protocol that can be easily scaled horizontally and coordinate...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...