Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Yeshim Deniz, Flint Brenton

Related Topics: @DXWorldExpo, Artificial Intelligence, @CloudExpo, @ThingsExpo

@DXWorldExpo: Blog Feed Post

Demystifying #DataScience | @CloudExpo #BigData #AI #ArtificialIntelligence

Data science is about identifying those variables and metrics that might be better predictors of performance

[Opening Scene]: Billy Dean is pacing the office. He’s struggling to keep his delivery trucks at full capacity and on the road. Random breakdowns, unexpected employee absences, and unscheduled truck maintenance are impacting bookings, revenues and ultimately customer satisfaction. He keeps hearing from his business customers how they are leveraging data science to improve their business operations. Billy Dean starts to wonder if data science can help him. As he contemplates what data science can do for him, he slowly drifts off to sleep, and visions of Data Science starts dancing in his head…

[Poof! Suddenly Wizard Wei appears]: Hi, I’m your data science wizard to help alleviate your data science concerns. I don’t understand why folks try to make the data science discussion complicated. Let’s start simple with a simple definition of data science:

Data science is about identifying those variables and metrics that might be better predictors of performance

The key to a successful analytical model is having a robust set of variables against which to test for their predictive capabilities. And the key to having a robust set of variables from which to test is to get the business users engaged early in the process.

[A confused Billy Dean]: Okay, but I’m still confused. I mean, how does this really apply to my business?

[A patient Wizard Wei]: Well, let’s say that you are trying to predict which of your routes are likely to have under-capacity loads so that you can combine loads. In order to identify those variables that might be better predictors of under-capacity routes, you might ask your business users:

What data might you want to have in order to predict under-capacity routes?

The business users are likely to come up with a wide variety of variables, including:

Customer name Ship to location Customer industry
Building permits Customer tenure Change in customer size
Customer stock price Customer D&B rating Types of products hauled
Time of year Seasonality/Holidays Day of week
Traffic Weather Local Events
Distance from distribution center Open headcount on Indeed.com Tenure of logistics manager

The Data Science team will then gather these variables, perform some data transformations and enrichment, and then look for variables and combinations of variables that yield the best predictive results regarding under-capacity routes (see Figure 1).

Figure 1: Data Science Process

Role of Artificial Intelligence
[A less confuse Billy Dean]:
Ah, I think I understand, but what about all this talk about artificial intelligence? From some of these commercials on TV, it appears that robots with artificial intelligence will be ruling the world. Can you say Skynet?

[A still patient Wizard Wei]: Ah, that’s just marketing. Artificial intelligence is just one of many different tools in the predictive analytics kit bag of a data scientist. But artificial intelligence – while embracing some very sophisticated mathematical, data enrichment and computing techniques – is really pretty straightforward. All artificial intelligence is trying to do is to find and quantify relationships between variables buried in large data sets (see Figure 2).

Figure 2: Understanding Artificial Intelligence

[An inquisitive Billy Dean]: Okay, I’m starting to get it, but there seems to be some many
different analytic and predictive algorithms from which to choose. How does the business user know where to start?

[A growing frustrated Wizard Wei]: Ah, that’s the secret to the process. Business users don’t need to know which algorithms to use; they need to be able to identify those variables that might be better predictors of performance. It is up to the data science team to determine which variables are the most appropriate by testing the different algorithms.

Data Mining, Machine Learning and Artificial Intelligence (including areas such as cognitive computing, statistics, neural networks, text analytics, video analytics, etc.) are all members of the broader category of data science tools. Our data scientist team has experts in each of these areas, though no one data scientist is an expert at all of them (in spite of what they tell me). The different data science tools are used in different scenarios for different needs. Think of one of your mechanics. They have a large toolbox full of different tools. They determine what tools to use to fix a truck based upon the problem they are trying to solve. That’s exactly what a data scientist is doing, just with a different toolbox of algorithms.

No single algorithm is best over whole domain; so different algorithms are needed to cover different domains. Often combinations of algorithms are used in order to achieve the best results. To be honest, it’s like a giant jigsaw puzzle with the data science team constantly testing different combinations of metrics, data enrichment and algorithms until they find the combination that yields the best results.

[An enlightened Billy Dean]: I think I’ve finally got it. All of these different algorithms and techniques are just trying to help predict what is likely to happen so that I can make better operational and customer issues. But what’s the realm of what’s possible with data and analytics; I mean, how effective can my organization become at leveraging data and analytics to power my business?

[A proud Wizard Wei]: Great question, and the heart of the big data and data science conversation. Figure 3 shows how you could use these different data science tools to progress up the Big Data Business Model Maturity Index; to transition from running your business on Descriptive analytics that tell you what happened (Monitoring stage) to Predictive analytics that tell you what is likely to happen (Insights stage) to Prescriptive analytics that tell you what they should do (Optimization stage).

Figure 3: Leveraging Artificial Intelligence to drive Business Value

In the end, the data and the analytics are only useful if they help you optimize key operational processes, reduce compliance and security risks, uncover new revenue opportunities and create a more compelling, more prescriptive customer engagement. In the end, data and analytics are all about your business.

[A satisfied Billy Dean]: That’s great Wizard Wei! Thanks for your help!

Now, what can you do about my taxes…

To learn more about “Demystifying Data Science”, come to my Dell EMC World session: “Demystifying Data Science: A Pragmatic Guide To Building Big Data Use Cases” See you there!!

The post Demystifying Data Science appeared first on InFocus Blog | Dell EMC Services.

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business” and “Big Data MBA: Driving Business Strategies with Data Science”, is responsible for setting strategy and defining the Big Data service offerings for Dell EMC’s Big Data Practice.

As a CTO within Dell EMC’s 2,000+ person consulting organization, he works with organizations to identify where and how to start their big data journeys. He’s written white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power an organization’s key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the “Big Data MBA” course. Bill also just completed a research paper on “Determining The Economic Value of Data”. Onalytica recently ranked Bill as #4 Big Data Influencer worldwide.

Bill has over three decades of experience in data warehousing, BI and analytics. Bill authored the Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements. Bill serves on the City of San Jose’s Technology Innovation Board, and on the faculties of The Data Warehouse Institute and Strata.

Previously, Bill was vice president of Analytics at Yahoo where he was responsible for the development of Yahoo’s Advertiser and Website analytics products, including the delivery of “actionable insights” through a holistic user experience. Before that, Bill oversaw the Analytic Applications business unit at Business Objects, including the development, marketing and sales of their industry-defining analytic applications.

Bill holds a Masters Business Administration from University of Iowa and a Bachelor of Science degree in Mathematics, Computer Science and Business Administration from Coe College.

@MicroservicesExpo Stories
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Don’t go chasing waterfall … development, that is. According to a recent post by Madison Moore on Medium featuring insights from several software delivery industry leaders, waterfall is – while still popular – not the best way to win in the marketplace. With methodologies like Agile, DevOps and Continuous Delivery becoming ever more prominent over the past 15 years or so, waterfall is old news. Or, is it? Moore cites a recent study by Gartner: “According to Gartner’s IT Key Metrics Data report, ...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...