Microservices Expo Authors: Elizabeth White, Mehdi Daoudi, Pat Romanski, Flint Brenton, Gordon Haff

Related Topics: @DXWorldExpo, @CloudExpo, @ThingsExpo

@DXWorldExpo: Blog Feed Post

Golden State Warriors Analytics Exercise | @BigDataExpo #BigData #Analytics

Identifying and quantifying variables that might be better predictors of performance

For a recent University of San Francisco MBA class, I wanted to put my students in a challenging situation where they would be forced to make difficult data science trade-offs between gathering data, preparing the data and performing the actual analysis.

The purpose of the exercise was to test their ability to “think like a data scientist” with respect to identifying and quantifying variables that might be better predictors of performance. The exercise would require them to:

  • Set up a basic analytic environment
  • Gather and organize different data sources
  • Explore the data using different visualization techniques
  • Create and test composite metrics by grouping and transforming base metrics
  • Create a score or analytic model that supports their recommendations

I gave them the links to 10 Warrior games (5 regulation wins, 3 overtime losses and 2 regulation losses) as their starting data set.

I then put them in a time boxed situation (spend no more than 5 hours on the exercise) with the following scenario:

You have been hired by the Golden State Warriors coaching staff to review game performance data to identify and quantify metrics that predict a Warriors victory

Here were the key deliverables for the exercise:

  1. I wanted a single, easy-to-understand slide with in-game and/or player recommendations.
  2. I wanted a break out of how they spent their 5 hours across the following categories:
  • Setting up your analytic environment
  • Gathering and organizing the data
  • Visualizing and analyzing the data
  • Creating the analytic models and recommendations
  1. Finally, I wanted back-up information (data, visualizations and analytics) in order to defend their in-game and/or player recommendations.

Exercise Learnings
Here is what we learned from the exercise:

Lesson #1: It’s difficult to not spend too much time gathering and cleansing data. On average, the teams spent 50% to 80% of their time gathering and preparing the data. That only left 10% to 20% of their time for the actual analysis. It’s really hard to know when “good enough” is really “good enough” when it comes to gathering and preparing the data.

Lesson #2: Quick and dirty visualizations are critical in understanding what is happening in the data and establishing hypotheses to be tested. For example, the data visualization in Figure 1 quickly highlighted the importance of offensive rebounds and three-point shooting percentage in the Warriors’ overtime losses.

Figure 1: Use Quick Data Visualizations to Establish Hypotheses to Test

Lesson #3: Different teams came up with different sets of predictive variables. Team #1 came up with Total Rebounds, Three-Point Shooting %, Fast Break Points and Technical Fouls as the best predictors of performance. They tested a hypothesis that the more “aggressive” the Warriors played (as indicated by rebounding, fast break points and technical fouls), the more likely they were to win (see Figure 2).

Figure 2: Testing Potential Predictive Variables

Team #2 came up with the variables of Steals, Field Goal Percentage and Assists as the best predictors of performance (see Figure 3).

Figure 3: ANOVA Table for Team #2

Team #2 then tested their analytic models against two upcoming games: New Orleans and Houston. Their model accurately predicted not only the wins, but the margin of victory fell within their predicted ranges. For example in the game against New Orleans, their model predicted a win by 21 to 30 points, in which the Warriors actually won by 22 (see Figure 4).

Figure 4: Predicting Warriors versus New Orleans Winner

And then in the Houston game, their model predicted a win by 0 to 10 points (where 0 indicated an overtime game), and the Warriors actually won that game by 9 points (see Figure 5).

Figure 5: Predicting Warriors versus Houston Winner

I think I’m taking Team #2 with me next time I go to Vegas!

By the way, in case you want to run the exercise yourself, Appendix A lists the data sources that the teams used for the exercise. But be sure to operate under the same 5-hour constraint!

A few other learnings came out of the exercise, which I think are incredibly valuable for both new as well as experienced data scientists:

  • Don’t spend too much time trying to set up the perfect analytic environment. Sometimes a simple analytic environment (spreadsheet) can yield consider insights with little effort.
  • Start with small data sets (10 to 20GB). That way you’ll spend more time visualizing and analyzing the data and less time trying to gather and prepare the data. You’ll be able to develop and test hypotheses much more quickly with the smaller data sets running on your laptop, which one can stress test later using the full data set.
  • Make sure that your data science team collaborates closely with business subject matter experts. The teams that struggled in the exercise were the teams that didn’t have anyone who understood the game of basketball (not sure how that’s even possible, but oh well).

One of the many reasons why I love teaching is the ability to work with students who don’t yet know what they can’t accomplish. In their eyes, everything is possible. Their fresh perspectives can yield all sorts of learnings, and not just for them. And yes, you can teach an old dog like me new tricks!

Appendix A:  Exercise Data Sources
Extract “Team Stats” from the Warriors Game Results website: http://www.espn.com/nba/team/schedule/_/name/gs.  Listed below is a cross-section of games from which you may want to use to start your analysis.


Rockets 1/20/17: http://www.espn.com/nba/matchup?gameId=400900067

Thunder 1/18/17: http://www.espn.com/nba/matchup?gameId=400900055

Cavaliers 1/16/17: http://www.espn.com/nba/matchup?gameId=400900040

Raptors 11/16/16: http://www.espn.com/nba/matchup?gameId=400899615

Trailblazers 1/2/17:  http://www.espn.com/nba/matchup?gameId=400900139

Overtime (Losses)

Houston 12/1/16: http://www.espn.com/nba/matchup?gameId=400899436

Grizzles 1/6/17: http://www.espn.com/nba/matchup?gameId=400899971

Sacramento 2/4/17: http://www.espn.com/nba/matchup?gameId=400900169


Spurs 10/25/16: http://www.espn.com/nba/boxscore?gameId=400899377

Lakers 11/4/16: http://www.espn.com/nba/matchup?gameId=400899528

Cavaliers 12/25/16: http://www.espn.com/nba/matchup?gameId=400899899

Note: You are welcome to gather team and/or individual stats from any other games or websites that you wish.

The post Golden State Warriors Analytics Exercise appeared first on InFocus Blog | Dell EMC Services.

Read the original blog entry...

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business” and “Big Data MBA: Driving Business Strategies with Data Science”, is responsible for setting strategy and defining the Big Data service offerings for Dell EMC’s Big Data Practice.

As a CTO within Dell EMC’s 2,000+ person consulting organization, he works with organizations to identify where and how to start their big data journeys. He’s written white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power an organization’s key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the “Big Data MBA” course. Bill also just completed a research paper on “Determining The Economic Value of Data”. Onalytica recently ranked Bill as #4 Big Data Influencer worldwide.

Bill has over three decades of experience in data warehousing, BI and analytics. Bill authored the Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements. Bill serves on the City of San Jose’s Technology Innovation Board, and on the faculties of The Data Warehouse Institute and Strata.

Previously, Bill was vice president of Analytics at Yahoo where he was responsible for the development of Yahoo’s Advertiser and Website analytics products, including the delivery of “actionable insights” through a holistic user experience. Before that, Bill oversaw the Analytic Applications business unit at Business Objects, including the development, marketing and sales of their industry-defining analytic applications.

Bill holds a Masters Business Administration from University of Iowa and a Bachelor of Science degree in Mathematics, Computer Science and Business Administration from Coe College.

@MicroservicesExpo Stories
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
"We started a Master of Science in business analytics - that's the hot topic. We serve the business community around San Francisco so we educate the working professionals and this is where they all want to be," explained Judy Lee, Associate Professor and Department Chair at Golden Gate University, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
For over a decade, Application Programming Interface or APIs have been used to exchange data between multiple platforms. From social media to news and media sites, most websites depend on APIs to provide a dynamic and real-time digital experience. APIs have made its way into almost every device and service available today and it continues to spur innovations in every field of technology. There are multiple programming languages used to build and run applications in the online world. And just li...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
The general concepts of DevOps have played a central role advancing the modern software delivery industry. With the library of DevOps best practices, tips and guides expanding quickly, it can be difficult to track down the best and most accurate resources and information. In order to help the software development community, and to further our own learning, we reached out to leading industry analysts and asked them about an increasingly popular tenet of a DevOps transformation: collaboration.
We call it DevOps but much of the time there’s a lot more discussion about the needs and concerns of developers than there is about other groups. There’s a focus on improved and less isolated developer workflows. There are many discussions around collaboration, continuous integration and delivery, issue tracking, source code control, code review, IDEs, and xPaaS – and all the tools that enable those things. Changes in developer practices may come up – such as developers taking ownership of code ...
Cloud Governance means many things to many people. Heck, just the word cloud means different things depending on who you are talking to. While definitions can vary, controlling access to cloud resources is invariably a central piece of any governance program. Enterprise cloud computing has transformed IT. Cloud computing decreases time-to-market, improves agility by allowing businesses to adapt quickly to changing market demands, and, ultimately, drives down costs.
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
How is DevOps going within your organization? If you need some help measuring just how well it is going, we have prepared a list of some key DevOps metrics to track. These metrics can help you understand how your team is doing over time. The word DevOps means different things to different people. Some say it a culture and every vendor in the industry claims that their tools help with DevOps. Depending on how you define DevOps, some of these metrics may matter more or less to you and your team.
"CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We are an integrator of carrier ethernet and bandwidth to get people to connect to the cloud, to the SaaS providers, and the IaaS providers all on ethernet," explained Paul Mako, CEO & CTO of Massive Networks, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Grape Up leverages Cloud Native technologies and helps companies build software using microservices, and work the DevOps agile way. We've been doing digital innovation for the last 12 years," explained Daniel Heckman, of Grape Up in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"Outscale was founded in 2010, is based in France, is a strategic partner to Dassault Systémes and has done quite a bit of work with divisions of Dassault," explained Jackie Funk, Digital Marketing exec at Outscale, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Let's do a visualization exercise. Imagine it's December 31, 2018, and you're ringing in the New Year with your friends and family. You think back on everything that you accomplished in the last year: your company's revenue is through the roof thanks to the success of your product, and you were promoted to Lead Developer. 2019 is poised to be an even bigger year for your company because you have the tools and insight to scale as quickly as demand requires. You're a happy human, and it's not just...
The enterprise data storage marketplace is poised to become a battlefield. No longer the quiet backwater of cloud computing services, the focus of this global transition is now going from compute to storage. An overview of recent storage market history is needed to understand why this transition is important. Before 2007 and the birth of the cloud computing market we are witnessing today, the on-premise model hosted in large local data centers dominated enterprise storage. Key marketplace play...
Cavirin Systems has just announced C2, a SaaS offering designed to bring continuous security assessment and remediation to hybrid environments, containers, and data centers. Cavirin C2 is deployed within Amazon Web Services (AWS) and features a flexible licensing model for easy scalability and clear pay-as-you-go pricing. Although native to AWS, it also supports assessment and remediation of virtual or container instances within Microsoft Azure, Google Cloud Platform (GCP), or on-premise. By dr...
With continuous delivery (CD) almost always in the spotlight, continuous integration (CI) is often left out in the cold. Indeed, it's been in use for so long and so widely, we often take the model for granted. So what is CI and how can you make the most of it? This blog is intended to answer those questions. Before we step into examining CI, we need to look back. Software developers often work in small teams and modularity, and need to integrate their changes with the rest of the project code b...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...