|By Michael Bushong||
|January 28, 2014 11:00 AM EST||
We are a few short days away from the biggest spectacle in sports – the Super Bowl. It is impossible to avoid talk this week of Peyton Manning, the Denver Broncos, the Seattle Seahawks, and the NFL in general. But does the NFL have anything to teach tech industries?
The NFL is a massively successful franchise by almost any measure. Despite a rash of recent scandals including a pay-for-injury bounty program and a major law suit and settlement tied to concussions, the league continues to grow its fan base – both in the US and abroad – while raking in record numbers of viewers and revenue. At the heart of the NFL’s resilience when it comes to scandal and its seemingly bottomless pit of revenues is an uncanny to reinvent itself.
In fact, it is the NFL’s overall position on its own evolution that has secured its place at the top of the entertainment pantheon.
Instant Replay in the NFL
The NFL adopted instant replay in 1986 after a fairly healthy debate. Detractors would point out that part of the history of the NFL was that the game was officiated by humans, complete with their flaws. Games had been decided in the past by a team of officials who had to get it right in the moment, and changing that would somehow alter the NFL’s traditions. But it took only a few high-profile officiating mishaps played back on national television to sway sentiment, and in 1986, by a vote of 23 to 4 (with one abstaining), the NFL ushered instant replay into the league.
But instant replay’s first stint in the NFL lasted only until 1992. In its first incarnation, instant replay ranged from effective to wildly unpopular. The rules for which plays could be reviewed was not always clear. The process was slow and at times awkward, making games take too long. And the original incarnation of instant replay allowed officials to review their own calls, which led to somewhat maddening outcomes.
Instant replay went dark until making its triumphant return in 1999. With a few process tweaks (coaches being able to challenge specific calls) and the advance of technology (HD and more angles), the system is clearly here to stay.
But what is so important about how the NFL rolled out instant replay? And how does this apply to networking?
Instant Replay and Networking
First, it is worth noting that instant replay was not a unanimous choice. There were detractors – members of the Old Guard who thought that the new way of doing business was too big a departure from the past. In networking, we face much of the same. There are countless people who fight change at every step because it is not consistent with the old way of doing things. They cling to their technological religion while the rest of the world moves forward. It’s not that their experiences are not not relevant or even not important, but their inability to work alongside the disruptors means that those experiences are kept private, forcing the New Guard to stumble over many of the same obstacles. This is not good for anyone.
Second, we should all realize that instant replay was tried and it failed. But despite the failure, the NFL was able to bring it back to the great benefit of the game. As the SDN revolution wages on, there are people who point to the past. They say clever things like “All that is old is new again” or they refer derisively to past attempts the industry has made to solve some of the same problems being addressed by SDN today.
But if ideas were permanently shelved because of setback or failure, where would we be? Using the past as a compass for the future is helpful; clinging to the past and using it to justify a refusal to move forward is destructive.
And finally, the NFL has shown a remarkable ability to iterate on its ideas. Instant replay was successful in its second run because of the changes the NFL made. New technology will not be invented with perfect foresight. The initial ideas might not even be as important as the iterative adjustments. We need to embrace failure and use it to adapt and overcome. By not being religious about its history, the NFL has successfully evolved. The question for networking specialists everywhere is to what extent our own industry is capable of setting aside its sacred cows.
Rushing, West Coast Offense, Hurry-Up Offense
Football is remarkable in how much it changes over time. Decades ago, offense was all about having a good running back. The passing game was an afterthought, used to lure defenders away from the line of scrimmage. Those days yielded to a more pass-happy time featuring the San Diego Chargers’ Air Coryell offense and the Houston Oilers Run and Shoot. Those teams handed the offensive mantle over to Bill Walsh’s West Coast Offense. Then we saw New Orleans’ more vertical passing attack. And now we have the whole hurry-up offense.
It almost doesn’t matter what is different between these systems. That so many systems have been able to thrive is what is amazing. The NFL, despite its traditions, seems most committed to reinventing itself. And for every one of these offensive systems, there are a dozen others that failed to catch on.
Evolution and Networking
The NFL has figured out that they are a league that thrives on new ideas. Whether its the NFL as a whole, or individual teams and players, the entire league is committed to trying new things. That commitment has created a hyper-fertile breeding ground for new ideas. It is no surprise that the league has managed to reinvent itself every few years, much to the delight of its legions of fans.
Networking is going through an interesting time. This period of 3-4 years might very well be looked on as a Golden Era for networking. The amount of new ideas that are being tested in the marketplace right now is amazing. SDN, NFV, DevOps, Photonic Switching, Sensor Networking, Network Virtualization… and the list goes on. But these new ideas came on the heels of what really were the Dark Ages. After the Dot.com bust, the networking world went dark. Sure, there were new knobs and doodads that were useful for folks, but as an industry, the innovation was pretty incremental.
So when this Golden Era of Networking is over, which networking industry will we have? Will we return to the Dark Ages, or will we end up in another Period of Enlightenment? If the NFL is any indication of what continuous innovation looks like, it would seem the better answer is to embrace the new ideas. But are we culturally prepared to continue embracing disruption? Are we collectively unafraid of failure enough that this type of future suits us? If you ask me, we have to be.
Defense Wins Championships
There is an old saw that goes “Defense wins championships.” At this time of year, it gets trotted out as one of those universal truths. But here’s the reality: evolution wins championships. In the NFL, offenses and defenses win about the same amount (a slight nod to defenses, but only by a hair). It’s a team’s ability to evolve over the years – and even during the game – that dictates success.
Our industry is no different. We have our own Old Guard that talks about past technologies with the kind of reverence that you see when historians put on their smoking jackets and grab their pipes. But our industry is defined by its future more than its past. There is a lot to learn from our history, but if we let those teachings get in the way of our future, we will be no better off than we are now.
So when you are grabbing a beer or diving into that 7-layer dip at whatever Super Bowl party you end up at, talk about the role of innovation and how it reigns supreme over those dusty old defenses.
[Today's fun fact: Clans of long ago that wanted to get rid of their unwanted people without killing them used to burn their houses down, hence the expression "To get fired." I wonder where the term "lay off" came from then?]
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Dec. 3, 2016 08:00 PM EST Reads: 1,737
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Dec. 3, 2016 05:15 PM EST Reads: 2,132
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 3, 2016 04:30 PM EST Reads: 1,468
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Dec. 3, 2016 03:15 PM EST Reads: 3,220
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Dec. 3, 2016 02:15 PM EST Reads: 5,469
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Dec. 3, 2016 02:00 PM EST Reads: 2,480
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, showed how customers are able to achieve a level of transparency that enables everyone fro...
Dec. 3, 2016 01:45 PM EST Reads: 1,830
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
Dec. 3, 2016 01:00 PM EST Reads: 1,872
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Dec. 3, 2016 11:30 AM EST Reads: 2,091
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Dec. 3, 2016 11:15 AM EST Reads: 1,640
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 3, 2016 09:30 AM EST Reads: 846
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to delive...
Dec. 3, 2016 08:30 AM EST Reads: 1,371
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
Dec. 3, 2016 08:30 AM EST Reads: 771
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Dec. 3, 2016 04:00 AM EST Reads: 2,738
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Dec. 3, 2016 02:15 AM EST Reads: 797
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
Dec. 3, 2016 01:45 AM EST Reads: 4,543
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 3, 2016 12:15 AM EST Reads: 1,769
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 2, 2016 10:30 PM EST Reads: 1,745
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Dec. 2, 2016 08:00 PM EST Reads: 1,553
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Dec. 2, 2016 01:30 PM EST Reads: 5,718