|By Michael Bushong||
|January 28, 2014 11:00 AM EST||
We are a few short days away from the biggest spectacle in sports – the Super Bowl. It is impossible to avoid talk this week of Peyton Manning, the Denver Broncos, the Seattle Seahawks, and the NFL in general. But does the NFL have anything to teach tech industries?
The NFL is a massively successful franchise by almost any measure. Despite a rash of recent scandals including a pay-for-injury bounty program and a major law suit and settlement tied to concussions, the league continues to grow its fan base – both in the US and abroad – while raking in record numbers of viewers and revenue. At the heart of the NFL’s resilience when it comes to scandal and its seemingly bottomless pit of revenues is an uncanny to reinvent itself.
In fact, it is the NFL’s overall position on its own evolution that has secured its place at the top of the entertainment pantheon.
Instant Replay in the NFL
The NFL adopted instant replay in 1986 after a fairly healthy debate. Detractors would point out that part of the history of the NFL was that the game was officiated by humans, complete with their flaws. Games had been decided in the past by a team of officials who had to get it right in the moment, and changing that would somehow alter the NFL’s traditions. But it took only a few high-profile officiating mishaps played back on national television to sway sentiment, and in 1986, by a vote of 23 to 4 (with one abstaining), the NFL ushered instant replay into the league.
But instant replay’s first stint in the NFL lasted only until 1992. In its first incarnation, instant replay ranged from effective to wildly unpopular. The rules for which plays could be reviewed was not always clear. The process was slow and at times awkward, making games take too long. And the original incarnation of instant replay allowed officials to review their own calls, which led to somewhat maddening outcomes.
Instant replay went dark until making its triumphant return in 1999. With a few process tweaks (coaches being able to challenge specific calls) and the advance of technology (HD and more angles), the system is clearly here to stay.
But what is so important about how the NFL rolled out instant replay? And how does this apply to networking?
Instant Replay and Networking
First, it is worth noting that instant replay was not a unanimous choice. There were detractors – members of the Old Guard who thought that the new way of doing business was too big a departure from the past. In networking, we face much of the same. There are countless people who fight change at every step because it is not consistent with the old way of doing things. They cling to their technological religion while the rest of the world moves forward. It’s not that their experiences are not not relevant or even not important, but their inability to work alongside the disruptors means that those experiences are kept private, forcing the New Guard to stumble over many of the same obstacles. This is not good for anyone.
Second, we should all realize that instant replay was tried and it failed. But despite the failure, the NFL was able to bring it back to the great benefit of the game. As the SDN revolution wages on, there are people who point to the past. They say clever things like “All that is old is new again” or they refer derisively to past attempts the industry has made to solve some of the same problems being addressed by SDN today.
But if ideas were permanently shelved because of setback or failure, where would we be? Using the past as a compass for the future is helpful; clinging to the past and using it to justify a refusal to move forward is destructive.
And finally, the NFL has shown a remarkable ability to iterate on its ideas. Instant replay was successful in its second run because of the changes the NFL made. New technology will not be invented with perfect foresight. The initial ideas might not even be as important as the iterative adjustments. We need to embrace failure and use it to adapt and overcome. By not being religious about its history, the NFL has successfully evolved. The question for networking specialists everywhere is to what extent our own industry is capable of setting aside its sacred cows.
Rushing, West Coast Offense, Hurry-Up Offense
Football is remarkable in how much it changes over time. Decades ago, offense was all about having a good running back. The passing game was an afterthought, used to lure defenders away from the line of scrimmage. Those days yielded to a more pass-happy time featuring the San Diego Chargers’ Air Coryell offense and the Houston Oilers Run and Shoot. Those teams handed the offensive mantle over to Bill Walsh’s West Coast Offense. Then we saw New Orleans’ more vertical passing attack. And now we have the whole hurry-up offense.
It almost doesn’t matter what is different between these systems. That so many systems have been able to thrive is what is amazing. The NFL, despite its traditions, seems most committed to reinventing itself. And for every one of these offensive systems, there are a dozen others that failed to catch on.
Evolution and Networking
The NFL has figured out that they are a league that thrives on new ideas. Whether its the NFL as a whole, or individual teams and players, the entire league is committed to trying new things. That commitment has created a hyper-fertile breeding ground for new ideas. It is no surprise that the league has managed to reinvent itself every few years, much to the delight of its legions of fans.
Networking is going through an interesting time. This period of 3-4 years might very well be looked on as a Golden Era for networking. The amount of new ideas that are being tested in the marketplace right now is amazing. SDN, NFV, DevOps, Photonic Switching, Sensor Networking, Network Virtualization… and the list goes on. But these new ideas came on the heels of what really were the Dark Ages. After the Dot.com bust, the networking world went dark. Sure, there were new knobs and doodads that were useful for folks, but as an industry, the innovation was pretty incremental.
So when this Golden Era of Networking is over, which networking industry will we have? Will we return to the Dark Ages, or will we end up in another Period of Enlightenment? If the NFL is any indication of what continuous innovation looks like, it would seem the better answer is to embrace the new ideas. But are we culturally prepared to continue embracing disruption? Are we collectively unafraid of failure enough that this type of future suits us? If you ask me, we have to be.
Defense Wins Championships
There is an old saw that goes “Defense wins championships.” At this time of year, it gets trotted out as one of those universal truths. But here’s the reality: evolution wins championships. In the NFL, offenses and defenses win about the same amount (a slight nod to defenses, but only by a hair). It’s a team’s ability to evolve over the years – and even during the game – that dictates success.
Our industry is no different. We have our own Old Guard that talks about past technologies with the kind of reverence that you see when historians put on their smoking jackets and grab their pipes. But our industry is defined by its future more than its past. There is a lot to learn from our history, but if we let those teachings get in the way of our future, we will be no better off than we are now.
So when you are grabbing a beer or diving into that 7-layer dip at whatever Super Bowl party you end up at, talk about the role of innovation and how it reigns supreme over those dusty old defenses.
[Today's fun fact: Clans of long ago that wanted to get rid of their unwanted people without killing them used to burn their houses down, hence the expression "To get fired." I wonder where the term "lay off" came from then?]
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Oct. 21, 2016 02:00 PM EDT Reads: 6,769
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Oct. 21, 2016 02:00 PM EDT Reads: 549
The reason I believe digital transformation is not only more than a fad, but is actually a life-or-death imperative for every business and IT executive on the planet is simple: there will be no place for an “industrial enterprise” in a digital world. Transformation, by definition, is a metamorphosis from one state to another, wholly new state. As such, a true digital transformation must be the act of transforming an industrial-era organization into something wholly different – the Digital Enter...
Oct. 21, 2016 02:00 PM EDT Reads: 1,210
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
Oct. 21, 2016 01:30 PM EDT Reads: 1,459
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, will discuss what every business should plan for how to structure their teams to d...
Oct. 21, 2016 01:00 PM EDT Reads: 1,212
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to...
Oct. 21, 2016 11:45 AM EDT Reads: 13,558
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 21, 2016 10:30 AM EDT Reads: 1,228
Oct. 21, 2016 10:00 AM EDT Reads: 3,691
In many organizations governance is still practiced by phase or stage gate peer review, and Agile projects are forced to accommodate, which leads to WaterScrumFall or worse. But governance criteria and policies are often very weak anyway, out of date or non-existent. Consequently governance is frequently a matter of opinion and experience, highly dependent upon the experience of individual reviewers. As we all know, a basic principle of Agile methods is delegation of responsibility, and ideally ...
Oct. 21, 2016 10:00 AM EDT Reads: 2,987
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
Oct. 21, 2016 09:30 AM EDT Reads: 436
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
Oct. 21, 2016 08:15 AM EDT Reads: 1,835
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
Oct. 21, 2016 07:45 AM EDT Reads: 1,168
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Oct. 21, 2016 07:45 AM EDT Reads: 2,055
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service.
Oct. 21, 2016 07:15 AM EDT Reads: 883
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Oct. 21, 2016 07:00 AM EDT Reads: 4,398
Let's just nip the conflation of these terms in the bud, shall we?
"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.
They are not.
One is about the application. The other, the network. T...
Oct. 21, 2016 06:45 AM EDT Reads: 6,310
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
Oct. 21, 2016 06:00 AM EDT Reads: 7,153
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
Oct. 21, 2016 04:30 AM EDT Reads: 16,208
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Oct. 21, 2016 03:30 AM EDT Reads: 3,773
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will show how customers are able to achieve a level of transparency that enables everyon...
Oct. 21, 2016 02:30 AM EDT Reads: 1,216