Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: Microservices Expo

Microservices Expo: Article

Five Reasons Why Web 2.0 Matters

The fact is that the general public is still struggling with blogs and wikis, much less full blown architectures


I've been spending a lot of time lately with folks around the mid-Atlantic region and talking to them about Web 2.0.  I get the expected full spectrum of responses ranging from genuine interest and active enthusiasm to some outright hostility.  Part of it is where the Web 2.0 space is still: an elite niche of technologists with a growing wider awareness that's just beginning.

Most of us know that the technology industry and the Web are often far out ahead of the mainstream.  The fact is that the general public is still struggling with blogs and wikis, much less full blown architectures of participation and software as a service (to name just two aspects of Web 2.0).  Not sure about this?  Try sampling a few people at random and ask them what a blog is.  You will probably be surprised with the answers.  Nevertheless, I'm extremely sanguine about Web 2.0 and where it's headed (notwithstanding Bubble 2.0 type events like the RSS Fund assembling a massive $100 million warchest and using it with questionable judgement.)

While generally exciting and engaging by most accounts, one thing my public presentations on Web 2.0 don't seem to address is the value proposition to the average person or organization.  Why should they spend their valuable time to leverage Web 2.0 ideas, participate in Web 2.0 software, or even create new Web 2.0 functionality?
How exactly does taking the effort to do this become worthwhile?  That question doesn't seem to be asked often enough or generally articulated.  Web 2.0 is exciting enough in its own right to sustain lots of interest and buzz, but how does it translate to delivering tangible value to the world at large?

To address this, I've thought fairly long and hard, and come up with a starting point at least.  I've tried to create the most distilled, direct explanation of the benefits that Web 2.0 best practices can provide in using and building engaging, useful software on the Web. 


Five Reasons Why Web 2.0 Matters


  1. The Focus of Technology Moves To People With Web 2.0.  One of the lessons the software industry relearns every generation is that it's always a people problem.  It's not that people are the actual problem of course.  It's when software developers naively use technology to try to solve our problems instead of addressing the underlying issues that people are actually facing.  Then the wrong things inevitably happen;  we've all seen technology for its own sake or views of the world which are focused much too little on where people fit into the picture. Put another way, people and their needs have to be at the center of any vision of software because technology is only here to make our lives and businesses better, easier, faster or whatever else we require.  Web 2.0 ideas have been successful (at least) because they effectively put people back into the technological equation.  This even goes as far as turning it on its head entirely and making the technology about people.  Web 2.0 fundamentally revolves around us and seeks to ensure that we engage ourselves, participate and collaborate together, and mutually trust and enrich each other, even though we could be separated by the entire world geographically.  And Web 2.0 gives us very specific techniques to do this and attempts to address the "people problem" directly.
  2. Web 2.0 Represents Best Practices.  The ideas in the Web 2.0 toolbox were not pulled from thin air.  In fact, they were systematically identified by what actually worked during the first generation of the Web.  Web 2.0 contains proven techniques for building valuable Web-based software and experiences.  The original Design Patterns book was one of the most popular books of its time because it at long last represented distilled knowledge of how to design software with ideas couched in a form that were reusable and accessible.  So too are the Web 2.0 best practices.  If you want to make software deliver the very best content and functionality to its users, Web 2.0 is an ideal place to start.
  3. Web 2.0 Has Excellent Feng Shui.  Yes, I'll get in trouble for stating it this way but I think it fits, here goes...  I'm a technologist by background and I don't buy into the new-agey vision of Web 2.0 that has sometimes been promulgated.  And I certainly don't believe that Web 2.0 has a "morality" as the famous Tim O'Reilly/Nicholas Carr debate highlighted.  However, as someone that has designed and built lots of software for two decades now, I have plenty of regard for the way the pieces of Web 2.0 fit together snugly and mutually reinforce each other.  Why does this matter?  It has to do with critical mass and synergy, two vital value creation forces.  Taken individually, Web 2.0 techniques like harnessing collective intelligence, radical decentralization, The Long Tail are quite powerful, but they all have a potency much greater than their simple sum and they strongly reinforce each other.  In fact, I'll go as far as to say that only "doing" parts of Web 2.0 can get you into some real trouble. You need a core set of Web 2.0 techniques in order to be successful and then the value curve goes geometric.  This is why the ROI of software built this way is so much greater.  Here's an earlier post that provides more detailed examples of why this is.
  4. Quality Is Maximized, Waste Is Minimized.  The software world is going through one of its cyclical crises as development jobs go overseas and older, more bloated ways of building software finish imploding as the latest software techniques become more agile and lightweight (sometimes called lean).  The guys over at 37Signals say it best...  Using Web 2.0 you can build better software with less people, less money, less abstractions, less effort, and with this increase in constraints you get cleaner, more satisfying software as the result.  And simpler software is invariably higher quality.
  5. Web 2.0 Has A Ballistic TrajectoryNever count out the momentum of a rapidly emerging idea.  For example, I'm a huge fan of Eric Evans' Domain Driven Design but it's so obscure that it will probably never get off the ground in a big way. There's no buzz, excitement, or even a general marketplace for it.  This is Web 2.0's time in the sun, deserved or not.  You can use the leviathan forces of attention and enthusiasm that are swirling around Web 2.0 these days as a powerful enabler to make something important and exciting happen in your organization.  Use this opportunity to seize the initiative, ride the wave, and build great software that matters.
Certainly there are other reasons why Web 2.0 is important and you're welcome to list them here, but I think this captures the central vision in a way that most anyone who is Web literate can grasp and access.

BTW, I will also use this moment to state that Web 2.0 is a terrible name for this new vision of Web-based people-centric software.  Except that is for every other name we have at the moment (for example, like "next generation of the Web").  So I will continue to use Web 2.0 until something better comes along.

OK, don't agree?  Please straighten me out.  Why does Web 2.0 matter (or not) to you?

Technorati: web2.0

More Stories By RIA News Desk

Ever since Google popularized a smarter, more responsive and interactive Web experience by using AJAX (Asynchronous JavaScript + XML) for its Google Maps & Gmail applications, SYS-CON's RIA News Desk has been covering every aspect of Rich Internet Applications and those creating and deploying them. If you have breaking RIA news, please send it to [email protected] to share your product and company news coverage with AJAXWorld readers.

Comments (13)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...