|By Joe Zeto||
|July 18, 2012 01:51 PM EDT||
Consumers are continuing to adopt multiple connected devices and video content is expected to reach more than 70 percent of global traffic. This growth and the increased reliance on wireless networks is putting stress on existing 802.11a/b/g/n networks. As a result of this high usage, users are likely to experience deteriorated performance, choppy videos and slower load times. At a time when IT managers report that network users are now averaging more than one Wi-Fi connected device per person, solutions to handle the rapid growth of devices are at a premium.
The next generation of the 802.11 standard, or IEEE 802.11ac, promises to finally break the wireless Ethernet gigabit barrier. This technology will deliver higher bandwidth while retaining better quality of experience (QoE) for end users, and is expected to be adopted rapidly into all markets: residential, enterprise, carrier and large venue.
Some of the first applications for 802.11ac's faster speeds will better residential video streaming, data syncing between mobile devices, and data backup. Streaming digital media between devices faster and simultaneously connecting more wireless devices will be some of the starting benefits for consumers and enterprises. In terms of service providers, they will be able to deploy the new technology to offload traffic from congested 3G and 4G-LTE cellular networks, and in dense operator hotspots 802.11ac will supply better performance to more users.
To date, all 802.11 revisions have focused on increasing transport speeds, which lead to higher traffic delivery rates and ultimately to faster response times as experienced by the end user. The introduction of 802.11n brought advances of MIMO (multiple-in, multiple-out) to deliver traffic over multiple spatial streams, and packet aggregation. MIMO delivered marked improvements in physical transport rates, enabling more bits per second to be transmitted than ever before over Wi-Fi. Packet aggregation delivered equally impressive improvements in transport experience, allowing devices to send more data once they had gained access to the wireless media. The new 802.11ac protocol is continuing down this path by preserving aggregation techniques, advancing the physical transport rates yet again, and introducing the concept of parallel transport into Wi-Fi through a technique known as Multi-User MIMO (MU-MIMO), where multiple client devices are receiving packets concurrently.
This is the first time Wi-Fi history that directed traffic can be delivered to multiple client devices at the same time. This ability has significant impact on delivery of content to any location with multiple users, especially where content is revenue-generating or critical.
Achieving Increased Gigabit+ Performance with 802.11ac
In order to reach the best performance, 802.11ac uses a variety of advancements and addresses the need for performance improvement through three primary initiatives:
- Increasing Raw Bandwidth allows for the higher speeds associated with 802.11ac. It makes use of a higher rate encoding scheme known as 256-QAM, which transmits 33 percent more data than the 64-QAM used in the 802.11n standard. Signal-to-noise ratios that worked for 802.11n are no longer sufficient because the difference in detectable signal level is now significantly smaller.
- Multi-user Support makes 802.11ac a real information superhighway, unlike its predecessors that only allowed one device to transmit at a time. MU-MIMO allows an access point to transmit data to multiple client devices on the same channel at the same time. It works by directing some of the spatial streams to one client and other spatial streams to a second client. MU-MIMO is critical to performance improvements in environments with high client counts.
- Individual Client Channel Optimization is also a major performance booster. The concept behind channel optimization is transmit beamforming (TxBF). The reflections and attenuations, common during the transmission of 802.11 signals, have a significant performance impact on overall network performance. With TxBF, the access point communicates with the client devices to determine the types of impairment that are present in the environment. Then the access point "precodes" the transmitted frame with the inverse of the impairment such that when the next frame is transmitted and transformed by the medium, it is received as a clean frame by the client. Since no two clients are in the same location, TxBF needs to be applied on a client-by-client basis and constantly updated to reflect the changing environment.
Overcoming Technical Challenges
One of the biggest frustration for developers and users of 802.11 is that it needs to work with previous versions. It can also be extremely difficult to identify the root cause of development problems. For example, when an application performs poorly, it is often hard to determine if it is due to an environmental, client, or network issue. The various devices in an 802.11 network are highly correlated so an issue in one area quickly ripples through to many other areas. Developers have lacked an effective means to assess the total picture from the RF to the application layer.
IEEE 802.11ac makes this problem significantly more challenging. In addition to being deployed into an existing environment with ten years' worth of previous releases, 802.11ac makes use of advanced technologies that are substantially more complex and demanding than previous versions. This latest generation of 802.11 requires a rethinking of how the technology is developed and tested to include a much more holistic view through the product development life cycle.
Traditionally, the RF section is verified using one set of equipment, and then the upper layer functions are tested using a second set of tools. The overall technical complexity and the introduction of new technologies such as TxBF demand coordination and control between the different layers of the protocol stack. Without this coordination, it would be difficult to utilize these functions and to quickly pinpoint performance issues.
802.11ac brings the promise of moving Wi-Fi into the limelight as a trusted and capable communication protocol, and will require equipment and rigor to match. The new generation of testing should be able to decode every frame in real-time and determine each frame's RF characteristics, as well as their frame-level performance, and generate every frame without limitation in real-time to adequately test receiver performance. Previous approaches use a digitized data record approach for both generation and analysis, creating or capturing what are known as I/Q files, and equipment typically adapted from the general-purpose RF domain. This result in equipment being capable of a single spatial stream, and able to generate or capture a small fraction of the frames required to perform testing. To meet the need, the approach needs to be able to generate and analyze all frames in real-time to the limit of the specification, tightly integrate RF and MAC functionality in 802.11ac, and include integral, real-time channel emulation to address TxBF performance.
Increasing Performance for All Markets
Gigabit+ performance for residential, enterprise, carrier and large venue markets is possible with the 802.11ac standard. But to realize the performance and density promise, chip and hardware developers must navigate some significant technical challenges, as detailed in this article. They must ensure graceful migrations from existing deployed solutions by providing backward compatibility and delivering high performance RF transmission and receive performance with a wide variety of signals. They must maintain high performance to multiple clients under the channel conditions that will exist in real deployments, while at the same time provide the high reliability and feature robustness to enable enterprise and carrier grade 802.11 adoption. Ultimately, the developers need to ensure that the key application traffic - most notably video - can be delivered with quality.
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
Feb. 9, 2016 05:15 PM EST Reads: 186
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
Feb. 9, 2016 05:00 PM EST Reads: 146
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 9, 2016 04:00 PM EST Reads: 219
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
Feb. 9, 2016 04:00 PM EST Reads: 188
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
Feb. 9, 2016 04:00 PM EST Reads: 586
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Feb. 9, 2016 03:45 PM EST Reads: 132
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
Feb. 9, 2016 03:00 PM EST Reads: 293
In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24x7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be muc...
Feb. 9, 2016 03:00 PM EST Reads: 226
SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support,...
Feb. 9, 2016 03:00 PM EST Reads: 181
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Feb. 9, 2016 03:00 PM EST Reads: 350
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
Feb. 9, 2016 02:45 PM EST Reads: 189
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
Feb. 9, 2016 02:15 PM EST Reads: 388
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
Feb. 9, 2016 01:15 PM EST Reads: 368
CIOs and those charged with running IT Operations are challenged to deliver secure, audited, and reliable compute environments for the applications and data for the business. Behind the scenes these tasks are often accomplished by following onerous time-consuming processes and often the management of these environments and processes will be outsourced to multiple IT service providers. In addition, the division of work is often siloed into traditional "towers" that are not well integrated for cro...
Feb. 9, 2016 11:30 AM EST Reads: 474
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
Feb. 9, 2016 11:30 AM EST Reads: 399
SYS-CON Events announced today that AppNeta, the leader in performance insight for business-critical web applications, will exhibit and present at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. AppNeta is the only application performance monitoring (APM) company to provide solutions for all applications – applications you develop internally, business-critical SaaS applications you use and the networks that deli...
Feb. 9, 2016 10:30 AM EST Reads: 372
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 9, 2016 10:00 AM EST Reads: 356
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Feb. 9, 2016 08:15 AM EST Reads: 224
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
Feb. 9, 2016 08:00 AM EST Reads: 337
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
Feb. 9, 2016 07:45 AM EST