Welcome!

Microservices Expo Authors: Pat Romanski, Mamoon Yunus, Elizabeth White, Jason Bloomberg, Liz McMillan

Related Topics: Mobile IoT, Java IoT, Microservices Expo, Machine Learning , Agile Computing, Perl

Mobile IoT: Article

Delivering Breakthrough Performance with 802.11ac

Breaking the wireless Ethernet gigabit barrier

Consumers are continuing to adopt multiple connected devices and video content is expected to reach more than 70 percent of global traffic. This growth and the increased reliance on wireless networks is putting stress on existing 802.11a/b/g/n networks. As a result of this high usage, users are likely to experience deteriorated performance, choppy videos and slower load times. At a time when IT managers report that network users are now averaging more than one Wi-Fi connected device per person, solutions to handle the rapid growth of devices are at a premium.

The next generation of the 802.11 standard, or IEEE 802.11ac, promises to finally break the wireless Ethernet gigabit barrier. This technology will deliver higher bandwidth while retaining better quality of experience (QoE) for end users, and is expected to be adopted rapidly into all markets: residential, enterprise, carrier and large venue.

Some of the first applications for 802.11ac's faster speeds will better residential video streaming, data syncing between mobile devices, and data backup. Streaming digital media between devices faster and simultaneously connecting more wireless devices will be some of the starting benefits for consumers and enterprises. In terms of service providers, they will be able to deploy the new technology to offload traffic from congested 3G and 4G-LTE cellular networks, and in dense operator hotspots 802.11ac will supply better performance to more users.

To date, all 802.11 revisions have focused on increasing transport speeds, which lead to higher traffic delivery rates and ultimately to faster response times as experienced by the end user. The introduction of 802.11n brought advances of MIMO (multiple-in, multiple-out) to deliver traffic over multiple spatial streams, and packet aggregation. MIMO delivered marked improvements in physical transport rates, enabling more bits per second to be transmitted than ever before over Wi-Fi. Packet aggregation delivered equally impressive improvements in transport experience, allowing devices to send more data once they had gained access to the wireless media. The new 802.11ac protocol is continuing down this path by preserving aggregation techniques, advancing the physical transport rates yet again, and introducing the concept of parallel transport into Wi-Fi through a technique known as Multi-User MIMO (MU-MIMO), where multiple client devices are receiving packets concurrently.

This is the first time Wi-Fi history that directed traffic can be delivered to multiple client devices at the same time. This ability has significant impact on delivery of content to any location with multiple users, especially where content is revenue-generating or critical.

Achieving Increased Gigabit+ Performance with 802.11ac
In order to reach the best performance, 802.11ac uses a variety of advancements and addresses the need for performance improvement through three primary initiatives:

  1. Increasing Raw Bandwidth allows for the higher speeds associated with 802.11ac. It makes use of a higher rate encoding scheme known as 256-QAM, which transmits 33 percent more data than the 64-QAM used in the 802.11n standard. Signal-to-noise ratios that worked for 802.11n are no longer sufficient because the difference in detectable signal level is now significantly smaller.
  2. Multi-user Support makes 802.11ac a real information superhighway, unlike its predecessors that only allowed one device to transmit at a time. MU-MIMO allows an access point to transmit data to multiple client devices on the same channel at the same time. It works by directing some of the spatial streams to one client and other spatial streams to a second client. MU-MIMO is critical to performance improvements in environments with high client counts.
  3. Individual Client Channel Optimization is also a major performance booster. The concept behind channel optimization is transmit beamforming (TxBF). The reflections and attenuations, common during the transmission of 802.11 signals, have a significant performance impact on overall network performance. With TxBF, the access point communicates with the client devices to determine the types of impairment that are present in the environment. Then the access point "precodes" the transmitted frame with the inverse of the impairment such that when the next frame is transmitted and transformed by the medium, it is received as a clean frame by the client. Since no two clients are in the same location, TxBF needs to be applied on a client-by-client basis and constantly updated to reflect the changing environment.

Overcoming Technical Challenges
One of the biggest frustration for developers and users of 802.11 is that it needs to work with previous versions. It can also be extremely difficult to identify the root cause of development problems. For example, when an application performs poorly, it is often hard to determine if it is due to an environmental, client, or network issue. The various devices in an 802.11 network are highly correlated so an issue in one area quickly ripples through to many other areas. Developers have lacked an effective means to assess the total picture from the RF to the application layer.

IEEE 802.11ac makes this problem significantly more challenging. In addition to being deployed into an existing environment with ten years' worth of previous releases, 802.11ac makes use of advanced technologies that are substantially more complex and demanding than previous versions. This latest generation of 802.11 requires a rethinking of how the technology is developed and tested to include a much more holistic view through the product development life cycle.

Traditionally, the RF section is verified using one set of equipment, and then the upper layer functions are tested using a second set of tools. The overall technical complexity and the introduction of new technologies such as TxBF demand coordination and control between the different layers of the protocol stack. Without this coordination, it would be difficult to utilize these functions and to quickly pinpoint performance issues.

802.11ac brings the promise of moving Wi-Fi into the limelight as a trusted and capable communication protocol, and will require equipment and rigor to match. The new generation of testing should be able to decode every frame in real-time and determine each frame's RF characteristics, as well as their frame-level performance, and generate every frame without limitation in real-time to adequately test receiver performance. Previous approaches use a digitized data record approach for both generation and analysis, creating or capturing what are known as I/Q files, and equipment typically adapted from the general-purpose RF domain. This result in equipment being capable of a single spatial stream, and able to generate or capture a small fraction of the frames required to perform testing. To meet the need, the approach needs to be able to generate and analyze all frames in real-time to the limit of the specification, tightly integrate RF and MAC functionality in 802.11ac, and include integral, real-time channel emulation to address TxBF performance.

Increasing Performance for All Markets
Gigabit+ performance for residential, enterprise, carrier and large venue markets is possible with the 802.11ac standard. But to realize the performance and density promise, chip and hardware developers must navigate some significant technical challenges, as detailed in this article. They must ensure graceful migrations from existing deployed solutions by providing backward compatibility and delivering high performance RF transmission and receive performance with a wide variety of signals. They must maintain high performance to multiple clients under the channel conditions that will exist in real deployments, while at the same time provide the high reliability and feature robustness to enable enterprise and carrier grade 802.11 adoption. Ultimately, the developers need to ensure that the key application traffic - most notably video - can be delivered with quality.

More Stories By Joe Zeto

Joe Zeto serves as a technical marketing evangelist within Ixia’s marketing organization. He has over 17 years of experience in wireless and IP networking, both from the engineering and marketing sides. He has extensive knowledge and a global prospective of the networking market and the test and measurement industry.

Prior to joining Ixia, Joe was Director of Product Marketing at Spirent Communications running Enterprise Switching, Storage Networking, and Wireless Infrastructure product lines. He has a Juris Docorate from Loyola Law School, Los Angeles, CA.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that’s no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, will explore how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He wi...
API Security is complex! Vendors like Forum Systems, IBM, CA and Axway have invested almost 2 decades of engineering effort and significant capital in building API Security stacks to lockdown APIs. The API Security stack diagram shown below is a building block for rapidly locking down APIs. The four fundamental pillars of API Security - SSL, Identity, Content Validation and deployment architecture - are discussed in detail below.
While some vendors scramble to create and sell you a fancy solution for monitoring your spanking new Amazon Lambdas, hear how you can do it on the cheap using just built-in Java APIs yourself. By exploiting a little-known fact that Lambdas aren’t exactly single-threaded, you can effectively identify hot spots in your serverless code. In his session at @DevOpsSummit at 21st Cloud Expo, Dave Martin, Product owner at CA Technologies, will give a live demonstration and code walkthrough, showing how ...
Translating agile methodology into real-world best practices within the modern software factory has driven widespread DevOps adoption, yet much work remains to expand workflows and tooling across the enterprise. As models evolve from pockets of experimentation into wholescale organizational reinvention, practitioners find themselves challenged to incorporate the culture and architecture necessary to support DevOps at scale.
We define Hybrid IT as a management approach in which organizations create a workload-centric and value-driven integrated technology stack that may include legacy infrastructure, web-scale architectures, private cloud implementations along with public cloud platforms ranging from Infrastructure-as-a-Service to Software-as-a-Service.
With Cloud Foundry you can easily deploy and use apps utilizing websocket technology, but not everybody realizes that scaling them out is not that trivial. In his session at 21st Cloud Expo, Roman Swoszowski, CTO and VP, Cloud Foundry Services, at Grape Up, will show you an example of how to deal with this issue. He will demonstrate a cloud-native Spring Boot app running in Cloud Foundry and communicating with clients over websocket protocol that can be easily scaled horizontally and coordinate...
Did you know that you can develop for mainframes in Java? Or that the testing and deployment can be automated across mobile to mainframe? In his session and demo at @DevOpsSummit at 21st Cloud Expo, Dana Boudreau, a Senior Director at CA Technologies, will discuss how increasingly teams are developing with agile methodologies, using modern development environments, and automating testing and deployments, mobile to mainframe.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
If you cannot explicitly articulate how investing in a new technology, changing the approach or re-engineering the business process will help you achieve your customer-centric vision of the future in direct and measurable ways, you probably shouldn’t be doing it. At Intellyx, we spend a lot of time talking to technology vendors. In our conversations, we explore emerging new technologies that are either disrupting the way enterprise organizations work or that help enable those organizations to co...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
Docker is on a roll. In the last few years, this container management service has become immensely popular in development, especially given the great fit with agile-based projects and continuous delivery. In this article, I want to take a brief look at how you can use Docker to accelerate and streamline the software development lifecycle (SDLC) process.
API Security has finally entered our security zeitgeist. OWASP Top 10 2017 - RC1 recognized API Security as a first class citizen by adding it as number 10, or A-10 on its list of web application vulnerabilities. We believe this is just the start. The attack surface area offered by API is orders or magnitude larger than any other attack surface area. Consider the fact the APIs expose cloud services, internal databases, application and even legacy mainframes over the internet. What could go wrong...
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically abo...
There are several reasons why businesses migrate their operations to the cloud. Scalability and price are among the most important factors determining this transition. Unlike legacy systems, cloud based businesses can scale on demand. The database and applications in the cloud are not rendered simply from one server located in your headquarters, but is instead distributed across several servers across the world. Such CDNs also bring about greater control in times of uncertainty. A database hack ...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.