Welcome!

Joe Zeto

Subscribe to Joe Zeto: eMailAlertsEmail Alerts
Get Joe Zeto via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Wireless Technology Magazine, Mobile Enterprise Application Platforms

Article

Adopting Best Practices for Wi-Fi: A Service Provider's Onus

Improve customer attraction rates and reduce churn by delivering a satisfying customer experience over Wi-Fi

As smartphone users become more sophisticated, they are actively seeking out the service provider they believe offers the best overall network for their smartphone. Providers are learning that users are quick to switch if they are unhappy with their existing service. Customers today expect their smartphone to deliver high-bandwidth applications along with high quality voice services. Service providers must look to alternative for "offloading" these bandwidth-intensive applications if they are to keep up with this high bandwidth demands. After years of serving as a nice-to-have hospitality solution, IEEE 802.11 is being thrust into the forefront as a solution.

The risk that providers face when using Wi-Fi for cellular offload is that unsatisfactory user experiences with Wi-Fi now result in a loss of high margin smartphone users. The reality is that consumers will, in most instances, not realize that new smartphones will move off a 3G or 4G service to a Wi-Fi network. At that point, end users will associate a poor Wi-Fi connection with a poor cellular network connection.

A single smartphone subscriber accounts for thousands of dollars in revenue and hundreds in profit for a single smartphone contract. Service providers must make every reasonable effort to acquire and retain these premium customers in order to sustain and grow their business.

Previous Wi-Fi Practices and Problems
As a niche hospitality product, Wi-Fi has traditionally been deployed according to the philosophy that radio frequency (RF) coverage is the primary goal. Within the service area, virtually any level of application performance was acceptable. This approach was based, in part, on the assumption that Wi-Fi was a convenience service and any connectivity was better than none. It was also expected that relatively few people would complain about poor service and that high cost customer support would only be offered under the most extreme circumstances. Initially, the primary deployment goals were to minimize installation costs, maximize RF coverage and minimize RF interference.

The corresponding deployment methodologies assumed that the vast majority of the problems that would occur were caused by 802.11 RF issues. In other words, the assumption was that if a device could detect the radio signal from the wireless access point (AP) and if there was relatively little interference, then the network would work fine.

The dual goals of minimizing cost and maximizing coverage resulted in service providers and enterprise IT managers deploying the minimum number of APs required to cover the target service area with each AP's transmit power turned up as high as possible. For example, instead of using 10 APs with a low transmit power, the traditional design might use five APs configured for maximum transmit power.

The deployment strategy focused on maximizing RF power and minimizing installation costs, and resulted in a well-known experience: the laptop or smartphone is turned on, a suitable public-access Wi-Fi network is identified with three of four bars which indicate that a strong network signal is available, then the user tries to connect the device to the network but is unable to. Thus, the network appears to be available, but it clearly cannot be used.

Another common situation is when users are able to get connected and then find the performance of the network to be unacceptable. Even more frustrating is that a user with 3G data access may be sitting next to you getting better performance. Wi-Fi should be the fastest wireless technology available to a laptop or smartphone.

These scenarios are not surprising when one considers that the signal strength is usually the only criteria used to approve a network. There are numerous reasons why the signal strength is not a good predictor of performance. One of the most common issues is that the signals that produce the signal strength indication are sent out at the most robust, but lowest encoding rate that is supported by the AP. However, higher rate encoding must be used by the users' application traffic in order to attain the performance necessary to provide a positive user experience.

In an access point with a marginal radio, the management frames will commonly work fine while the higher rate data encodings will be marginal or fail completely. Thus the user is able to see the AP, but is unable to use it for any practical purpose. APs will continue to advertise their presence, even if they cannot accept more users or effectively route user traffic to the Internet due to limited backhaul bandwidth, a broken backhaul connection, or misconfigured equipment.

Clearly, the ability to detect a signal from the AP is necessary in order for users to be able to access the Wi-Fi network. However, the mere fact that an AP is present and advertising itself is not sufficient to ensure that a user will achieve reasonable performance and have a satisfactory experience. A much better criterion is user satisfaction with application performance, even when the network is fully populated.

Cellular Offload - Optimizing for Coverage, Not Capacity
The business and technical requirements of cellular offload are completely juxtaposed to those of traditional Wi-Fi. As we mentioned, Wi-Fi access has traditionally been a nice-to-have for providers and the network typically received relatively light use. Under those conditions, it does make some sense to optimize the network for coverage rather than capacity. In modern offload networks, the 802.11 usage scenario is defined by congregations of people carrying Wi-Fi-enabled smartphones. The requirements to serve these customers demand that the network deliver a satisfactory experience to many concurrent users in a high density, heavily used location. The Wi-Fi access network in this scenario is a critical infrastructure element of the cellular network.

In large public venues such as stadiums, concerts, conferences, and fairs, the number of smartphones packed into a given area creates some of the most demanding scenarios for 802.11 deployments. With nearly 50 percent of the population carrying and using smartphones in modern cultures, and growing, these networks need a lot of capacity. The challenge with a network designed primarily for coverage is that it uses relatively few APs to service large areas and will therefore have a large number of clients attempting to share any given AP. In these environments, these networks quickly become congested and result in frequent user complaints that they see the network fine, but that it just doesn't work.

Another significant concern in large public venue deployments is the capabilities of the wired infrastructure elements. With thousands of local users, multiple controllers, switches and servers must all be configured and working properly in order to deliver a quality experience to all customers. Downloading a web page occasionally from a single client will often work when the network is lightly loaded. When the network is fully loaded and people are simultaneously trying to upload pictures, download web pages, stream video or Skype with friends, the aggregate customer experience under scale is often very different from the user experience of downloading a web page in an unloaded network.

Why Is Deployment Testing Critical?
Each of the abovementioned infrastructure elements is critical to delivering a successful user experience. Generally speaking, these items are completely or mostly untested during a deployment even though they behave differently at scale then they would behave under lightly loaded conditions.

Most testing today assumes that there is a direct relationship between the maximum RF signal strength detected by a test laptop and the performance of a high value client such as a smartphone. This correlation is wishful thinking at best. The reason being is that smartphones are optimized for power, space and performance. Their radio designs and software will be different from the designs found in a test laptop and their performance will differ accordingly.

One of the most common occurrences is that the smartphone will not connect to the AP that one would typically expect. Smartphones, and actually all 802.11 devices, make their own decisions about which specific access point they will use when connecting to a network. They also make their own decision about when to roam, which is to say when they decide to stop talking to one AP in the network and start talking to another AP, presumably because better performance will result. Roaming algorithms are completely unspecified so every client design roams at different times and in different patterns.

Some devices try to minimize the number of roams because there is generally a slight service interruption during a roam that they want to avoid as much as possible. These devices often remain connected to an AP over a severely degraded link even when a substantially better option is available. The device remains connected and avoids the momentary impacts of frequent roaming, but the performance degrades severely as the client device gets further from its associated AP. Even worse, the effect of this one underperforming client device is to reduce the overall capacity of the network because all other client devices on the same channel must wait for the now lengthier conversation to complete before they can transfer their data. To a user, of course, this simply looks like the network is doing a terrible job.

At the other end of the spectrum are devices that instantly roam every time they think there is a slightly better option. These devices attempt to achieve improved connection quality by accepting more frequent interruptions due to roaming. They tend to work well for data services, but the interruptions do have an impact on the quality of real-time services such as voice or video. Once again, since device operation is essentially invisible to users, users naturally assume that any performance degradation that they perceive must be the fault of the network.

Understanding how flagship products will behave in a Wi-Fi deployment is critical to tuning the network so it can maximize the high value users' quality of experience. Reading the RF signal strength on a laptop or just looking at a phone's signal level periodically is simply insufficient to facilitate this tuning.

Core Technical Principles of Next Generation Practices
Looking forward, the defining principles of the next generation of best practices include using application traffic in addition to signal strength as measurement sources, reporting customer satisfaction metrics, using the same high-value client devices that network's users will have, being able to test network scalability prior to heavy usage at events and the ability to isolate client behavior from network behavior.

In essence, it involves moving beyond site survey and into site assessment. Site assessments can be run in a similar amount of time as a site survey, but provide a much more comprehensive view of a network's ability to deliver customer satisfaction, and a more powerful set of metrics as the deployment sign-off criteria.

The major principle of site assessment is to use application traffic and measure the customer experience directly rather than infer it from signal strength as is done with site survey. This approach detects a much broader range of deployment issues including misconfigurations of network elements, improperly installed APs with marginal performance, network problems caused by client behaviors, and noise. In short, any issue that degrades customer experience can be detected by measuring the actual customer experience.

Equally important to the measurement technique is the ability to isolate the source of identified issues to the client or to the network. Service providers are able to easily address network-related issues once they are identified using existing practices. Somewhat surprisingly, it is also true that some significant client issues can also be addressed through alternative network configurations. This ability to take action is the reason why it is important to know which client devices matter most to businesses and to understand which specific client behaviors are leading to a degraded user experience. Once known, alternative configurations can be tested to deliver optimized performance to the most valuable users.

New Cellular Offload Practices = Increased Revenues
Cellular offload onto Wi-Fi requires a new approach to deployments that focuses on customer satisfaction with their applications in a realistically loaded environment. Adopting the best practices will allow service providers to dramatically improve the quality of their new deployments, resulting in happier customers. With Wi-Fi now inexorably tied into cellular contracts and the rapid rise in Wi-Fi-enabled devices, service providers should be able to improve customer attraction rates and reduce churn by delivering a satisfying customer experience over Wi-Fi. Ultimately, improved Wi-Fi will ensure revenue protection and decrease infrastructure costs for service providers.

More Stories By Joe Zeto

Joe Zeto serves as a technical marketing evangelist within Ixia’s marketing organization. He has over 17 years of experience in wireless and IP networking, both from the engineering and marketing sides. He has extensive knowledge and a global prospective of the networking market and the test and measurement industry.

Prior to joining Ixia, Joe was Director of Product Marketing at Spirent Communications running Enterprise Switching, Storage Networking, and Wireless Infrastructure product lines. He has a Juris Docorate from Loyola Law School, Los Angeles, CA.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.