21 April 2009

802.11n Performance: Radios vs. Streams

Many organizations pride themselves on being at the cutting edge of technological innovation, the first to deploy a vendors newest innovation. Indeed, >50% of organizations surveyed will evaluate other wireless vendors’ products within the next 12-18 months. Being the first to catch the hottest new innovation carries with it the risk of being burned, and a little due diligence can go a long way in making sure that a buying decision is prudent.

Take for example the matter of 802.11n performance. 802.11n performance is based in part on both the number of radio chains and the number of spatial streams. The two are often confused...at the buyers peril. The number of radio chains corresponds with the number of transmitters or receivers, and is typically denoted as “m x n” where m is the number of transmitters, and n the number of receivers. m x n need not necessarily be symmetrical, and some 802.11n access point can dynamically adjust the numbers, e.g., a 3x3 radio can operate in 3x3, 2x3 or 1x3 mode depending on configuration, mode and power profile.

While multiple transmitter and receiver chains can be used to improve the signal quality, the big increases in data rates associated with multiple input-multiple output (MIMO) access points are more dependent on the number of spatial streams. Using 1 stream, the maximum 802.11n data rate per radio, assuming 40MHz bandwidth, is 150Mbps. Using 2 streams that number doubles to 300Mbps, and so on. The number of spatial streams is typically denoted by S in “n x m : S.” There are as yet no 3 stream access points on the market, though several access points have 3 receiver and/or transmitter chains.

By way of example, Aruba's AP-124 and AP-125 Access Points are 3x3:2 devices. In contrast, Cisco's 1140 and 1250 series access points have a dual transmitter, triple receiver design and are 2x3:2 devices. If you're looking for the best performance, Aruba's 3x3:2 access points are your best bet.

18 April 2009

Saving Energy and Money By Extending the Battery Life of Mobile Devices

The battery life of Wi-Fi capable mobile devices can be extended by enabling the Wi-Fi radio to enter a low-power “sleep” mode during period when the device neither needs to transmit or receive data. The longer the sleep time, the lower the battery drain. The difficulty is ensuring that sleep mode does not interfere with network performance, i.e., the device can wake-up in a timely manner.

Mobile device drivers and radio firmware employ a variety of pre-set times and trigger events to optimize entry into, and termination of, sleep time. The techniques employed typically vary by device and applications. For example, scanners typically have longer pre-set sleep times than laptops because the latter is assumed to have greater access to a recharger. The IEEE 802.11 standard includes a mandatory power save polling (PSP) feature whereby the Wi-Fi access point with which the device is associated must buffer data for that device while it is sleeping. Once the device awakens, the buffered data are delivered.

Following the transaction the device can return to the sleep mode if no additional data are to be sent or received. The PSP mechanism includes additional provisions that enable the access point to override sleep times and force the device to wake up at shorter intervals (called DTIM interval) even if there is no traffic to send or receive.

Battery life can be compromised as a result of two primary issues. Network performance problems, such as the failure to respond to ARP requests within the allocated time, or insufficient buffer storage within an access point, can reduce the DTIM interval and cause a mobile device to wake-up more often than necessary.

Additionally, broadcast and multicast Wi-Fi traffic chatter can prevent a mobile device from entering sleep mode, keeping it awake to check lest any of the chatter include packets intended for the device. In both scenarios battery life is compromised because the sleep mode cannot be utilized as intended.

To address these issues some vendors have implemented proprietary power-saving solutions that require software clients (Cisco CCX) or firmware hooks (Symbol). There are two fundamental issues with these approaches: they limit the range of available devices by locking customers into using only devices embedded with the proprietary technology; they require that the customer implement strict revision control over the client software and firmware to avoid incompatibilities or performance differences that exist between revisions.

Aruba has taken a standards-based approach to extending battery life by using infrastructure controls to manage off-the-shelf mobile devices without recourse to proprietary software or firmware. Three standards-based infrastructure controls are leveraged to equal or exceed the battery life achievable with proprietary solutions:

• Proxy-ARP: Mobility Controllers answer all ARP requests for devices with their radios in sleep mode, permitting longer DTIM intervals than could be supported if access points alone managed these requests;

• Long DTIMs: Long DTIM intervals are enabled by a battery boost feature, set by SSID, that permits the conversion of multicast / broadcast frames to unicast frames without having to buffer every DTIM period. Client devices can define their own DTIM periods thereby extending battery life without negatively affecting network performance;

• Multicast suppression: Mobility Controllers employ real-time packet inspection to identify and block network chatter (multicast traffic) that would negatively affect mobile devices. As a result, mobile devices able to remain in sleep mode longer and conserve additional power.

This three-pronged approach to power saving allows for longer sleep times on mobile devices such as scanners and voice handsets. Longer operating service from a single charge can have significant logistics and cost benefits, requiring fewer mobile devices, battery packs, and /or charging stations. Additionally, battery service life will be extended since service time is inversely related to the number of charge cycles.

Aruba’s standards-based approach also frees customers to use any Wi-Fi certified mobile device on the market, with the assurance that its battery life will be maximized regardless of make, model, form-factor or application. Eliminating sole-sourced products in favor of a procurement process based on price and/or performance can yield significant cost savings.

09 April 2009

You Get What You Pay For: Meru Pays Novarum For Performance Not Seen By Customers

Novarum recently published a test report claiming that Meru Networks’ 802.11n wireless LAN delivers higher throughput, better power efficiency, and superior airtime fairness than either Aruba or Cisco. The report is available from Novarum's Web site.

At a high level – setting aside all technical details – the report’s findings are at odds with the experience of many prospects and installed-base customers. Meru deployments have been removed from, or Meru lost head-to-head technical evaluations (“bake-offs”) at, the following schools among many others:

• University of Tennessee – replacement and bakeoff
• C-2 Raytown School District - replacement
• Norwood School - replacement
• Francis Xavier Warde School - replacement
• Drexel - bakeoff

The EDUCAUSE Board (http://ised-l.blogspot.com/2009_01_01_archive.html) has been rife with postings about issues with Meru’s 802.11n network. See for example the posting from Jomar McDonald, Director of Technology, The Frances Xavier Warde School.

Recent press articles have explored the reasons why customers are replacing Meru networks with Aruba adaptive 802.11n networks. One such case is Mike Morisy’s Search Networking article, “From Cisco to Meru to Aruba, school finally finds right WLAN” (http://searchnetworking.techtarget.com/news/article/0,289142,sid7_gci1352631,00.html#).

No one can dispute that performance differences exist between different wireless LANs, however, the dichotomy between the findings of the Novarum report and what customer’s experience in the real world is startling. A little digging into the research methodology employed in the Novarum report highlights casts a bright light on the reasons for this schism. Novarum is a paid consulting firm – a writer for hire, as it were – and given the fact that their findings are completely at odds with what we (along with other vendors) see in actual deployments in the industry, one has to believe that the results they publish are heavily influenced by the source of the funding. For example, a 2007 Novarum report – also commissioned by Meru – saw Aruba’s AP-70 Access Points tested with their antennas closed and in the wrong planar orientation relative to the clients. Novarum claimed that the network was set-up in accordance with Aruba’s guidelines, however, that proved not to be the case with the antenna position and a host of other critical parameters.

Fast forward to the newest Novarum report. The methodology issues are different from the 2007 report but just as significant with respect to their denigration of performance:

• The tests used just one single access point from each vendor - hardly an environment conducive to measuring wireless LAN capacity – and Meru access points were operated at full power but the other access points were not;

• Commercially available software releases were used for the Aruba and Cisco devices (Aruba 3.3.2.10 and Cisco 5.2.178) but Meru used a special test code that is not available to its customers. This inobtainium code was no doubt crafted to perform special tasks, just for the test, that would otherwise be unnatural acts in a commercial deployment;

• Encryption was disabled, despite a mandate by most customers to cipher communications. Encryption has been demonstrated to degrade the performance of Meru wireless LANs;

• Only two client types were used, one being a plug-in adapter, this despite the plethora of clients in real world deployments. The performance of Meru wireless LANs has been previously demonstrated to degrade in the presence of commonly used clients that were excluded from this test;

• Screen shots show major misconfigurations of Aruba’s controller. Aruba utilizes a technology called Adaptive Radio Management (ARM) to optimize wireless LAN performance, and in the test the ARM traffic management profile for fairness was created but not assigned to the Aruba access point under test. The voice traffic DSCP (ToS) tag was also incorrectly set to a value of 56. Other errors abound;

• Meru’s own installation guide states that 3X3 MIMO operation cannot be supported over 802.3af power over Ethernet, and that both radios have to back down to 2X2 MIMO. Therefore it is possible that a single radio was used during power measurements of the Meru access point, and the same was done for the Cisco 1250 access point – providing nothing more than that one radio consumes less power than two.

Occam’s razor - entia non sunt multiplicanda praeter necessitatem – states that the explanation of any phenomenon should be parsimonious with respect to assumptions about observable predictions. The Novarum report is nearly forty pages long, but the most fundamental underlying assumption – that the competing equipment was set-up properly, fairly, and in accordance with the manufacturers’ guidelines – was violated. The results – all of the results – were thereby nullified, the paper wasted.

One assumes one gets what one pays for: Meru got a test report in exchange for paying Novarum. Readers, however, got nothing of value. Caveat emptor.

At Aruba we appreciate and encourage head-to-head testing by our customers before they choose a WLAN. It is only in these real-world scenarios, running the applications and equipment that are intended to be used, that one can best evaluate the performance of a network. We also appreciate the value of thorough testing done by industry experts. However, when you can’t replicate a test in the real world - as is the case with the Novarum report – then the testing procedure is flawed and/or skewed.