09 April 2009

You Get What You Pay For: Meru Pays Novarum For Performance Not Seen By Customers

Novarum recently published a test report claiming that Meru Networks’ 802.11n wireless LAN delivers higher throughput, better power efficiency, and superior airtime fairness than either Aruba or Cisco. The report is available from Novarum's Web site.

At a high level – setting aside all technical details – the report’s findings are at odds with the experience of many prospects and installed-base customers. Meru deployments have been removed from, or Meru lost head-to-head technical evaluations (“bake-offs”) at, the following schools among many others:

• University of Tennessee – replacement and bakeoff
• C-2 Raytown School District - replacement
• Norwood School - replacement
• Francis Xavier Warde School - replacement
• Drexel - bakeoff

The EDUCAUSE Board (http://ised-l.blogspot.com/2009_01_01_archive.html) has been rife with postings about issues with Meru’s 802.11n network. See for example the posting from Jomar McDonald, Director of Technology, The Frances Xavier Warde School.

Recent press articles have explored the reasons why customers are replacing Meru networks with Aruba adaptive 802.11n networks. One such case is Mike Morisy’s Search Networking article, “From Cisco to Meru to Aruba, school finally finds right WLAN” (http://searchnetworking.techtarget.com/news/article/0,289142,sid7_gci1352631,00.html#).

No one can dispute that performance differences exist between different wireless LANs, however, the dichotomy between the findings of the Novarum report and what customer’s experience in the real world is startling. A little digging into the research methodology employed in the Novarum report highlights casts a bright light on the reasons for this schism. Novarum is a paid consulting firm – a writer for hire, as it were – and given the fact that their findings are completely at odds with what we (along with other vendors) see in actual deployments in the industry, one has to believe that the results they publish are heavily influenced by the source of the funding. For example, a 2007 Novarum report – also commissioned by Meru – saw Aruba’s AP-70 Access Points tested with their antennas closed and in the wrong planar orientation relative to the clients. Novarum claimed that the network was set-up in accordance with Aruba’s guidelines, however, that proved not to be the case with the antenna position and a host of other critical parameters.

Fast forward to the newest Novarum report. The methodology issues are different from the 2007 report but just as significant with respect to their denigration of performance:

• The tests used just one single access point from each vendor - hardly an environment conducive to measuring wireless LAN capacity – and Meru access points were operated at full power but the other access points were not;

• Commercially available software releases were used for the Aruba and Cisco devices (Aruba and Cisco 5.2.178) but Meru used a special test code that is not available to its customers. This inobtainium code was no doubt crafted to perform special tasks, just for the test, that would otherwise be unnatural acts in a commercial deployment;

• Encryption was disabled, despite a mandate by most customers to cipher communications. Encryption has been demonstrated to degrade the performance of Meru wireless LANs;

• Only two client types were used, one being a plug-in adapter, this despite the plethora of clients in real world deployments. The performance of Meru wireless LANs has been previously demonstrated to degrade in the presence of commonly used clients that were excluded from this test;

• Screen shots show major misconfigurations of Aruba’s controller. Aruba utilizes a technology called Adaptive Radio Management (ARM) to optimize wireless LAN performance, and in the test the ARM traffic management profile for fairness was created but not assigned to the Aruba access point under test. The voice traffic DSCP (ToS) tag was also incorrectly set to a value of 56. Other errors abound;

• Meru’s own installation guide states that 3X3 MIMO operation cannot be supported over 802.3af power over Ethernet, and that both radios have to back down to 2X2 MIMO. Therefore it is possible that a single radio was used during power measurements of the Meru access point, and the same was done for the Cisco 1250 access point – providing nothing more than that one radio consumes less power than two.

Occam’s razor - entia non sunt multiplicanda praeter necessitatem – states that the explanation of any phenomenon should be parsimonious with respect to assumptions about observable predictions. The Novarum report is nearly forty pages long, but the most fundamental underlying assumption – that the competing equipment was set-up properly, fairly, and in accordance with the manufacturers’ guidelines – was violated. The results – all of the results – were thereby nullified, the paper wasted.

One assumes one gets what one pays for: Meru got a test report in exchange for paying Novarum. Readers, however, got nothing of value. Caveat emptor.

At Aruba we appreciate and encourage head-to-head testing by our customers before they choose a WLAN. It is only in these real-world scenarios, running the applications and equipment that are intended to be used, that one can best evaluate the performance of a network. We also appreciate the value of thorough testing done by industry experts. However, when you can’t replicate a test in the real world - as is the case with the Novarum report – then the testing procedure is flawed and/or skewed.