I really need to write something up about the ins and outs of what we run into on the performance testing front.  Vendors are all over the map in the L4-7 world in terms of what type of tests are run that result in what customers or prospects see on data sheets or othe rmarketing claims.

Tests are run with different numbers of packets exchanged per request and different numbers of connections used over which those request travel.  Some vendors continue to just send a SYN and report that as a request in their request/second metrics.  While that technique does generate inflated numbers, it does nothing to help the customer or prospect size their device and is nothing that you'd ever see in the real world.  To top things off, test vendors like Ixia and Spirent have TCP stacks that behave differently and have different modes in which they operate to boot. 

When you're comparing performance claims for different products out there, make sure to ask a few questions about what is being reported as I can assure you that there are lots of different methods being used and that does nothing to help educate the market.  Some of the guys here worked up a chart yesterday that categorizes various vendors' performance claims against the test methodology used to yield the results.  We'll get it cleaned up and posted.

On a lighter note, I'm off to do some skiing at Sun Peaks in British Columbia.  They've had 20" of new snow in the past 3 days so it should be sweet!  If the conditions are really good I'll probably sign up for another trip on their old Tucker SnoCat.