Tuesday, November 20, 2007

Benchmarks everywhere

It seems that my benchmarking "initiative" caused some effects, and many vendors/frameworks are doing the test with the same tool.

AIDA/Web vs Seaside:

Seaside on VisualWorks, Squeak, vs Ruby on Rails :

My own tests (which will be replayed):

And the initial, and more exhaustive, Seaside on Gemstone benchmark:

Considering all of us are using the same tool I've choosen (WAPT), we should define a clear and unique test suite, upload it to a public place, and run it against the different implementations. The results published by AIDA and Cincom have a significant difference between them.

Anyway is important that we put "benchmarking" over the table.

See you in Smalltalks 2007.

Labels: ,


At 8:36 am, Blogger Janko MivÅ¡ek said...


I very agree that we need to define a common test case and setup the tool in the same way, but before that we also need to understand the tool and what are we actually measure.

Currently I have a feeling that we just blindly follow some default settings and then interpret results which probably mean something different as we think. So, understanding the test is a first step to interpreting results correctly.


Post a Comment

<< Home

web site analysis