The problem is: Given a number of programming libraries with similar or equal scope (e.g. XML parser, regex, markup, ...); are there tools, with which one can run performance tests on these libraries and compare (and generate reports), even though the libraries may be written in different programming languages (such as java, C#, ruby, python, perl, ...)?
I looked at these opensourcetesting.org/performance.php, but none of them fitted into the (somewhat blurry) requirement above.
Are there toolkits or frameworks out there for cross-language cross-platform performance tests?
Thanks.
I wouldn't try to use a single toolkit for multiple languages. That's unlikely to bring out the best (or even the average) performance for each app.
Instead, I would try to come up with a framework design which defines what it's going to test, and has a common data set. Then each language/library can provide its own framework implementation which tests the operations which are appropriate for that library. That way the operations can be "logically equivalent" even if they don't use the exact same syntax/calls. You end up testing the idiomatic code for that library/language, rather than just a lowest common denominator.
This is the approach I've taken for benchmarking Protocol Buffers. So far my very basic framework has implementations in C# and Java, and I'm now writing a richer framework which allows a whole "benchmark script" to be run. One ideal aim is that different implementations within the same platform (e.g. different .NET implementations of Protocol Buffers) should be able to hook into the same core benchmarking code with very little effort.
The important thing, to my mind, is to have a common set of operations (even if they're not all implemented by all libraries) and a common data set. That's the only way the benchmarks can be meaningful.