loggingjava-17microbenchmarkjmh

Where should I configure benchmark outputs in JMH?


Where do I tell JMH (Java Microbenchmark Harness) to output to a log file?

I have a test based on this BenchmarkLoop.java example working against an OrientDB backend server and it all runs fine in the console.
But I need to save for the future evaluation and reference over time with the project-code itself.

I can see target is getting generated, but I don't see any logs output there or anywhere else in the project. Nor can I tell Log4j2 to take in what JMH outputs.

I've been trying to search for answers myself:

  1. Internet searching only to find stuff like Understanding JMH Output which isn't my concern
  2. Digging around in SO for [jmh] output returning JMH: Buggy output in a benchmark at time of this post
  3. Sifting through JMH records, docs, and examples
    1. GitHub
    2. JavaDocs (Doc)
    3. OpenJDK (Doc)
    4. Baeldung (Example)
    5. Mkyong (Example)
    6. OpenAI suggestions don't work for my needs
      1. the CLI way of using -rf but I need this project packageable
      2. resultFormat() is closer but I think is wrong in its expectation
        .resultFormat(ResultFormatFactory.getInstance(ResultFormatType.TEXT, new FileWriter("output-file.txt")))
        
        1. We can see that it only takes in a ResultFormatType
87.039 ms/op

Result "jmh.Test51_1.retrieveASubgraphOf1000Elements":
  97.618 ±(99.9%) 31.200 ms/op [Average]
  (min, avg, max) = (87.039, 97.618, 106.949), stdev = 8.102
  CI (99.9%): [66.418, 128.818] (assumes normal distribution)


# Run complete. Total time: 00:01:42

REMEMBER: The numbers below are just data. To gain reusable insights, you need to follow up on
why the numbers are the way they are. Use profilers (see -prof, -lprof), design factorial
experiments, perform baseline and negative tests that provide experimental control, make sure
the benchmarking environment is safe on JVM/OS/HW level, ask for reviews from the domain experts.
Do not assume the numbers tell you what you want them to tell.

Benchmark                                                     (orientdbUrl)  Mode  Cnt   Score    Error  Units
Test51_1.retrieveASubgraphOf1000Elements  remote:localhost:2424/sparse_data  avgt    5  97.618 ± 31.200  ms/op

Process finished with exit code 0

package jmh;

import org.apache.commons.configuration2.BaseConfiguration;
import org.apache.commons.configuration2.Configuration;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.tinkerpop.gremlin.orientdb.OrientGraph;
import org.apache.tinkerpop.gremlin.process.traversal.dsl.graph.GraphTraversal;
import org.apache.tinkerpop.gremlin.process.traversal.dsl.graph.GraphTraversalSource;
import org.apache.tinkerpop.gremlin.structure.Vertex;
import org.openjdk.jmh.annotations.*;
import org.openjdk.jmh.infra.Blackhole;
import org.openjdk.jmh.runner.Runner;
import org.openjdk.jmh.runner.RunnerException;
import org.openjdk.jmh.runner.options.Options;
import org.openjdk.jmh.runner.options.OptionsBuilder;

import java.util.concurrent.TimeUnit;

@BenchmarkMode(Mode.AverageTime)
@OutputTimeUnit(TimeUnit.MILLISECONDS)
@State(Scope.Benchmark)
@Fork(value = 2, jvmArgs = {"-Xms2G", "-Xmx2G"})
public class Test {

    private static final Logger logger = LogManager.getLogger(Test.class);
    @Param("remote:localhost:2424/sparse_data")
    String orientdbUrl;
    Configuration configuration;
    OrientGraph orientGraph;
    static GraphTraversalSource g;

    public static void main(String[] args) throws RunnerException {
        Options opt = new OptionsBuilder()
                .include(Test.class.getSimpleName())
                .forks(1)
                .build();
        new Runner(opt).run();
    }

    @Setup
    public void setup() {
        orientdbUrl = "remote:localhost:2424/sparse_data";
        logger.info("{\torientdb.database.url:\t" + orientdbUrl + "\t}");
        configuration = new BaseConfiguration();
        configuration.setProperty(OrientGraph.CONFIG_URL, orientdbUrl);
        configuration.setProperty(OrientGraph.CONFIG_USER, "root");
        configuration.setProperty(OrientGraph.CONFIG_PASS, "rootpwd");
        configuration.setProperty(OrientGraph.CONFIG_TRANSACTIONAL, true);
        orientGraph = OrientGraph.open(configuration);
        g = orientGraph.traversal();
    }
    @TearDown
    public void teardown() {
        orientGraph.close();
    }

    @Benchmark
    public static void retrieveASubgraphOf1000Elements() {
        GraphTraversal<Vertex, Vertex> returnValue = g.V().limit(1000);
        while (returnValue.hasNext()) {
            logger.info("returnValue.next():\t" + returnValue.next());
        }
    }
}

Solution

  • Based on JMH: Returning the benchmark results as a json object, where the asker was asking about returning their value. This snippet of theirs was what I needed.

                .resultFormat(ResultFormatType.JSON)
                .result("benchmark-result/" + System.currentTimeMillis() + ".json")
    

    Here's the output:

    [
        {
            "jmhVersion" : "1.36",
            "benchmark" : "",
            "mode" : "avgt",
            "threads" : 1,
            "forks" : 1,
            "jvm" : "",
            "jvmArgs" : [
                "-Xms2G",
                "-Xmx2G"
            ],
            "jdkVersion" : "",
            "vmName" : "OpenJDK 64-Bit Server VM",
            "vmVersion" : "",
            "warmupIterations" : 5,
            "warmupTime" : "10 s",
            "warmupBatchSize" : 1,
            "measurementIterations" : 5,
            "measurementTime" : "10 s",
            "measurementBatchSize" : 1,
            "params" : {
                "orientdbUrl" : "remote:localhost:2424/database"
            },
            "primaryMetric" : {
                "score" : 86.41678545945095,
                "scoreError" : 9.76599990211842,
                "scoreConfidence" : [
                    76.65078555733253,
                    96.18278536156937
                ],
                "scorePercentiles" : {
                    "0.0" : 82.68515289256199,
                    "50.0" : 87.28553391304348,
                    "90.0" : 89.14723716814159,
                    "95.0" : 89.14723716814159,
                    "99.0" : 89.14723716814159,
                    "99.9" : 89.14723716814159,
                    "99.99" : 89.14723716814159,
                    "99.999" : 89.14723716814159,
                    "99.9999" : 89.14723716814159,
                    "100.0" : 89.14723716814159
                },
                "scoreUnit" : "ms/op",
                "rawData" : [
                    [
                        82.68515289256199,
                        89.14723716814159,
                        87.82189739130435,
                        85.14410593220339,
                        87.28553391304348
                    ]
                ]
            },
            "secondaryMetrics" : {
            }
        }
    ]