This is a facilitator for the MLPerf Inference Benchmark Suite leveraging the axs2mlperf Docker container build and testing for currently facilitating ResNwt-50 reference model inference CPU benchmarks. See reference information at https://github.com/krai/axs2mlperf/blob/master/demo/README.md
OpenBenchmarking.org metrics for this test profile configuration based on 29 public results since 12 May 2023 with the latest data as of 19 May 2023.
Below is an overview of the generalized performance for components where there is sufficient statistically significant data based upon user-uploaded results. It is important to keep in mind particularly in the Linux/open-source space there can be vastly different OS configurations, with this overview intended to offer just general guidance as to the performance expectations.