Tests
Suites
Latest Results
Search
Register
Login
Popular Tests
Timed Linux Kernel Compilation
7-Zip Compression
SVT-AV1
FFmpeg
PostgreSQL
Stockfish
Newest Tests
WarpX
Epoch
Valkey
Whisperfile
XNNPACK
GROMACS
Recently Updated Tests
Apache CouchDB
PostgreSQL
NAMD
Apache Cassandra
Opus Codec Encoding
ParaView
New & Recently Updated Tests
Recently Updated Suites
Database Test Suite
Machine Learning
Steam
New & Recently Updated Suites
Component Benchmarks
CPUs / Processors
GPUs / Graphics
OpenGL
Disks / Storage
Motherboards
File-Systems
Operating Systems
OpenBenchmarking.org
Corporate / Organization Info
Bug Reports / Feature Requests
Mobile Neural Network 2.1.0
pts/mnn-2.1.0
- 31 August 2022 -
Update against MNN 2.1 upstream.
downloads.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v10.8.4--> <PhoronixTestSuite> <Downloads> <Package> <URL>https://github.com/alibaba/MNN/archive/refs/tags/2.1.0.tar.gz</URL> <MD5>6a5e963d9f7b5a70d2c9959b138e167c</MD5> <SHA256>2e4c337208c1ed4c60be23fd699dce5c23dade6ac6f509e35ef8ab5b4a81b2fa</SHA256> <FileName>MNN-2.1.0.tar.gz</FileName> <FileSize>18970119</FileSize> </Package> </Downloads> </PhoronixTestSuite>
install.sh
#!/bin/bash rm -rf MNN-2.1.0 tar -xf MNN-2.1.0.tar.gz cd MNN-2.1.0 cd schema ./generate.sh cd .. mkdir build cd build EXTRA_CMAKE_FLAGS="" if [ $OS_TYPE = "Linux" ] then if grep avx512 /proc/cpuinfo > /dev/null then EXTRA_CMAKE_FLAGS="$EXTRA_CMAKE_FLAGS -DMNN_AVX512=ON" fi if grep avx512_vnni /proc/cpuinfo > /dev/null then EXTRA_CMAKE_FLAGS="$EXTRA_CMAKE_FLAGS -DMNN_AVX512_VNNI=ON" fi fi cmake .. -DMNN_BUILD_BENCHMARK=true -DCMAKE_BUILD_TYPE=Release -DMNN_OPENMP=ON $EXTRA_CMAKE_FLAGS make -j $NUM_CPU_CORES echo $? > ~/install-exit-status cd ~/ cat>mnn<<EOT #!/bin/sh cd MNN-2.1.0/build ./benchmark.out ../benchmark/models/ 2000 100 0 \$NUM_CPU_CORES > \$LOG_FILE echo \$? > ~/test-exit-status EOT chmod +x mnn
results-definition.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v10.8.4--> <PhoronixTestSuite> <ResultsParser> <OutputTemplate>[ - ] nasnet.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>nasnet.mnn</LineHint> <ArgumentsDescription>Model: nasnet</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] mobilenetV3.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>mobilenetV3.mnn</LineHint> <ArgumentsDescription>Model: mobilenetV3</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] squeezenetv1.1.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>squeezenetv1.1.mnn</LineHint> <ArgumentsDescription>Model: squeezenetv1.1</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] resnet-v2-50.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>resnet-v2-50.mnn</LineHint> <ArgumentsDescription>Model: resnet-v2-50</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] SqueezeNetV1.0.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>SqueezeNetV1.0.mnn</LineHint> <ArgumentsDescription>Model: SqueezeNetV1.0</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] MobileNetV2_224.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>MobileNetV2_224.mnn</LineHint> <ArgumentsDescription>Model: MobileNetV2_224</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] mobilenet-v1-1.0.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>mobilenet-v1-1.0.mnn</LineHint> <ArgumentsDescription>Model: mobilenet-v1-1.0</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] inception-v3.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>inception-v3.mnn</LineHint> <ArgumentsDescription>Model: inception-v3</ArgumentsDescription> </ResultsParser> </PhoronixTestSuite>
test-definition.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v10.8.4--> <PhoronixTestSuite> <TestInformation> <Title>Mobile Neural Network</Title> <AppVersion>2.1</AppVersion> <Description>MNN is the Mobile Neural Network as a highly efficient, lightweight deep learning framework developed by Alibaba. This MNN test profile is building the OpenMP / CPU threaded version for processor benchmarking and not any GPU-accelerated test. MNN does allow making use of AVX-512 extensions.</Description> <ResultScale>ms</ResultScale> <Proportion>LIB</Proportion> <TimesToRun>3</TimesToRun> </TestInformation> <TestProfile> <Version>2.1.0</Version> <SupportedPlatforms>Linux, MacOSX</SupportedPlatforms> <SoftwareType>Scientific</SoftwareType> <TestType>System</TestType> <License>Free</License> <Status>Verified</Status> <ExternalDependencies>cmake, build-utilities</ExternalDependencies> <EnvironmentSize>2700</EnvironmentSize> <ProjectURL>https://www.mnn.zone/</ProjectURL> <RepositoryURL>https://github.com/alibaba/MNN/</RepositoryURL> <Maintainer>Michael Larabel</Maintainer> </TestProfile> </PhoronixTestSuite>