Tests
Suites
Latest Results
Search
Register
Login
Popular Tests
Timed Linux Kernel Compilation
SVT-AV1
7-Zip Compression
Stockfish
Blender
FFmpeg
Newest Tests
OpenVINO GenAI
Rustls
LiteRT
WarpX
Epoch
Valkey
Recently Updated Tests
Llama.cpp
OpenVINO
Renaissance
Blender
vkpeak
ProjectPhysX OpenCL-Benchmark
New & Recently Updated Tests
Recently Updated Suites
Machine Learning
Server Motherboard
HPC - High Performance Computing
New & Recently Updated Suites
Component Benchmarks
CPUs / Processors
GPUs / Graphics
OpenGL
Disks / Storage
Motherboards
File-Systems
Operating Systems
OpenBenchmarking.org
Corporate / Organization Info
Bug Reports / Feature Requests
Mobile Neural Network 1.3.0
pts/mnn-1.3.0
- 18 June 2021 -
Update against new upstream MNN 1.2.0 release.
downloads.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v10.4.0--> <PhoronixTestSuite> <Downloads> <Package> <URL>https://github.com/alibaba/MNN/archive/refs/tags/1.2.0.tar.gz</URL> <MD5>6406b88b959a09148b79ffb2bc4ab7d8</MD5> <SHA256>b0e32d28e1b1c64904d6c1f810a48238430cf4ef5e8fdcd2ea4600e52c8a82ef</SHA256> <FileName>MNN-1.2.0.tar.gz</FileName> <FileSize>103892968</FileSize> </Package> </Downloads> </PhoronixTestSuite>
install.sh
#!/bin/sh rm -rf MNN-1.2.0 tar -xf MNN-1.2.0.tar.gz cd MNN-1.2.0 cd schema ./generate.sh cd .. mkdir build cd build cmake .. -DMNN_BUILD_BENCHMARK=true -DCMAKE_BUILD_TYPE=Release make -j $NUM_CPU_CORES echo $? > ~/install-exit-status cd ~/ cat>mnn<<EOT #!/bin/sh cd MNN-1.2.0/build ./benchmark.out ../benchmark/models/ 1000 100 0 \$NUM_CPU_CORES > \$LOG_FILE echo \$? > ~/test-exit-status EOT chmod +x mnn
results-definition.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v10.4.0--> <PhoronixTestSuite> <ResultsParser> <OutputTemplate>[ - ] nasnet.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>squeezenetv1.1.mnn</LineHint> <ArgumentsDescription>Model: squeezenetv1.1</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] mobilenetV3.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>mobilenetV3.mnn</LineHint> <ArgumentsDescription>Model: mobilenetV3</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] squeezenetv1.1.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>squeezenetv1.1.mnn</LineHint> <ArgumentsDescription>Model: squeezenetv1.1</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] resnet-v2-50.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>resnet-v2-50.mnn</LineHint> <ArgumentsDescription>Model: resnet-v2-50</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] SqueezeNetV1.0.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>SqueezeNetV1.0.mnn</LineHint> <ArgumentsDescription>Model: SqueezeNetV1.0</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] MobileNetV2_224.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>MobileNetV2_224.mnn</LineHint> <ArgumentsDescription>Model: MobileNetV2_224</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] mobilenet-v1-1.0.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>mobilenet-v1-1.0.mnn</LineHint> <ArgumentsDescription>Model: mobilenet-v1-1.0</ArgumentsDescription> </ResultsParser> <ResultsParser> <OutputTemplate>[ - ] inception-v3.mnn max = #_MAX_RESULT_# ms min = #_MIN_RESULT_# ms avg = #_RESULT_# ms</OutputTemplate> <LineHint>inception-v3.mnn</LineHint> <ArgumentsDescription>Model: inception-v3</ArgumentsDescription> </ResultsParser> </PhoronixTestSuite>
test-definition.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v10.4.0--> <PhoronixTestSuite> <TestInformation> <Title>Mobile Neural Network</Title> <AppVersion>1.2</AppVersion> <Description>MNN is the Mobile Neural Network as a highly efficient, lightweight deep learning framework developed by Alibaba.</Description> <ResultScale>ms</ResultScale> <Proportion>LIB</Proportion> <TimesToRun>3</TimesToRun> </TestInformation> <TestProfile> <Version>1.3.0</Version> <SupportedPlatforms>Linux, MacOSX</SupportedPlatforms> <SoftwareType>Scientific</SoftwareType> <TestType>System</TestType> <License>Free</License> <Status>Verified</Status> <ExternalDependencies>cmake, build-utilities</ExternalDependencies> <EnvironmentSize>2700</EnvironmentSize> <ProjectURL>https://www.mnn.zone/</ProjectURL> <RepositoryURL>https://github.com/alibaba/MNN/</RepositoryURL> <Maintainer>Michael Larabel</Maintainer> </TestProfile> </PhoronixTestSuite>