Tests
Suites
Latest Results
Search
Register
Login
Popular Tests
Timed Linux Kernel Compilation
SVT-AV1
7-Zip Compression
Stockfish
FFmpeg
x265
Newest Tests
Rustls
LiteRT
WarpX
Epoch
Valkey
Whisperfile
Recently Updated Tests
Mobile Neural Network
ACES DGEMM
NWChem
SuperTuxKart
ASTC Encoder
SVT-AV1
New & Recently Updated Tests
Recently Updated Suites
Database Test Suite
Machine Learning
Steam
New & Recently Updated Suites
Component Benchmarks
CPUs / Processors
GPUs / Graphics
OpenGL
Disks / Storage
Motherboards
File-Systems
Operating Systems
OpenBenchmarking.org
Corporate / Organization Info
Bug Reports / Feature Requests
NAS Parallel Benchmarks 1.4.0
pts/npb-1.4.0
- 28 August 2019 -
Update against upstream NPB 3.4, add new test cases.
downloads.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v9.0.0m2--> <PhoronixTestSuite> <Downloads> <Package> <URL>https://www.nas.nasa.gov/assets/npb/NPB3.4.tar.gz</URL> <MD5>b1b963cc86803b185f0907b24d11c840</MD5> <SHA256>991bb15ee34f1cff434d22d8228e3d8cb34ea9fea8c2920dda6582233897cc18</SHA256> <FileName>NPB3.4.tar.gz</FileName> <FileSize>420400</FileSize> </Package> </Downloads> </PhoronixTestSuite>
install.sh
#!/bin/sh tar -zxf NPB3.4.tar.gz MPI_CC=mpicc if [ ! "X$MPI_PATH" = "X" ] && [ -d $MPI_PATH ] && [ -d $MPI_INCLUDE ] && [ -x $MPI_CC ] && [ -e $MPI_LIBS ] then # PRE-SET MPI echo "Using pre-set environment variables." elif [ -d /usr/lib/x86_64-linux-gnu/openmpi/lib/openmpi ] then # OpenMPI MPI_PATH=/usr/lib/x86_64-linux-gnu/openmpi/lib/openmpi/ MPI_INCLUDE=/usr/include/openmpi/ MPI_LIBS=/usr/lib/x86_64-linux-gnu/libmpi.so MPI_CC=/usr/bin/mpicc.openmpi MPI_VERSION=`$MPI_CC -showme:version 2>&1 | grep MPI | cut -d "(" -f1 | cut -d ":" -f2` elif [ -d /usr/lib/x86_64-linux-gnu/openmpi/lib/openmpi3 ] then # OpenMPI MPI_PATH=/usr/lib/x86_64-linux-gnu/openmpi/lib/openmpi3/ MPI_INCLUDE=/usr/include/ MPI_LIBS=/usr/lib/x86_64-linux-gnu/libmpi.so MPI_CC=/usr/bin/mpicc.openmpi MPI_VERSION=`$MPI_CC -showme:version 2>&1 | grep MPI | cut -d "(" -f1 | cut -d ":" -f2` elif [ -d /usr/lib/openmpi/include ] then # OpenMPI MPI_PATH=/usr/lib/openmpi MPI_INCLUDE=/usr/lib/openmpi/include MPI_LIBS=/usr/lib/openmpi/lib/libmpi.so MPI_CC=/usr/bin/mpicc.openmpi MPI_VERSION=`$MPI_CC -showme:version 2>&1 | grep MPI | cut -d "(" -f1 | cut -d ":" -f2` elif [ -d /usr/lib/mpich/include ] then # MPICH MPI_PATH=/usr/lib/mpich MPI_INCLUDE=/usr/lib/mpich/include MPI_LIBS=/usr/lib/mpich/lib/libmpich.so.1.0 MPI_CC=/usr/bin/mpicc.mpich MPI_VERSION=`$MPI_CC -v 2>&1 | grep "MPICH version"` elif [ -d /usr/include/mpich2 ] then # MPICH2 MPI_PATH=/usr/include/mpich2 MPI_INCLUDE=/usr/include/mpich2 MPI_LIBS=/usr/lib/mpich2/lib/libmpich.so MPI_CC=/usr/bin/mpicc.mpich2 MPI_VERSION=`$MPI_CC -v 2>&1 | grep "MPICH2 version"` fi if [ ! "X$MPI_VERSION" = "X" ] then echo $MPI_VERSION > ~/install-footnote fi if [ "X$CFLAGS_OVERRIDE" = "X" ] then CFLAGS="$CFLAGS -O3 -march=native" else CFLAGS="$CFLAGS_OVERRIDE" fi # Should have all the necessary variables for both OpenMP and MPI tests echo "F77 = gfortran MPIFC = mpif90 MPIF77 = mpif77 FLINK = \$(MPIF77) FMPI_LIB = -L$MPI_LIBS FMPI_INC = -I$MPI_INCLUDE FFLAGS = $CFLAGS FLINKFLAGS = \$(FFLAGS) MPICC = $MPI_CC CLINK = $MPI_CC CMPI_LIB = -L$MPI_LIBS CMPI_INC = -I$MPI_INCLUDE CFLAGS = $CFLAGS CLINKFLAGS = \$(CFLAGS) CC = cc -g BINDIR = ../bin RAND = randi8 C_LIB = -lm WTIME = wtime.c " > NPB3.4/NPB3.4-MPI/config/make.def # Copy over OpenMP make for when using that... cp NPB3.4/NPB3.4-MPI/config/make.def NPB3.4/NPB3.4-OMP/config/make.def cd ~/NPB3.4/NPB3.4-MPI/ make bt CLASS=A make bt CLASS=C make ep CLASS=C make ep CLASS=D make ft CLASS=A make ft CLASS=B make ft CLASS=C make lu CLASS=A make lu CLASS=C make sp CLASS=A make sp CLASS=B make is CLASS=D make mg CLASS=C make cg CLASS=C echo $? > ~/install-exit-status cd ~ echo "#!/bin/sh cd NPB3.4/NPB3.4-MPI/ if [ \"X\$NUM_CPU_PHYSICAL_CORES\" = \"X\" ] then NUM_THREADS=\$NUM_CPU_CORES else NUM_THREADS=\$NUM_CPU_PHYSICAL_CORES fi if [ ! \"X\$HOSTFILE\" = \"X\" ] && [ -f \$HOSTFILE ] then HOSTFILE=\"--hostfile \$HOSTFILE\" elif [ -f /etc/hostfile ] then HOSTFILE=\"--hostfile /etc/hostfile\" else HOSTFILE=\"\" fi mpiexec -np \$NUM_THREADS \$HOSTFILE ./bin/\$@.x > \$LOG_FILE 2>&1 echo \$? > ~/test-exit-status" > npb chmod +x npb
results-definition.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v9.0.0m2--> <PhoronixTestSuite> <ResultsParser> <OutputTemplate> Mop/s total = #_RESULT_#</OutputTemplate> <LineHint>Mop/s total</LineHint> </ResultsParser> </PhoronixTestSuite>
test-definition.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v9.0.0m2--> <PhoronixTestSuite> <TestInformation> <Title>NAS Parallel Benchmarks</Title> <AppVersion>3.4</AppVersion> <Description>NPB, NAS Parallel Benchmarks, is a benchmark developed by NASA for high-end computer systems. This test profile currently uses the MPI version of NPB. This test profile offers selecting the different NPB tests/problems and varying problem sizes.</Description> <ResultScale>Total Mop/s</ResultScale> <Proportion>HIB</Proportion> <TimesToRun>3</TimesToRun> </TestInformation> <TestProfile> <Version>1.4.0</Version> <SupportedPlatforms>Linux, Solaris, BSD</SupportedPlatforms> <SoftwareType>Benchmark</SoftwareType> <TestType>Processor</TestType> <License>Free</License> <Status>Verified</Status> <ExternalDependencies>build-utilities, fortran-compiler, openmpi-development</ExternalDependencies> <EnvironmentSize>8</EnvironmentSize> <ProjectURL>http://www.nas.nasa.gov/Resources/Software/npb.html</ProjectURL> <InternalTags>SMP, MPI</InternalTags> <Maintainer>Michael Larabel</Maintainer> </TestProfile> <TestSettings> <Option> <DisplayName>Test / Class</DisplayName> <Identifier>run-test</Identifier> <Menu> <Entry> <Name>BT.C</Name> <Value>bt.C</Value> <Message>Block Tri-diagonal solver</Message> </Entry> <Entry> <Name>EP.C</Name> <Value>ep.C</Value> <Message>Embarrassingly Parallel</Message> </Entry> <Entry> <Name>EP.D</Name> <Value>ep.D</Value> <Message>Embarrassingly Parallel</Message> </Entry> <Entry> <Name>FT.C</Name> <Value>ft.C</Value> <Message>discrete 3D fast Fourier Transform</Message> </Entry> <Entry> <Name>LU.C</Name> <Value>lu.C</Value> <Message>Lower-Upper Gauss-Seidel solver</Message> </Entry> <Entry> <Name>SP.B</Name> <Value>sp.B</Value> <Message>Scalar Penta-diagonal solver</Message> </Entry> <Entry> <Name>IS.D</Name> <Value>is.D</Value> <Message>Integer Sort</Message> </Entry> <Entry> <Name>MG.C</Name> <Value>mg.C</Value> <Message>Multi-Grid on a sequence of meshes</Message> </Entry> <Entry> <Name>CG.C</Name> <Value>cg.C</Value> <Message>Conjugate Gradient</Message> </Entry> </Menu> </Option> </TestSettings> </PhoronixTestSuite>