2 x Intel Xeon Gold 6244 testing with a Dell 060K5C (2.4.1 BIOS) and NVIDIA Quadro GV100 32GB on Ubuntu 20.04.6 LTS via the Phoronix Test Suite.
Kernel Notes: Transparent Huge Pages: always
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none=/build/gcc-9-9QDOt0/gcc-9-9.4.0/debian/tmp-nvptx/usr,hsa --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate powersave (EPP: balance_performance) - CPU Microcode: 0x5003604
Python Notes: Python 3.8.10
Kernel Notes: Transparent Huge Pages: always
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none=/build/gcc-9-9QDOt0/gcc-9-9.4.0/debian/tmp-nvptx/usr,hsa --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance (EPP: performance) - CPU Microcode: 0x5003604
Python Notes: Python 3.8.10
Processor: 2 x Intel Xeon Gold 6244 @ 4.40GHz (16 Cores / 32 Threads), Motherboard: Dell 060K5C (2.4.1 BIOS), Memory: 128GB, Disk: PM981a NVMe SAMSUNG 2048GB + 4 x 8002GB TOSHIBA MG06ACA8, Graphics: NVIDIA Quadro GV100 32GB
OS: Ubuntu 20.04.6 LTS, Kernel: 3.10.0-1160.95.1.el7.x86_64 (x86_64), Display Driver: NVIDIA, Vulkan: 1.1.182, Compiler: GCC 9.4.0 + CUDA 12.0, File-System: xfs, Screen Resolution: 800x600
This test profile uses PlaidML deep learning framework developed by Intel for offering up various benchmarks. Learn more via the OpenBenchmarking.org test page.
FP16: No - Mode: Training - Network: VGG16 - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Training - Network: VGG19 - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Inference - Network: VGG16 - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Inference - Network: VGG19 - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Training - Network: VGG16 - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Training - Network: VGG19 - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Inference - Network: VGG16 - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Inference - Network: VGG19 - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Training - Network: IMDB LSTM - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Training - Network: Mobilenet - Device: CPU
2024-02-05 13:58: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-05 18:36: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Inference - Network: IMDB LSTM - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Inference - Network: Mobilenet - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Training - Network: IMDB LSTM - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Training - Network: Mobilenet - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Training - Network: ResNet 50 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Inference - Network: IMDB LSTM - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Inference - Network: Mobilenet - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Inference - Network: ResNet 50 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Training - Network: DenseNet 201 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Training - Network: Inception V3 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Training - Network: NASNer Large - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Inference - Network: DenseNet 201 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Inference - Network: Inception V3 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: No - Mode: Inference - Network: NASNer Large - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Training - Network: DenseNet 201 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Training - Network: Inception V3 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Training - Network: NASNer Large - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Inference - Network: DenseNet 201 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Inference - Network: Inception V3 - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
FP16: Yes - Mode: Inference - Network: NASNer Large - Device: CPU
2024-02-06 08:56: The test run did not produce a result. The test run did not produce a result. The test run did not produce a result.
NCNN is a high performance neural network inference framework optimized for mobile and other platforms developed by Tencent. Learn more via the OpenBenchmarking.org test page.
Kernel Notes: Transparent Huge Pages: always
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none=/build/gcc-9-9QDOt0/gcc-9-9.4.0/debian/tmp-nvptx/usr,hsa --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate powersave (EPP: balance_performance) - CPU Microcode: 0x5003604
Python Notes: Python 3.8.10
Testing initiated at 5 February 2024 13:53 by user melkor.
Kernel Notes: Transparent Huge Pages: always
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none=/build/gcc-9-9QDOt0/gcc-9-9.4.0/debian/tmp-nvptx/usr,hsa --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate powersave (EPP: balance_performance) - CPU Microcode: 0x5003604
Python Notes: Python 3.8.10
Testing initiated at 5 February 2024 13:58 by user melkor.
Kernel Notes: Transparent Huge Pages: always
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none=/build/gcc-9-9QDOt0/gcc-9-9.4.0/debian/tmp-nvptx/usr,hsa --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance (EPP: performance) - CPU Microcode: 0x5003604
Python Notes: Python 3.8.10
Testing initiated at 5 February 2024 18:03 by user melkor.
Kernel Notes: Transparent Huge Pages: always
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none=/build/gcc-9-9QDOt0/gcc-9-9.4.0/debian/tmp-nvptx/usr,hsa --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance (EPP: performance) - CPU Microcode: 0x5003604
Python Notes: Python 3.8.10
Testing initiated at 5 February 2024 18:36 by user melkor.
Processor: 2 x Intel Xeon Gold 6244 @ 4.40GHz (16 Cores / 32 Threads), Motherboard: Dell 060K5C (2.4.1 BIOS), Memory: 128GB, Disk: PM981a NVMe SAMSUNG 2048GB + 4 x 8002GB TOSHIBA MG06ACA8, Graphics: NVIDIA Quadro GV100 32GB
OS: Ubuntu 20.04.6 LTS, Kernel: 3.10.0-1160.95.1.el7.x86_64 (x86_64), Display Driver: NVIDIA, Vulkan: 1.1.182, Compiler: GCC 9.4.0 + CUDA 12.0, File-System: xfs, Screen Resolution: 800x600
Kernel Notes: Transparent Huge Pages: always
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++,gm2 --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none=/build/gcc-9-9QDOt0/gcc-9-9.4.0/debian/tmp-nvptx/usr,hsa --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance (EPP: performance) - CPU Microcode: 0x5003604
Python Notes: Python 3.8.10
Testing initiated at 6 February 2024 08:56 by user melkor.