Intel + Gigabyte NVIDIA GeForce GTX 1070 8GB + Supermicro X10DAL-i v1.02 (2.0b BIOS)
Various open-source benchmarks by the Phoronix Test Suite v9.6.0 (Nittedal).
Gigabyte NVIDIA GeForce GTX 1070
Processor: Intel @ 2.80GHz (12 Cores / 24 Threads), Motherboard: Supermicro X10DAL-i v1.02 (2.0b BIOS), Chipset: Intel Xeon E7 v3/Xeon, Memory: 64GB, Disk: 500GB Samsung SSD 860 + 2 x 4001GB Western Digital WD40EFRX-68N + 2000GB Western Digital WD20EZRZ-00Z + 3001GB Seagate ST3000DM007-1WY1, Graphics: Gigabyte NVIDIA GeForce GTX 1070 8GB (1556/4006MHz), Audio: NVIDIA GP104 HD Audio, Monitor: PHL 246V5, Network: 2 x Intel I210 + Intel 7260
OS: Ubuntu 18.04, Kernel: 4.15.0-99-generic (x86_64) 20200423, Desktop: GNOME Shell 3.28.4, Display Server: X Server 1.19.6, Display Driver: NVIDIA 440.82, OpenGL: 4.6.0, Compiler: GCC 7.5.0 + CUDA 9.1, File-System: ext4, Screen Resolution: 3840x1080
Processor Notes: Scaling Governor: intel_pstate powersave - CPU Microcode: 0x14
Java Notes: OpenJDK Runtime Environment (build 1.8.0_252-8u252-b09-1~18.04-b09)
Python Notes: Python 2.7.17 + Python 3.6.9
Security Notes: itlb_multihit: KVM: Mitigation of Split huge pages + l1tf: Mitigation of PTE Inversion; VMX: conditional cache flushes SMT vulnerable + mds: Vulnerable: Clear buffers attempted no microcode; SMT vulnerable + meltdown: Mitigation of PTI + spec_store_bypass: Vulnerable + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Full generic retpoline STIBP: disabled RSB filling + tsx_async_abort: Vulnerable: Clear buffers attempted no microcode; SMT vulnerable
LuxCoreRender
LuxCoreRender is an open-source physically based renderer. This test profile is focused on running LuxCoreRender on the CPU as opposed to the OpenCL version. Learn more via the OpenBenchmarking.org test page.
Apache Cassandra
This is a benchmark of the Apache Cassandra NoSQL database management system making use of cassandra-stress. Learn more via the OpenBenchmarking.org test page.
DeepSpeech
Mozilla DeepSpeech is a speech-to-text engine powered by TensorFlow for machine learning and derived from Baidu's Deep Speech research paper. This test profile times the speech-to-text process for a roughly three minute audio recording. Learn more via the OpenBenchmarking.org test page.
Gigabyte NVIDIA GeForce GTX 1070
Processor: Intel @ 2.80GHz (12 Cores / 24 Threads), Motherboard: Supermicro X10DAL-i v1.02 (2.0b BIOS), Chipset: Intel Xeon E7 v3/Xeon, Memory: 64GB, Disk: 500GB Samsung SSD 860 + 2 x 4001GB Western Digital WD40EFRX-68N + 2000GB Western Digital WD20EZRZ-00Z + 3001GB Seagate ST3000DM007-1WY1, Graphics: Gigabyte NVIDIA GeForce GTX 1070 8GB (1556/4006MHz), Audio: NVIDIA GP104 HD Audio, Monitor: PHL 246V5, Network: 2 x Intel I210 + Intel 7260
OS: Ubuntu 18.04, Kernel: 4.15.0-99-generic (x86_64) 20200423, Desktop: GNOME Shell 3.28.4, Display Server: X Server 1.19.6, Display Driver: NVIDIA 440.82, OpenGL: 4.6.0, Compiler: GCC 7.5.0 + CUDA 9.1, File-System: ext4, Screen Resolution: 3840x1080
Processor Notes: Scaling Governor: intel_pstate powersave - CPU Microcode: 0x14
Java Notes: OpenJDK Runtime Environment (build 1.8.0_252-8u252-b09-1~18.04-b09)
Python Notes: Python 2.7.17 + Python 3.6.9
Security Notes: itlb_multihit: KVM: Mitigation of Split huge pages + l1tf: Mitigation of PTE Inversion; VMX: conditional cache flushes SMT vulnerable + mds: Vulnerable: Clear buffers attempted no microcode; SMT vulnerable + meltdown: Mitigation of PTI + spec_store_bypass: Vulnerable + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Full generic retpoline STIBP: disabled RSB filling + tsx_async_abort: Vulnerable: Clear buffers attempted no microcode; SMT vulnerable
Testing initiated at 21 May 2020 02:08 by user steve.