Ubuntu 18.04 Performance
Various open-source benchmarks by the Phoronix Test Suite v8.6.1 (Spydeberg).
NVIDIA GeForce RTX 2080
Processor: Intel Core i7-9700K @ 4.90GHz (8 Cores), Motherboard: Notebook P7xxTM1 (1.07.20 BIOS), Chipset: Intel Device 3e30, Memory: 2 x 16384 MB DDR4-2667MT/s CM4X16GE2666C18S4, Disk: 1000GB Samsung SSD 970 EVO Plus 1TB, Graphics: NVIDIA GeForce RTX 2080 8GB (300/405MHz), Audio: Realtek ALC898, Network: Qualcomm Atheros Killer E2500 + Intel-AC 9260
OS: Ubuntu 18.04, Kernel: 4.18.0-17-generic (x86_64), Desktop: GNOME Shell 3.28.3, Display Server: X Server 1.20.1, Display Driver: NVIDIA 418.40.04, Compiler: GCC 7.3.0 + CUDA 10.1, File-System: ext4, Screen Resolution: 3840x2160
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance
Security Notes: __user pointer sanitization + Full generic retpoline IBPB IBRS_FW + SSB disabled via prctl and seccomp
NVIDIA GeForce 1050ti
Processor: Intel Core i5-8400 @ 3.80GHz (6 Cores), Motherboard: MSI B360M GAMING PLUS (MS-7B19) v1.0 (1.10 BIOS), Chipset: Intel Cannon Lake PCH Shared SRAM, Memory: 16384MB, Disk: 14GB INTEL MEMPEK1J016GAH + 1000GB Seagate ST1000DM003-1SB1 + 250GB Crucial_CT250MX2 + 500GB Western Digital WDBNCE5000PN, Graphics: MSI NVIDIA GeForce GTX 1050 Ti 4GB (1354/3504MHz), Audio: Realtek ALC887-VD, Monitor: DELL SE2416H, Network: Intel I219-V
OS: Ubuntu 18.04, Kernel: 4.15.0-47-generic (x86_64), Desktop: MATE 1.20.1, Display Server: X Server 1.19.6, Display Driver: NVIDIA 418.56, OpenGL: 4.6.0, Compiler: GCC 7.3.0, File-System: ext4, Screen Resolution: 3840x1080
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance
Security Notes: KPTI + __user pointer sanitization + Full generic retpoline IBPB IBRS_FW + SSB disabled via prctl and seccomp + PTE Inversion; VMX: conditional cache flushes SMT disabled
SciMark
This test runs the ANSI C version of SciMark 2.0, which is a benchmark for scientific and numerical computing developed by programmers at the National Institute of Standards and Technology. This test is made up of Fast Foruier Transform, Jacobi Successive Over-relaxation, Monte Carlo, Sparse Matrix Multiply, and dense LU matrix factorization benchmarks. Learn more via the OpenBenchmarking.org test page.
Radiance Benchmark
This is a benchmark of NREL Radiance, a synthetic imaging system that is open-source and developed by the Lawrence Berkeley National Laboratory in California. Learn more via the OpenBenchmarking.org test page.
NVIDIA GeForce RTX 2080
Processor: Intel Core i7-9700K @ 4.90GHz (8 Cores), Motherboard: Notebook P7xxTM1 (1.07.20 BIOS), Chipset: Intel Device 3e30, Memory: 2 x 16384 MB DDR4-2667MT/s CM4X16GE2666C18S4, Disk: 1000GB Samsung SSD 970 EVO Plus 1TB, Graphics: NVIDIA GeForce RTX 2080 8GB (300/405MHz), Audio: Realtek ALC898, Network: Qualcomm Atheros Killer E2500 + Intel-AC 9260
OS: Ubuntu 18.04, Kernel: 4.18.0-17-generic (x86_64), Desktop: GNOME Shell 3.28.3, Display Server: X Server 1.20.1, Display Driver: NVIDIA 418.40.04, Compiler: GCC 7.3.0 + CUDA 10.1, File-System: ext4, Screen Resolution: 3840x2160
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance
Security Notes: __user pointer sanitization + Full generic retpoline IBPB IBRS_FW + SSB disabled via prctl and seccomp
Testing initiated at 4 April 2019 23:30 by user root.
NVIDIA GeForce 1050ti
Processor: Intel Core i5-8400 @ 3.80GHz (6 Cores), Motherboard: MSI B360M GAMING PLUS (MS-7B19) v1.0 (1.10 BIOS), Chipset: Intel Cannon Lake PCH Shared SRAM, Memory: 16384MB, Disk: 14GB INTEL MEMPEK1J016GAH + 1000GB Seagate ST1000DM003-1SB1 + 250GB Crucial_CT250MX2 + 500GB Western Digital WDBNCE5000PN, Graphics: MSI NVIDIA GeForce GTX 1050 Ti 4GB (1354/3504MHz), Audio: Realtek ALC887-VD, Monitor: DELL SE2416H, Network: Intel I219-V
OS: Ubuntu 18.04, Kernel: 4.15.0-47-generic (x86_64), Desktop: MATE 1.20.1, Display Server: X Server 1.19.6, Display Driver: NVIDIA 418.56, OpenGL: 4.6.0, Compiler: GCC 7.3.0, File-System: ext4, Screen Resolution: 3840x1080
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance
Security Notes: KPTI + __user pointer sanitization + Full generic retpoline IBPB IBRS_FW + SSB disabled via prctl and seccomp + PTE Inversion; VMX: conditional cache flushes SMT disabled
Testing initiated at 6 April 2019 08:10 by user rozene.