Intel Core i7-8086K testing with a ASUS PRIME Z370-A (1002 BIOS) and NVIDIA GeForce GTX 1080 Ti 11264MB on Ubuntu 18.04 via the Phoronix Test Suite.
Processor: Intel Core i7-8086K @ 5.00GHz (6 Cores / 12 Threads), Motherboard: ASUS PRIME Z370-A (1002 BIOS), Chipset: Intel Device 3ec2, Memory: 16384MB, Disk: 525GB SABRENT + 118GB INTEL SSDPEK1W120GA, Graphics: NVIDIA GeForce RTX 2080 Ti 11264MB (1350/7000MHz), Audio: Realtek ALC1220, Monitor: Acer B286HK, Network: Intel Connection
OS: Ubuntu 18.04, Kernel: 4.18.0-041800-generic (x86_64), Desktop: GNOME Shell 3.28.3, Display Server: X Server 1.19.6, Display Driver: NVIDIA 410.57, OpenGL: 4.6.0, Compiler: GCC 7.3.0, File-System: ext4, Screen Resolution: 3840x2160
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-as=/usr/bin/x86_64-linux-gnu-as --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-ld=/usr/bin/x86_64-linux-gnu-ld --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance
Python Notes: Python 2.7.15rc1 + Python 3.6.5
Security Notes: KPTI + __user pointer sanitization + Full generic retpoline IBPB IBRS_FW + SSB disabled via prctl and seccomp
Changed Chipset to Intel 8th Gen Core.
Changed Graphics to NVIDIA GeForce GTX 1080 Ti 11264MB (1480/5508MHz).
GpuTest is a cross-platform OpenGL benchmark developed at Geeks3D.com that offers tech demos such as FurMark, TessMark, and other workloads to stress various areas of GPUs and drivers. Learn more via the OpenBenchmarking.org test page.
This test calculates the average frame-rate within the Heaven demo for the Unigine engine. This engine is extremely demanding on the system's graphics card. Learn more via the OpenBenchmarking.org test page.
This test calculates the average frame-rate within the Superposition demo for the Unigine engine, released in 2017. This engine is extremely demanding on the system's graphics card. Learn more via the OpenBenchmarking.org test page.
This test calculates the average frame-rate within the Valley demo for the Unigine engine, released in February 2013. This engine is extremely demanding on the system's graphics card. Unigine Valley relies upon an OpenGL 3 core profile context. Learn more via the OpenBenchmarking.org test page.
Processor: Intel Core i7-8086K @ 5.00GHz (6 Cores / 12 Threads), Motherboard: ASUS PRIME Z370-A (1002 BIOS), Chipset: Intel Device 3ec2, Memory: 16384MB, Disk: 525GB SABRENT + 118GB INTEL SSDPEK1W120GA, Graphics: NVIDIA GeForce RTX 2080 Ti 11264MB (1350/7000MHz), Audio: Realtek ALC1220, Monitor: Acer B286HK, Network: Intel Connection
OS: Ubuntu 18.04, Kernel: 4.18.0-041800-generic (x86_64), Desktop: GNOME Shell 3.28.3, Display Server: X Server 1.19.6, Display Driver: NVIDIA 410.57, OpenGL: 4.6.0, Compiler: GCC 7.3.0, File-System: ext4, Screen Resolution: 3840x2160
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-as=/usr/bin/x86_64-linux-gnu-as --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-ld=/usr/bin/x86_64-linux-gnu-ld --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate performance
Python Notes: Python 2.7.15rc1 + Python 3.6.5
Security Notes: KPTI + __user pointer sanitization + Full generic retpoline IBPB IBRS_FW + SSB disabled via prctl and seccomp
Testing initiated at 19 September 2018 18:26 by user phoronix.
Processor: Intel Core i7-8086K @ 5.00GHz (6 Cores / 12 Threads), Motherboard: ASUS PRIME Z370-A (1002 BIOS), Chipset: Intel 8th Gen Core, Memory: 16384MB, Disk: 525GB SABRENT + 118GB INTEL SSDPEK1W120GA, Graphics: NVIDIA GeForce GTX 1080 Ti 11264MB (1480/5508MHz), Audio: Realtek ALC1220, Monitor: Acer B286HK, Network: Intel Connection
OS: Ubuntu 18.04, Kernel: 4.18.0-041800-generic (x86_64), Desktop: GNOME Shell 3.28.3, Display Server: X Server 1.19.6, Display Driver: NVIDIA 410.57, OpenGL: 4.6.0, Compiler: GCC 7.3.0, File-System: ext4, Screen Resolution: 3840x2160
Processor Notes: Scaling Governor: intel_pstate performance
Python Notes: Python 2.7.15rc1 + Python 3.6.5
Security Notes: KPTI + __user pointer sanitization + Full generic retpoline IBPB IBRS_FW + SSB disabled via prctl and seccomp
Testing initiated at 19 September 2018 19:36 by user phoronix.