hidei9
Intel Core i9-9900KS testing with a Gigabyte Z390 AORUS PRO-CF (F12d BIOS) and Gigabyte NVIDIA GeForce RTX 2080 SUPER 8GB on LinuxMint 19.3 via the Phoronix Test Suite.
Intel Core i9-9900KS
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate powersave - CPU Microcode: 0xca
Security Notes: l1tf: Not affected + mds: Not affected + meltdown: Not affected + spec_store_bypass: Mitigation of SSB disabled via prctl and seccomp + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced IBRS IBPB: conditional RSB filling
Gigabyte NVIDIA GeForce RTX 2080 SUPER
Processor: Intel Core i9-9900KS @ 5.00GHz (8 Cores / 16 Threads), Motherboard: Gigabyte Z390 AORUS PRO-CF (F12d BIOS), Chipset: Intel Cannon Lake PCH, Memory: 16384MB, Disk: 250GB Western Digital WDS250G2B0B + SSD 128GB + 2000GB External HDD, Graphics: Gigabyte NVIDIA GeForce RTX 2080 SUPER 8GB (1650/7750MHz), Audio: Realtek ALC1220, Monitor: UHD HDMI1, Network: Intel I219-V
OS: LinuxMint 19.3, Kernel: 5.0.0-32-generic (x86_64), Desktop: Cinnamon 4.4.8, Display Server: X Server 1.20.4, Display Driver: NVIDIA 435.21, OpenGL: 4.6.0, Compiler: GCC 7.4.0, File-System: ext4, Screen Resolution: 1920x1200
Processor Notes: Scaling Governor: intel_pstate powersave - CPU Microcode: 0xca
Python Notes: Python 2.7.17 + Python 3.6.9
Security Notes: l1tf: Not affected + mds: Not affected + meltdown: Not affected + spec_store_bypass: Mitigation of SSB disabled via prctl and seccomp + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced IBRS IBPB: conditional RSB filling
C-Ray
This is a test of C-Ray, a simple raytracer designed to test the floating-point CPU performance. This test is multi-threaded (16 threads per core), will shoot 8 rays per pixel for anti-aliasing, and will generate a 1600 x 1200 image. Learn more via the OpenBenchmarking.org test page.
Crafty
This is a performance test of Crafty, an advanced open-source chess engine. Learn more via the OpenBenchmarking.org test page.
GLmark2
This is a test of Linaro's glmark2 port, currently using the X11 OpenGL 2.0 target. GLmark2 is a basic OpenGL benchmark. Learn more via the OpenBenchmarking.org test page.
Unigine Heaven
This test calculates the average frame-rate within the Heaven demo for the Unigine engine. This engine is extremely demanding on the system's graphics card. Learn more via the OpenBenchmarking.org test page.
Intel Core i9-9900KS
Compiler Notes: --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,brig,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-targets=nvptx-none --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic --without-cuda-driver -v
Processor Notes: Scaling Governor: intel_pstate powersave - CPU Microcode: 0xca
Security Notes: l1tf: Not affected + mds: Not affected + meltdown: Not affected + spec_store_bypass: Mitigation of SSB disabled via prctl and seccomp + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced IBRS IBPB: conditional RSB filling
Testing initiated at 15 January 2020 09:43 by user hideaki.
Gigabyte NVIDIA GeForce RTX 2080 SUPER
Processor: Intel Core i9-9900KS @ 5.00GHz (8 Cores / 16 Threads), Motherboard: Gigabyte Z390 AORUS PRO-CF (F12d BIOS), Chipset: Intel Cannon Lake PCH, Memory: 16384MB, Disk: 250GB Western Digital WDS250G2B0B + SSD 128GB + 2000GB External HDD, Graphics: Gigabyte NVIDIA GeForce RTX 2080 SUPER 8GB (1650/7750MHz), Audio: Realtek ALC1220, Monitor: UHD HDMI1, Network: Intel I219-V
OS: LinuxMint 19.3, Kernel: 5.0.0-32-generic (x86_64), Desktop: Cinnamon 4.4.8, Display Server: X Server 1.20.4, Display Driver: NVIDIA 435.21, OpenGL: 4.6.0, Compiler: GCC 7.4.0, File-System: ext4, Screen Resolution: 1920x1200
Processor Notes: Scaling Governor: intel_pstate powersave - CPU Microcode: 0xca
Python Notes: Python 2.7.17 + Python 3.6.9
Security Notes: l1tf: Not affected + mds: Not affected + meltdown: Not affected + spec_store_bypass: Mitigation of SSB disabled via prctl and seccomp + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced IBRS IBPB: conditional RSB filling
Testing initiated at 15 January 2020 10:06 by user hideaki.