sotr-RTX2080-nvidia-drm-495
eVGA GeForce RTX 2080 FTW3 Ultra GPU 8 GB (90.04.0b.80.79 VBIOS) testing with an NZXT Z490 (DDR4-3200, 16-18-18-36) and Intel Core i5-11600K on Ubuntu 21.10 via the Phoronix Test Suite.
RTX 2080
Processor: Intel Core i5-11600K @ 4.90GHz (6 Cores / 12 Threads), Motherboard: NZXT N7 Z490 (P1.80 BIOS), Chipset: Intel Comet Lake PCH, Memory: 16GB, Disk: 1000GB Western Digital WDS100T3XHC-00SJG0 + 512GB INTEL SSDPEKKW512G8 + 4001GB Seagate ST4000VN008-2DR1, Graphics: NVIDIA GeForce RTX 2080 8GB, Audio: Realtek ALC1220, Monitor: LG TV SSCR2, Network: Realtek RTL8125 2.5GbE + Intel Wi-Fi 6 AX200
OS: Ubuntu 21.10, Kernel: 5.13.0-22-generic (x86_64), Desktop: GNOME Shell 40.5, Display Server: X Server 1.20.13, Display Driver: NVIDIA 495.44, OpenGL: 4.6.0, OpenCL: OpenCL 3.0 CUDA 11.5.100, Vulkan: 1.2.186, Compiler: GCC 10.3.0 + Clang 13.0.0-2, File-System: ext4, Screen Resolution: 3840x2160
Kernel Notes: Transparent Huge Pages: madvise
Processor Notes: Scaling Governor: intel_pstate powersave - CPU Microcode: 0x40 - Thermald 2.4.6
Graphics Notes: BAR1 / Visible vRAM Size: 256 MiB
Security Notes: itlb_multihit: Not affected + l1tf: Not affected + mds: Not affected + meltdown: Not affected + spec_store_bypass: Mitigation of SSB disabled via prctl and seccomp + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced IBRS IBPB: conditional RSB filling + srbds: Not affected + tsx_async_abort: Not affected
Shadow of the Tomb Raider
Shadow of the Tomb Raider on Steam. The test profile assumes you have a Steam account, have Steam installed for the system, and that you own a copy of this game. This automates the process of executing the game and using its built-in benchmark mode. Backs up old preferences (in ~/.local/share/feral-interactive/) for the run. Learn more via the OpenBenchmarking.org test page.
CPU Peak Freq (Highest CPU Core Frequency) Monitor
CPU Temperature Monitor
CPU Usage (Summary) Monitor
GPU Fan Speed Monitor
GPU Frequency Monitor
GPU Memory Usage Monitor
GPU Power Consumption Monitor
GPU Temperature Monitor
Memory Usage Monitor
RTX 2080
Processor: Intel Core i5-11600K @ 4.90GHz (6 Cores / 12 Threads), Motherboard: NZXT N7 Z490 (P1.80 BIOS), Chipset: Intel Comet Lake PCH, Memory: 16GB, Disk: 1000GB Western Digital WDS100T3XHC-00SJG0 + 512GB INTEL SSDPEKKW512G8 + 4001GB Seagate ST4000VN008-2DR1, Graphics: NVIDIA GeForce RTX 2080 8GB, Audio: Realtek ALC1220, Monitor: LG TV SSCR2, Network: Realtek RTL8125 2.5GbE + Intel Wi-Fi 6 AX200
OS: Ubuntu 21.10, Kernel: 5.13.0-22-generic (x86_64), Desktop: GNOME Shell 40.5, Display Server: X Server 1.20.13, Display Driver: NVIDIA 495.44, OpenGL: 4.6.0, OpenCL: OpenCL 3.0 CUDA 11.5.100, Vulkan: 1.2.186, Compiler: GCC 10.3.0 + Clang 13.0.0-2, File-System: ext4, Screen Resolution: 3840x2160
Kernel Notes: Transparent Huge Pages: madvise
Processor Notes: Scaling Governor: intel_pstate powersave - CPU Microcode: 0x40 - Thermald 2.4.6
Graphics Notes: BAR1 / Visible vRAM Size: 256 MiB
Security Notes: itlb_multihit: Not affected + l1tf: Not affected + mds: Not affected + meltdown: Not affected + spec_store_bypass: Mitigation of SSB disabled via prctl and seccomp + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced IBRS IBPB: conditional RSB filling + srbds: Not affected + tsx_async_abort: Not affected
Testing initiated at 30 November 2021 07:50 by user tad.