OIDN + LCR Intel Core i9-7980XE testing with a ASUS PRIME X299-A (1704 BIOS) and NVIDIA GeForce GTX TITAN X 12GB on Ubuntu 19.04 via the Phoronix Test Suite. Intel Core i9-7980XE: Processor: Intel Core i9-7980XE @ 4.20GHz (18 Cores / 36 Threads), Motherboard: ASUS PRIME X299-A (1704 BIOS), Chipset: Intel Sky Lake-E DMI3 Registers, Memory: 16384MB, Disk: Samsung SSD 970 EVO 500GB, Graphics: NVIDIA GeForce GTX TITAN X 12GB (1001/3505MHz), Audio: Realtek ALC1220, Monitor: ASUS PB278, Network: Intel I219-V OS: Ubuntu 19.04, Kernel: 5.0.0-29-generic (x86_64), Desktop: GNOME Shell 3.32.0, Display Server: X Server 1.20.4, Display Driver: NVIDIA 418.56, OpenGL: 4.6.0, OpenCL: OpenCL 1.2 CUDA 10.1.133, Compiler: GCC 8.3.0, File-System: ext4, Screen Resolution: 2560x1440 Intel Open Image Denoise 1.0.0 Scene: Memorial Images / Sec > Higher Is Better Intel Core i9-7980XE . 23.51 |================================================= LuxCoreRender 2.2 Scene: DLSC M samples/sec > Higher Is Better Intel Core i9-7980XE . 2.84 |================================================== LuxCoreRender 2.2 Scene: Rainbow Colors and Prism M samples/sec > Higher Is Better Intel Core i9-7980XE . 2.65 |==================================================