onednn 3.0 raptor lake Intel Core i9-13900K testing with a ASUS PRIME Z790-P WIFI (0602 BIOS) and eVGA NVIDIA GeForce RTX 3060 12GB on Ubuntu 22.10 via the Phoronix Test Suite. a: Processor: Intel Core i9-13900K @ 4.00GHz (24 Cores / 32 Threads), Motherboard: ASUS PRIME Z790-P WIFI (0602 BIOS), Chipset: Intel Device 7a27, Memory: 32GB, Disk: 1000GB Western Digital WDS100T1X0E-00AFY0, Graphics: eVGA NVIDIA GeForce RTX 3060 12GB, Audio: Realtek ALC897, Monitor: ASUS VP28U, Network: Realtek RTL8125 2.5GbE + Intel Device 7a70 OS: Ubuntu 22.10, Kernel: 5.19.0-26-generic (x86_64), Desktop: GNOME Shell 43.1, Display Server: X Server 1.21.1.4, Display Driver: NVIDIA 525.60.11, OpenGL: 4.6.0, OpenCL: OpenCL 3.0 CUDA 12.0.89, Vulkan: 1.3.224, Compiler: GCC 12.2.0, File-System: ext4, Screen Resolution: 2560x1600 b: Processor: Intel Core i9-13900K @ 4.00GHz (24 Cores / 32 Threads), Motherboard: ASUS PRIME Z790-P WIFI (0602 BIOS), Chipset: Intel Device 7a27, Memory: 32GB, Disk: 1000GB Western Digital WDS100T1X0E-00AFY0, Graphics: eVGA NVIDIA GeForce RTX 3060 12GB, Audio: Realtek ALC897, Monitor: ASUS VP28U, Network: Realtek RTL8125 2.5GbE + Intel Device 7a70 OS: Ubuntu 22.10, Kernel: 5.19.0-26-generic (x86_64), Desktop: GNOME Shell 43.1, Display Server: X Server 1.21.1.4, Display Driver: NVIDIA 525.60.11, OpenGL: 4.6.0, OpenCL: OpenCL 3.0 CUDA 12.0.89, Vulkan: 1.3.224, Compiler: GCC 12.2.0, File-System: ext4, Screen Resolution: 2560x1600 c: Processor: Intel Core i9-13900K @ 4.00GHz (24 Cores / 32 Threads), Motherboard: ASUS PRIME Z790-P WIFI (0602 BIOS), Chipset: Intel Device 7a27, Memory: 32GB, Disk: 1000GB Western Digital WDS100T1X0E-00AFY0, Graphics: eVGA NVIDIA GeForce RTX 3060 12GB, Audio: Realtek ALC897, Monitor: ASUS VP28U, Network: Realtek RTL8125 2.5GbE + Intel Device 7a70 OS: Ubuntu 22.10, Kernel: 5.19.0-26-generic (x86_64), Desktop: GNOME Shell 43.1, Display Server: X Server 1.21.1.4, Display Driver: NVIDIA 525.60.11, OpenGL: 4.6.0, OpenCL: OpenCL 3.0 CUDA 12.0.89, Vulkan: 1.3.224, Compiler: GCC 12.2.0, File-System: ext4, Screen Resolution: 2560x1600 oneDNN 3.0 Harness: IP Shapes 1D - Data Type: f32 - Engine: CPU ms < Lower Is Better a . 1.71588 |============================================================ b . 1.88491 |================================================================== c . 1.86077 |================================================================= oneDNN 3.0 Harness: IP Shapes 3D - Data Type: f32 - Engine: CPU ms < Lower Is Better a . 4.15008 |================================================================== b . 3.93573 |=============================================================== c . 3.84879 |============================================================= oneDNN 3.0 Harness: IP Shapes 1D - Data Type: u8s8f32 - Engine: CPU ms < Lower Is Better a . 0.786709 |========================================================= b . 0.863089 |=============================================================== c . 0.891727 |================================================================= oneDNN 3.0 Harness: IP Shapes 3D - Data Type: u8s8f32 - Engine: CPU ms < Lower Is Better a . 0.629303 |================================================================= b . 0.621966 |================================================================ c . 0.599391 |============================================================== oneDNN 3.0 Harness: Convolution Batch Shapes Auto - Data Type: f32 - Engine: CPU ms < Lower Is Better a . 5.76384 |================================================================== b . 5.75816 |================================================================== c . 5.75127 |================================================================== oneDNN 3.0 Harness: Deconvolution Batch shapes_1d - Data Type: f32 - Engine: CPU ms < Lower Is Better a . 7.47730 |=============================================================== b . 7.36674 |============================================================== c . 7.86792 |================================================================== oneDNN 3.0 Harness: Deconvolution Batch shapes_3d - Data Type: f32 - Engine: CPU ms < Lower Is Better a . 3.42736 |================================================================== b . 3.42391 |================================================================== c . 3.42519 |================================================================== oneDNN 3.0 Harness: Convolution Batch Shapes Auto - Data Type: u8s8f32 - Engine: CPU ms < Lower Is Better a . 5.91006 |================================================================== b . 5.89829 |================================================================== c . 5.86806 |================================================================== oneDNN 3.0 Harness: Deconvolution Batch shapes_1d - Data Type: u8s8f32 - Engine: CPU ms < Lower Is Better a . 0.955404 |================================================================ b . 0.972727 |================================================================= c . 0.937470 |=============================================================== oneDNN 3.0 Harness: Deconvolution Batch shapes_3d - Data Type: u8s8f32 - Engine: CPU ms < Lower Is Better a . 1.44911 |================================================================== b . 1.44938 |================================================================== c . 1.44885 |================================================================== oneDNN 3.0 Harness: Recurrent Neural Network Training - Data Type: f32 - Engine: CPU ms < Lower Is Better a . 2119.68 |================================================================== b . 2122.24 |================================================================== c . 2089.67 |================================================================= oneDNN 3.0 Harness: Recurrent Neural Network Inference - Data Type: f32 - Engine: CPU ms < Lower Is Better a . 1087.51 |================================================================= b . 1073.62 |================================================================ c . 1104.46 |================================================================== oneDNN 3.0 Harness: Recurrent Neural Network Training - Data Type: u8s8f32 - Engine: CPU ms < Lower Is Better a . 2126.29 |================================================================= b . 2143.27 |================================================================== c . 2097.67 |================================================================= oneDNN 3.0 Harness: Recurrent Neural Network Inference - Data Type: u8s8f32 - Engine: CPU ms < Lower Is Better a . 1098.42 |================================================================== b . 1099.64 |================================================================== c . 1095.20 |================================================================== oneDNN 3.0 Harness: Matrix Multiply Batch Shapes Transformer - Data Type: f32 - Engine: CPU ms < Lower Is Better a . 1.269326 |============================================================ b . 1.179855 |======================================================= c . 1.382935 |================================================================= oneDNN 3.0 Harness: Recurrent Neural Network Training - Data Type: bf16bf16bf16 - Engine: CPU ms < Lower Is Better a . 2117.94 |================================================================= b . 2148.96 |================================================================== c . 2113.69 |================================================================= oneDNN 3.0 Harness: Recurrent Neural Network Inference - Data Type: bf16bf16bf16 - Engine: CPU ms < Lower Is Better a . 1084.75 |================================================================= b . 1095.58 |================================================================== c . 1083.23 |================================================================= oneDNN 3.0 Harness: Matrix Multiply Batch Shapes Transformer - Data Type: u8s8f32 - Engine: CPU ms < Lower Is Better a . 0.783081 |================================================================= b . 0.786177 |================================================================= c . 0.759611 |===============================================================