Benchmarks by Michael Larabel for a future article.
Compare your own system(s) to this result file with the
Phoronix Test Suite by running the command:
phoronix-test-suite benchmark 2401305-NE-NVIDIACOM43 NVIDIA RTX 40 Series Compute - Phoronix Test Suite NVIDIA RTX 40 Series Compute Benchmarks by Michael Larabel for a future article.
HTML result view exported from: https://openbenchmarking.org/result/2401305-NE-NVIDIACOM43&grs .
NVIDIA RTX 40 Series Compute Processor Motherboard Chipset Memory Disk Graphics Audio Monitor Network OS Kernel Desktop Display Server Display Driver OpenGL OpenCL Compiler File-System Screen Resolution RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 AMD Ryzen 9 7950X 16-Core @ 5.88GHz (16 Cores / 32 Threads) ASUS ROG STRIX X670E-E GAMING WIFI (1416 BIOS) AMD Device 14d8 2 x 16 GB DRAM-6000MT/s G Skill F5-6000J3038F16G 2000GB Samsung SSD 980 PRO 2TB + 4001GB Western Digital WD_BLACK SN850X 4000GB NVIDIA GeForce RTX 2060 SUPER 8GB NVIDIA TU106 HD Audio DELL U2723QE Intel I225-V + Intel Wi-Fi 6 AX210/AX211/AX411 Ubuntu 23.10 6.7.0-060700-generic (x86_64) GNOME Shell 45.2 X Server 1.21.1.7 NVIDIA 545.29.06 4.6.0 OpenCL 3.0 CUDA 12.3.99 GCC 13.2.0 + LLVM 16.0.6 ext4 3840x2160 ASUS NVIDIA GeForce RTX 2070 8GB NVIDIA GeForce RTX 2070 SUPER 8GB NVIDIA TU104 HD Audio Zotac NVIDIA GeForce RTX 2080 8GB NVIDIA GeForce RTX 2080 SUPER 8GB 4001GB Western Digital WD_BLACK SN850X 4000GB + 2000GB Samsung SSD 980 PRO 2TB NVIDIA TITAN RTX 24GB NVIDIA TU102 HD Audio 2000GB Samsung SSD 980 PRO 2TB + 4001GB Western Digital WD_BLACK SN850X 4000GB eVGA NVIDIA GeForce RTX 3060 12GB NVIDIA GA106 HD Audio NVIDIA GeForce RTX 3060 Ti 8GB NVIDIA GA104 HD Audio 4001GB Western Digital WD_BLACK SN850X 4000GB + 2000GB Samsung SSD 980 PRO 2TB NVIDIA GeForce RTX 3070 8GB NVIDIA GeForce RTX 3070 Ti 8GB 2000GB Samsung SSD 980 PRO 2TB + 4001GB Western Digital WD_BLACK SN850X 4000GB NVIDIA GeForce RTX 3080 10GB NVIDIA GA102 HD Audio NVIDIA GeForce RTX 3080 Ti 12GB NVIDIA GeForce RTX 3090 24GB MSI NVIDIA GeForce RTX 4060 8GB NVIDIA Device 22be NVIDIA GeForce RTX 4070 12GB NVIDIA Device 22bc NVIDIA GeForce RTX 4070 SUPER 12GB NVIDIA 550.40.07 OpenCL 3.0 CUDA 12.4.74 ASUS NVIDIA GeForce RTX 4070 Ti SUPER 16GB NVIDIA Device 22bb NVIDIA GeForce RTX 4080 16GB NVIDIA 545.29.06 OpenCL 3.0 CUDA 12.3.99 NVIDIA GeForce RTX 4080 SUPER 16GB NVIDIA 550.40.07 OpenCL 3.0 CUDA 12.4.74 NVIDIA GeForce RTX 4090 24GB NVIDIA AD102 HD Audio NVIDIA 545.29.06 OpenCL 3.0 CUDA 12.3.99 OpenBenchmarking.org Kernel Details - Transparent Huge Pages: madvise Compiler Details - --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-bootstrap --enable-cet --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++,m2 --enable-libphobos-checking=release --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-link-serialization=2 --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-defaulted --enable-offload-targets=nvptx-none=/build/gcc-13-XYspKM/gcc-13-13.2.0/debian/tmp-nvptx/usr,amdgcn-amdhsa=/build/gcc-13-XYspKM/gcc-13-13.2.0/debian/tmp-gcn/usr --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-build-config=bootstrap-lto-lean --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v Processor Details - Scaling Governor: amd-pstate-epp performance (EPP: performance) - CPU Microcode: 0xa601203 Graphics Details - RTX 2060 SUPER: BAR1 / Visible vRAM Size: 256 MiB - vBIOS Version: 90.06.44.00.01 - RTX 2070: BAR1 / Visible vRAM Size: 256 MiB - vBIOS Version: 90.06.0b.40.83 - RTX 2070 SUPER: BAR1 / Visible vRAM Size: 256 MiB - vBIOS Version: 90.04.76.00.01 - RTX 2080: BAR1 / Visible vRAM Size: 256 MiB - vBIOS Version: 90.04.0d.00.1e - RTX 2080 SUPER: BAR1 / Visible vRAM Size: 256 MiB - vBIOS Version: 90.04.79.00.01 - TITAN RTX: BAR1 / Visible vRAM Size: 256 MiB - vBIOS Version: 90.02.23.00.01 - RTX 3060: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 94.06.14.40.46 - RTX 3060 Ti: BAR1 / Visible vRAM Size: 8192 MiB - vBIOS Version: 94.04.25.00.2c - RTX 3070: BAR1 / Visible vRAM Size: 8192 MiB - vBIOS Version: 94.04.25.00.2b - RTX 3070 Ti: BAR1 / Visible vRAM Size: 8192 MiB - vBIOS Version: 94.04.5b.00.02 - RTX 3080: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 94.02.20.00.07 - RTX 3080 Ti: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 94.02.71.00.01 - RTX 3090: BAR1 / Visible vRAM Size: 32768 MiB - vBIOS Version: 94.02.27.00.02 - RTX 4060: BAR1 / Visible vRAM Size: 8192 MiB - vBIOS Version: 95.07.31.00.e3 - RTX 4070: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 95.04.49.00.03 - RTX 4070 SUPER: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 95.04.69.00.01 - RTX 4070 Ti SUPER: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 95.03.45.00.9c - RTX 4080: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 95.03.0e.00.04 - RTX 4080 SUPER: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 95.03.44.00.01 - RTX 4090: BAR1 / Visible vRAM Size: 32768 MiB - vBIOS Version: 95.02.20.00.01 OpenCL Details - RTX 2060 SUPER: GPU Compute Cores: 2176 - RTX 2070: GPU Compute Cores: 2304 - RTX 2070 SUPER: GPU Compute Cores: 2560 - RTX 2080: GPU Compute Cores: 2944 - RTX 2080 SUPER: GPU Compute Cores: 3072 - TITAN RTX: GPU Compute Cores: 4608 - RTX 3060: GPU Compute Cores: 3584 - RTX 3060 Ti: GPU Compute Cores: 4864 - RTX 3070: GPU Compute Cores: 5888 - RTX 3070 Ti: GPU Compute Cores: 6144 - RTX 3080: GPU Compute Cores: 8704 - RTX 3080 Ti: GPU Compute Cores: 10240 - RTX 3090: GPU Compute Cores: 10496 - RTX 4060: GPU Compute Cores: 3072 - RTX 4070: GPU Compute Cores: 5888 - RTX 4070 SUPER: GPU Compute Cores: 7168 - RTX 4070 Ti SUPER: GPU Compute Cores: 8448 - RTX 4080: GPU Compute Cores: 9728 - RTX 4080 SUPER: GPU Compute Cores: 10240 - RTX 4090: GPU Compute Cores: 16384 Python Details - Python 3.11.6 Security Details - gather_data_sampling: Not affected + itlb_multihit: Not affected + l1tf: Not affected + mds: Not affected + meltdown: Not affected + mmio_stale_data: Not affected + retbleed: Not affected + spec_rstack_overflow: Vulnerable: Safe RET no microcode + spec_store_bypass: Mitigation of SSB disabled via prctl + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced / Automatic IBRS IBPB: conditional STIBP: always-on RSB filling PBRSB-eIBRS: Not affected + srbds: Not affected + tsx_async_abort: Not affected
NVIDIA RTX 40 Series Compute v-ray: NVIDIA CUDA GPU v-ray: NVIDIA RTX GPU luxcorerender: DLSC - GPU luxcorerender: Danish Mood - GPU opencl-benchmark: FP64 Compute gpuowl: 332220523 gpuowl: 77936867 gpuowl: 57885161 opencl-benchmark: INT32 Compute opencl-benchmark: INT8 Compute opencl-benchmark: INT16 Compute luxcorerender: LuxCore Benchmark - GPU blender: Fishy Cat - NVIDIA OptiX blender: Pabellon Barcelona - NVIDIA OptiX luxcorerender: Orange Juice - GPU indigobench: OpenCL GPU - Bedroom blender: Classroom - NVIDIA OptiX blender: Barbershop - NVIDIA OptiX blender: BMW27 - NVIDIA OptiX indigobench: OpenCL GPU - Supercar fluidx3d: FP32-FP16C opencl-benchmark: Memory Bandwidth Coalesced Read luxcorerender: Rainbow Colors and Prism - GPU fluidx3d: FP32-FP32 opencl-benchmark: Memory Bandwidth Coalesced Write fluidx3d: FP32-FP16S opencl-benchmark: INT64 Compute pytorch: NVIDIA CUDA GPU - 512 - ResNet-50 pytorch: NVIDIA CUDA GPU - 512 - ResNet-152 opencl-benchmark: FP32 Compute RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 511 839 3.27 2.73 0.255 56.17 263.99 355.871886121 8.214 5.457 7.049 3.59 27.68 43.41 3.66 7.569 35.98 141.74 13.28 23.185 4442 388.00 12.35 2437 418.57 4982 2.185 191.53 80.27 8.272 450 736 2.93 2.49 0.256 51.78 244.74 330.62 8.377 5.657 7.260 3.24 29.78 47.35 3.36 6.946 39.50 155.70 14.40 21.195 4223 388.49 11.37 2270 421.16 4589 2.126 177.73 73.52 8.484 794 1068 4.11 3.55 0.301 66.04 310.72 419.64 9.707 6.498 8.407 4.21 24.81 41.82 4.47 7.985 34.63 131.40 12.85 24.475 4955 388.30 12.14 2321 409.50 4948 3.067 200.51 84.38 9.785 782 998 4.22 3.65 0.335 73.30 344.12 463.18 10.807 7.098 9.264 4.33 23.29 41.65 4.58 8.062 34.88 130.98 12.74 25.015 4966 387.25 11.80 2322 417.55 4957 3.310 197.62 82.86 10.898 804 1013 4.44 3.81 0.365 80.43 377.03 508.82 11.831 7.897 10.403 4.49 21.82 40.56 4.73 8.313 33.57 126.60 12.35 25.887 5351 426.01 11.73 2504 452.00 5423 3.630 208.68 87.60 11.891 970 1452 5.91 5.10 0.530 115.73 531.73 711.07 17.081 10.943 13.899 6.37 16.25 28.29 6.05 12.061 24.45 92.68 9.29 34.924 7157 556.60 16.37 3343 603.99 7108 3.576 263.23 106.96 17.177 811 1147 4.25 3.34 0.201 44.28 209.13 284.50 6.794 5.117 5.917 4.10 28.37 42.74 4.82 8.285 38.45 135.48 14.48 24.384 3386 335.75 13.57 2039 339.67 3932 1.925 204.41 87.19 13.080 1250 1685 5.95 4.78 0.299 65.93 313.15 421.23 9.658 7.187 8.401 5.70 20.18 31.35 6.18 11.681 28.14 98.84 11.00 33.386 4698 414.45 19.08 2637 422 5136 3.090 267.12 117.40 18.693 1416 1874 6.93 5.57 0.356 78.55 369.14 497.02 11.452 8.454 9.987 6.66 17.63 28.19 7.04 13.174 25.36 88.75 9.98 36.891 4884 414.43 21.52 2597 422.01 5079 3.270 287.04 126.55 22.272 1487 2031 7.18 5.78 0.361 79.25 372.02 503.95 11.594 8.730 10.115 6.96 16.70 26.23 7.18 13.867 23.48 83.71 9.22 38.506 5913 562.78 22.20 3507 578.09 6915 3.185 305.64 130.83 22.553 1777 2449 9.84 7.89 0.530 116.07 533.43 725.69 16.956 12.162 14.526 9.63 12.49 21.01 9.08 17.765 18.38 65.48 7.52 45.612 7999 702.77 27.22 4338 721.77 8460 3.266 388.94 143.64 32.924 2045 2918 11.31 9.13 0.606 131.09 615.01 817.44 19.329 13.584 16.368 11.14 11.16 17.85 10.08 20.385 15.97 57.27 6.64 51.018 9144 842.88 32.00 5199 866.26 10122 3.155 387.25 145.36 37.298 2108 2995 11.87 9.69 0.643 139.80 656.60 882.36 20.417 14.333 17.260 11.60 10.67 17.54 10.49 21.048 15.56 55.51 6.45 52.798 9610 864.49 32.52 5328 885.87 10491 3.170 389.05 146.22 39.661 1239 1860 5.37 4.56 0.264 58.24 275.74 374.53 8.491 6.250 7.391 5.22 19.03 26.22 5.48 9.686 23.95 98.26 9.22 28.690 3112 253.03 13.15 1608 258.23 3046 2.091 288.08 117.11 16.502 1720 2505 9.25 7.54 0.508 103.41 486.70 656.75 16.312 12.354 14.166 8.91 12.21 17.98 8.36 16.773 16.45 65.09 6.85 44.699 5252 465.36 20.21 2689 459.27 4679 3.428 369.73 140.04 31.648 2250 3091 11.59 9.65 0.614 133.96 631.18 850.83 19.672 14.363 16.870 11.34 9.92 14.68 9.78 19.462 13.19 53.45 5.80 52.172 5670 464.76 24.78 2778 455.27 5547 3.989 386.65 141.97 38.175 2697 3681 13.76 11.24 0.725 159.72 744.60 1003.01 23.201 17.227 20.201 13.08 8.73 12.88 11.54 24.194 11.70 46.44 5.25 60.453 7285 619.58 28.28 3830 612.02 6447 4.304 399.92 146.29 45.031 3070 4182 15.29 12.81 0.830 182.69 861.82 1138.95 26.623 20.150 23.081 14.52 7.76 11.27 12.48 25.679 10.10 41.12 4.68 64.822 7807 653.28 32.42 3820 611.31 7683 4.521 387.42 147.44 51.569 3073 4113 15.82 13.24 0.864 188.70 889.94 1187.648456057 27.654 20.916 24.024 14.84 7.57 11.09 12.81 26.194 10.12 40.64 4.63 64.925 8106 681.11 32.53 3967 627.33 7778 4.246 399.58 147.59 53.642 4333 5645 22.27 18.52 1.389 302.79 1426.53 1909.61 44.403 33.325 38.439 19.25 5.59 8.90 17.50 35.622 7.96 32.54 3.80 79.196 11542 927.88 41.03 5726 905.50 10237 4.351 401.36 147.31 85.843 OpenBenchmarking.org
Chaos Group V-RAY Mode: NVIDIA CUDA GPU OpenBenchmarking.org vpaths, More Is Better Chaos Group V-RAY 5.02 Mode: NVIDIA CUDA GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 900 1800 2700 3600 4500 SE +/- 1.20, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 1.20, N = 3 SE +/- 0.33, N = 3 SE +/- 1.00, N = 3 SE +/- 0.33, N = 3 SE +/- 0.33, N = 3 SE +/- 1.86, N = 3 SE +/- 1.67, N = 3 SE +/- 0.88, N = 3 SE +/- 0.33, N = 3 SE +/- 1.86, N = 3 SE +/- 0.67, N = 3 SE +/- 1.53, N = 3 SE +/- 1.20, N = 3 SE +/- 5.61, N = 3 SE +/- 1.45, N = 3 SE +/- 1.67, N = 3 SE +/- 2.31, N = 3 511 450 794 782 804 970 811 1250 1416 1487 1777 2045 2108 1239 1720 2250 2697 3070 3073 4333
Chaos Group V-RAY Mode: NVIDIA RTX GPU OpenBenchmarking.org vrays, More Is Better Chaos Group V-RAY 5.02 Mode: NVIDIA RTX GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 1200 2400 3600 4800 6000 SE +/- 1.33, N = 3 SE +/- 1.00, N = 3 SE +/- 4.37, N = 3 SE +/- 0.67, N = 3 SE +/- 0.00, N = 3 SE +/- 1.33, N = 3 SE +/- 0.67, N = 3 SE +/- 2.52, N = 3 SE +/- 3.21, N = 3 SE +/- 0.33, N = 3 SE +/- 3.18, N = 3 SE +/- 4.67, N = 3 SE +/- 1.76, N = 3 SE +/- 2.08, N = 3 SE +/- 0.67, N = 3 SE +/- 3.61, N = 3 SE +/- 3.51, N = 3 SE +/- 6.01, N = 3 SE +/- 6.93, N = 3 SE +/- 16.17, N = 3 839 736 1068 998 1013 1452 1147 1685 1874 2031 2449 2918 2995 1860 2505 3091 3681 4182 4113 5645
LuxCoreRender Scene: DLSC - Acceleration: GPU OpenBenchmarking.org M samples/sec, More Is Better LuxCoreRender 2.6 Scene: DLSC - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 5 10 15 20 25 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 3.27 2.93 4.11 4.22 4.44 5.91 4.25 5.95 6.93 7.18 9.84 11.31 11.87 5.37 9.25 11.59 13.76 15.29 15.82 22.27
LuxCoreRender Scene: Danish Mood - Acceleration: GPU OpenBenchmarking.org M samples/sec, More Is Better LuxCoreRender 2.6 Scene: Danish Mood - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 5 10 15 20 25 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.03, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.04, N = 3 SE +/- 0.00, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.05, N = 3 SE +/- 0.08, N = 3 SE +/- 0.09, N = 3 SE +/- 0.04, N = 3 SE +/- 0.04, N = 3 SE +/- 0.04, N = 3 SE +/- 0.01, N = 3 SE +/- 0.07, N = 3 SE +/- 0.12, N = 3 SE +/- 0.22, N = 3 2.73 2.49 3.55 3.65 3.81 5.10 3.34 4.78 5.57 5.78 7.89 9.13 9.69 4.56 7.54 9.65 11.24 12.81 13.24 18.52
ProjectPhysX OpenCL-Benchmark Operation: FP64 Compute OpenBenchmarking.org TFLOPs/s, More Is Better ProjectPhysX OpenCL-Benchmark 1.2 Operation: FP64 Compute RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.3125 0.625 0.9375 1.25 1.5625 SE +/- 0.001, N = 3 SE +/- 0.001, N = 3 SE +/- 0.000, N = 3 SE +/- 0.000, N = 3 SE +/- 0.001, N = 3 SE +/- 0.001, N = 3 SE +/- 0.000, N = 3 SE +/- 0.000, N = 3 SE +/- 0.000, N = 3 SE +/- 0.000, N = 3 SE +/- 0.001, N = 4 SE +/- 0.000, N = 4 SE +/- 0.000, N = 4 SE +/- 0.000, N = 3 SE +/- 0.000, N = 4 SE +/- 0.001, N = 4 SE +/- 0.000, N = 4 SE +/- 0.000, N = 4 SE +/- 0.000, N = 5 SE +/- 0.000, N = 5 0.255 0.256 0.301 0.335 0.365 0.530 0.201 0.299 0.356 0.361 0.530 0.606 0.643 0.264 0.508 0.614 0.725 0.830 0.864 1.389 1. (CXX) g++ options: -std=c++17 -pthread -lOpenCL
GpuOwl Exponent: 332220523 OpenBenchmarking.org Iterations / Second, More Is Better GpuOwl 7.2.1 Exponent: 332220523 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 70 140 210 280 350 SE +/- 0.00, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.05, N = 3 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 SE +/- 0.02, N = 3 SE +/- 0.12, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.05, N = 3 SE +/- 0.03, N = 3 SE +/- 0.04, N = 3 SE +/- 0.08, N = 3 56.17 51.78 66.04 73.30 80.43 115.73 44.28 65.93 78.55 79.25 116.07 131.09 139.80 58.24 103.41 133.96 159.72 182.69 188.70 302.79
GpuOwl Exponent: 77936867 OpenBenchmarking.org Iterations / Second, More Is Better GpuOwl 7.2.1 Exponent: 77936867 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 300 600 900 1200 1500 SE +/- 0.00, N = 3 SE +/- 0.07, N = 3 SE +/- 0.13, N = 3 SE +/- 0.07, N = 3 SE +/- 0.19, N = 3 SE +/- 0.38, N = 3 SE +/- 0.15, N = 3 SE +/- 0.13, N = 3 SE +/- 0.00, N = 3 SE +/- 0.16, N = 3 SE +/- 0.38, N = 3 SE +/- 0.38, N = 3 SE +/- 0.25, N = 3 SE +/- 0.03, N = 3 SE +/- 0.80, N = 3 SE +/- 0.87, N = 3 SE +/- 0.00, N = 3 SE +/- 0.50, N = 3 SE +/- 0.53, N = 3 SE +/- 0.00, N = 3 263.99 244.74 310.72 344.12 377.03 531.73 209.13 313.15 369.14 372.02 533.43 615.01 656.60 275.74 486.70 631.18 744.60 861.82 889.94 1426.53
GpuOwl Exponent: 57885161 OpenBenchmarking.org Iterations / Second, More Is Better GpuOwl 7.2.1 Exponent: 57885161 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 400 800 1200 1600 2000 SE +/- 0.00, N = 3 SE +/- 0.13, N = 3 SE +/- 0.00, N = 3 SE +/- 0.12, N = 3 SE +/- 1.12, N = 3 SE +/- 1.18, N = 3 SE +/- 0.24, N = 3 SE +/- 0.18, N = 3 SE +/- 0.00, N = 3 SE +/- 0.08, N = 3 SE +/- 0.53, N = 3 SE +/- 1.56, N = 3 SE +/- 2.08, N = 3 SE +/- 0.14, N = 3 SE +/- 1.01, N = 3 SE +/- 1.05, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 1.22, N = 3 355.87 330.62 419.64 463.18 508.82 711.07 284.50 421.23 497.02 503.95 725.69 817.44 882.36 374.53 656.75 850.83 1003.01 1138.95 1187.65 1909.61
ProjectPhysX OpenCL-Benchmark Operation: INT32 Compute OpenBenchmarking.org TIOPs/s, More Is Better ProjectPhysX OpenCL-Benchmark 1.2 Operation: INT32 Compute RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 10 20 30 40 50 SE +/- 0.002, N = 3 SE +/- 0.004, N = 3 SE +/- 0.023, N = 3 SE +/- 0.014, N = 3 SE +/- 0.030, N = 3 SE +/- 0.046, N = 3 SE +/- 0.001, N = 3 SE +/- 0.014, N = 3 SE +/- 0.000, N = 3 SE +/- 0.000, N = 3 SE +/- 0.033, N = 4 SE +/- 0.037, N = 4 SE +/- 0.046, N = 4 SE +/- 0.000, N = 3 SE +/- 0.000, N = 4 SE +/- 0.000, N = 4 SE +/- 0.001, N = 4 SE +/- 0.022, N = 4 SE +/- 0.009, N = 5 SE +/- 0.001, N = 5 8.214 8.377 9.707 10.807 11.831 17.081 6.794 9.658 11.452 11.594 16.956 19.329 20.417 8.491 16.312 19.672 23.201 26.623 27.654 44.403 1. (CXX) g++ options: -std=c++17 -pthread -lOpenCL
ProjectPhysX OpenCL-Benchmark Operation: INT8 Compute OpenBenchmarking.org TIOPs/s, More Is Better ProjectPhysX OpenCL-Benchmark 1.2 Operation: INT8 Compute RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 8 16 24 32 40 SE +/- 0.001, N = 3 SE +/- 0.015, N = 3 SE +/- 0.004, N = 3 SE +/- 0.004, N = 3 SE +/- 0.005, N = 3 SE +/- 0.069, N = 3 SE +/- 0.012, N = 3 SE +/- 0.036, N = 3 SE +/- 0.000, N = 3 SE +/- 0.011, N = 3 SE +/- 0.072, N = 4 SE +/- 0.057, N = 4 SE +/- 0.001, N = 4 SE +/- 0.034, N = 3 SE +/- 0.008, N = 4 SE +/- 0.038, N = 4 SE +/- 0.001, N = 4 SE +/- 0.006, N = 4 SE +/- 0.011, N = 5 SE +/- 0.061, N = 5 5.457 5.657 6.498 7.098 7.897 10.943 5.117 7.187 8.454 8.730 12.162 13.584 14.333 6.250 12.354 14.363 17.227 20.150 20.916 33.325 1. (CXX) g++ options: -std=c++17 -pthread -lOpenCL
ProjectPhysX OpenCL-Benchmark Operation: INT16 Compute OpenBenchmarking.org TIOPs/s, More Is Better ProjectPhysX OpenCL-Benchmark 1.2 Operation: INT16 Compute RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 9 18 27 36 45 SE +/- 0.028, N = 3 SE +/- 0.019, N = 3 SE +/- 0.027, N = 3 SE +/- 0.031, N = 3 SE +/- 0.027, N = 3 SE +/- 0.038, N = 3 SE +/- 0.014, N = 3 SE +/- 0.002, N = 3 SE +/- 0.002, N = 3 SE +/- 0.000, N = 3 SE +/- 0.045, N = 4 SE +/- 0.006, N = 4 SE +/- 0.033, N = 4 SE +/- 0.003, N = 3 SE +/- 0.008, N = 4 SE +/- 0.004, N = 4 SE +/- 0.007, N = 4 SE +/- 0.006, N = 4 SE +/- 0.010, N = 5 SE +/- 0.051, N = 5 7.049 7.260 8.407 9.264 10.403 13.899 5.917 8.401 9.987 10.115 14.526 16.368 17.260 7.391 14.166 16.870 20.201 23.081 24.024 38.439 1. (CXX) g++ options: -std=c++17 -pthread -lOpenCL
LuxCoreRender Scene: LuxCore Benchmark - Acceleration: GPU OpenBenchmarking.org M samples/sec, More Is Better LuxCoreRender 2.6 Scene: LuxCore Benchmark - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 5 10 15 20 25 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.03, N = 3 SE +/- 0.04, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 SE +/- 0.04, N = 3 SE +/- 0.05, N = 3 SE +/- 0.01, N = 3 3.59 3.24 4.21 4.33 4.49 6.37 4.10 5.70 6.66 6.96 9.63 11.14 11.60 5.22 8.91 11.34 13.08 14.52 14.84 19.25
Blender Blend File: Fishy Cat - Compute: NVIDIA OptiX OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.0 Blend File: Fishy Cat - Compute: NVIDIA OptiX RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 7 14 21 28 35 SE +/- 0.27, N = 3 SE +/- 0.06, N = 3 SE +/- 0.27, N = 3 SE +/- 0.03, N = 3 SE +/- 0.02, N = 3 SE +/- 0.16, N = 5 SE +/- 0.30, N = 3 SE +/- 0.01, N = 3 SE +/- 0.21, N = 4 SE +/- 0.01, N = 3 SE +/- 0.01, N = 4 SE +/- 0.09, N = 9 SE +/- 0.01, N = 5 SE +/- 0.21, N = 4 SE +/- 0.10, N = 8 SE +/- 0.05, N = 15 SE +/- 0.05, N = 15 SE +/- 0.06, N = 15 SE +/- 0.01, N = 6 SE +/- 0.06, N = 15 27.68 29.78 24.81 23.29 21.82 16.25 28.37 20.18 17.63 16.70 12.49 11.16 10.67 19.03 12.21 9.92 8.73 7.76 7.57 5.59
Blender Blend File: Pabellon Barcelona - Compute: NVIDIA OptiX OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.0 Blend File: Pabellon Barcelona - Compute: NVIDIA OptiX RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 11 22 33 44 55 SE +/- 0.02, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.01, N = 3 SE +/- 0.03, N = 3 SE +/- 0.05, N = 3 SE +/- 0.03, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 SE +/- 0.04, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 SE +/- 0.02, N = 3 SE +/- 0.00, N = 3 SE +/- 0.02, N = 4 SE +/- 0.01, N = 4 SE +/- 0.02, N = 4 SE +/- 0.00, N = 4 SE +/- 0.01, N = 5 43.41 47.35 41.82 41.65 40.56 28.29 42.74 31.35 28.19 26.23 21.01 17.85 17.54 26.22 17.98 14.68 12.88 11.27 11.09 8.90
LuxCoreRender Scene: Orange Juice - Acceleration: GPU OpenBenchmarking.org M samples/sec, More Is Better LuxCoreRender 2.6 Scene: Orange Juice - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 4 8 12 16 20 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.04, N = 3 SE +/- 0.04, N = 3 SE +/- 0.02, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.05, N = 3 SE +/- 0.01, N = 3 SE +/- 0.06, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 3.66 3.36 4.47 4.58 4.73 6.05 4.82 6.18 7.04 7.18 9.08 10.08 10.49 5.48 8.36 9.78 11.54 12.48 12.81 17.50
IndigoBench Acceleration: OpenCL GPU - Scene: Bedroom OpenBenchmarking.org M samples/s, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Bedroom RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 8 16 24 32 40 SE +/- 0.002, N = 3 SE +/- 0.007, N = 3 SE +/- 0.009, N = 3 SE +/- 0.006, N = 3 SE +/- 0.007, N = 3 SE +/- 0.027, N = 3 SE +/- 0.011, N = 3 SE +/- 0.013, N = 3 SE +/- 0.003, N = 3 SE +/- 0.015, N = 3 SE +/- 0.004, N = 3 SE +/- 0.022, N = 3 SE +/- 0.007, N = 3 SE +/- 0.004, N = 3 SE +/- 0.004, N = 3 SE +/- 0.013, N = 3 SE +/- 0.003, N = 3 SE +/- 0.068, N = 3 SE +/- 0.037, N = 3 SE +/- 0.105, N = 3 7.569 6.946 7.985 8.062 8.313 12.061 8.285 11.681 13.174 13.867 17.765 20.385 21.048 9.686 16.773 19.462 24.194 25.679 26.194 35.622
Blender Blend File: Classroom - Compute: NVIDIA OptiX OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.0 Blend File: Classroom - Compute: NVIDIA OptiX RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 9 18 27 36 45 SE +/- 0.02, N = 3 SE +/- 0.04, N = 3 SE +/- 0.02, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.05, N = 3 SE +/- 0.03, N = 3 SE +/- 0.01, N = 3 SE +/- 0.04, N = 3 SE +/- 0.06, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 4 SE +/- 0.03, N = 4 SE +/- 0.03, N = 3 SE +/- 0.05, N = 3 SE +/- 0.05, N = 4 SE +/- 0.03, N = 4 SE +/- 0.01, N = 5 SE +/- 0.01, N = 5 SE +/- 0.01, N = 6 35.98 39.50 34.63 34.88 33.57 24.45 38.45 28.14 25.36 23.48 18.38 15.97 15.56 23.95 16.45 13.19 11.70 10.10 10.12 7.96
Blender Blend File: Barbershop - Compute: NVIDIA OptiX OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.0 Blend File: Barbershop - Compute: NVIDIA OptiX RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 30 60 90 120 150 SE +/- 0.16, N = 3 SE +/- 0.07, N = 3 SE +/- 0.06, N = 3 SE +/- 0.09, N = 3 SE +/- 0.14, N = 3 SE +/- 0.03, N = 3 SE +/- 0.21, N = 3 SE +/- 0.07, N = 3 SE +/- 0.01, N = 3 SE +/- 0.11, N = 3 SE +/- 0.07, N = 3 SE +/- 0.17, N = 3 SE +/- 0.06, N = 3 SE +/- 0.18, N = 3 SE +/- 0.13, N = 3 SE +/- 0.06, N = 3 SE +/- 0.08, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.04, N = 3 141.74 155.70 131.40 130.98 126.60 92.68 135.48 98.84 88.75 83.71 65.48 57.27 55.51 98.26 65.09 53.45 46.44 41.12 40.64 32.54
Blender Blend File: BMW27 - Compute: NVIDIA OptiX OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.0 Blend File: BMW27 - Compute: NVIDIA OptiX RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 4 8 12 16 20 SE +/- 0.12, N = 7 SE +/- 0.01, N = 4 SE +/- 0.10, N = 10 SE +/- 0.02, N = 4 SE +/- 0.04, N = 4 SE +/- 0.05, N = 15 SE +/- 0.12, N = 8 SE +/- 0.02, N = 5 SE +/- 0.01, N = 5 SE +/- 0.02, N = 5 SE +/- 0.01, N = 6 SE +/- 0.06, N = 15 SE +/- 0.02, N = 6 SE +/- 0.06, N = 15 SE +/- 0.06, N = 15 SE +/- 0.06, N = 15 SE +/- 0.06, N = 15 SE +/- 0.06, N = 15 SE +/- 0.01, N = 7 SE +/- 0.06, N = 15 13.28 14.40 12.85 12.74 12.35 9.29 14.48 11.00 9.98 9.22 7.52 6.64 6.45 9.22 6.85 5.80 5.25 4.68 4.63 3.80
IndigoBench Acceleration: OpenCL GPU - Scene: Supercar OpenBenchmarking.org M samples/s, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Supercar RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 20 40 60 80 100 SE +/- 0.00, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.07, N = 3 SE +/- 0.11, N = 3 SE +/- 0.03, N = 3 SE +/- 0.08, N = 3 SE +/- 0.13, N = 3 SE +/- 0.04, N = 3 SE +/- 0.16, N = 3 SE +/- 0.15, N = 3 SE +/- 0.03, N = 3 SE +/- 0.31, N = 3 SE +/- 0.08, N = 3 SE +/- 0.31, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.37, N = 3 SE +/- 0.34, N = 3 SE +/- 0.57, N = 3 23.19 21.20 24.48 25.02 25.89 34.92 24.38 33.39 36.89 38.51 45.61 51.02 52.80 28.69 44.70 52.17 60.45 64.82 64.93 79.20
FluidX3D Test: FP32-FP16C OpenBenchmarking.org MLUPs/s, More Is Better FluidX3D 2.9 Test: FP32-FP16C RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 2K 4K 6K 8K 10K SE +/- 10.41, N = 3 SE +/- 16.42, N = 3 SE +/- 0.00, N = 3 SE +/- 5.93, N = 3 SE +/- 0.33, N = 3 SE +/- 1.76, N = 3 SE +/- 1.20, N = 3 SE +/- 5.49, N = 3 SE +/- 2.33, N = 3 SE +/- 8.69, N = 3 SE +/- 12.24, N = 3 SE +/- 19.94, N = 3 SE +/- 6.69, N = 3 SE +/- 0.67, N = 3 SE +/- 1.86, N = 3 SE +/- 8.84, N = 3 SE +/- 1.20, N = 3 SE +/- 9.45, N = 3 SE +/- 0.33, N = 3 SE +/- 0.25, N = 4 4442 4223 4955 4966 5351 7157 3386 4698 4884 5913 7999 9144 9610 3112 5252 5670 7285 7807 8106 11542
ProjectPhysX OpenCL-Benchmark Operation: Memory Bandwidth Coalesced Read OpenBenchmarking.org GB/s, More Is Better ProjectPhysX OpenCL-Benchmark 1.2 Operation: Memory Bandwidth Coalesced Read RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 200 400 600 800 1000 SE +/- 0.10, N = 3 SE +/- 0.09, N = 3 SE +/- 0.11, N = 3 SE +/- 0.02, N = 3 SE +/- 0.03, N = 3 SE +/- 0.16, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 4 SE +/- 0.03, N = 4 SE +/- 0.02, N = 4 SE +/- 0.00, N = 3 SE +/- 0.03, N = 4 SE +/- 0.03, N = 4 SE +/- 0.02, N = 4 SE +/- 0.02, N = 4 SE +/- 0.03, N = 5 SE +/- 0.02, N = 5 388.00 388.49 388.30 387.25 426.01 556.60 335.75 414.45 414.43 562.78 702.77 842.88 864.49 253.03 465.36 464.76 619.58 653.28 681.11 927.88 1. (CXX) g++ options: -std=c++17 -pthread -lOpenCL
LuxCoreRender Scene: Rainbow Colors and Prism - Acceleration: GPU OpenBenchmarking.org M samples/sec, More Is Better LuxCoreRender 2.6 Scene: Rainbow Colors and Prism - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 9 18 27 36 45 SE +/- 0.06, N = 4 SE +/- 0.06, N = 4 SE +/- 0.02, N = 4 SE +/- 0.05, N = 4 SE +/- 0.06, N = 4 SE +/- 0.04, N = 5 SE +/- 0.03, N = 4 SE +/- 0.05, N = 5 SE +/- 0.12, N = 5 SE +/- 0.06, N = 5 SE +/- 0.21, N = 9 SE +/- 0.09, N = 6 SE +/- 0.07, N = 6 SE +/- 0.07, N = 4 SE +/- 0.07, N = 5 SE +/- 0.24, N = 6 SE +/- 0.03, N = 6 SE +/- 0.16, N = 6 SE +/- 0.17, N = 6 SE +/- 0.20, N = 7 12.35 11.37 12.14 11.80 11.73 16.37 13.57 19.08 21.52 22.20 27.22 32.00 32.52 13.15 20.21 24.78 28.28 32.42 32.53 41.03
FluidX3D Test: FP32-FP32 OpenBenchmarking.org MLUPs/s, More Is Better FluidX3D 2.9 Test: FP32-FP32 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 1200 2400 3600 4800 6000 SE +/- 0.33, N = 3 SE +/- 2.08, N = 3 SE +/- 0.33, N = 3 SE +/- 0.58, N = 3 SE +/- 0.33, N = 3 SE +/- 3.18, N = 3 SE +/- 1.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.33, N = 3 SE +/- 0.00, N = 3 SE +/- 0.33, N = 3 SE +/- 0.00, N = 3 SE +/- 3.51, N = 3 SE +/- 0.33, N = 3 SE +/- 0.33, N = 3 SE +/- 0.88, N = 3 SE +/- 4.04, N = 3 SE +/- 0.33, N = 3 2437 2270 2321 2322 2504 3343 2039 2637 2597 3507 4338 5199 5328 1608 2689 2778 3830 3820 3967 5726
ProjectPhysX OpenCL-Benchmark Operation: Memory Bandwidth Coalesced Write OpenBenchmarking.org GB/s, More Is Better ProjectPhysX OpenCL-Benchmark 1.2 Operation: Memory Bandwidth Coalesced Write RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 200 400 600 800 1000 SE +/- 0.15, N = 3 SE +/- 0.04, N = 3 SE +/- 0.64, N = 3 SE +/- 0.64, N = 3 SE +/- 0.51, N = 3 SE +/- 0.30, N = 3 SE +/- 0.02, N = 3 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 4 SE +/- 0.01, N = 4 SE +/- 0.04, N = 4 SE +/- 0.01, N = 3 SE +/- 0.18, N = 4 SE +/- 0.23, N = 4 SE +/- 0.23, N = 4 SE +/- 0.28, N = 4 SE +/- 0.43, N = 5 SE +/- 0.29, N = 5 418.57 421.16 409.50 417.55 452.00 603.99 339.67 422.00 422.01 578.09 721.77 866.26 885.87 258.23 459.27 455.27 612.02 611.31 627.33 905.50 1. (CXX) g++ options: -std=c++17 -pthread -lOpenCL
FluidX3D Test: FP32-FP16S OpenBenchmarking.org MLUPs/s, More Is Better FluidX3D 2.9 Test: FP32-FP16S RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 2K 4K 6K 8K 10K SE +/- 0.00, N = 3 SE +/- 0.88, N = 3 SE +/- 0.33, N = 3 SE +/- 0.00, N = 3 SE +/- 0.33, N = 3 SE +/- 0.33, N = 3 SE +/- 0.00, N = 3 SE +/- 0.33, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.58, N = 3 SE +/- 0.58, N = 3 SE +/- 0.33, N = 3 SE +/- 0.00, N = 3 SE +/- 2.73, N = 3 SE +/- 0.58, N = 3 SE +/- 0.88, N = 3 SE +/- 0.88, N = 3 SE +/- 0.88, N = 3 SE +/- 4.36, N = 3 4982 4589 4948 4957 5423 7108 3932 5136 5079 6915 8460 10122 10491 3046 4679 5547 6447 7683 7778 10237
ProjectPhysX OpenCL-Benchmark Operation: INT64 Compute OpenBenchmarking.org TIOPs/s, More Is Better ProjectPhysX OpenCL-Benchmark 1.2 Operation: INT64 Compute RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 1.0172 2.0344 3.0516 4.0688 5.086 SE +/- 0.000, N = 3 SE +/- 0.003, N = 3 SE +/- 0.025, N = 3 SE +/- 0.010, N = 3 SE +/- 0.005, N = 3 SE +/- 0.010, N = 3 SE +/- 0.008, N = 3 SE +/- 0.009, N = 3 SE +/- 0.000, N = 3 SE +/- 0.002, N = 3 SE +/- 0.020, N = 4 SE +/- 0.014, N = 4 SE +/- 0.054, N = 4 SE +/- 0.003, N = 3 SE +/- 0.001, N = 4 SE +/- 0.006, N = 4 SE +/- 0.002, N = 4 SE +/- 0.007, N = 4 SE +/- 0.003, N = 5 SE +/- 0.007, N = 5 2.185 2.126 3.067 3.310 3.630 3.576 1.925 3.090 3.270 3.185 3.266 3.155 3.170 2.091 3.428 3.989 4.304 4.521 4.246 4.351 1. (CXX) g++ options: -std=c++17 -pthread -lOpenCL
PyTorch Device: NVIDIA CUDA GPU - Batch Size: 512 - Model: ResNet-50 OpenBenchmarking.org batches/sec, More Is Better PyTorch 2.1 Device: NVIDIA CUDA GPU - Batch Size: 512 - Model: ResNet-50 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 90 180 270 360 450 SE +/- 0.16, N = 4 SE +/- 0.43, N = 4 SE +/- 0.10, N = 4 SE +/- 0.22, N = 4 SE +/- 0.39, N = 4 SE +/- 0.38, N = 5 SE +/- 0.37, N = 4 SE +/- 0.60, N = 5 SE +/- 0.67, N = 5 SE +/- 0.48, N = 5 SE +/- 1.06, N = 6 SE +/- 1.61, N = 6 SE +/- 1.32, N = 6 SE +/- 0.56, N = 5 SE +/- 1.00, N = 5 SE +/- 0.86, N = 6 SE +/- 1.81, N = 6 SE +/- 1.24, N = 6 SE +/- 1.09, N = 6 SE +/- 2.01, N = 6 191.53 177.73 200.51 197.62 208.68 263.23 204.41 267.12 287.04 305.64 388.94 387.25 389.05 288.08 369.73 386.65 399.92 387.42 399.58 401.36
PyTorch Device: NVIDIA CUDA GPU - Batch Size: 512 - Model: ResNet-152 OpenBenchmarking.org batches/sec, More Is Better PyTorch 2.1 Device: NVIDIA CUDA GPU - Batch Size: 512 - Model: ResNet-152 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 30 60 90 120 150 SE +/- 0.15, N = 3 SE +/- 0.16, N = 3 SE +/- 0.10, N = 3 SE +/- 0.11, N = 3 SE +/- 0.08, N = 3 SE +/- 0.17, N = 3 SE +/- 0.18, N = 3 SE +/- 0.19, N = 3 SE +/- 0.09, N = 3 SE +/- 0.24, N = 3 SE +/- 1.03, N = 3 SE +/- 0.46, N = 3 SE +/- 0.33, N = 3 SE +/- 0.04, N = 3 SE +/- 0.31, N = 3 SE +/- 0.98, N = 3 SE +/- 0.72, N = 3 SE +/- 1.10, N = 3 SE +/- 0.91, N = 3 SE +/- 0.83, N = 3 80.27 73.52 84.38 82.86 87.60 106.96 87.19 117.40 126.55 130.83 143.64 145.36 146.22 117.11 140.04 141.97 146.29 147.44 147.59 147.31
ProjectPhysX OpenCL-Benchmark Operation: FP32 Compute OpenBenchmarking.org TFLOPs/s, More Is Better ProjectPhysX OpenCL-Benchmark 1.2 Operation: FP32 Compute RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 20 40 60 80 100 SE +/- 0.018, N = 3 SE +/- 0.023, N = 3 SE +/- 0.000, N = 3 SE +/- 0.000, N = 3 SE +/- 0.030, N = 3 SE +/- 0.000, N = 3 SE +/- 0.049, N = 3 SE +/- 0.001, N = 3 SE +/- 0.000, N = 3 SE +/- 0.002, N = 3 SE +/- 0.026, N = 4 SE +/- 0.019, N = 4 SE +/- 0.001, N = 4 SE +/- 0.000, N = 3 SE +/- 0.025, N = 4 SE +/- 0.001, N = 4 SE +/- 0.012, N = 4 SE +/- 0.032, N = 4 SE +/- 0.007, N = 5 SE +/- 0.014, N = 5 8.272 8.484 9.785 10.898 11.891 17.177 13.080 18.693 22.272 22.553 32.924 37.298 39.661 16.502 31.648 38.175 45.031 51.569 53.642 85.843 1. (CXX) g++ options: -std=c++17 -pthread -lOpenCL
GPU Temperature Monitor Phoronix Test Suite System Monitoring OpenBenchmarking.org Celsius GPU Temperature Monitor Phoronix Test Suite System Monitoring RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 16 32 48 64 80 Min: 32 / Avg: 63.7 / Max: 74 Min: 33 / Avg: 63.8 / Max: 76 Min: 28 / Avg: 59.08 / Max: 71 Min: 51 / Avg: 73.87 / Max: 83 Min: 29 / Avg: 60.19 / Max: 73 Min: 40 / Avg: 71.04 / Max: 78 Min: 43 / Avg: 54.51 / Max: 64 Min: 40 / Avg: 61.62 / Max: 72 Min: 35 / Avg: 58.6 / Max: 73 Min: 48 / Avg: 68.94 / Max: 80 Min: 49 / Avg: 70.32 / Max: 78 Min: 36 / Avg: 62.58 / Max: 78 Min: 45 / Avg: 64.33 / Max: 71 Min: 34 / Avg: 52.86 / Max: 65 Min: 33 / Avg: 48.29 / Max: 62 Min: 34 / Avg: 53.31 / Max: 68 Min: 36 / Avg: 49.56 / Max: 60 Min: 32 / Avg: 45.66 / Max: 60 Min: 33 / Avg: 45.64 / Max: 57 Min: 33 / Avg: 42.46 / Max: 59
GPU Power Consumption Monitor Phoronix Test Suite System Monitoring OpenBenchmarking.org Watts GPU Power Consumption Monitor Phoronix Test Suite System Monitoring RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 17.39 / Avg: 114.08 / Max: 198.52 Min: 6.75 / Avg: 109.77 / Max: 188.99 Min: 14.9 / Avg: 124.93 / Max: 220.48 Min: 14.33 / Avg: 141.11 / Max: 223.41 Min: 10.27 / Avg: 129.96 / Max: 255.32 Min: 9.65 / Avg: 194.86 / Max: 286.86 Min: 19.36 / Avg: 87.09 / Max: 151.4 Min: 19.64 / Avg: 119.17 / Max: 199.96 Min: 14.88 / Avg: 107.51 / Max: 219.28 Min: 23.19 / Avg: 167.28 / Max: 289.92 Min: 16.53 / Avg: 220.58 / Max: 319.93 Min: 20.08 / Avg: 179.98 / Max: 349.71 Min: 16.24 / Avg: 253.87 / Max: 348.11 Min: 7.95 / Avg: 93.57 / Max: 184.25 Min: 3.4 / Avg: 106.14 / Max: 213.55 Min: 8.14 / Avg: 141.6 / Max: 272.56 Min: 9.6 / Avg: 131.42 / Max: 259.62 Min: 9.23 / Avg: 123.83 / Max: 251.41 Min: 13.54 / Avg: 34.16 / Max: 322.66
PyTorch GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better PyTorch 2.1 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 52 / Avg: 64.28 / Max: 71 Min: 52 / Avg: 63.66 / Max: 70 Min: 48 / Avg: 58.07 / Max: 64 Min: 59 / Avg: 71.59 / Max: 79 Min: 49 / Avg: 57.92 / Max: 63 Min: 56 / Avg: 65.97 / Max: 72 Min: 43 / Avg: 54.57 / Max: 59 Min: 48 / Avg: 57.74 / Max: 63 Min: 47 / Avg: 56.77 / Max: 63 Min: 54 / Avg: 64.23 / Max: 70 Min: 56 / Avg: 63.45 / Max: 69 Min: 57 / Avg: 63.25 / Max: 68 Min: 52 / Avg: 58.4 / Max: 63 Min: 48 / Avg: 54.71 / Max: 59 Min: 36 / Avg: 42.19 / Max: 47 Min: 37 / Avg: 44.61 / Max: 50 Min: 44 / Avg: 50.82 / Max: 57 Min: 33 / Avg: 38.05 / Max: 42 Min: 33 / Avg: 37.77 / Max: 41 Min: 33 / Avg: 37.11 / Max: 40
PyTorch GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better PyTorch 2.1 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 50 100 150 200 250 Min: 18.19 / Avg: 140.07 / Max: 176.71 Min: 7.32 / Avg: 135.23 / Max: 175.56 Min: 15.73 / Avg: 144.41 / Max: 191.95 Min: 16.4 / Avg: 153.3 / Max: 207.77 Min: 11.05 / Avg: 146.2 / Max: 198 Min: 13.08 / Avg: 192 / Max: 266.53 Min: 19.78 / Avg: 99.34 / Max: 126.81 Min: 21.3 / Avg: 117.24 / Max: 165.3 Min: 15.39 / Avg: 119.06 / Max: 174.49 Min: 24.95 / Avg: 152.75 / Max: 214.91 Min: 23.88 / Avg: 180.17 / Max: 260.46 Min: 34.58 / Avg: 196.67 / Max: 285.09 Min: 29.13 / Avg: 192.12 / Max: 277.93 Min: 10.72 / Avg: 72.72 / Max: 109.21 Min: 5.25 / Avg: 75.06 / Max: 115.71 Min: 12.2 / Avg: 102.07 / Max: 150.66 Min: 12.93 / Avg: 82.78 / Max: 123.07 Min: 9.36 / Avg: 72.53 / Max: 111.75 Min: 14.45 / Avg: 93.44 / Max: 136.85
PyTorch Device: NVIDIA CUDA GPU - Batch Size: 512 - Model: ResNet-152 OpenBenchmarking.org batches/sec Per Watt, More Is Better PyTorch 2.1 Device: NVIDIA CUDA GPU - Batch Size: 512 - Model: ResNet-152 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.4579 0.9158 1.3737 1.8316 2.2895 0.573 0.544 0.584 0.540 0.599 0.557 0.878 1.001 1.063 0.857 0.797 0.739 0.761 1.926 1.891 1.433 1.781 2.035 1.577
PyTorch GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better PyTorch 2.1 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 14 28 42 56 70 Min: 50 / Avg: 59.05 / Max: 66 Min: 49 / Avg: 57.58 / Max: 64 Min: 47 / Avg: 54.25 / Max: 59 Min: 57 / Avg: 65.34 / Max: 72 Min: 49 / Avg: 54.55 / Max: 59 Min: 55 / Avg: 62.48 / Max: 68 Min: 46 / Avg: 52.97 / Max: 59 Min: 48 / Avg: 55.15 / Max: 61 Min: 46 / Avg: 53.05 / Max: 59 Min: 53 / Avg: 61.65 / Max: 68 Min: 56 / Avg: 61.38 / Max: 67 Min: 56 / Avg: 61.16 / Max: 66 Min: 52 / Avg: 56.62 / Max: 61 Min: 41 / Avg: 51.47 / Max: 64 Min: 37 / Avg: 40.97 / Max: 46 Min: 39 / Avg: 43.8 / Max: 50 Min: 36 / Avg: 44.81 / Max: 54 Min: 34 / Avg: 37.48 / Max: 42 Min: 34 / Avg: 37.41 / Max: 42 Min: 33 / Avg: 36.19 / Max: 40
PyTorch GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better PyTorch 2.1 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 50 100 150 200 250 Min: 18.14 / Avg: 113.84 / Max: 178.83 Min: 7.23 / Avg: 111.02 / Max: 174.89 Min: 15.26 / Avg: 116.07 / Max: 183.27 Min: 16.11 / Avg: 119.71 / Max: 186.87 Min: 10.84 / Avg: 115.26 / Max: 186.08 Min: 13.28 / Avg: 147.91 / Max: 260.61 Min: 19.8 / Avg: 83.88 / Max: 127.87 Min: 21.14 / Avg: 97.59 / Max: 161.48 Min: 15.28 / Avg: 96.19 / Max: 168.76 Min: 24.8 / Avg: 130.93 / Max: 215.45 Min: 27.32 / Avg: 151.29 / Max: 283.75 Min: 34.29 / Avg: 167.98 / Max: 299.65 Min: 24.42 / Avg: 163.76 / Max: 295.11 Min: 9.44 / Avg: 58.67 / Max: 119.07 Min: 8.07 / Avg: 60.79 / Max: 127.19 Min: 13.86 / Avg: 81.23 / Max: 160.33 Min: 13.48 / Avg: 67.39 / Max: 128.21 Min: 9.35 / Avg: 57.44 / Max: 119.32 Min: 14.17 / Avg: 78.02 / Max: 143.63
PyTorch Device: NVIDIA CUDA GPU - Batch Size: 512 - Model: ResNet-50 OpenBenchmarking.org batches/sec Per Watt, More Is Better PyTorch 2.1 Device: NVIDIA CUDA GPU - Batch Size: 512 - Model: ResNet-50 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 2 4 6 8 10 1.682 1.601 1.728 1.651 1.811 1.780 2.437 2.737 2.984 2.334 2.571 2.305 2.376 6.302 6.361 4.923 5.749 6.957 5.144
ProjectPhysX OpenCL-Benchmark GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better ProjectPhysX OpenCL-Benchmark 1.2 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 14 28 42 56 70 Min: 54 / Avg: 59.09 / Max: 66 Min: 53 / Avg: 57.89 / Max: 63 Min: 51 / Avg: 54.89 / Max: 61 Min: 62 / Avg: 66.16 / Max: 71 Min: 52 / Avg: 56.64 / Max: 63 Min: 59 / Avg: 65.18 / Max: 71 Min: 44 / Avg: 50.87 / Max: 61 Min: 50 / Avg: 55.88 / Max: 62 Min: 52 / Avg: 56.86 / Max: 64 Min: 56 / Avg: 62.71 / Max: 72 Min: 60 / Avg: 63.77 / Max: 68 Min: 61 / Avg: 64.16 / Max: 67 Min: 55 / Avg: 58.84 / Max: 63 Min: 42 / Avg: 46.95 / Max: 60 Min: 39 / Avg: 43.7 / Max: 60 Min: 42 / Avg: 47.8 / Max: 68 Min: 38 / Avg: 42.95 / Max: 59 Min: 36 / Avg: 39.99 / Max: 60 Min: 36 / Avg: 40.36 / Max: 57 Min: 40 / Avg: 44.21 / Max: 57
ProjectPhysX OpenCL-Benchmark GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better ProjectPhysX OpenCL-Benchmark 1.2 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.36 / Avg: 88.41 / Max: 198.52 Min: 7.27 / Avg: 81.61 / Max: 176.52 Min: 15.34 / Avg: 91.41 / Max: 220.48 Min: 16.31 / Avg: 99.96 / Max: 223.41 Min: 11.61 / Avg: 95.67 / Max: 255.32 Min: 14.66 / Avg: 127.58 / Max: 277.04 Min: 20 / Avg: 69.09 / Max: 150.55 Min: 21.27 / Avg: 88.74 / Max: 199.22 Min: 15.85 / Avg: 87.78 / Max: 218.33 Min: 25.9 / Avg: 128.24 / Max: 281.57 Min: 18.77 / Avg: 154.23 / Max: 305.16 Min: 36.38 / Avg: 170.64 / Max: 335.25 Min: 26.33 / Avg: 171.22 / Max: 339.38 Min: 10 / Avg: 63.42 / Max: 184.25 Min: 8.22 / Avg: 69.69 / Max: 213.55 Min: 14.36 / Avg: 88.49 / Max: 272.56 Min: 14.57 / Avg: 74.95 / Max: 259.62 Min: 9.41 / Avg: 72.14 / Max: 251.41 Min: 14.58 / Avg: 95.78 / Max: 309.66
ProjectPhysX OpenCL-Benchmark Operation: Memory Bandwidth Coalesced Write OpenBenchmarking.org GB/s Per Watt, More Is Better ProjectPhysX OpenCL-Benchmark 1.2 Operation: Memory Bandwidth Coalesced Write RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 3 6 9 12 15 4.735 5.161 4.480 4.177 4.724 4.734 4.916 4.755 4.808 4.508 4.680 5.076 5.174 7.242 6.532 6.916 8.156 8.696 9.454
GpuOwl GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better GpuOwl 7.2.1 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 54 / Avg: 65.53 / Max: 66 Min: 54 / Avg: 64.51 / Max: 65 Min: 52 / Avg: 60.54 / Max: 61 Min: 63 / Avg: 76.08 / Max: 77 Min: 54 / Avg: 63 / Max: 64 Min: 62 / Avg: 75.18 / Max: 77 Min: 46 / Avg: 54.07 / Max: 57 Min: 51 / Avg: 60.98 / Max: 62 Min: 52 / Avg: 62.8 / Max: 64 Min: 56 / Avg: 68.33 / Max: 69 Min: 60 / Avg: 71.21 / Max: 72 Min: 61 / Avg: 72.05 / Max: 74 Min: 56 / Avg: 65.43 / Max: 67 Min: 43 / Avg: 51.79 / Max: 52 Min: 40 / Avg: 48.27 / Max: 49 Min: 42 / Avg: 55.07 / Max: 56 Min: 39 / Avg: 50.01 / Max: 51 Min: 36 / Avg: 46.71 / Max: 48 Min: 36 / Avg: 46.1 / Max: 47 Min: 41 / Avg: 52.17 / Max: 54
GpuOwl GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better GpuOwl 7.2.1 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 50 100 150 200 250 Min: 18.32 / Avg: 117.55 / Max: 120.12 Min: 7.08 / Avg: 110.07 / Max: 118.83 Min: 15.29 / Avg: 126.87 / Max: 129.47 Min: 16.29 / Avg: 150.73 / Max: 154.46 Min: 11.53 / Avg: 138.71 / Max: 142.19 Min: 15.03 / Avg: 223.86 / Max: 231.35 Min: 19.83 / Avg: 78.37 / Max: 80.37 Min: 21.39 / Avg: 110.81 / Max: 112.84 Min: 16.21 / Avg: 121.39 / Max: 124.14 Min: 25.82 / Avg: 158.95 / Max: 162.16 Min: 30.25 / Avg: 223.26 / Max: 229.4 Min: 36.52 / Avg: 254.16 / Max: 263.62 Min: 22.11 / Avg: 258.35 / Max: 266.52 Min: 11.38 / Avg: 93.28 / Max: 96.08 Min: 8.18 / Avg: 114.12 / Max: 118.26 Min: 14.36 / Avg: 150.25 / Max: 155.88 Min: 14.95 / Avg: 140.36 / Max: 147.06 Min: 9.61 / Avg: 130.62 / Max: 136.79 Min: 14.36 / Avg: 202.43 / Max: 217.33
GpuOwl Exponent: 332220523 OpenBenchmarking.org Iterations / Second Per Watt, More Is Better GpuOwl 7.2.1 Exponent: 332220523 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.3366 0.6732 1.0098 1.3464 1.683 0.478 0.470 0.521 0.486 0.580 0.517 0.565 0.595 0.647 0.499 0.520 0.516 0.541 1.109 1.174 1.063 1.302 1.445 1.496
GpuOwl GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better GpuOwl 7.2.1 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 55 / Avg: 65.59 / Max: 67 Min: 55 / Avg: 64.49 / Max: 65 Min: 52 / Avg: 60.46 / Max: 61 Min: 63 / Avg: 75.99 / Max: 77 Min: 54 / Avg: 62.76 / Max: 64 Min: 62 / Avg: 74.97 / Max: 76 Min: 46 / Avg: 54.09 / Max: 57 Min: 51 / Avg: 61.05 / Max: 62 Min: 52 / Avg: 62.99 / Max: 64 Min: 56 / Avg: 68.38 / Max: 69 Min: 60 / Avg: 70.98 / Max: 72 Min: 61 / Avg: 71.82 / Max: 74 Min: 56 / Avg: 65.39 / Max: 66 Min: 44 / Avg: 52.57 / Max: 53 Min: 39 / Avg: 47.83 / Max: 49 Min: 42 / Avg: 54.21 / Max: 55 Min: 39 / Avg: 49.15 / Max: 50 Min: 36 / Avg: 45.67 / Max: 47 Min: 36 / Avg: 45.19 / Max: 46 Min: 41 / Avg: 50.71 / Max: 52
GpuOwl GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better GpuOwl 7.2.1 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 50 100 150 200 250 Min: 18.46 / Avg: 117.3 / Max: 119.6 Min: 7.32 / Avg: 109.7 / Max: 118.46 Min: 15.34 / Avg: 126.71 / Max: 128.96 Min: 16.38 / Avg: 151.08 / Max: 155.52 Min: 11.43 / Avg: 138.4 / Max: 141.4 Min: 15.06 / Avg: 221.05 / Max: 227.33 Min: 20.05 / Avg: 79.51 / Max: 81.79 Min: 21.29 / Avg: 112.02 / Max: 114.06 Min: 16.25 / Avg: 123.09 / Max: 125.15 Min: 25.92 / Avg: 160.17 / Max: 163.16 Min: 30.2 / Avg: 221.82 / Max: 227.89 Min: 33.24 / Avg: 255.79 / Max: 264.47 Min: 31.39 / Avg: 259.24 / Max: 267.22 Min: 10.63 / Avg: 91.38 / Max: 94.07 Min: 8.33 / Avg: 109.12 / Max: 113.03 Min: 14.5 / Avg: 144.14 / Max: 149.1 Min: 14.66 / Avg: 129.49 / Max: 134.98 Min: 9.41 / Avg: 118.23 / Max: 123.91 Min: 14.28 / Avg: 180.21 / Max: 192.65
GpuOwl Exponent: 77936867 OpenBenchmarking.org Iterations / Second Per Watt, More Is Better GpuOwl 7.2.1 Exponent: 77936867 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 2 4 6 8 10 2.251 2.231 2.452 2.278 2.724 2.405 2.630 2.795 2.999 2.323 2.405 2.404 2.533 5.326 5.784 5.166 6.656 7.527 7.916
GpuOwl GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better GpuOwl 7.2.1 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 58 / Avg: 65.04 / Max: 66 Min: 59 / Avg: 64.32 / Max: 65 Min: 55 / Avg: 60.91 / Max: 62 Min: 66 / Avg: 75.59 / Max: 77 Min: 58 / Avg: 62.88 / Max: 64 Min: 62 / Avg: 74.29 / Max: 76 Min: 45 / Avg: 53.94 / Max: 57 Min: 54 / Avg: 60.81 / Max: 61 Min: 55 / Avg: 62.67 / Max: 63 Min: 58 / Avg: 68.5 / Max: 69 Min: 62 / Avg: 70.53 / Max: 72 Min: 62 / Avg: 71.65 / Max: 74 Min: 56 / Avg: 65.26 / Max: 66 Min: 46 / Avg: 52.16 / Max: 53 Min: 42 / Avg: 48.58 / Max: 51 Min: 45 / Avg: 54.15 / Max: 55 Min: 41 / Avg: 48.68 / Max: 50 Min: 38 / Avg: 45.97 / Max: 47 Min: 38 / Avg: 45.55 / Max: 46 Min: 43 / Avg: 50.49 / Max: 52
GpuOwl GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better GpuOwl 7.2.1 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 50 100 150 200 250 Min: 18.91 / Avg: 116.34 / Max: 119.26 Min: 7.3 / Avg: 109.85 / Max: 118.56 Min: 16.16 / Avg: 126.33 / Max: 129.09 Min: 16.97 / Avg: 149.07 / Max: 154.42 Min: 12.2 / Avg: 138 / Max: 141.24 Min: 14.54 / Avg: 216.13 / Max: 226.05 Min: 20.66 / Avg: 79.46 / Max: 81.03 Min: 21.96 / Avg: 110.4 / Max: 112.43 Min: 17.33 / Avg: 121.63 / Max: 124.35 Min: 26.84 / Avg: 159.92 / Max: 163.23 Min: 25.87 / Avg: 221.07 / Max: 227.53 Min: 36.97 / Avg: 252.2 / Max: 262.6 Min: 27.54 / Avg: 256.81 / Max: 266.77 Min: 11.2 / Avg: 88.95 / Max: 92.68 Min: 8.59 / Avg: 103.33 / Max: 107.8 Min: 14.56 / Avg: 136.39 / Max: 142.42 Min: 15.16 / Avg: 124.57 / Max: 130.88 Min: 9.58 / Avg: 113.87 / Max: 120.51 Min: 14.28 / Avg: 174.08 / Max: 189.3
GpuOwl Exponent: 57885161 OpenBenchmarking.org Iterations / Second Per Watt, More Is Better GpuOwl 7.2.1 Exponent: 57885161 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 3 6 9 12 15 3.059 3.010 3.322 3.107 3.687 3.290 3.580 3.815 4.086 3.151 3.283 3.241 3.436 7.384 8.234 7.354 9.143 10.430 10.970
FluidX3D GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better FluidX3D 2.9 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 16 32 48 64 80 Min: 57 / Avg: 70.58 / Max: 74 Min: 59 / Avg: 71.55 / Max: 76 Min: 54 / Avg: 66.89 / Max: 71 Min: 64 / Avg: 79.11 / Max: 83 Min: 54 / Avg: 67.06 / Max: 73 Min: 61 / Avg: 73.32 / Max: 78 Min: 44 / Avg: 61.22 / Max: 64 Min: 53 / Avg: 68.81 / Max: 72 Min: 54 / Avg: 69.39 / Max: 73 Min: 58 / Avg: 75.85 / Max: 80 Min: 62 / Avg: 72.56 / Max: 77 Min: 62 / Avg: 71.7 / Max: 77 Min: 56 / Avg: 64.05 / Max: 67 Min: 47 / Avg: 62.3 / Max: 65 Min: 42 / Avg: 56.97 / Max: 62 Min: 45 / Avg: 62.56 / Max: 67 Min: 40 / Avg: 55.62 / Max: 60 Min: 38 / Avg: 51.31 / Max: 56 Min: 37 / Avg: 50.58 / Max: 55 Min: 42 / Avg: 54.04 / Max: 59
FluidX3D GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better FluidX3D 2.9 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.78 / Avg: 157.72 / Max: 178.06 Min: 7.23 / Avg: 156.36 / Max: 188.99 Min: 15.52 / Avg: 191.81 / Max: 219.74 Min: 16.77 / Avg: 188.58 / Max: 220.64 Min: 11.87 / Avg: 202.48 / Max: 239.66 Min: 14.8 / Avg: 238.69 / Max: 286.86 Min: 20.53 / Avg: 139.35 / Max: 151.4 Min: 22.15 / Avg: 180.22 / Max: 199.96 Min: 17.43 / Avg: 194.62 / Max: 219.28 Min: 26.68 / Avg: 253.79 / Max: 289.92 Min: 30.71 / Avg: 272.35 / Max: 319.93 Min: 30.58 / Avg: 291.88 / Max: 349.71 Min: 25.18 / Avg: 285.54 / Max: 345.49 Min: 10.49 / Avg: 147.37 / Max: 171.53 Min: 8.38 / Avg: 166.45 / Max: 193.2 Min: 14.5 / Avg: 210.45 / Max: 250.56 Min: 15.1 / Avg: 191.97 / Max: 233.36 Min: 9.5 / Avg: 188.56 / Max: 233.56 Min: 14.02 / Avg: 247.6 / Max: 322.66
FluidX3D Test: FP32-FP16C OpenBenchmarking.org MLUPs/s Per Watt, More Is Better FluidX3D 2.9 Test: FP32-FP16C RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 11 22 33 44 55 28.16 27.01 25.83 26.33 26.43 29.98 24.30 26.07 25.10 23.30 29.37 31.33 33.66 35.64 34.07 34.62 40.67 42.99 46.62
FluidX3D GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better FluidX3D 2.9 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 57 / Avg: 69.23 / Max: 73 Min: 57 / Avg: 67.7 / Max: 72 Min: 52 / Avg: 61.32 / Max: 65 Min: 63 / Avg: 74.82 / Max: 79 Min: 52 / Avg: 60.78 / Max: 64 Min: 60 / Avg: 71.27 / Max: 75 Min: 44 / Avg: 57.46 / Max: 59 Min: 53 / Avg: 64.97 / Max: 68 Min: 53 / Avg: 65.24 / Max: 69 Min: 58 / Avg: 73.77 / Max: 78 Min: 62 / Avg: 72.43 / Max: 77 Min: 62 / Avg: 70.25 / Max: 75 Min: 57 / Avg: 64.44 / Max: 68 Min: 45 / Avg: 55.97 / Max: 59 Min: 41 / Avg: 50.45 / Max: 53 Min: 44 / Avg: 55.8 / Max: 59 Min: 40 / Avg: 49.79 / Max: 53 Min: 38 / Avg: 46.56 / Max: 49 Min: 37 / Avg: 46 / Max: 49 Min: 42 / Avg: 50.03 / Max: 53
FluidX3D GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better FluidX3D 2.9 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.6 / Avg: 150.83 / Max: 171.56 Min: 7.26 / Avg: 141.53 / Max: 161 Min: 15.28 / Avg: 149.4 / Max: 169.75 Min: 15.82 / Avg: 158.8 / Max: 180.55 Min: 11.67 / Avg: 146.39 / Max: 168.8 Min: 14.76 / Avg: 214.71 / Max: 256.31 Min: 21.06 / Avg: 114.84 / Max: 125.25 Min: 21.96 / Avg: 148.76 / Max: 166.88 Min: 17.27 / Avg: 152.53 / Max: 173 Min: 27.15 / Avg: 221.87 / Max: 258.06 Min: 26.04 / Avg: 268.85 / Max: 319.28 Min: 26.31 / Avg: 274.7 / Max: 336.09 Min: 25.99 / Avg: 281.59 / Max: 346.92 Min: 9.91 / Avg: 113.05 / Max: 127.42 Min: 8.48 / Avg: 131.13 / Max: 151.27 Min: 14.19 / Avg: 162.47 / Max: 190.04 Min: 14.84 / Avg: 156.36 / Max: 187.51 Min: 9.47 / Avg: 147.54 / Max: 179.32 Min: 14.26 / Avg: 197.5 / Max: 249.26
FluidX3D Test: FP32-FP16S OpenBenchmarking.org MLUPs/s Per Watt, More Is Better FluidX3D 2.9 Test: FP32-FP16S RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 12 24 36 48 60 33.03 32.43 33.12 31.22 37.05 33.11 34.24 34.52 33.30 31.17 31.47 36.85 37.26 41.39 42.30 39.68 49.14 52.72 51.83
FluidX3D GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better FluidX3D 2.9 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 53 / Avg: 66.13 / Max: 69 Min: 54 / Avg: 64.86 / Max: 68 Min: 50 / Avg: 58.87 / Max: 61 Min: 62 / Avg: 72.09 / Max: 75 Min: 52 / Avg: 59.22 / Max: 61 Min: 60 / Avg: 70.87 / Max: 73 Min: 44 / Avg: 55.12 / Max: 58 Min: 52 / Avg: 63.31 / Max: 65 Min: 43 / Avg: 60.73 / Max: 66 Min: 57 / Avg: 71.96 / Max: 75 Min: 60 / Avg: 72.11 / Max: 76 Min: 51 / Avg: 67.81 / Max: 75 Min: 56 / Avg: 66.14 / Max: 69 Min: 49 / Avg: 56.64 / Max: 63 Min: 40 / Avg: 48.8 / Max: 51 Min: 44 / Avg: 53.63 / Max: 56 Min: 40 / Avg: 48.94 / Max: 51 Min: 38 / Avg: 45.33 / Max: 47 Min: 38 / Avg: 44.87 / Max: 46 Min: 42 / Avg: 50.54 / Max: 52
FluidX3D GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better FluidX3D 2.9 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.26 / Avg: 133.72 / Max: 142.79 Min: 7.78 / Avg: 121.58 / Max: 142.23 Min: 15.38 / Avg: 129.75 / Max: 138.01 Min: 16.44 / Avg: 136.21 / Max: 149.22 Min: 11.6 / Avg: 128.2 / Max: 136.2 Min: 14.58 / Avg: 197.07 / Max: 214.44 Min: 20.47 / Avg: 100.65 / Max: 105.72 Min: 21.96 / Avg: 131.94 / Max: 141.18 Min: 16.54 / Avg: 132.86 / Max: 142.53 Min: 26.1 / Avg: 201.43 / Max: 218 Min: 30.46 / Avg: 257.29 / Max: 284.73 Min: 26.51 / Avg: 286.34 / Max: 322.56 Min: 31.29 / Avg: 305.62 / Max: 345.04 Min: 10.99 / Avg: 106.31 / Max: 114.9 Min: 8.48 / Avg: 117.26 / Max: 127.32 Min: 14.48 / Avg: 157.81 / Max: 172.82 Min: 14.97 / Avg: 138.85 / Max: 153.34 Min: 9.59 / Avg: 132.56 / Max: 146.68 Min: 14.37 / Avg: 192.05 / Max: 220.55
FluidX3D Test: FP32-FP32 OpenBenchmarking.org MLUPs/s Per Watt, More Is Better FluidX3D 2.9 Test: FP32-FP32 RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 7 14 21 28 35 18.22 18.67 17.89 17.05 19.53 16.96 20.26 19.99 19.55 17.41 16.86 18.16 17.43 25.30 23.69 24.27 27.51 29.93 29.82
LuxCoreRender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better LuxCoreRender 2.6 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 54 / Avg: 62.19 / Max: 65 Min: 55 / Avg: 60.99 / Max: 64 Min: 52 / Avg: 58.11 / Max: 60 Min: 63 / Avg: 71.49 / Max: 76 Min: 53 / Avg: 59.11 / Max: 61 Min: 61 / Avg: 70.96 / Max: 74 Min: 44 / Avg: 53.91 / Max: 58 Min: 52 / Avg: 62.04 / Max: 65 Min: 53 / Avg: 63.21 / Max: 66 Min: 57 / Avg: 68.81 / Max: 72 Min: 61 / Avg: 70.62 / Max: 74 Min: 61 / Avg: 72.11 / Max: 76 Min: 56 / Avg: 65.03 / Max: 68 Min: 47 / Avg: 54.53 / Max: 57 Min: 41 / Avg: 48.61 / Max: 51 Min: 45 / Avg: 55.76 / Max: 59 Min: 41 / Avg: 50.09 / Max: 53 Min: 39 / Avg: 47.59 / Max: 50 Min: 38 / Avg: 46.78 / Max: 49 Min: 42 / Avg: 51.04 / Max: 54
LuxCoreRender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better LuxCoreRender 2.6 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 50 100 150 200 250 Min: 18.47 / Avg: 108.64 / Max: 125.39 Min: 7.2 / Avg: 96.2 / Max: 121.83 Min: 15.54 / Avg: 120.72 / Max: 139.9 Min: 16.33 / Avg: 133.17 / Max: 157.95 Min: 11.52 / Avg: 125.33 / Max: 144.25 Min: 14.27 / Avg: 195.03 / Max: 229.11 Min: 19.75 / Avg: 89.44 / Max: 102.56 Min: 21.69 / Avg: 122.67 / Max: 139.71 Min: 17.27 / Avg: 131.7 / Max: 151.97 Min: 26.52 / Avg: 167.17 / Max: 188.26 Min: 25.83 / Avg: 225.73 / Max: 259.66 Min: 36.19 / Avg: 262.35 / Max: 299.21 Min: 23.66 / Avg: 264.59 / Max: 304.53 Min: 9.24 / Avg: 96.07 / Max: 113.06 Min: 8.48 / Avg: 119.12 / Max: 140.91 Min: 14.55 / Avg: 150.15 / Max: 174.49 Min: 15 / Avg: 146.93 / Max: 172.17 Min: 9.75 / Avg: 136.37 / Max: 160.77 Min: 14.11 / Avg: 197.62 / Max: 231.59
LuxCoreRender Scene: Danish Mood - Acceleration: GPU OpenBenchmarking.org M samples/sec Per Watt, More Is Better LuxCoreRender 2.6 Scene: Danish Mood - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.0218 0.0436 0.0654 0.0872 0.109 0.025 0.026 0.029 0.027 0.030 0.026 0.037 0.039 0.042 0.035 0.035 0.035 0.037 0.078 0.081 0.075 0.087 0.097 0.094
LuxCoreRender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better LuxCoreRender 2.6 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 52 / Avg: 63.53 / Max: 67 Min: 54 / Avg: 63.06 / Max: 66 Min: 51 / Avg: 59.59 / Max: 63 Min: 62 / Avg: 73.84 / Max: 78 Min: 52 / Avg: 60.53 / Max: 63 Min: 60 / Avg: 72.47 / Max: 76 Min: 44 / Avg: 55.44 / Max: 58 Min: 52 / Avg: 64.31 / Max: 67 Min: 52 / Avg: 65.53 / Max: 69 Min: 57 / Avg: 71.41 / Max: 75 Min: 61 / Avg: 73.07 / Max: 76 Min: 61 / Avg: 73.99 / Max: 77 Min: 56 / Avg: 67.03 / Max: 70 Min: 46 / Avg: 57.03 / Max: 60 Min: 41 / Avg: 50.63 / Max: 54 Min: 43 / Avg: 57.65 / Max: 62 Min: 40 / Avg: 52.57 / Max: 56 Min: 38 / Avg: 49.14 / Max: 52 Min: 37 / Avg: 48.15 / Max: 51 Min: 41 / Avg: 52.44 / Max: 56
LuxCoreRender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better LuxCoreRender 2.6 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.22 / Avg: 118.64 / Max: 135.14 Min: 7.33 / Avg: 110.98 / Max: 133 Min: 15.45 / Avg: 131.88 / Max: 151.74 Min: 16.35 / Avg: 147.08 / Max: 171.26 Min: 11.41 / Avg: 136.44 / Max: 155.62 Min: 13.99 / Avg: 209.16 / Max: 240.48 Min: 19.96 / Avg: 101.23 / Max: 112.74 Min: 21.88 / Avg: 140.16 / Max: 157.22 Min: 16.86 / Avg: 153.28 / Max: 173.05 Min: 26.3 / Avg: 188.24 / Max: 209.31 Min: 25.6 / Avg: 255.09 / Max: 286.12 Min: 36.24 / Avg: 295.04 / Max: 331.43 Min: 31.53 / Avg: 299.83 / Max: 337.93 Min: 10.64 / Avg: 109.15 / Max: 124.14 Min: 8.54 / Avg: 133.01 / Max: 151.93 Min: 14.32 / Avg: 170.09 / Max: 192.33 Min: 14.87 / Avg: 163.46 / Max: 186.71 Min: 9.61 / Avg: 151.66 / Max: 175.37 Min: 13.68 / Avg: 218.59 / Max: 249.41
LuxCoreRender Scene: Orange Juice - Acceleration: GPU OpenBenchmarking.org M samples/sec Per Watt, More Is Better LuxCoreRender 2.6 Scene: Orange Juice - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.0189 0.0378 0.0567 0.0756 0.0945 0.031 0.030 0.034 0.031 0.035 0.029 0.048 0.044 0.046 0.038 0.036 0.034 0.035 0.077 0.074 0.068 0.076 0.084 0.080
LuxCoreRender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better LuxCoreRender 2.6 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 43 / Avg: 55.31 / Max: 65 Min: 53 / Avg: 62.09 / Max: 66 Min: 50 / Avg: 58.25 / Max: 61 Min: 59 / Avg: 71.61 / Max: 76 Min: 51 / Avg: 58.82 / Max: 62 Min: 57 / Avg: 70.96 / Max: 75 Min: 43 / Avg: 54.22 / Max: 58 Min: 51 / Avg: 62.32 / Max: 65 Min: 41 / Avg: 55.7 / Max: 66 Min: 56 / Avg: 69.31 / Max: 73 Min: 59 / Avg: 70.87 / Max: 75 Min: 59 / Avg: 72.35 / Max: 76 Min: 53 / Avg: 65.27 / Max: 69 Min: 40 / Avg: 52.16 / Max: 63 Min: 39 / Avg: 48.67 / Max: 52 Min: 34 / Avg: 47.53 / Max: 59 Min: 37 / Avg: 50.39 / Max: 54 Min: 37 / Avg: 46.66 / Max: 50 Min: 36 / Avg: 45.97 / Max: 49 Min: 39 / Avg: 48.39 / Max: 52
LuxCoreRender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better LuxCoreRender 2.6 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 17.52 / Avg: 85.99 / Max: 133.78 Min: 7.26 / Avg: 108.77 / Max: 129.93 Min: 15.44 / Avg: 125.38 / Max: 142.93 Min: 15.63 / Avg: 137.83 / Max: 162.54 Min: 11.51 / Avg: 128.93 / Max: 147.45 Min: 14.23 / Avg: 206.34 / Max: 239.5 Min: 19.74 / Avg: 95.03 / Max: 106.95 Min: 21.56 / Avg: 127.24 / Max: 145.09 Min: 16.14 / Avg: 99.96 / Max: 158.07 Min: 26.38 / Avg: 174.33 / Max: 195.65 Min: 17.59 / Avg: 239.04 / Max: 273.35 Min: 35.23 / Avg: 278.51 / Max: 315.67 Min: 27.54 / Avg: 279.81 / Max: 322.78 Min: 11.1 / Avg: 100.63 / Max: 117.14 Min: 3.85 / Avg: 87.74 / Max: 143.96 Min: 9.56 / Avg: 155.59 / Max: 180.51 Min: 15.26 / Avg: 148.84 / Max: 176.9 Min: 9.55 / Avg: 137.77 / Max: 166.35 Min: 13.87 / Avg: 185.92 / Max: 233.78
LuxCoreRender Scene: LuxCore Benchmark - Acceleration: GPU OpenBenchmarking.org M samples/sec Per Watt, More Is Better LuxCoreRender 2.6 Scene: LuxCore Benchmark - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.029 0.058 0.087 0.116 0.145 0.042 0.030 0.034 0.031 0.035 0.031 0.043 0.045 0.067 0.040 0.040 0.040 0.041 0.089 0.129 0.084 0.098 0.108 0.104
LuxCoreRender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better LuxCoreRender 2.6 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 14 28 42 56 70 Min: 54 / Avg: 61.58 / Max: 65 Min: 55 / Avg: 60.89 / Max: 63 Min: 53 / Avg: 57.83 / Max: 60 Min: 65 / Avg: 68.96 / Max: 71 Min: 54 / Avg: 58.28 / Max: 60 Min: 62 / Avg: 67.68 / Max: 71 Min: 43 / Avg: 53.67 / Max: 58 Min: 53 / Avg: 59.56 / Max: 64 Min: 53 / Avg: 60.17 / Max: 64 Min: 58 / Avg: 66.78 / Max: 71 Min: 62 / Avg: 66.73 / Max: 70 Min: 63 / Avg: 67.38 / Max: 71 Min: 57 / Avg: 61.29 / Max: 65 Min: 48 / Avg: 52.83 / Max: 56 Min: 42 / Avg: 46.4 / Max: 50 Min: 44 / Avg: 49.94 / Max: 55 Min: 40 / Avg: 45.25 / Max: 49 Min: 38 / Avg: 43.3 / Max: 48 Min: 38 / Avg: 42.82 / Max: 47 Min: 41 / Avg: 45.65 / Max: 50
LuxCoreRender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better LuxCoreRender 2.6 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.32 / Avg: 108.92 / Max: 143.79 Min: 7.32 / Avg: 106.03 / Max: 145.21 Min: 15.35 / Avg: 111.08 / Max: 142.77 Min: 16.58 / Avg: 117.8 / Max: 148.39 Min: 11.47 / Avg: 108.11 / Max: 141.33 Min: 15.48 / Avg: 161.95 / Max: 228 Min: 19.98 / Avg: 89.38 / Max: 114.93 Min: 21.72 / Avg: 110.52 / Max: 157.34 Min: 17.35 / Avg: 115.96 / Max: 169.73 Min: 26.81 / Avg: 155.29 / Max: 212.62 Min: 30.54 / Avg: 187.86 / Max: 283.44 Min: 33.54 / Avg: 217.42 / Max: 330.81 Min: 24.7 / Avg: 210.07 / Max: 336.9 Min: 11.11 / Avg: 75.52 / Max: 111.52 Min: 8.54 / Avg: 83.79 / Max: 135.51 Min: 11.1 / Avg: 107.79 / Max: 169.83 Min: 15.54 / Avg: 100.6 / Max: 169.5 Min: 9.58 / Avg: 89.44 / Max: 158.04 Min: 13.97 / Avg: 118.95 / Max: 218.79
LuxCoreRender Scene: Rainbow Colors and Prism - Acceleration: GPU OpenBenchmarking.org M samples/sec Per Watt, More Is Better LuxCoreRender 2.6 Scene: Rainbow Colors and Prism - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.0819 0.1638 0.2457 0.3276 0.4095 0.113 0.107 0.109 0.100 0.108 0.101 0.152 0.173 0.186 0.143 0.145 0.147 0.155 0.268 0.296 0.262 0.322 0.364 0.345
LuxCoreRender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better LuxCoreRender 2.6 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 45 / Avg: 58.53 / Max: 67 Min: 58 / Avg: 65 / Max: 67 Min: 53 / Avg: 61.85 / Max: 64 Min: 64 / Avg: 76.43 / Max: 80 Min: 54 / Avg: 62.67 / Max: 65 Min: 63 / Avg: 74.59 / Max: 78 Min: 44 / Avg: 56.08 / Max: 58 Min: 53 / Avg: 65.33 / Max: 67 Min: 43 / Avg: 58.71 / Max: 69 Min: 58 / Avg: 72.71 / Max: 75 Min: 63 / Avg: 75.07 / Max: 78 Min: 63 / Avg: 75.4 / Max: 78 Min: 57 / Avg: 68.33 / Max: 71 Min: 40 / Avg: 54.23 / Max: 63 Min: 43 / Avg: 53.36 / Max: 56 Min: 35 / Avg: 51.77 / Max: 64 Min: 42 / Avg: 54.93 / Max: 58 Min: 39 / Avg: 51.35 / Max: 54 Min: 39 / Avg: 50.61 / Max: 53 Min: 43 / Avg: 54.93 / Max: 57
LuxCoreRender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better LuxCoreRender 2.6 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 17.66 / Avg: 92.27 / Max: 136.5 Min: 7.3 / Avg: 114.56 / Max: 129.52 Min: 15.63 / Avg: 140.35 / Max: 153.47 Min: 16.39 / Avg: 159.06 / Max: 176.54 Min: 11.62 / Avg: 144.39 / Max: 157.51 Min: 15.69 / Avg: 225.88 / Max: 249.56 Min: 20.51 / Avg: 103.92 / Max: 111.55 Min: 22.05 / Avg: 143.63 / Max: 154.99 Min: 16.21 / Avg: 113.67 / Max: 170.86 Min: 26.86 / Avg: 194.14 / Max: 208.92 Min: 20.79 / Avg: 270.78 / Max: 294.18 Min: 37.28 / Avg: 310.03 / Max: 336.65 Min: 31.06 / Avg: 318.13 / Max: 346.47 Min: 10.93 / Avg: 117.19 / Max: 128.29 Min: 3.4 / Avg: 101.04 / Max: 157.96 Min: 14.77 / Avg: 183.37 / Max: 198.75 Min: 15.45 / Avg: 176.3 / Max: 192.49 Min: 9.57 / Avg: 167.39 / Max: 183.93 Min: 14.31 / Avg: 236.25 / Max: 260.36
LuxCoreRender Scene: DLSC - Acceleration: GPU OpenBenchmarking.org M samples/sec Per Watt, More Is Better LuxCoreRender 2.6 Scene: DLSC - Acceleration: GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.0259 0.0518 0.0777 0.1036 0.1295 0.035 0.026 0.029 0.027 0.031 0.026 0.041 0.041 0.061 0.037 0.036 0.036 0.037 0.079 0.115 0.075 0.087 0.095 0.094
IndigoBench GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better IndigoBench 4.4 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 55 / Avg: 67.13 / Max: 71 Min: 57 / Avg: 66.81 / Max: 70 Min: 52 / Avg: 60.8 / Max: 64 Min: 64 / Avg: 75.54 / Max: 80 Min: 53 / Avg: 61.48 / Max: 64 Min: 62 / Avg: 74.15 / Max: 77 Min: 43 / Avg: 57.1 / Max: 59 Min: 53 / Avg: 65.8 / Max: 69 Min: 53 / Avg: 65.94 / Max: 70 Min: 58 / Avg: 73.22 / Max: 77 Min: 62 / Avg: 74.43 / Max: 78 Min: 62 / Avg: 74.83 / Max: 78 Min: 57 / Avg: 67.5 / Max: 70 Min: 47 / Avg: 57.78 / Max: 63 Min: 42 / Avg: 53.04 / Max: 58 Min: 44 / Avg: 59.06 / Max: 66 Min: 41 / Avg: 54.19 / Max: 59 Min: 38 / Avg: 49.98 / Max: 54 Min: 38 / Avg: 49.32 / Max: 54 Min: 41 / Avg: 52.91 / Max: 57
IndigoBench GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better IndigoBench 4.4 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.7 / Avg: 133.52 / Max: 163.39 Min: 7.29 / Avg: 129.14 / Max: 159.96 Min: 15.68 / Avg: 139.95 / Max: 169.66 Min: 17.01 / Avg: 156.59 / Max: 195.73 Min: 11.59 / Avg: 140.82 / Max: 176.14 Min: 14.71 / Avg: 227.01 / Max: 267.55 Min: 20.09 / Avg: 110.97 / Max: 124.28 Min: 22.03 / Avg: 150.79 / Max: 169.01 Min: 17.17 / Avg: 156.78 / Max: 184.28 Min: 26.9 / Avg: 202.39 / Max: 224.09 Min: 30.92 / Avg: 270.89 / Max: 302.82 Min: 37.14 / Avg: 308.47 / Max: 343.37 Min: 31.68 / Avg: 308.41 / Max: 341.1 Min: 11.04 / Avg: 121.82 / Max: 142.03 Min: 8.67 / Avg: 138.15 / Max: 168.54 Min: 14.69 / Avg: 185.57 / Max: 213.34 Min: 15.22 / Avg: 171.26 / Max: 197.37 Min: 9.48 / Avg: 162.42 / Max: 188.22 Min: 14.63 / Avg: 223.43 / Max: 254.7
IndigoBench Acceleration: OpenCL GPU - Scene: Bedroom OpenBenchmarking.org M samples/s Per Watt, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Bedroom RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.0362 0.0724 0.1086 0.1448 0.181 0.057 0.054 0.057 0.051 0.059 0.053 0.075 0.077 0.084 0.069 0.066 0.066 0.068 0.138 0.141 0.130 0.150 0.161 0.159
IndigoBench GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better IndigoBench 4.4 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 53 / Avg: 65.27 / Max: 70 Min: 54 / Avg: 65.28 / Max: 69 Min: 49 / Avg: 58.89 / Max: 62 Min: 62 / Avg: 74.72 / Max: 79 Min: 51 / Avg: 59.95 / Max: 63 Min: 59 / Avg: 72.33 / Max: 76 Min: 44 / Avg: 56.06 / Max: 59 Min: 53 / Avg: 64.64 / Max: 68 Min: 53 / Avg: 64.66 / Max: 69 Min: 58 / Avg: 71.92 / Max: 75 Min: 62 / Avg: 72.69 / Max: 76 Min: 63 / Avg: 73.05 / Max: 76 Min: 57 / Avg: 66.16 / Max: 69 Min: 47 / Avg: 56.44 / Max: 61 Min: 41 / Avg: 50.63 / Max: 54 Min: 44 / Avg: 56.47 / Max: 62 Min: 41 / Avg: 51.46 / Max: 55 Min: 38 / Avg: 47.72 / Max: 51 Min: 38 / Avg: 47.11 / Max: 50 Min: 42 / Avg: 50.03 / Max: 53
IndigoBench GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better IndigoBench 4.4 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.44 / Avg: 126.88 / Max: 158.36 Min: 7.3 / Avg: 122.51 / Max: 146.51 Min: 15.67 / Avg: 133.84 / Max: 157.27 Min: 17.01 / Avg: 152.15 / Max: 184.44 Min: 11.26 / Avg: 137.44 / Max: 165.15 Min: 14.61 / Avg: 214.59 / Max: 259.31 Min: 20.34 / Avg: 104.12 / Max: 119.91 Min: 22.17 / Avg: 141.77 / Max: 161.46 Min: 17.25 / Avg: 144.9 / Max: 173.29 Min: 27.38 / Avg: 189.37 / Max: 211.81 Min: 31.19 / Avg: 247.79 / Max: 278.12 Min: 37.34 / Avg: 281.35 / Max: 312.73 Min: 31.36 / Avg: 284.21 / Max: 313.03 Min: 11.27 / Avg: 109.72 / Max: 125.01 Min: 8.67 / Avg: 123.18 / Max: 145.96 Min: 14.77 / Avg: 160.45 / Max: 181.7 Min: 15.3 / Avg: 149.36 / Max: 169.13 Min: 9.85 / Avg: 138.57 / Max: 158.29 Min: 14.44 / Avg: 183.51 / Max: 204.92
IndigoBench Acceleration: OpenCL GPU - Scene: Supercar OpenBenchmarking.org M samples/s Per Watt, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Supercar RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 0.1055 0.211 0.3165 0.422 0.5275 0.183 0.173 0.183 0.164 0.188 0.163 0.234 0.236 0.255 0.203 0.184 0.181 0.186 0.407 0.424 0.377 0.434 0.469 0.432
Chaos Group V-RAY GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better Chaos Group V-RAY 5.02 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 50 / Avg: 59.77 / Max: 66 Min: 52 / Avg: 59.9 / Max: 64 Min: 46 / Avg: 54.61 / Max: 59 Min: 60 / Avg: 69.79 / Max: 74 Min: 51 / Avg: 56.95 / Max: 60 Min: 54 / Avg: 66.33 / Max: 73 Min: 43 / Avg: 53.91 / Max: 59 Min: 51 / Avg: 62.32 / Max: 67 Min: 49 / Avg: 61.71 / Max: 68 Min: 56 / Avg: 70.37 / Max: 76 Min: 60 / Avg: 70.88 / Max: 76 Min: 57 / Avg: 70.14 / Max: 77 Min: 54 / Avg: 65.16 / Max: 70 Min: 43 / Avg: 54.11 / Max: 60 Min: 38 / Avg: 46.74 / Max: 52 Min: 41 / Avg: 53.17 / Max: 60 Min: 40 / Avg: 50.09 / Max: 55 Min: 36 / Avg: 45.14 / Max: 50 Min: 38 / Avg: 45.94 / Max: 49 Min: 39 / Avg: 48.65 / Max: 54
Chaos Group V-RAY GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better Chaos Group V-RAY 5.02 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 17.93 / Avg: 98.52 / Max: 136.66 Min: 7.23 / Avg: 95.04 / Max: 129.87 Min: 15.18 / Avg: 108.55 / Max: 145.59 Min: 16.51 / Avg: 122.45 / Max: 150.28 Min: 10.88 / Avg: 113.21 / Max: 140.03 Min: 13.71 / Avg: 163.18 / Max: 219.73 Min: 19.95 / Avg: 87.95 / Max: 113.42 Min: 21.99 / Avg: 127.04 / Max: 157.14 Min: 16.52 / Avg: 124.46 / Max: 166.93 Min: 26.25 / Avg: 180.93 / Max: 224.16 Min: 23.16 / Avg: 231.65 / Max: 291.84 Min: 22.42 / Avg: 253.46 / Max: 334.61 Min: 23.87 / Avg: 278.11 / Max: 348.11 Min: 8.67 / Avg: 91.33 / Max: 125.74 Min: 6.95 / Avg: 110.91 / Max: 153.37 Min: 11.78 / Avg: 147.19 / Max: 202.6 Min: 11.75 / Avg: 136.11 / Max: 188.96 Min: 9.51 / Avg: 139.49 / Max: 182.51 Min: 13.91 / Avg: 183.5 / Max: 266.24
Chaos Group V-RAY Mode: NVIDIA RTX GPU OpenBenchmarking.org vrays Per Watt, More Is Better Chaos Group V-RAY 5.02 Mode: NVIDIA RTX GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 7 14 21 28 35 8.516 7.744 9.838 8.150 8.948 8.898 13.042 13.263 15.058 11.225 10.572 11.513 10.769 27.427 27.868 25.009 30.725 29.486 30.763
Chaos Group V-RAY GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better Chaos Group V-RAY 5.02 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 54 / Avg: 62.4 / Max: 67 Min: 54 / Avg: 60.68 / Max: 64 Min: 50 / Avg: 58.04 / Max: 62 Min: 61 / Avg: 71.77 / Max: 78 Min: 52 / Avg: 59.18 / Max: 62 Min: 59 / Avg: 69.27 / Max: 74 Min: 43 / Avg: 54.94 / Max: 59 Min: 51 / Avg: 63.42 / Max: 69 Min: 51 / Avg: 64.02 / Max: 70 Min: 56 / Avg: 71.16 / Max: 77 Min: 60 / Avg: 71.61 / Max: 77 Min: 59 / Avg: 72.13 / Max: 78 Min: 53 / Avg: 64.82 / Max: 69 Min: 45 / Avg: 56.33 / Max: 63 Min: 40 / Avg: 49.34 / Max: 54 Min: 42 / Avg: 55.74 / Max: 64 Min: 39 / Avg: 51.68 / Max: 57 Min: 37 / Avg: 47.86 / Max: 53 Min: 36 / Avg: 47.17 / Max: 53 Min: 39 / Avg: 51.1 / Max: 57
Chaos Group V-RAY GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better Chaos Group V-RAY 5.02 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.4 / Avg: 103.18 / Max: 143.19 Min: 7.19 / Avg: 92.41 / Max: 130.29 Min: 15.5 / Avg: 124.36 / Max: 161.89 Min: 16.52 / Avg: 132.44 / Max: 176.43 Min: 10.9 / Avg: 124.68 / Max: 159.85 Min: 14.69 / Avg: 178.52 / Max: 231.97 Min: 19.82 / Avg: 96.85 / Max: 118.86 Min: 21.79 / Avg: 135.73 / Max: 169.63 Min: 16.91 / Avg: 143.19 / Max: 184.3 Min: 26.48 / Avg: 189.21 / Max: 237.34 Min: 29.01 / Avg: 242.54 / Max: 308.12 Min: 32.33 / Avg: 276.77 / Max: 344.93 Min: 17.71 / Avg: 275.2 / Max: 345.54 Min: 8.63 / Avg: 102.33 / Max: 131.61 Min: 5.32 / Avg: 121.78 / Max: 163.76 Min: 11.58 / Avg: 163.03 / Max: 210.13 Min: 11.32 / Avg: 158.52 / Max: 203.25 Min: 9.67 / Avg: 149.9 / Max: 192.37 Min: 13.99 / Avg: 219.47 / Max: 288.12
Chaos Group V-RAY Mode: NVIDIA CUDA GPU OpenBenchmarking.org vpaths Per Watt, More Is Better Chaos Group V-RAY 5.02 Mode: NVIDIA CUDA GPU RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 5 10 15 20 25 4.953 4.869 6.385 5.904 6.449 5.433 8.374 9.209 9.889 7.859 7.327 7.389 7.660 16.808 18.476 16.543 19.367 20.501 19.743
Blender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.0 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 54 / Avg: 68.68 / Max: 72 Min: 56 / Avg: 67.94 / Max: 71 Min: 43 / Avg: 58.22 / Max: 63 Min: 62 / Avg: 76.19 / Max: 81 Min: 50 / Avg: 61.04 / Max: 65 Min: 59 / Avg: 73.14 / Max: 77 Min: 44 / Avg: 56.97 / Max: 59 Min: 51 / Avg: 64.5 / Max: 68 Min: 52 / Avg: 65.35 / Max: 69 Min: 55 / Avg: 71.66 / Max: 76 Min: 59 / Avg: 71.9 / Max: 77 Min: 60 / Avg: 72.78 / Max: 77 Min: 54 / Avg: 65.3 / Max: 69 Min: 45 / Avg: 58.13 / Max: 62 Min: 39 / Avg: 50.16 / Max: 55 Min: 42 / Avg: 56.48 / Max: 63 Min: 38 / Avg: 50.51 / Max: 55 Min: 36 / Avg: 46.47 / Max: 51 Min: 36 / Avg: 46.12 / Max: 51 Min: 37 / Avg: 47.07 / Max: 53
Blender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better Blender 4.0 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.32 / Avg: 141.12 / Max: 157.32 Min: 7.33 / Avg: 131.11 / Max: 147.26 Min: 15.55 / Avg: 140.62 / Max: 159.41 Min: 17.17 / Avg: 156.61 / Max: 183.5 Min: 10.84 / Avg: 141.82 / Max: 163.71 Min: 14.29 / Avg: 219.53 / Max: 257.62 Min: 20.52 / Avg: 107.19 / Max: 115.77 Min: 21.65 / Avg: 141.98 / Max: 158.65 Min: 16.75 / Avg: 151.56 / Max: 171.2 Min: 26.58 / Avg: 194.47 / Max: 218.65 Min: 26.72 / Avg: 252.1 / Max: 298.72 Min: 36.08 / Avg: 291.63 / Max: 342.82 Min: 26.72 / Avg: 289.51 / Max: 343.44 Min: 8.49 / Avg: 110.68 / Max: 131.7 Min: 8.48 / Avg: 130.5 / Max: 162.42 Min: 12.23 / Avg: 159.79 / Max: 201.74 Min: 13.03 / Avg: 151.75 / Max: 196.34 Min: 9.36 / Avg: 143.25 / Max: 188.53 Min: 13.68 / Avg: 186.8 / Max: 260.44
Blender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.0 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 53 / Avg: 64.74 / Max: 69 Min: 56 / Avg: 64.98 / Max: 69 Min: 28 / Avg: 45.14 / Max: 53 Min: 63 / Avg: 73.27 / Max: 77 Min: 48 / Avg: 56.67 / Max: 60 Min: 59 / Avg: 69.78 / Max: 74 Min: 44 / Avg: 56.93 / Max: 60 Min: 51 / Avg: 62.6 / Max: 66 Min: 51 / Avg: 63.09 / Max: 67 Min: 55 / Avg: 69.06 / Max: 74 Min: 59 / Avg: 68.62 / Max: 73 Min: 61 / Avg: 70.02 / Max: 75 Min: 54 / Avg: 62.63 / Max: 67 Min: 45 / Avg: 55.73 / Max: 60 Min: 39 / Avg: 48.08 / Max: 52 Min: 42 / Avg: 53.71 / Max: 60 Min: 37 / Avg: 47.15 / Max: 52 Min: 35 / Avg: 44.4 / Max: 50 Min: 36 / Avg: 44.58 / Max: 50 Min: 37 / Avg: 44.69 / Max: 50
Blender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better Blender 4.0 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.49 / Avg: 134.54 / Max: 157.43 Min: 7.12 / Avg: 127.93 / Max: 160.28 Min: 14.9 / Avg: 131.99 / Max: 157.65 Min: 16.39 / Avg: 146.85 / Max: 173.43 Min: 11.6 / Avg: 134.84 / Max: 159.83 Min: 13.65 / Avg: 203.54 / Max: 241.89 Min: 20 / Avg: 106.45 / Max: 121.73 Min: 21.39 / Avg: 138.19 / Max: 162.16 Min: 16.96 / Avg: 144.97 / Max: 175.19 Min: 26.76 / Avg: 190.02 / Max: 226.65 Min: 24.56 / Avg: 237.74 / Max: 296.96 Min: 30.21 / Avg: 270.83 / Max: 341.27 Min: 21.92 / Avg: 270.17 / Max: 344.9 Min: 11.52 / Avg: 109.03 / Max: 142.71 Min: 8.39 / Avg: 126.14 / Max: 178.28 Min: 13.99 / Avg: 153.83 / Max: 217.23 Min: 12.88 / Avg: 143.19 / Max: 213.17 Min: 9.7 / Avg: 134.93 / Max: 206.29 Min: 13.5 / Avg: 170.58 / Max: 279.86
Blender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.0 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 52 / Avg: 62.18 / Max: 67 Min: 55 / Avg: 64.08 / Max: 68 Min: 47 / Avg: 55.45 / Max: 59 Min: 63 / Avg: 72.47 / Max: 77 Min: 46 / Avg: 54.2 / Max: 58 Min: 59 / Avg: 68.54 / Max: 74 Min: 44 / Avg: 55.12 / Max: 59 Min: 50 / Avg: 60.64 / Max: 65 Min: 51 / Avg: 61.19 / Max: 66 Min: 55 / Avg: 66.57 / Max: 71 Min: 59 / Avg: 66.66 / Max: 71 Min: 60 / Avg: 68.91 / Max: 74 Min: 54 / Avg: 61.21 / Max: 65 Min: 46 / Avg: 55.51 / Max: 60 Min: 39 / Avg: 47.68 / Max: 54 Min: 41 / Avg: 53.49 / Max: 62 Min: 38 / Avg: 47.47 / Max: 55 Min: 35 / Avg: 44.16 / Max: 52 Min: 38 / Avg: 45.63 / Max: 53 Min: 38 / Avg: 44.06 / Max: 52
Blender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better Blender 4.0 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 18.42 / Avg: 123.63 / Max: 150.78 Min: 7.85 / Avg: 122.74 / Max: 148.07 Min: 17.21 / Avg: 132.12 / Max: 162.53 Min: 16.69 / Avg: 149.41 / Max: 182.76 Min: 11.09 / Avg: 139.65 / Max: 178.33 Min: 13.94 / Avg: 193.84 / Max: 265.61 Min: 20.55 / Avg: 96.54 / Max: 112.82 Min: 21.57 / Avg: 124.66 / Max: 156.89 Min: 17.14 / Avg: 132.7 / Max: 174.2 Min: 25.97 / Avg: 167.57 / Max: 213.76 Min: 24.54 / Avg: 211.76 / Max: 294.58 Min: 30 / Avg: 245.8 / Max: 344.36 Min: 29.56 / Avg: 240.22 / Max: 342.6 Min: 10.85 / Avg: 92.23 / Max: 133.69 Min: 8.32 / Avg: 107.56 / Max: 168.28 Min: 12.24 / Avg: 128.42 / Max: 205.2 Min: 14.19 / Avg: 117.89 / Max: 202.49 Min: 9.63 / Avg: 110.01 / Max: 197.47 Min: 13.59 / Avg: 138.9 / Max: 287.14
Blender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.0 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 15 30 45 60 75 Min: 45 / Avg: 59.18 / Max: 65 Min: 46 / Avg: 60.48 / Max: 67 Min: 42 / Avg: 53.06 / Max: 58 Min: 59 / Avg: 71.93 / Max: 77 Min: 38 / Avg: 49.8 / Max: 56 Min: 55 / Avg: 67.12 / Max: 72 Min: 43 / Avg: 56.34 / Max: 59 Min: 46 / Avg: 59.22 / Max: 64 Min: 45 / Avg: 59 / Max: 65 Min: 53 / Avg: 66.74 / Max: 72 Min: 56 / Avg: 66.19 / Max: 72 Min: 57 / Avg: 67.95 / Max: 74 Min: 49 / Avg: 60.32 / Max: 66 Min: 44 / Avg: 55.75 / Max: 61 Min: 37 / Avg: 46.6 / Max: 52 Min: 39 / Avg: 51.86 / Max: 60 Min: 37 / Avg: 47.8 / Max: 54 Min: 34 / Avg: 43.5 / Max: 50 Min: 41 / Avg: 49.47 / Max: 56 Min: 36 / Avg: 44.39 / Max: 51
Blender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better Blender 4.0 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 17.77 / Avg: 132.68 / Max: 152.06 Min: 7.12 / Avg: 127.09 / Max: 147.26 Min: 15.09 / Avg: 137.82 / Max: 161.36 Min: 15.78 / Avg: 148.17 / Max: 175.02 Min: 10.59 / Avg: 136.92 / Max: 162.63 Min: 16.8 / Avg: 201.35 / Max: 246.24 Min: 19.97 / Avg: 103.94 / Max: 116.46 Min: 20.66 / Avg: 134.94 / Max: 157.96 Min: 17.04 / Avg: 142.73 / Max: 170.22 Min: 24.65 / Avg: 185.74 / Max: 220.79 Min: 23.16 / Avg: 233.44 / Max: 293.62 Min: 27.96 / Avg: 273.57 / Max: 347.32 Min: 29.21 / Avg: 268.38 / Max: 345.81 Min: 10.49 / Avg: 106.06 / Max: 141.13 Min: 8.31 / Avg: 124.68 / Max: 176.14 Min: 9.01 / Avg: 149.02 / Max: 215.18 Min: 12.36 / Avg: 139.12 / Max: 210.84 Min: 9.7 / Avg: 134.08 / Max: 203.79 Min: 13.82 / Avg: 168.56 / Max: 279.1
Blender GPU Temperature Monitor OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.0 GPU Temperature Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4060 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 13 26 39 52 65 Min: 33 / Avg: 48.16 / Max: 57 Min: 33 / Avg: 45.58 / Max: 54 Min: 29 / Avg: 43.62 / Max: 51 Min: 51 / Avg: 62.02 / Max: 69 Min: 29 / Avg: 39.52 / Max: 45 Min: 40 / Avg: 57.94 / Max: 67 Min: 46 / Avg: 54.99 / Max: 64 Min: 40 / Avg: 51.57 / Max: 57 Min: 39 / Avg: 49.91 / Max: 56 Min: 48 / Avg: 60.5 / Max: 66 Min: 49 / Avg: 58.25 / Max: 65 Min: 42 / Avg: 58.78 / Max: 68 Min: 45 / Avg: 53.18 / Max: 59 Min: 34 / Avg: 52.84 / Max: 63 Min: 33 / Avg: 41.81 / Max: 47 Min: 38 / Avg: 47.52 / Max: 54 Min: 40 / Avg: 46.24 / Max: 54 Min: 32 / Avg: 39.7 / Max: 46 Min: 38 / Avg: 44.42 / Max: 51 Min: 38 / Avg: 43 / Max: 49
Blender GPU Power Consumption Monitor OpenBenchmarking.org Watts, Fewer Is Better Blender 4.0 GPU Power Consumption Monitor RTX 2060 SUPER RTX 2070 RTX 2070 SUPER RTX 2080 RTX 2080 SUPER TITAN RTX RTX 3060 RTX 3060 Ti RTX 3070 RTX 3070 Ti RTX 3080 RTX 3080 Ti RTX 3090 RTX 4070 RTX 4070 SUPER RTX 4070 Ti SUPER RTX 4080 RTX 4080 SUPER RTX 4090 60 120 180 240 300 Min: 17.4 / Avg: 108.37 / Max: 143.96 Min: 6.75 / Avg: 99.62 / Max: 142.52 Min: 16.03 / Avg: 110.79 / Max: 147.01 Min: 14.44 / Avg: 118.67 / Max: 158.14 Min: 10.27 / Avg: 108.81 / Max: 145.89 Min: 9.65 / Avg: 157.16 / Max: 228.79 Min: 20.48 / Avg: 88.92 / Max: 113.34 Min: 19.76 / Avg: 105.23 / Max: 145.26 Min: 16.76 / Avg: 108.79 / Max: 154.85 Min: 23.45 / Avg: 148.21 / Max: 205.54 Min: 22.94 / Avg: 176.52 / Max: 269.6 Min: 22.77 / Avg: 205.93 / Max: 318.74 Min: 27.58 / Avg: 197.7 / Max: 314.94 Min: 10.12 / Avg: 76.38 / Max: 126 Min: 7.8 / Avg: 85.16 / Max: 151.78 Min: 11.52 / Avg: 105.77 / Max: 190.05 Min: 12.07 / Avg: 93.39 / Max: 182.7 Min: 9.52 / Avg: 84.64 / Max: 174.52 Min: 13.62 / Avg: 111.42 / Max: 240.45
Phoronix Test Suite v10.8.4