TR7970X_Friday

AMD Ryzen Threadripper 7970X 32-Cores testing with a ASRock TRX50 WS (10.01 BIOS) and ASUS NVIDIA GeForce RTX 4090 24GB on Debian 12 via the Phoronix Test Suite.

Compare your own system(s) to this result file with the Phoronix Test Suite by running the command: phoronix-test-suite benchmark 2407142-SHAM-TR7970X61
Jump To Table - Results

Statistics

Remove Outliers Before Calculating Averages

Graph Settings

Prefer Vertical Bar Graphs

Multi-Way Comparison

Condense Multi-Option Tests Into Single Result Graphs

Table

Show Detailed System Result Table

Run Management

Result
Identifier
Performance Per
Dollar
Date
Run
  Test
  Duration
Friday
July 13
  1 Day, 5 Hours, 58 Minutes
Only show results matching title/arguments (delimit multiple options with a comma):
Do not show results matching title/arguments (delimit multiple options with a comma):


TR7970X_Friday Suite 1.0.0 System Test suite extracted from TR7970X_Friday. pts/tscp-1.2.2 AI Chess Performance pts/nettle-1.1.0 aes256 Test: aes256 pts/nettle-1.1.0 chacha Test: chacha pts/nettle-1.1.0 sha512 Test: sha512 pts/nettle-1.1.0 poly1305-aes Test: poly1305-aes pts/crafty-1.4.5 Elapsed Time pts/ramspeed-1.4.3 ADD -b 3 Type: Add - Benchmark: Integer pts/ramspeed-1.4.3 COPY -b 3 Type: Copy - Benchmark: Integer pts/ramspeed-1.4.3 SCALE -b 3 Type: Scale - Benchmark: Integer pts/ramspeed-1.4.3 TRIAD -b 3 Type: Triad - Benchmark: Integer pts/ramspeed-1.4.3 AVERAGE -b 3 Type: Average - Benchmark: Integer pts/ramspeed-1.4.3 ADD -b 6 Type: Add - Benchmark: Floating Point pts/ramspeed-1.4.3 COPY -b 6 Type: Copy - Benchmark: Floating Point pts/ramspeed-1.4.3 SCALE -b 6 Type: Scale - Benchmark: Floating Point pts/ramspeed-1.4.3 TRIAD -b 6 Type: Triad - Benchmark: Floating Point pts/ramspeed-1.4.3 AVERAGE -b 6 Type: Average - Benchmark: Floating Point pts/stream-1.3.4 Copy Type: Copy pts/stream-1.3.4 Scale Type: Scale pts/stream-1.3.4 Triad Type: Triad pts/stream-1.3.4 Add Type: Add pts/cython-bench-1.1.0 NQUEENS Test: N-Queens pts/botan-1.6.0 AES-256 Test: AES-256 pts/botan-1.6.0 AES-256 Test: AES-256 - Decrypt pts/botan-1.6.0 Blowfish Test: Blowfish pts/botan-1.6.0 Blowfish Test: Blowfish - Decrypt pts/botan-1.6.0 ChaCha20Poly1305 Test: ChaCha20Poly1305 pts/botan-1.6.0 ChaCha20Poly1305 Test: ChaCha20Poly1305 - Decrypt pts/cachebench-1.2.0 -r Test: Read pts/cachebench-1.2.0 -w Test: Write pts/cachebench-1.2.0 -b Test: Read / Modify / Write pts/glmark2-1.4.0 -s 800x600 Resolution: 800 x 600 pts/glmark2-1.4.0 -s 1920x1080 Resolution: 1920 x 1080 pts/unigine-heaven-1.6.5 -video_width 800 -video_height 600 -video_fullscreen 0 -video_app opengl Resolution: 800 x 600 - Mode: Windowed - Renderer: OpenGL pts/unigine-heaven-1.6.5 -video_width 1920 -video_height 1080 -video_fullscreen 0 -video_app opengl Resolution: 1920 x 1080 - Mode: Windowed - Renderer: OpenGL pts/unigine-valley-1.1.9 -video_width 800 -video_height 600 -video_fullscreen 0 -video_app opengl Resolution: 800 x 600 - Mode: Windowed - Renderer: OpenGL pts/unigine-valley-1.1.9 -video_width 1920 -video_height 1080 -video_fullscreen 0 -video_app opengl Resolution: 1920 x 1080 - Mode: Windowed - Renderer: OpenGL pts/fio-2.1.0 randread posixaio 1 1m 1 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 4k 1 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 8m 1 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 16k 1 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 1m 16 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 1m 64 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 4k 16 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 4k 64 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 64k 1 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 8m 16 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 8m 64 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 1m 1 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 4k 1 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 8m 1 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 16k 16 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 16k 64 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 64k 16 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randread posixaio 1 64k 64 Type: Random Read - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 16k 1 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 1m 16 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 1m 64 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 4k 16 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 4k 64 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 64k 1 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 8m 16 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 8m 64 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 16k 16 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 16k 64 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 64k 16 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 randwrite posixaio 1 64k 64 Type: Random Write - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 1m 1 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 4k 1 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 8m 1 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 16k 1 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 1m 16 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 1m 64 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 4k 16 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 4k 64 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 64k 1 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 8m 16 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 8m 64 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 1m 1 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 4k 1 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 8m 1 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 16k 16 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 16k 64 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 64k 16 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 read posixaio 1 64k 64 Type: Sequential Read - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 16k 1 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 1m 16 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 1m 64 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 1MB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 4k 16 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 4k 64 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 4KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 64k 1 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 1 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 8m 16 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 8m 64 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 8MB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 16k 16 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 16k 64 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 16KB - Job Count: 64 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 64k 16 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 16 - Disk Target: Default Test Directory pts/fio-2.1.0 write posixaio 1 64k 64 Type: Sequential Write - Engine: POSIX AIO - Direct: Yes - Block Size: 64KB - Job Count: 64 - Disk Target: Default Test Directory pts/fs-mark-1.0.3 -L 20 -s 1048576 -n 1000 Test: 1000 Files, 1MB Size pts/fs-mark-1.0.3 -L 5 -s 1048576 -n 5000 -t 4 Test: 5000 Files, 1MB Size, 4 Threads pts/fs-mark-1.0.3 -L 10 -s 1048576 -n 4000 -D 32 Test: 4000 Files, 32 Sub Dirs, 1MB Size pts/fs-mark-1.0.3 -L 100 -s 1048576 -n 1000 -S 0 Test: 1000 Files, 1MB Size, No Sync/FSync pts/lczero-1.7.0 -b blas Backend: BLAS pts/numpy-1.2.1 pts/namd-1.3.2 ../f1atpase/f1atpase.namd Input: ATPase with 327,506 Atoms pts/namd-1.3.2 ../stmv/stmv.namd Input: STMV with 1,066,628 Atoms pts/pmbench-1.0.2 -j 1 -r 50 Concurrent Worker Threads: 1 - Read-Write Ratio: 50% pts/pmbench-1.0.2 -j 2 -r 50 Concurrent Worker Threads: 2 - Read-Write Ratio: 50% pts/pmbench-1.0.2 -j 4 -r 50 Concurrent Worker Threads: 4 - Read-Write Ratio: 50% pts/pmbench-1.0.2 -j 8 -r 50 Concurrent Worker Threads: 8 - Read-Write Ratio: 50% pts/pmbench-1.0.2 -j 16 -r 50 Concurrent Worker Threads: 16 - Read-Write Ratio: 50% pts/pmbench-1.0.2 -j 32 -r 50 Concurrent Worker Threads: 32 - Read-Write Ratio: 50% pts/pmbench-1.0.2 -j 64 -r 50 Concurrent Worker Threads: 64 - Read-Write Ratio: 50% pts/pmbench-1.0.2 -j 1 -r 100 Concurrent Worker Threads: 1 - Read-Write Ratio: 100% Reads pts/pmbench-1.0.2 -j 2 -r 100 Concurrent Worker Threads: 2 - Read-Write Ratio: 100% Reads pts/pmbench-1.0.2 -j 4 -r 100 Concurrent Worker Threads: 4 - Read-Write Ratio: 100% Reads pts/pmbench-1.0.2 -j 8 -r 100 Concurrent Worker Threads: 8 - Read-Write Ratio: 100% Reads pts/pmbench-1.0.2 -j 1 -r 0 Concurrent Worker Threads: 1 - Read-Write Ratio: 100% Writes pts/pmbench-1.0.2 -j 16 -r 100 Concurrent Worker Threads: 16 - Read-Write Ratio: 100% Reads pts/pmbench-1.0.2 -j 2 -r 0 Concurrent Worker Threads: 2 - Read-Write Ratio: 100% Writes pts/pmbench-1.0.2 -j 32 -r 100 Concurrent Worker Threads: 32 - Read-Write Ratio: 100% Reads pts/pmbench-1.0.2 -j 4 -r 0 Concurrent Worker Threads: 4 - Read-Write Ratio: 100% Writes pts/pmbench-1.0.2 -j 64 -r 100 Concurrent Worker Threads: 64 - Read-Write Ratio: 100% Reads pts/pmbench-1.0.2 -j 8 -r 0 Concurrent Worker Threads: 8 - Read-Write Ratio: 100% Writes pts/pmbench-1.0.2 -j 16 -r 0 Concurrent Worker Threads: 16 - Read-Write Ratio: 100% Writes pts/pmbench-1.0.2 -j 32 -r 0 Concurrent Worker Threads: 32 - Read-Write Ratio: 100% Writes pts/pmbench-1.0.2 -j 64 -r 0 Concurrent Worker Threads: 64 - Read-Write Ratio: 100% Writes pts/pmbench-1.0.2 -j 1 -r 80 Concurrent Worker Threads: 1 - Read-Write Ratio: 80% Reads 20% Writes pts/pmbench-1.0.2 -j 2 -r 80 Concurrent Worker Threads: 2 - Read-Write Ratio: 80% Reads 20% Writes pts/pmbench-1.0.2 -j 4 -r 80 Concurrent Worker Threads: 4 - Read-Write Ratio: 80% Reads 20% Writes pts/pmbench-1.0.2 -j 8 -r 80 Concurrent Worker Threads: 8 - Read-Write Ratio: 80% Reads 20% Writes pts/pmbench-1.0.2 -j 16 -r 80 Concurrent Worker Threads: 16 - Read-Write Ratio: 80% Reads 20% Writes pts/pmbench-1.0.2 -j 32 -r 80 Concurrent Worker Threads: 32 - Read-Write Ratio: 80% Reads 20% Writes pts/pmbench-1.0.2 -j 64 -r 80 Concurrent Worker Threads: 64 - Read-Write Ratio: 80% Reads 20% Writes pts/n-queens-1.2.1 Elapsed Time pts/aircrack-ng-1.3.0 pts/build-ffmpeg-7.0.0 Time To Compile pts/build-gdb-1.1.0 Time To Compile pts/build-imagemagick-1.7.2 Time To Compile pts/build-mplayer-1.5.0 Time To Compile pts/build-apache-1.6.1 Time To Compile pts/stockfish-1.5.0 Chess Benchmark pts/compress-7zip-1.11.0 Test: Compression Rating pts/compress-7zip-1.11.0 Test: Decompression Rating pts/john-the-ripper-1.8.0 --format=bcrypt Test: bcrypt pts/john-the-ripper-1.8.0 --format=wpapsk Test: WPA PSK pts/john-the-ripper-1.8.0 --format=bcrypt Test: Blowfish pts/john-the-ripper-1.8.0 --format=HMAC-SHA512 Test: HMAC-SHA512 pts/john-the-ripper-1.8.0 --format=md5crypt Test: MD5 pts/build-llvm-1.5.0 Ninja Build System: Ninja pts/build-llvm-1.5.0 Build System: Unix Makefiles pts/build-php-1.7.0 Time To Compile pts/compress-zstd-1.6.0 -b3 Compression Level: 3 - Compression Speed pts/compress-zstd-1.6.0 -b19 Compression Level: 19 - Compression Speed pts/compress-zstd-1.6.0 -b19 Compression Level: 19 - Decompression Speed pts/compress-zstd-1.6.0 -b3 --long Compression Level: 3, Long Mode - Compression Speed pts/compress-zstd-1.6.0 -b3 --long Compression Level: 3, Long Mode - Decompression Speed pts/compress-zstd-1.6.0 -b19 --long Compression Level: 19, Long Mode - Compression Speed pts/compress-zstd-1.6.0 -b19 --long Compression Level: 19, Long Mode - Decompression Speed pts/asmfish-1.1.2 1024 Hash Memory, 26 Depth pts/m-queens-1.1.0 Time To Solve pts/build-gcc-1.4.0 Time To Compile pts/build-linux-kernel-1.16.0 defconfig Build: defconfig pts/build-linux-kernel-1.16.0 allmodconfig Build: allmodconfig pts/sysbench-1.1.0 memory run Test: RAM / Memory pts/sysbench-1.1.0 cpu run Test: CPU pts/x264-2.7.0 Bosphorus_3840x2160.y4m Video Input: Bosphorus 4K pts/x264-2.7.0 Bosphorus_1920x1080_120fps_420_8bit_YUV.y4m Video Input: Bosphorus 1080p pts/blender-4.1.0 -b ../bmw27_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CPU Blend File: BMW27 - Compute: CPU-Only pts/blender-4.1.0 -b ../bmw27_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CUDA Blend File: BMW27 - Compute: NVIDIA CUDA pts/blender-4.1.0 -b ../junkshop.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CPU Blend File: Junkshop - Compute: CPU-Only pts/blender-4.1.0 -b ../classroom_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CPU Blend File: Classroom - Compute: CPU-Only pts/blender-4.1.0 -b ../fishy_cat_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CPU Blend File: Fishy Cat - Compute: CPU-Only pts/blender-4.1.0 -b ../barbershop_interior_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CPU Blend File: Barbershop - Compute: CPU-Only pts/blender-4.1.0 -b ../junkshop.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CUDA Blend File: Junkshop - Compute: NVIDIA CUDA pts/blender-4.1.0 -b ../classroom_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CUDA Blend File: Classroom - Compute: NVIDIA CUDA pts/blender-4.1.0 -b ../fishy_cat_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CUDA Blend File: Fishy Cat - Compute: NVIDIA CUDA pts/blender-4.1.0 -b ../barbershop_interior_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CUDA Blend File: Barbershop - Compute: NVIDIA CUDA pts/blender-4.1.0 -b ../pavillon_barcelone_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CPU Blend File: Pabellon Barcelona - Compute: CPU-Only pts/blender-4.1.0 -b ../pavillon_barcelone_gpu.blend -o output.test -x 1 -F JPEG -f 1 -- --cycles-device CUDA Blend File: Pabellon Barcelona - Compute: NVIDIA CUDA pts/povray-1.2.1 Trace Time pts/build-godot-4.0.0 Time To Compile pts/build2-1.2.0 Time To Compile pts/build-python-1.0.0 Build Configuration: Default pts/build-python-1.0.0 --enable-optimizations --with-lto Build Configuration: Released Build, PGO + LTO Optimized pts/build-eigen-1.1.0 Time To Compile pts/build-erlang-1.2.0 Time To Compile pts/build-gem5-1.1.0 Time To Compile pts/build-mesa-1.1.0 Time To Compile pts/build-nodejs-1.4.0 Time To Compile pts/build-wasmer-1.2.0 Time To Compile pts/hashcat-1.1.1 -m 0 Benchmark: MD5 pts/hashcat-1.1.1 -m 100 Benchmark: SHA1 pts/hashcat-1.1.1 -m 11600 Benchmark: 7-Zip pts/hashcat-1.1.1 -m 1700 Benchmark: SHA-512 pts/pyperformance-1.1.0 go Benchmark: go pts/pyperformance-1.1.0 chaos Benchmark: chaos pts/pyperformance-1.1.0 float Benchmark: float pts/pyperformance-1.1.0 nbody Benchmark: nbody pts/pyperformance-1.1.0 pathlib Benchmark: pathlib pts/pyperformance-1.1.0 raytrace Benchmark: raytrace pts/pyperformance-1.1.0 xml_etree Benchmark: xml_etree pts/pyperformance-1.1.0 gc_collect Benchmark: gc_collect pts/pyperformance-1.1.0 json_loads Benchmark: json_loads pts/pyperformance-1.1.0 crypto_pyaes Benchmark: crypto_pyaes pts/pyperformance-1.1.0 async_tree_io Benchmark: async_tree_io pts/pyperformance-1.1.0 regex_compile Benchmark: regex_compile pts/pyperformance-1.1.0 python_startup Benchmark: python_startup pts/pyperformance-1.1.0 asyncio_tcp_ssl Benchmark: asyncio_tcp_ssl pts/pyperformance-1.1.0 django_template Benchmark: django_template pts/pyperformance-1.1.0 asyncio_websockets Benchmark: asyncio_websockets pts/pyperformance-1.1.0 pickle_pure_python Benchmark: pickle_pure_python pts/nginx-3.0.1 -c 1 Connections: 1 pts/nginx-3.0.1 -c 100 Connections: 100 pts/nginx-3.0.1 -c 1000 Connections: 1000 pts/nginx-3.0.1 -c 4000 Connections: 4000 pts/openssl-3.3.0 sha256 Algorithm: SHA256 pts/openssl-3.3.0 sha512 Algorithm: SHA512 pts/openssl-3.3.0 rsa4096 Algorithm: RSA4096 pts/openssl-3.3.0 -evp chacha20 Algorithm: ChaCha20 pts/openssl-3.3.0 -evp aes-128-gcm Algorithm: AES-128-GCM pts/openssl-3.3.0 -evp aes-256-gcm Algorithm: AES-256-GCM pts/node-express-loadtest-1.0.1 pts/phpbench-1.1.6 PHP Benchmark Suite pts/hadoop-1.0.0 -op open -threads 20 -files 100000 Operation: Open - Threads: 20 - Files: 100000 pts/hadoop-1.0.0 -op open -threads 100 -files 100000 Operation: Open - Threads: 100 - Files: 100000 pts/hadoop-1.0.0 -op create -threads 20 -files 100000 Operation: Create - Threads: 20 - Files: 100000 pts/hadoop-1.0.0 -op delete -threads 20 -files 100000 Operation: Delete - Threads: 20 - Files: 100000 pts/hadoop-1.0.0 -op open -threads 1000 -files 100000 Operation: Open - Threads: 1000 - Files: 100000 pts/hadoop-1.0.0 -op open -threads 20 -files 10000000 Operation: Open - Threads: 20 - Files: 10000000 pts/hadoop-1.0.0 -op rename -threads 20 -files 100000 Operation: Rename - Threads: 20 - Files: 100000 pts/hadoop-1.0.0 -op create -threads 100 -files 100000 Operation: Create - Threads: 100 - Files: 100000 pts/hadoop-1.0.0 -op delete -threads 100 -files 100000 Operation: Delete - Threads: 100 - Files: 100000 pts/hadoop-1.0.0 -op open -threads 100 -files 10000000 Operation: Open - Threads: 100 - Files: 10000000 pts/hadoop-1.0.0 -op rename -threads 100 -files 100000 Operation: Rename - Threads: 100 - Files: 100000 pts/hadoop-1.0.0 -op create -threads 1000 -files 100000 Operation: Create - Threads: 1000 - Files: 100000 pts/hadoop-1.0.0 -op create -threads 20 -files 10000000 Operation: Create - Threads: 20 - Files: 10000000 pts/hadoop-1.0.0 -op delete -threads 1000 -files 100000 Operation: Delete - Threads: 1000 - Files: 100000 pts/hadoop-1.0.0 -op delete -threads 20 -files 10000000 Operation: Delete - Threads: 20 - Files: 10000000 pts/hadoop-1.0.0 -op open -threads 1000 -files 10000000 Operation: Open - Threads: 1000 - Files: 10000000 pts/hadoop-1.0.0 -op rename -threads 1000 -files 100000 Operation: Rename - Threads: 1000 - Files: 100000 pts/hadoop-1.0.0 -op rename -threads 20 -files 10000000 Operation: Rename - Threads: 20 - Files: 10000000 pts/hadoop-1.0.0 -op create -threads 100 -files 10000000 Operation: Create - Threads: 100 - Files: 10000000 pts/hadoop-1.0.0 -op delete -threads 100 -files 10000000 Operation: Delete - Threads: 100 - Files: 10000000 pts/hadoop-1.0.0 -op rename -threads 100 -files 10000000 Operation: Rename - Threads: 100 - Files: 10000000 pts/hadoop-1.0.0 -op create -threads 1000 -files 10000000 Operation: Create - Threads: 1000 - Files: 10000000 pts/hadoop-1.0.0 -op delete -threads 1000 -files 10000000 Operation: Delete - Threads: 1000 - Files: 10000000 pts/hadoop-1.0.0 -op rename -threads 1000 -files 10000000 Operation: Rename - Threads: 1000 - Files: 10000000 pts/hadoop-1.0.0 -op fileStatus -threads 20 -files 100000 Operation: File Status - Threads: 20 - Files: 100000 pts/hadoop-1.0.0 -op fileStatus -threads 100 -files 100000 Operation: File Status - Threads: 100 - Files: 100000 pts/hadoop-1.0.0 -op fileStatus -threads 1000 -files 100000 Operation: File Status - Threads: 1000 - Files: 100000 pts/hadoop-1.0.0 -op fileStatus -threads 20 -files 10000000 Operation: File Status - Threads: 20 - Files: 10000000 pts/hadoop-1.0.0 -op fileStatus -threads 100 -files 10000000 Operation: File Status - Threads: 100 - Files: 10000000 pts/hadoop-1.0.0 -op fileStatus -threads 1000 -files 10000000 Operation: File Status - Threads: 1000 - Files: 10000000 pts/apache-iotdb-1.2.0 800 100 800 400 Device Count: 800 - Batch Size Per Write: 100 - Sensor Count: 800 - Client Number: 400 pts/clickhouse-1.2.0 100M Rows Hits Dataset, First Run / Cold Cache pts/clickhouse-1.2.0 100M Rows Hits Dataset, Second Run pts/clickhouse-1.2.0 100M Rows Hits Dataset, Third Run pts/cockroach-1.0.2 kv --ramp 10s --read-percent 10 --concurrency 128 Workload: KV, 10% Reads - Concurrency: 128 pts/cockroach-1.0.2 kv --ramp 10s --read-percent 50 --concurrency 128 Workload: KV, 50% Reads - Concurrency: 128 pts/cockroach-1.0.2 kv --ramp 10s --read-percent 95 --concurrency 128 Workload: KV, 95% Reads - Concurrency: 128 pts/cockroach-1.0.2 kv --ramp 10s --read-percent 10 --concurrency 1024 Workload: KV, 10% Reads - Concurrency: 1024 pts/cockroach-1.0.2 kv --ramp 10s --read-percent 50 --concurrency 1024 Workload: KV, 50% Reads - Concurrency: 1024 pts/cockroach-1.0.2 kv --ramp 10s --read-percent 95 --concurrency 1024 Workload: KV, 95% Reads - Concurrency: 1024 pts/redis-1.4.0 -t get -c 50 Test: GET - Parallel Connections: 50 pts/redis-1.4.0 -t set -c 50 Test: SET - Parallel Connections: 50 pts/redis-1.4.0 -t get -c 500 Test: GET - Parallel Connections: 500 pts/redis-1.4.0 -t lpop -c 50 Test: LPOP - Parallel Connections: 50 pts/redis-1.4.0 -t set -c 500 Test: SET - Parallel Connections: 500 pts/redis-1.4.0 -t get -c 1000 Test: GET - Parallel Connections: 1000 pts/redis-1.4.0 -t lpop -c 500 Test: LPOP - Parallel Connections: 500 pts/redis-1.4.0 -t lpush -c 50 Test: LPUSH - Parallel Connections: 50 pts/redis-1.4.0 -t set -c 1000 Test: SET - Parallel Connections: 1000 pts/redis-1.4.0 -t lpop -c 1000 Test: LPOP - Parallel Connections: 1000 pts/redis-1.4.0 -t lpush -c 500 Test: LPUSH - Parallel Connections: 500 pts/redis-1.4.0 -t lpush -c 1000 Test: LPUSH - Parallel Connections: 1000 pts/sqlite-2.2.0 1 Threads / Copies: 1 pts/sqlite-2.2.0 16 Threads / Copies: 16 pts/sqlite-2.2.0 64 Threads / Copies: 64 pts/leveldb-1.1.0 --benchmarks=readhot --num=1000000 Benchmark: Hot Read pts/leveldb-1.1.0 --benchmarks=fillsync --num=1000000 Benchmark: Fill Sync pts/leveldb-1.1.0 --benchmarks=overwrite --num=100000 Benchmark: Overwrite pts/leveldb-1.1.0 --benchmarks=fillrandom --num=100000 Benchmark: Random Fill pts/leveldb-1.1.0 --benchmarks=readrandom --num=1000000 Benchmark: Random Read pts/leveldb-1.1.0 --benchmarks=seekrandom --num=1000000 Benchmark: Seek Random pts/leveldb-1.1.0 --benchmarks=deleterandom --num=500000 Benchmark: Random Delete pts/leveldb-1.1.0 --benchmarks=fillseq --num=500000 Benchmark: Sequential Fill pts/sqlite-speedtest-1.0.1 Timed Time - Size 1,000 pts/cassandra-1.2.0 WRITE Test: Writes pts/cassandra-1.2.0 MIXED_1_1 Test: Mixed 1:1 pts/cassandra-1.2.0 MIXED_1_3 Test: Mixed 1:3 pts/pgbench-1.14.0 -s 1 -c 1 -S Scaling Factor: 1 - Clients: 1 - Mode: Read Only pts/pgbench-1.14.0 -s 1 -c 1 Scaling Factor: 1 - Clients: 1 - Mode: Read Write pts/pgbench-1.14.0 -s 1 -c 100 -S Scaling Factor: 1 - Clients: 100 - Mode: Read Only pts/pgbench-1.14.0 -s 100 -c 1 -S Scaling Factor: 100 - Clients: 1 - Mode: Read Only pts/pgbench-1.14.0 -s 1 -c 100 Scaling Factor: 1 - Clients: 100 - Mode: Read Write pts/pgbench-1.14.0 -s 1 -c 1000 -S Scaling Factor: 1 - Clients: 1000 - Mode: Read Only pts/pgbench-1.14.0 -s 100 -c 1 Scaling Factor: 100 - Clients: 1 - Mode: Read Write pts/pgbench-1.14.0 -s 1000 -c 1 -S Scaling Factor: 1000 - Clients: 1 - Mode: Read Only pts/pgbench-1.14.0 -s 1 -c 1000 Scaling Factor: 1 - Clients: 1000 - Mode: Read Write pts/pgbench-1.14.0 -s 100 -c 100 -S Scaling Factor: 100 - Clients: 100 - Mode: Read Only pts/pgbench-1.14.0 -s 1000 -c 1 Scaling Factor: 1000 - Clients: 1 - Mode: Read Write pts/pgbench-1.14.0 -s 10000 -c 1 -S Scaling Factor: 10000 - Clients: 1 - Mode: Read Only pts/pgbench-1.14.0 -s 100 -c 100 Scaling Factor: 100 - Clients: 100 - Mode: Read Write pts/pgbench-1.14.0 -s 100 -c 1000 -S Scaling Factor: 100 - Clients: 1000 - Mode: Read Only pts/pgbench-1.14.0 -s 1000 -c 100 -S Scaling Factor: 1000 - Clients: 100 - Mode: Read Only pts/pgbench-1.14.0 -s 10000 -c 1 Scaling Factor: 10000 - Clients: 1 - Mode: Read Write pts/pgbench-1.14.0 -s 100 -c 1000 Scaling Factor: 100 - Clients: 1000 - Mode: Read Write pts/pgbench-1.14.0 -s 1000 -c 100 Scaling Factor: 1000 - Clients: 100 - Mode: Read Write pts/pgbench-1.14.0 -s 1000 -c 1000 -S Scaling Factor: 1000 - Clients: 1000 - Mode: Read Only pts/pgbench-1.14.0 -s 10000 -c 100 -S Scaling Factor: 10000 - Clients: 100 - Mode: Read Only pts/pgbench-1.14.0 -s 1000 -c 1000 Scaling Factor: 1000 - Clients: 1000 - Mode: Read Write pts/pgbench-1.14.0 -s 10000 -c 100 Scaling Factor: 10000 - Clients: 100 - Mode: Read Write pts/pgbench-1.14.0 -s 10000 -c 1000 -S Scaling Factor: 10000 - Clients: 1000 - Mode: Read Only pts/pgbench-1.14.0 -s 10000 -c 1000 Scaling Factor: 10000 - Clients: 1000 - Mode: Read Write pts/node-web-tooling-1.0.1 pts/simdjson-2.0.1 kostya Throughput Test: Kostya pts/simdjson-2.0.1 top_tweet Throughput Test: TopTweet pts/simdjson-2.0.1 large_random Throughput Test: LargeRandom pts/simdjson-2.0.1 partial_tweets Throughput Test: PartialTweets pts/simdjson-2.0.1 distinct_user_id Throughput Test: DistinctUserID pts/node-octane-1.0.1 pts/compress-gzip-1.2.0 Linux Source Tree Archiving To .tar.gz pts/git-1.1.0 Time To Complete Common Git Commands pts/pybench-1.1.3 Total For Average Test Times