ocdcephbenchmarks

KVM QEMU testing on CentOS Linux 7 via the Phoronix Test Suite.

Compare your own system(s) to this result file with the Phoronix Test Suite by running the command: phoronix-test-suite benchmark 1805300-FO-OCDCEPHBE02
Jump To Table - Results

View

Do Not Show Noisy Results
Do Not Show Results With Incomplete Data
Do Not Show Results With Little Change/Spread
List Notable Results

Limit displaying results to tests within:

C/C++ Compiler Tests 2 Tests
CPU Massive 3 Tests
Database Test Suite 2 Tests
Disk Test Suite 5 Tests
Common Kernel Benchmarks 3 Tests
Server 3 Tests

Statistics

Show Overall Harmonic Mean(s)
Show Overall Geometric Mean
Show Geometric Means Per-Suite/Category
Show Wins / Losses Counts (Pie Chart)
Normalize Results
Remove Outliers Before Calculating Averages

Graph Settings

Force Line Graphs Where Applicable
Convert To Scalar Where Applicable
Prefer Vertical Bar Graphs

Multi-Way Comparison

Condense Multi-Option Tests Into Single Result Graphs

Table

Show Detailed System Result Table

Run Management

Highlight
Result
Hide
Result
Result
Identifier
Performance Per
Dollar
Date
Run
  Test
  Duration
local filesystem
May 27 2018
 
CEPH Jewel 3 OSDs
May 27 2018
 
Direct SSD io=native cache=none
May 27 2018
 
CEPH Jewel 1 OSD w/ external Journal
May 28 2018
 
CEPH Jewel 1 OSD
May 29 2018
 
CEPH jewel 3 OSDs replica 3
May 29 2018
 
CEPH luminous bluestore 3 OSDs replica 3
May 30 2018
 
CEPH luminous bluestore 3 OSDs replica 3 csum_type=none
May 30 2018
 
CEPH luminous bluestore 3 OSDs replica 1
May 30 2018
 
Invert Hiding All Results Option
 

Only show results where is faster than
Only show results matching title/arguments (delimit multiple options with a comma):
Do not show results matching title/arguments (delimit multiple options with a comma):


ocdcephbenchmarks - Phoronix Test Suite

ocdcephbenchmarks

KVM QEMU testing on CentOS Linux 7 via the Phoronix Test Suite.

HTML result view exported from: https://openbenchmarking.org/result/1805300-FO-OCDCEPHBE02&grs&rdt&rro.

ocdcephbenchmarksProcessorMotherboardMemoryDiskGraphicsOSKernelCompilerFile-SystemScreen ResolutionSystem Layerlocal filesystemCEPH Jewel 3 OSDsDirect SSD io=native cache=noneCEPH Jewel 1 OSD w/ external JournalCEPH Jewel 1 OSDCEPH jewel 3 OSDs replica 3CEPH luminous bluestore 3 OSDs replica 3CEPH luminous bluestore 3 OSDs replica 3 csum_type=noneCEPH luminous bluestore 3 OSDs replica 18 x QEMU Virtual 2.5+ @ 2.19GHz (8 Cores)Red Hat KVM (1.11.0-2.el7 BIOS)2 x 16384 MB RAM28GBcirrusdrmfbCentOS Linux 73.10.0-862.3.2.el7.x86_64 (x86_64)GCC 4.8.5 20150623xfs1024x768KVM QEMU1024GB1788GB28GB1024GBOpenBenchmarking.orgCompiler Details- --build=x86_64-redhat-linux --disable-libgcj --disable-libunwind-exceptions --enable-__cxa_atexit --enable-bootstrap --enable-checking=release --enable-gnu-indirect-function --enable-gnu-unique-object --enable-initfini-array --enable-languages=c,c++,objc,obj-c++,java,fortran,ada,go,lto --enable-plugin --enable-shared --enable-threads=posix --mandir=/usr/share/man --with-arch_32=x86-64 --with-linker-hash-style=gnu --with-tune=genericDisk Mount Options Details- attr2,inode64,noquota,relatime,rw,seclabelPython Details- Python 2.7.5Security Details- SELinux + KPTI + Load fences + Retpoline without IBPB Protection

ocdcephbenchmarkstiobench: 64MB Rand Write - 32 Threadssqlite: Timed SQLite Insertionsdbench: 12 Clientsdbench: 1 Clientsfs-mark: 1000 Files, 1MB Sizedbench: 48 Clientsdbench: 128 Clientsapache: Static Web Page Servingpostmark: Disk Transaction Performanceunpack-linux: linux-4.15.tar.xzcompilebench: Read Compiled Treecompilebench: Initial Createcompilebench: Compilepgbench: On-Disk - Normal Load - Read Writecompress-gzip: Linux Source Tree Archiving To .tar.gztiobench: 64MB Rand Read - 32 Threadsaio-stress: Rand Writelocal filesystemCEPH Jewel 3 OSDsDirect SSD io=native cache=noneCEPH Jewel 1 OSD w/ external JournalCEPH Jewel 1 OSDCEPH jewel 3 OSDs replica 3CEPH luminous bluestore 3 OSDs replica 3CEPH luminous bluestore 3 OSDs replica 3 csum_type=noneCEPH luminous bluestore 3 OSDs replica 1958.9620.611285.75179.02152.13812.43959.007307.72240914.4571.3360691.531478.38337.0052.75683.4982.8587.98968.60965.177336.67214915.4173.37107041.831721.82555.5417.29800.77197.93159.031220.011336.957272.34229914.713642.9169.39115753.711802.66300.2345.10773.20101.2795.501055.651055.647162.87220614.77260.32135.491028.881824.7067.57100449.371822.87299.6046.21691.8798.6783.53938.32970.318550.11244314.53260.08144.531148.8871.74102558.871340.55214.0198.30417.5156.0561.93712.22779.867961.19227315.68250.96136.011112.4370.3784936.341818.64151.00109.48344.7154.6766.07679.26771.706755.53206616.30236.52134.451025.8374.62100973.581690.54107.78351.4665.271599.59255.3269.95480.2167.0982.60768.75754.907729.99243415.33239.00139.52916.8366.58108942.811754.61OpenBenchmarking.org

Threaded I/O Tester

64MB Random Write - 32 Threads

OpenBenchmarking.orgMB/s, More Is BetterThreaded I/O Tester 2017050364MB Random Write - 32 ThreadsCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem2004006008001000SE +/- 3.91, N = 3SE +/- 9.35, N = 6SE +/- 2.37, N = 3SE +/- 5.80, N = 6SE +/- 1.00, N = 3SE +/- 10.53, N = 3SE +/- 5.79, N = 3SE +/- 27.51, N = 6255.32151.00214.01299.60300.23555.54337.00958.961. (CC) gcc options: -O2

SQLite

Timed SQLite Insertions

OpenBenchmarking.orgSeconds, Fewer Is BetterSQLite 3.22Timed SQLite InsertionsCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3 csum_type=noneCEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem20406080100SE +/- 1.07, N = 3SE +/- 1.84, N = 3SE +/- 0.93, N = 3SE +/- 0.38, N = 3SE +/- 0.10, N = 3SE +/- 0.34, N = 3SE +/- 0.28, N = 6SE +/- 0.77, N = 4SE +/- 0.06, N = 369.95107.78109.4898.3046.2145.1017.2952.7520.611. (CC) gcc options: -O2 -ldl -lpthread

Dbench

12 Clients

OpenBenchmarking.orgMB/s, More Is BetterDbench 4.012 ClientsCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3 csum_type=noneCEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem30060090012001500SE +/- 1.35, N = 3SE +/- 1.81, N = 3SE +/- 3.55, N = 3SE +/- 0.66, N = 3SE +/- 2.75, N = 3SE +/- 2.72, N = 3SE +/- 7.08, N = 3SE +/- 1.93, N = 3SE +/- 4.49, N = 3480.21351.46344.71417.51691.87773.20800.77683.491285.751. (CC) gcc options: -lpopt -O2

Dbench

1 Clients

OpenBenchmarking.orgMB/s, More Is BetterDbench 4.01 ClientsCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem4080120160200SE +/- 0.30, N = 3SE +/- 0.39, N = 3SE +/- 2.01, N = 6SE +/- 1.69, N = 4SE +/- 1.69, N = 3SE +/- 1.04, N = 3SE +/- 0.76, N = 3SE +/- 2.91, N = 367.0954.6756.0598.67101.27197.9382.85179.021. (CC) gcc options: -lpopt -O2

FS-Mark

1000 Files, 1MB Size

OpenBenchmarking.orgFiles/s, More Is BetterFS-Mark 3.31000 Files, 1MB SizeCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3 csum_type=noneCEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem4080120160200SE +/- 1.25, N = 4SE +/- 0.83, N = 3SE +/- 0.64, N = 3SE +/- 0.29, N = 3SE +/- 0.80, N = 3SE +/- 1.35, N = 6SE +/- 1.29, N = 3SE +/- 1.36, N = 5SE +/- 4.84, N = 682.6065.2766.0761.9383.5395.50159.0387.98152.131. (CC) gcc options: -static

Dbench

48 Clients

OpenBenchmarking.orgMB/s, More Is BetterDbench 4.048 ClientsCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem30060090012001500SE +/- 3.97, N = 3SE +/- 2.02, N = 3SE +/- 1.19, N = 3SE +/- 8.61, N = 3SE +/- 2.21, N = 3SE +/- 2.82, N = 3SE +/- 7.36, N = 3SE +/- 98.18, N = 6768.75679.26712.22938.321055.651220.01968.60812.431. (CC) gcc options: -lpopt -O2

Dbench

128 Clients

OpenBenchmarking.orgMB/s, More Is BetterDbench 4.0128 ClientsCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem30060090012001500SE +/- 6.81, N = 3SE +/- 3.58, N = 3SE +/- 5.18, N = 3SE +/- 2.85, N = 3SE +/- 11.99, N = 3SE +/- 6.02, N = 3SE +/- 11.71, N = 3SE +/- 10.43, N = 3754.90771.70779.86970.311055.641336.95965.17959.001. (CC) gcc options: -lpopt -O2

Apache Benchmark

Static Web Page Serving

OpenBenchmarking.orgRequests Per Second, More Is BetterApache Benchmark 2.4.29Static Web Page ServingCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem2K4K6K8K10KSE +/- 88.00, N = 3SE +/- 80.76, N = 3SE +/- 128.93, N = 6SE +/- 49.14, N = 3SE +/- 197.05, N = 6SE +/- 125.92, N = 4SE +/- 99.52, N = 6SE +/- 37.64, N = 37729.996755.537961.198550.117162.877272.347336.677307.721. (CC) gcc options: -shared -fPIC -O2 -pthread

PostMark

Disk Transaction Performance

OpenBenchmarking.orgTPS, More Is BetterPostMark 1.51Disk Transaction PerformanceCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem5001000150020002500SE +/- 15.67, N = 3SE +/- 31.10, N = 3SE +/- 21.11, N = 3SE +/- 34.53, N = 3SE +/- 35.31, N = 5SE +/- 16.19, N = 3SE +/- 53.62, N = 6243420662273244322062299214924091. (CC) gcc options: -O3

Unpacking The Linux Kernel

linux-4.15.tar.xz

OpenBenchmarking.orgSeconds, Fewer Is BetterUnpacking The Linux Kernellinux-4.15.tar.xzCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem48121620SE +/- 0.33, N = 8SE +/- 0.27, N = 4SE +/- 0.24, N = 5SE +/- 0.14, N = 4SE +/- 0.42, N = 8SE +/- 0.19, N = 7SE +/- 0.19, N = 8SE +/- 0.07, N = 415.3316.3015.6814.5314.7714.7115.4114.45

Compile Bench

Test: Read Compiled Tree

OpenBenchmarking.orgMB/s, More Is BetterCompile Bench 0.6Test: Read Compiled TreeCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external Journal60120180240300SE +/- 1.50, N = 3SE +/- 5.63, N = 3SE +/- 5.76, N = 3SE +/- 0.73, N = 3SE +/- 2.94, N = 3239.00236.52250.96260.08260.32

Compile Bench

Test: Initial Create

OpenBenchmarking.orgMB/s, More Is BetterCompile Bench 0.6Test: Initial CreateCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external Journal306090120150SE +/- 1.96, N = 3SE +/- 1.69, N = 3SE +/- 1.49, N = 3SE +/- 2.37, N = 3SE +/- 4.01, N = 3139.52134.45136.01144.53135.49

Compile Bench

Test: Compile

OpenBenchmarking.orgMB/s, More Is BetterCompile Bench 0.6Test: CompileCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external Journal2004006008001000SE +/- 28.08, N = 6SE +/- 19.45, N = 6SE +/- 4.80, N = 3SE +/- 15.77, N = 3SE +/- 22.78, N = 6916.831025.831112.431148.881028.88

PostgreSQL pgbench

Scaling: On-Disk - Test: Normal Load - Mode: Read Write

OpenBenchmarking.orgTPS, More Is BetterPostgreSQL pgbench 10.3Scaling: On-Disk - Test: Normal Load - Mode: Read WriteCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=none8001600240032004000SE +/- 63.93, N = 3SE +/- 14.08, N = 31824.703642.911. (CC) gcc options: -fno-strict-aliasing -fwrapv -O2 -lpgcommon -lpgport -lpq -lpthread -lrt -lcrypt -ldl -lm

Gzip Compression

Linux Source Tree Archiving To .tar.gz

OpenBenchmarking.orgSeconds, Fewer Is BetterGzip CompressionLinux Source Tree Archiving To .tar.gzCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem20406080100SE +/- 0.70, N = 3SE +/- 1.47, N = 3SE +/- 2.62, N = 6SE +/- 2.16, N = 6SE +/- 1.35, N = 3SE +/- 2.46, N = 6SE +/- 1.77, N = 6SE +/- 2.64, N = 666.5874.6270.3771.7467.5769.3973.3771.33

Threaded I/O Tester

64MB Random Read - 32 Threads

OpenBenchmarking.orgMB/s, More Is BetterThreaded I/O Tester 2017050364MB Random Read - 32 ThreadsCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem20K40K60K80K100KSE +/- 7596.75, N = 6SE +/- 2403.25, N = 6SE +/- 9550.32, N = 6SE +/- 2213.07, N = 6SE +/- 2822.07, N = 6SE +/- 1990.78, N = 6SE +/- 1303.43, N = 3SE +/- 3323.01, N = 6108942.81100973.5884936.34102558.87100449.37115753.71107041.8360691.531. (CC) gcc options: -O2

AIO-Stress

Random Write

OpenBenchmarking.orgMB/s, More Is BetterAIO-Stress 0.21Random WriteCEPH luminous bluestore 3 OSDs replica 1CEPH luminous bluestore 3 OSDs replica 3 csum_type=noneCEPH luminous bluestore 3 OSDs replica 3CEPH jewel 3 OSDs replica 3CEPH Jewel 1 OSDCEPH Jewel 1 OSD w/ external JournalDirect SSD io=native cache=noneCEPH Jewel 3 OSDslocal filesystem400800120016002000SE +/- 96.24, N = 6SE +/- 22.97, N = 3SE +/- 25.18, N = 6SE +/- 24.90, N = 3SE +/- 13.54, N = 3SE +/- 109.84, N = 6SE +/- 55.02, N = 6SE +/- 25.85, N = 3SE +/- 73.02, N = 61754.611599.591690.541818.641340.551822.871802.661721.821478.381. (CC) gcc options: -pthread -laio


Phoronix Test Suite v10.8.4