ocdcephbenchmarks

KVM QEMU testing on CentOS Linux 7 via the Phoronix Test Suite.

Compare your own system(s) to this result file with the Phoronix Test Suite by running the command: phoronix-test-suite benchmark 1805300-FO-OCDCEPHBE02
Jump To Table - Results

View

Do Not Show Noisy Results
Do Not Show Results With Incomplete Data
Do Not Show Results With Little Change/Spread
List Notable Results

Limit displaying results to tests within:

C/C++ Compiler Tests 2 Tests
CPU Massive 3 Tests
Database Test Suite 2 Tests
Disk Test Suite 5 Tests
Common Kernel Benchmarks 3 Tests
Server 3 Tests

Statistics

Show Overall Harmonic Mean(s)
Show Overall Geometric Mean
Show Geometric Means Per-Suite/Category
Show Wins / Losses Counts (Pie Chart)
Normalize Results
Remove Outliers Before Calculating Averages

Graph Settings

Force Line Graphs Where Applicable
Convert To Scalar Where Applicable
Prefer Vertical Bar Graphs

Multi-Way Comparison

Condense Multi-Option Tests Into Single Result Graphs

Table

Show Detailed System Result Table

Run Management

Highlight
Result
Hide
Result
Result
Identifier
Performance Per
Dollar
Date
Run
  Test
  Duration
local filesystem
May 27 2018
 
CEPH Jewel 3 OSDs
May 27 2018
 
Direct SSD io=native cache=none
May 27 2018
 
CEPH Jewel 1 OSD w/ external Journal
May 28 2018
 
CEPH Jewel 1 OSD
May 29 2018
 
CEPH jewel 3 OSDs replica 3
May 29 2018
 
CEPH luminous bluestore 3 OSDs replica 3
May 30 2018
 
CEPH luminous bluestore 3 OSDs replica 3 csum_type=none
May 30 2018
 
CEPH luminous bluestore 3 OSDs replica 1
May 30 2018
 
Invert Hiding All Results Option
 

Only show results where is faster than
Only show results matching title/arguments (delimit multiple options with a comma):
Do not show results matching title/arguments (delimit multiple options with a comma):


ocdcephbenchmarks, "FS-Mark 3.3 - 1000 Files, 1MB Size", Higher Results Are Better "local filesystem",128.2,154.1,155.9,158.9,157.7,158 "CEPH Jewel 3 OSDs",83,89.5,88.5,91.1,87.8 "Direct SSD io=native cache=none",158,157.5,161.6 "CEPH Jewel 1 OSD w/ external Journal",89.6,95.4,98.3,98.9,94.9,95.9 "CEPH Jewel 1 OSD",83.9,84.7,82 "CEPH jewel 3 OSDs replica 3",61.4,62.4,62 "CEPH luminous bluestore 3 OSDs replica 3",64.8,66.9,66.5 "CEPH luminous bluestore 3 OSDs replica 3 csum_type=none",66.1,63.6,66.1 "CEPH luminous bluestore 3 OSDs replica 1",79.7,85.7,83.1,81.9 "AIO-Stress 0.21 - Random Write", Higher Results Are Better "local filesystem",1165.71,1438.46,1665.95,1493.25,1636.85,1470.05 "CEPH Jewel 3 OSDs",1670.31,1743.68,1751.47 "Direct SSD io=native cache=none",1612.88,1893.65,2005.14,1741.7,1759.97,1802.64 "CEPH Jewel 1 OSD w/ external Journal",1607.74,2359.8,1755.51,1733.28,1715.52,1765.36 "CEPH Jewel 1 OSD",1331.4,1367.19,1323.05 "CEPH jewel 3 OSDs replica 3",1868.39,1795.59,1791.93 "CEPH luminous bluestore 3 OSDs replica 3",1641.99,1733.19,1784.09,1677.83,1613.54,1692.58 "CEPH luminous bluestore 3 OSDs replica 3 csum_type=none",1645.52,1577.3,1575.96 "CEPH luminous bluestore 3 OSDs replica 1",1623.72,1624.18,2069.11,1510.09,2032.95,1667.58 "Dbench 4.0 - 12 Clients", Higher Results Are Better "local filesystem",1294.31,1279.14,1283.79 "CEPH Jewel 3 OSDs",685.618,685.21,679.635 "Direct SSD io=native cache=none",786.697,809.115,806.487 "CEPH Jewel 1 OSD w/ external Journal",768.034,777.249,774.306 "CEPH Jewel 1 OSD",689.001,689.236,697.366 "CEPH jewel 3 OSDs replica 3",418.582,416.293,417.649 "CEPH luminous bluestore 3 OSDs replica 3",337.625,347.938,348.557 "CEPH luminous bluestore 3 OSDs replica 3 csum_type=none",350.617,354.923,348.838 "CEPH luminous bluestore 3 OSDs replica 1",477.532,481.865,481.245 "Dbench 4.0 - 48 Clients", Higher Results Are Better "local filesystem",879.982,323.514,902.149,942.177,900.687,926.062 "CEPH Jewel 3 OSDs",980.761,969.688,955.347 "Direct SSD io=native cache=none",1223.55,1214.44,1222.03 "CEPH Jewel 1 OSD w/ external Journal",1055.61,1051.83,1059.5 "CEPH Jewel 1 OSD",930.268,955.517,929.165 "CEPH jewel 3 OSDs replica 3",714.091,710.001,712.56 "CEPH luminous bluestore 3 OSDs replica 3",675.242,680.891,681.643 "CEPH luminous bluestore 3 OSDs replica 1",776.558,766.107,763.576 "Dbench 4.0 - 128 Clients", Higher Results Are Better "local filesystem",978.523,955.618,942.871 "CEPH Jewel 3 OSDs",987.287,960.747,947.465 "Direct SSD io=native cache=none",1348.35,1327.9,1334.6 "CEPH Jewel 1 OSD w/ external Journal",1033.51,1074.69,1058.73 "CEPH Jewel 1 OSD",967.215,976.004,967.709 "CEPH jewel 3 OSDs replica 3",789.808,772.408,777.351 "CEPH luminous bluestore 3 OSDs replica 3",767.418,768.881,778.81 "CEPH luminous bluestore 3 OSDs replica 1",756.07,742.561,766.065 "Dbench 4.0 - 1 Clients", Higher Results Are Better "local filesystem",174.781,177.694,184.598 "CEPH Jewel 3 OSDs",81.8696,82.3243,84.3512 "Direct SSD io=native cache=none",198.097,199.632,196.055 "CEPH Jewel 1 OSD w/ external Journal",100.337,104.56,98.917 "CEPH Jewel 1 OSD",103.563,96.8607,98.2182,96.0252 "CEPH jewel 3 OSDs replica 3",46.014,58.1758,57.4725,57.9952,57.8471,58.7785 "CEPH luminous bluestore 3 OSDs replica 3",54.1861,54.3753,55.448 "CEPH luminous bluestore 3 OSDs replica 1",66.5083,67.2193,67.531 "Threaded I/O Tester 20170503 - 64MB Random Read - 32 Threads", Higher Results Are Better "local filesystem",54472.431,48647.236,59500.291,66158.418,64652.587,70718.232 "CEPH Jewel 3 OSDs",109565.59,105214.488,106345.415 "Direct SSD io=native cache=none",118848.654,109460.182,117350.447,109806.445,118244.804,120811.704 "CEPH Jewel 1 OSD w/ external Journal",103785.537,112391.615,97995.119,97849.976,92165.069,98508.899 "CEPH Jewel 1 OSD",95451.156,103392.569,111195.57,101567.149,98774.959,104971.809 "CEPH jewel 3 OSDs replica 3",94277.954,86728.212,38765.853,102589.791,98694.039,88562.162 "CEPH luminous bluestore 3 OSDs replica 3",94718.342,99061.623,109900.724,96195.397,99829.393,106135.987 "CEPH luminous bluestore 3 OSDs replica 1",120004.688,85611.571,111613.712,136406.021,108486.068,91534.817 "Threaded I/O Tester 20170503 - 64MB Random Write - 32 Threads", Higher Results Are Better "local filesystem",892.775,1024.996,975.087,885.977,929.368,1045.532 "CEPH Jewel 3 OSDs",334.28,348.097,328.615 "Direct SSD io=native cache=none",576.418,547.539,542.661 "CEPH Jewel 1 OSD w/ external Journal",301.965,300.216,298.505 "CEPH Jewel 1 OSD",325.007,299.554,298.114,299.444,281.425,294.069 "CEPH jewel 3 OSDs replica 3",218.577,210.624,212.82 "CEPH luminous bluestore 3 OSDs replica 3",167.38,119.932,168.019,166.294,123.328,161.076 "CEPH luminous bluestore 3 OSDs replica 1",247.789,260.936,257.229 "Compile Bench 0.6 - Test: Compile", Higher Results Are Better "CEPH Jewel 1 OSD w/ external Journal",1121.91,986.25,988.89,1000.53,1002.29,1073.42 "CEPH Jewel 1 OSD",1127.52,1179.65,1139.48 "CEPH jewel 3 OSDs replica 3",1105.74,1109.81,1121.73 "CEPH luminous bluestore 3 OSDs replica 3",955.81,1034.01,1058.26,1089.1,1028.36,989.46 "CEPH luminous bluestore 3 OSDs replica 1",998.87,966.18,914.19,835.54,951.16,835.03 "Compile Bench 0.6 - Test: Initial Create", Higher Results Are Better "CEPH Jewel 1 OSD w/ external Journal",142.62,128.74,135.1 "CEPH Jewel 1 OSD",146.95,139.8,146.84 "CEPH jewel 3 OSDs replica 3",134.81,134.26,138.97 "CEPH luminous bluestore 3 OSDs replica 3",137.79,133.25,132.3 "CEPH luminous bluestore 3 OSDs replica 1",136.52,138.83,143.21 "Compile Bench 0.6 - Test: Read Compiled Tree", Higher Results Are Better "CEPH Jewel 1 OSD w/ external Journal",263.98,262.48,254.5 "CEPH Jewel 1 OSD",258.62,260.88,260.74 "CEPH jewel 3 OSDs replica 3",239.46,257.37,256.06 "CEPH luminous bluestore 3 OSDs replica 3",244.07,225.52,239.97 "CEPH luminous bluestore 3 OSDs replica 1",236.08,239.92,241.01 "Apache Benchmark 2.4.29 - Static Web Page Serving", Higher Results Are Better "local filesystem",7377.6,7248.53,7297.02 "CEPH Jewel 3 OSDs",7241.69,7029.38,7735.56,7402.73,7186.45,7424.21 "Direct SSD io=native cache=none",7031.72,7078.47,7503.78,7475.38 "CEPH Jewel 1 OSD w/ external Journal",8056.21,7371.77,6794.3,6975.15,6929.19,6850.57 "CEPH Jewel 1 OSD",8471.78,8640.68,8537.87 "CEPH jewel 3 OSDs replica 3",7793.15,7950.82,8361.04,7446.95,8147.48,8067.69 "CEPH luminous bluestore 3 OSDs replica 3",6916.97,6670.55,6679.08 "CEPH luminous bluestore 3 OSDs replica 1",7711.89,7587.43,7890.66 "PostMark 1.51 - Disk Transaction Performance", Higher Results Are Better "local filesystem",2336,2427,2551,2577,2272,2293 "CEPH Jewel 3 OSDs",2173,2118,2155 "Direct SSD io=native cache=none",2173,2272,2336,2358,2358 "CEPH Jewel 1 OSD w/ external Journal",2272,2192,2155 "CEPH Jewel 1 OSD",2475,2403,2450 "CEPH jewel 3 OSDs replica 3",2212,2293,2314 "CEPH luminous bluestore 3 OSDs replica 3", "CEPH luminous bluestore 3 OSDs replica 1",2403,2450,2450 "PostgreSQL pgbench 10.3 - Scaling: On-Disk - Test: Normal Load - Mode: Read Write", Higher Results Are Better "Direct SSD io=native cache=none",3668.185622,3619.527295,3641.016249 "CEPH Jewel 1 OSD w/ external Journal",1696.930945,1884.288442,1892.887513 "SQLite 3.22 - Timed SQLite Insertions", Lower Results Are Better "local filesystem",20.516628026962,20.601418018341,20.707045793533 "CEPH Jewel 3 OSDs",53.091369867325,50.6812479496,54.41764998436,52.79353094101 "Direct SSD io=native cache=none",18.391449213028,16.430815935135,16.827563047409,17.120757102966,17.560329914093,17.432175159454 "CEPH Jewel 1 OSD w/ external Journal",45.749105215073,44.947448968887,44.604034900665 "CEPH Jewel 1 OSD",46.045742034912,46.205090999603,46.385432004929 "CEPH jewel 3 OSDs replica 3",97.54148888588,98.795964002609,98.554126977921 "CEPH luminous bluestore 3 OSDs replica 3",111.29176878929,108.18467092514,108.97330284119 "CEPH luminous bluestore 3 OSDs replica 3 csum_type=none",106.13357496262,105.75309896469,111.44476795197 "CEPH luminous bluestore 3 OSDs replica 1",71.866236925125,69.795584201813,68.184175014496 "Unpacking The Linux Kernel - linux-4.15.tar.xz", Lower Results Are Better "local filesystem",14.671772956848,14.416718006134,14.376811027527,14.349663972855 "CEPH Jewel 3 OSDs",15.970369100571,15.256604194641,16.261918067932,14.82867193222,15.782586812973,14.868692874908,14.977872848511,15.33172917366 "Direct SSD io=native cache=none",14.380490064621,14.77250289917,15.500410079956,13.874188899994,14.556468009949,14.962131023407,14.897182226181 "CEPH Jewel 1 OSD w/ external Journal",15.0353910923,15.394495010376,16.55308008194,13.525293827057,13.532243967056,14.660176992416,13.512840986252,15.926465988159 "CEPH Jewel 1 OSD",14.864154815674,14.208387136459,14.623382091522,14.415481090546 "CEPH jewel 3 OSDs replica 3",16.62650179863,15.420037031174,15.289583921432,15.605319023132,15.452120065689 "CEPH luminous bluestore 3 OSDs replica 3",16.452185869217,16.983808040619,15.994841814041,15.787308931351 "CEPH luminous bluestore 3 OSDs replica 1",15.107428073883,15.554498195648,16.221236944199,16.500249862671,15.585499048233,13.949112892151,15.708993196487,14.028234004974 "Gzip Compression - Linux Source Tree Archiving To .tar.gz", Lower Results Are Better "local filesystem",79.054538965225,78.979356050491,71.47647690773,68.688354969025,65.143775939941,64.616640090942 "CEPH Jewel 3 OSDs",80.724157094955,75.840415000916,71.116141080856,72.596638917923,68.437533855438,71.507607936859 "Direct SSD io=native cache=none",78.013841152191,74.701321125031,70.314050912857,66.173755884171,63.182187080383,63.970546007156 "CEPH Jewel 1 OSD w/ external Journal",70.20988202095,65.798621177673,66.694798946381 "CEPH Jewel 1 OSD",65.764527082443,74.972877025604,74.965191841125,74.963376998901,75.621740102768,64.132951021194 "CEPH jewel 3 OSDs replica 3",78.897522211075,75.804579973221,72.435704946518,68.039735078812,63.065625905991,63.979481935501 "CEPH luminous bluestore 3 OSDs replica 3",77.530955076218,72.814587831497,73.515851020813 "CEPH luminous bluestore 3 OSDs replica 1",65.221828937531,66.969459056854,67.563499927521