p300
Running pts/mandelgpu-1.3.1 via the Phoronix Test Suite.
gts-1050-ti
Processor: Intel Xeon E3-1220 v3 @ 3.50GHz (4 Cores), Motherboard: LENOVO SHARKBAY, Chipset: Intel Xeon E3-1200 v3 DRAM, Memory: 16384MB, Disk: 1000GB Samsung SSD 850, Graphics: NVIDIA GeForce GTX 1050 Ti 4096MB (384/405MHz), Audio: Realtek ALC662 rev3, Network: Intel Connection I217-LM
OS: Ubuntu 16.10, Kernel: 4.8.0-46-generic (x86_64), Desktop: Unity 7.5.0, Display Server: X Server 1.18.4, Display Driver: NVIDIA 375.39, OpenGL: 4.5.0, Compiler: GCC 6.2.0 20161005, File-System: ext4 (ecryptfs), Screen Resolution: 5120x1440
Processor Notes: Scaling Governor: intel_pstate powersave
Intel Xeon E3-1220 v3 - NVIDIA GeForce GTX 1050 Ti
Processor: Intel Xeon E3-1220 v3 @ 3.50GHz (4 Cores), Motherboard: LENOVO SHARKBAY, Chipset: Intel Xeon E3-1200 v3 DRAM, Memory: 16384MB, Disk: 1000GB Samsung SSD 850, Graphics: NVIDIA GeForce GTX 1050 Ti 4096MB (1266/3504MHz), Audio: Realtek ALC662 rev3, Network: Intel Connection I217-LM
OS: Ubuntu 17.04, Kernel: 4.10.0-19-generic (x86_64), Desktop: Unity 7.5.0, Display Server: X Server 1.19.3, Display Driver: NVIDIA 381.09, OpenGL: 4.5.0, Compiler: GCC 6.3.0 20170406, File-System: ext4 (ecryptfs), Screen Resolution: 5120x1440
Compiler Notes: --build=x86_64-linux-gnu --disable-browser-plugin --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-gtk-cairo --enable-java-awt=gtk --enable-java-home --enable-languages=c,ada,c++,java,go,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-arch-directory=amd64 --with-default-libstdcxx-abi=new --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic -v
Processor Notes: Scaling Governor: intel_pstate powersave
OpenCL Notes: GPU Compute Cores: 768
System Notes: GPU Compute Cores: 768.
OpenArena
Tesseract
Tesseract is a fork of Cube 2 Sauerbraten with numerous graphics and game-play improvements. Tesseract has been in development since 2012 while its first release happened in May of 2014. Learn more via the OpenBenchmarking.org test page.
Unigine Valley
This test calculates the average frame-rate within the Valley demo for the Unigine engine, released in February 2013. This engine is extremely demanding on the system's graphics card. Unigine Valley relies upon an OpenGL GL3 Core Profile context. Learn more via the OpenBenchmarking.org test page.
Xonotic
This is a benchmark of Xonotic, which is a fork of the DarkPlaces-based Nexuiz game. Development began in March of 2010 on the Xonotic game. Learn more via the OpenBenchmarking.org test page.
GpuTest
GpuTest is a cross-platform OpenGL benchmark developed at Geeks3D.com that offers tech demos such as FurMark, TessMark, and other workloads to stress various areas of GPUs and drivers. Learn more via the OpenBenchmarking.org test page.
MandelGPU
MandelGPU is an OpenCL benchmark and this test runs with the OpenCL rendering float4 kernel with a maximum of 4096 iterations. Learn more via the OpenBenchmarking.org test page.
gts-1050-ti
Processor: Intel Xeon E3-1220 v3 @ 3.50GHz (4 Cores), Motherboard: LENOVO SHARKBAY, Chipset: Intel Xeon E3-1200 v3 DRAM, Memory: 16384MB, Disk: 1000GB Samsung SSD 850, Graphics: NVIDIA GeForce GTX 1050 Ti 4096MB (384/405MHz), Audio: Realtek ALC662 rev3, Network: Intel Connection I217-LM
OS: Ubuntu 16.10, Kernel: 4.8.0-46-generic (x86_64), Desktop: Unity 7.5.0, Display Server: X Server 1.18.4, Display Driver: NVIDIA 375.39, OpenGL: 4.5.0, Compiler: GCC 6.2.0 20161005, File-System: ext4 (ecryptfs), Screen Resolution: 5120x1440
Processor Notes: Scaling Governor: intel_pstate powersave
Testing initiated at 15 April 2017 16:30 by user mb.
Intel Xeon E3-1220 v3 - NVIDIA GeForce GTX 1050 Ti
Processor: Intel Xeon E3-1220 v3 @ 3.50GHz (4 Cores), Motherboard: LENOVO SHARKBAY, Chipset: Intel Xeon E3-1200 v3 DRAM, Memory: 16384MB, Disk: 1000GB Samsung SSD 850, Graphics: NVIDIA GeForce GTX 1050 Ti 4096MB (1266/3504MHz), Audio: Realtek ALC662 rev3, Network: Intel Connection I217-LM
OS: Ubuntu 17.04, Kernel: 4.10.0-19-generic (x86_64), Desktop: Unity 7.5.0, Display Server: X Server 1.19.3, Display Driver: NVIDIA 381.09, OpenGL: 4.5.0, Compiler: GCC 6.3.0 20170406, File-System: ext4 (ecryptfs), Screen Resolution: 5120x1440
Compiler Notes: --build=x86_64-linux-gnu --disable-browser-plugin --disable-vtable-verify --disable-werror --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-gtk-cairo --enable-java-awt=gtk --enable-java-home --enable-languages=c,ada,c++,java,go,d,fortran,objc,obj-c++ --enable-libmpx --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-arch-directory=amd64 --with-default-libstdcxx-abi=new --with-multilib-list=m32,m64,mx32 --with-target-system-zlib --with-tune=generic -v
Processor Notes: Scaling Governor: intel_pstate powersave
OpenCL Notes: GPU Compute Cores: 768
System Notes: GPU Compute Cores: 768.
Testing initiated at 15 April 2017 22:23 by user mb.