x264 @ 4.6Ghz/1.4V
Intel Core i5-6600K testing with a ASRock Z170 Extreme4 and ASUS NVIDIA GeForce GTX 970 4096MB on Gentoo 2.2 via the Phoronix Test Suite.
Intel Core i5-6600K
Processor: Intel Core i5-6600K @ 4.60GHz (4 Cores), Motherboard: ASRock Z170 Extreme4, Chipset: Intel Sky Lake, Memory: 16384MB, Disk: 240GB OCZ ARC100 + 750GB Western Digital WD7500BPVT-2, Graphics: ASUS NVIDIA GeForce GTX 970 4096MB (1050/3505MHz), Audio: Realtek ALC1150, Monitor: BenQ XL2411Z, Network: Intel Connection
OS: Gentoo 2.2, Kernel: 4.6.0-rc7-linus-master+ (x86_64), Display Server: X Server 1.18.99.1, Display Driver: NVIDIA 364.19, OpenGL: 4.5.0, Vulkan: 1.0.8, Compiler: GCC 4.9.3 + Clang 3.9.0 + LLVM 3.9.0svn + CUDA 7.5, File-System: f2fs, Screen Resolution: 3840x1200
Environment Notes: __GL_THREADED_OPTIMIZATIONS=1 __GL_SHADER_DISK_CACHE=1 __GL_SHADER_DISK_CACHE_PATH=/home/tomas/.csnv
Compiler Notes: --bindir=/usr/x86_64-pc-linux-gnu/gcc-bin/4.9.3 --build=x86_64-pc-linux-gnu --datadir=/usr/share/gcc-data/x86_64-pc-linux-gnu/4.9.3 --disable-altivec --disable-fixed-point --disable-isl-version-check --disable-libcilkrts --disable-libgcj --disable-libmudflap --disable-libsanitizer --disable-libssp --disable-nls --disable-werror --enable-__cxa_atexit --enable-checking=release --enable-clocale=gnu --enable-languages=c,c++,fortran --enable-libgomp --enable-libstdcxx-time --enable-libvtv --enable-lto --enable-multilib --enable-obsolete --enable-secureplt --enable-shared --enable-silent-rules --enable-targets=all --enable-threads=posix --enable-vtable-verify --host=x86_64-pc-linux-gnu --includedir=/usr/lib/gcc/x86_64-pc-linux-gnu/4.9.3/include --mandir=/usr/share/gcc-data/x86_64-pc-linux-gnu/4.9.3/man --with-cloog --with-multilib-list=m32,m64 --with-python-dir=/share/gcc-data/x86_64-pc-linux-gnu/4.9.3/python
Processor Notes: Scaling Governor: intel_pstate performance
x264
Intel Core i5-6600K
Processor: Intel Core i5-6600K @ 4.60GHz (4 Cores), Motherboard: ASRock Z170 Extreme4, Chipset: Intel Sky Lake, Memory: 16384MB, Disk: 240GB OCZ ARC100 + 750GB Western Digital WD7500BPVT-2, Graphics: ASUS NVIDIA GeForce GTX 970 4096MB (1050/3505MHz), Audio: Realtek ALC1150, Monitor: BenQ XL2411Z, Network: Intel Connection
OS: Gentoo 2.2, Kernel: 4.6.0-rc7-linus-master+ (x86_64), Display Server: X Server 1.18.99.1, Display Driver: NVIDIA 364.19, OpenGL: 4.5.0, Vulkan: 1.0.8, Compiler: GCC 4.9.3 + Clang 3.9.0 + LLVM 3.9.0svn + CUDA 7.5, File-System: f2fs, Screen Resolution: 3840x1200
Environment Notes: __GL_THREADED_OPTIMIZATIONS=1 __GL_SHADER_DISK_CACHE=1 __GL_SHADER_DISK_CACHE_PATH=/home/tomas/.csnv
Compiler Notes: --bindir=/usr/x86_64-pc-linux-gnu/gcc-bin/4.9.3 --build=x86_64-pc-linux-gnu --datadir=/usr/share/gcc-data/x86_64-pc-linux-gnu/4.9.3 --disable-altivec --disable-fixed-point --disable-isl-version-check --disable-libcilkrts --disable-libgcj --disable-libmudflap --disable-libsanitizer --disable-libssp --disable-nls --disable-werror --enable-__cxa_atexit --enable-checking=release --enable-clocale=gnu --enable-languages=c,c++,fortran --enable-libgomp --enable-libstdcxx-time --enable-libvtv --enable-lto --enable-multilib --enable-obsolete --enable-secureplt --enable-shared --enable-silent-rules --enable-targets=all --enable-threads=posix --enable-vtable-verify --host=x86_64-pc-linux-gnu --includedir=/usr/lib/gcc/x86_64-pc-linux-gnu/4.9.3/include --mandir=/usr/share/gcc-data/x86_64-pc-linux-gnu/4.9.3/man --with-cloog --with-multilib-list=m32,m64 --with-python-dir=/share/gcc-data/x86_64-pc-linux-gnu/4.9.3/python
Processor Notes: Scaling Governor: intel_pstate performance
Testing initiated at 12 May 2016 00:28 by user tomas.