TestXonotic180421
Intel Core i7-3630QM testing with a Intel VG10F (1.80 BIOS) and Intel HD 4000 2GB on Arch Linux via the Phoronix Test Suite.
Intel Core i7 3Gen with Nvidia GeForce GT 740M
Processor: Intel Core i7-3630QM @ 3.40GHz (4 Cores / 8 Threads), Motherboard: Intel VG10F (1.80 BIOS), Chipset: Intel 3rd Gen Core DRAM, Memory: 8GB, Disk: 500GB Samsung SSD 870 + 0GB Card Reader, Graphics: Intel HD 4000 2GB (1150MHz), Audio: IDT 92HD99BXX, Network: Qualcomm Atheros QCA8171 + Intel 7260
OS: Arch Linux, Kernel: 5.10.31-1-lts (x86_64), Desktop: KDE Plasma 5.21.4, Display Server: X Server 1.20.11, Display Driver: NVIDIA, OpenGL: 4.6.0, Compiler: GCC 10.2.0, File-System: ext4, Screen Resolution: 1366x768
Kernel Notes: Transparent Huge Pages: madvise
Environment Notes: __GLX_VENDOR_LIBRARY_NAME=nvidia
Processor Notes: Scaling Governor: intel_cpufreq performance - CPU Microcode: 0x17
Security Notes: itlb_multihit: KVM: Mitigation of VMX disabled + l1tf: Mitigation of PTE Inversion; VMX: conditional cache flushes SMT vulnerable + mds: Vulnerable: Clear buffers attempted no microcode; SMT vulnerable + meltdown: Mitigation of PTI + spec_store_bypass: Vulnerable + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Full generic retpoline STIBP: disabled RSB filling + srbds: Vulnerable: No microcode + tsx_async_abort: Not affected
Xonotic
This is a benchmark of Xonotic, which is a fork of the DarkPlaces-based Nexuiz game. Development began in March of 2010 on the Xonotic game. Learn more via the OpenBenchmarking.org test page.
Intel Core i7 3Gen with Nvidia GeForce GT 740M
Processor: Intel Core i7-3630QM @ 3.40GHz (4 Cores / 8 Threads), Motherboard: Intel VG10F (1.80 BIOS), Chipset: Intel 3rd Gen Core DRAM, Memory: 8GB, Disk: 500GB Samsung SSD 870 + 0GB Card Reader, Graphics: Intel HD 4000 2GB (1150MHz), Audio: IDT 92HD99BXX, Network: Qualcomm Atheros QCA8171 + Intel 7260
OS: Arch Linux, Kernel: 5.10.31-1-lts (x86_64), Desktop: KDE Plasma 5.21.4, Display Server: X Server 1.20.11, Display Driver: NVIDIA, OpenGL: 4.6.0, Compiler: GCC 10.2.0, File-System: ext4, Screen Resolution: 1366x768
Kernel Notes: Transparent Huge Pages: madvise
Environment Notes: __GLX_VENDOR_LIBRARY_NAME=nvidia
Processor Notes: Scaling Governor: intel_cpufreq performance - CPU Microcode: 0x17
Security Notes: itlb_multihit: KVM: Mitigation of VMX disabled + l1tf: Mitigation of PTE Inversion; VMX: conditional cache flushes SMT vulnerable + mds: Vulnerable: Clear buffers attempted no microcode; SMT vulnerable + meltdown: Mitigation of PTI + spec_store_bypass: Vulnerable + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Full generic retpoline STIBP: disabled RSB filling + srbds: Vulnerable: No microcode + tsx_async_abort: Not affected
Testing initiated at 18 April 2021 17:12 by user julesfb.