nvnews.net: Back in August, NVIDIA announced the GeForce 6600 series of graphics processing units. The GeForce 6600 series is targeted at the mainstream market segment and is comprised of the GeForce 6600 GT and GeForce 6600, which will carry a suggested retail price of $199 and $149 respectively.
pcstats.com: The Geforce 6600/6600GT GPU, like its opposite the ATI X700/X700 PRO, offers considerably more for the money than the mainstream cards of previous generations did. PCstats is testing out nine Geforce 6600 and Geforce 6600 GT based videocards, in both AGP and PCI Express versions. The features of each videocard will be compared, as well as the standard run of gaming benchmarks. SLI benchmarks will be analyzed wherever possible with the overall aim of deciding which of the mainstream graphics cards solutions in this roundup provides the best value for money, whether you are looking for a single mid-range gaming card or a dual-videocard SLI powerhouse.
lostcircuits.com: The resurrection of SLI made possible by PCI Express offers new dimensions for 3D processing - in both performance and price categories whereas the upper midrange of the price range still faces a painful void. That particular nitch is where the GeGorce 6600GT is aiming, especially in SLI configuration. From an SLI standpoint, the 6600GT is especially interesting since its single card performance - even though excellent for the price - offers enough headroom for improvement by doubling the resources.
3dgameman.com: This particular product comes standard with 128MB of DDR3, 2ns memory. The default core speed is 500MHz and the memory speed is 1GHz. While these are high default speeds, the core and memory can be overclocked even higher for added performance. Also, an included adapter enables easy hookup of S-Video, Composite and Component Video out. Watch the Video to find out more...
legionhardware.com: After recently reviewing the Gigabyte 3D1, which was a dual-core graphics card designed for use with the PCI Express bus, it is easy to forget about AGP. However, if you were to forget about AGP, you would be forgetting about the majority of the computers in the world that still use this interface. Clearly AGP will not be exiting the market anytime soon and most users are not willing to sacrifice their entire system for a graphics card upgrade. That said manufacturers such as ATi and NVIDIA are still releasing the odd AGP card.
hexus.net: Enthusiasts looking to build new midrange systems now need to make a few hard choices. The first is which CPU company to opt for, Intel or AMD. Once that choice is made, the next thing to consider is which CPU form factor is best for them. Leading on from this is a choice of motherboard chipsets. Here's where it gets interesting, especially in terms of graphics cards. Our recent look at NVIDIA and ATI's midrange line-up highlighted both company's use of the new-fangled PCI-Express interface.
overclockers.co.nz: The results speak for themselves. MSI's turbo charged 6600 Diamond Edition is only around 13% slower than the standard 6600GT AGP, thanks to its higher GPU and memory clocks.
legitreviews.com: Overall the GeForce GTX 280 graphics card was a winner in our books and it made a difference while gaming, which is the most important thing. The game we noticed the performance gains the most was actually Age of Conan when we cranked up the image quality at a resolution of 1920x1200 . Age of Conan: Hyborian Adventures passed the astounding 'One Million Copies Shipped' milestone in less than three weeks after the game's launch, so that is a huge potential market in the months to come...
hothardware.com: The high-end 280 card is powered by NVIDIA's 1.4 billion transistor GT200 GPU, produced on TSMC's 65nm process node and is the largest, most complex chip TSMC has ever manufactured. Beyond just a larger number of stream processing units, the GT200 also supports three times the number of threads in flight, as NVIDIA's previous G80, at any given time. It also has a new scheduler design that is up to 20% more efficient, along with a wider 512-bit memory interface with improved z-cull and compression technology.
techreport.com: If the GPU world were a wildlife special on the National Geographic channel, the G80 processor that powers GeForce 8800 GTX graphics cards would be a stunningly successful apex predator. In the nearly two years that have passed since its introduction, no other single-chip graphics solution has surpassed it. Newer GPUs have come close, shrinking similar capabilities into smaller, cooler chips, but that's about it. The G80 is still the biggest, baddest beast of its kinda chip, as we said at the time, with &quot;the approximate surface area of Rosie O'Donnell.&quot; After it dispatched its would-be rival, the Radeon HD 2900 XT, in an epic mismatch, AMD gave up on building high-end GPUs altogether, preferring instead to go the multi-GPU route.
bit-tech.net: That said, making a compute heavy ASIC with 1.4 billion transistors is an amazing feat, but to be honest I had hoped to see more performance from such a big chip the GT200's performance is often higher than the 9800 GX2 in the targeted tests we've run, but in many real-world cases it isn't, as we'll show you very soon in our GeForce GTX 280 and 260 gaming performance article. I can't help but feel this is a strange position to be in with the release of a completely new architecture because, generally speaking, the new generation of hardware completely outclasses everything that's gone before.
bit-tech.net: We've only had a short time with Nvidia's GeForce GTX 295, but it looks set to take the performance crown from the Radeon HD 4870 X2. However, that performance crown won't be held in a dominant manner like we witnessed back in August when AMD launched its dual-GPU monster. There are some early driver issues we've seen here though and, from the conversations we've had with Nvidia, there is still some work to be done on the driver side before they're ready for the January 8th launch. Our biggest disappointment in the testing we've done so far though is how the card performs at higher resolutions with AA enabled - there are scenarios like the one we've shown you here in Fallout 3 where the card just runs out of steam at 2,560 x 1.600 with 8xAA. This doesn't happen with the Radeon HD 4870 X2 and we believe a lot of this is down to the decision to launch with a lower than expected memory speed, but it does leave room for partners to release cards with memory speeds above 2,300MHz in the future.
techarp.com: When NVIDIA launched their GT200 GPU and the first two graphics cards based on it - the GeForce GTX 280 and the GeForce GTX 260, they thought they had ATI licked for good. In fact, they originally pegged the GeForce 9800 GTX+ as the direct competitor to ATI's forthcoming (at that time) Radeon HD 4870 graphics card. Unfortunately, NVIDIA grossly underestimated the ATI Radeon HD 4870, which not only roundly trounced both the GeForce 9800 GTX+ and the GeForce GTX 260, but was also more than a match for the GeForce GTX 280.
bit-tech.net: From Nvidias side of the fence, the GeForce GTX 280 is the obvious competition and while the GTX 285 is faster, you have to ask if its worth the additional outlay over the few GTX 280s that are still available. Obviously, weve tested the standard clocked version here, but theres an XFX GeForce GTX 280 clocked at 640MHz core (not that much lower than the GTX 285s reference clock) on sale at Scan for 275 including VAT which makes us wonder whether its worth spending the extra 20 or so on a 285. Of course, there are power consumption benefits to the 285, but theyre small.
tweaktown.com: Unlike the GTX 295, we can install three of the GTX 285 cards into a compatible system and what you essentially end up with is three cores. And while this is less cores than a Quad-SLI GTX 295 setup which only offers two cards but a grand total of four GPUs, the performance on a GTX 285 is superior to a single core on the GTX 295. Today we'll simply be looking at the performance of adding the extra cards across our Vista benchmarks. With everything said and done, let's get stuck into the benchmarks and see if Tri-SLI is working a bit better than how it's been in the past.
hothardware.com: NVIDIA has gotten a lot of mileage out of their G92 GPU architecture. Starting with the GeForce 8800 GT, which featured a 65nm variant of the G92 GPU, on up through the GeForce 9800 GTX+, which used an updated version manufactured on a more advanced 55nm process. The G92 GPU has been featured on no less than seven different GeForce branded desktop graphics cards, not to mention the slew of mobile GeForces based on the G92 that are also in production. Although it has been around for quite some time now, NVIDIA is launching yet another graphics card based on the G92 today, the GeForce GTS 250.
techreport.com: The GeForce GTS 250 supplants the GeForce 9800 GTX+ with a smaller card, lower power draw, double the video memory, and a markedly lower price. But is it good enough to fend off the Radeon HD 4800-series competition?
legitreviews.com: This article included the performance analysis of the video games Velvet Assassin and Wall-E, along with the OpenGL benchmark FurMark on the GeForce GTX 275 and the Radeon HD 4890. I feel it's always good to switch things up and to go out and buy a couple games every now and again to see how video card performance is on games that the driver team really hasn't spent much time on optimizing. The results of the three new benchmarks showed that the...