Introduction
Sorry G80, your time is up.
There's no arguing that NVIDIA's flagship D3D10 GPU has held a reign over 3D graphics that never truly saw it usurped, even by G92 and a dubiously named GeForce 9-series range. The high-end launch product based on G80, GeForce 8800 GTX, is still within spitting distance of anything that's come out since in terms of raw single-chip performance. It flaunts its 8 clusters, 384-bit memory bus and 24 ROPs in the face of G92, meaning that products like 9800 GTX have never really felt like true upgrades to owners of G80-based products.
That I type this text on my own PC powered by a GeForce 8800 GTX, one that I bought -- which is largely unheard of in the world of tech journalism; as a herd, we never usually buy PC components -- with my own hard-earned, and on launch day no less, speaks wonders for the chip's longevity. I'll miss you old girl, your 20 month spell at the top of the pile is now honestly up. So what chip the usurper, and how far has it moved the game on?
Rumours about GT200 have swirled for some time, and recently the rumour mill has mostly got it right. The basic architecture is pretty much a known quantity at this point, and it's a basic architecture that shares a lot of common ground with the one powering the chip we've just eulogised. Why mess too much with what's worked so well, surely? "Correctamundo", says the Fonz, and the Fonz is always right.
It's all about the detail now, so we'll try and reveal as much as possible to see where the deviance can be found. We'll delve into the architecture first, before taking a look at the first two products it powers, looking back to previous NVIDIA D3D10 hardware as necessary to paint the picture.