• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G80 Appreciation thread

Joined
27 Jul 2005
Posts
13,302
Location
The Orion Spur
Seriously never in my life have I been so impressed by a GPU architecture, over the years I've owned the voodoo 1/2, 4200TI, the ATI 9600/9700 etc and non have came close, ok the voodoo 1 wowed me the most in terms of the upgrade in gfx's but nothing has came close to the G80 in terms of longevity imo, even today a g80 will play most games @ 1920x1200, it's unbelievable seriously.

Anyone else have a love infatuation with the G80? :o

If not what's your fav GPU of all time and what's your reasoning?
 
Last edited:
My 4870x2 was my favourite GPU. I got it really cheap and it was the fastest card out for 5-6 months. It was probably the best value card i've owned - it's still really decent performance. Amazing for a card made in 2008! I'd still be using it if I just used it for gaming :) (need OpenCL).
 
Had my 8800gtx for 3 years, i loved it, sadly it died last week but i suppose 3 years on one card is good going.
 
nv30 (Geforce 3), the first 'properly' programmable gpu, although it was quite restrictive and quite basic it paved the way for all the nice things we have today.
 
eh, the 8800gtx, was no real significant boost over the previous gen, than any other new gen. In terms of architecture, it was a brute force style architecture and nothing particularly fancy. Saying it can still game in 1920x1200, is like saying a gf3 could still play 1920x1200 3 years after its release, of course it could, if you played a 3 year old game, 3 years later, it offered the same, or even increased performance. Try to play a "tough" brand new game on a gf3 3 years post release, or a G80 post release, and they are not good at high res.

Honestly, the R600/2900xt was one of the most interesting gpu's in terms of tech features, just a tiny bit ahead of its time and didn't follow AMD's current path of design based on manufacturing quality available. IF TSMC didn't suck the 2900XT released on 65NM WOULD have been faster than a 8800gtx, the ring bus was a huge, but ultimately too costly step forward that isn't quite required yet(but will likely make a comeback when we get to much higher shader counts).

IT was also AMD who started the push towards shaders rather than pipelines with the x1800/x1900 series, they also had the first programable shader core(in the Xbox).

Nvidia's latest core, the unreleased Fermi has a few unexpected but very interesting features, however, we'll have to see if we get told around release exactly what they are. So far we know they've moved a lot of the rendering pipeline into the shader clusters and split it up, however a lot of things, like tesselation, we don't know if they'd made specific hardware for or not.

The GF3 was a very nice little card though, and the changes between their first cards and a GF 256(IIRC), were massive and interesting.

The G80 went towards programable shaders, the iterations since then have been, more is better really rather than anything fancy. Even the changes in Fermi, the primary change is, more shaders the better with very small differences, its the rendering pipeline as opposed to the shaders which have changed most, and with those changes, we're not sure whats changed, other than location, as Nvidia haven't told us anything except whats in the unit, not how its done.

The 2900xt, from a manufacturing standpoint, was maybe the most amazing card made for years. Fermi is struggling to be built on the process it was intended for, thats out, been used for a year, sucks balls but is there. AMD managed to get the 2900xt, designed exclusively for the 65nm process, the process which was delayed by almost a year and wasn't available at all due to big problems with it, and moved the design UP a process to 80nm and still managed to get it out and in games like Bioshock it even beat a 8800gtx in DX10 mode. That is an amazing feat, imagine Fermi sucking so badly on 40nm Nvidia would decide to push it back to 55nm, a 3billion transistor core, thats the equivilent. Remember the 2900xt was a humongous core on 80nm, bigger than a G80, and far more complex, to within 6 months move it up a process is something that afaik, is unmatched in terms of manufacturing feats in CPU and GPU's.


To really appreciate a core you have to understand the architecture, and realistically the best/biggest changes and biggest advances came with a GF 256, 9700 pro, x1900, 2900xt.
 
I have to agree the 9700/9800 were the first cards that allowed users to actually use AA with minimal performance hit.
 
I liked the G80, during that gen, I went inbetween a 2900XT, 3870, 8800GT (G92) and a 8800GTS (G92). My fave was the GTS, really nice performer!

Last gen I've went 4850, 4850CF, 4870X2 and 260 GTX, the CF was amazing VFM, at the time I got both for £200, and the X2 was also amazing. Didn't need the power so went to a 260. That card has fared me very well and I like it.

I was going to get the ATi X2 this gen, but going to wait for NV to see what they bring out and effect on prices. I prefer to spend my money on bits of carbon fibre for my bikes at the moment.

I have no allegiances to either firm, but I think NV are less consumer friendly
 
<3 my 8800GTX, bought as a temp card but ended up carrying on using it until it died recently. Played nearly all my games maxed out including new ones such as DiRT 2

Will miss it :(
 
Wow, weird! I was just visiting this sub-forum to make a similar thread, but about G92's :)

Still using my 8800GT, had it since a week or so after release. Still plays pretty much everything at max details, with adequate AA and AF perfectly...I run at 1440x900 though, I would probably upgrade if I got a larger monitor :o

Definitely my best graphics card purchase ever :)
 
G80's are still quite good today, and they will still play a large number of modern high demanding games with a large amount of eye candy @ 1920x1200. But the only G80 that still performs today is the 8800GTX (and Ultra), as they had the most shaders and the best memory infrastructure. The 320/640 GTS's were quite a bit slower.

I wouldnt say that the G80's were no significant boost over the previous generation, the 8800GTX will blow a 7900GTX out of the water, and try running a current game at 1920x1200 on a 7900GTX, it's miles behind the G80's. Plus G80's brought CUDA to the table, and of course any Physx game will benifit from having CUDA available :P
 
Back
Top Bottom