• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G80 Appreciation thread

the only reason I changed to my current GTX280 was boredom, noticed no real performance increase although I'm no mega gamer.

Well this is what I'm talking about... different people have different ideas about what "playable" means. Personally, I run 1920x1200 with settings maxed and 16AF + whatever AA I can get while maintaining 30fps or 60 if possible. I also use fraps to check framerates in high-action / demanding scenes.

So the G80 GPUs will not do it for me in the vast majority of today's games, because I don't just load up a game, not bother to change the settings from "auto" and "use my eyes" to guess at framerates.

It's inconceivable to me that anyone could be happy any other way. :D
 
8800GT
It could clock past the GTX & was only £130 at launch.
It still plays modern games at almost top settings.
Now you can grab one for £35 ish, amazing bang for buck card!!!
 
My 8800 GTS 320 was great until I upgraded to a 1680x1050 monitor, which was a big upgrade from my 1280x960 CRT. :p
 
Apart from crysis there's been no major advancements or bar raising lately even bad company 2 runs nice on my 4850 512mb @ 1680-1050.

My fave was the 4200.

You said it. Microsoft announced that they forsee the XB360 to last another 5 years as their current-gen console. :rolleyes:

still pretty fond of this card, 3 years on and it's still running anything I chuck at it, albeit on a 19" LCD at 1440x900, I've been using it constantly for the last 3 weeks as my own system is in bits, and tbh, I haven't really noticed much difference between this and my 280...

Typical Nvidia rebranding at work. Is the GTX 280 just an 8800 series card in disguise?
 
Lets face it, lets call a spade a spade.

The G80/8800GTX was the 9800 Pro of this generation.

Anybody who thinks the HD 2900 XT was even in the same league is kidding themselves. Just take a look at any modern comparison of the 2 cards. To be honest, ATI did well to make the POS R600 go as fast as it did.

It took ATi a MAJOR revision of the architecture to make the R600 arch something worthwhile (they ditched the bloated ringbus, and managed to almost double transistor density).

The HD 4850 is what the HD 2900 XT should have been. Look at reviews of the HD 4850 and you'll see the 8800GTX and the 2900XT doppleganger HD 3870 XT wwwaaaayyyy in the background.

At the very least, the R600 should have resembled the RV740 (HD 477)), to compete with G80
(HD 4770 : 826M trans, 640 shader cores, 32 texture units, 750mhz clock speed)
(HD 2900 XT : 720M trans, 320 shader cores, 16 texture units).

Every design design employed by the G80 was a touch of genius. The odd 384bit memory inface coupled with the 768mb of RAM meant it had great balance as an architecture and was somewhat future proof (due to ever increasing memory requirements).

It had great AA performance (something Nvidia wasn't previously known for) and can still play most high end titles at 1920x1200 without AA enabled, as well as massive improvement to AF quality. No mean feat

A massively increased clock speed to a core part of the chip (shader core) was a bold move, one that I think was a first in the graphics world (correct me if I'm wrong), and also increasing effective utilization of the shader core to great success.

Lets face it, Nvidia hit the ball out of the park with the G80. The only disappointment was the price.
 
My current PC has an 8500GT. CPU is an Intel CeleronD 3.33 GHz, and 3 GB DDR2 RAM. This is why I am upgrading, hehe (see signature). It said on the box of my PNY 8500GT that the card could handle DirectX 10 games. I could play Crysis, but it ran VERY badly on minimum settings, a lot of choppiness and sound bugs allong the way, and I gave up on Crysis because of all of these issues until I finish my new PC. My Celeron and 8500GT could handle the games I liked though, like Quake 4 on high settings, and the Orange Box. Modern Warfare will run, but only on the lowest resolution (I think it is 640x480), shadows off. Very playable though. And Far Cry 2 in DirectX 9 plays quite well. Rainbow Six Vegas 1 & 2 were very playable, but in 800x600 resolution, DirectX 9, shadows on, same as Unreal Tournament 3. Metal of Honor Airborne ran, but only on lowest settings, and even then it chugged. I played through Assassin's Creed 1 on my 8500GT at the lowest possible resolution and settings, shadows off. Assassin's Creed was a great game, even on lowest settings. Street Fighter 4 runs VERY well, but ONLY if I turn off ALL other applications that might run in the background. Probably because it is a fighting game, so there is less graphical power needed. H.A.W.X. runs well on max in DirectX 9, and in DirectX 10 runs OK. Turok (the new PC game) is almost playable, but the framerate is so bad that it hurts my eyes. And as for Prototype and Quantum of Solace, I could not even get past the intro movie because the framerate was so low that I pressed CTL+ALT+DEL to open Task Manager and clicked "End Program." Ghost Recon Advanced Warfighter 2, my personal favourite game for about 2 years straight, runs at about 25 to 35 (varrying) FPS. Playable, but not smooth.

My CeleronD 3.33 GHz, 8500GT PC will continue to be used by me as a media PC rather than a gaming PC. Videos, DVD movies, maybe writing an essay. But obviously it was not fast enough to be a good gaming PC.
 
Last edited:
I have loads of respect for the G80 tbh.

You could still happily run a 8800GTX/Ultra now and still max a lot out.
 
Biggest load of tripe I've read on this forum in years, the G80 gave a massive boost in terms of performance and hardly any other GPU compares in terms of longevity.

Don't worry about it; everyone knows that drunkenmaster's function here is to take 500 words (repeating himself 3 times) to regurgitate some fantasised negative rubbish about nvidia. He's like a troll that doesn't realise he's trolling any more.


Regarding the G80 - yes, it was a very impressive architecture. I don't think that anything will ever come close to the voodoo1 in terms of the visual leap forward, but it's not really fair to compare the first real mainstream 3D accelerator (a 'revolution') with an architecture like the G80 (an 'evolution', albeit a large one).

As for longevity, I think a lot of it is related to the effect of the consoles. The need for games to be released on multiple platforms means that games PC are often able to run quite well on console-level hardware; i.e. G80 / r600 era cards.

In my mind, the three most dramatic jumps forward have been the voodoo1, the 9700xt, and the 8800GTX. hopefully one day we'll see another card to add to this list :)
 
The 8800gtx i bought as soon as they came out for £440. Most expensive card i have ever bought but was the best card i have owned in terms of its performance and value for money ultimately as lasted 2 solid years pretty much unmatched. Even after I RMA'ed this 4870x2 and shoved the 8800gtx back in it handled pretty much everything at max apart from crysis. Just sold it recently though to a mate for £45 :( Almost a 1/10th what i original paid for it. Work mate got an awesomly cheap upgrade from his 6800le.
 
In my mind, the three most dramatic jumps forward have been the voodoo1, the 9700xt, and the 8800GTX. hopefully one day we'll see another card to add to this list :)

Voodoo 1->GeForce 1->GeForce 3->8800GTX

ATI had some ground breaking performance cards along the way like the high end 9x00 models and the x1950xtx but none of them have moved technology on quite like the nVidia cards.

Voodoo 1->Performance, texture filtering quality, colored lighting.

GeForce 1->T&L which didn't get used that much but had the knock on effect of fuelling the progress of things like bump mapping*, more advanced multi texturing, more complex geometry, etc.

GeForce 3->Shader Model - specular lighting, cubic and other more advanced environment mapping, normal/bump maps and even stencil buffer features like realtime shadows. Also was a catalyst for a massive increase in poly count in scenes.

8800GTX->Made it feasible performance wise to actually use many new shader effects/dx9 features.

* OK this one really belongs to matrox with expendable :D
 
Currently running a GTS 320Mb on a 1280x1024 monitor ( 24 incher is broke on RMA ) and its running stuff very well indeed. Not sure how it will cope when the 24 incher comes back tho.

Wish i still had my 8800GTX's in SLI.

 
Last edited:
I'm not going to go as far as to agree with DM, but I don't see what G80 did that was particularly special?

It doubled the performance of the previous top-end card. There you go. That's the same as what's been happening for what... ten years now?

It certainly didn't break any records for power efficiency, or size.

G92 was far more impressive. :)
 
Whats so special about G92... Its a die shrunk, slightly more power efficient G80, with a 256bit memory bus instead of 384bit.

My partner has a 9800GTX+ (which I bought for her to play World of Warcraft @ 1920x1200)... I have an 8800GTX (Not ultra), and in most games @ 1920x1200 the 8800GTX is still inching ahead of the 9800GTX+ due to having 768 ram instead of 512, and 384bit memory bus.

Anyway the G92 is just a minor evolution of G80. Mind you, both are still capable of handling the majority of new games at 60FPS, with pretty high settings. Perhaps not maxed out x16 AA/AF, but I have no problem with most games at x8AA/AF on the 8800GTX (@1920x1200)
 
both are still capable of handling the majority of new games at 60FPS, with pretty high settings. Perhaps not maxed out x16 AA/AF, but I have no problem with most games at x8AA/AF on the 8800GTX (@1920x1200)

What new games (released in the last 12 months) are you getting 60fps in @ 1920x1200 8xAA???
 
Anyway the G92 is just a minor evolution of G80. Mind you, both are still capable of handling the majority of new games at 60FPS, with pretty high settings. Perhaps not maxed out x16 AA/AF, but I have no problem with most games at x8AA/AF on the 8800GTX (@1920x1200)

:D:D:D

Always cracks me up when I read drivel like this. At least he has the balls to stick to his guns way after the masses have conceded though. I've read so many people saying the same kind of thing but nonetheless spend hundreds on a new graphics upgrade as soon as they can afford it...:rolleyes:
 
Back
Top Bottom