• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

***Nvidia GTX480 & 470 reviews & discussion***

Ch3m1c4L, good on you :) you are a bit of a loon but would like to see some benchies with 3 of these beasts.
the fact is at the moment the 480 is the fastest single GPU you can buy, don't let the ATI fanboys tell you otherwise..im not sure for how long though ;)

(before anyone starts to whine ive owned ATI cards for the last 2yrs)
 
Ch3m1c4L, good on you :) you are a bit of a loon but would like to see some benchies with 3 of these beasts.
the fact is at the moment the 480 is the fastest single GPU you can buy, don't let the ATI fanboys tell you otherwise..im not sure for how long though ;)

(before anyone starts to whine ive owned ATI cards for the last 2yrs)

People keep saying that, but ATi fanboys or not, it's simply untrue. All 5870 GPUs will clock to speeds that will give better performance than a GTX480, the GTX480's limited overclocking potential means that it's NOT the world's fastest "GPU" because it's not capable of beating another GPU perfectly capable of running at higher speeds than it comes at.

Look at it this way, the chips used on 5850s are rated for 1.3 Ghz, they don't run at that, but it doesn't mean they're not 1.3Ghz chips.

If a graphics card came out with 1.25Ghz RAM, but that was its maximum speed, you wouldn't say that card had the world's fastest graphics RAM now would you?

Think about cars that are speed limited by the engine computer, you CAN bypass the lock meaning the car itself comes from the factory at lower performance, but its engine is capable of far more if you remove the limiter.
 
only nancy boy sociailsts get puzzlement and discomfort from other peoples money.

So? Whether it bothers someone or not, I would still consider someone doing something so petty in order to deliberatey invoke a reaction, just for kicks, to be really quite petty themselves.

And no it wouldn't really bother me, I just can't see the point myself. But that is just my own viewpoint amongst many varied viewpoints.
 
Last edited:
only buying 3 for a few reasons, my mobo only supports 3 cards being the main one. Also I want to have a sound card, which is why I originally got into water cooling (single slot blocks), and I also have a sata controller card as well, so, even if i could support 4 i doubt id have the slot space. When I was ordering it never crossed my mind. Scaling from 1->2 GPUs is good, scaling 2->3 is ok, scaling 3->4 sucks ass. While i view it as useful on 295s, thats only because it goes from 2gpus ->4. For the moment (current drivers) I would not go quad SLI again, If enough benchies had been out when i bought my cards, at 2560 res, i would have gone tri sli 285's or something.

I might buy one, just to use as a heat shield to defend against the flaming from ati fanboys. I mean if it can survive under air cooling, surely if i power the fan then it can cool all the flaming in the world right?

@kylew: You try running cross fire in my system :P

I'm an nvidia fanboi ffs, i only finally admitted it to myelf the other day when i realised i had never purchased an ati card in my life (at least not for myself). I am now going to get a 5770 for my new media centre so i can get used to the drivers and then maybe make a more sensible purchase in the future when i upgrae my mobo to something tha supports xfire/sli.
 
No the GTX480 is not the fastest card, what about when the HD5870 beats it as higher res? and it does this quite often particularly with 10-3s and at stock clock speeds. Or is that a scam? Also the HD5870 manages to keep up with it while using less power and keeping cooler, Now I wouldn't be suprsied if it still uses a lot less power even in its overclocked state and runs far cooler( need someone with a HD5870 which is clocking at over 1Ghz on core)

If the HD5870 was over clocked a single GTX480 wouldn't stand a chance, need alone a pair of GTX480s and a pair of HD5870s and tbh I don't think the GTX480 would be able to be much faster than CF HD5870 overclocked...if anything it will just reboot cos it takes too much power :p So its not really very friendly is it? Probably need 1.5Kw PSU or two of them to run it

Also too the point, your going to pay £1200 for 3 ovens ? even if watercooled it will have to be on quite a loop to keep everything cool, just becasue one can have 3 GPUs on the mobo don't mean you should, if you are than be prepared for your GF/wife to walk right out ;p

The point is this, Fermi don't really have a place on the gaming market as there already is two cards that are not only cheaper but as good (if not better) in certain situations. If it was 20-30% faster it would be a different story but heres the fact it is not. I m not saying its a bad card or anything it has its place in the world and thats not really gaming, its just a waste of money for gaming, might as well spend £300 on a HD5870 if you want more overclock the nuts off it and a single GTX480 (guess) wouldn't even touch it.


Not saying thats how it will be forever as Nvidia would have to work on it and make it better as the years go by.
 
Last edited:
These forums are sinking to new lows on a daily basis, I used to respect this place.

Guys guys guys, calm down:eek:

Lets take a step back here and look at the larger political picture, then it may become slightly clearer to certain individuals....

There are 3 players in the CPU/GPU maket, Intel, AMD, and Nvidia.

Nvidia are not allowed to make CPU's becase it would make intel and AMD cry.... so they are re-invening them, on a massively parallel basis, something that intel tried and failed hard with... Like it or not, Fermi & Cuda is what intel wanted Larabree to be. Or laughabree, or whatever they called it.

My point here is simple, the 480 is a vey high end gaming card, theres no two ways about that... but not only that, along side that, Nvidia are blurring the lines betwen CPU and GPU like never before, not only because they want to, but because they have to just to stay alive. And they seem to be doing a pretty damn good job when you consider AMD and Intel want them dead, with thier heads on poles outside the castle walls to serve as a warning to any other tech companies that have any bright ideas.

More power to them, and long live Fermi!


To quote my previous post... ' putting thier money where thier mouth is by heavily investing thier R&D in Cuda/GPGPU tech, which could either be very wise or disastrous depending on how the CPU market plays out, and if software developers start tapping into GPU developing.

It's no secret that Nvidia want to produce CPU's but the fact is Intel and (now especialy) AMD will never allow them an X86 licence (or indeed a means of using one), that was prooved when they tried to buy VIA a few years back.'
 
You say flaming from ATI fanboys, but I actually think the majority of it is people flaming because it's just not sensible at all to put three of those in a computer when you consider all their shortcomings and what is available elsewhere. It is a good card, but for this generation of the new architecture within it... it just isn't quite there yet.

Fanboy flaming from both camps can and does occur anyplace, anytime though!
 
Last edited:
You say flaming from ATI fanboys, but I actually think the majority of it is people flaming because it's just not sensible at all to put three of those in a computer when you consider all their shortcomings and what is available elsewhere. It is a good card, but for this generation of the new architecture within it... it just isn't quite there yet.

Fanboy flaming from both camps can and does occur anyplace, anytime though!

This.

I think he seems to think there's a huge argument going on when mostly is just debating.

Not to say there aren't a few taking things too seriously, but I'm finding it interesting debating it personally.

Ch3m1c4L isn't taking it personally that people are calling him mad for buying three, and so he shouldn't because I don't mean any harm by it, and I don't mean to insult him by it either, it's all good fun mostly to be honest.
 
These forums are sinking to new lows on a daily basis, I used to respect this place.

Guys guys guys, calm down:eek:

Lets take a step back here and look at the larger political picture, then it may become slightly clearer to certain individuals....

There are 3 players in the CPU/GPU maket, Intel, AMD, and Nvidia.

Nvidia are not allowed to make CPU's becase it would make intel and AMD cry.... so they are re-invening them, on a massively parallel basis, something that intel tried and failed hard with... Like it or not, Fermi & Cuda is what intel wanted Larabree to be. Or laughabree, or whatever they called it.

My point here is simple, the 480 is a vey high end gaming card, theres no two ways about that... but not only that, along side that, Nvidia are blurring the lines betwen CPU and GPU like never before, not only because they want to, but because they have to just to stay alive. And they seem to be doing a pretty damn good job when you consider AMD and Intel want them dead, with thier heads on poles outside the castle walls to serve as a warning to any other tech companies that have any bright ideas.

More power to them, and long live Fermi!


To quote my previous post... ' putting thier money where thier mouth is by heavily investing thier R&D in Cuda/GPGPU tech, which could either be very wise or disastrous depending on how the CPU market plays out, and if software developers start tapping into GPU developing.

It's no secret that Nvidia want to produce CPU's but the fact is Intel and (now especialy) AMD will never allow them an X86 licence (or indeed a means of using one), that was prooved when they tried to buy VIA a few years back.'

You don't happen to work for nVidia do you? That sounds seriously like some PR marketing blurb.

You may be telling yourself that nVidia are trying to reinvent the CPU, but it's not quite that simple. They NEED a new avenue to go down and GPGPU is a natural progression considering they already make GPUs, I think that's simply all it is.

Performance wise, ATI's GPUs appear to actually be faster than fermi for GPGPU anyway which isn't good at all for nVidia considering where they've tried to hit with fermi.

The only saving factor is CUDA and its market adoption.
 
You may be telling yourself that nVidia are trying to reinvent the CPU, but it's not quite that simple. They NEED a new avenue to go down and GPGPU is a natural progression considering they already make GPUs, I think that's simply all it is.

Performance wise, ATI's GPUs appear to actually be faster than fermi for GPGPU anyway which isn't good at all for nVidia considering where they've tried to hit with fermi.

And ATi GPGPU performance is not that far behind (if at all), they just need something like CUDA to make it more everyday usable.
 
You don't happen to work for nVidia do you? That sounds seriously like some PR marketing blurb.

You may be telling yourself that nVidia are trying to reinvent the CPU, but it's not quite that simple. They NEED a new avenue to go down

Haha, no, I dont work for nividia, im just applying a bit of common sense to the situation. This is virgin, undeveloped tech, dont forget, and its quite obvious the competition is rattled (see the last 20 pages).
 
stream?, but the support for its is a little limited at this moment in time which is a shame


and not really AMD know they won this round as fermi failed to be any faster its just the same plus the high power and high temps, it may be different next time around as Nvidia improves it


think about it a bit. So you going to tell your mates to get a GTX480 for gaming when its just not much faster? when its about £150 MORE when they find out this, they won't be happy about it.
 
Last edited:
Haha, no, I dont work for nividia, im just applying a bit of common sense to the situation. This is virgin, undeveloped tech, dont forget, and its quite obvious the competition is rattled (see the last 20 pages).

The irony is that Intel/AMD pretty much made Nvidia do this by trying to squeeze them out.
 
stream?, but the support for its is a little limited at this moment in time which is a shame


and not really AMD know they won this round as fermi failed to be any faster its just the same plus the high power and high temps, it may be different next time around as Nvidia improves it


think about it a bit. So you going to tell your mates to get a GTX480 for gaming when its just not much faster? when its about £150 MORE when they find out this, they won't be happy about it.

They have stream, but it's pointless because it's locked to hardware, OpenCL/DirectCompute all the way.
 
The whole flaming thing has gone full circle and some of you have short memories. Do you remember when the HD5000 cards came out the Nv fanboys were giving it -

'The HD5870 isn't always faster then the GTX295'

'Bah, HD5770 is being bottlenecked by 128bit memory bus'

Give it a month and all the excitement about Fermi rather anti climax will have gone away just like it did with the Radeon's back in October/November time.
 
Yea, im not taking it personally, since it is freaking stupid tbh. Im pretty lucky that I can do it. Being single = lots of monies for myself, which tbh im loving it atm, had holiday at begining of march, gotone at end of april, bought 3 new cards, and buying a htpc. now, that is not my normal spending habbits, and i had saved for a good 6 months (I save at least £300 a month normally, more if i make any extra from overtime or trading). IF I were to go out and get myself a new GF, i doubt I would be able to afford any of this lol, and WIFE....Im only 24!

Now, someone go and post in my media center build thread to advise me on my FIRST EVER ATI card!
 
Back
Top Bottom