• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Interesting 470 overclocked article on tweaktown

The clockspeed limit of Cypress GPUs is in excess of 1Ghz, that's the limitation of the GPU itself, the clockspeed of GF100s seems to be not more over 750Mhz which isn't a big enough speed boost for it to beat a Cypress GPU.

All of the GF100 overclocks the reviewers have reached have been on stock volts, stock cooling and running on barely-working beta drivers.

Compare that to the myriad of voltage tweak/driver revision/cooling options now available for the 5800s, and your "conclusion" looks absolutely asinine.

Tame the VRMs temps and smack up the juice, stick the chip under water and I think we'll be seeing circa 900 on the core before too long.
 
All of the GF100 overclocks the reviewers have reached have been on stock volts, stock cooling and running on barely-working beta drivers.

Compare that to the myriad of voltage tweak/driver revision/cooling options now available for the 5800s, and your "conclusion" looks absolutely asinine.

Tame the VRMs temps and smack up the juice, stick the chip under water and I think we'll be seeing circa 900 on the core before too long.

There is very little overclocking potential with them.

They run at 95 degrees STOCK, their throttling temperature is 105c.

They have OVERHEATED on LN2 cooling.

Seriously, they are broken GPUs.

"Barely working drivers" you're grasping at everything you can to make it look a bit better.

Barely working drivers aren't suddenly going to make the card overclock to far higher numbers the moment they ascend from "barely working" to "work a bit".

You're living in a fantasy land if you think 900Mhz on core is something remotely achievable with simple watercooling.

In addition to that, please take note that I am still comparing what is for all intents and purposes, still a REFERENCE 5870 to a REFERENCE GTX480, I'm not talking about cherry picked special OC versions that aren't going to fix what is currently a broken GPU.

Look at the price of a GTX480 now, how much do you think an OC version with a waterblock slapped on is going for? £500-550 easily £600 is equally likely, and you think that helps your argument how? There have been 5870s sub £280, you could have gotten 2 for what will likely be the same price.

Stop talking nonsense.
 
Last edited:
I have to agree with kylew on this one. Only hardcore nVidia fans would not see how badly new GeForces present themselves with older Radeons in the background.

PS nVidia drivers aren't "barely-working betas".
 
I love how watercooling is now "tweaking" and how everybody has the ability and desire to use it to get a few extra MHz from the Fermi furnace.

Also, I thought the drivers for Fermi were almost as mature as Catalyst drivers, unless the green team has been doing nothing at all to help themselves over the 6 month delay.

Surely the fact you are all talking about watercooling proves that Fermi is a failure since you have to make such efforts just to compete.
 
It didn't overheat on LN2 - it was showing artifacts typical of overheating - infact it was probably power leakage - IMO dropping the voltage down a notch would have probably given just as good if not better clocks without the leakage issue.
 
I love how watercooling is now "tweaking" and how everybody has the ability and desire to use it to get a few extra MHz from the Fermi furnace.

Also, I thought the drivers for Fermi were almost as mature as Catalyst drivers, unless the green team has been doing nothing at all to help themselves over the 6 month delay.

Surely the fact you are all talking about watercooling proves that Fermi is a failure since you have to make such efforts just to compete.

Fernice. :D

He's grasping at straws, I've seen the nVidia hardcores going on about that since its release.

"Wait for the optimised drivers, these are just immature buggy ones".

When you ask them "what do you think nVidia have been doing for the last 8 months while they've been tweaking the core layout to improve the yields?" they don't say anything.

I'm not going to claim just HOW mature their drivers are, I don't know how long they've been working on them compared to AMD, but they ARE NOT immature-buggy-barely-working-betas at all, and to say so shows you have a different agenda with what you're saying.
 
It didn't overheat on LN2 - it was showing artifacts typical of overheating

That is akin to your "nvidia didn't lock ATi hardware OUT of Batman AA, they locked nVidia hardware IN". VERY little difference unless you want to argue the semantics until you're breathless.


infact it was probably power leakage - IMO dropping the voltage down a notch would have probably given just as good if not better clocks without the leakage issue.
In your opinion, yes, however there's nothing wrong with calling it on what it APPEARS to be until it's proven otherwise.

As for dropping the voltage, do you honestly mean to say that these hardcore overclockers have literally hamfistedly set the numbers on the cards really high and just benched it?

Come off it Rroff, you should know better than that.

They will have been tweaking the settings and messing about with different combinations of settings.

Or do you think you've got some pool of knowledge that these hardcore overclockers don't?

If you think they should have eased off the voltage and went for an all out clockspeed, why didn't they know that? Assuming it's correct?

In addition to that, what's to say that they didn't only up the v-core to allow a higher clockspeed?

That's generally how it works, no? Try the highest frequency on a specific voltage until it doesn't work properly?

How do you propose that, when they've ran out of headroom at a particular voltage, they then LOWER the voltage, to get a higher clock speed?
 
Last edited:
That is akin to your "nvidia didn't lock ATi hardware OUT of Batman AA, they locked nVidia hardware IN". VERY little difference unless you want to argue the semantics until you're breathless.

There is a world of difference between enabling a vendor specific feature and intentionally locking out the competition.

Lets take Mass Effect 2 as an illustration to this point... like Batman its built on a branch of the same engine and out the box lacks anti-aliasing on both ATI and nVidia hardware, like Batman there are issues with implementing AA with the use of deferred shader and HDR paths that are most effectively worked around with the cooperation of the respective hardware developers. As it happens the game shipped without working AA and was later patched in via driver hacks by both ATI and nVidia... but lets say they approached ATI and nVidia asking them to help implement AA for their respective hardware and by the time the game came to launch ATI had provided a working implementation but nVidia had dithered and by launch had no working code... the developer then has the choice, they can ship with the option to only enabled AA on ATI hardware, ship with no AA support at all or ship with the ATI implementation also working on nVidia hardware... a quick look at this thread will quickly show the potential problems with forcing AA on unsupported configurations under these circumstances - at first glance it appears to work ok, but under some scenarios you get random slowdowns, visual glitches or even crashing.

So maybe the developer then ships with the option for AA enabled on ATI cards, but not shown on other hardware - seems quite reasonable... until all the nVidia "fanboys" start moaning about how ATI has locked out AA on nVidia cards in ME2...

In your opinion, yes, however there's nothing wrong with calling it on what it APPEARS to be until it's proven otherwise.

As for dropping the voltage, do you honestly mean to say that these hardcore overclockers have literally hamfistedly set the numbers on the cards really high and just benched it?

Come off it Rroff, you should know better than that.

They will have been tweaking the settings and messing about with different combinations of settings.

Or do you think you've got some pool of knowledge that these hardcore overclockers don't?

If you think they should have eased off the voltage and went for an all out clockspeed, why didn't they know that? Assuming it's correct?

In addition to that, what's to say that they didn't only up the v-core to allow a higher clockspeed?

That's generally how it works, no? Try the highest frequency on a specific voltage until it doesn't work properly?

How do you propose that, when they've ran out of headroom at a particular voltage, they then LOWER the voltage, to get a higher clock speed?

Shamino said they saw artifacts that normally happen with overheating - he does not say the card was actually overheating - infact the opposite - he was suprised to see those artifacts with the temperatures they were running - it is therefore fairly reasonable to conclude that these artifacts occured due to power leakage which has been something to plague extreme nVidia shader overclocking.

I'm not saying they didn't try it - they might have done - but the methodology is counter intuitive and he would not be the first experienced overclocker to miss it by far.
 
Last edited:
SNARFF!!!

Nonsense Rroff, didn't you read that BSN article? It points out EXACTLY what happened, nVidia supplied a generic piece of code to Rocksteady and then claimed it was proprietary, Eidos claimed it was nVidia's property, then they didn't want to claim ownership of anything and tried to pass the blame on to eachother.

They've been exposed as locking AA out on ATi cards, there is NO REASON you need to continue this arguing over it, they locked out AA on ATi hardware maliciously, it's been proven.

And just in case you "missed" when the article was posted the first time, here, have a second link to it, you could really do with reading it.

Batmangate
 
Please read my ME2 illustration and then tell me... would that be ATI locking out nVidia or not?

The code wasn't 100% entirely generic anyhow, the final resolve part was opptomized and tested against nVidia hardware, all the code upto that point was infact running on other hardware and all that was missing was the final resolve path which for obvious reasons would be better opptomised for the target hardware. That link just rehashes all the information and brings it together but doesn't really provide any real insight other than face value interpretations and very little in the way of real analysis.
 
Last edited:
I would doubt that was down to a electro migration problem THB.

I would think something was getting to hot, maybe the air pocket under the heat spreader. The core could have become voltage starved as the temps increased or just plain crapped out. If they run again under liquid helium we would soon know what happened.

IMO the problem was a some type of heat issue. Probably because the side of the die become incredibly under the load.
 
With that 40nm process, extreme cooling, increased voltage and clockspeed your going to hit the wall with transistor leakage long before heat becomes a problem.

It was a problem with the Pentium 4 and RAM based on the D9GMH chips too - no matter the amount of cooling you used you suffered leakage/premature levels of hardware degrading electromigration at higher voltage and switching speeds that would kill them very very quickly even with extreme cooling.


EDIT: I'm not ruling out overheating - its not impossible - just my experience with nVidia overclocking would tend to err on the side of other problems than heat.
 
Last edited:
only on this forum could a thread on gtx470 OCing result in a tit for tat argument on the 6 month old Batman AA fiasco.

I would doubt that was down to a electro migration problem THB.

You guys listened to yourselves, its a graphics card lol !
 
As I said in a previous post:


All it takes is for AMD to release a normal 5870 with a new bios at 1000 core/1300 memory and appropriate volatages, and it's without a doubt the fastest GPU, they don't need to make a new GPU, they can just release new cards with the same hardware and a new bios. This is what makes it the fastest GPU, if the GTX480 was the faster GPU, it would would be able to overclock high enough to top an overclocked 5870.

Why does it take AMD to officially release a card with a modded bios for you to consider it to be the fastest GPU?


According to my above logic, the 5870 is STILL the fastest GPU regardless of what clocks it's currently available in if we assume that it's essentially underclocked as they're pretty much all capable of 1000 on the core.

A stupid argument.

All Nvidia would have to do is select the few 512 SP parts, or those cores that clock to 750MHz and Nvidia have the fastest GPU again.

IF ATI then release a 1100MHz core that burns hotter than the sun Nvidia can find the 1 single 480 with 512 core and clocks to 850MHz and has the fastest GPU in the world, which would destory a 5870 at any realistic clock speed.


Or we can be more realistic and accept that at current mass produced cards, the 480 is the single fastest GPU at stock settings.
 
All of the GF100 overclocks the reviewers have reached have been on stock volts, stock cooling and running on barely-working beta drivers.

Compare that to the myriad of voltage tweak/driver revision/cooling options now available for the 5800s, and your "conclusion" looks absolutely asinine.

Tame the VRMs temps and smack up the juice, stick the chip under water and I think we'll be seeing circa 900 on the core before too long.

Indeed. I've read nothing in the overclock reviews to suggest the 480 wont clock well if cooled.
 
Back
Top Bottom