• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA GeForce GTX 960 Launch Date Revealed

So in Tomb Raider, the 980's are 23% faster than the 290X's whilst using only 63% of the power that the 290X's use.

That is a massive difference.

Cannot really see what Humbug has a problem with. Why does it matter whether the 980's arent drawing the most power that they can? They are still significantly faster than the 290X's anyway. :confused:
 
What has made me laugh about what Tom's did is the way they went about testing.

All a PC user is interested in is how much power does it use in an hour, day, month or year. This can be easily tested on very basic equipment measuring how many watts are being used in the relevant time period. Perhaps Tom's should pop down to their local elec board for some basic tips in how to do this lol.
 
What has made me laugh about what Tom's did is the way they went about testing.

All a PC user is interested in is how much power does it use in an hour, day, month or year. This can be easily tested on very basic equipment measuring how many watts are being used in the relevant time period. Perhaps Tom's should pop down to their local elec board for some basic tips in how to do this lol.

Home power testers are not that accurate - this is why wesbites like TPU actually bother to spend £1000+ on a decent multimeter for testing.

They also showed the GTX970 and GTX980 being 35%to 50% more efficient than a GTX970 and remember TH uses a suite of games to measure power consumption. They do the same for CPU power measurements which use a test suite and not one application.

You seem to ignore Tom's was trying to show,which has always been consistent with the info Nvidia has said about how it works,especially when you consider why its important for the Tegra line of SOCs,especially since such quick changes are important for balancing the power and TDP between the CPU and GPU sections of the IGP. Nvidia has sunk billions into Tegra development.

Intel has been doing similar things for years:

http://images.anandtech.com/doci/8595/Power Management.png

The burst mode enables quick burst clockspeeds during peaky workloads improving performance. However,it happens so quick the average TDP is not actually exceeded.

AMD is behind Nvidia and Intel in this regard.

Its the first website to measure dynamic power consumption properly and we had the same resistance from enthusiasts when frame latencies tests were done by TR. As time progresses we are going to see more and more websites start doing it.

Now is Maxwell more efficient at a hardware level than Kepler. It is since Kepler was too when compared to Fermi. Kepler did things like moving to software scheduling over Fermi,and Maxwell reduced the ratio of texture units relative to Kepler and so on.

But then people are also comparing the GTX970 and GTX980 to the GTX780,which is a bit of an unfair comparison IMHO.
 
Last edited:
Home power testers are not that accurate - this is why wesbites like TPU actually bother to spend £1000+ on a decent multimeter for testing.
You don't seem to ignore Tom's was trying to show,which has always been consistent with the info Nvidia has said about how it works,especially when you consider why its important for the Tegra line of SOCs,especially since such quick changes are important for balancing the power and TDP between the CPU and GPU sections of the IGP. Nvidia has sunk billions into Tegra development.
Its the first website to measure dynamic power consumption properly and we had the same resistance from enthusiasts when frame latencies tests were done by TR. As time progresses we are going to see more and more websites start doing it.

You pay your electricity on the total amount of power used not some dynamic peak found in a dodgy test.

Yes I do think that what Tom's did was very dodgy and can be tested quite easy.

Put 4 x 980s + 5960X on a Corsair 1200i and it will run

Put 4 x 290Xs + 5960X on a Corsair 1200i and all you will get is pain.

I think even though it is crude having one setup run on a PSU and one setup fail on a PSU is a way better guide of dynamic peaks than anything Tom's did.:D
 
What has made me laugh about what Tom's did is the way they went about testing.

All a PC user is interested in is how much power does it use in an hour, day, month or year. This can be easily tested on very basic equipment measuring how many watts are being used in the relevant time period. Perhaps Tom's should pop down to their local elec board for some basic tips in how to do this lol.

Put simply, the reason they do not use a Basic hardware store KW meter is the same reason Nvidia Built the F-CAT system, like FRAPS for testing frame latency a KW meter is useless for measuring power draw on Maxwell GPU's
 
You pay your electricity on the total amount of power used not some dynamic peak found in a dodgy test.

Yes I do think that what Tom's did was very dodgy and can be tested quite easy.

Put 4 x 980s + 5960X on a Corsair 1200i and it will run

Put 4 x 290Xs + 5960X on a Corsair 1200i and all you will get is pain.

I think even though it is crude having one setup run on a PSU and one setup fail on a PSU is a way better guide of dynamic peaks than anything Tom's did.:D

You mean like their average power tests that paint a R9 290X consuming more power(around 25% more) while being slower than a GTX980?

I think you have dug yourself into a hole about hating them.

I added more info to my last answer. There is nothing dodgy about their test since it agrees with what we know about Maxwell,and Intel does not do anything differently with their load management. It actually is an intelligent use of available TDP and power budgets. There is enough information that indicates AMD is going to do the same,but Nvidia has a 9+ month lead over them with such tech.

You don't seem to get that even the Integra multi-meters which TPU use,cannot measure very quick dynamic measurements.

In fact what TH has used it not unique.

Its been done before in other areas of engineering to measure dynamic power:

https://books.google.co.uk/books?id...ynamic power consumption oscilloscope&f=false

http://www.cmicrotek.com/uPower_Analyzer_whitepaper.pdf

http://literature.cdn.keysight.com/litweb/pdf/5991-4268EN.pdf

You might notice it is far more commonly used for measuring the power consumption of low power devices due to its sensitivity(one chap I knew at university was doing similar things since they were working in that area).

Thats the first few hits off Google.

We will have to agree to disagree but what TH has done is not special in the scheme of things - they are just the first computer site to do it.
 
Last edited:
He does seem to be stuck in a cycle of defending Nvidia at all costs these days, including rubbishing arguably the bast hardware reviewer on the net ^^^^
 
So in Tomb Raider, the 980's are 23% faster than the 290X's whilst using only 63% of the power that the 290X's use.

That is a massive difference.

Cannot really see what Humbug has a problem with. Why does it matter whether the 980's arent drawing the most power that they can? They are still significantly faster than the 290X's anyway. :confused:

Because finding an extreme edge case and then constructing an entire ethos around it is all he has to cling to. That and personal attacks on someone that spends thousands of pounds on AMD hardware. I'm sure Kaap does that just so he can rubbish them on forums. :rolleyes:

At the end of the day, kaap has done some testing people asked him to do. Thanks to kaap. Of course people are free to draw whatever conclusions they want from it based on whether or not it fits their bias.
 
Last edited:
He does seem to be stuck in a cycle of defending Nvidia at all costs these days, including rubbishing arguably the bast hardware reviewer on the net ^^^^

Give it a rest Humbug. You have given Kaap personal insults and attacks and he only did what you asked him to (bench a game). You didn't like the results for whatever reason (made AMD look bad?), so you thought instead of saying thanks for the info, it was better to have a go at him because it didn't paint your chosen brand in the light you would put them in.

I use a MM and it does a good enough job for me and many other users on the forum. Keep up the good work Kaap :)
 
So in Tomb Raider, the 980's are 23% faster than the 290X's whilst using only 63% of the power that the 290X's use.

That is a massive difference.

Cannot really see what Humbug has a problem with. Why does it matter whether the 980's arent drawing the most power that they can? They are still significantly faster than the 290X's anyway. :confused:

Be aware they are only significantly faster at 1080p, so the consumption vs framerate is undeniably brilliant. However consumption increases to both 980's and 290x's at 1440p and 4k, the 980's still excel in power consumption but the significant framerate advantage to Nvidia is eradicated.
 
Because finding an extreme edge case and then constructing an entire ethos around it is all he has to cling to. That and personal attacks on someone that spends thousands of pounds on AMD hardware. I'm sure Kaap does that just so he can rubbish them on forums. :rolleyes:

At the end of the day, kaap has done some testing people asked him to do. Thanks to kaap. Of course people are free to draw whatever conclusions they want from it based on whether or not it fits their bias.

Its a proven methodology for testing power consumption on electrical components that he is disagreeing with.

Would you have us agree with him when we think what Toms Hardware did is sound?
 
Its a proven methodology for testing power consumption on electrical components that he is disagreeing with.

Would you have us agree with him when we think what Toms Hardware did is sound?

He isnt disagreeing with the method, but the conclusion
Go back and check the Toms article;
[EDIT] We originally posted Power Consumption Torture (GPGPU) results that showed a simulated GeForce GTX 970 reference card pulling over 240 Watts. This does not represent Nvidia's reference GeForce GTX 970 board because our data point was simulated with a Gigabyte GTX 970 card that has a non-reference ~250 Watt power target, unlike the reference board's ~150 W power target.

And


[EDIT] Our original Nvidia GeForce GTX 980 reference sample suffered from a BIOS issue that caused a higher-than-expected power draw. We flashed the card with the reference BIOS and have updated the charts below with the new results. [/EDIT]


So Toms have in fact already highlighted that their testing method was faulty, hence the conclusions are also faulty, their results now reflect closer to what kaap is showing
 
He isnt disagreeing with the method, but the conclusion
Go back and check the Toms article;
[EDIT] We originally posted Power Consumption Torture (GPGPU) results that showed a simulated GeForce GTX 970 reference card pulling over 240 Watts. This does not represent Nvidia's reference GeForce GTX 970 board because our data point was simulated with a Gigabyte GTX 970 card that has a non-reference ~250 Watt power target, unlike the reference board's ~150 W power target.

And


[EDIT] Our original Nvidia GeForce GTX 980 reference sample suffered from a BIOS issue that caused a higher-than-expected power draw. We flashed the card with the reference BIOS and have updated the charts below with the new results. [/EDIT]


So Toms have in fact already highlighted that their testing method was faulty, hence the conclusions are also faulty, their results now reflect closer to what kaap is showing

you missed a bit.

If the load is held constant, then the lower power consumption measurements vanish immediately. There’s nothing for GPU Boost to adjust, since the highest possible voltage is needed continuously. Nvidia's stated TDP becomes a distant dream. In fact, if you compare the GeForce GTX 980’s power consumption to an overclocked GeForce GTX Titan Black, there really aren’t any differences between them. This is further evidence supporting our assertion that the new graphics card’s increased efficiency is largely attributable to better load adjustment and matching.
 
it's valid if you are loading the card up and this has been your argument throughout, but what games will actually do that that is the argument from the otherside.

My argument has been that overall Maxwell is more efficient than Hawaii and its predecessor Kepler, overall.

But its not as if it will never pull those sort of power loads, it can and will, just not all the time.
 
It won't sustain those loads of current draw for long term, unless the dynamic load was hacked or set to 100%, ie gpu compute torture test. I don't think any game will cause this to happen.

As it's more dynamic and more controlling of stepping up/down p-states it'll always be finding a happy medium. Amd are behind on this but the issue wouldn't be so bad if Amd had released their new gcard or still held a performance advantage.
 
It won't sustain those loads of current draw for long term, unless the dynamic load was hacked or set to 100%, ie gpu compute torture test. I don't think any game will cause this to happen.

As it's more dynamic and more controlling of stepping up/down p-states it'll always be finding a happy medium. Amd are behind on this but the issue wouldn't be so bad if Amd had released their new gcard or still held a performance advantage.

I don't think any game will cause this to happen either, not at a constant, but Direct Compute can put a heavy load on GPU's and in that case it could load up the GPU.

It depends on which game, even what part of it, its no longer good enough to test one thing and use that as an across the board power consumption template, Maxwell simply doesn't work like that.

I also don't think long tern loads like that will harm the card as it is infact designed to tolerate it, they have 450 Watt power phases, 300 Watt inlets and 300 Watt cooling (that indecently on a lot of reviews heats up just as much as Kepler and Hawaii)

What's more, compute and rendering tasks will put them under those load and at a constant.
 
Last edited:
Back
Top Bottom