• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

Try reading around ... some haven't stuck to the PR line quite as closely as Tomshardware have (they never deviate around NVIDIA launches or big PR campaigns).

It's not a 180W card.

Not sure you could be right, but I read a lot of reviews and did not see any problems, although admittedly I was looking at the performance etc. a lot more than the TDP. But its going to be a lot better than a 980ti anyway for power usage. it should be at the 180w at default settings, then somewhere around 225w or less if you move the power limit.

On the "overclockersclub" overclocking review... it says it has slightly less than GTX970 (when both are overclocked)... Sounds good to me.

http://www.overclockersclub.com/reviews/nvidia_geforcegtx_1080_overclocking/7.htm

980ti 80w higher....

980ti 250w TDP, 1080 180w TDP...
 
Last edited:
looking around all i am seeing is between 178w to 184 at stock at full tilt but once you oc the thing its going to increase but there is a limit and that's the 8 pin limit + pcix lane 150w + 75w if i am correct but again i am no atx gpu engineer
So yet again pmc25 is just spreading anti-Nvidia FUD.

In other news, the sun is reported to rise tomorrow morning. Do not be alarmed.
 
Also a decent 8pin can usually push more than 150w. The same way a quality 6-pin can push more than 75w, and has been known to do so.

The card really does need an 8+6pin though. Even my Ti's maxed out hit 80-90% power draw out of the 121% capable. A 1080 with a mild overclock requiring 120% kinda proves this.
 
Last edited:
So yet again pmc25 is just spreading anti-Nvidia FUD.

In other news, the sun is reported to rise tomorrow morning. Do not be alarmed.

Stop making yourself look silly.

Is the Fury X a 453 watt card?

http://www.tomshardware.co.uk/amd-radeon-r9-fury-x,review-33235-7.html

No, so please stop spreading stupid misinformation around as it helps no one.

Sadly that is all he does. Makes up rubbish and trys to play it as fact. He isn't even looking for negatives and just fantasizing them up :(
 
If I wanted to say bad things about Nvidia cards, I think TDP would be about the worst thing to pick, you could pick loads of things, marketing, etc etc. but TDP compared to AMD is not a good one. Have you seen a 290x power consumption? and the 390/390x (midrange cards) are similar to a 290x. The watt / performance on those is not good at all.
 
Last edited:

Doom at 2ghz.

I don't know why people get so hung up about how many watts a card uses as long as it works.


Because its always a source of whinging among the rabid fanboys, "omg your card needs 5 more watts to run the same speed as mine, your card sux". Its a number, therefore its a peen measuring utility for arguments among the nvidia and amd nut swingers.
 
Last edited:
http://www.pcadvisor.co.uk/new-prod...tx-1070-release-date-price-specs-new-3639751/

"Given that both GPUs are $50 more than their predecessor's launch price, with the GTX 980 sitting at £429 and 970 at £259, we expect the UK price for the latest Pascal GPUs to be around £449 for the GTX 1080 and £289 for the GTX 1070 and £339 for the GTX 1070 Founder Edition - making them affordable GPUs given their prospective speed gains - read on to find out about their performance."
 
Because its always a source of whinging among the rabid fanboys, "omg your card needs 5 more watts to run the same speed as mine, your card sux". Its a number, therefore its a peen measuring utility for arguments among the nvidia and amd nut swingers.

I can understand why people would moan about it from a performance per watt perspective.

The perf/W of the 1080 is actually pretty bad if we're being honest (compared to what it theoretically 'should' be). It's only ~1.56x the GTX 980 (which was the highest perf/W 28nm card). The GTX 1080 should be ~25-30% faster for the amount of power it draws. This is because 16nm FF+ is supposed to be 2x the perf/W.

Source: http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/27.html
 
http://www.pcadvisor.co.uk/new-prod...tx-1070-release-date-price-specs-new-3639751/

"Given that both GPUs are $50 more than their predecessor's launch price, with the GTX 980 sitting at £429 and 970 at £259, we expect the UK price for the latest Pascal GPUs to be around £449 for the GTX 1080 and £289 for the GTX 1070 and £339 for the GTX 1070 Founder Edition - making them affordable GPUs given their prospective speed gains - read on to find out about their performance."

God I hope the 1080 is near £449... I'd snap it up asap.
 
I can understand why people would moan about it from a performance per watt perspective.

The perf/W of the 1080 is actually pretty bad if we're being honest (compared to what it theoretically 'should' be). It's only ~1.56x the GTX 980 (which was the highest perf/W 28nm card). The GTX 1080 should be ~25-30% faster for the amount of power it draws. This is because 16nm FF+ is supposed to be 2x the perf/W.

Source: http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/27.html



This is nonsense, TSMC claim their 16nm FF+ process offers a reduction in power of 70% or 65% faster clock speeds, not both Increasing clock speed will reduce that power saving.

http://www.tsmc.com/english/dedicatedFoundry/technology/16nm.htm
 
This is nonsense, TSMC claim their 16nm FF+ process offers a reduction in power of 70% or 65% faster clock speeds, not both Increasing clock speed will reduce that power saving.

http://www.tsmc.com/english/dedicatedFoundry/technology/16nm.htm

Well seeing as reducing the power by 70% would be a 3.33x perf/W increase, ending up at 1.56x is still disappointing, even taking account of high clockspeeds reducing that improvement.
 
Enforces my thoughts about Pascal. An Async poster child and Nvidia some how decided not to get Async sorted on there latest and greatest. Still think they have hardware support?

Stop trolling, it is getting very tiresome dealing with this ignirant rubbish.

Nvidia are not oxide developers.
 
Back
Top Bottom