• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

Surely if the 1080 is 24% faster, that means you will get 24% faster FPS in games. Which is the only relevant number.

But the 1070 is only 19.4% slower, so you will only get 19.4% less FPS which is the only relevant number.


No one was ever arguing how much faster a 1080 is, only how much slower a 1070. And the simple answer based on those benchmarks is the 1070 is around 19% slower.
We are comparing the performance difference of the 1070 to the 1080.
 
Surely if the 1080 is 24% faster, that means you will get 24% faster FPS in games. Which is the only relevant number.
It's very important to keep in mind this is an artificial benchmark.

I really, really hate that they've become a mainstay of benchmarking for gaming GPU's. It is totally ridiculous that a non-game is being used to measure gaming performance. I know they have mostly correlated to actual gaming performance, but I just dont get why we shouldn't exclusively use gaming benchmarks as they are the ONLY thing that matters in the end. Who gives a **** how something runs in an artificial benchmark? Whoopty ****ing do, have fun playing 3DMARK! No, you want to play an actual game. So measure how it does in that game.
 
But I mean the 24% would be the relevant number for game performance? As opposed to the 19% slower? Or you could average it at about 22% I don't know.

You really don't know. There is no discussion, it is basic maths.

If you want know how much slower the 1070 is you simply divide its performance by the 1089. so 100/124 = 80.6% of the speed, thus 19.4% slower.

Just like when you compared prices. the 1070 is $379 and the 10780 is $599. Thus the 070 is 63.3% if the cost, or 36.7% cheaper.
 
The reviews I showed did list the actual clock speeds attained and the fan speed % during their overclocking.

Guru3D 60% fan speed core clocks sat around 1950-2050MHz (up from 1733 normal boost clocks). ~15.4% Overclock (not actual performance boost).

HardOCP 100% fan speed gave max OC clock speed of 2062MHz versus 1770MHz average clock speed at default operation". 16.5% overclock (not actual performance boost).

I do believe the custom AIB version with better power delivery and cooling will OC better but that doesn't change the fact the FE 1080 is a bad overclocker.

Well overall, from the reviews I have read, you will be getting about 15-30% more performance, depending on game, usually around 22% with a 980ti 1450/7900 compared to a 1080 2000-2100/11000 or something around that. So overall it is not bad.
 
Last edited:
Wow hmmm I think you should email WhyCry at the link below to tell him that he calculated it wrong and then he would revised to corrected it.

http://videocardz.com/contact

The 1080 is 24% faster than 1070 but taken the other way the 1070 is 20% slower. So both numbers are correct but can be used either way to make the gap sound smaller or bigger depending on the agenda.

Example (not trying to be condescending here)

Going from 100 and dropping to 80 is a 20% drop (20 is one fifth of 100)
Though starting at 80 and increasing to 100 is a 25% increase (20 is one quarter of 80).

Same numbers, different percentages.
 
Yeh I said that, but which is the relevant % to game performance. A game at 60fps on the 1080 would be 48fps on the 1070?
 
Last edited:
Surely if the 1080 is 24% faster, that means you will get 24% faster FPS in games. I admit maths is not my strong point.

The division will show good gains as it is GPU limited. Gregster will be pleased. CPU bound games less so (total war, almost anything written by blizzard etc.).

Ability to maintain boost clocks is also paramount, the ref cooler kicked ass on the 980 but 980ti, tx and now the 1080 don't look like they can cope as well. We may see bigger variances in actual perf this time than in past as the clocks are just getting hooj.
 
It's very important to keep in mind this is an artificial benchmark.

All benchmarks are artificial to some degree tho. When was the last time you saw benchmarks of more than a few mins long for example? Heat soak could even make a difference possibly highlighting the higher quality components vs the lower. Even if a chip is running at 80deg c constantly, other parts could also be warming up nicely and underperforming.

I'm fairly sure nvidia and AMD could tweak thing if they wanted to too. Ie, detect the EXE that's running and then apply some settings to speed things up a bit. Benchmarks are predictable too (they play through a set scenario) so performance could probably be optimised there too.

No benchmark in my opinion necessarily shows how well a game plays.

Benchmarks should also show highs and lows, many don't, and deviation from the average FPS plotted frequently
 
Last edited:
But then a game at 48fps on 1070 would be 60 on a 1080, which would make the 1080 25% faster... but yes overall actually that is right, the amount you would see in games would be 20% less than whatever the 1080 is getting. But the 970 vs 980 is only about 12 or 13% difference I think.
 
It says 24% on the graph.

It says the 1080 is 24% faster, which means the 1070 is 19.4% slower.

Look, i hate to be that pedantic but this is basic primary school maths The good thing about maths is there is either r aright or wrong answer. There is nothing ambiguous here.

The 1070 is 19.4% slower in that benchmark. You can either accept that fact and move on or continue to show you don't understand how percentages work.


Here is another exampel:
Bob weighs 124lb, Jane ways 100lbs. How much light is Jane? Jane is 100-(100/124*100)= 19.4% lighter than Bob.
 
Yeh I said that, but which is the relevant % to game performance. A game at 60fps on the 1080 would be 48fps on the 1070?

If you are comparing the 1070 to the 108 there is only one relevant number, the 1070 is 19.4% slower. Anything else is wrong.
 
It's very important to keep in mind this is an artificial benchmark.

All benchmarks are artificial to some degree tho. When was the last time you saw benchmarks of more than a few mins long for example? Heat soak could even make a difference possibly highlighting the higher quality components vs the lower. Even if a chip is running at 80deg c constantly, other parts could also be warming up nicely and underperforming.

I'm fairly sure nvidia and AMD could tweak thing if they wanted to too. Ie, detect the EXE that's running and then apply some settings to speed things up a bit. Benchmarks are predictable too (they play through a set scenario) so performance could probably be optimised there too.

No benchmark in my opinion necessarily shows how well a game plays.

Benchmarks should also show highs and lows, many don't, and deviation from the average FPS plotted frequently



in Hitman Nvidia cards get a significant boost if you test anywhere apart from the defined benchmark. I wonder why that is;)
 
It says the 1080 is 24% faster, which means the 1070 is 19.4% slower.

Look, i hate to be that pedantic but this is basic primary school maths The good thing about maths is there is either r aright or wrong answer. There is nothing ambiguous here.

The 1070 is 19.4% slower in that benchmark. You can either accept that fact and move on or continue to show you don't understand how percentages work.


Here is another exampel:
Bob weighs 124lb, Jane ways 100lbs. How much light is Jane? Jane is 100-(100/124*100)= 19.4% lighter than Bob.

yes I just said that in my post above....

The 1080 is 24% faster

The 1070 is 20% slower.

The actual number you see in games will be 20% slower than the 1080.
 
Last edited:
All benchmarks are artificial to some degree tho. When was the last time you saw benchmarks of more than a few mins long for example? Heat soak could even make a difference possibly highlighting the higher quality components vs the lower. Even if a chip is running at 80deg c constantly, other parts could also be warming up nicely and underperforming.

I'm fairly sure nvidia and AMD could tweak thing if they wanted to too. Ie, detect the EXE that's running and then apply some settings to speed things up a bit. Benchmarks are predictable too (they play through a set scenario) so performance could probably be optimised there too.

No benchmark in my opinion necessarily shows how well a game plays.

Benchmarks should also show highs and lows, many don't, and deviation from the average FPS plotted frequently
A gaming benchmark is never 'artificial'. That's pretty much by actual definition.

I do get what you're saying, though. They aren't always 100% representative. Which is true. Especially so since many people will use different settings or CPU's or RAM that can affect performance disparities between cards somewhat.

But they are really the best we have. Artificial benchmarks just seem so completely needless when there's hundreds of games we can actually test with.

Also, many benches do show highs and lows. Which I personally consider not that useful. Lows maybe. Highs - not really. What *really* helps is actual frametimes. That shows how consistent a framerate is. You can have a game that averages 70fps but feels like a stuttery mess. Something an average, high and low bench wont show at all.

And there's no reason to suspect that anybody is 'speeding up' gaming benches. That's ridiculous. For one, this would be noticeable immediately with 3rd party testing. Two, the benches that most people look at are 3rd party. This simply is not an issue.
 
Last edited:
in Hitman Nvidia cards get a significant boost if you test anywhere apart from the defined benchmark. I wonder why that is;)
Specific in-game benchmarking options are notoriously inconsistent and unrepresentative.

When I say 'gaming benchmarks', I mean the actual running of the game.
 
Last edited:
Back
Top Bottom