• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why Benchmarking CPUs with the GTX 1080 Ti is NOT STUPID!

this is a litrally stupid reason. in 07 my Q6600 faced many single threaded and maybe if I was lucky dual threaded games, most single or dual core cpus with high clocks decimated me in these low res max fps rubbish. For futureproofing its almost always better to look at utilisation rather than max fps. All we do here is show that assuming there is COMPLETE stagnation in CPU implementation and GPUs get more powerful how will we do.

I predict in 3 years that the 7700k will fall a LONG way behind the 1600x because we now code for more corez the same way a game from now or 2014 will perform much better on a Q6600 than a super fast Intel pentium Duo.
These tests of max fps on a super gpu at 720fps are as usefull as passmark for longevity because the only thing it does is bulldozes the current trend through without thinking about how if a cpu is utilised 100% now maybe in the future it will fall behind as it has no room to improve. Atleast a cpu with 60% will not get worse at worst it will remain the same gap.

this is an extrapolation of the current situation without even thinking about previous trends. More cores has always gained in performance historically, maybe in the future we will see an interesting development about how higher ram frequency is more important than cpu power who knows but for sure its not going to be 720p gaming. That died in about 2010 i hope so why benchmark it now. Its about as real world as testing the future of modern car technology by doing nothing but insisting of testing on cobbled streets only because it really shows the difference in handling and comfort. Well maybe no one drives on them anymore and its not representative of anything at all.
 
I predict in 3 years that the 7700k will fall a LONG way behind the 1600x because we now code for more corez the same way a game from now or 2014 will perform much better on a Q6600 than a super fast Intel pentium Duo.

It OUGHT to happen because developers should be able to see that clock speeds and IPC are going nowhere fast and the only way to get a huge amount more more CPU power is to use more threads than ever. The hardware is there to use if they are capable of coding it.

How much and how soon is the boring part to wait for.
 
Disagree, 4k benchmarks on a 1070 are much better at showing how close AMD are to Intel now.
I'm not too sure of the logic here as the majority of people I know that can afford to game at 4K also afford a 1080Ti.
 
There are two completely different approaches with different goals, as far as I'm concerned. First is to perform a synthetic benchmark to see how capable the CPU is, where you'd use a GTX 1080 Ti for example. The theory is this helps you know how future-proof the CPU is when paired with newer/faster GPUs, although this is still a crystal ball affair ultimately. You can test current games but you can't test future ones.

The second is to test various combinations of CPU and GPU that people would actually buy, which IMO is the generally more useful kind.

Remember a lot of the current thinking stems from a decade of CPU stagnation and GPU advancement - you could keep your base system and get a GPU that was 3 generations newer quite easily. These days it's more the other way around: the CPU space has been shaken up and the GPU space has all but dried up. AMD can barely compete right now, nVidia are delaying their new GPUs for as long as they can to milk more profits since they have no fear of competition. Pascal has been out for 2 years already!
 
Last edited:
I have never thought using a high end gpu to test 720 or 1080p games is worthwhile, not only is it never going to be done in real life but trying to ascertain the futureproofness (er!) of processor by doing so seems to be not much better than a guessing game.
I have no doubt the 7700k will continue to fair badly upagainst more core chips, the 8400 will struggle more than its hyperthreaded chums as well down the road.
 
Of course, Ryzen 7 2700X or some kind of Ryzen Threadripper should go with Radeon Vega 64 or GTX 1080 Ti.
While something much more modest like Ryzen 3 or 8400 should be tested with something like Radeon RX 560 or GTX 1050.

It OUGHT to happen because developers should be able to see that clock speeds and IPC are going nowhere fast and the only way to get a huge amount more more CPU power is to use more threads than ever. The hardware is there to use if they are capable of coding it.

How much and how soon is the boring part to wait for.

They are capable and it actually doesn't look like a very tough task, if you ask me. Just don't spread a single task among all the threads, but rather individual threads should get their own tasks. Making the games more realistic - one core to process the characters' hair, another core to process the artificial intelligence, a third core to process another effect, a forth core to process the draw calls towards the graphics core. But not that the CPU just commands the GPU and doesn't calculate anything else. Like in 4K benchmarking, when the GPU is loaded to 99%, while the CPU stays cool at 19-20% lol.
 
I have never thought using a high end gpu to test 720 or 1080p games is worthwhile, not only is it never going to be done in real life but trying to ascertain the futureproofness (er!) of processor by doing so seems to be not much better than a guessing game.
I have no doubt the 7700k will continue to fair badly upagainst more core chips, the 8400 will struggle more than its hyperthreaded chums as well down the road.

It's done on a regular basis and by arguably some of the best gamers around who play at high hz and competitevly. Although small in numbers compared to mainstream, it's still relevant. I can guarantee you anyone remotely competitive at gunplay games will take 1080p high hz over 4k 60. You can argue currently that 4k benches in a variety of games are just as useless due to the fact that when the graphics are dialed up, you can't even maintain 60fps which to some including myself makes for a bad gameplay experience. £700+ for a gpu that will be used for less than smooth gameplay, no thanks.
 
It OUGHT to happen because developers should be able to see that clock speeds and IPC are going nowhere fast and the only way to get a huge amount more more CPU power is to use more threads than ever. The hardware is there to use if they are capable of coding it.

How much and how soon is the boring part to wait for.

Problem is some tasks are inherently serial in nature and extremely difficult to multi-thread in any effective way. On the other hand games should be progressively making more use of things like physics, AI, more advanced sound simulations, etc. where you can offload those features onto additional threads effectively.

I don't think we will ever see the core main loop of games threaded in the way some people talk about as doing so basically breaks the laws of physics :s
 
Problem is some tasks are inherently serial in nature and extremely difficult to multi-thread in any effective way. On the other hand games should be progressively making more use of things like physics, AI, more advanced sound simulations, etc. where you can offload those features onto additional threads effectively.

I don't think we will ever see the core main loop of games threaded in the way some people talk about as doing so basically breaks the laws of physics :s
I agree in sentiment partially but I can gaurentee look into 7700k vs ryzen we said the same there or in dual vs quad cores and look how it turned out. More cores while keeping a high clockspeed won out every time. 4.6 or 4v8 8700k vs a 5.2 7700k we all prefer the 8700, especially with multi task in mind.

we wont see 22 core optimisation possibly ever but not for long enough I wont guess but cmon we are on 8 core optimisations massively progressing now in the past 2-3 years from the old 4 cores only. Look into your own thought program back in the Pentium 3 days if somone said oh we will have 8 core optimisation for games in the future you'd call them mental.
 
Look into your own thought program back in the Pentium 3 days if somone said oh we will have 8 core optimisation for games in the future you'd call them mental.

Back when I had a Pentium 3 I was working on a DX7 game engine, active Quake 2/3 modder and an enthusiast for things such as more advanced physics simulations even back then - I certainly wouldn't have called them mental.
 
Does anyone here fundamentally disagree with the video?

This should silence a few people around here. He gets my point across a lot better than myself.

I'm just wondering why the high-minded soapbox mentality of that ^^^^
 
Does anyone here fundamentally disagree with the video?


I'm just wondering why the high-minded soapbox mentality of that ^^^^

Because for the past 12 months I've had people come at me with comments like nobody uses a 1080ti to play at 1080p.
Completely missing the point.
 
Because for the past 12 months I've had people come at me with comments like nobody uses a 1080ti to play at 1080p.
Completely missing the point.
I don't think people misunderstand the reasons given for testing with a more powerful GPU, they just disagree with the methodology's validity. :)
 
I agree testing 1080p with a 1080ti is fine but testing future proofing by litrally testing using outdated perimeters of yesterday is ridiculous they are testing backwards compatibility and performance they are doing the opposite of their claims!
 
Disagree, 4k benchmarks on a 1070 are much better at showing how close AMD are to Intel now.

what about the actual sector most gamers play like 1080. many of my friends and mates run 1080/1080tis at 1080. for high fps on modern monitors.this is where intel often smashes amd cpu wise.some games its massive differences aswell. then often the response is at 4k you wont notice the difference.well...what if they at 1080.like many people are because to get the high fps on a 1080 144hz+ panel its exactly what you need.amd lag far behind at doing so.
 
Back
Top Bottom