• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Idiotic CPU reviewers rant thread........

You still probably have been fine TBH - you could probably have run all cores Turbo on it!! :p

I have mates who have Core i7 4770K CPUs still running at stock still perfectly chugging along!!

I had it on turbo but it wasn't enough. It was a nightmare on ps2. This became apparent when I couldn't hit 100% usage on a 1070 so I switched. Thing is on ryzen it was even worse lol. It's ok now though. I am now GPU limited which is where everyone should be.
 
i hate it when reviewers make idiotic claims like that.


Feel free to comment.

This tends to be the norm from the DIY Tech Reviewers,
Recently a young so-called British tech-head who runs a small review channel (I think they're called Tech GB) did a monitor review recently where he called a 3440x1440 monitor a 4k ultrawide monitor, I tried to politely correct him but he told me I was wrong so I gave him links to a couple of more established well known tech sites resolution stats and naming pages but he still wouldn't accept that he had got it wrong.
Unfortunately a high percentage of emerging tech reviewers don't actually know what they're talking about. So there's no surprise that they're wrong so often.
 
I had it on turbo but it wasn't enough. It was a nightmare on ps2. This became apparent when I couldn't hit 100% usage on a 1070 so I switched. Thing is on ryzen it was even worse lol. It's ok now though. I am now GPU limited which is where everyone should be.

The problem with PS2,is even with newer hardware the servers can also crap out during the 200+ player battles! :p
 
1) You can test Crysis 3 in a simpler level where Intel pulls ahead (sometimes) seriously or you can test it in an open environment (the big ones with grass) where even the FX CPUs look good when they have proper work to do.

2) You can test The Witcher 3 in a relativ basic scene, with few AIs where even an I3 is enough and is beating a FX CPU or you can test it in Novigrad with lots of AIs where you can see the power of multi core and faster CPUs.

You can use same settings, but prove/have 2 completely different results and for sure you've find similar situations in more games if someone would look into it.

Of course, it's the same for GPUs, more so when we're talking of settings other than the standard Ultra (and sometimes high, medium, etc.).

Then you have tests/reviews in which anyone knowing a bit about HW and SW asks himself a big "WHAT?"; for instance testing for the gains using a lower level API, but you test it in a scenario where this doesn't really show (high end/OC CPUs + maxed gfx details). Remember BF4 and Mantle?

Most of the reviews are extremely superficial and you can hind behind such an approach to prove/promote someone while being able to land on your feet if you're called out for it - "hey, that was happening in my testing, under that scenario!". Right...
 
It's fine now tbh, I only ever seen GPU limited not CPU. Which is an excuse for a GPU upgrade when Nvidia pull their finger out lol

I dunno at least a while back,I can still remember people with Haswell and Skylake 6000 series CPUs moaning in comms that performance was crap!:p

I might need to fire it up - I have not played it for like a year. Also,I remember when benchmarking it during my GTX1080 review,one or two settings would tank performance on my GTX1080 which was kind of ridiculous considering how the game looks,and it was indicating a GPU bottleneck at qHD.
 
1) You can test Crysis 3 in a simpler level where Intel pulls ahead (sometimes) seriously or you can test it in an open environment (the big ones with grass) where even the FX CPUs look good when they have proper work to do.

2) You can test The Witcher 3 in a relativ basic scene, with few AIs where even an I3 is enough and is beating a FX CPU or you can test it in Novigrad with lots of AIs where you can see the power of multi core and faster CPUs.

You can use same settings, but prove/have 2 completely different results and for sure you've find similar situations in more games if someone would look into it.

Of course, it's the same for GPUs, more so when we're talking of settings other than the standard Ultra (and sometimes high, medium, etc.).

Then you have tests/reviews in which anyone knowing a bit about HW and SW asks himself a big "WHAT?"; for instance testing for the gains using a lower level API, but you test it in a scenario where this doesn't really show (high end/OC CPUs + maxed gfx details). Remember BF4 and Mantle?

Most of the reviews are extremely superficial and you can hind behind such an approach to prove/promote someone while being able to land on your feet if you're called out for it - "hey, that was happening in my testing, under that scenario!". Right...

Quoted for truth.

I don't mind so much using lower resolution to test CPU performance in games, that's fine.

A lot of truth in that ^^^ with that what i don't like is using lower graphics settings to test CPU performance, this is completely flawed, by turning down image quality settings you're reducing Draw Distances, Streamed shadows and lighting, object streaming..... all these things depend on CPU performance, you reduce them or turn them off by reducing graphics setting, what a reviewer who does that is actually doing is off loading the work from the CPU onto the GPU, its no different from looking up at the sky to test your CPU performance and looking at the jungle, in the sky there is nothing for the CPU to do, on a landscape devoid post processing the CPU has little to do, in that way you can make a dual core Pentium look just as good for gaming as an i3 or and i5 or even an i7.
 
Quoted for truth.

I don't mind so much using lower resolution to test CPU performance in games, that's fine.

A lot of truth in that ^^^ with that what i don't like is using lower graphics settings to test CPU performance, this is completely flawed, by turning down image quality settings you're reducing Draw Distances, Streamed shadows and lighting, object streaming..... all these things depend on CPU performance, you reduce them or turn them off by reducing graphics setting, what a reviewer who does that is actually doing is off loading the work from the CPU onto the GPU, its no different from looking up at the sky to test your CPU performance and looking at the jungle, in the sky there is nothing for the CPU to do, on a landscape devoid post processing the CPU has little to do, in that way you can make a dual core Pentium look just as good for gaming as an i3 or and i5 or even an i7.

You should have seen The Techreport do it with Crysis 3 - they had a Pentium dual core match higher end CPUs,which struck me as odd as I have played the game. They tested this:

https://www.youtube.com/watch?v=k5pV69ELx5g

LOL.
 
1) You can test Crysis 3 in a simpler level where Intel pulls ahead (sometimes) seriously or you can test it in an open environment (the big ones with grass) where even the FX CPUs look good when they have proper work to do.

2) You can test The Witcher 3 in a relativ basic scene, with few AIs where even an I3 is enough and is beating a FX CPU or you can test it in Novigrad with lots of AIs where you can see the power of multi core and faster CPUs.

You can use same settings, but prove/have 2 completely different results and for sure you've find similar situations in more games if someone would look into it.

Of course, it's the same for GPUs, more so when we're talking of settings other than the standard Ultra (and sometimes high, medium, etc.).

Then you have tests/reviews in which anyone knowing a bit about HW and SW asks himself a big "WHAT?"; for instance testing for the gains using a lower level API, but you test it in a scenario where this doesn't really show (high end/OC CPUs + maxed gfx details). Remember BF4 and Mantle?

Most of the reviews are extremely superficial and you can hind behind such an approach to prove/promote someone while being able to land on your feet if you're called out for it - "hey, that was happening in my testing, under that scenario!". Right...

witcher 3 and crysis are both optimised games that will use all cores. They both high budget titles and fall into that exception bracket. I can understand why you struggling to think of a title thats not multi core tho because the reviewers usually dont touch them :)
 
I had an 3.8ghz 8core 16 thread CPU and more than half of it was sat idle whilst gaming. Give me a fast 5ghz 4 or 6 core over that any day.

But the operating system has other things to do whilst gaming - it has updates to do, it has other apps to maintain, etc, etc. Why do you ignore the fact that these more cores even if at some point in time are idling, can be extremely beneficial at other points in time?
 
You should have seen The Techreport do it with Crysis 3 - they had a Pentium dual core match higher end CPUs,which struck me as odd as I have played the game. They tested this:

https://www.youtube.com/watch?v=k5pV69ELx5g

LOL.


LolF###! :rolleyes: no draw distance, very little in the way of streamed shading... a perfect scene for low performance dual cores to shine.

Another one reviewers like when benchmarking Crysis 3 was the tunnels, again an inside scene with little to nothing for the CPU to do.

The Scene Digital Foundry used was Welcome to the Jungle. a proper test of CPU performance in games.

Look at the 6 core Sandy Bridge-E run away with it, even the FX-8350 is beating out the Ivy Bridge i7, 8 cores vs 4.

iHWD4K5.jpg.png
 
Quoted for truth.

I don't mind so much using lower resolution to test CPU performance in games, that's fine.

A lot of truth in that ^^^ with that what i don't like is using lower graphics settings to test CPU performance, this is completely flawed, by turning down image quality settings you're reducing Draw Distances, Streamed shadows and lighting, object streaming..... all these things depend on CPU performance, you reduce them or turn them off by reducing graphics setting, what a reviewer who does that is actually doing is off loading the work from the CPU onto the GPU, its no different from looking up at the sky to test your CPU performance and looking at the jungle, in the sky there is nothing for the CPU to do, on a landscape devoid post processing the CPU has little to do, in that way you can make a dual core Pentium look just as good for gaming as an i3 or and i5 or even an i7.

I sort of agree with you.

Reducing graphics settings the idea of what they doing is to try and prevent the GPU been a bottleneck and making the CPU a bottleneck instead, but its false.

All they need to do is find games that are naturally cpu bottlenecked, they not that hard to find, but the problem is the reviewers are stuck testing a narrow range of games and it seems only want to keep using those same games.

They need to use resolutions that are expected for a player to use such as 1080p, 1440 and maybe 720p and also 4k. So those 4 resolutions. But if they testing 720p then that should also be tested on the expected hardware so e.g. say a gtx 1050 and a i3 cp, and again on top of that if they tetsing an i3, they should be using a motherboard fitting for the cpu and make sure the TDP limit is not overridden etc for a non o/c sku.

So games like

witcher 3
crysis 3
doom

typically GPU limited and will be optimised for max cores (although doesnt mean max cores will always be faster just that they will be utilised better).

Games like

lightning returns
tales of zesteria
many non AAA titles

Will usually only utilise 1 or 2 cores, and on top of that not be optimised well so often are cpu bottlenecked vs gpu bottlenecked.

Sometimes there is games that use all cores "and" can be cpu bottlenecked such as FF15, in those cases cpus like threadrippers can be really good for those games, although on FF15 my 6 core 4.8ghz coffeelake outperforms a 3.6ghz 8 core ryzen as well as a 4.3ghz 4 core 7700k (extra 4 logical threads useless). However a 16 core threadripper I know is pretty sweet for FF15, likewise with 16 core xeon. My 4670k was stuttering with FF15, my 8600k doesnt, my 8600k stutters with 6 cores at stock clocks, but stutter free at 4.8ghz. So I know from that test 50% more cores than my haswell at 700mhz lower clocks, it is still bottlenecked.
 
But the operating system has other things to do whilst gaming - it has updates to do, it has other apps to maintain, etc, etc. Why do you ignore the fact that these more cores even if at some point in time are idling, can be extremely beneficial at other points in time?

Its not like a server, things like a NTP update is like what? a 10ms small bit of cpu usage.
Or checking for windows update once every 6 hours maybe a few seconds of low load on one core.

Windows stuff in task scheduler is designed to only run when idle, so a lot of maintenance tasks including windows defender wont run when in a game, many a/v have gamer modes so they are similar. If you have a browser open that can affect gaming although IE, chrome, and firefox recently have added features that throttles their cpu usage when idle. Just a bit of common sense tho, dont leave a heavy duty website loaded in the browser when launching a game.
 
Yes.

Testing at 1080P with a GTX 1080TI is perfectly valid, back in 2012 they used to test at 720P, notice the Crysis 3 CPU test i put up is 720P, but also notice its with Very High "VH" settings, because those guys understand that if you reduce the settings you reduce the work the CPU needs to do.

Today, with cards as powerful as the 1080TI 1080P is the new 720P, but it doesn't really matter, 720P if you like just as long as the Graphics settings are as high as they will go.

Let me say something here.

In the Digital Foundry CPU review the 7600K maxed out at 270 FPS, the Ryzen 1600X at 240 FPS, this while looking at the sky, there are too many who would take an empty scene like that and call it the difference in the true performance of those two CPU's.
I call that completely idiotic. when looking at the jungle, with all the Soft Body Physics, Streamed Shading/Lighting, Long Draw Distance.... the 7600K dropped to 70 FPS, the Ryzen 1600X to 130 FPS, almost double the performance, that to me is the actual difference in performance between these two CPU's, not because the Ryzen CPU is faster 'which to some reviewers would be a very controversial claim to make' but because in that scene the 7600K is working for all its got and its out of CPU, the Ryzen 1600X very probably also it but it is able to perform at almost twice the 7600K's limit, that makes it that much faster.
Its an accurate representation of the comparison.

This does translate to real life gaming, that is a real scene in the game and with my 1070, which as you knowing owning your own is a very high performance GPU... was utterly strangled by my 4.5Ghz Haswell in almost everything, often to the point of stuttering with the GPU down clocking into power saving states it was that strangled and yet the Ryzen CPU i have now, despite the 500Mhz clock deficit allows my GPU to stretch its legs and gaming is now buttery smooth.

That for me was the proof in the pudding.
 
Last edited:
many people use highend cards at 1080. most of my mates run two 1070s/1080s/1080ti at 1080.for modern games and high hz . you can see in benchmarks that ryzen will often in modern mp games be behind at that res by 20-30 fps in games that people play in big numbers.it does matter.
 
Its not like a server, things like a NTP update is like what? a 10ms small bit of cpu usage.
Or checking for windows update once every 6 hours maybe a few seconds of low load on one core.

Windows stuff in task scheduler is designed to only run when idle, so a lot of maintenance tasks including windows defender wont run when in a game, many a/v have gamer modes so they are similar. If you have a browser open that can affect gaming although IE, chrome, and firefox recently have added features that throttles their cpu usage when idle. Just a bit of common sense tho, dont leave a heavy duty website loaded in the browser when launching a game.

Does this mean that the antivirus doesn't scan all the game files which are loaded whilst gaming?
The described by you compromises just prove the point - they exist because of lack of sufficient resources, in the case more cores. Give people more cores and let them use their software at all times.
 
The described by you compromises just prove the point - they exist because of lack of sufficient resources, in the case more cores. Give people more cores and let them use their software at all times.
Because it's more efficient? Why would you want to run code in the background (also using other system resources) just so you can make some of your extra cores light up?
 
Does this mean that the antivirus doesn't scan all the game files which are loaded whilst gaming?
The described by you compromises just prove the point - they exist because of lack of sufficient resources, in the case more cores. Give people more cores and let them use their software at all times.

Pretty sure every modern antivirus has a game mode now which disables scanning and updates when a game is detected.
 
many people use highend cards at 1080. most of my mates run two 1070s/1080s/1080ti at 1080.for modern games and high hz . you can see in benchmarks that ryzen will often in modern mp games be behind at that res by 20-30 fps in games that people play in big numbers.it does matter.
Um disagree. "Your mates" are not a great indicator to be honest. According to the Steam Survey (which is of course not all-knowing but a far better sample size), under 5% of Steam gamers use the GTX 1070/1080/1080Ti, and that's with +1.5% market share in this month alone, which tells you the margin of error here. Some of that 5% will be at resolutions greater than 1080p also.
 
Back
Top Bottom