• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PC Games Intel performance testing fake.

There are no warnings when you switch Game mode on, it just reboots and disables cores, that is it.

My system after enabling game mode becomes a nice 4c/8t.

gamemode.jpg

Wasn't a point raised earlier that doing this does something different for TR and Ryzen cpus.
 
Well Ryzen looked as good as anything else +/- 10% but the reason for TR performance is obvious from just looking at the configuration of the chip and platform. I'm sure being a new from the ground up platform added a little more complexity when analysing what AMD had to offer initially and what AMD are offering is something Intel are fully aware of.

No Ryzen numbers didnt looked normal. Total warhammer 2 was obvious red light.
70fps on the Intel benchmark against 103fps with same GPU and on higher settings in techspot.... That is not just 10% nor 20%. Is 50% as much as Intel boasting was better than AMD.
 
No thats not incompetence, its clever feigned ignorance, the fact they knew exactly how to gimp AMD to achieve that level of results shows these guys have half a clue how to setup a machine, a lot of what they did was utter nonsense, they knew full well Ryzen (Not threadripper) doesnt like 4 dimm slots populated and will run lower mhz ram to be stable, if they had used 3200mhz c14 ram the results would be a lot better, same with the game mode, they can say "well AMD name it game mode, so its for gaming" and thats a get out of jail for them etc... same with the cooler, well the Ryzen cooler would cost $40 so we can use an aftermarket cooler on the Intel rig as it doesnt come with one etc etc...

That is not incompetence, that is a cleverly thought out and planned hatchet job on your rival.

maybe im still airing on the side of being incompetence a little and also their own hubris. as that comment about how he's been testing for longer than steves been alive and then states 16 years, well then sir i call you a noob! been overclocking and the like for 25 years :P im not excusing them at all but i have a feeling its more than just a hatchet job on their part. will be interesting to see what happens if the follow up tests ever happen and the results get made public and if they recant on the results intel are using in their press releases now.
 
Wasn't a point raised earlier that doing this does something different for TR and Ryzen cpus.

It's doing the same thing just that Ryzen has less cores than Threadripper unless you have the runt of the litter :D it does an additional change for TR which is to change memory mode between NUMA (local) and UMA (distributed) game mode sets local which is a good thing.
 
It's doing the same thing just that Ryzen has less cores than Threadripper unless you have the runt of the litter :D it does an additional change for TR which is to change memory mode between NUMA (local) and UMA (distributed) game mode sets local which is a good thing.

I believe I heard there was a warning that pops up when you try it on a Ryzen instead of a TR due to it doing nothing useful for the Ryzen.
 
I don't even think the whole 'exposing cpu bottleneck' at 1080p really works as the results at 4k show a closing of the gap, and the CPU is working harder at 4k to push more data.

CPU working harder at 4k? Where do you get that nonsense from?

People who complain about low resolution tests for high end CPUs are completely missing the point. The CPU is doing next to nothing at 4k so what kind of test is that? It would be like testing a 2080ti at 720p/60Hz.
 
I think the main issue isn't that they are fake, its rather that there is no information on any test rigs and benchmark settings, and are released many days before the nda date, meaning it is advertising/propaganda dressed up as a review, and should be treated as advertising.
How many people who publish benchmarks tell u their ram timings?... not a lot
 
CPU working harder at 4k? Where do you get that nonsense from?

People who complain about low resolution tests for high end CPUs are completely missing the point. The CPU is doing next to nothing at 4k so what kind of test is that? It would be like testing a 2080ti at 720p/60Hz.

I see your point and did a bit of googling, it seems like FPS relates to CPU load quite nicely. Also I see what you're saying about 1080p being good for testing CPU load, but I don't see what it achieves. You can get 400fps on a 8800k vs 300fps on a 2700x (made up numbers), neither of these numbers has any point as its not a setting that anyone with that sort of hardware would use. Average frame times, and performance at the intended quality/resolution makes more sense, if you can't differentiate the CPUs thats the fault of the GPU/game not being demanding enough and using artificial settings to make the CPUs seem different doesn't to me have any value.
 
You can get 400fps on a 8800k vs 300fps on a 2700x (made up numbers), neither of these numbers has any point as its not a setting that anyone with that sort of hardware would use

It tells you which CPU can push a GPU at the highest FPS. Whether that is relevant to a particular use case is for the individual to work out, but if we're going to gauge gaming performance then we need to separate them somehow. It's like comparing a 1080ti v 2080ti at 1440p. The 1080ti might hit 120fps and the 2080ti might hit 150fps. What does this mean to the end user who might have a 1440p/60Hz monitor and play with Vsync on?
 
The cooler and memory part of the interview are just difficult to watch, as it's just really shoddy testing.
Memory timings, they are either incompetent or they knew exactly what they were doing.
 
Really shocking shenanigans from Intel. We, as consumers, need to come down hard on crap like this, which seems increasingly the norm.

Hope you saw the Hardware Unboxed video posted today about the new intel HEDT lineup yes?

Clearly displays the arrogance of this company.
 
It tells you which CPU can push a GPU at the highest FPS. Whether that is relevant to a particular use case is for the individual to work out, but if we're going to gauge gaming performance then we need to separate them somehow. It's like comparing a 1080ti v 2080ti at 1440p. The 1080ti might hit 120fps and the 2080ti might hit 150fps. What does this mean to the end user who might have a 1440p/60Hz monitor and play with Vsync on?

End of the day..the cpu that can hold the highest single thread frequency is gona dish the highest max fps stat....

But yeah . Game smoothness is something else and latency..IS NEVER TESTED...

Ryzen has its usb ports..or some of them directly controlled by the cpu....

Nobody benchmarks this.... this is key for V.R and competetive online shooting...

With meltdown n spectre effecting caches and storage ....
From what i have been told is game loading time has increased on the intel side...BIG TIME...

On older intel systems the cracked version of f1 2018 can take upto 2mins to load...

On my pc it takes around 20-30seconds from launch to menu on ssds

This isnt the end of it .i knew as soon as i saw last year that cpu cache memory had to be remapped that this is gona be a huge penalty to performance..and as time goes on.... it will hit harder
 
It tells you which CPU can push a GPU at the highest FPS. Whether that is relevant to a particular use case is for the individual to work out, but if we're going to gauge gaming performance then we need to separate them somehow. It's like comparing a 1080ti v 2080ti at 1440p. The 1080ti might hit 120fps and the 2080ti might hit 150fps. What does this mean to the end user who might have a 1440p/60Hz monitor and play with Vsync on?

Thats the thing bro...
Everyone is complaining that RTX only hits 1080p-60 max... and that its not enough..... when most people only run... 1080p-60 screens....

We can turn rtx off.... just know that ray tracing is gona be the main selling point of next generation consoles
 
This thread clearly demonstrates how some people get away with murder, I'd love some of these people sitting on a jury if I was a criminal. One error can be excused but they pretty well did everything possible that could hobble AMD's performance. Accidental? Hahahaha :)

Edit: Just watching the GN interview. The owner had to clarify that it would have been fairer if they used the same cooler on all CPU's. So much for the level playing field he was trying to achieve!
 
Last edited:
This thread clearly demonstrates how some people get away with murder, I'd love some of these people sitting on a jury if I was a criminal. One error can be excused but they pretty well did everything possible that could hobble AMD's performance. Accidental? Hahahaha :)

Edit: Just watching the GN interview. The owner had to clarify that it would have been fairer if they used the same cooler on all CPU's. So much for the level playing field he was trying to achieve!

Also what the owner thinks ultimately might not mean the people actually doing the testing (the owner doesn't sound like he's that involved day to day) didn't deliberately hobble AMD. After all, Intel is basically paying for it all.
 
Also what the owner thinks ultimately might not mean the people actually doing the testing (the owner doesn't sound like he's that involved day to day) didn't deliberately hobble AMD. After all, Intel is basically paying for it all.

Plausible deniability. First Intel use a third party then the owner of said third party doesn't know what goes on. States they are aiming for usage like the average gamer then uses 64GB RAM!

My betting is some very clever people knowing exactly how to deflect, obscure and confuse the whole issue whilst letting the dodgy results stand.

That's how you get rich ;)
 
Plausible deniability. First Intel use a third party then the owner of said third party doesn't know what goes on. States they are aiming for usage like the average gamer then uses 64GB RAM!

My betting is some very clever people knowing exactly how to deflect, obscure and confuse the whole issue whilst letting the dodgy results stand.

That's how you get rich ;)

To me it did sounded like he's realised his business is finished. No one will use them anymore. I reckon he got screwed by his employees.
 
But yeah . Game smoothness is something else and latency..IS NEVER TESTED...

Ryzen has its usb ports..or some of them directly controlled by the cpu....

Nobody benchmarks this.... this is key for V.R and competetive online shooting...
How do you know it's key if no-one benchmarks it?
 
Back
Top Bottom