• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ConLake Returns - Golden Sample 8700K's.

What he does is uses someone else's information, reviewers mainly who we apparently shouldn't be taking any notice of anyway, to make his own point valid. I don't see any detail, just using someone else's work which is suspect anyway. If we are going to be skeptics about reviewers work, well he is using their numbers.

I don't have a favourite colour but I guess that's all people see when you criticise something they like.
If reviewers did something wrong in the review why are you not allowed to point that out? He is doing it and shows proof, analysis, comparisons from various sources etc. Like above sideways said, just because he is doing something different doesn't mean it's wrong. Everyone who is against adored never show any arguments in the topic to prove him wrong, never. Just say, his ego this or that, he doesn't do reviews, but where is analysis that he is doing it wrong? HU tried and failed.
 
"Did something wrong" implies absolute correctness on this topic. I don't think using faster RAM is necessarily "wrong", in fact I think he'd be up in arms if they reviewed Ryzen and didn't overclock the RAM, for example. Including older games is irrelevant IMO, to be representative of what people play you have to choose popular games, not "modern" ones, and that would likely include games like GTA V and WoW. The gripe about settings is more valid, if only because they don't explain why they used different settings for some games compared with settings they'd used in previous reviews. What settings should be used when benchmarking games though? There is no right answer. I suppose "typical" settings that'd get you 60 FPS at 1080p with a mid-range GPU might be most representative but you're never gonna please everyone.

As he says at the start of the video, you can prove anything with the right testing parameters.
 
If reviewers did something wrong in the review why are you not allowed to point that out? He is doing it and shows proof, analysis, comparisons from various sources etc. Like above sideways said, just because he is doing something different doesn't mean it's wrong. Everyone who is against adored never show any arguments in the topic to prove him wrong, never. Just say, his ego this or that, he doesn't do reviews, but where is analysis that he is doing it wrong? HU tried and failed.

I'm not saying he can't point it out. Why would I do that? He is using data that is potentially also flawed and so I see no proof. His argument is based on the same flawed data unless we are picking what reviewers data we like. I imagine we could do something similar for all reviewers and find similar inconsistencies.
 
He's not a fanboy, thats for sure. Yeah alright, he references himself a bit too much for me and I've said that before but one thing he certainly is not is a fanboy. You can't have watched many of his videos to come to that conclusion. People who do I'm sure are just repeating what they've heard online about him from disgruntled fanboys of other members of the tech press.

I used to be a sub of Adored TV and just watched a couple of the videos he mentions where he states he is not a fanboy. In both the titles are less click bait and he is much much kinder to AMD.
 
I'm not saying he can't point it out. Why would I do that? He is using data that is potentially also flawed and so I see no proof. His argument is based on the same flawed data unless we are picking what reviewers data we like. I imagine we could do something similar for all reviewers and find similar inconsistencies.
So it's OK to make reviews with flawed data, but it's not OK to analyze those reviews?
In science there's this thing called peer reviewing, whose purpose is to catch both errors/mistakes and purposely fraudulent data.
And it happens exactly by others analyzing results of others.

Major part of Intel's advantage over Ryzens comes from those very agressive boost clocks.
And it's exactly those high clock speeds showing reviews using high end motherboard and coolers average consumers look at and repeat.
So there could be lot more scrutiny on how well those results correspond to PCs of average consumers.

There's certainly lots of big brand "supermarket" PCs with Intel CPUs selling no doubt because of those "Intel is always better" results.
But those same PCs also skimp on everything possible.
Starting from using cheap (stock) coolers, which surely aren't enough to handle heat output of those high boost clocks. (with non soldered heatspreader worsening temperatures)
Also those cheap motherboards certainly don't have strong VRM to keep CPU clocks maxed.
So "TDP" is likely capped in many of them in the first place.

Intel even specifies that it's base clock, which is used to define TDP.
So that 8700K of standard market PC is unlikely to spend lots of time at those high boost clocks.
 
excellent video.

Glad he called it out, but the rest of the industry will be silent and play dumb over it.

What makes it even worse is those statistical figures from silicon lottery are with the following as well.

short stress tests
AVX -2
very high voltage (yes I consider 1.4 or higher very high, and not a good idea for 24/7 use).

My 8600k stock voltage out of the box in the bios was around 1.28v. At around 1.31v I can get 4.8ghz (which is what I run at now) no AVX offset.
To reach 5ghz (and 4.9 which needed the same vcore), I need 1.38v.

I also wonder why the press and some other people have so many temp issues, my haswell not delidded has way better temps than my i5 750. My 8600k even when at close to 1.4v in my testing never went over 75C (not in case, but same as reviewers most of their testing is on testbed). At 1.28v non AVX gaming load it doesnt even hit 50C inside case with mid range air cooling, in 1.31v AVX load stress in case its under 65C with mid range air cooling. Idle temps under 25C inside case. Granted I did this testing during feb/march.

If I was prepared to run at these horrible 1.42 voltages, what clock would I reach, I think maybe possibly 5.1,5.2 but I did need a 0.8v boost in the bios setting to go from 4.8 to 4.9/5.0, so if same again needed to reach 5.1 It would be equal to a 5.0 silicon lottery sample. Which is apparently below average.
 
What a massive ego this AdoredTV has. All he does is reference himself and his videos and goes on as though people really give a toss about what he has to say. Just a massive troll. He also never does any testing of his own but trawls the internet for figures that suit his purpose.

"You forced me to do this Steve and I am going to forgive you". What a massive ****.

To be honest its better than everyone patting each other on the back and been scared to raise critiscm.
 
Anyone who thinks he is an AMD fanboy needs to watch this right until the end.

Indeed, and he could have actually took another dig at the retail test, I just started watching the hardware unboxed 10 cpu test video, and found out the 10 retail cpus were "donated" oh dear.

So still not anonymously purchased from a retailer.
 
To be honest its better than everyone patting each other on the back and been scared to raise critiscm.
That's the thing with 90% of the tech review channel on YT everyone is pals with each other and they don't want to upset the big manufacturers namely, Intel, Nvidia and Asus who all pay them/given them hardware for to be used in custom build videos (go an watch any LTT video that involves building a brand new PC of some sorts and it based around hardware from those three companies). It's drives me up the wall and I find a lot of content put out by LTT, Paul's Hardware, Bittwit (or whatever his name is), Jayz2cents etc rehash's or content they've done previously and it's dull and predictably so it's good someone is out there looking at what these guys are doing and holding them to account.

Clearly there's people in this thread who don't like him and think he's egotistical but if it brings about change so consumers get better and more accurate information and opinions based hardware that will end up in the hands of consumers then that can only be good thing.
 
To be honest its better than everyone patting each other on the back and been scared to raise critiscm.

Agreed. This is why (at least to me), when I watch one video review or read one review of a particular product, it feels like I have watched/read them all. There is really no diversity in the reviews or different approaches in testing anymore.
 
The problem with reviewers using high end cooling and high end boards for reviews still seems to be lost on folk, especially the reviewers themselves.
The vast majority of the chips that go out never see a high end cooler or an overclock tool, yet inflated results from using them get used as gospel and truth when in reality you have to be delving deep into a pc to do this.

I had to laugh at the bit where he called out the use of the intel overclock tool to deal with the 65 watt issue, i mean come on... how can any reviewer with a straight face say that. Yes i want to know the numbers a chip can put out when pushed but bugger me thats not stock.

Far to much of the tech press are yes men, its bloody obvious and you must be really naive if you dont see it.
 
That's the thing with 90% of the tech review channel on YT everyone is pals with each other and they don't want to upset the big manufacturers namely, Intel, Nvidia and Asus who all pay them/given them hardware for to be used in custom build videos.
Yep, pretty much always some expensive Asus ROG stuff there...
While at the same time Asus has apparently been cutting their non-ROG motherboards worser and worser.

And no doubt cost of this current RGB fad is also cut away from actually important parts.
Yet don't see reviewers checking that, only advertising about those fancy lights.
 
The problem is both AMD and Intel usually send review kits with the higher end motherboards and cooling as standard,which is good for the numbers published but less indicative of normal performance.

What I think reviewers should do,is to the test the CPUs with the high end parts since this is what is usually sampled by AMD and Intel at launch,and do a follow up review with chips using stock cooling or lower end cooling in a cheaper motherboard.
 
The problem with reviewers using high end cooling and high end boards for reviews still seems to be lost on folk, especially the reviewers themselves.
The vast majority of the chips that go out never see a high end cooler or an overclock tool, yet inflated results from using them get used as gospel and truth when in reality you have to be delving deep into a pc to do this.

I had to laugh at the bit where he called out the use of the intel overclock tool to deal with the 65 watt issue, i mean come on... how can any reviewer with a straight face say that. Yes i want to know the numbers a chip can put out when pushed but bugger me thats not stock.

Far to much of the tech press are yes men, its bloody obvious and you must be really naive if you dont see it.

Also these graphs with 8400 chips on, I wonder how many of them had the TDP limit raised in the bios to stop TDP throttling and as such make unrealistic results.

The point adored was making is that on the 8400 chips which have a pretty low base clock mixed with a beefy turbo clock is that they cannot sustain these turbo clocks for long periods because it pushes the chip out of spec so it throttles down. On non overclocking boards that these chips will often be paired with the end user cannot adjust the TDP limit in the bios so are bound by these limits, the only way is via that intel xtu app.

So many things its unreal.

MCE been on for so called stock tests
running 8400 chips in ROG motherboards
Doing GPU benchmarking on rigs with threadrippers or xeons.
Doing CPU tests on well optimised games that dont reflect the bulk of the games on the PC market. Even games that give intel a modest say 10-20% lead are actually favouring heavy core chips, as the true picture for badly coded games is a circa 30-40% advantage to highly clocked intel chips. They can test witcher 3, doom and stuff, but they should mix it in with low budget JRPG games, so end users see a more realistic picture. There is more to gaming than high budget western FPS games.
Only testing hardware thats been supplied by the vendors. These also come with "review guidelines", where they asked to push specific things on the reviews, and even can advise on what hardware to bench against. I remember the nvidia maxwell reviews where almost every reviewer only compared against certian cards. Also vega reviews omitted the 1080ti, it got revealed its because AMD requested it.
Staying quiet on flaws, unless there is "overwhelming" bad PR from it e.g. the 970 VRAM issue.

In regards to the games been tested, on that 10 intel retail chips video from hardware unboxed, he made a comment along the lines of reviews been accurate as they testing the wrong games, so he recognised the problem although he does the same thing himself. Another reviewer I forgot who it was also addressed it once, he said the reason they pick highly cpu optimised games is they feel a unoptimised game doesnt allow a multi core cpu to show its full potential, well yes thats true, but thats simply the state of the market. If you picking games to allow multi core cpus to show their full potential then you effectively been biased to heavy core cpus.
 
Plenty of websites test popular games but I would like more of them to be covered:

http://store.steampowered.com/stats/

Some others include Fornite,etc.That is what they really need to be looking at as well some of the more advanced ones. Witcher 3 is a very valid choice though as it is popular still and that does not include the DRM free version played through GOG too,which CDPR owns.

I would also like tests with typical gaming GPUs,ie,a GTX1060 as well as a Titan X,since too many CPU reviews tend to make games look more CPU limited than they are,when for the most part people will be GPU limited. From my own experience if a game is truely CPU limited you see it with even slower graphics cards - in the games where I am CPU limited I saw the same CPU limited dips with a GTX960,RX470 and GTX1080.

I remember a thread here where someone asked about a new CPU for FarCry 5 and I pointed out they were more GPU limited than CPU limited,and everyone else was just argueing whether Ryzen or CFL was better,instead of looking at what their setup had.

On top of this reviews need to try and test longer sequences,and at least two to three areas in games,otherwise it might paint a wrong picture of a game being more CPU limited or even less CPU limited than it is in most realworld gameplay. I saw that with FO4 testing where only HardOCP realised that large user built settlements were where the most CPU limitations were,not wondering through most of the world. Also a whole load of websites artificially unlocked framerates,which was pointless as FPS and physics are linked in the game.
 
Last edited:
But as PC enthusiasts that's a given right? It's the same as the last video. He is pointing out to us that doing benchmarks at different times will be different. Well of course. Even doing them on the same day can throw up odd inconsistances.

You know that video AdoredTV made to prove low resolution testing to show future gaming performance on given CPU's was flawed? that the once slower FX8350 is now faster than the 2500K in modern games?

Well Steve at Hardware Unboxed made a counter video to that, in it he concluded the 2500K was still faster, there are a number of discrepancies with Steves video, still using 2500K era games, using different settings in different games almost as if looking for ways to make the evidence fit the agenda, but worse also having very much lower results for the FX-8350 in this test than he had for the same games at the same settings in the past, Jim at AdoredTV put it down to mistakes, others not so kind might say Steve was making the numbers up so he could prove his point.

My trust for him has just gone out of the window.
 
Last edited:
Well Steve at Hardware Unboxed made a counter video to that, in it he concluded the 2500K was still faster, there are a number of discrepancies with Steves video, still using 2500K era games, using different settings in different games almost as if looking for ways to make the evidence fit the agenda, but worse also having very much lower results for the FX-8350 in this test than he had for the same games at the same settings in the past, Jim at AdoredTV put it down to mistakes, others not so kind might say Steve was making the numbers up so he could prove his point.

My trust for him has just gone out of the window.

The biggest difference was Steve was using high speed/overclocked memory (2133 Mhz I believe) which gave the 2500K a large boost from previous tests with Ram that was around 1333 or 1600 Mhz.
 
You know that video AdoredTV made to prove low resolution testing to show future gaming performance on given CPU's was flawed? that the once slower FX8350 is now faster than the 2500K in modern games?

Well Steve at Hardware Unboxed made a counter video to that, in it he concluded the 2500K was still faster, there are a number of discrepancies with Steves video, still using 2500K era games, using different settings in different games almost as if looking for ways to make the evidence fit the agenda, but worse also having very much lower results for the FX-8350 in this test than he had for the same games at the same settings in the past, Jim at AdoredTV put it down to mistakes, others not so kind might say Steve was making the numbers up so he could prove his point.

My trust for him has just gone out of the window.

Watching through the video:

The computer base scores show no improvement for ever until the last benchmark. Even in the same year before moving to a Pascal the FX lost. That proves nothing other than if you were willing to wait with your FX until Pascal came along you might be onto a winner. The other so called improvements for the FX were within margin and so discountable.

The scores he found while he was trying to prove the difference between a 2500k and FX were not comparing those actual CPU's. He also mentions the change from 2500k to 2600k every score but fails to mention the 8350 to 8370. I imagine all of these systems were very different each time as well. What were the specs of the rest of the machines? Not very reliable.

For HU discrepancies it's the change of settings. If HU wanted to show the FX purposely in a bad light why have a video up showing it off at it's best? Why not remove the video or use the best settings for the 2500k in both videos?

He then removes the games from his list he doesn't like and corrects the scores for the 3 games where the discrepancies were (even though his list is still showing the original HU scores) and it still shows the FX losing.

He then tries to blame memory speeds as though that is HU's fault. Why wouldn't you use the fastest memory to get the most from your CPU?

So the only way he shows anything is with computer base (one source), blaming the fact that the 2500k does well with faster memory (like Ryzen so we keep hearing) or changing and removing games that suit his purpose.

He has not done what you suggest at all.
 
Back
Top Bottom