• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD THREADRIPPER VS INTEL SKYLAKE X

Even run BF1 64man to compare. pffff I lost 5fps..... (while all the above were ON also). The Division played DZ, saw 0 difference on FPS.

BF4 and BF1 respond well to overclocks - the difference between my 4820K at stock and 4.6GHz is the difference between ~50 and holding 60fps or more normal with the settings I play with from 80 to almost 100fps making better use of my 144Hz monitor. (This is 1440p).

Granted not all games respond as well especially if you up the res to 4K but you are massively downplaying it even on the cutting edge CPUs.

I can only imagine either something wrong with your setup or you are comparing against a poor overclock.

EDIT: Just got someone with a 6800K to run a BF1 benchmark - fair enough the difference in average FPS stock v overclock was barely over 5% but the difference in minimum FPS was almost 20% which isn't to be sniffed at.
 
Last edited:
I never bothered overclocking my 5930k or my 1800X or any GPU and performance has always been good. Reviews make it sound like overclocking is needed to play Tetris at 640*480.

Yeah apparently when we are moving off 2560x1440 to 3440x1440, 4K or VR, 720p and 1080p became mandatory resolutions to play somehow. Hell not even my TV can do 1080p atm. Is 4K HDR :P and the PS4pro handles it fine. Ahh forgot, maybe need to ask Sony downgrade the device and chop 4 of it's cores to bring it inline with "the mainstream PC" :P

Exactly this! I posted over in the 7700k binned thread a while back. Even when overclocked to 5.2 ghz the increase over stock was minimal. Like within margin of error stuff. Money would be better spent on the GPU. I'm not saying there isn't a place for it, but once you get over a certain limit you hit very diminishing returns. And it seems that's what intel have done here. That extra speed has equates to almost nothing over the previous gen but has brought a load of heat and power consumption with it.

Back on topic, thread ripper has got this in the bag.


I agree on the diminishing returns. [email protected] with 5Ghz has no difference in perf. Just generating heat and some benchmarks were better. But real gaming wasn't better. What improved my gaming experience was the jump to 6+ core CPUs.


BF4 and BF1 respond well to overclocks - the difference between my 4820K at stock and 4.6GHz is the difference between ~50 and holding 60fps or more normal with the settings I play with from 80 to almost 100fps making better use of my 144Hz monitor. (This is 1440p).

Granted not all games respond as well especially if you up the res to 4K but you are massively downplaying it even on the cutting edge CPUs.

I can only imagine either something wrong with your setup or you are comparing against a poor overclock.

I had a 4820K. Yes stock the perf wasn't good enough even at 4.9Ghz was running 24/7
Same applied to the [email protected]. Don't forget the RAM bandwidth is far inferior compared to DDR4 also.

Where I saw big jump in perf when I switched to [email protected]. Yes my benchmarks went down, compared to the 4930K, but gaming was miles better.
In WOT alone, from 70fps went straight to 90. (2560x1440).

However, buying a 1700X, but due to Asus burning it and no mobos available switched, and 6800K later, even if both are/were merely overclocked to 4Ghz the gaming performance is far superior to all the previous CPUs and fps jumped to 110+ where before was 90. (same card & res).

And I will give a very simple example. World of Tanks & World of Warships using the same single thread game engine. (with some sound process offset to a different thread from time to time).

On the [email protected] it was running on Core 2. That core was running 90-100% all the time.
On the Ryzen & 6800K was/is running on core 4 and 6 respectively. On both systems OC @ 4Ghz, the core usage never exceeded 60%. (only when I run the 6800K stock hits 72%).


------------------------

In the mean time, in the system I always have running TS, because of the clans, internet broadcasting (Addictive 90s), and few tabs open on second browser for quick alt-tabbing which is SMOOTH and without delay compared to the [email protected]. Most of the time Steam & Uplay are running on the background also.

Some could complain that I run too many apps on the background, but isn't what PCs are for? Why restrict myself to 4 core CPUs, where all processes are pact in 4 cores, and what ever is left over to their threads and not spread them to 6 or 8 cores or even 16 cores now they are becoming affordable. (16c TR will go for around £850 given that the server part is around $700!!)

Hell, I will be able to even make faster rendering with the Unity 5 now in the process, with a machine that won't break the bank, and can use it for other things than just rendering. (tried a xeon and handed back, cannot justify the costs and uselessness on general usage).

And yes I can live with 110-112fps at 2560x1440 on World of tanks & Warships with freesync (or atm without while playing with a GTX1080Ti). I do not want 200 a 5Ghz 7700K might give me, because I will have tearing :P (muahaha)
 
Same applied to the [email protected]. Don't forget the RAM bandwidth is far inferior compared to DDR4 also.

4820K with its quad channel is pretty much never memory bandwidth starved - infact going for nuts 3000+MHz RAM can actually compromise its performance in some situations the best results came from a good balance of MHz and tight timings - DDR4 seems a bit less susceptible to that in more common scenarios.

Personally though I'm an old school FPS junky to a degree - for any kind of fast paced game online I will drop settings to hold around 100+ FPS and you really feel the difference in CPUs and an overclock in that scenario which might bias my perception somewhat. I usually play BF4 for instance with "low pro" settings where even in 64 player games I'm holding 144 FPS with ease.

I do not want 200 a 5Ghz 7700K might give me, because I will have tearing :p (muahaha)

Not if you use an nVidia GPU with FastSync :p
 
4820K with its quad channel is pretty much never memory bandwidth starved - infact going for nuts 3000+MHz RAM can actually compromise its performance in some situations the best results came from a good balance of MHz and tight timings - DDR4 seems a bit less susceptible to that in more common scenarios.

Personally though I'm an old school FPS junky to a degree - for any kind of fast paced game online I will drop settings to hold around 100+ FPS and you really feel the difference in CPUs and an overclock in that scenario which might bias my perception somewhat. I usually play BF4 for instance with "low pro" settings where even in 64 player games I'm holding 144 FPS with ease.



Not if you use an nVidia GPU with FastSync :p


fast sync is a blessing for competitive fps games, I just can't play rainbow six siege or CS GO without it anymore, even overwatch feels unresponsive under 300fps due to the game suffering from horrific input lag below that.

hopefully the new 200hz ultrawide from acer/asus announced will mean vsync shouldn't have any noticeable input lag. even with my titan Xp I play siege at minimum settings (expect distance and shadows) just to stay above 200fps when there's lots of effects going on etc.

don't get me wrong, I enjoy slower paced games like the witcher 3 on my tv at 60fps too, but high refresh rates just feel so buttery smooth it's hard to move back
 
Overwatch feels unresponsive under 300 fps :p
You provide so much entertainment. I feel like I should pay you or something.

Its not as silly as it sounds - Overwatch has some funky built in frame buffering if you are used to games like CS:GO it can be noticeable - rendering very fast can brute force around the problem to an extent.

I think its related to the way the game uses a lot of client side tricks for "netcode" which aren't used by your average competitive shooter due to the different nature of the gameplay.
 
Its not as silly as it sounds - Overwatch has some funky built in frame buffering if you are used to games like CS:GO it can be noticeable - rendering very fast can brute force around the problem to an extent.

I still remember the best SC2 player I ever met was running it on a Phenom II X4 and an old AMD card,and was in one of the higher ranked leagues and still thrashed people at a LAN I was at,running Core i5s,etc who probably were getting much higher framerates,and were kind of shocked.
 
Its not as silly as it sounds - Overwatch has some funky built in frame buffering if you are used to games like CS:GO it can be noticeable - rendering very fast can brute force around the problem to an extent.

I think its related to the way the game uses a lot of client side tricks for "netcode" which aren't used by your average competitive shooter due to the different nature of the gameplay.

Sat here playing on 144hz and never ever had any feeling of input lag. Framerate is capped at 154 and it feels great. The pros are playing on 144hz too :/
 
I still remember the best SC2 player I ever met was running it on a Phenom II X4 and an old AMD card,and was in one of the higher ranked leagues and still thrashed people at a LAN I was at,running Core i5s,etc who probably were getting much higher framerates,who were kind of shocked.

Framerate doesn't == skill :p but its not unlikely he'd do even better on a higher spec setup.

Sat here playing on 144hz and never ever had any feeling of input lag. Framerate is capped at 154 and it feels great. The pros are playing on 144hz too :/

Some people notice stuff like that more than others - personally I'm pretty sensitive to it.
 
Framerate doesn't == skill :p but its not unlikely he'd do even better on a higher spec setup.



Some people notice stuff like that more than others - personally I'm pretty sensitive to it.

I dunno,he was pretty highly ranked,and regarding Overwatch I tend to get around 100 to 120FPS on my setup at qHD. I would say I am more limited by skill and tactics than my framerates TBH!! :p

Also,go and try Planetside 2 - you won't be getting 100s of FPS during the largest battles there! :p
 
Framerate doesn't == skill :p but its not unlikely he'd do even better on a higher spec setup.



Some people notice stuff like that more than others - personally I'm pretty sensitive to it.

I am very sensitive to it. I cannot play console games because of it, hence the 144hz monitor.
Regarding planetside, it doesn't matter what settings I change there the game runs and feels like a piece of crap :p
 
I dunno,he was pretty highly ranked,and regarding Overwatch I tend to get around 100 to 120FPS on my setup at qHD. I would say I am more limited by skill and tactics than my framerates TBH!! :p

Also,go and try Planetside 2 - you won't be getting 100s of FPS during the largest battles there! :p

Also depends on your playing style - personally I'm someone who plays very much on instinct and reactions and framerate and responsiveness can massively impact how well my skill as such as it is translates into the game, someone with say a more deliberated gaming style might have different requirements and be more or less bothered by different things to myself.
 
I am very sensitive to it. I cannot play console games because of it, hence the 144hz monitor.
Regarding planetside, it doesn't matter what settings I change there the game runs and feels like a piece of crap :p

Some of the shadow settings can push even a GTX1080 at qHD FFS!!

Also depends on your playing style - personally I'm someone who plays very much on instinct and reactions and framerate and responsiveness can massively impact how well my skill as such as it is translates into the game, someone with say a more deliberated gaming style might have different requirements and be more or less bothered by different things to myself.

Don't worry with PS2 it makes no difference - once you get to the major battles with 100s of people on one point,there is only one way your framerates are going.......DOWN............DOWN................GONE!! :p
 
^^ I don't tend to bother with games where the game mechanics or performance get in the way of reasonably responsive playability - if half your potential skill is being handicapped in that way its just frustrating.
 
^^ I don't tend to bother with games where the game mechanics or performance get in the way of reasonably responsive playability - if half your potential skill is being handicapped in that way its just frustrating.

I'm with you there. I dont play PS2 because of the god awful performance. Same goes for ARK.
If I can't run it comfortably then I cannot enjoy it. I have the annoying ability to detect the slightest frame rate drops.
 
Too many years of playing Quake 3 at a capped 125fps has killed my enjoyment of anything less heh once you become conditioned to that there is no going back.
 
^^ I don't tend to bother with games where the game mechanics or performance get in the way of reasonably responsive playability - if half your potential skill is being handicapped in that way its just frustrating.
It's still fun - in the end very few games offer the experience that I got from playing it, especially how frantic it gets when you have a few 100 people contesting a point and it is still a challenging game even in less intensive areas. It's also got decent sized maps too.
 
Back
Top Bottom