• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD THREADRIPPER VS INTEL SKYLAKE X

I'm with you there. I dont play PS2 because of the god awful performance. Same goes for ARK.
If I can't run it comfortably then I cannot enjoy it. I have the annoying ability to detect the slightest frame rate drops.

ARK is just a pile of fail - I can at least understand PS2 having performance drops as the base game was released a few years ago and just the amount of stuff going on.

ARK is just a poorly optimised game full stop - apparently according to some posts here you can't even run the game at 60fps at 1080p on almost max settings.
 
ARK is just a pile of fail - I can at least understand PS2 having performance drops as the base game was released a few years ago and just the amount of stuff going on.

ARK is just a poorly optimised game full stop - apparently according to some posts here you can't even run the game at 60fps at 1080p on almost max settings.

Last time I tried it on my 4770 with a 1070 it was struggling to hit 40 on high settings just in the starting area! I don't think I'll be trying it again.
 
It's still fun - in the end very few games offer the experience that I got from playing it, especially how frantic it gets when you have a few 100 people contesting a point and it is still a challenging game even in less intensive areas. It's also got decent sized maps too.

I know they are very different generations of engine and game style (though there was an update that gave it relatively decent graphics) - I used to play City of Heroes with the events where you could get 100s of players zoned in and it was pretty playable on what is now fairly ancient hardware.
 
Its not as silly as it sounds - Overwatch has some funky built in frame buffering if you are used to games like CS:GO it can be noticeable - rendering very fast can brute force around the problem to an extent.

I think its related to the way the game uses a lot of client side tricks for "netcode" which aren't used by your average competitive shooter due to the different nature of the gameplay.

basically^ that's why any high level /pro team play at 300fps because of how overwatch handles things.

please remember guys I'm talking about high level ranked play here, it's perfectly playable at whatever fps you desire, but in a competitive scene you need to reduce that input lag down to the bare minimum.

I mean rainbow six siege is the perfect example, most gunfights are under half a second long, if you're good and know how to peek corners you often kill people with a prefire before you've even appeared on their screen, what's why even 5ms here and there make such a huge difference when trying to play competitively.

it's something I guess has stuck with me as my main game has been counter strike for the last 15 odd years.

worth mentioning Titanfall 2 is pretty epic as a fun shooter, feels nice too as it's source engine based albeit respawns heavily modified version
 
Single socket 16 core is $750 dollars to be precise.

1497994409198448065856.jpg


I'm sorry, i'm really really old, does that say "Starting at $650" under the 16 core chip?

Because if it does, thats... well its pretty aggressive, thats a price so low Intel's shills are going to have an impossible time to convince anyone they are not worth it over the Intel equivalents. thats the sort of pricing where you get the impression nothing will stop the success of them, nothing.

Thats exactly the sort of viciously proactive ***** AMD need right now, awesome. do your worst Intel, at that price it ain't going to matter.
 
I'm sorry, i'm really really old, does that say "Starting at $650" under the 16 core chip?

Because if it does, thats... well its pretty aggressive, thats a price so low Intel's shills are going to have an impossible time to convince anyone they are not worth it over the Intel equivalents. thats the sort of pricing where you get the impression nothing will stop the success of them, nothing.

Thats exactly the sort of viciously proactive ***** AMD need right now, awesome. do your worst Intel, at that price it ain't going to matter.
Yes it does but i think that's the price people will pay when ordering a dual cpu server.
 
I am still using my 2600k, holding off for the 16core AMD.

Intel have milked everyone for too long imo. Even if the price is similar and performance similar I'd still go AMD.
 
I'm sorry but if AMD's EPYC 16c/32t part is $750 (suggesting the Threadripper version might be $600-700), while Intel are still charging $600 for their new 8c/16t part, Coffee Lake is the only chance Intel has of taking my money. And considering the i7-8700K will have to sit alongside the i5-7600K and i7-7700K for now at least, there's no way their 6c/12t will be under $350 unless they do some price slashing. Can't see it happening, unfortunately. Plus I'm already on a 6c/12t part from 2009 so even with a big jump in IPC, it'd still feel a bit like a sidegrade.
 
I'm sorry but if AMD's EPYC 16c/32t part is $750 (suggesting the Threadripper version might be $600-700), while Intel are still charging $600 for their new 8c/16t part, Coffee Lake is the only chance Intel has of taking my money. And considering the i7-8700K will have to sit alongside the i5-7600K and i7-7700K for now at least, there's no way their 6c/12t will be under $350 unless they do some price slashing. Can't see it happening, unfortunately. Plus I'm already on a 6c/12t part from 2009 so even with a big jump in IPC, it'd still feel a bit like a sidegrade.
How does you Xenon compare to the current 6 core processors 5820k or 6800k. I have mobo lying around and I wanted to see if it's worth a try.
 
I'm sorry but if AMD's EPYC 16c/32t part is $750 (suggesting the Threadripper version might be $600-700), while Intel are still charging $600 for their new 8c/16t part, Coffee Lake is the only chance Intel has of taking my money. And considering the i7-8700K will have to sit alongside the i5-7600K and i7-7700K for now at least, there's no way their 6c/12t will be under $350 unless they do some price slashing. Can't see it happening, unfortunately. Plus I'm already on a 6c/12t part from 2009 so even with a big jump in IPC, it'd still feel a bit like a sidegrade.

They may consider they have the sales volume high enough even with decline that they don't need to shift prices until their 10nm CPUs.
 
How does you Xenon compare to the current 6 core processors 5820k or 6800k. I have mobo lying around and I wanted to see if it's worth a try.
Theoretically a 6800K would have around 40-50% better IPC and would likely clock higher than my chip (4.2 GHz multi-thread, 4.4 GHz single thread). It'd also have more instruction sets like AVX.
 
I know some don't think it is worth it from reading a few posts but is anyone thinking of getting either platform for primarily gaming? I do Twitch and Youtube and also some video encoding but mostly game. I've had the trigger finger on a Ryzen build but I guess it's that need for even newer and shiny things and I do like the HEDT platforms generally.
 
Back
Top Bottom