• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD 3300x vs Intel i9-10900k... evidence that gamers can save a lot of cash on CPU's

Caporegime
Joined
8 Sep 2005
Posts
27,601
Location
Utopia
Hi guys. I just watched a very interesting performance comparison video between an AMD 3300x and a Intel i9-10900k that re-enforced that beyond a surprisingly low pricepoint, CPU's don't make so much impact to most gamers. This may not be new or revolutionary information to many people, but with the CPU wars REALLY heating up in a post-COVID economy I think it's an ideal time to discuss it and give it some spotlight.


The CPU difference becomes FAR less significant when using 1080p Ultra detail or above graphical settings, with the AMD 3300x and the 10900k performing on a par or at least within 10% of each other, despite a massive 430% difference in price. :eek:

Of course the CPU can matter more in calculation-intensive games such as RTS, 4x etc, but even there it looks like the AMD 3300x is going to do the job adequately in most cases and stepping up to 3600x/3700x will give you the power you need to really be happy.

So what does this mean?

On average, you don't realistically need to buy anything above an AMD 3700x (8c16t) if you are a high-end gamer and also want ample power for most creativity work for the forseeable future. An AMD 3600x (6c12t) at 30% less represents the ideal "budget" option for high-end gaming with breathing room for creativity work.

The money you save on the CPU can be pumped into a GPU for FAR more perceptible bang for your buck than by pumping extra hundreds into the CPU/MOBO/RAM. Of course, if you are a performance enthusuast or overclocker then this is a moot point and you will buy what makes you happy regardless of cost, and that is fine, but for those budget and value conscious gamers who simply want the best performance for the money without fiddling around, then it makes sense to pair a modest CPU with a high-end GPU.

I really do not see anything better overall value than an AMD 3700x based platform with a bog-standard PCIe4-enabed motherboard and average branded RAM, with the money saved then going into a Nvidia Ampere 30xx on release. The release of the AMD 4000 CPU's in Q3/Q4 will push the AMD 3000 series down to even more ridiculous levels of value. :)
 
I'm guessing that conclusion will be different when Nvidia 3000 series and Big Navi launch as there will be less of a GPU bottleneck at the higher resolutions and we'll see performance differences between the different tiers of CPU. It's right that CPU isn't a massive issue right now.

Also when the PS5 and Xbox Series X are well embedded after a couple of years more devs will utilise the 8C/16T in games and that should introduce more performance differentiation.
 
I'm guessing that conclusion will be different when Nvidia 3000 series and Big Navi launch as there will be less of a GPU bottleneck at the higher resolutions and we'll see performance differences between the different tiers of CPU. It's right that CPU isn't a massive issue right now.
I don't see the logic of how the NV 3000 series will make a significant difference to CPU-limitations as the benchmarks show that low-end CPU's already provide more than enough power for current GPU's and are not at some maximum capacity. CPU's such as the 3600x and higher will therefore very likely provide more than enough power for even a 3080ti when you offload the processing to the GPU. The more you offload to the GPU (more detail and higher resolutions), the better a CPU will perform and the less performance dependency will be placed upon it.
 
I'm pretty much of this mindset.

My current plan is a get an A520 board, the cheapest 8 core CPU (3700X or 4700X if significantly better & similar price to the 3700X), and dump the rest of the budget into the best GPU I can get late this year / early 2021.
 
I'm pretty much of this mindset.

My current plan is a get an A520 board, the cheapest 8 core CPU (3700X or 4700X if significantly better & similar price to the 3700X), and dump the rest of the budget into the best GPU I can get late this year / early 2021.
Will A520 definitely have PCIe4 lanes? I read conflicting reports when I Googled.
 
I see lots of posts similar to this, and have one concern

Intel still beats out AMD when it comes to 1080p - it is commonly agreed that this is because the GPU isn’t being bottlenecked here. As a result, when the GPU is being taxed heavily, like at higher resolutions, the CPU becomes less of a worry.

Do you think that with the new GPUs from NVidia, and to a lesser extent AMD, we could see the gap between Intel and AMD open up more, if the GPU is no longer the bottleneck at 1440p?

How much would this really matter do we think?
 
I see lots of posts similar to this, and have one concern

Intel still beats out AMD when it comes to 1080p - it is commonly agreed that this is because the GPU isn’t being bottlenecked here. As a result, when the GPU is being taxed heavily, like at higher resolutions, the CPU becomes less of a worry.

Do you think that with the new GPUs from NVidia, and to a lesser extent AMD, we could see the gap between Intel and AMD open up more, if the GPU is no longer the bottleneck at 1440p?

How much would this really matter do we think?
Did you read the above posts thoroughly? Maybe give them another once over. :)
 
Did you read the above posts thoroughly? Maybe give them another once over. :)

I have, and still not sure

I guess it depends on how many FPS you need, vs want.

If we see 1440p become the new 1080p, where differences in IPC are exaggerated, with up to 5-10% in performance delta, it could be the difference between 144 FPS and 158. Not that much really - would it even be noticed?

And, as you say, if the GPUs are that much more powerful, the FPS might be so high that the differences are irrelevant, because they’re all maintaining 165 FPS at 1080p
 
The difference in performance from next-gen GPUs would have to be astronomical in order for a 3700X to become the bottleneck at 1440p / 2160p, given how unstressed they currently are in games.
 
I would have liked to have seen minimum frame rates, frametimes and something other than 1080p considering the 10900k is part of the test.
They also show 4k Ultra? I mean it's not the most exhaustive set of tests with min/max, but to me it still confirms the situation and re-enforces what we already knew and applies it to the latest CPU generations.

I have, and still not sure

I guess it depends on how many FPS you need, vs want.

If we see 1440p become the new 1080p, where differences in IPC are exaggerated, with up to 5-10% in performance delta, it could be the difference between 144 FPS and 158. Not that much really - would it even be noticed?

And, as you say, if the GPUs are that much more powerful, the FPS might be so high that the differences are irrelevant, because they’re all maintaining 165 FPS at 1080p
I'm still not sure what you really mean but it seems you may be overcomplicating things in your mind a little...
 
I see lots of posts similar to this, and have one concern

Intel still beats out AMD when it comes to 1080p - it is commonly agreed that this is because the GPU isn’t being bottlenecked here. As a result, when the GPU is being taxed heavily, like at higher resolutions, the CPU becomes less of a worry.

Do you think that with the new GPUs from NVidia, and to a lesser extent AMD, we could see the gap between Intel and AMD open up more, if the GPU is no longer the bottleneck at 1440p?

How much would this really matter do we think?

You answered your own question...

Intel CPU's, in games at least are about 10% faster due to higher clocks, its true, Intel are "The fastest chip for gaming"

However... right now you need a 2080TI at 1080P to see those extra FPS because that is where the CPU not the GPU is the bottleneck, how many people have a 2080TI and play at 1080P? the vast majority use GPU's around the 2070S / 5700XT where even at 1080P a Ryzen 3600 is more than enough, its £100 cheaper than Intel 10600K, more over B450 motherboards are cheaper, £110 for a good one vs £150 from Intel, the 3600 will run comfortably on a £20 cooler you would need to use an entry level AIO or a better air cooler to stop the 10600K from boiling its self.

A 3600 Platform is half the price of a 10600K platform and the performance difference with anything from a 2080 down is nothing. Hell if your games ain't too multithreaded you could even get away with a 3300X, £120 CPU vs a £280 CPU, the cheaper None K CPU's from Intel are still much more expensive and not as fast because they don't have the clocks.

If you're the sort of guy who always has the best GPU then you're going to have a 10900K to get the most out of it, that's completely sensible.

For most people, even those on still relatively powerful GPU's, my 5700XT is no slouch.... a Ryzen 3600 is perfect and it has headroom to upgrade to a faster GPU because i don't game at 1080P, and if you're on a B450 or higher board you don't even need to change that when Ryzen 4000 hits as thanks to AMD's change of mind they will now drop into B450 boards.

PS: a Ryzen 3300X 'clocked' vs a 9900KS.

v8EgPlk.png
 
Last edited:
I agree. Frametime consistency is better with the higher end CPUs:

tomb-raider_1080p.png


This will be a consideration for some but the 3300X is far from unplayable and the difference is likely unnoticeable unless specifically being monitored.

That Fortnite result is mighty impressive.
 
You answered your own question...

Intel CPU's, in games at least are about 10% faster due to higher clocks, its true, Intel are "The fastest chip for gaming"

However... right now you need a 2080TI at 1080P to see those extra FPS because that is where the CPU not the GPU is the bottleneck, how many people have a 2080TI and play at 1080P? the vast majority use GPU's around the 2070S / 5700XT where even at 1080P a Ryzen 3600 is more than enough, its £100 cheaper than Intel 10600K, more over B450 motherboards are cheaper, £110 for a good one vs £150 from Intel, the 3600 will run comfortably on a £20 cooler you would need to use an entry level AIO or a better air cooler to stop the 10600K from boiling its self.

A 3600 Platform is half the price of a 10600K platform and the performance difference with anything from a 2080 down is nothing. Hell if your games ain't too multithreaded you could even get away with a 3300X, £120 CPU vs a £280 CPU, the cheaper None K CPU's from Intel are still much more expensive and not as fast because they don't have the clocks.

If you're the sort of guy who always has the best GPU then you're going to have a 10900K to get the most out of it, that's completely sensible.

For most people, even those on still relatively powerful GPU's, my 5700XT is no slouch.... a Ryzen 3600 is perfect and it has headroom to upgrade to a faster GPU because i don't game at 1080P, and if you're on a B450 or higher board you don't even need to change that when Ryzen 4000 hits as thanks to AMD's change of mind they will now drop into B450 boards.

PS: a Ryzen 3300X 'clocked' vs a 9900KS.

v8EgPlk.png

Very interesting stuff

Seems like the moral of the “story” is that the money is better spent upgrading the GPU than the CPU, once you get to 3600 territory.
 
Absolutely ^^^

I agree. Frametime consistency is better with the higher end CPUs:

tomb-raider_1080p.png


This will be a consideration for some but the 3300X is far from unplayable and the difference is likely unnoticeable unless specifically being monitored.

That Fortnite result is mighty impressive.

Fortnight is a lower threaded game. the picture does look a little different in more threaded games, like SOTR... and its clocked too in that slide, its just to show what it can do once clocked.

On your slide, the 2080TI is 40% faster than the 5700XT, the performance difference here between a stock 3300X and a 5.1Ghz OC 10600K or a Stock 10900K is 30%.

The same slide with a 5700XT, with medium 1080P settings.... seriously who plays this game like that on high end GPU's? Would be there is no difference between the 10900K and 3300X.
 
Also when the PS5 and Xbox Series X are well embedded after a couple of years more devs will utilise the 8C/16T in games and that should introduce more performance differentiation.

I'd agree with that but if you're building a PC here and now for gaming it's pretty hard to justify spending more IMO. Might as well just get the 3300x and save the extra cash to upgrade in a few years time when/if games do actually benefit from that faster CPU.
 
I'd agree with that but if you're building a PC here and now for gaming it's pretty hard to justify spending more IMO. Might as well just get the 3300x and save the extra cash to upgrade in a few years time when/if games do actually benefit from that faster CPU.
I will likely be buying a 3700x 8c16t in Q3 as I think this is the sweet spot for heavy gaming, creativity and productivity use for the forseeable future. There is nothing it won't handle for the next couple of years at least. :)
 
Fortnight is a lower threaded game. the picture does look a little different in more threaded games, like SOTR... and its clocked too in that slide, its just to show what it can do once clocked.

Even so, I'm surprised it's able to match a 9900K in any scenario!
 
Back
Top Bottom