• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What's the score with AMD CPU's these days?

Caporegime
Joined
17 Mar 2012
Posts
47,753
Location
ARC-L1, Stanton System
This is very typical of my own experience moving from an FX 83## to the 4690K

Intel is better when fewer cores are needed AMD do better when more cores are needed.... heavy AI and physics calc, ecte...

It does not make one better than the other, it just makes them different, if the i5 can't keep pace with the FX83## in such situation the i3 certainly can't.

image.png


image.png
 
Last edited:
Associate
Joined
1 Nov 2013
Posts
713
Location
Ireland
Hold on, you can't accuse me of cherry picking (I picked that clip completely at random) and then cherry pick Fallout 4 in response. :p

I cherry picked that example because the i5 has nothing to prove in any game - it's a rock solid performer and will not be found wanting in anything versus competitors. I can show you loads of benchmarks showing the FX flip-flopping from excellent, to good, to poor performance across various games when compared to the i5, which is always consistently reliable. Can you show me benchmarks showing the FX beating the i5 in a variety of games?

That is why simply showing one or two benchmarks where it does fine, like you've done, paint a totally inaccurate picture of the processor.

It does not make one better than the other, it just makes them different, if the i5 can't keep pace with the FX83## in such situation the i3 certainly can't.

In some games Haswell and Skylake i3 can beat the FX. IT doesn't make the i3 a better all around processor but it does show the inherent weakness of the FX in single thread which a lot of games depend on.

As for Fallout 4, the FX is way slower than the i5 in that game. Set everything to ultra including CPU options and see the FX struggle badly.

It's also pointless use screen caps from videos. Try running the opening 30 minutes of Fallout 4 on both CPU's where there aren't any major variances in what occurs and then examine your average and minimum frames for the real picture.

Also, I know from experience, I build PC's. I've run Fallout 4 on about 10-15 different processors including the i3-6100, FX-6300, FX8350, i5-2400, i7-2600, i5-3570, i7-6700, etc. To maintain a steady 60fps on an FX processor, whether the 6300 or monster 9570, you have to dial back the CPU intensive options.

That's not cherry-picking. It is COMMON for the FX-8350 to be fine for gaming. The bottleneck for the huge majority of people is their GPU. They were just supporting their point with an example of a modern and popular game that ran fine with an FX-8350. Nearly all modern popular games will.

Yes, it's common for it to be fine in games, just as it is common for it to be a considerable bottleneck in scenarios where the i5 is not, and there are a few games where it does comparatively very badly. Most games are GPU dependent, but there are also games that are more CPU dependent.

The FX was fine when it came out, but in 2016, you'd want to be actually mad to choose it over any Intel platform for gaming.

When I said it was 'terrible for gaming', I didn't mean it won't play new games. I meant compared to the i5, it's comparatively poor and supports high-end graphics cards badly.

Also, people are always obsessed with average framerate, and while the FX trails the i5 from small to huge margins compared to i5, minimum frames also tell a huge story as the below benches show.

Fallout 4, everything set to ultra at 1440p inc all CPU dependent options.

fallout-4-cpu-benchmark-1440-u.png
 
Last edited:
Associate
Joined
1 Nov 2013
Posts
713
Location
Ireland
So time-run benchmarks from reputable tech websites are bogus, but random youtube videos using random segments of the game are fine?

Here's one from Techspot at 1080p rather than 1440p but shows the same thing - FX trailing way behind.

But I'm sure you also think Techspot are all fools as well?

CPU_01.png
 
Soldato
Joined
5 Dec 2010
Posts
3,163
Location
Solihull
It is funny that Intels SMT hyperthreading is rubbish according to some AMD fans, what is the betting that next year when AMD has its own SMT system rather than their currant CMT it will be the best thing ever. :D:p:D

I haven't heard anyone on here say that.

Cmt itself isn't a bad thing. Real cores should beat virtual cores even when they share an FPU.

The problem is that more cores can't make up for a large ipc delta when most programs can't use all of them.
 
Caporegime
Joined
17 Mar 2012
Posts
47,753
Location
ARC-L1, Stanton System
So time-run benchmarks from reputable tech websites are bogus, but random youtube videos using random segments of the game are fine?

Here's one from Techspot at 1080p rather than 1440p but shows the same thing - FX trailing way behind.

But I'm sure you also think Techspot are all fools as well?

CPU_01.png

What none of these charts show is how the performance differentiates between the CPU in different situations in game.....

For example, same game, Fallout 4.

image.png


image.png


I could take any one of those part of this game and make a chart out of it.
So one chart could say

i5 4460: 60 FPS - FX-8350 49 FPS

Intel fanbois all over the internet use and to say "look Intel better".

Or i could make another chart.

i5 4460@ 38 FPS - FX-8350 48 FPS

Now anyone can use that to prove AMD are better.

The fact of the matter is sometimes the i5 is better and other times the AMD is better, you can only see that in gameplay video's, With charts you can only put down the numbers as you test them and given AMD and Intel behave very differently you have to chose which one comes out on top in your charts, so the author of the chart determines who wins.
Charts are completely meaningless and useless.

The way around it for AMD is to make everything the same as Intel.
 
Last edited:
Associate
Joined
1 Nov 2013
Posts
713
Location
Ireland
The vital point you're missing is that it doesn't matter what magic forumula you use, if you run the exact same scenario on both CPU's for a decent length of time to account for variables, the Intel CPU will always beat the FX. This is what the pro benchmarks show, this is what I've personally experienced as well testing the game will various processors - the FX processors are substandard in Fallout 4 and generally in CPU intensive games.

Fallout is one of the most CPU heavy games available, and makes use of multiple cores/threads, and also benefits from faster ram.

Intel 'fanboi' doesn't come into it. Right now at this exact moment I have the following PCs and I have run Fallout 4 on all of them - FX6300, FX9350, i5-2400, i5-3570, i7-6700. (have owned FX8350 also previously and run it on that, as well as other CPU's I just can't remember right now).

Playing Fallout 4 for about an hour on each and traveling around the map, the FX line is by far the slowest. It shows massively in areas like Boston where the FX tanks compared to the i5 (not as much on the 2400 but still slower) and i7.

Always running the game at 1440p Ultra with a GTX1070 until now, haven't used the GTX1080 with the FX yet but the FX bottlenecks the GTX1070 badly.

These are simply facts to be honest, it's not as simple as 'in some parts of the game the AMD is better and other parts the Intel is better'. Overall, the FX is simply just far inferior to the i5 in the CPU-intensive world of Fallout 4.

Posting random screengrabs of one or two sections where AMD posts a higher FPS for that brief moment doesn't change that.

Charts and graphs are absolutely not meaningless if the benchmarks are carried out effectively (which they are, on most pro websites) - they're certainly far more effective than random, 120 second youtube videos. For Fallout 4, for example, many benches run the first 30-40+ minutes of the game where it's far easier to replicate the same scenarios from leaving the vault and initially exploring around the wasteland to gain their figures.
 
Last edited:
Caporegime
Joined
18 Sep 2009
Posts
30,118
Location
Dormanstown.
Frankly the only fair comparisons come from FPS charts that are polled every second throughout for an hour from the same segments (Or as close as reasonably possible) between the two CPU's.

Stills don't really prove anything when the image is never the same, I don't know why this is even a thing.
 
Associate
Joined
1 Nov 2013
Posts
713
Location
Ireland
Frankly the only fair comparisons come from FPS charts that are polled every second throughout for an hour from the same segments (Or as close as reasonably possible) between the two CPU's.

Stills don't really prove anything when the image is never the same, I don't know why this is even a thing.

In a nutshell, this is what I was getting at. :D
 
Caporegime
Joined
17 Mar 2012
Posts
47,753
Location
ARC-L1, Stanton System
The vital point you're missing is that it doesn't matter what magic forumula you use, if you run the exact same scenario on both CPU's for a decent length of time to account for variables, the Intel CPU will always beat the FX. This is what the pro benchmarks show, this is what I've personally experienced as well testing the game will various processors - the FX processors are substandard in Fallout 4 and generally in CPU intensive games.

Fallout is one of the most CPU heavy games available, and makes use of multiple cores/threads, and also benefits from faster ram.

Intel 'fanboi' doesn't come into it. Right now at this exact moment I have the following PCs and I have run Fallout 4 on all of them - FX6300, FX9350, i5-2400, i5-3570, i7-6700. (have owned FX8350 also previously and run it on that, as well as other CPU's I just can't remember right now).

Playing Fallout 4 for about an hour on each and traveling around the map, the FX line is by far the slowest. It shows massively in areas like Boston where the FX tanks compared to the i5 (not as much on the 2400 but still slower) and i7.

Always running the game at 1440p Ultra with a GTX1070 until now, haven't used the GTX1080 with the FX yet but the FX bottlenecks the GTX1070 badly.

These are simply facts to be honest, it's not as simple as 'in some parts of the game the AMD is better and other parts the Intel is better'. Overall, the FX is simply just far inferior to the i5 in the CPU-intensive world of Fallout 4.

Posting random screengrabs of one or two sections where AMD posts a higher FPS for that brief moment doesn't change that.

Charts and graphs are absolutely not meaningless if the benchmarks are carried out effectively (which they are, on most pro websites) - they're certainly far more effective than random, 120 second youtube videos. For Fallout 4, for example, many benches run the first 30-40+ minutes of the game where it's far easier to replicate the same scenarios from leaving the vault and initially exploring around the wasteland to gain their figures.

This post is a lot like those charts, its reader is expect to trust it without reference.

With Videos you can see with your own eye's whats going on, i trust my own eye's more than i do someone with numbers on a graph telling me to ignore everything and believe him instead, just him, in this case you.

I was not born yesturday, i have had CPU's of alsorts.
 
Soldato
Joined
2 Dec 2005
Posts
5,515
Location
Herts
Charts like that ^^^^ are for fools, anyone can put anything they like on them.

The only way to know the performance is to see it in action, actual video's like joeyjojo posted..

I wouldn't go that far, both are useful. A chart is a lot better than comparing one or two random screengrabs for example, as in your post two up from this one.

My point by sharing youtube vids is that charts can also be misleading by making the difference seem larger than it is. If product A scores 50 FPS and product B scores 100 FPS, it's a big difference, but is product A useless? No.



Terrorfirmer, you've completely missed my point. Read my first post again. I said
We don't need this kind of hyperbolic, partisan language in this subforum thanks.

You can find any number of real world measurements and recordings that show that most games are GPU-limited, and the CPU makes little difference. Here's the very first one I found with a youtube search for example. Is the AMD being 'completely battered'? Of course not.

Would I recommend someone buy a new system around one in 2016? No, but that's a different question.

I'm well aware AMD FX CPUs are slower than Intel's offerings, that's been the case for over 4 years. :p

What I'm saying is, describing them as 'so incredibly weak' or 'completely battered' (as one poster did earlier) is inaccurate and hyperbolic.

In real world systems, like the kind people stick up on youtube where they're using a sensible resolution (1080p or 1440p) with a mid-range GPU, it tends not to matter much what CPU you use. You'll be GPU limited and the CPU won't come into it much. If it's ticking along at 40 FPS but you could have 50 FPS with a different CPU, the end user probably won't be too bothered.

Always running the game at 1440p Ultra with a GTX1070 until now, haven't used the GTX1080 with the FX yet but the FX bottlenecks the GTX1070 badly.

I think this explains it. You're talking about £400+ GPUs at frankly not a very high screen res. Under these conditions it becomes a much bigger test of the CPU, particularly how efficiently it can feed the GPU (i.e. the graphics API). In those conditions it might well be that AMD CPUs are 'terrible'.

But according to the latest steam survey these really powerful GPUs, the GTX 980Ti/1070/1080, say, are only found in 3.6% of machines. Not very typical. Using those kinds of GPUs your needs are different (until the lighter APIs are available that is).

Charts and graphs are absolutely not meaningless if the benchmarks are carried out effectively (which they are, on most pro websites) - they're certainly far more effective than random, 120 second youtube videos. For Fallout 4, for example, many benches run the first 30-40+ minutes of the game where it's far easier to replicate the same scenarios from leaving the vault and initially exploring around the wasteland to gain their figures.

Agreed.
 
Last edited:
Soldato
Joined
2 Dec 2005
Posts
5,515
Location
Herts
This post is a lot like those charts, its reader is expect to trust it without reference.

With Videos you can see with your own eye's whats going on, i trust my own eye's more than i do someone with numbers on a graph telling me to ignore everything and believe him instead, just him, in this case you.

I was not born yesturday, i have had CPU's of alsorts.

This is absolutely true, but must be a good recording. Obviously a 30 FPS clip isn't going to cut it if the FPS are higher than that.

I stress again, the point of posting those youtube vids was to just to show that in real recordings with midrange GPUs, FX CPUs are usually totally adequate, not 'rubbish' as some people often say without evidence.

And I stress again, I wouldn't advise anyone to just buy something adequate, there are many other factors!
 
Caporegime
Joined
17 Mar 2012
Posts
47,753
Location
ARC-L1, Stanton System
I wouldn't go that far, both are useful. A chart is a lot better than comparing one or two random screengrabs for example, as in your post two up from this one.

Which one?

You misunderstood both.

The first one I used to demonstrate a difference between CPU's at different times, actually this is exactly what I said in it.

I don't know how else to explain it to make that even more clear.

I use the same images again to demonstrate how a reviewer might get his results from one part of a game when another part of the game might result completely differently.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
47,753
Location
ARC-L1, Stanton System
This is absolutely true, but must be a good recording. Obviously a 30 FPS clip isn't going to cut it if the FPS are higher than that.

I stress again, the point of posting those youtube vids was to just to show that in real recordings with midrange GPUs, FX CPUs are usually totally adequate, not 'rubbish' as some people often say without evidence.

And I stress again, I wouldn't advise anyone to just buy something adequate, there are many other factors!

Right, if running a GTX 1080 then an i7 of some sort is best advised.

If its a £200 GPU its not going to be powerful enough to drive those single threaded Draw Calls.
 
Caporegime
Joined
18 Sep 2009
Posts
30,118
Location
Dormanstown.
Which one?

You misunderstood both.

The first one I used to demonstrate a difference between CPU's at different times, actually this is exactly what I said in it.

I don't know how else to explain it to make that even more clear.

I use the same images again to demonstrate how a reviewer might get his results from one part of a game when another part of the game might result completely differently.

But this is meaningless.

In those Fallout 3 images I'm seeing two different scenes on two different CPU's, the figures aren't comparable with each other.

Video's are good for the overall experience, but you absolutely cannot take two stills and compare the frame rate because they're not the same image.
 
Caporegime
Joined
18 Sep 2009
Posts
30,118
Location
Dormanstown.
Right, if running a GTX 1080 then an i7 of some sort is best advised.

If its a £200 GPU its not going to be powerful enough to drive those single threaded Draw Calls.

Well, given there's been no AMD CPU progress on the higher end for years, yet a 200 pound GPU from today isn't the same as a 200 pound GPU when the FX83's were released, it's not that black and white.

200 pound GPU for me is an RX480, which is like a 290X, both of which are on a scale higher than ideally you'd want to match up to an FX83 (At least on older games, on DX12/Vulkan (All like 5 of them) then sure, there's probably little difference between an FX83 and an i5, but there's still tons of games where there are)
 
Caporegime
Joined
17 Mar 2012
Posts
47,753
Location
ARC-L1, Stanton System
For the most part one served my GTX 970 just as well as the 4690K ^^^^

But this is meaningless.

In those Fallout 3 images I'm seeing two different scenes on two different CPU's, the figures aren't comparable with each other.

Video's are good for the overall experience, but you absolutely cannot take two stills and compare the frame rate because they're not the same image.

Fallout 4

Well, they are not identical, thats impossible, but being a few pixel lines different does not account for 20% difference in performance.
 
Last edited:
Caporegime
Joined
18 Sep 2009
Posts
30,118
Location
Dormanstown.
Fallout 4

Well, they are not identical, thats impossible, but being a few pixel lines different does not account for 20% difference in performance.

I know, it's just Fallout 4 looks pretty crappy IQ wise :p.

And where the FX83's "ahead" there is tons of different in what we're seeing.
The stills are meaningless.

At the end of the day for me, you're one of the biggest AMD fans on these boards, and even you went and changed to an i5 4690K, if you really felt it wasn't worth it, you'd have profited and went back to an FX83.
 
Back
Top Bottom