• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

8350 Vs ......

Associate
Joined
7 Mar 2013
Posts
1,639
Location
North East
While looking for monitor reviews I came across this :-


and It reminded me of this forum :-) but it is an interesting watch ......
 
Last edited:
This trouble is this video is not reflected by pretty much every other review ever done and there are lots of anomalies just comparing the Intel results.

Take his Crysis Warhead results as an example:

8350: 35.64
3570: 26.84 (seems low but anyway...)
3770: 38.44 (Hyperthreading giving a 50% improvement? on which planet?)
3820: 26.84 (Virtually the same CPU as 3770 but on a different socket)

Just comparing the 3570 v 3770 v 3820 is enough to induce head scratching.

Stick to the reputable review sites rather than a Youtube video with some bizarre numbers seemingly plucked out of nowhere.
 
Last edited:
This trouble is this video is not reflected by pretty much every other review ever done and there are lots of anomalies just comparing the Intel results.

Take his Crysis Warhead results as an example:

8350: 35.64
3570: 26.84 (seems low but anyway...)
3770: 38.44 (Hyperthreading giving a 50% improvement? on which planet?)
3820: 26.84 (Virtually the same CPU as 3770 but on a different socket)

Just comparing the 3570 v 3770 v 3820 is enough to induce head scratching.

Stick to the reputable review sites rather than a Youtube video with some bizarre numbers seemingly plucked out of nowhere.

The 3820 4c/8t sandy E is odd. That it matches the ivy i5 exactly suggests a mistake which is just sloppy and doesn't inspire confidence in the review.

The performance boost is becasue the testing was done with concurrent streaming of the gameplay, so the extra cores/HT provide a boost by giving the game engine more full cores/core time. So perhaps a slightly sneaky intent, but not unrealstic slant to the results.
 
This trouble is this video is not reflected by pretty much every other review ever done and there are lots of anomalies just comparing the Intel results.

Take his Crysis Warhead results as an example:

8350: 35.64
3570: 26.84 (seems low but anyway...)
3770: 38.44 (Hyperthreading giving a 50% improvement? on which planet?)
3820: 26.84 (Virtually the same CPU as 3770 but on a different socket)

Just comparing the 3570 v 3770 v 3820 is enough to induce head scratching.

Stick to the reputable review sites rather than a Youtube video with some bizarre numbers seemingly plucked out of nowhere.

Weren't those his steaming results?

While streaming you should expect very different results with CPU's that have a different number of cores / threads
 
Weren't those his steaming results?

While streaming you should expect very different results with CPU's that have a different number of cores / threads

I believe it was. Which was the point of that video to begin with. To prove that if you are streaming the fx-8350 is worth a look.
 
Seen the FX 8320 down as low as £100, surely that's pretty good value for anyone building a PC?

Interested to see AMD's next Desktop CPU's, could be decent option.
 
Seen the FX 8320 down as low as £100, surely that's pretty good value for anyone building a PC?

Interested to see AMD's next Desktop CPU's, could be decent option.
But the thing is that 8 cores CPU still won't make much difference in pretty much all games over the 6 cores CPU. Strictly speaking in terms of gaming performance, I think the new AMD CPU range with more L3 cache such as the FX6350 is probably a better gaming CPU than the FX8320.
 
But the thing is that 8 cores CPU still won't make much difference in pretty much all games over the 6 cores CPU. Strictly speaking in terms of gaming performance, I think the new AMD CPU range with more L3 cache such as the FX6350 is probably a better gaming CPU than the FX8320.

If only the fx6350 was a true 6 core..

I dont think we should be stuck in current gen mindset. Sure not that many games are proper multithreaded but doesnt mean future games wont be. Considering the specs on the ps4 where we will get a lot of our ports from and the cheapness of a 6 or 8 core these days(i know 3-4 modules) i wouldnt be surprised to see developers make use of this. Besides most gamers are on a budget, so future proofing yourself is not a bad way to go.
 
Last edited:
If only the fx6350 was a true 6 core..

I dont think we should be stuck in current gen mindset. Sure not that many games are proper multithreaded but doesnt mean future games wont be. Considering the specs on the ps4 where we will get a lot of our ports from and the cheapness of a 6 or 8 core these days(i know 3-4 modules) i wouldnt be surprised to see developers make use of this. Besides most gamers are on a budget, so future proofing yourself is not a bad way to go.

While I agree with you, I doubt that anytime soon games will start using 8 cores efficently. It's a lot of cores and with a quad still being a very popular configuration amougst gamers I'm sure they will continue to develop for that in mind, especially since Intel has not released any affordable 6 cores. AMD may have a 6 and 8 but they are not very popular models and would not be worth the effort coding for 8 threads.

Of course this may change over the next few years, but only when at least a 6 core is more popular.
 
IMO the 3820 has been fine for streaming 720p so far for games such as Planetside 2 (30-60fps off the top off my head), L4D2, Civilization V, GTA IV modded etc etc. Skyrim on the other hand uses so much VRAM (1.5gb easy) on my gtx580 it is hard to gauge as I am already slowing it to a crawl in the forest areas so I don't bother with that one. However, I cannot tell you about Crysis games but I always imagined that with a good enough GPU (and upload speed) I should be able to stream games at higher than 25fps.
 
While I agree with you, I doubt that anytime soon games will start using 8 cores efficently. It's a lot of cores and with a quad still being a very popular configuration amougst gamers I'm sure they will continue to develop for that in mind, especially since Intel has not released any affordable 6 cores. AMD may have a 6 and 8 but they are not very popular models and would not be worth the effort coding for 8 threads.

Of course this may change over the next few years, but only when at least a 6 core is more popular.

That's not what he's talking about, the new xBox and PS4 game Consoles have low power AMD 8 core CPU's in them. to get the most out of games that's what developers will have to code for.
 
That's not what he's talking about, the new xBox and PS4 game Consoles have low power AMD 8 core CPU's in them. to get the most out of games that's what developers will have to code for.

I know. I was/am in a rush to write so it may have seemed a little off.

If games are ported over, it's likely they will be edited to run better on quad cores (aswell as hex if it becomes cheaper). Not many PCs run AMD systems (see the Steam hardware survey) and a much lower percentage of that will be the new 6/8 cores so it may be better for them to modify to suit a popular configuartion rather than a few thousand running a 8 core. This is not suddenly going to change in a few years.

If games are ported right over with very little optimization, well yes, 8 core AMDs might be future proof for longer than a 3570k, for example. However, it's very likely that most releases will be optomised (even if it's slightly) to run on 4 cores which is the most popular configuration for games and will be for several more years. If the quad does start to be "poor" then a 6 core may start to become "mainstream" but even then.
 
I know. I was/am in a rush to write so it may have seemed a little off.

If games are ported over, it's likely they will be edited to run better on quad cores (aswell as hex if it becomes cheaper). Not many PCs run AMD systems (see the Steam hardware survey) and a much lower percentage of that will be the new 6/8 cores so it may be better for them to modify to suit a popular configuartion rather than a few thousand running a 8 core. This is not suddenly going to change in a few years.

If games are ported right over with very little optimization, well yes, 8 core AMDs might be future proof for longer than a 3570k, for example. However, it's very likely that most releases will be optomised (even if it's slightly) to run on 4 cores which is the most popular configuration for games and will be for several more years. If the quad does start to be "poor" then a 6 core may start to become "mainstream" but even then.

It doesn't work like that, if a game supports 8 cores you don't need to recode it to support 4. it will support anything upto 8 cores, to undo 8 core support makes no sense and its a waist of time.

here.... Crysis 3 Welcome to the jungle map.

 
Last edited:
It doesn't work like that, if a game supports 8 cores you don't need to recode it to support 4. it will support anything upto 8 cores, to undo 8 cores support makes no sense and its a waist of time.

here.... Crysis 3 Welcome to the jungle map.

<snip>

I did not directly say that. Optomising and recoding are slightly different, really. Some games can support multiple cores but are only really optomised for a few. I do not know exactly which games would be a good example but they do exist.

Then when you factor in that a computer processor is very likely to be more powerful than those found in consoles, is there any need for games to be optomised for 8 cores on PC over the next few years? I'm aware that games can be much more efficient on consoles but even so.
 
The way I see the 8350 is that it isn't a bad processor for gaming, it just isn't as good as the 3570K in typical conditions, it's cheaper though which could be a deciding factor for some people.
 
I did not directly say that. Optomising and recoding are slightly different, really. Some games can support multiple cores but are only really optomised for a few. I do not know exactly which games would be a good example but they do exist.

Then when you factor in that a computer processor is very likely to be more powerful than those found in consoles, is there any need for games to be optomised for 8 cores on PC over the next few years? I'm aware that games can be much more efficient on consoles but even so.

If they don't want to optimise for Game Consoles then no, how often do you think that's likely to happen?
 
BF3 scales well with cores, so does Crysis 3.

Generally the programmers will make the program modular and if they're being serious about it they will run things on another thread if they can.

You get tangible benefits even with single core CPUs, since all modern CPUs have multiple ALUs/FPUs per core, so programmers should get used to it. I try to stick to it when I can, I recently wrote a few thousand line program and that was threaded as much as possible, including such things as concurrent for loops.
 
BF3 scales well with cores, so does Crysis 3.

Generally the programmers will make the program modular and if they're being serious about it they will run things on another thread if they can.

**snip**
QUOTE]

That's what I thought. It don't make sense to me that you wouldn't make your program modular or somehow able to utilize the cores available by building your program using as many threads as possible and spreading them out across the cpu. But then again I don't know so much about programming.
 
Back
Top Bottom