Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
As above these are poor chips
No they are not,as I know loads of people who use them in builds. The FX6300 at around £80 to £85 has been one of the best budget CPUs under £100 for a while.
In games like WoW and SC2 which are lightly threaded,it will destroy the older AMD CPUs,even at the same clockspeed. Both GameGPU and Toms Hardware tested this.
As above these are poor chips if you can get an 8320 and have a decent motherboard I would do that first and then clock it to at least 4.5 GHz and then you should not have as bigger bottle neck and please save your self some money and get something like 7950/7970 as at the price they are now they are a steel and out perform the NVIDIA cards at a higher price point.
Just because a lot of people use them and they are good value for money when compared to other new chips
doesn't stop the architecture being poor, a chip launched at the end of 2012 should not have lower clock for clock performance per core than ones from 2007, it's that simple.
In WoW the FX6300 gets less than 1 FPS more than the equally clocked AMD Phenom II X4 970 BE, less than 2 FPS more than the lower clocked AMD Phenom II X6 1100T BE, and lower FPS than the AMD Phenom II X4 980 BE, In addition it get beat by the Intel Pentium G620.
You can ignore the THG testing on WoW it's worthless, Anand do a much more competent job.
In WoW the FX6300 gets less than 1 FPS more than the equally clocked AMD Phenom II X4 970 BE, less than 2 FPS more than the lower clocked AMD Phenom II X6 1100T BE, and lower FPS than the AMD Phenom II X4 980 BE, In addition it get beat by the Intel Pentium G620.
You can ignore the THG testing on WoW it's worthless, Anand do a much more competent job.
So,in the end its just pathetic E-PEEN and a case of people just trying to feel better.
...
It also comes from people who have no clue of ever using a single of these systems in real life at all.
...
Of course you ignore the THG results,because it does not fit your biased opinion.
...
all reviewers now test BD and PD CPUs under Win8 or Win7 with the patches. Anandtech tested the FX8150 with the patch which increased performance in a number of games. Then they forgot to update the bench tables.
Just to clarify I DID NOT say the FX6300 is a poor chip. The FX6300 is great chip for its asking price £80 ish, but if people were running them at stock clock, then they are doing it wrong...as these chip really should all be overclocked to reach its full potential.As above these are poor chips if you can get an 8320 and have a decent motherboard I would do that first and then clock it to at least 4.5 GHz and then you should not have as bigger bottle neck and please save your self some money and get something like 7950/7970 as at the price they are now they are a steel and out perform the NVIDIA cards at a higher price point.
Points I agree with - it's getting tiring not so much from the usual suspects but these random jib ins from people with no weight behind their points who are just like sheep following the biased factoids they absorb.
Yesterday a stereotypical intel user posted on the Cinebench thread - comparing a £500 CPU to the £150 FX8 without applying the brain. Yeah the BD/PD era didnt quite perform as expected but they are still a good performing component for the price.

The shining star in today’s comparison is AMD’s FX-6350, which delivers solid performance in games, while besting Intel's Core i5 in a number of our other benchmark workloads. The cheaper FX-6300 is an even more attractive bargain, so long as you're willing to overclock it.
Which are the only important metrics.
Not relevant at all,since you need to compare same priced CPUs.
That would be the Core i3 for the £80 to £85 FX6300 and the Core i3 for the £110 to £115 FX8320.
Also,what is hilarious is that many of you would take a lower clocked Core i5 instead of a higher clocked Core i3,even if the latter had better single core performance. None of you would disable half the cores on a Core i5 to overclock it 10% higher would you?? Nope??
But then none of you would ever use anything under a Core i5 anyway,so its a moot point too when you are talking about cheaper CPUs.
So,in the end its just pathetic E-PEEN and a case of people just trying to feel better. Anything good to be said about any AMD CPU,needs to be burned with fire!!
It also comes from people who have no clue of ever using a single of these systems in real life at all. I have used recent Core i3,Core i5,Core i7 and FX63**,FX83** and A6,A8 and A10 based systems in the realworld too.
BTW,I have a Xeon E3 and a GTX660,so you cannot try that other angle also.
Of course you ignore the THG results,because it does not fit your biased opinion.
WoW is dead. It's coming up to 10 years old. These types of games benefit far more from a decent internet connection and less people in the game area.
I thing you've missed the point, I wasn't saying the FX chips are bad, or that the FX-4/6 are not the best chip currently on the market at their price points, I was taking exception to your statement that the FX's are not poor chips. Just because something is cheap and able to compete well with opponents that don't make an effort doesn't stop it being poor, the fact that piledriver cores are slower per clock than Core2 cores and AMD is forced to mount bunches of them onto a CPU and raise the clocks just to compete shows how poor the architecture is. If current i3's were overclockable they would walk over the FX-4's like the i3 5xx's did and in many things (like WoW) the FX-6 too.
Actually those benches are clearly showing the GTX680 in tests being bottlenecked by the stock clock FX6300. It's only not bottlenecking when faster CPUs such as i5/i7 (or FX8350 even) are not giving extra frame rate over the FX-6300.FX-6300 CPU benchmarks (attached to a GTX680 - A slightly downclocked GTX 770);
http://www.xbitlabs.com/articles/cpu/display/fx-8350-8320-6300-4300_6.html#sect0
At 1080p, there is no bottleneck. In fact, only the Pentium shows any genuine bottleneck at 1080p.
Fairly old games, to be fair.
Actually those benches are clearly showing the GTX680 in tests being bottlenecked by the stock clock FX6300. It's only not bottlenecking when faster CPUs such as i5/i7 (or FX8350 even) are not giving extra frame rate over the FX-6300.
As I said, if people were running the FX6300 on stock clock and not overclocking it, they are doing it WRONG.
I WAS referring to 1080p. Big or small, a bottleneck is a bottleneck when the graphic card's GPU is not being ultilized at 99/100%. That's why I said people should all overclock their FX6300 by default, rather than running them on stock clock.You know the bit where I said "at 1080p"? Well, that should have given you a clue that I was referring to the 1080p figures.
And the very small advantage the Intel CPUs hold in some of the 1080p benchmarks is not a CPU bottleneck - since you can see by the low-res scores that the AMD CPU has more to give (as do the others). It's a platform limitation - probably memory - but not a bottleneck.
And if a GPU is running at 100% then you have a GPU bottleneck as much as you would with a CPU at 100%. This near obsession with having the GPU running at 100% regardless of whether it makes any difference to games is strange to say the least.