• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

piledriver - Official AMD FX4300,FX6300 and FX8350 review thread

I would like to see a Skyrim test because there is a big difference in performance between the release version and the latest patches. I believe after patch 1.4 or something the game takes better advantage of multi-threading and more advanced instructions (originally Skyrim was compiled using legacy x87 code according to some modders).
 
I would like to see a Skyrim test because there is a big difference in performance between the release version and the latest patches. I believe after patch 1.4 or something the game takes better advantage of multi-threading and more advanced instructions (originally Skyrim was compiled using legacy x87 code according to some modders).

New Skyrim performance results on the Piledriver, series looks impressive, very much overtaking the i3 3220 from what I recall.

Makes the FX6 a price/performance monster.
 
Benchmarking of games is just measuring the real world performance.

Not that you're going to notice an extra 40 FPS on the already 200 FPS in some games :p

but that is wrong in some games its a very big deal especially if you play them a lot

for eg i play cod4 a lot and strafing is vital at higher level 10 fps can mean you not make certain jumps so 40 would be a big difference

i see what you saying about in other games though where it doesnt effect the game but fps game you want as high as possible and it does matter.
 
Last edited:
but that is wrong in some games its a very big deal especially if you play them a lot

for eg i play cod4 a lot and strafing is vital at higher level 10 fps can mean you not make certain jumps so 40 would be a big difference

i see what you saying about in other games though where it doesnt effect the game but fps game you want as high as possible and it does matter.

Its not wrong, no game will give a noticeable difference between 200FPS and 240FPS, its just not going to happen. In fact the are quite a few games that will actually see negative performance impacts from FPS that high as quite a few are tuned for 60FPS so >60FPS results in visual anomalies and problems.
 
The i7 920 is nearly 4 years old now, it's not very flattering to be comparing AMD's latest 32nm processors to an old 45nm one.
 
My only issue with piledriver or any other recent AMD CPU is their powerdraw.. My i7 920 is already sucking my powersocket dry and the piledriver is even worse.

When HardOCP tested the FX8150,their example actualy consumed less power at load and idle when overclocked than a Core i7 920:

http://www.hardocp.com/images/articles/1318034683VZqVQLiVuL_9_1.png

http://www.hardocp.com/images/articles/1318034683VZqVQLiVuL_9_2.png

Still terrible though.

Not sure how the FX8350 compares to the Phenom II X6 when overclocked with regards to power consumption,although TBH only the FX6300 interests me.

The i7 920 is nearly 4 years old now, it's not very flattering to be comparing AMD's latest 32nm processors to an old 45nm one.

Maybe,if you bothered to look at what I was answering first? Complain at him instead. I even stated it was terrible,but hey ignore that.

I am on socket 1155 myself,so don't really care massively about the FX8300 series as I am not going to get one, as a Xeon E3 is my upgrade path,but still he stated Piledriver consumes more power than a Core i7 920. That is not true since Bulldozer appears not to be, with the worst model in the range as it was an overvolted FX8120. I some how doubt a FX4300 on a 4 phase 970 motherboard consumes more power than a Core i7 920 on a X58 motherboard,even a mATX one. The FX6300 interests me since I know people on AM3+ motherboards where it looks a decent upgrade(Athlon II).

Maybe you want to spread disinformation,but I don't.
 
Last edited:
Its not wrong, no game will give a noticeable difference between 200FPS and 240FPS, its just not going to happen. In fact the are quite a few games that will actually see negative performance impacts from FPS that high as quite a few are tuned for 60FPS so >60FPS results in visual anomalies and problems.

true but im sure it means higher minimums
most ppl will run fps capped so thats what will matter

its hard to recommend amd for a gaming system unless its low budget and even then why wouldnt you go dual core intel so you have a better upgrade path

i dont think amd have carved out any other niche with this chip either? >.<
 
I think we're going to have to rethink the overclocking of Bulldozer architecture chips.

It's main problem is the single threaded performance right? Well I think overclocking just using the turbo clock is the best solution to the power problem. If you're running a heavily multithreaded load you are using EIGHT processor cores, which is a lot, and we've seen the performance in these situations is excellent, so overclocking of all cores isn't needed.

If you could run one module at 4.8 and the others at 3.5 whilst gaming, that might be a solution to the power problem.
 
Its not wrong, no game will give a noticeable difference between 200FPS and 240FPS, its just not going to happen. In fact the are quite a few games that will actually see negative performance impacts from FPS that high as quite a few are tuned for 60FPS so >60FPS results in visual anomalies and problems.

yes it will i already explained which .cod4 which i play daily you aim for 250 fps minimum if you play at a decent standard because of strafe jumps in the engine and other things so it does matter if games you mainly play are effected by the fps drop

for eg on certain maps say someone smokes a point i try to jump wall with the fps drop of say 200 i might not make jump and die with 250 i make it win match

it is vital in certain games.
 
I think the 8350 is a really nice option, however I won't be getting one.

I'll be seriously surprised if it'll beat my OC'd 960T (@ 6 core / 4.1ghz / 2800ish Cpu-NB) for most gaming. BF3 multiplayer is "my game" and I haven't seen any benchmarks or even approximate benchs for this chip in BF3 multiplayer.

If I needed to buy a new AMD chip for my board the 8350 is the one I'd go for. I just can';t spend the money for something that is very likely to be a worse performer in most games that :(
 
I think the 8350 is a really nice option, however I won't be getting one.

I'll be seriously surprised if it'll beat my OC'd 960T (@ 6 core / 4.1ghz / 2800ish Cpu-NB) for most gaming. BF3 multiplayer is "my game" and I haven't seen any benchmarks or even approximate benchs for this chip in BF3 multiplayer.

If I needed to buy a new AMD chip for my board the 8350 is the one I'd go for. I just can';t spend the money for something that is very likely to be a worse performer in most games that :(


Its not worse, there are few gaming benchmarks vs the x6, but there is one here and its consistently faster than the x6 1100T http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/7

there are a few others kicking around as well that i have seen, and the story is the same.

its defiantly a faster chip that the Phenom II x6
 
Maybe,if you bothered to look at what I was answering first? Complain at him instead. I even stated it was terrible,but hey ignore that.

I am on socket 1155 myself,so don't really care massively about the FX8300 series as I am not going to get one, as a Xeon E3 is my upgrade path,but still he stated Piledriver consumes more power than a Core i7 920. That is not true since Bulldozer appears not to be, with the worst model in the range as it was an overvolted FX8120. I some how doubt a FX4300 on a 4 phase 970 motherboard consumes more power than a Core i7 920 on a X58 motherboard,even a mATX one. The FX6300 interests me since I know people on AM3+ motherboards where it looks a decent upgrade(Athlon II).

Maybe you want to spread disinformation,but I don't.

I could take it a step further and say complain at http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/6 this instead. Shows increased powerdraw over a 920 under load. My opinion was based on that.

Also most of the x58 boards that i have worked with are overvolting the cpu like crazy. Example my gigabyte x58-ud4p had the idea of running my 920 C1 at stock clocks at 1,31volts. The sabertooth im on now wants to give it 1,3volt. The lowest stable im been able to do has been 1,05volt so fare. My point is that i dont think this is something the reviewers take in to consideration when they power bench the systems. They most likely just plug the beast in measure and go on to the next cpu on their charts.

And even if when the dust settles it shows that the piledriver is a better at powerconsumption its not by much, but actually quite unimpressive considering the fact that its against (as you pointed out 32nm vs 45nm) a 4 year old chip.

Dont get me wrong here, i would love for the AMD CPUs to be better, i wouldnt even mind using one myself with its current performance if the power draw was a little better.

The i7 920 is nearly 4 years old now, it's not very flattering to be comparing AMD's latest 32nm processors to an old 45nm one.

A lot of us is still sitting on an I7 920, so it perfectly valid for us to compare, both power draw and clock for clock speed.
 
Last edited:
Back
Top Bottom