• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FX 8320 analysis

How is "Not being similar" the same as "completely useless" (As you've tried to make my post sound)

And way to ignore the point about the power usage of the original i7's compared to its performance at the time (And it was half a decade ago)

But of course, I'm obviously raising up the biased points, because I'm a blates fanboy.

How on earth are they not similar? they're a CPU that you put in your PC and use. For gaming, streaming, going on the internet. You're talking as if they're not comparable which is just daft.

My question was - name me a game that the 4670k could run, but, the 6300 could not manage. I think you would find that the answer would be 0.

What I'm doing here is pointing out just how much of a CPU and or GPU is just pure snobbery.

You know? being real.
 
But your logic is flawed beyond belief.
You could play every single game out on a Phenom II X4.

Forget it, I'll continue ignoring the posts you make, because you're completely unbiased and objective /Sarcasm.
 
It depends what you mean when you say competitive. At the high end? no, AMD have nothing to offer. However, everything up to the 4770K (IE the 4670K) is toppled by AMD's pricing. In gaming (as an overall average sort of summary) the 6300 can do pretty much anything the 4670k can and you can get a 6300 with board and ram for the same price.

The power argument? we've done it before and I've posted a lot of figures that show it's literally £15 a year or something daft like that. But hey, let's not use the power argument because it's one that can be used by both sides (in so much as the Intel lot didn't care when the I7 9x0 happily ate 200w when overclocked).

The Intel guys have only started using that as an argument since Intel dropped power use. But the irony is, of course, they did not do that for the enthusiast to save money in fuel bills. They did it for their own agenda (laptops and tablets etc) so if anything it wasn't done for the enthusiast at all and has only served to screw the enthusiast (tiny dies can't be soldered as they'll crack for example ruining the overclocking element unless you want to void your warranty).

The bottom line (even in la la land where a lot of enthusiasts live) is that 99% of us are going to be hindered by the old green paper (money) and thus common sense needs to be factored in. Which all points to AMD as that's where the cheap stuff is.

Fair points.

It will be game specific like the chap above mentions, so the i7 probably cant maintain should a huge lead in clock per clock performance in all games.

But specifically in BF4 I can still see benchmarks that show it takes a 5.0Ghz AMD to match a 3.4Ghz i7 which isn't that far in percentage terms to clock needed for an equivalency of a 8350's 4.5Ghz needing to match 2.5Ghz Sleeping dogs above.

How you measure worth or performance is totally user specific.

Its really down to money at the end of the day because intel are much better overall but cost up to 100 pounds more.

Its just shocking that AMD have let Intel off the performance hook. In the old days the K6, 2500M were legends because they overclocked and were faster and better value than Intel. (I owned both) Today AMD cant say that, they can only say they are cheaper and cheaper for quite a few reasons including lack of performance.

I really cant wait for AMD to catch up and hope they do (if they can) because I'll have my wallet out, ready to buy from them (I dont like intels marketing and business ethics).
 
No one cared. Just like no one cares about the 290x using ridiculous power. They only care when it happens to be something they can use to argue with.

Actually the 290x uses 15-20w more than a 780, so its circa 10% more power than its Nvidia rival.

The 8350's in the example we mentioned clocked at 2.5Ghz 30-50w verse a 8350 clocked at 4.5Ghz is beteen 4-6 times the energy burn in this example.

But even so getting back to the BF4 example. 5.0Ghz to match a 3.4Ghz intel is basically 77w verse 250W.

That is totally different to a 290x using 15w more power than a titan/780 with a sheet fan making it run hot becuase of the reference design.
 
The power use and heat output of the 290 isn't at all bad. Performance has increased more from the 7970 than power use, hence it's more efficient. Reference cooler's just a bit rubbish, as ever.
 
The power use and heat output of the 290 isn't at all bad. Performance has increased more from the 7970 than power use, hence it's more efficient. Reference cooler's just a bit rubbish, as ever.

The heat output is bad, I actually use it to warm myself at the night. Just leave Heaven looping while I'm on my Surface watching say Person Of Interest.

And then you've got people who don't have a clue saying "It's meant to run at 95c" when it's blatantly not, hence the throttling!

I should invite them round, then can pep talk my R9 290 into running at 95c (Which itself is funny, as at default it'll throttle at 94c to avoid 95c)
 
Last edited:
Any gaming rig is going to pull a kilowatt every three or so hours at the wall socket. There is not much room for argument there. In general usage, my 8350 pulls about 135W total system power, and doubles that on eight workers of prime at 4.6-4.7GHz.

I have not seen many comparisons on system power using the 4770K at 4.5Ghz with HT on prime, so I cannot say whether this is good, bad or indifferent.
 
The power argument? we've done it before and I've posted a lot of figures that show it's literally £15 a year or something daft like that. But hey, let's not use the power argument because it's one that can be used by both sides (in so much as the Intel lot didn't care when the I7 9x0 happily ate 200w when overclocked).

The Intel guys have only started using that as an argument since Intel dropped power use.

Electricity bills is really only half of it, power is also an issue due to cooling requirements. i7 9x0 never had any trouble being kept within a safe temperature under the heaviest of loads, whereas AMD's FX line run so hot that you can't run Prime95 etc so you're running a CPU with either shaky stability or throttling of some kind.

Cost is brought into it because it's just another tick in Intel's favour.

Intel
+ Can run extreme loads
+ Lower electricity bills

AMD
- Can't run extreme loads (without dangerously high temperatures or throttling)
- Higher electricity bills

You only have to look at the FX8370 with its 220W TDP at 4.4ghz to see how out of control things have gotten on AMD's side, up until the previous generation (Phenom) 140W TDP was considered to be extreme.
 
Last edited:
The heat output is bad, I actually use it to warm myself at the night. Just leave Heaven looping while I'm on my Surface watching say Person Of Interest.

And then you've got people who don't have a clue saying "It's meant to run at 95c" when it's blatantly not, hence the throttling!

Unless AMD has managed to overcome fundamental physical laws and create energy, it doesn't emit much more heat than a 7970 :) The 7970 at 1GHz uses about 220W, the 290 is about 240W.
 
The heat output is bad.

And then you've got people who don't have a clue saying "It's meant to run at 95c" when it's blatantly not, hence the throttling!

But that was AMD's design/marketing decision to nerf the 290x cooler just to get nvidia to show thier "hand", which they did. Nvidia dropped their prices and released the 780Ti. AMD will release the partner boards with Ghz+ gpu's and faster Vram modules and better cooling. Its just a game between AMD and Nvidia. Only thing is that AMD GPU'S perform against nvidia in terms of frames per pound and watts per frame also.

AMD 290/290x has a lot more to give i'd say 15%-20% which hits right on the 780ti's door.
 
You only have to look at the FX8370 with its 220W TDP at 4.4ghz to see how out of control things have gotten on AMD's side, up until the previous generation (Phenom) 140W TDP was considered to be extreme.

Yeah, back in my AMD days I remember when the Phenom II 965 launched, man there was complaints about the 140W TDP back then.
 
I guess AMD has broken the law of conservation of energy then :p

Its a bigger chip, has more vram modules working harder due to higher bandwidth and the reference fan is dog poo. lol

AMD CPU uses should just say they didn't want to spend the extra money on Intel. Or the AMD cpu fits in my budget Intel did not. The other reasons dont work for me.
 
Last edited:
People don't complain about the i7's power draw because it has the performance to back it up.

I personally think the power draw for the FX chips is pretty decent when you consider they have 8 'cores'

If you take most performance Intel Quad Cores, double the core count they'll suck up just as much juice as an 8 'core' FX chip does.

AMD just have rubbish performance per watt which needs sorting, if it offered i7 like performance in terms of IPC at the expense of power draw then no one would complain but instead they have a CPU that has the same IPC performance of a 6-7year old CPU with much higher power draw.

AMD released the FX series too soon, it's a multi-threaded chip living in a single threaded world.
 
An i7 ivy or haswell has a TDP of 77-84w its low. Its not power per core, its work done per core per clock. Intel kills AMD on this.


In reality if I could get my Intel chip to reach 5.0Ghz or 5.5Ghz or 6.0Ghz at 300w tdp and cool it effectively I would.
 
Last edited:
Back
Top Bottom