• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Advice on i5 or an AMD piledriver

  • Thread starter Thread starter Deleted member 157462
  • Start date Start date
A steaming load of old gonads, filled with bait.

Is as is. You put in your snide little comments about how Intel are close, or, how older gen CPUs can come close in performance yet at the same time you don't realise what it is you're actually showing.

IE - AMD CPUs (most notably the 8350 and 6300) make fine gaming CPUs and, even at their tested stock speeds (because things change radically when you start ramping up the clocks and making the single threaded performance rise) are more than good enough to run any game.

And it's far more important to spend as much as you can on a GPU, as that alone will be the deciding factor in whether or not you can turn up settings.

And it's as simple as that.
 
No use buying a high end GPU if it sits at 70% utilisation for 90% of its life ;)

How do they say it? Cool story brah !

Oddly you're completely wrong there. Here is Metro Last Light being benched about five minutes ago on my 8320 rig and both GPUs are at 80% or higher.

http://www.youtube.com/watch?v=XR70SUonOhc&feature=youtu.be

The game uses 8 cores too. I'd say that you'd easily max out a 280x considering I'm running a 7990 overclocked.

As you can see, Metro only uses those 8 cores to around 50%, but it does use 8 cores. As do all of the other games I listed.
 
How do they say it? Cool story brah !

Oddly you're completely wrong there. Here is Metro Last Light being benched about five minutes ago on my 8320 rig and both GPUs are at 80% or higher.

http://www.youtube.com/watch?v=XR70SUonOhc&feature=youtu.be

The game uses 8 cores too. I'd say that you'd easily max out a 280x considering I'm running a 7990 overclocked.

As you can see, Metro only uses those 8 cores to around 50%, but it does use 8 cores. As do all of the other games I listed.

Oh look... one game...

Care to do the same with say... GTA4?

And I proved that not all the games you listed use 8 cores..... Maybe you forgot to read that part?
 
Oh look... one game...

No look, every game. Face it - you talk with absolutely and utterly no experience of an AMD CPU whatsoever, and thus pull all of you 'facts' out of either thin air (see Arma III CPU results) or, stock benchmarks ran on the wrong operating system from the internet.

The reality, as usual, is completely different. You should see what Crysis 3 does to my rig. 100% CPU usage over 8 cores and 98% GPU usage over both 7970s on my 7990.

Sooner or later you're going to have to face the fact that AMD CPUs do a bloody great job of gaming. When overclocked they more than justify themselves.
 
Oh look... one game...

Care to do the same with say... GTA4?

And I proved that not all the games you listed use 8 cores..... Maybe you forgot to read that part?

He's still showing a bottleneck though, even in one of those perfect scenario's.

And 20NM's almost on us, so we're going to have set ups with over twice the performance of the 7990.

In the house currently I've got 2 R9 290's, if we're bottlenecking a 7990 in a perfect scenario, what would they do to my R9 290's?

EDIT : Just read Andy's post (Don't know why I bothered)

I forgot Almighty, didn't you have one of the fastest Thuban chips in the UK?
 
Last edited:
No look, every game. Face it - you talk with absolutely and utterly no experience of an AMD CPU whatsoever, and thus pull all of you 'facts' out of either thin air (see Arma III CPU results) or, stock benchmarks ran on the wrong operating system from the internet.

The reality, as usual, is completely different. You should see what Crysis 3 does to my rig. 100% CPU usage over 8 cores and 98% GPU usage over both 7970s on my 7990.

Sooner or later you're going to have to face the fact that AMD CPUs do a bloody great job of gaming. When overclocked they more than justify themselves.

Oh Andy, this is why I love your posts....

You're silly and blind, there's thousands of forum posts and article all over the internet that prove that AMD can't hack it in every game.

And don't go calling me out on my AMD CPU history, you won't like what you'll find.... In fact if you do call me out on it I'll just have great pleasure in making you look stupid in front of everyone who's reading this thread...

The fact that you claim an FX can keep a GPU at 99% is every game is a joke and you need mental help.
 
GTA4? bloody hell how old do you want to go :rolleyes:

I'd say that when you're using GTA4 to sell a CPU you need to have a word with yourself.

People buy upgrades, or complete rigs based on modern games. Not on games from ten years ago that were absolutely terribly coded in the first place.

GTA4 was the worst console port I have ever seen. It took until I had GTX 480 SLI, and hacked the game to work with SLI, and throw an overclocked I7 950 at it for it to finally behave, and it still looked dated and crap.
 
Oh Andy, this is why I love your posts....

You're silly and blind, there's thousands of forum posts and article all over the internet that prove that AMD can't hack it in every game.

And don't go calling me out on my AMD CPU history, you won't like what you'll find....

The fact that you claim an FX can keep a GPU at 99% is every game is a joke and you need mental help.

Don't actually recall him saying that like :p

But Crysis 3 maxes out his whole rig per his post, so he'd bottleneck an R9 290 Crossfire with an FX83.
 
I forgot Almighty, didn't you have one of the fastest Thuban chips in the UK?

Mate I've run all the decent AMD chips under phase, Including some FX chips.

I've played with them at clock speeds most of the tools in here can only dream of running 24/7

The motherboard that I used a lone cost more then Andy's CPU and mobo combined..
 
GTA4? bloody hell how old do you want to go :rolleyes:

I'd say that when you're using GTA4 to sell a CPU you need to have a word with yourself.

People buy upgrades, or complete rigs based on modern games. Not on games from ten years ago that were absolutely terribly coded in the first place.

GTA4 was the worst console port I have ever seen. It took until I had GTX 480 SLI, and hacked the game to work with SLI, and throw an overclocked I7 950 at it for it to finally behave, and it still looked dated and crap.

You're the one that said.... and I quote

No look, every game. Face it

My 2600k has no problems with GTA4.... despite it being a bad port.

Surely your super powered 8 'core' CPU should eat it? After all you did say..

No look, every game. Face it
 
Last edited:
Oh Andy, this is why I love your posts....

You're silly and blind, there's thousands of forum posts and article all over the internet that prove that AMD can't hack it in every game.

Yet when I run the games myself on my rig the results are usually the complete opposite. You posted a set of results from Metro Last Light, yet when it came down to it and Teppic and I both ran the benchmark I beat his I7 3770k.

So why the conflicting info? well it's easy, really. The 8320 and 8350 come with a rubbish stock clock speed. There's a a good reason for that - you can overclock them. And, you can overclock them miles beyond a stock speed of 3.7ghz or 3.9ghz as they come stock. And when you do the results completely change.

Now I wouldn't mind your complete ignorance if you were completely ignorant. But you're not, because you can pull data from the internet and be selective with it to try and show that AMD cpus are rubbish. Then you lie, and then you come up with comments such as the one you just made about a GPU sitting at 70% because it's with an AMD CPU.

Yet clearly you didn't have any time whatsoever to back up your claim, and once again you got called out on it and made to look like a liar.

You only provide facts AFTER you've made your outrageous comments. And, you can only supply those facts by using a warped perspective.

And don't go calling me out on my AMD CPU history, you won't like what you'll find....

Oh noes, please don't you're scaring me :rolleyes:

God, you talk some crap.

The fact that you claim an FX can keep a GPU at 99% is every game is a joke and you need mental help.

Where did I say that an FX can keep EVERY game at 99% GPU usage? don't try and twist my words to suit your lies. What I actually said was Crysis 3 does that to my rig. OK? that clear enough? as for the last line and the insult you have chucked in because you are clearly losing this argument?

Do it again and I'll report you (if a mod doesn't see it) and you'll end up on a ban.
 
He's still showing a bottleneck though, even in one of those perfect scenario's.

And 20NM's almost on us, so we're going to have set ups with over twice the performance of the 7990.

In the house currently I've got 2 R9 290's, if we're bottlenecking a 7990 in a perfect scenario, what would they do to my R9 290's?

EDIT : Just read Andy's post (Don't know why I bothered)

I forgot Almighty, didn't you have one of the fastest Thuban chips in the UK?

Martin you are talking about top end rigs here. The guy is investing in an i5/FX, with an £800 budget would mean current pricing lucky to sneak in a standard 290 into that build.
 
Yet when I run the games myself on my rig the results are usually the complete opposite. You posted a set of results from Metro Last Light, yet when it came down to it and Teppic and I both ran the benchmark I beat his I7 3770k.

So why the conflicting info? well it's easy, really. The 8320 and 8350 come with a rubbish stock clock speed. There's a a good reason for that - you can overclock them. And, you can overclock them miles beyond a stock speed of 3.7ghz or 3.9ghz as they come stock. And when you do the results completely change.

Now I wouldn't mind your complete ignorance if you were completely ignorant. But you're not, because you can pull data from the internet and be selective with it to try and show that AMD cpus are rubbish. Then you lie, and then you come up with comments such as the one you just made about a GPU sitting at 70% because it's with an AMD CPU.

Yet clearly you didn't have any time whatsoever to back up your claim, and once again you got called out on it and made to look like a liar.

You only provide facts AFTER you've made your outrageous comments. And, you can only supply those facts by using a warped perspective.



Oh noes, please don't you're scaring me :rolleyes:

God, you talk some crap.



Where did I say that an FX can keep EVERY game at 99% GPU usage? don't try and twist my words to suit your lies. What I actually said was Crysis 3 does that to my rig. OK? that clear enough? as for the last line and the insult you have chucked in because you are clearly losing this argument?

Do it again and I'll report you (if a mod doesn't see it) and you'll end up on a ban.

One game.... whoopeeee

And I'm not losing anything...... yay for losing an argument on the internet.... Not..

I have a 2600k, you have a slower FX and my CPU was the same price as yours..

So who's lost, really? :rolleyes:
 
You're the one that said.... and I quote

Now not only are you taking what I said out of context to try and bolster your argument you're also trying to use it, out of context, to bolster your argument. Silly.


My 2600k has no problems with GTA4.... despite it being a bad port.

Surely your super powered 8 'core' CPU should eat it? After all you did say..

Good for you. You managed to run the worst game ever ported to the PC on hardware that didn't even exist when it was launched.

Have a cookie.
 
Martin you are talking about top end rigs here. The guy is investing in an i5/FX, with an £800 budget would mean current pricing lucky to sneak in a standard 290 into that build.

And the R9 290 and 780 are about the same performance, Phix switched from an FX83 to an Intel because the FX83 bottlenecked his 780.

So.....
 
Back
Top Bottom