• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Homeworld: Remastered CPU Benchmarks

Yeah, the shared floating point does make a difference.

I noticed when comparing my Athlon x2 270 to a dual core Kaveri in Cinebench (floating point dependent, much like games). Single core, the Kaveri gave my Athlon a beating. With all (two) cores firing, they were near identical.

Presumably, a lot of games aren't optimised to favour cores from separate modules?

They're meant to be, it's the threading priority. Like HT, in a 2 threaded game, a game wouldn't use core 1 and the 2nd logical core on it, it'd use 2 separate cores.
 
They're meant to be, it's the threading priority. Like HT, in a 2 threaded game, a game wouldn't use core 1 and the 2nd logical core on it, it'd use 2 separate cores.

Agreed which if is the case then fair enough, but some other review sites actually show how the cpu is loaded up in a game.
 
CiS924g.jpg


This game is not that taxing on a CPU though. If even in a large battle you are not going under 40FPS and the rest of the game its over 100fps then its going to run fine for most people.

Some of you ought to play Sins of a Solar Empire - performance craters even on Intel setups during large firefights - I would wish for 40FPS then lol.
 
CiS924g.jpg


This game is not that taxing on a CPU though. If even in a large battle you are not going under 40FPS and the rest of the game its over 100fps then its going to run fine for most people.

Some of you ought to play Sins of a Solar Empire - performance craters even on Intel setups during large firefights - I would wish for 40FPS then lol.

This is the benchmark run for the graphs you linked:

https://www.youtube.com/watch?v=AV8iQICQ3VI

Completely worthless as there is nothing going on.

The graph in the OP is a skirmish - which highlights the advantage Intel CPU's have in this game, as well as other CPU intensive games.
 

Quoting from your own link:

Clocked at 4.5GHz the FX-9590 wasn't quite able to max out the GTX 980 like it did at its default 4.7GHz operating frequency with a 5.0GHz turbo boost. The FX-9590 shows how the FX series needs to hustle for the kind of performance you can expect from a much lower clocked Core i5 processor for example.

Pretty much sums it up. You have to clock the FX's into silly 220w+ TDP for them to compete with the I5's at stock for gaming, and that's in one of the best performing games for the FX CPU's. That kinda of heat output during Summer time? no thanks!

Once the 65W broadwell-k CPU's launch in 2-3 months, the gap will only get more embarrassing, in terms of performance, TDP, electricity consumption.
 
I don't see the purpose of having a hard on for intel, which is all this thread seems to focus on. I think that's enough now for the general consensus to see that some of us are terrible at debating.

As martini put it, all the graphs do is cherry pick the strengths. As for two core games that is so two-thousand and late!

You've made the wrong conclusion. All the CPUs in that chart manage 102 +/- 8 FPS. The i3s haven't shot up because Intel made huge leaps and bounds in the last year or two. It's because the test is not a good discriminator of CPU power, so they all manage about the same.

The difference between this BF4 chart and the earlier one is probably a combination of testing methodology (SP v MP?) and game patches, not incredible performance improvements in the Intel i3.

:)
 
I don't see the purpose of having a hard on for intel, which is all this thread seems to focus on. I think that's enough now for the general consensus to see that some of us are terrible at debating.

As martini put it, all the graphs do is cherry pick the strengths. As for two core games that is so two-thousand and late!



:)

There's nothing to debate.

It's a well established fact that Intel's I5/I7/I7-E range are worlds apart from AMD's 2011 CPU/Chipset in terms of performance, feature set, heat output and electricity consumption.

Or are you suggesting otherwise? :)

The fact that Broadwell-k and Skylake are going to arrive this year will just further exaggerate this huge difference.

Oh and before you mention it, yes they are more expensive, you get what you pay for in this life afterall.
 
There's nothing to debate.

It's a well established fact that Intel's I5/I7/I7-E range are worlds apart from AMD's 2011 CPU/Chipset in terms of performance, feature set, heat output and electricity consumption.

Or are you suggesting otherwise? :)

The fact that Broadwell-k and Skylake are going to arrive this year will just further exaggerate this huge difference.

Oh and before you mention it, yes they are more expensive, you get what you pay for in this life afterall.

One situation you are not looking at is that you are stating 220w tdp of an fx8 core,(which is a worse case scenario dependent on all 8 cores being loaded up to around 90-100% utilization and the main factor to take into count is the Vid of the cpu, 1.5v-1.55v.
In games like above using two threads, the heat output won't be comparable to it's maximum potential, However in other games that are more MT optimized and in apps if the cpu is loaded up the temperatures will scale upwards, in comparison to an Intel that can be noticeable in something like Handbrake conversions, but if a game was to max out all threads at 90-100% then it would be time to upgrade.
Whilst relevant but probably disgarded is that some of these fx9590's can be undervolted and in particular the same 2013 refined fx83/(e)'s can hit around 4.5-4.6 at an average of 1.33-1.43v.


Worlds apart? There are gains so so..., but then I'm still waiting for the adoption of Cpu-Gpu compute. The untapped Hsa in Kaveri could provide a nice gain/overhead of resources for an optimized game engine.
Why am I still on a 2600k since 2011? i'll answer that for you the 2600k still provides great performance and the incremental Intel offerings since have been lacklustre in my opinion.

Amd Am3+ is outdated, but not irrelevant. The mistake is Gf 28nm came late and didn't help the fx architecture in that it doesn't scale well to clock-speeds, it's better suited to density. They have Fx unsold/rebranded stock on the shelf whilst a more refined,improved feature set (fm2+) is limited by pin count/core count and motherboard power regulation, that can't make use of the denser libraries (for a 6 core Apu). I remember back in the old days Amd managed a 6 core Thuban on the same 45nm process as a 4 core Deneb, That was impressive.

If buying new today then dependent on what I'm using it for then it would probably be Intel for me, however Amd is still an option and I would still recommend an Amd system for the person and their needs.
 
One situation you are not looking at is that you are stating 220w tdp of an fx8 core,(which is a worse case scenario dependent on all 8 cores being loaded up to around 90-100% utilization and the main factor to take into count is the Vid of the cpu, 1.5v-1.55v.

Of course it is but it's still utterly pants when their competitor has sub 100W TDP parts under those same scenarios and additionally have integrated GPU's which is included in that TDP.
 
There's nothing to debate.

It's a well established fact that Intel's I5/I7/I7-E range are worlds apart from AMD's 2011 CPU/Chipset in terms of performance, feature set, heat output and electricity consumption.
...
Oh and before you mention it, yes they are more expensive, you get what you pay for in this life afterall.

Its not just the fact that they are more expensive Dave2150 but they have a gigantic budget that dwarf's AMD's R&D.

Along your rather biased benchmarks I by accident stumbled to this story through the old fashioned 'surfing' (circa 2011 like the dated AMD technology) which caught my eye rather aptly:

http://www.engadget.com/2014/06/12/intel-loses-eu-antitrust-appeal/?ncid=rss_truncated

Again cementing the reason why their competitors stand little chance but blinkers on eh Dave? :cool:
 
Last edited:
There's nothing to debate.

It's a well established fact that Intel's I5/I7/I7-E range are worlds apart from AMD's 2011 CPU/Chipset in terms of performance, feature set, heat output and electricity consumption.

Or are you suggesting otherwise? :)

The fact that Broadwell-k and Skylake are going to arrive this year will just further exaggerate this huge difference.

Oh and before you mention it, yes they are more expensive, you get what you pay for in this life afterall.

BRB dude just got to answer the door..

Awww, Donkeyballs...

 
Its not just the fact that they are more expensive Dave2150 but they have a gigantic budget that dwarf's AMD's R&D.

Along your rather biased benchmarks I by accident stumbled to this story through the old fashioned 'surfing' (circa 2011 like the dated AMD technology) which caught my eye rather aptly:

http://www.engadget.com/2014/06/12/intel-loses-eu-antitrust-appeal/?ncid=rss_truncated

Again cementing the reason why their competitors stand little chance but blinkers on eh Dave? :cool:

I'd buy a AMD CPU in a heartbeat if it was competitive in performance, tdp, power consumption and featureset, in all programs, singlethreaded and multi-threaded.

I used to be an all AMD guy - I've owned so many AMD chips - the first being the 1.4Ghz 'Thunderbird' Athlon many years ago (I foolishly bought a 60mm Delta Focused flow fan to cool it, I was 16 at the time, so I can be forgiven for that crazy fan choice :D), followed by athlon xp, 'barton', the original FX dual core.

Sadly I've been with intel since conroe was released.

I'm still using AMD GPU's though, I've never owned an NVIDIA card for example.

I really want AMD to do well - it's just inescapable that Intel offer far superiour products in the mid to high end budget at the moment.

As I've said many times, I really hope Zen is a fantastic architecture and that it's worlds apart from Intel. I'll buy it in a heartbeat.
 
Even an i3 is faster than the best AMD chip in that graph! What the hell?

Why are people so surprised by this? It's been happening for years now. Are people in some sort of strange denial?

  • AMD = budget provider for those on highly limited budget
  • Intel = performance for this with less limited budget

This is not going to chance until AMD suffer a dramatic reversal in financial fortunes and ave the billions to pump into coming up with some kind of miracle architecture. This will likely now never happen due to how far behind the company has fallen in the CPU market.

It's not a case of bias, it's just fact.
 
Why are people so surprised by this? It's been happening for years now. Are people in some sort of strange denial?

The Intel Witnesses are in strange denial.

Only and I repeat; only when you skew the benchmarks and cherry pick does the I3 win in anything. And the reason behind it is quite clear - bad game support.

It's no wonder who is in charge of our country when there are so many deluded people out there who don't seem to understand that if a CPU isn't supported correctly then guess what? it won't work properly !

FFS how can you take a game that supports two cores, ignores the other six and even so much as think it's even partially fair?

If I took three of the wheels off your car and it refused to budge would you blame the engine? or would you logically and intelligently deduce that the wheels are missing so there is nothing the engine can do about it?

Look. The internet has been around for a very, very long time now. 1/8 of your natural life. If you haven't figured out now that people on there abuse statistics and bend and warp the truth then I feel awfully, horridly sorry for you.

I
I really want AMD to do well

pmsl I might have that one for my sig. You really want them to do well yet you're on here, day after day, ****ging off their processors when you have no actual grasp of how a processor works, nor how software can totally and utterly make or break it, regardless of how bad or good the CPU is.

You're a fanboy and fanboys will stop at absolutely nothing to drive their point home, even if it's as bent as a nine bob note.
 
Last edited:
The Intel Witnesses are in strange denial.

Look. The internet has been around for a very, very long time now. 1/8 of your natural life. If you haven't figured out now that people on there abuse statistics and bend and warp the truth then I feel awfully, horridly sorry for you.

Congrats for playing up to the stereotypical fanboy spiel. Personally, I prefer objectivity and the fastest performance for my money, hence why I have no bias. :)
 
Congrats for playing up to the stereotypical fanboy spiel. Personally, I prefer objectivity and the fastest performance for my money, hence why I have no bias. :)

Yeah mate aww poo I'm sorry you caught me out. Total AMD fanboy me how did you ever guess?









+ 8 core Xeon Hackamac + Intel Pentium Anniversary ITX rig.

Do you want me to carry on? how about my ROG I7 laptop?
 
Back
Top Bottom