• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Homeworld: Remastered CPU Benchmarks

Ha, I wish you didn't have me quoted with the age thing (As I think he's being ludicrous with that "point") And mins do count a hell of a lot.
But anyway, a 3930K with 7970 Crossfire at 144HZ 1080p versus FX83 same set up. As far as the performance statistics go, there will be a fair difference (Mileage obviously depends on games). You'll notice it depending on game (Can't give you a blanket yes/no), you could be playing Soldat for all I know.

You can only give it a try and report back.
 
Last edited:
I used to be an all AMD guy - I've owned so many AMD chips - the first being the 1.4Ghz 'Thunderbird' Athlon many years ago (I foolishly bought a 60mm Delta Focused flow fan to cool it, I was 16 at the time, so I can be forgiven for that crazy fan choice :D), followed by athlon xp, 'barton', the original FX dual core.

Sadly I've been with intel since conroe was released.

I switched to my first AMD when the 1Ghz chip came out, before that I was using a PIII 450 (later a PIII 500). Before that a P100 - circa '93 I reckon.

I have had numerous nvidia cards, more than the ATi I reckon. Just because I own an FX doesnt mean I do not recognise intel. I switched from a C2D E8500 to this system, I just didnt have a king kong budget to spend. :D
 
I switched to my first AMD when the 1Ghz chip came out, before that I was using a PIII 450 (later a PIII 500). Before that a P100 - circa '93 I reckon.

I have had numerous nvidia cards, more than the ATi I reckon. Just because I own an FX doesnt mean I do not recognise intel. I switched from a C2D E8500 to this system, I just didnt have a king kong budget to spend. :D

LOL at King Kong budget :D

My first PC ever was a 'Tiny' make, with a Pentium 233Mhz with MMX! Technology! :D

Still remember it had a crappy little green heatsink :p
 
Erm.

I just looked at ALX Andy's COD benchmark and had to lol.
The i3 4340 has a superior average and minimum frame rate to the FX83.

They're obviously GPU bottlenecked anyway so there's nothing really to separate them (The Intel stuff is literally the same i3/i5/i7)

Just thought it was funny.

Graph wars.
 
Just for reference the article from HardwarePal (my website) was done somewhere at the end of 2013. I corrected some spelling and the article shows a published date of 2014.

Anyway those CPUs and GPUs were top of the range back then. Let me see if Milos (the reviewer is able to update the article) with newer games and CPUs/GPUs.
 
And I think you are missing the point a lot of people are trying to make and getting slightly annoyed. Its not that Dave is completely wrong, its that he cherry picks, over exaggerates, and disregards the significant budget difference. Apparently according to him, increasing someone budget by 40%+ is a minor increase.

Andy's post are straight to the point but from an experienced opinion ''This is how it is accept it or don't, but I don't care''
He provides a great insight into the games played by Intel and Amd, whilst most people probably don't agree with his out the box thinking I personally think he is spot on most of the time, (ie his intel server handme downs theory).

Same case with DaveXXXX when he's on an Anti-AMD rant with incorrect info.

Woah I thought I was on my own a week ago deflecting the babble from poor Dave! :D

Going back on topic, yeah the game is not a good indicator of a fair comparison between processors. /thread
 
Why isn't it a "fair" comparison? Is it only games which play to the strengths of the CPU that are "fair"?

Every game comparison is "fair", regardless of result (Or even relevance)
 
Your first comments normally include (slight paraphrasing) "amd have horrid performance", "all games are poor on AMD" and "I would advise against anyone purchasing AMD mb+cpu ATM"...followed by your cherry picked results. You have even ventured into the AMD overclocking thread to tell them how bad their CPUs are.

Only after you have been called out for over exaggerating and cherry picking do you then slightly back down and admit AMD are not completely horrid but not once have I seen you actually say AMD are valid for the budget builder. In fact, its one of the things I have had to keep emphasising to you after your initially out bursts.

This is a carbon copy of another poster on here where they change stance and throw in the odd 'AMD are actually great for the price' comments after ripping the back out of them for ages.

Apart from geographical differences they are of the same school of thought! :D
 
Why isn't it a "fair" comparison? Is it only games which play to the strengths of the CPU that are "fair"?

Every game comparison is "fair", regardless of result (Or even relevance)

By fair martini I just have to quote an early post that sums it up for you (not I).

Your first comments normally include (slight paraphrasing) "amd have horrid performance", "all games are poor on AMD" and "I would advise against anyone purchasing AMD mb+cpu ATM"...followed by your cherry picked results.

Yup we are. A thread about gaming where the game is loaded to work really well on Intel and make the AMD look bad.
...
99% of the time in your loaded, cherry picked benchmarks.

We have danced over the fact that if you dissect the game engine and find it revs the heck out of two cores no matter what (courtesy of the other Dave) then to me that is an unfair comparison - no?
 
Last edited:
That doesn't make it unfair, it's just a situation that happens (More rarely these days albeit!)

You're basically saying a comparison is only fair when it doesn't highlight the weakness of current AMD CPU's. That's the opposite of fair comparisons.

If we took 100 random games from 2013-2015, and half of them used 3 cores or less, does that make the comparing unfair? Nope. My main criticism of the AMD FX83's is inconsistency in performance, and a broad range of testing would show that.
 
Last edited:
I see your angle, but I mean if you are using it in the context to make the other product get beaten by lower market segments (i.e. the graphs are highlighting an issue with the high end products suffering due to poor software design) then to me it is flawed as a comparison.

The i3 at 3.4Ghz clockspeed should never beat the fx9590 at 4.7Ghz unless there are slight of hand tactics or optimisations favouring the intel instruction sets that have not yet been tweaked for AMD processors. If the fx9590 was around 4Ghz or lower I could understand it being beaten - just not at 4.7Ghz, no way.
 
Last edited:
Sorry, but I all I do is see people (constantly) excusing performance in X or Y scenario because of X or Y excuse.

And then there's people trying to highlight that X and Y performance far too much.

Guess we'll agree to disagree.

EDIT : You're just underestimating how far behind AMD are clock for clock in FPU performance core for core.
 
Last edited:
I am not excusing AMD in the main, even though you may think I do. The difference here is the game is being used as a vehicle to make a point - albeit from dave and his notorious elbow digs to try and big up the 'newer processors from intel after 2011 that are so much better than the FX relics'.

We get that, most enthusiasts on here understand. Rubbing it in with poor information and very weighted graph wars is laughable.

The game looks good though! :)
 
CiS924g.jpg


This game is not that taxing on a CPU though. If even in a large battle you are not going under 40FPS and the rest of the game its over 100fps then its going to run fine for most people.

Some of you ought to play Sins of a Solar Empire - performance craters even on Intel setups during large firefights - I would wish for 40FPS then lol.

This is the benchmark run for the graphs you linked:

https://www.youtube.com/watch?v=AV8iQICQ3VI

Completely worthless as there is nothing going on.

The graph in the OP is a skirmish - which highlights the advantage Intel CPU's have in this game, as well as other CPU intensive games.

Even with an unoverclocked AMD CPU you are hitting 40FPS in a fight scenario and in most other situations its going to be much higher.

Have you even played something like Sins of a Solar Empire??

During fight scenes playing 12+ hour matches at LANs with mates,I would wish to see 40FPS and thats with a newish Core i7 myself.

I still can remember Supcom and its time dilation effect even a few years ago.

Homeworld:Remastered does not appear taxing as an RTS on CPUs like TW or some other RTS games.

It probably will run fine on most newish CPUs.

I also wonder if you end up with a more midrange graphics cards how things will look,especially since going from a £130 Core i5 to £300+ Core i7 barely yields a 15% to 20% improvement but going from a £130 card to a £250 to £300 yields a 50% increase in framerates.

I suspect a modern £100 CPU and a £150 card will be fine for the game though.


Edit!!

Homeworld Remastered was tested in a similar manner to StarCraft II. We played a skirmish with seven AI-controlled players in a 4v4 match. The resources were set to their maximum value and we went with a 15 minute build time so players could reach critical mass. Just moments before the first major battle was set to take place we made a save which could be loaded repeatedly.

As the test starts over 200 of my own ships engage the enemy and all hell breaks loose for two minutes. With the GTX 980 frame rates were initially around 140fps at 2560x1600 but that dipped down to 50fps as the ships began to engage each other.

So if we were to benchmark the game in the building stage, gathering resources and what not, then frame rates would be around three times higher than what we are going to show during our massive battle scene, so keep that in mind."

So basically it is a worse case scenario - if you are running with less AIs enabled or simply the case with more actual people its going to be far less taxing.

The single player campaign is less than 40 hours long.

Playing against other people is going to what most heavy players of the game will be doing and putting the 100s of hours in,and that is going to be less punishing on CPUs than having to do all the AI calculations at the same time on one system.
 
Last edited:
EDIT : You're just underestimating how far behind AMD are clock for clock in FPU performance core for core.

I'm not. I know quite clearly how far ahead Intel are.

No one here is arguing that or pretending things are any different, we're just able to see all of the facts of which there are many.

So for an Intel fanboy that may take a while, but I imagined even the most hardened Intel fan could see the woods for the trees eventually (IE though AMD's core speed is a bit pants there are 8 of them after all).

But sadly it seems not and I've actually had to do something I really hate doing (ignoring people). In fact, in all of my time on OCUK I've never ignored any one before, but this guy is simply too far gone.

Shame really. There are many around here I don't particularly like and I can live with that. I also pride myself on never reporting posts to drop people in the crap, live and let live and all.

But I've never come across someone so brainwashed, so deluded.
 
The days of breaking a high clockspeed are long gone. Why bother when you can just multi-thread? Consoles are proof that you can have weak processing power on a few cores, you just need the right developers to get the most out of the hardware. Pretty much the polar opposite of dave12345 game showcase here where they have not really bothered.

Going beyond 5Ghz on the CPU's is not the target since there is little to gain for all that heat. Intel and AMD have focused on the mobile markets with low power consumption. For me the last 5 years seems to have stagnated for CPU progression, perhaps that is due to reaching certain limits and that GPU's can muscle in.
 
This is a (poorly) dual threaded game ... what do you expect. For all intents and purposes it might as well be single threaded. Shameful.
 
This is a (poorly) dual threaded game ... what do you expect. For all intents and purposes it might as well be single threaded. Shameful.

Hey don't blame the crappy software man. It simply must be the hardware !

Sarcasm aside - totally. A dual threaded game in this day and age. Poor, poor show.
 
Back
Top Bottom