Far cry 4 system Requirments

Hmm I guess.

The ultimate solution in that case is to release a pre-release version for public playtesting and use a STEAM hardware survey like system to track the performance and hardware used by those playing it and make the data available on the game's product page. Yes it's a longer process but the results will give the most absolute accurate representation of what gamers will get and that would force developers to be lazy.

Or even just release a demo which is reflective of the full game, allowing people to test it on their own systems...

Of course we all know this isn't going to happen, as all of the big publishers are too intent on getting people to pre-order without any real verified information about the game (as this means it doesn't really matter what the game is actually like...) :(
 
The funny thing is I have 2 GTX680s and 4 R9-290s.

One R9-290 let alone a R9-290X smokes them both.

I was running 5760x1080 and now 4K but even on lower resolutions
 
Last edited:
3 was infinitely superior to 2 in every way possible (I didn't play more than 2-3 hours of 2, whereas 3 was the first game I've ever bothered to complete with all achievements :p)

I bought 1,2&3 on steam after having played and loved farcry2 on ps3 and i was thinking how nice itd be to run at 5040*1200 on a 290x, its an old game, should be no bother.

Boy was i wrong, even at minimum settings it was horrific and unplayable when the same setup pushed bf3, bf4, heavily modded skyrim, theif, ac2,3&4 etc all at max (without aa for bf4 and ac4 admittedly) at that res no bother.

Very much put me off 3, i'll maybe try it sometime when my confidence is restored.
 
I've been playing FC3 using an Powercolor Devil R9 270X OC'd - pretty much all on v.high/ultra except for Shadows. This is at 1920x1080.

I'd say on average I get around 50FPS. There will be the odd dip to around 35/40FPS when there's a lot going on but I can handle that. It looks very very nice, I think a 290X should be fine IMO, give it a try :)
 
Generally they overestimate the requirements so people can just brute force their poor optimisation.

Yet this is why in the console vs pc thread or whatever it was called… and this is why PC gaming is better? http://forums.overclockers.co.uk/showthread.php?t=18633089

I used to love PC gaming. I truly did but there's so much screwing/messing about. Oh you need this, oh you need that. Oh they are looking into the bugs, no it's the graphics companies problem. Need to wait on new drivers two weeks after the game has been released. Now need another patch because neither/vaguely communicate with each other to sort it out. New patch breaks the driver or vice versa, rinse and repeat the merry-go-round. It's all right at our end. Blame each other. Silence and hide hoping it disappears.

Hello Crysis 2 and Battlefield 3 threads years ago here…

It's ok right, PC gaming is out of this world for console ports? PC owners die defending their highly priced Quad SLi setups for poor console bug ridden console ports that they're so proud of.
 
Last edited:
Yet this is why in the console vs pc thread or whatever it was called… and this is why PC gaming is better? http://forums.overclockers.co.uk/showthread.php?t=18633089

I used to love PC gaming. I truly did but there's so much screwing/messing about. Oh you need this, oh you need that. Oh they are looking into the bugs, no it's the graphics companies problem. Need to wait on new drivers two weeks after the game has been released. Now need another patch because neither/vaguely communicate with each other to sort it out. New patch breaks the driver or vice versa, rinse and repeat the merry-go-round. It's all right at our end. Blame each other. Silence and hide hoping it disappears.

Hello Crysis 2 and Battlefield 3 threads years ago here…

It's ok right, PC gaming is out of this world for console ports? PC owners die defending their highly priced Quad SLi setups for poor console bug ridden console ports that they're so proud of.

basing this on a company well known for its poor PC support is abit silly ... Nothing beats PC for anything when on a level playing field . This is not an even playing field with Ubi. Games that are generally broken for the PC are still in someway broken for consoles ... I seem to recall reading the loading bug affected consoles , people's save data corrupted and just as much fury from console guys.

As for SLI & Crossfire users ... We may usually be on the back foot in terms of waiting for profile releases and fixes but when is all said and done I'd rather wait 2 weeks or 2 months for a profile and play even a port with fadelity the outshines anything closed platforms have to offer.

Where poor performance in concerned it is nearly always down to the developer and not the drivers . ( obviously there can be driver issues but it really is few and far between ) watchdogs was proof enough
 
FC3 was quite good imo, got quite repetitive but executed the storyline of the game well, and gave anything with free roam is always pretty good.

I would be quite interested in FC4 if the New COD hadn't just come out, but I think this must be the highest PC recommended specs iv ever seen, a 290X?
I have one but im sure it would run fine on many GPU's lower than that!

and 8GB Ram seems to be the standard figure thrown at all games now.

I will say some of the Far Cry 4 vidoes that have been released look very good!!
Cut scenes and what not seem to be something they focus on well
 
Looking at all the other recommended specs given to games:

It won't use 8GB of system RAM, probably 3-4GB on max settings.
It will probably only need a 2GB 680/280X for maximum graphics @ 1080p
Intel i5 or AMD 6300 (Clocked up).

This is of course, if the game is 'optimised' to actually work properly without memory leaks and whatever, maybe a couple patches & driver updates.

These specifications are always stupid & reading the above posts, if CoD:AW is using 6GB RAM, something isn't right at all...
 
Looking at all the other recommended specs given to games:

It won't use 8GB of system RAM, probably 3-4GB on max settings.
It will probably only need a 2GB 680/280X for maximum graphics @ 1080p
Intel i5 or AMD 6300 (Clocked up).

This is of course, if the game is 'optimised' to actually work properly without memory leaks and whatever, maybe a couple patches & driver updates.

These specifications are always stupid & reading the above posts, if CoD:AW is using 6GB RAM, something isn't right at all...

That's a little optimistic, considering what FC3 was like. I don't doubt that it will need a 780/290 to max it, but I figure the 680/280 will be fine if you turn down AA etc.
 
That's a little optimistic, considering what FC3 was like. I don't doubt that it will need a 780/290 to max it, but I figure the 680/280 will be fine if you turn down AA etc.

I guess, but you can never trust recommended specs at face value really, they always go over the top and lately it's because of optimisation issues or a bad port, which is getting more and more worrying.

Comes to something when they'll be recommending 16GB of RAM & 980's/390X's soon just to 'power' through the issues. :(
 
I guess, but you can never trust recommended specs at face value really, they always go over the top and lately it's because of optimisation issues or a bad port, which is getting more and more worrying.

Comes to something when they'll be recommending 16GB of RAM & 980's/390X's soon just to 'power' through the issues. :(

Yes, I agree. Inflating recommended specifications to account for lazy coding is bad. I just think that Ubisoft are really trying to push this next gen thing, it could be all talk, but looking at some of the graphical features in Assassin's Creed Unity and Far Cry 4, I can't help but think the required hardware to max these games will certainly be a hike over their predecessors.
 
Guys, the may be some method to their mayhem.

Firstly, they say the game requires a D3D11 card with 2GB RAM, the GTX670/680 were the first decent Nvidia cards to meet both those requirement's (the reference GTX580 only had 1.5GB, 3GB was third party). So it makes sense that a GTX670/680 would be the Nvidia card (though I would have expected the 670).

Secondly, the AMD equivalent of a GTX680 in a game that favours Nvidia (I.E WoW, HAWX, etc) is a R290, so the 290X kinda makes sense too (again 290 would have worked better IMO).
 
I'm only interested in FC4 for its Map editor/creator..to make my own online multiplayer maps i find that fun.

I found the single player in FC3 to be very boring,too much traveling also.
 
After the complete and utter let down of watch dogs:

- the fact that is ran crap for both gpu brands [funnily the game ran better for me on my 7850 than my 290 though :rolleyes:] (using those mods provided by theworse, lunah etc. made the game run so much better)
- false advertising with the graphics, which were then later "turned on" by a modder (they have done this with previous games before including far cry 3)

+ all the recent ****; can't see more than 30 fps, we want an equal experience on all platforms and the other usual BS oh and not forgetting the state that far cry 3 was released in as well....


It is safe to say that ubisoft isn't getting any more money from me!

Just a shame that people will still happily throw money towards these sort of game developers/publishers......
 
Last edited:
Back
Top Bottom