• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

XCOM 2 memory shock

** Removed deleted quote **

Can you give it a rest. The "joke" has been done to death which makes this look like a blatant attempt to derail the thread.
 
I just cant believe this game needs all of this power tbh. It's a turn based game FFS!!!!!

Imagine if it was a third person shooter.....no-one would be able to run it. :eek:

What is it with game Devs recently???
 
I just cant believe this game needs all of this power tbh. It's a turn based game FFS!!!!!

Imagine if it was a third person shooter.....no-one would be able to run it. :eek:

What is it with game Devs recently???

Game is actually very good looking for such game. And it's not really that over demanding when you get those overkill settings down to realistic levels.
 
Cheers and might well give this a look at the weekend :)

The game is well worth a look. I will be playing it a lot for the next few months.

It is also very playable on max settings @2160p with a single TitanX.

All credit to NVidia for having the foresight to equip the TX with 12gb of VRAM, games like XCOM 2 will become very commonplace in the next 12 months and people need to accept that is the way things are going.

The previous XCOM game got good SLI support about a month after launch and hopefully the same will happen with XCOM 2.

Anyone thinking about buying a next gen card should hold off until they are available with at least 12gb or 16gb of VRAM. Anyone buying next gen cards with only 8gb will be making a big mistake.
 
I just cant believe this game needs all of this power tbh. It's a turn based game FFS!!!!!

Imagine if it was a third person shooter.....no-one would be able to run it. :eek:

What is it with game Devs recently???

Turn based game doesn't mean 2D crap graphics nowadays... :rolleyes:

The XCOM 2 graphics are better than 2/3 of the games that came out last year :rolleyes:

Not the best screenshot (and heavily compressed by Photobucket but can email you the uncompressed originals at 2560x1440 which are 5-6MB each ) but believe me when comes to the graphics, looks stunning at max out settings with No AA


268500_2016-02-09_00012_zpsq10ap1ts.png


see the light of the guns

268500_2016-02-09_00013_zpsu9k4py8h.png


or the detail
268500_2016-02-09_00014_zpswvckuzps.png


from another battle

268500_2016-02-07_00022_zpsbsanphte.png


268500_2016-02-07_00025_zpsc0iy3lcb.png


consider this detail from top above with multiple units and lighting....

268500_2016-02-09_00008_zpsuqeh7iqi.png


268500_2016-02-08_00005_zpsuzjywomt.png

As for the 0 AA is because I lose 20fps and the game from a comfortable 59-62fps goes down to 40 and FXAA makes the screen look blurry and crap.


FYI it consumes max 3927MB of the GPU VRAM according to MSI AB.
 
Last edited:
Anyone thinking about buying a next gen card should hold off until they are available with at least 12gb or 16gb of VRAM. Anyone buying next gen cards with only 8gb will be making a big mistake.

Kaap I agree.
Next top of the range GPUs should have 12GB+. Any company that fails to do so, going to lose at least my money. Regardless if it is AMD or NV.
 
Anyone thinking about buying a next gen card should hold off until they are available with at least 12gb or 16gb of VRAM. Anyone buying next gen cards with only 8gb will be making a big mistake.

Although I tend to agree with you, I am curious if we're not hanging up too much on this single game in this respect?
 
Although I tend to agree with you, I am curious if we're not hanging up too much on this single game in this respect?

There are a number of games out now that can breach 8gb @2160p, Watch Dogs and GTA V are a couple that spring to mind.

Another thing that often gets said is you can not see the effect of 8XMSAA @2160p but the truth is in some games you can.
 
All credit to NVidia for having the foresight to equip the TX with 12gb of VRAM, games like XCOM 2 will become very commonplace in the next 12 months and people need to accept that is the way things are going.

Agreed.

NVIDIA did very well with their 6GB 980ti and 12GB Titan X cards, though their 970's and 980's are indeed struggling with max details 1080p/1440p in the latest crop of games. I feel sorry for those who bought a 970/980 and expected it to play 1080P/1440P at max settings (without extreme AA) but are no longer able to do so.

Good foresight for AMD to equip the 290X with a 8GB variant, and the bargain priced 390/390x with 8GB also.

Though the Fury range was a complete let down in almost every area, cooler (pump issue), VRAM, only 4GB on a top tier card, and poor overclocking.
 
There are a number of games out now that can breach 8gb @2160p, Watch Dogs and GTA V are a couple that spring to mind.

Another thing that often gets said is you can not see the effect of 8XMSAA @2160p but the truth is in some games you can.

At 4K I can imagine, however at the more widely adopted 1440p...? But you're probably right that that's where we're headed. It'll be a steep curve this year, VRAM wise.
 
Anyone thinking about buying a next gen card should hold off until they are available with at least 12gb or 16gb of VRAM. Anyone buying next gen cards with only 8gb will be making a big mistake.

It wasn't that long ago that people on this forum were laughing at me for saying my 780 didn't have enough ram.
 
Though the Fury range was a complete let down in almost every area, cooler (pump issue), VRAM, only 4GB on a top tier card, and poor overclocking.

Completely right. My Fury is an awesome card with the best Fury cooling solution available (Sapphire Tri-x) but it is being held back in ways it shouldn't be.
 
There are a number of games out now that can breach 8gb @2160p, Watch Dogs and GTA V are a couple that spring to mind.

Another thing that often gets said is you can not see the effect of 8XMSAA @2160p but the truth is in some games you can.

I personally don't argue over image quality improvements over 8xMSAA and FXAA on 4K, but I argue that it's not realistic to be used fps wice, as it really kills the performance. Also remember games will only get heavier and heavier all the time. We haven't even reached stable 60fps with max details (and won't reach that in the near future) at 1080p. People always call that out for bad optimisation etc (I don't clame they are properly optimised), but fact is that we are nowhere near maximising 1080p rendering quality even, so people talk about 60fps @ max details on 4k (dream on people).

What comes to memory, more the better :)
 
Turn based game doesn't mean 2D crap graphics nowadays... :rolleyes:

The XCOM 2 graphics are better than 2/3 of the games that came out last year :rolleyes:

Not the best screenshot (and heavily compressed by Photobucket but can email you the uncompressed originals at 2560x1440 which are 5-6MB each ) but believe me when comes to the graphics, looks stunning at max out settings with No AA


268500_2016-02-09_00012_zpsq10ap1ts.png


see the light of the guns

268500_2016-02-09_00013_zpsu9k4py8h.png


or the detail
268500_2016-02-09_00014_zpswvckuzps.png


from another battle

268500_2016-02-07_00022_zpsbsanphte.png


268500_2016-02-07_00025_zpsc0iy3lcb.png


consider this detail from top above with multiple units and lighting....

268500_2016-02-09_00008_zpsuqeh7iqi.png


268500_2016-02-08_00005_zpsuzjywomt.png

As for the 0 AA is because I lose 20fps and the game from a comfortable 59-62fps goes down to 40 and FXAA makes the screen look blurry and crap.


FYI it consumes max 3927MB of the GPU VRAM according to MSI AB.



I consume less than 4GB at 1440p and FXAA. No need for MSAA on this game really.

However, XCOM 2 although a great game is definitely poorly optimised at the moment.
 
So to recap, developer cant optomise to save themselves and we should all blame our cards for not having enough vram and we need more.

Ooookay then.
 
Back
Top Bottom