• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ashes of the Singularity Coming, with DX12 Benchmark in thread.

So we going from 2 gig to 12 gig minimum in 3 years give it a rest.

jfRWrTt.jpg


You were saying.....

XCOM 2 on a single TitanX maxed @2160p.

A GTX 980 Ti or Fury X won't run the game at those settings.
 
Kaapstad, as you have access to both Titan X, and Fury X. Could you test 4k with 8x msaa. And see the difference of ram usage between both systems, no need to finish benches, just monitor them during tests.
 
Bear in mind that Kaap's idea of max settings means running 8X MSAA even at 4K res. It has always been the case that we could max out VRAM in games at settings nobody would realistically use on a single GPU.

Even a Titan X at 4K in XCOM 2 is getting ~15 FPS with max settings (8X MSAA).
 
Bear in mind that Kaap's idea of max settings means running 8X MSAA even at 4K res. It has always been the case that we could max out VRAM in games at settings nobody would realistically use on a single GPU.

Even a Titan X at 4K in XCOM 2 is getting ~15 FPS with max settings (8X MSAA).

If I use all the TXs I get a lot more than that.

I prefer to play on a single card though.
 
Bear in mind that Kaap's idea of max settings means running 8X MSAA even at 4K res. It has always been the case that we could max out VRAM in games at settings nobody would realistically use on a single GPU.

Even a Titan X at 4K in XCOM 2 is getting ~15 FPS with max settings (8X MSAA).

Yes but this is the status of current gen of games, but memory limits are bound to get higher year by year. So his reasoning for vram suggestions for the future ain't that far off. For example VR, you want absolutely minimum latency. So you want as much as data loaded to your video card memory to ensure best possible experience.
 
Bear in mind that Kaap's idea of max settings means running 8X MSAA even at 4K res. It has always been the case that we could max out VRAM in games at settings nobody would realistically use on a single GPU.

Even a Titan X at 4K in XCOM 2 is getting ~15 FPS with max settings (8X MSAA).

Exactly, how many pc gamers will play at these settings? Just Kaap so far xD
No but seriously that's overkill it appears the justification for 12GB is that you can use that much even if getting unplayable FPS and marginally any visual IQ increase.

You would probably need atleast a 3 way setup to get anything sort of playable FPS running these games at max settings and even then you best hope for SLI support and good scaling lol.

12GB atm is pointless. 4-6GB is more than enough for today. Maybe "tomorrow" will 12GB be more useful.
 
Yes but this is the status of current gen of games, but memory limits are bound to get higher year by year. So his reasoning for vram suggestions for the future ain't that far off. For example VR, you want absolutely minimum latency. So you want as much as data loaded to your video card memory to ensure best possible experience.

I have never disputed that year by year VRAM requirements will increase and generally advocate future proofing in relation to VRAM. Kaap isn't suggesting about the future VRAM needs he is stating that some games need 12GB now. His reasoning is based on using 8X MSAA at 4K which is not a realistic yardstick to measure current requirements.
 
Last edited:
According to HardOCP Tomb Raider is using 8 GB VRAM at 4K on a Titan X with no AA and 9.5 GB with 4x AA

VRAM Usage

VRAM utilization was one of the more interesting aspects of Rise of the Tomb Raider. It appears to be very different depending on what GPU you are running. There is no question this game can consume large amounts of VRAM at the right settings. With the GeForce GTX 980 Ti we maxed out its 6GB of VRAM just at 1440p with No AA and maximum settings. The TITAN X revealed that it was actually using up to 7GB of VRAM for that setting. In fact, when we pushed the TITAN X up to 4K with SSAA it used up to almost 10GB of VRAM.

However, for the AMD Radeon R9 390X utilization was a bit odd when you first see it, never exceeding 4.2GB and remaining "flat" with "Very High" textures and SSAA. We did see the proper decrease in VRAM using lower settings, but the behavior was odd indeed. This didn't seem to negatively impact the video card however. The VRAM is simply managed differently with the Radeon R9 300 series.

The AMD Radeon R9 Fury X kind of backs that statement up since it was able to allocate dynamic VRAM for extra VRAM past its 4GB of dedicated VRAM capacity. We saw up to a 4GB utilization of dynamic VRAM. That allowed the Fury X to keep its 4GB of dedicated VRAM maxed out and then use system RAM for extra storage. In our testing, this did not appear to negatively impact performance. At least we didn't notice anything in terms of choppy framerates or "micro-stutter." The Fury X seems to be using the dynamic VRAM as a cache rather than a direct pool of instant VRAM. This would make sense since it did not cause a performance drain and obviously system RAM is a lot slower than local HBM on the Fury X. If you remember a good while ago that AMD was making claims to this effect, but this is the first time we have actually been able to show results in real world gaming. It is awesome to see some actual validation of these statements a year later. This is what AMD said about this in June of 2015.

Note that HBM and GDDR5 memory sized can’t be directly compared. Think of it like comparing an SSD’s capacity to a mechanical hard drive’s capacity. As long as both capacities are sufficient to hold local data sets, much higher performance can be achieved with HBM, and AMD is hand tuning games to ensure that 4GB will not hold back Fiji’s performance. Note that the graphics driver controls memory allocation, so its incorrect to assume that Game X needs Memory Y. Memory compression, buffer allocations, and caching architectures all impact a game’s memory footprint, and we are tuning to ensure 4GB will always be sufficient for 4K gaming. Main point being that HBM can be thought of as a giant embedded cache, and is not directly comparable to GDDR5 sizes.

Now specifically that statement backs up "4K gaming" and we will give AMD (and NVDIA for that matter) a pass at this moment as neither produce "4K gaming" GPUs. The important statement here is this, "AMD is hand tuning games to ensure that 4GB will not hold back Fiji’s performance." We have said over and over again that this statement by AMD did not ring true in terms of needed HBM capacity, and this is the actually the first time we have seen AMD's statement make sense to us in real world gaming with Triple A titles. So kudos to AMD in being able to show us that this statement has come to fruition finally. The downside we see to this statement is AMD will have to "hand tune" games in order to make this work. What games will get hand tuned in the future? That said, AMD seems to have done an excellent job hand tuning Rise of the Tomb Raider for its HBM architecture.
 
'X' times more people running the 970's 'hand tuned' 4Gb memory allocation, bet you a lot of them are soiling themselves more than Fiji users are in regards to future tuning!:p
 
Just had a read of that and indeed interesting. VSync is on and frames are dropped and some interesting points about the Windows store and the same behavior. Will read it agin tomorrow when I haven't just finished a late shift :)

Yeah, I personally don't really like how microsoft is literally putting their nose in every place. Only good thing about this new Vsync "hybrid" mode is that, it has mixed benefits of both, vsync on and off. You don't get tearing you would normally get with vsync off, and you don't get mouselag you get with vsync on. But according pcper you get some "judder" (is that a word?). I personally don't know what they are seeing as I don't see it, or perhaps I don't know what to look for.

What about freesync and gsync? Will this new mode ruin their functionality?
 
Back
Top Bottom