• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

VRAM - AMD/Nvidia, why does it differ?

@Rusty ive got no idea what you're going on about. You're just arguing for the sake of it.

Didn't see it as arguing in truth but it sounds like your backs been up in this whole thread for some reason. Chill out. It's been a good thread.

Nice try pulling the "I dunno what you're talking about when it's been proved I was wrong" :p.

Everything you say is in complete contrast to what Nvidia say on their website.

No it isn't - my image and remarks is based on their website. Your interpretation of what is being said is based on an incorrect understanding of what SLI is. I've already shown via a nice .png file that nVidia distinguish themselves between 2-way, 3-way SLI.

I go back to the original point that i was making. If Nvidia don't think 4k is important...

Where have you got the not important part from? I said there's no market for it and I said they were setting the foundations for 4K adoption because it's in GPU manufacturers interests for there to be a mass 4K adoption.

.... or that 780 SLI is capable of providing playable fps with settings turned up whilst not running out of vram then don't start a PR campaign and plaster it all over your website. Not interested in how you perceive things when its there in black and white on their own website.

It's good that you say it's there in black and white and you're right that perception doesn't come into it. The fact is nVidia explicitly distinguish between 2-way, 3-way and Quad-SLI. SLI doesn't mean two GPUs. It means more than one. I've provided links to this.

Therefore their statement which you have kindly quoted on the previous page is indeed correct. It might be a bit vague - probably purposefully so - as it isn't exactly a good idea saying you're going to need 3x £450 cards to play 4K. It just looks a million miles away and will put people off and as I've explained it's in both AMDs and nVidia's interest that 4K is adopted. The GPU sales as a result helps them both.

You OK now buddy? :)
 
Farcry 3, Hitman Absolution, Battlefield 4, Rome Total War 2, Crysis 3. There may be more but thats a few i can think of just quickly.

So less then 1% then? And Hitman is borderline

The biggest thing that sucks up VRAM is running anti-aliasing, especially since nearly all game engines these days are deferred engines that use multiple buffers.

Simply increasing the resolution by a factor of 4x does not increase VRAM use by a factor of 4x.

And it's the same for art assets, if a texture consumes 20Mb of VRAM at 1080p then that texture will still only consume 20Mb of memory at 4k.

Now at 4k you simply will not need to run 4-8xAA to get rid of the jaggies, heck post AA will do an awesome job at that resolution as post AA takes it samples from surrounding pixels and the more pixels it has to sample from the better the results are. It will have 4x the amount of pixels at 4k to generate it data from compared to 1080p so it'll look much better.

And in this review the 780 really seems to be struggling with 4k gaming ; http://udteam.tistory.com/585
 
Last edited:
Yea Vega said his 7970 3gb were pushed for memory back then on his 5 monitor setup at least i think it was 5 in portrait. Whatever it was it was way higher than 4k.

Yeah he had some insane set-ups on the go. Ridiculous almost. Fair play to him. I'd do something similar if I had the time and money to invest and test things out.
 

It doesn't really matter how many gpu's. You can debate whether they meant two or three gpu's, it doesn't really matter. From looking on their website when they mean three gpu's they specifically state 3 way SLI same as every other review site out there. No point going round in circles about it. Never heard any site or person refer to more than two gpu's as anything other than 3 way or tri sli but whatever you say. Doesn't change the 3gb issue or the fact the you agree with their assumption that 3gb is enough for 4k with settings turned up when reviews have clearly shown that this is not always enough. You agreed but saying you don't think that they consider it important, yet their PR campaign and website say different. Going round in circles here...

So less then 1% then? And Hitman is borderline

The biggest thing that sucks up VRAM is running anti-aliasing, especially since nearly all game engines these days are deferred engines that use multiple buffers.

Simply increasing the resolution by a factor of 4x does not increase VRAM use by a factor of 4x.

And it's the same for art assets, if a texture consumes 20Mb of VRAM at 1080p then that texture will still only consume 20Mb of memory at 4k.

Now at 4k you simply will not need to run 4-8xAA to get rid of the jaggies, heck post AA will do an awesome job at that resolution as post AA takes it samples from surrounding pixels and the more pixels it has to sample from the better the results are. It will have 4x the amount of pixels at 4k to generate it data from compared to 1080p so it'll look much better.

And in this review the 780 really seems to be struggling with 4k gaming ; http://udteam.tistory.com/585

Doesn't matter how few titles it is, those are all big games that would be good to play on 4k with settings up. Obviously you'd need more than one gpu to do this, but it doesn't matter how many you have if you don't have enough vram. But as before my point comes back to the whole PR smear campaign, yet now the shoe is on the other foot regarding it.
 
Last edited:
Where have you got the not important part from? I said there's no market for it and I said they were setting the foundations for 4K adoption because it's in GPU manufacturers interests for there to be a mass 4K adoption.


I don't think Nvidia would have even bothered marketing 4K if AMD hadn't jumped the gun.


Although G-Sync as far as counter marketing goes is potentially far more beneficial.

The real argument isn't "what is enough VRAM". It is "how long is this going to be enough VRAM".

I regularly hit over 2.5GB in most current titles at 1440P. I've hit my VRAM limit in Battlefield 4. Just the once, with Shadow Play, but the fact remains games are bouncing off it already with maximum settings.


I put my house on the fact that come Q2 2014, 3GB will be insufficient across the board (for maximum settings)
 
It doesn't really matter how many gpu's. You can debate whether they meant two or three gpu's, it doesn't really matter. From looking on their website when they mean three gpu's they specifically state 3 way SLI same as every other review site out there. No point going round in circles about it. Never heard any site or person refer to more than two gpu's as anything other than 3 way or tri sli but whatever you say.

It does matter because I've already shown you that nVidia distinguish between 2-way, 3-way and Quad-SLI. And from a technical POV, SLI refers to just more than one.

Doesn't change the 3gb issue or the fact the you agree with their assumption that 3gb is enough for 4k with settings turned up when reviews have clearly shown that this is not always enough.

As already explained: 'settings turned up' is not the same thing as maximum settings. I haven't really formed an opinion on it as I haven't tested and 4K is still too young in my view to judge.

I said I agree with nVidia's statement that you'll need multi GPU 780+ to play games at 4K with settings up. To max games, I don't know what you'll need but I kind of guesstimated at 3 and Greg confirmed this has been tested and confirmed.

You agreed but saying you don't think that they consider it important, yet their PR campaign and website say different. Going round in circles here...

This is agonising. I never said that they don't consider it important. Pack it up. I didn't even allude to it... :). I was talking in terms of the market in today's terms. Nothing more, nothing less :).
 
Last edited:
Doesn't matter how few titles it is, those are all big games that would be good to play on 4k with settings up. Obviously you'd need more than one gpu to do this, but it doesn't matter how many you have if you don't have enough vram.

Obviously you couldn't even be bothered to read the review...

Crysis 3 at 4k with 4xAA......

241C1A3552AE05082BFE8A


Far Cry 3 at 4k with 4xAA

241C033552AE050A2A0551


Hitman at 4k with 4xAA

2218593552AE050B2EB99A


They seem to be doing OK to my eyes so maybe you can point out where the performance issues are? :rolleyes:
 
I think this is true for both companies, but i do think Nvidia are much more aggressive than AMD in that regard.

Yes.


AMD would do the same if they could NV gets away with it because they are the current market leader
 
As already explained: 'settings turned up' is not the same thing as maximum settings. I haven't really formed an opinion on it as I haven't tested and 4K is still too young in my view to judge.

I said I agree with nVidia's statement that you'll need multi GPU 780+ to play games at 4K with settings up. To max games, I don't know what you'll need but I kind of guesstimated at 3 and Greg confirmed this has been tested and confirmed.

Doesn't matter if its two or three way sli if they don't have the vram, so its not important arguing about what you believe they meant.

Settings turned up indicates maximum settings or close to. Regardless not enough vram for all titles regardless at either setting. So again we come back to the original point with the PR campaign and the marketing all over Geforce.co.uk. If its not important to them or they don't believe there is a market for it then why the PR campaign/marketing all over their website saying the opposite. Why the Pcper sponsored review featuring SLI. Guess how many cards they used Rusty? Two. Guess what website that was linked on? You guess it Geforce.co.uk. Strange how they didn't use three cards.
 
Obviously you couldn't even be bothered to read the review...

They seem to be doing OK to my eyes so maybe you can point out where the performance issues are? :rolleyes:

I guess you don't understand how it works. None of those graphs show the minimums so unless you read it via the translate you wouldn't know if the limit was exceeded or not. Have a read. ;)

ZmQaxcP.png


http://www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055/6

The AMD Radeon R9 290X doesn’t have the best average clock speed, but it should be noted that the card never fully stalled when running with these aggressive image quality settings. The EVGA GeForce GTX 780 Ti SC ran the benchmark great except for one spot near the start of the benchmark run where more than 3GB of frame buffer was being used and the card fell on its face and the game froze for a full two seconds.
 
I guess you don't understand how it works. None of those graphs show the minimums so unless you read it via the translate you wouldn't know if the limit was exceeded or not. Have a read. ;)

ZmQaxcP.png


http://www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055/6

You obviously don't understand things like that can be sorted by drivers.... Modern game engines have a feature you might not of heard of, STREAMING....

And see how much that extra memory is helping the 290? The card has run out of puff and even 4Gb VRAM can't help it
 
Doesn't matter if its two or three way sli if they don't have the vram, so its not important arguing about what you believe they meant.

Settings turned up indicates maximum settings or close to. Regardless not enough vram for all titles regardless at either setting. So again we come back to the original point with the PR campaign and the marketing all over Geforce.co.uk. If its not important to them or they don't believe there is a market for it then why the PR campaign/marketing all over their website saying the opposite. Why the Pcper sponsored review featuring SLI. Guess how many cards they used Rusty? Two. Guess what website that was linked on? You guess it Geforce.co.uk. Strange how they didn't use three cards.

You've hit the nail on the head. Settings turned up doesn't mean anything. It could mean anything. I've said all along that the quote you're hanging on to like a dog with a bone (which is actually out of context) is purposefully vague.

I've already answered this three times but I'll quote again for your convenience as comprehension doesn't seem to be your strongest skill tonight :p:

Rusty0611 said:
Where have you got the not important part from? I said there's no market for it and I said they were setting the foundations for 4K adoption because it's in GPU manufacturers interests for there to be a mass 4K adoption.

There you go old boy :).

It's clear you don't have any experience in planning but it's not like one day they go into work and say "right then, 4K....". It's all part of a longer term strategy and the advertising of 4K falls into this. And they do this due to my quote. It's in their interests. But this does not mean there is currently a market for it today.

How many people on this forum have 4K screens? How many people have even 2-way SLI set ups? We're dealing with a tiny potential market here. In the future it should expand.
 
Last edited:
2 seconds? And only once for the entire benchmark, didn't they test it a second time? :D

When I ran out of VRAM on my 680's at 5760x1080, it froze just like that but I was told that isn't how it happens and I was wrong by certain people.... Now it can be quoted for truth???
 
I guess you don't understand how it works. None of those graphs show the minimums so unless you read it via the translate you wouldn't know if the limit was exceeded or not. Have a read. ;)

http://i.imgur.com/ZmQaxcP.png[IMG]

[url]http://www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055/6[/url][/QUOTE]

Thought you were saying that running out of VRAM doesn't result in 0 FPS periodic stutters?
 
Found the quote:

If you read the sweclockers review it sounds like something that occurs, the performance loss, the longer you play. Picking it up to play for a few minutes might not be enough. Plus the fact hes using four gpu's might skew the results somehow. This is why they cannot be taken as gospel. You either match it up like for like or don't bother. Its pretty easy to run the same test with the same cards, if you want. Doesn't matter how you think performance will show itself when vram runs out. Multiple sites are saying it shows itself as a performance drop (much lower minimums than 3gb+ cards Sweclocks+HardOCP) and others are saying it presents itself as stuttering (HARDOCP+PCPER). Only you or Greg are saying the only way it shows itself is you drop down to single digit fps. As neither of you are able to test a 2gb card now, maybe things have changed somewhat regarding the way it works thanks to driver/OS improvements, faster DRAM and faster SSD pagefiles. If it was one site saying it fair enough, but too many decent sites are singing the same tune now.
 
Back
Top Bottom