1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

    Dismiss Notice

Dedicated VRAM thread?

Discussion in 'Graphics Cards' started by ColdAsIce, 5 Nov 2020.

  1. ColdAsIce

    Soldato

    Joined: 26 May 2006

    Posts: 5,314

    Location: Edinburgh

    In all seriousness, there has been some really good discussion between at least 3 different threads. 3070/3080 and 6900XT in regards to VRAM usage and how important it is. Anyway for the mods to consolidate the discussion into a single topic?
     
  2. Kaapstad

    Man of Honour

    Joined: 21 May 2012

    Posts: 31,184

    Location: Dalek flagship

    Providing they close all the other VRAM threads.

    I am also concerned that all these VRAM threads have more to do with trying to bash the rival brand and hardly anything to do with games running out of memory.
     
  3. tommybhoy

    Capodecina

    Joined: 30 Mar 2010

    Posts: 11,837

    Location: Under The Stairs!

    Its nearly always select Nv supporters saddled with the more expensive lesser vram amount at a comparable performance trying to convince the world its not needed.

    There was next to no argument to the Fury though...


    Although Iv'e never seen a thread title around the line of:

    Can't Amd Cards use less vram to make them even more cheaper than Nvidia?

    :p
     
  4. Chuk_Chuk

    Mobster

    Joined: 12 May 2014

    Posts: 2,857

    The best option is a mod merge it all together and rename it to "The great VRAM debate".
     
  5. Blackjack Davy

    Soldato

    Joined: 16 Aug 2009

    Posts: 6,034

    I'm waiting for the "AMD's cards don't use as much power as Nvidia's therefore they must be inferior!" argument to appear. Give it time...
     
  6. PrincessFrosty

    Wise Guy

    Joined: 1 Oct 2009

    Posts: 1,033

    Location: Norwich, UK

    I actually have unironically made this argument however. Way too many people seem to have this weird notion that somehow the price points of the cards are fixed and that adding more RAM would somehow cost Nvidia/AMD, or that less RAM would allow them to pocket greater profits. That's not how businesses work, if the product costs more to make then you sell it for more, if it costs less to make you sell it for less. What sensible people should want is enough RAM on your card so it's not a bottleneck, but no more.

    Perfectly analagous to system RAM, people will avoid putting in 32Gb of RAM when 16Gb will do. Why? because 32Gb costs 2x more than 16Gb.
     
  7. g67575

    Mobster

    Joined: 30 Jun 2019

    Posts: 3,232

    There's been a lot of worrying about not having enough VRAM, and not enough analysis of what happens when you exceed it slightly, or a lot in games.

    For example, if a game exceeds VRAM capacity, does the frequency of RAM make a difference (used as a subsitute for VRAM)?

    With cards like the RTX 3080, you may easily get over 60 in many titles, even if exceeding the VRAM capacity.

    More GDDR6X is expensive and uses more power than GDDR6, NV is already using at least 320w of power for the RTX 3080, is it a good idea to increase the tdp further? This will just push up the heat and power supply requirements even more.
     
    Last edited: 5 Nov 2020
  8. g67575

    Mobster

    Joined: 30 Jun 2019

    Posts: 3,232

    If they make a 7nm version of the RTX 3080, the power requirements should be lower, so they should be able to give it more VRAM at about the same power, or maybe lower.

    No reason why they can't make a RTX 3070 with more VRAM though, except that it would annoy RTX 3080 owners.
     
  9. Flake87

    Wise Guy

    Joined: 27 Jul 2015

    Posts: 1,250

    Well, that really is the production led view taken by very many tech companies, and is the wrong way around for most companies.

    Marketing works differently - find out what people want, find out how many want it, find how much they are prepared to pay for it, find out how much it costs to make it, and how many you can produce, then adjust the price to match production output. Had Nvidia done this correctly they would have been charging a heck of a lot more, and lowering demand as a result.
     
  10. no_1_dave

    Soldato

    Joined: 7 Jul 2004

    Posts: 7,074

    Location: Gloucestershire

    But is the vram being exceeded with assets that are being used and therefore needed or would the game just cache more if there was more vram available? We have seen many instances of vram usage way above what the game is actually using.

    As for the power, GDDR6X uses 15% less power per bit than GDDR6, power consumption is only higher because GDDR6X is faster than GDDR6. If they were running at the same speed/bandwidth GDDR6X would use less power than GDDR6
     
  11. g67575

    Mobster

    Joined: 30 Jun 2019

    Posts: 3,232

    Isn't GDDR6X more expensive too? Otherwise, AMD probably would've used it for their RDNA 2 graphics cards.
     
  12. RanxZy

    Wise Guy

    Joined: 15 Jun 2009

    Posts: 2,182

    Location: South London

    Wow, there seems to be something severely wrong with the 3070 being able to maintain performance on par with the 11gb 2080ti on Watch Dogs Legion - 4K Ultra DLSS Quality. 2080ti can stay over 30fps, but the 3070 is completely unplayable in the 20s.

    This is looking at early YouTube tests though, I wonder if some recent 3070 owners could respond on their performance at 4K Ultra.

    The old saying the core always runs out of juice before VRAM could be completely dispelled - Only in regards to the 3070 vs 2080ti battle.

    So glad I picked up a 2080ti Sea Hawk Waterblock on the cheap less than half the RRP when the 3070 was announced.
     
  13. RanxZy

    Wise Guy

    Joined: 15 Jun 2009

    Posts: 2,182

    Location: South London

    3070 could be the new GTX 970 LOL
     
  14. stooeh

    Wise Guy

    Joined: 19 Sep 2009

    Posts: 2,184

    Location: Riedquat system

    30 fps though? :p
     
  15. RanxZy

    Wise Guy

    Joined: 15 Jun 2009

    Posts: 2,182

    Location: South London

    I thought you knew, 4K30 is the new 4K60 mate :p
     
  16. no_1_dave

    Soldato

    Joined: 7 Jul 2004

    Posts: 7,074

    Location: Gloucestershire

    3070 is only a 4k card when playing previous gen games. You gotta remember its performance is 2 years old :)
     
  17. RanxZy

    Wise Guy

    Joined: 15 Jun 2009

    Posts: 2,182

    Location: South London

    In all seriousness now, NVIDIA are doing their own customers an injustice when they allocate it less VRAM than it's performance equivalent launched with two years ago.

    It wouldn't be so bad if it wasn't the exact year where the new generation of graphics fidelity is launched.
     
  18. gpuerrilla

    Capodecina

    Joined: 21 Jul 2005

    Posts: 15,038

    Location: N.Ireland


    PMSL :p

    I think that was the observation most picked up on. When we dig deeper it gets more smoke and mirrors. The best way is for all users to target specific games and settings then using a proper tool, share results of the performance.


    Lots of sensible discussion. Plenty of nonsense and tittle-tattle. Condensing it down to raw facts would be awesome.
     
  19. stooeh

    Wise Guy

    Joined: 19 Sep 2009

    Posts: 2,184

    Location: Riedquat system

    Indeed and soon enough AMD will have their latest GPUs out which powerful enough to compare and expose any VRAM bottlenecks.
     
  20. gpuerrilla

    Capodecina

    Joined: 21 Jul 2005

    Posts: 15,038

    Location: N.Ireland

    I have yet to invest any time to melt my head, but the infinity cache could be interesting in certain scenarios. It may cause stutter in poorly developed games/apps, but could also show vast improvements at the other end of the scale.

    16Gb is at least doubling what you were getting previous gens back. Anyone want to run Windows 10 on 4Gb RAM?