• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 16GB of Vram The Standard For High End Graphic Cards?

Status
Not open for further replies.
People who buy the high end cards tend to upgrade more often so VRAM is less of a concern whereas those who buy mid range and low end usually hang on to the cards for longer so VRAM can be more of a factor for them.
 
I'd much rather forgo the another 2gb VRAM and 10% performance for an extra 60%+ more money, when the 3080 isn't fast enough for 4K in a the majority of games then nor will the 80ti or 90 be good enough either regardless of the extra VRAM.



AMD also said 4gb VRAM wasn't enough for 1080p when they released the 5500XT 8gb yet 20 months later its successor gets released with 4gb so I wouldn't pay to much attention as they just say whatever to fit the narrative.

https://community.amd.com/t5/gaming/game-beyond-4gb/ba-p/414776

Stop bringing logic into this thread!!!!

:p

I think 16 GB will definitely be the new standard for anything mid-range or above. On the one hand the need for > 8 GB is clear, and on the other hand AMD/NV won't want to make cards with bus widths too large, so for 256-bit bus cards 16 GB will be an obvious option.

What people don't get is that even right now a lot of games that require a lot of vram are still GIMPED in terms of how much they allow the streaming & loading of higher res textures and models. And then RT as it gets more complex also hits vram quite a bit. So all the hacks & band-aids currently done driver-side to accommodate the <12 GB cards will not necessarily be there next gen and with the next crop of games.

I know personally I definitely won't touch a card without a minimum of 12 GB, but realistically I'll want 16+. I've had instances where even my 16 GB wasn't enough and I could have used more (FC6 downsampling shenanigans), and I'm sure I'll run into that more going forward. Niche scenarios but hey, that's the fun of PC gaming in the first place! Why else spend all this money on high-end gear otherwise? :cool:

In "theory" direct storage will change the game quite a bit when it comes to vram usage:

https://www.nvidia.com/en-gb/geforce/news/rtx-io-gpu-accelerated-storage-technology/

Essentially we are in a transition period where amd, microsoft, nvidia and intel are all looking for more efficient ways to solve/improve various things rather than always just brute forcing things with throwing more hardware specs at said product.

Problem is we are still waiting to see this really take of.... Only one game announced so far and well it's not a triple a title and looks meh... but hopefully it is well implemented so it can gives us some idea as to what we can expect.

https://www.tomshardware.com/news/forspoken-game-to-support-directstorage
 
Stop bringing logic into this thread!!!!

:p



In "theory" direct storage will change the game quite a bit when it comes to vram usage:

https://www.nvidia.com/en-gb/geforce/news/rtx-io-gpu-accelerated-storage-technology/

Essentially we are in a transition period where amd, microsoft, nvidia and intel are all looking for more efficient ways to solve/improve various things rather than always just brute forcing things with throwing more hardware specs at said product.

Problem is we are still waiting to see this really take of.... Only one game announced so far and well it's not a triple a title and looks meh... but hopefully it is well implemented so it can gives us some idea as to what we can expect.

https://www.tomshardware.com/news/forspoken-game-to-support-directstorage


Well we will find out soon enough, square Enix is launching two games this year that use Direct Storage; Forsaken and the next Final Fantasy

lets see if direct storage can save the low vram Nvidia cards
 
Well we will find out soon enough, square Enix is launching two games this year that use Direct Storage; Forsaken and the next Final Fantasy

lets see if direct storage can save the low vram Nvidia cards
By then we will be next gen cards that have 16gb anyways. It’s like I said when the is 10gb is enough thread started. By the time 10gb becomes an issue in more than handful of games we will be on next gen cards that have a lot more vram :D
 
By then we will be next gen cards that have 16gb anyways. It’s like I said when the is 10gb is enough thread started. By the time 10gb becomes an issue in more than handful of games we will be on next gen cards that have a lot more vram :D


The game is out tomorrow I didn't know we have new GPU yet
 
Oh look another "vram" thread...

*grabs popcorn*

:cry:

Depends entirely on Res. and settings and rasterisation or ray tracing scenarios....

Grunt and vram go hand in hand as has been evidenced throughout several threads now.... Essentially no point having 16gb vram if they don't have the grunt in the first place as we have seen in several ray tracing titles now and likewise no point having the grunt and too little vram as we have seen with the 3070 8gb vram in a couple of games/scenarios at 4k (but as many have stated before, 3070 was never really a proper 4k card....)

Given the 40xx series is right around the corner and a 4060/4070 is likely going to match and beat a 3090 for a considerably cheaper price, I certainly wouldn't be buying any RDNA 2 or ampere GPUs now, just look at what happened with the 3070 upon release and the 2080ti situation ;) :cry:

In an ideal world I agree but plenty of people sold their 20 series GPUs with the intention of picking up a great value 30 series then really struggled to get one or over paid. Hopefully the 40 series launch is different but I'm not holding my breath.

That said there's something far wrong if a 4060 can't be had for under £1400 even via scalpers!
 
The early rumour is the 4060 will be slightly stronger than a 3090 for ray tracing, with a uplift to just under for it in rasterisation.

If true then do you think when that trickles in (some four months after flagship) its going to list as £399 price point?

Well did people think that a 3070 was going to match/beat a £1000+ 2080ti for £470?
 
The early rumour is the 4060 will be slightly stronger than a 3090 for ray tracing, with a uplift to just under for it in rasterisation.

If true then do you think when that trickles in (some four months after flagship) its going to list as £399 price point?

The 3070 was slightly less raster than a Titan RTX with slightly better RT and that released for £469, it's generally what happens everytime new cards release and why it doesn't really pay to go for the top model as that will see the biggest hit on resale value.
 
People who buy the high end cards tend to upgrade more often so VRAM is less of a concern whereas those who buy mid range and low end usually hang on to the cards for longer so VRAM can be more of a factor for them.

I don't think that's true at all, I think the people willing to spend 2K to have the latest and greatest will pay the same or more 2 years later. Enthusiasts spend the most and buy most often.
 
I'm sure, in your world, adding more VRAM to a 1080/1440p mid range card and then charging £649 for it would have made sense :p

Indeed.... Seems people forget that at the time, nvidia didn't have much choice with what vram they could use, in the case of the 3080, it was 10GB or 24GB i.e. if they went with 24GB, there goes that sweet £650 price tag....

I would rather forgo an extra 2+GB vram if it meant adding on another £100+ or as we have seen, £300/400+..... Essentially you would be better saving that money for nextgen cards, which are going to be far better all round than previous/current gen.

VRAM seems to be very important and something worth having for you Tommy, why didn't you just go for the 3090 or 3080ti/3080 12GB? :p ;)
 
Course it did, I eventually managed to purchase that exact budget conscious mid range 3080 for £649.

That's what I was getting at. You should have had a 3080 for 1440p/4k to begin with. 3070 was never going to compete with the 3080 due to less GPU grunt, hence it required less VRAM.
 
The 3070 was slightly less raster than a Titan RTX with slightly better RT and that released for £469, it's generally what happens everytime new cards release and why it doesn't really pay to go for the top model as that will see the biggest hit on resale value.

That's not what I asked. :cry: If you also want to be pedantic, (as mentioned in the post) the 3060 was released some months after the 3080/90 then it was also not available in FE. They typically sold for a lot more than they should have.

So when a 4060 is available and can give you 3090 performance, its only 28 months later.. its going to be a minimum of £450! A bargain though if you didnt get a 30 series card so far.
 
That's not what I asked. :cry: If you also want to be pedantic, (as mentioned in the post) the 3060 was released some months after the 3080/90 then it was also not available in FE. They typically sold for a lot more than they should have.

So when a 4060 is available and can give you 3090 performance, its only 28 months later.. its going to be a minimum of £450! A bargain though if you didnt get a 30 series card so far.
The standard 3060 won't have 3090 perf, my guess is it will be more like 3070ti + 5-10% for around £350-400 the 3060ti may give almost 3090 levels of perf but that will likely cost £500.
 
That's what I was getting at. You should have had a 3080 for 1440p/4k to begin with. 3070 was never going to compete with the 3080 due to less GPU grunt, hence it required less VRAM.

The 80 is not that much faster than the 70, its literally 1 or 2 higher in game settings or the same reduced settings in FC6 but higher fps.

Considering it took 433 days to get a £649 3080, there were no drops for about 3 months before I got it, dibs n drabs of drops before that, others still can't get a sniff of them, I did get a launch price aib 3070 to tide me over which was the right call...

Meanwhile, despite Nv adding another 0 to the transaction to get into the 12Gb comfort zone(which tells its own story), mid range 80 owners are 100% adamant their budget gpus are bulletproof-hint thats why there's 3 and nearly a 4th better gpu than the 10gb 80.

The 70/80 are damn fine cards at msrp, still doesn't detract the fact they can run out of vram.
 
Status
Not open for further replies.
Back
Top Bottom