Would it not be better for them into include say 24gb GDDR6 if that was as expensive as 16gb GDDR7? There must be some solid reasons for going GDDR7.Nvidia might gaslight you in to thinking that but its just not true.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Would it not be better for them into include say 24gb GDDR6 if that was as expensive as 16gb GDDR7? There must be some solid reasons for going GDDR7.Nvidia might gaslight you in to thinking that but its just not true.
The manufacturers have 7 listed as faster and lower latency than 6X. Not sure what info you have.Nvidia might gaslight you in to thinking that but its just not true.
I read somewhere that the bandwidth of the GDDR7 is massively higher than that used in the 40s series so you can’t make a direct comparison GB per GB.
… not sure how true that is!
Is there a difference between a game allocating a large amount of vram because you have it, rather than it strictly needing it? Or am I thinking of normal RAM.Faster memory is always welcome but if an application demands 20GB, For example, Then it won't care if you have 16GB of fast memory, It'll still need 20GB.
Is there a difference between a game allocating a large amount of vram because you have it, rather than it strictly needing it? Or am I thinking of normal RAM.
Should work in a similar way to software program. Request memory for assets, drop the allocation when done, however, you’d have to consider how those assets are used in the game world and whether it’s faster to keep them in memory, because they’ll be used again soon, or drop them - this is also where developers will get lazy.Is there a difference between a game allocating a large amount of vram because you have it, rather than it strictly needing it? Or am I thinking of normal RAM.
Should work in a similar way to software program. Request memory for assets, drop the allocation when done, however, you’d have to consider how those assets are used in the game world and whether it’s faster to keep them in memory, because they’ll be used again soon, or drop them - this is also where developers will get lazy.
Maybe faster memory changes that but it’s not like developers will make that effort.
My 2TB SSD has massively higher bandwidth than my 4TB HDD but I still can't shove 4TB into itI read somewhere that the bandwidth of the GDDR7 is massively higher than that used in the 40s series so you can’t make a direct comparison GB per GB.
… not sure how true that is!
It opens up options. You’d hope there’s a reason they’re going for faster, more expensive VRAM on the flagships as that’s not exactly a selling point for consumers.So in a way it does sound like what was debated above is correct in that GDDR7 would allow for less overall because it's more proficient (faster) at pulling and flushing the assets that it needs. Theoretically anyways.
My 2TB SSD has massively higher bandwidth than my 4TB HDD but I still can't shove 4TB into it
Kind of reminds me when AMD had 6GB on the Fury X and they said because it's faster than normal GDDR, It's actually the equivalent to nearly double the capacity
4 gig of hbm, it wasn't that it was faster it was to do with the memory bus being 4096 bit.
Yes.Good point, if you had 4gb ram on a flagship today, and it was like 10TB/s bandwidth, would you worry about the 4GB?
Nope, RTX 3070 called and said:"Cyberpu....." darn it ran out of vram again.I think I'd rather a 5080 and £700 in my pocket if it comes to it lol.
Nvidia always seem to give people less than the want, but so far they have always been on the money in terms of VRAM allocation.
Isn't allocated but not used VRAM a thing as well? Also what kind of performance penalties when its maxed out?
You'd think Nvidia would have done loads of tests with the amount and I doubt they would be afraid to pass any extra cost on to the consumer, if it was needed!
Nope, RTX 3070 called and said:"Cyberpu....." darn it ran out of vram again.