Soldato
- Joined
- 24 Sep 2013
- Posts
- 2,890
- Location
- Exmouth, Devon
Bend over for those 3090's then, 'VRAM not enough' buyers!
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
They need a higher VRAM card to compete with the 6800XT/6900XT, so common sense dictates that the Ti or a Super variant will appear later in the year.You missed the last page or so, 3080 Ti's have been postponed indefinitely.
His opinion (expert as it is, based on his widespread experience) back in September 2020 was to 'stop worrying about it'. I don't believe he is correct, as objectively speaking, some current games are already using 8+GB VRAM at 4k. It is not and can never be a 'total non-issue' to say that 10GB VRAM may be breached within the next 2 years. We can do our best to make an educated guess to predict VRAM usage over the next 2 years, but again there is, objectively speaking, a realistic possibility that some games will come along that will push this boundary at Ultra/Extreme settings, especially with mods like 4k textures (I also take modding potential into account as I love modding games so it always factors into my forecasting). Can you always scale those settings down to reduce VRAM usage? Yes, of course, but it doesn't change the fact that you would otherwise have hit a VRAM limitation by simply doing what you want to do.Even Steve from GN says the 3080 VRAM argument is a total non issue. 10GB will be plenty enough until the GPU needs replacing anyway.
In the end, no-one can say for sure, definitively, that 10GB VRAM will not be a limitation within the next 2 years
Even Steve from GN says the 3080 VRAM argument is a total non issue. 10GB will be plenty enough until the GPU needs replacing anyway.
Lets agree to disagree with the opinon you are trying to present as fact and just wait and see wat happens. There's nothing more for me to discuss here and I will revisit the topic in the coming 12-24 months when we have actual evidence to support it either way.When 10GB Vram is a limitation it'll be a moot point for the 3080 because it'll have run out of grunt to operate at the ultra settings that require those large amounts of Vram, so the end user will be turning down the settings anyway... Or, upgrading to the 4000 series.
Not sure if serious.There is a huge vram thread which I don’t wish to replicate but we never hear it the other way around as when will the 3080 run out of gpu horsepower?
Not sure if serious.
I meant are you serious with what you posted about: "never seeing arguments about horsepower running out before VRAM". That is exactly what Dirk posted and exactly what has been posted for months in the VRAM discussion threads as one of the core argument of people who believe the cards do have enough VRAM. So for you to say you haven't seen it posted is just really bizzarre.Well as serious as one should be worried about vram running out!
His opinion (expert as it is, based on his widespread experience) back in September 2020 was to 'stop worrying about it'. I don't believe he is correct, as objectively speaking, some current games are already using 8+GB VRAM at 4k. It is not and can never be a 'total non-issue' to say that 10GB VRAM may be breached within the next 2 years. We can do our best to make an educated guess to predict VRAM usage over the next 2 years, but again there is, objectively speaking, a realistic possibility that some games will come along that will push this boundary at Ultra/Extreme settings, especially with mods like 4k textures (I also take modding potential into account as I love modding games so it always factors into my forecasting). Can you always scale those settings down to reduce VRAM usage? Yes, of course, but it doesn't change the fact that you would otherwise have hit a VRAM limitation by simply doing what you want to do.
In the end, no-one can say for sure, definitively, that 10GB VRAM will not be a limitation within the next 2 years. At this point, it's a known potential risk. AMD have removed this concern from the equation for this generation by giving 16GB VRAM and they are to be applauded for that. Anyway, I will leave it there as the discussion has been done to absolute death by this point in numerous threads, so all that remains is to wait and see and gather more concrete information as 2021/2022 progresses.
To be honest I know very little about the inner workings of BAR as I haven't got around to doing any reading up on it yet. However if it does use VRAM in the way that you say then it could indeed have a further impact if it is simultaneously using VRAM that a game would otherwise want to use.Throwing this one in there too, when people start using resizeable bar, this will only make it a bigger headache no? The whole point is to work with the system by utilising spare VRAM but the 3080 has such a lack of it I can see a lot of shuffling or better still, dont bother enabling it for some games.
To be honest I know very little about the inner workings of BAR as I haven't got around to doing any reading up on it yet. However if it does use VRAM in the way that you say then it could indeed have a further impact if it is simultaneously using VRAM that a game would otherwise want to use.
Just enhances your earlier point that if any card that is the flagship or best value buy can suddenly be exposed as a weak link due to poor engineering selection.
Remember when we talked about the lack of VRAM being the topic, there was no SAM/resizeable bar... now we have progressed and nvidia are releasing this finally, you have to wonder even moreso now that is this going to be the Achilles heel to expose the lack of VRAM?
Hah. I think it's certainly safe to say that at the very least, 3090 users are never going to run out of VRAM.For those slating the 3090 users though..
The GPU will reserve what it needs for the game and will not allow CPU to get more than what it allows. This will preserve the game performance. This functionality in current state only allows 256MB to CPU where instruction handshake can happen. rest of the data that needs to handshake is stored in main RAM modules and CPU will fetch that and provide to GPU on demand.
With resize bar, even though CPU technically has full access to GPU VRAM, say 10gb, GPU will still reserve what it needs as a priority and give CPU remaining VRAM for storing instructions and data. This means when GPU needs additional information, it does not have to ask CPU and can itself fetch it from VRAM, making performance better.
However, lesser the total VRAM, lesser the "extra" benefit, still more than what we see today (never less). It will also depend on the game or the software and amount of data to reap the benefits. A very demanding game with lots of data, should reap higher benefits on a 3090 with 24gb vram than on a 3080 with 10gb VRAM, but still it will be better than what it is today on a 3080.
That argument has never made sense, because most games are very tweakable and you don't have to just pick a preset and go. If you're running a card that's starting to struggle in modern titles and it doesn't have much VRAM either, it's down to low/medium settings across the board for you. If you have a card that's feeling the heat, but has a ton of VRAM, you can turn all the core-heavy settings down to low/medium but keep textures on Ultra Mega Nightmare Xtreme, which will result in a better-looking game. Red Dead Redemption 2 is a good example of a game that benefits greatly in such a scenario, where turning the texture setting down has a bigger impact on visual quality than all the other settings combined. It looks like a PS2 game on the lowest texture setting.When 10GB Vram is a limitation it'll be a moot point for the 3080 because it'll have run out of grunt to operate at the ultra settings that require those large amounts of Vram, so the end user will be turning down the settings anyway... Or, upgrading to the 4000 series.
NVIDIA GeForce RTX 3080 Ti 'Rumored' Graphics Card Specifications
NVIDIA's GeForce RTX 3080 Ti FE (Founders Edition) graphics card is expected to feature the PG133-SKU15 PCB design and the GA102-250-KD-A1 graphics core. The GA102-250 GPU has also changed since the last time we saw them and is now exactly the same as the GeForce RTX 3090 at 10496 FP32 CUDA cores.