• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I agree. For the record @james.miller , I never said princessfrosty or the others from that moment were wrong, I just came to my conclusion that nvidia were just stingy and should have offered the next leg up (which means this discussion would be eradicated, the end).

Yeah, i think nVidia engineered themselves in to corner. expensive gddr6x on a 320bit bus, doesnt give them many options - 10 or 20gb and nothing in between. only a bigger bus would have allowed something in between and that would have been a chunk more expensive and .... basically a 3090, unless nvidia went with some trick memory system and we know how popular that made nvidia when they last tried that lol. I do think the majority of people will be fine. But that could change on a dime, we just dont know. More vram would have made that less of a concern, or no concern, but here we are. nVidia clearly sacrificed to hit that price point and they didnt quite nail it. It's still a great card IMO.
 
It's certainly not easy to double the VRAM to 20GB, while remaining competitive. Remember Nvidia is using GDDR6X, as apposed to GDDR6. It's not just Nvidia that can play dirty.
We are not talking about being competitve, we are talking about a AMD changing game specs to screw over Nvidia based on a rumour. (according to you)
Of all the specs on a GPU (that we "care" about) VRAM is the easiest to change and one that can be changed at the very last minute (a few months before release). During the rumour phase Nvidia could have changed the 1GB modules for 2GB modules a few months before release rendering AMDs plan pointless. It would be stupid to invest money on screwing over a competitor when the competitor could accidentally stumble upon the solution or come up with a solution in a few months.
Godfall needing 12GB was nothing more than a coincidence that AMD try to play up.
 
We are not talking about being competitve,

Seriously?

it would be **** easy for Nvidia to put 20GB on that cards if they wanted.

£999 for a 20GB 3080 vs £600 for a 6800XT would have just been handing sales to AMD.

we are talking about a AMD changing game specs to screw over Nvidia based on a rumour. (according to you)

The only area that the AMD cards come out on top against the 3080 is the amount of VRAM. Are you tellling me that AMD wouldn't leverage that advantage?

Of all the specs on a GPU (that we "care" about) VRAM is the easiest to change and one that can be changed at the very last minute (a few months before release). During the rumour phase Nvidia could have changed the 1GB modules for 2GB modules a few months before release rendering AMDs plan pointless. It would be stupid to invest money on screwing over a competitor when the competitor could accidentally stumble upon the solution or come up with a solution in a few months.
Godfall needing 12GB was nothing more than a coincidence that AMD try to play up.

This is heading back to why we don't need more than 10GB on a 3080 other than some AMD sponsored title that doesn't support DLSS and requries 12GB of VRAM when other comparable games only need 6-7GB.
 
Yeah, i think nVidia engineered themselves in to corner. expensive gddr6x on a 320bit bus, doesnt give them many options - 10 or 20gb and nothing in between. only a bigger bus would have allowed something in between and that would have been a chunk more expensive and .... basically a 3090, unless nvidia went with some trick memory system and we know how popular that made nvidia when they last tried that lol. I do think the majority of people will be fine. But that could change on a dime, we just dont know. More vram would have made that less of a concern, or no concern, but here we are. nVidia clearly sacrificed to hit that price point and they didnt quite nail it. It's still a great card IMO.

Its still a great card and like another guy posted (cant be bothered to reference) if they were available I would have got myself one. The 6800XT launch was shocking as I counted on this being a sensible alternative to not having 3080 for choice, then to see it was more expensive than the 3080FE (as the unicorn £599 units were like what 100pcs for whole of UK) and in less supply, made me slightly depressed.
 
Seriously?



£999 for a 20GB 3080 vs £600 for a 6800XT would have just been handing sales to AMD.
Yes seriously. I thought we were discussing whether or not AMD would try and screw over Nvidia by raising the VRAM requirements of a video game? Or have you decided to no longer discuss this and pivot to the cost of Nvidia increasing the amount of VRAM?

Citation on prices.


Let me get this straight you read this statement
we are talking about a AMD changing game specs to screw over Nvidia based on a rumour. (according to you).
and this was your take away
The only area that the AMD cards come out on top against the 3080 is the amount of VRAM. Are you tellling me that AMD wouldn't leverage that advantage?

For your sake i will be explicit. I do not think that AMD would spend money trying to screw Nvidia over by basing their decision on a rumour (from some random youtube video/twitter account). Especially a rumour on a specification that could easily be changed. It would be a stupid business decision, more so for a company that is as poor as AMD. (They are poor compared to their contemporaries).
With regards to Godfall i have already made my position clear.
Godfall needing 12GB was nothing more than a coincidence that AMD try to play up.




This is heading back to why we don't need more than 10GB on a 3080 other than some AMD sponsored title that doesn't support DLSS and requries 12GB of VRAM when other comparable games only need 6-7GB.
This has nothing to do with the main topic at hand. If you want to discuss if Godfall needs 12GB or what optimisations they could do to drop the VRAM requirements then feel free to start a thread.

Also screen resolution is a small part of VRAM requirements (assuming no settings change in the background) so DLSS won't have a big affect.
 
Yes seriously. I thought we were discussing whether or not AMD would try and screw over Nvidia by raising the VRAM requirements of a video game? Or have you decided to no longer discuss this and pivot to the cost of Nvidia increasing the amount of VRAM?

We were discussing AMD before you decided that Nvidia could easily double the VRAM on a 3080, while ignoring competiveness.

Citation on prices.

No need, that's just common sense. There is no way Nvidia would hand out more costly chips without charging for them. Nvidia is much happier using such chips on the 3090.

Let me get this straight you read this statement

and this was your take away

Yes. This is what companies do to sell hardware. This is what Nvidia did with Gameworks for years.

For your sake i will be explicit. I do not think that AMD would spend money trying to screw Nvidia over by basing their decision on a rumour (from some random youtube video/twitter account). Especially a rumour on a specification that could easily be changed. It would be a stupid business decision, more so for a company that is as poor as AMD. (They are poor compared to their contemporaries).

You have your head in the sand if you think AMD didn't know how much VRAM the 3080 was launching with, at least 4 months before launch. Likewise Nvidia would also have known how much VRAM it's main competitor, the 6800XT, would launch with.
 
I saw my Titan use nearly the full 12gb a few years ago when playing Final Fantasy 15. Does that mean it ran poorly or lower detail on 1080 and 1080Ti? Nope. Allocated memory and used memory are different. Most have not heard about this yet so get all confused thinking they need loads of vram for such games. You need to have the latest version of MSI Afterburner and I believe there is a way of showing you allocated vs what is actually being used.

Guess which card Final Fantasy 15 runs better on, a 3080 with 10gb or a Titan with 12gb. Hell, even a Titan RTX with 24GB. I am sure the Titan RTX will show it allocates more than even 12gb, but it still would get smashed by a 3080. All the while the 3080 costs about a quarter of the price of a Titan RTX. Hence why I think 10gb is fine for now. By the time it ain’t we will be on next gen cards anyways.

In the case of FF15 it is using the vram. It goes up gradually as it loads more textures in, and it will even overflow to using normal ram.
 
We were discussing AMD before you decided that Nvidia could easily double the VRAM on a 3080, while ignoring competiveness.



No need, that's just common sense. There is no way Nvidia would hand out more costly chips without charging for them. Nvidia is much happier using such chips on the 3090.

The only common sense is that Nvidia would sell them at a price that stopped AMD gaining a significant price advantage, without incurring unsustainable short term losses.

We don't know how much those memory chips cost Nvidia - they might be a lot cheaper than we all think
People seem to assume that Nvidia would making a loss if they kept the prices the same - we don't have enough information to assume this, they may still be able to turn a profit for all we know.
or people assume that is Nvidia would never reduce their profit margins - Nvida would if they have to (this should be common sense;) )

If you have somehow managed to get cost figure for the cards feel free to share them with the rest of us.


Yes. This is what companies do to sell hardware. This is what Nvidia did with Gameworks for years.
.

And this folks is a good example of someone reading what they want and not what you wrote.

You have your head in the sand if you think AMD didn't know how much VRAM the 3080 was launching with, at least 4 months before launch. Likewise Nvidia would also have known how much VRAM it's main competitor, the 6800XT, would launch with.

April 2020
Lisa su: "Hey Jenson, what's the specs for your upcoming 3000 series cards"
Jenson: "Give me a minute I'll email them to you now. While I'm at it, I'll send you the technical presentation as well. Don't worry these specs are fixed and they will not be changing in the future. Just a heads up we are going to have pretty **** stock at launch"
Lisa Su: "Thanks Jenson, Looks like you've made some nice gains with this generation. Don't worry our launch stock is going to be **** as well, gamers are in for a suprise this winter. Just a heads up we have a screamer of a card. We are right on your tail."
Jenson: "Thanks for letting me know, I'll adjust our prices."

I know that companies hear things from the grapevines about what their competitors are doing, but guess what those are still rumours. Unless they engage in really shady practises or someone breaks their NDA, they won't have concrete information. Those plans could easily change, especially for something like VRAM, which compared to the rest of GPU is an easy hardware change.
VRAM chips are so easy to change that some random guy on youtube changed the VRAM on their graphics card at home. This isn't a GPU die feature that would take literally years to change. We waited all of about 3 weeks before we heard rumours of versions of the 3080 being released with more VRAM, that is how simple of a hardware change it is. AMD's master plan to thwart Nvidia on VRAM could have been undone in the space of 6 months, this ignores that the 3090 exists. Since according to you they would have also known that this exists and exactly how much VRAM it has.

So your proposal is that AMD spent thousands maybe tens of thousands of dollars changing VRAM requirements and testing these changes in a video game, lets not forget that the game was probably in its crunch period at that time, because they heard a rumour on a GPU spec. A spec that Nvidia could have changed and are planning to change within the next few months. While also ignoring that a 24GB 3090 exists which would not be effected by the limitation.

That is a moronic plan.


Edit: The reason why gameworks, works, is because AMD has to spend decent amount of time and money remedying the issue and are not even guranteed to have a complete/proper solution using drivers alone. Any remedy they do find isn't going to bring the benchmarks back to the status quo, they will still be lagging.

That is how you properly sabatage a competitor not your half arse scheme.
 
Last edited:
So your proposal is that AMD spent thousands maybe tens of thousands of dollars changing VRAM requirements and testing these changes in a video game, lets not forget that the game was probably in its crunch period at that time, because they heard a rumour on a GPU spec. A spec that Nvidia could have changed and are planning to change within the next few months. While also ignoring that a 24GB 3090 exists which would not be effected by the limitation.

That is a moronic plan.

We are discussing the mid to high end card, the 3080. Are you really putting forward that AMD would be waisting their time leveraging there only benefit, the amount of VRAM, because people would simply buy a card that costs twice as much, the 3090?

I'm sure I already asked what comparable games also use ~12GB?

Just out of interest, what sort of figure would you put on sponsorship? Do you think ten of thousands is a lot when promoting new hardware such as RDNA2?

Edit: The reason why gameworks, works, is because AMD has to spend decent amount of time and money remedying the issue and are not even guranteed to have a complete/proper solution using drivers alone. Any remedy they do find isn't going to bring the benchmarks back to the status quo, they will still be lagging.

That is how you properly sabatage a competitor
not your half arse scheme.

And how do you see Nvidia remedying the issue of Godfall using 12GB? :rolleyes:

Maybe stop with the moronic half arse'd notion that AMD is beyond dirty tricks :D
 
Maybe you should learn how to read. It would stop you asking questions that have already been answered and coming up with statements like this.

161 pages so far. It's evident that I missed the answer to what comparable game to Godfall requires 12GB of VRAM. Please provide the link :D
 
TIL: Derbeu8r and his girlfriend do porn on only fans
llKBwwN.png
 
@LtMatt youv'e got ACV, as it's also an Amd title, does it use 12Gb at 4K?
I've only played the benchmark so far Lol. :p

I can give it a go though and see what the video memory usage is. Is there any particular level or area to test or can i just start from the beginning?

I've never played these games before so have no idea what to do.
 
That awkward moment when the 3060 has more vRAM than the 3080

It seems the lack of VRAM on the 3080 was so it could hit a price point which is stupid given current pricing but when it comes to the 3060 6GB VRAM isnt enough but 12 is. :confused:

I dont know why but years ago one had a choice of VRAM on the same model but this choice was taken away from the consumer.
 
Status
Not open for further replies.
Back
Top Bottom