• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 16GB of Vram The Standard For High End Graphic Cards?

Status
Not open for further replies.
Yes reading comprehension. Keep up.
Remember my friend. Context.
You keep pushing this nonsense without realising you just come off as a complete troll as you don't articulate anything.

I am assuming you don't mean this but it comes across that way.

Subjective use case etc. There is no truth to what you are saying based on this.

Have you seen the TPU database? They have Vram amounts for the RTX 4000 series.

I will be moving to the 4000 series becaus the 3070's GPU core at 170hz is pretty lackluster.

Vram is larger on the 4000 series cards. This is before you start saying we all jumped because "Vram" too.
 
I will be moving to the 4000 series becaus the 3070's GPU core at 170hz is pretty lackluster.

Vram is larger on the 4000 series cards. This is before you start saying we all jumped because "Vram" too.

I dont need to say anything about your posts, as the saying goes, give someone enough rope..

If you digested what I posted (which you dont), I said many enthusiasts will move on before it is a thing. You are just confirming so if anyone is 'trolling' it would be you.

..Ironically some of these folk then sell their flagship product to pocket. Some of these folk then buy a card with more VRAM (see willhub above), or they get a different one which wont teeter on the issue (see TNA). I guess it doesn't really matter as most enthusiasts on here will be shuffling soon, but to stick to the OP topic, it seems to some it doesn't matter and to others they have either avoided it by not buying or they have sold on the regular 3080 so it wont be a problem to worry about!

Luckily for some the 40 series is nearly here.
 
I dont need to say anything about your posts, as the saying goes, give someone enough rope..

If you digested what I posted (which you dont), I said many enthusiasts will move on before it is a thing. You are just confirming so if anyone is 'trolling' it would be you.
So why are you posting memes which undercuts your original plan?

Apparently you know it is not an issue. Then create memes as if it is an issue.

Confused?
 
For any card that has an MSRP over £600 it should have 12GB of memory bare minimum, I have a 3080 Ti, Play at 3840x1600 ultrawide and have seen some games bumping up against 11GB and that's with games right now.
 
The discussion has always been pushing the card to see how good it is and if there are any weaknesses, you seem to have a lot to say yet fail to pick up on the simplest of points. I don't think I can recall of anyone saying the 3080 was a terrible card, in particular at the resolution of 1440p. This is where DLSS comes in to bail out most cards. You could easily argue that at that resolution the 3070 and Ti are this target sku marketed by nvidia. So what we have is even by your own admission here dGPU's needing a bail out for fps when your jacking into ray tracing settings, from what you term as horsepower. However there are occasions putting ray tracing aside where high textures are used that horse power is not the focal point but the lack of vram becomes the issue. Whether this is deliberate as you put it, or simply an oversight from the engineers is to be seen, but we do have other cards now available with more VRAM which likely do not suffer from this (3080Ti, 3080 12Gb - funny how these came to be released if there was no issue regarding vram that you and others speak of - maybe just gullible punters then!).

So the statement could also read "I know my 3080 could do with some more vram when using these large textures" but that would be too much of admission, so we have the ever decreasing circles of this denial.

We had a 25% increase in VRAM for Nvidia's original 3080 meaning it launched with 10GB. We also saw Sony and Microsoft launch consoles with a TOTAL of 16GB. Microsoft made the decission to seperate their system 10/6GB meaning 10GB for VRAM, leaving 6GB for OS, housekeeping and games. It's true, Sony decided to keep the full 16GB unified, but doesn't have any secret sauce to use less memory for OS, housekeeping and games due to it still being the same x86 architecture.

Now we have the plebs, who have barely managed to operate a plug, declare these three behemoths don't know what they are talking about?

The name gpullible is still available for your next name change.
 
We had a 25% increase in VRAM for Nvidia's original 3080 meaning it launched with 10GB. We also saw Sony and Microsoft launch consoles with a TOTAL of 16GB. Microsoft made the decission to seperate their system 10/6GB meaning 10GB for VRAM, leaving 6GB for OS, housekeeping and games. It's true, Sony decided to keep the full 16GB unified, but doesn't have any secret sauce to use less memory for OS, housekeeping and games due to it still being the same x86 architecture.

Now we have the plebs, who have barely managed to operate a plug, declare these three behemoths don't know what they are talking about?

The name gpullible is still available for your next name change.

:cry:
 
Microsoft made the decission to seperate their system 10/6GB meaning 10GB for VRAM, leaving 6GB for OS
The bolded part is wrong the OS has 2.5GB to play with.
P.S. The 10GB is referred to as GPU optimal. That doesn't mean the GPU is confined to that 10GB.
P.S.S If GPUs run out of grunt before they run out of VRAM does that mean the 3080 is only as powerful as an Xbox series X?
troll-face-transparent-1.png
 
The bolded part is wrong the OS has 2.5GB to play with.
P.S. The 10GB is referred to as GPU optimal. That doesn't mean the GPU is confined to that 10GB.
P.S.S If GPUs run out of grunt before they run out of VRAM does that mean the 3080 is only as powerful as an Xbox series X?
troll-face-transparent-1.png

Keeping to the spirit of your post, anyone putting more than 6GB of RAM in their PC should be given a tube of lube and handed over to our resident knitting circle :D
 
I always felt the 3080 should have launched with at least 11GB to match the 2080Ti. Basically the current 12GB model should have been the launch model.

As someone who did the EVGA step up from a 2080Ti to a 3080 10GB and had to wait a year and a half I'm not complaining. My 3080 performs better than my 2080Ti and that is what counts.

The difference in some games between high and ultra is barely noticeable and let's face it, in 5 years time if you have to turn down some graphics settings on something like a 3080 it's had a good long life and most of this arguing is purely academic at 10GB.

Below 10GB and I think there is maybe slightly more debate for the next year or two, but again, just turn down some settings if you have to.
 
I'm sure I read a similar discussion regarding 1GB > 2GB, 2GB > 4GB ;) I'd look at it in reverse, what resolution and settings do you want to run at, then pick a card to suit. If that's too expensive then turn a few setting down. Maybe we need a flow chart pinned as a sticky? 10GB isn't bad though when you're on a 5830 with 2GB :cry:
 
The only triggered people seem to be the ones continually squealing about a card they've never owned. I cant understand why some are so invested in the hate for it, the bitterness is legendary. I wonder why it matters so much?

The only reason folk must be so bitter is that folk had to pay 100% more for 10-15% more performance over a 10GB 3080 in 99% of games because they couldn't get a 3080. In fact on release in Sept 2020 when many made their decisions and purchases, the graphs were only 9% more. https://www.guru3d.com/articles_pages/geforce_rtx_3090_founder_review,24.html

I guess some have to find some way of trying to knock the 10GB 3080 off it's perch as best bang for buck card this gen, and the jealousy for those that got in early on launch or managed to snag an FE. Those that started off thinking £100-200 over the odds was a con, then got FOMO, ending up conditioning themselves paying way more for a bit of extra performance (at the time). FC6 came out a year later than the 3080 & 3090 and plenty of people boiught in that time. But the silliness goes on to the fact that one game a year later has issues with a card. One game - and guess what a newly released game. AAA titles always come out polished :rolleyes:.

I think it's safe to say that no 3080 owner that paid a sane amount in the fall of2020, will be annoyed at how much they have left in their pocket for next gen or the amount of time they have enjoyed gaming on it. On release I considered the 3090 and waited for the 6900XT reviews to arrive and sense dictated that 10% perf isn't worth 100+% more cost.

Dunno why some have got so upset about cards they don't own and are a tier or so down (small tier). The only reason must be some justification for spending double or more due to unavailability / shortage / scalping of the 3080 on cards that offer little extra in 95% of games or 99% when you consider all resolutions.

Sounds like many have already conditioned themselves for +£1k cards now being high end. Lose, lose for all of us. As putting the 3080 down so hard and saying it's no good is just giving the green light to AMD and Nvidia to reset the teirs and associated costs. Though Nvidia list High end then enthusiast above that.

Interesting proposal on VRAM being the decider of high end. What about cost? What about performance? Seeing as though the 10GB 3080 was 90% of the 3090 on release then is it still mid range? And high end is for the 3 cards in the last 10%? Or enthusiast for the final 3%?

Either way prepare for graphics cards being north of £1k permanently. The 3080FE should have been the benchmark for where the costing should be set at. We've all lost out next gen.
 
Status
Not open for further replies.
Back
Top Bottom