• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
That was my post so I'll reply:

I'm running a Reverb G1. So a little over 4k @90FPS.
I see, so your situation with high fps VR gaming is likely more demanding that most people's hence why you need to reduce. I think in terms of GPU horsepower, 4k60fps or higher is do-able in the vast majority of games fully maxed out on a 3080/3090.
 
I see, so your situation with high fps VR gaming is likely more demanding that most people's hence why you need to reduce. I think in terms of GPU horsepower, 4k60fps or higher is do-able in the vast majority of games fully maxed out on a 3080/3090.

Maybe that's the case now. But my situation illustrates that it's possible to burden the GPU itself to the point that settings must be reduced. The vram-preppers seem to imply that *only* vram buffers will be overburdened in future games.

There are a lot of people who don't need more vram or GPU horsepower *right now*. However, future games will require more of *both* I'm sure.
 
However, future games will require more of *both* I'm sure.

Why?

The 3080 and 6800XT are "budget" cards when it comes to 4K so you should expect a performance knock. If you want max textures and decent FPS at 4K, get the 3090 or 6900XT. I also get what people are drawing from the Godfall stats, but we should maybe focus on games that people will actually play :D
 
Maybe that's the case now. But my situation illustrates that it's possible to burden the GPU itself to the point that settings must be reduced. The vram-preppers seem to imply that *only* vram buffers will be overburdened in future games.

There are a lot of people who don't need more vram or GPU horsepower *right now*. However, future games will require more of *both* I'm sure.
Is that what people said or is that your interpretation of what people said?
 
Is that what people said or is that your interpretation of what people said?

Well I said "imply" and there is entire thread focused on running out of vram and needing to *gasp* reduce settings.

I have not seen many people in this thread talking about the need to turn down settings for a lack of GPU horsepower, much less an entire thread dedicated to the issue. So, yes, the *implication* is that vram is the reason people will need to reduce settings in future games.
 
Why?

The 3080 and 6800XT are "budget" cards when it comes to 4K so you should expect a performance knock. If you want max textures and decent FPS at 4K, get the 3090 or 6900XT. I also get what people are drawing from the Godfall stats, but we should maybe focus on games that people will actually play :D

*Every* card will lack what it takes to max-out games at some point. It doesn't matter what hallowed "tier" you buy into.
 
*Every* card will lack what it takes to max-out games at some point. It doesn't matter what hallowed "tier" you buy into.

Too many people want a Ferrari but are only willing to pay VW Polo money (assuming MSRP). Cards are built to fulfil the needs of specific market segments, so I expect them to struggle as you push the card outside those limits.

You also can't assume a linear trend as there are several software-level improvements in the pipeline that'll have a direct impact on VRAM usage and general horsepower. How this affects current cards is pretty massive in terms of longevity - Nvidia will likely benefit the most as they have a more mature platform.
 
Why?

The 3080 and 6800XT are "budget" cards when it comes to 4K so you should expect a performance knock. If you want max textures and decent FPS at 4K, get the 3090 or 6900XT. I also get what people are drawing from the Godfall stats, but we should maybe focus on games that people will actually play :D

The 3080 is perfectly capable at 4k, paying extra for 6-14gb extra VRAM is not worth it.
 
Too many people want a Ferrari but are only willing to pay VW Polo money (assuming MSRP). Cards are built to fulfil the needs of specific market segments, so I expect them to struggle as you push the card outside those limits.

You also can't assume a linear trend as there are several software-level improvements in the pipeline that'll have a direct impact on VRAM usage and general horsepower. How this affects current cards is pretty massive in terms of longevity - Nvidia will likely benefit the most as they have a more mature platform.

Your comparison uses two products that have a meaningful/noticable performance between them than a 3080 and a 3090.

A more accurate comparison would be a Ferrari and a Porsche.
 
Well I said "imply" and there is entire thread focused on running out of vram and needing to *gasp* reduce settings.

I have not seen many people in this thread talking about the need to turn down settings for a lack of GPU horsepower, much less an entire thread dedicated to the issue. So, yes, the *implication* is that vram is the reason people will need to reduce settings in future games.
That is a lot of words so that you could say it was your interpretation of what others have said.

PS: maybe you should start your own thread about gpu horsepower since you know this thread is about vram and all.
 
The 3080 is perfectly capable at 4k, paying extra for 6-14gb extra VRAM is not worth it.

I agree, but it's also the minimum if you don't want to drop settings. If I was running a 4K monitor instead of 1440p I probably still would have gone with the 3080, but that's largely because I don't see value paying over 1k for a card when it'll probably be dethroned in a year or 2 but a cheaper card - obviously if you have the money, spend it.

It'll likely only get better as DLSS improves.

Your comparison uses two products that have a meaningful/noticable performance between them than a 3080 and a 3090.

A more accurate comparison would be a Ferrari and a Porsche.

I guess if a Ferrari/Porsche is 20-30% faster than a Ferrari/Porsche and the price difference is 115%. Regardless, a lot of these arguments and opinions are stupid because someone sees a bigger number and suddenly becomes a game engineer.
 
That is a lot of words so that you could say it was your interpretation of what others have said.

This will be the second time I direct your attention to my use of the word "implied". (Or at least the second time I have tried to do so.)

im·plied
/imˈplīd/

adjective
  1. suggested but not directly expressed; implicit.
 
Maybe that's the case now. But my situation illustrates that it's possible to burden the GPU itself to the point that settings must be reduced. .
If you are shooting for 90fps at 4k in a demanding game then yes of course, your situation illustrates that are setting yourself a high bar for which any top-end GPU will struggle. By that logic I could say I also 'ran out of GPU horsepower' because I was shooting for 120fps at 4k in X game. You are applying a very subjective view of what constitutes reasonable GPU horsepower and then using it as the basis for an argument about why VRAM isn't the only factor... when no-one said it was. In fact, people have mentioned GPU horsepower continually throughout the thread.

The fact is all games engines are different and people have different definitions of what is reasonable based on how and what they play. I personally think that 4k60fps with all details maxed is a reasonable benchmark of GPU horsepower for any current generation GPU and it is definitely possible that games from this generation will exceed 10GB while providing that benchmark of performance.
 
This will be the second time I direct your attention to my use of the word "implied". (Or at least the second time I have tried to do so.)

im·plied
/imˈplīd/

adjective
  1. suggested but not directly expressed; implicit.
In your OPINION people implied something.

My statement still stands it is your interpretation of what people wrote.


In my opinion, in reference to the first post I quoted and bolded, nobody has implied that VRAM is the only limiting factor in games. You're just reading what you want and selectively remembering what has happened.

As a side note I am also pretty sure that the argument as to why people are not focused on GPU horsepower being a limiting factor in future has already been addressed somewhere in this thread. Go grab some popcorn and get digging.

I will give you a clue to the conclusion of that discussion. it has something to do with bottlenecks. I'll leave you to work out the rest.
 
If you are shooting for 90fps at 4k in a demanding game then yes of course, your situation illustrates that are setting yourself a high bar for which any top-end GPU will struggle. By that logic I could say I also 'ran out of GPU horsepower' because I was shooting for 120fps at 4k in X game. You are applying a very subjective view of what constitutes reasonable GPU horsepower and then using it as the basis for an argument about why VRAM isn't the only factor... when no-one said it was. In fact, people have mentioned GPU horsepower continually throughout the thread.

The fact is all games engines are different and people have different definitions of what is reasonable based on how and what they play. I personally think that 4k60fps with all details maxed is a reasonable benchmark of GPU horsepower for any current generation GPU and it is definitely possible that games from this generation will exceed 10GB while providing that benchmark of performance.

I'm sure a game can be coded in such a way that it blows out a vram buffer. But then games can also be coded in a way that overburdens the GPU itself.

And, if past history is any indication, they will. You may think my use case is extreme now, but you can't argue that 40k 90fps will be seen as "extreme" in the (near) future.

Software, hardware, and expectations increase with time.

The GPU horspower I need to max out my sims in VR doesn't exist yet. It will someday, but right now it's just not there. Vram isn't limiting me, the GPU is. My "expectation" from manufacturers is to give me more performance at a given price point from one generation to the next. Ampere has done that well. I'm happy with my card.

I think expecting hardware to stay ahead of software is particularly unreasonable. There's a reason Crysis became a meme. (I guess MSFS is taking its place now days.)

I don't think expecting *anything* to max out MSFS at 4k 60fps is reasonable right now.
 
obody has implied that VRAM is the only limiting factor in games. You're just reading what you want and selectively remembering what has happened.

Perhaps not, but VRAM limitations are certainly at the forefront of the discussions when it comes to limiting factors on these GPUs. We've had people questioning the logic of dropping 700 quid on a GPU when the RAM might limit your ability to run everything maxed out, when there are games out there now that the 3080 already cant maintain 4k60. The later seems to be continually brushed aside like it's fine to be slow.... just not slow because of the VRAM buffer :o
 
Perhaps not, but VRAM limitations are certainly at the forefront of the discussions when it comes to limiting factors on these GPUs. We've had people questioning the logic of dropping 700 quid on a GPU when the RAM might limit your ability to run everything maxed out, when there are games out there now that the 3080 already cant maintain 4k60. The later seems to be continually brushed aside like it's fine to be slow.... just not slow because of the VRAM buffer :o

I would rather dump £650 on a 10gig GPU than £800 on a 20 gig one.

However if the 3080ti appears for say £700 it will be a hmmm moment.
 
I also get what people are drawing from the Godfall stats, but we should maybe focus on games that people will actually play :D

Perhaps not, but VRAM limitations are certainly at the forefront of the discussions when it comes to limiting factors on these GPUs. We've had people questioning the logic of dropping 700 quid on a GPU when the RAM might limit your ability to run everything maxed out, when there are games out there now that the 3080 already cant maintain 4k60. The later seems to be continually brushed aside like it's fine to be slow.... just not slow because of the VRAM buffer :o

+1
 
I'm sure a game can be coded in such a way that it blows out a vram buffer. But then games can also be coded in a way that overburdens the GPU itself.

This is just stating the obvious. Games and engines are coded differently. Since decades.

And, if past history is any indication, they will. You may think my use case is extreme now, but you can't argue that 40k 90fps will be seen as "extreme" in the (near) future.

Software, hardware, and expectations increase with time.

The GPU horspower I need to max out my sims in VR doesn't exist yet. It will someday, but right now it's just not there. Vram isn't limiting me, the GPU is. My "expectation" from manufacturers is to give me more performance at a given price point from one generation to the next. Ampere has done that well. I'm happy with my card.

I think expecting hardware to stay ahead of software is particularly unreasonable. There's a reason Crysis became a meme. (I guess MSFS is taking its place now days.)

I don't think expecting *anything* to max out MSFS at 4k 60fps is reasonable right now.
My GPU easily maxes everything out everything have tried so far at 4k60 (my 4k monitor is limited by this)... it is perfectly reasonable to expect this in the majority of games on both a 3080 and 3090, at least without full quality RT enabled (where DLSS isn't an option).
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom