• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Crysis Remastered minimum system requirements
OS: Windows 10 64-bit
Processor: Intel Core i5-3450 / AMD Ryzen 3
Memory: 8GB
Storage: 20GB
Direct X: DX11
GPU: Nvidia GeForce GTX 1050 Ti / AMD Radeon 470
GPU memory: 4GB in 1080p


Crysis Remastered recommended system requirements
OS: Windows 10 64-bit
Processor: Intel Core i5-7600k or higher / AMD Ryzen 5 or higher
Memory: 12GB
Storage: 20GB
Direct X: DX11
GPU: Nvidia GeForce GTX 1660 Ti / AMD Radeon Vega 56
GPU memory: 8GB in 1080p

tenor.gif
 
16Gb cards would mean a 512bit bus and a chunk of extra expense. It can be done of course, but it's rare and the last time nVidia went that wide was 12 years ago with the GTX280... 384bit has been the limit since for traditional vram. So, don't know but going by past releases a 512bit card from nvidia is unlikely, so we might be stuck with 10/20 and 12/24Gb cards for a while. Well, unless micron decide to release some oddball sized modules.
 
Last edited:
It does nothing of the sort, all it shows is different frame rates with and without DLSS.

Get back to me when you can actually show the VRAM usage figures in those scenes, until then you're just blowing hot air.
Haaaa haaa!
I can tell based on the tone of your response that you know there is a vram bottleneck in that fortnite press doc.

But it's not about you though. It's a community post. But dont worry dlss to the rescue. :D

But why would I buy a new flagship card if its hamstrung without dlss:eek:

but wait that press doc release shows that it's still slower than console doing it at 4K 30 frames per second. didn't you mock consoles for it? yeah, yeah you did. Oh the irony!!
 
Last edited:
There are 2 scenarios and in both you should buy an RTX 3080 with 10Gb VRAM:

1) 10Gb is enough for 4k for the next 4 years so you buy the RTX 3080 and keep it.

2) 10Gb is not enough in 2 years so you buy the RTX 3080 now and then the RTX 4080 in 2 years. The total cost of £1400 will be less than the 3090 which is certain to be worse than the future 4080. If 10Gb really is not enough the 4080 will come with more VRAM when it launches (unless 10Gb is enough in which case see option 1). :cool:

Only real question is do you buy now or wait for prices to drop once Big Navi is launched. Personally I am waiting to see the best card for Cyberpunk with Ray tracing on High
 
There are 2 scenarios and in both you should buy an RTX 3080 with 10Gb VRAM:

1) 10Gb is enough for 4k for the next 4 years so you buy the RTX 3080 and keep it.

2) 10Gb is not enough in 2 years so you buy the RTX 3080 now and then the RTX 4080 in 2 years. The total cost of £1400 will be less than the 3090 which is certain to be worse than the future 4080. If 10Gb really is not enough the 4080 will come with more VRAM when it launches (unless 10Gb is enough in which case see option 1). :cool:

Only real question is do you buy now or wait for prices to drop once Big Navi is launched. Personally I am waiting to see the best card for Cyberpunk with Ray tracing on High
You forgot about Option 3

3) You don't feel like 10GB is enough so wait an see if they release a 20GB 3080
 
Those are wrong.

Go to the Epic store and looked at the requirements.

Graphics Memory
8GB Graphics Memory in 4K

That's the issue with the internet. A lot of misinformation. Always go to the source for your info.

They changed them in the meantime, that's what it was. :P

Granted these are still the figures for the non-souped version (i.e. RTX on, 8K texture pack on, etc) so we'll see what happens. Comes out just a day or two after the 3080 as well.
 
Haaaa haaa!
I can tell based on the tone of your response that you know there is a vram bottleneck in that fortnite press doc.

But it's not about you though. It's a community post. But dont worry dlss to the rescue. :D

But why would I buy a new flagship card if its hamstrung without dlss:eek:

but wait that press doc release shows that it's still slower than console doing it at 4K 30 frames per second. didn't you mock consoles for it? yeah, yeah you did. Oh the irony!!

Nope, I've not mocked consoles for anything. Once again, you're spouting off without any facts or evidence on your side, just a screenshot and a whole lot of hot air.

I'm not suggesting you buy a new card. I'm making no claims about DLSS and whether it's good or bad. I'm just stating the obvious - a screenshot with a couple of different framerates on it tells us precisely nothing at all about VRAM.
 
I am useing a 240hz 1080p gsync monitor. My frames don't hit 144 in a lot of games with my GTX1070 and older processor.

I will be purchasing a 3080TI if they are realeased. A 3090 is stupid for 1080p but I want to not have to upgrade for 7 years. I have had my GTX1070 for quite a few years.
 
Nope, I've not mocked consoles for anything. Once again, you're spouting off without any facts or evidence on your side, just a screenshot and a whole lot of hot air.

I'm not suggesting you buy a new card. I'm making no claims about DLSS and whether it's good or bad. I'm just stating the obvious - a screenshot with a couple of different framerates on it tells us precisely nothing at all about VRAM.

Feigning ignorance only works if the person you reply to isnt aware of your tactic. In your whole scheme no details about games are presented. However, its doesnt void one of critical thinking.

Which is nothing more then:
the objective analysis and evaluation of an issue in order to form a judgment.
We know that the 3080 is hamstrung in fortnite without dlss. Which is why you avoided saying anything when I brought it up to you.

We also know that a reduction in resolution does reduce vram footprint required to run a game. Again you avoid this. We also know that this flagship card has less memory then the 2089ti and 1080ti. Your omission just tells me you concede.so thank you for that. :D

Either way, I am sure you will double down. That's the only real rebuttal you have to avoiding the information presented to you.

So let's recap shall we:
Asus used a lower sku brand (TUF) with a 30hz or so overclock for ampere. Which optically looks bad for ampere.

User posted a video showing that an overclock 2080ti is about 10% slower then a 3080. Which I assume was boosting. However, we dont know how well it will oc as it suggested that it hovers at 80c.

Nvidia's own press docs shows a staggering 28fps for fortinite which is only alleviated using dlss which is slightly 73fps. Do to, in part, a vram bottleneck at 4k in a open world game using ray tracing. Which is a far cry from claiming superiority over next gen console a 4k/30fps. And throws doubt to how strong this flagship card is if its hamstrung without dlss. But I'm willing to believe that its replacement TI will do much better. ;)

And we still don't know were RDNA 2 stands. But if you actually have and facts to provide regarding "that picture" that isnt laced with your emotional, replays that are now a circular argument which only reveals a hackney, stilted bias by all means let us know.

Until then arrivederci!!
:D
 
Last edited:
Feigning ignorance only works if the person you reply to isnt aware of your tactic. In your whole scheme no details about games are presented. However, its doesnt void one of critical thinking.

I don't have a tactic. I don't care about DLSS. I'm not even saying I think the 3080 has definitely got enough VRAM.

I'm saying that screenshot is not evidence of lack of VRAM, and all your hot air and bluster is just hot air and bluster. Once again, come back when you have some actual figures about VRAM usage. Until then nobody has any reason to believe any of your weird claims.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom