• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Memory allocation the left, actual usage on the right.


That's super handy I'm just installing the beta now to give this a shot.

Memory allocation the left, actual usage on the right.


I'm not getting this with the beta install from the MSI page, is there another newer beta hosted somewhere else or is this a hidden option somewhere?
 
Last edited by a moderator:
8 FPS, damn, that card is clearly leagues ahead with it's mighty 10GB V


Two things worth considering IMO:

1. This thread is filled with current or future 3080 10GB owners, who want to defend their purchase, and believe the card will age as well as the 1080ti etc. Completely in denial and not worth getting into debates/arguments with them.
2. This will only be a short term consideration, as I fully expect Nvidia to withdraw the current 8nm 10GB 3080, in favour of the upcoming 7nm 20GB 3080 (will probably be renamed 3080 super, or 3080ti, who knows).
3. The 7nm 3080 20GB will be more expensive. I think Nvidia see that at £650 too many people can afford it and the demand is too great, gets them worse PR when no cards available vs when it's more expensive but ample stock. I doubt we'll see such a price point for the flagship going forward. Perhaps in 2 years, when we're deeper into the current depression, the free helicopter money has stopped and more are unemployed, relying on savings etc, this will be different, we'll see.

Firstly, that's three things you have listed. Important since we are talking about numbers :p

1. This thread is filled with current or future 3080 10GB owners, who want to defend their purchase, and believe the card will age as well as the 1080ti etc. Completely in denial and not worth getting into debates/arguments with them.

I'm in the on order group. I am purchasing RT performance and try to be realistic about Turing, Ampere and RDNA2. RT performance is improving at a decent pace. Turing had terrible RT performance, which is why many including myself skipped it. Ampere has borderline performance managing Quake 2's path tracing at 1440p/60. RDNA2 is AMD's first gen and although they have an elegant solution, the performance will never be there without costly dedicated HW. Therefore I can't see anyone who wants RT sticking with a 3080 beyond the release of Hopper/RDNA3.

Now the big question is do you really beleive developers will target greater than 10GB of VRAM when most PC gamers will be spending less than £400 on a graphics card and the next gen consoles realistically have at best 10GB of VRAM to play with?
 
I'm not getting this with the beta install from the MSI page, is there another newer beta hosted somewhere else or is this a hidden option somewhere?

  1. Enter the MSI Afterburner settings/properties menu
  2. Click the monitoring tab
  3. Near the top and next to "Active Hardware Monitoring Graphs" click the "..."
  4. Click the Checkmark next to "GPU.dll", and hit OK
  5. Scroll down the list until you see "GPU Dedicated Memory Usage", "GPU Shared Memory Usage", "GPU Dedicated Memory Usage \ Process", "GPU Shared Memory Usage \ Process"
  6. Pick and choose what you want to be tracked using the checkmarks next to them. "GPU Dedicated Memory Usage \ Process" is the # that most closely reflects the # we find in FS2020 Developer Overlay and Special K (DXGI_Budget, except Unwinder uses D3DKMT api)
  7. Click show in On-Screen Display, and customize as desired.

https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/
 
I'm in the on order group. I am purchasing RT performance and try to be realistic about Turing, Ampere and RDNA2........................

Now the big question is do you really believe developers will target high amounts of ray tracing when most PC gamers will be spending less than £400 on a graphics card and the next gen consoles realistically have a hybrid AMD solution?

fixed .....:D
 
Now the big question is do you really beleive developers will target greater than 10GB of VRAM when most PC gamers will be spending less than £400 on a graphics card and the next gen consoles realistically have at best 10GB of VRAM to play with?

As someone with some experience in the field, you want to have the minimum VRAM target as low as possible to have a wider potential audience but allow memory usage to scale based on the users hardware/settings, it's why non-official memory allocation counters are very misleading.
 
Yes that's what I'd have done as well, don't know abt t&l but I am not too hung up on rt.. the problem scales exponentially, there is this slight chance that rt will be discarded as a failed experiment in real time graphics.. i won't be using rt on 3080 if I had one

RT is to graphics now as Vodoo was back in the day. Realtime RT is the holy grail of graphics. It doesn't mean everything has to look realistic as it can still be very stylized, but what it does do is provide far better quality of lighting and shadow with far less developer time.

I suppose it is down to the type of games you enjoy. I'd love a RT version of Theif where you have to sneak around in the shadows as it would add greatly to the atmosphere. BF5 for example is more about the FPS as you really don't get the time to appreciate what is renedered due to the game being fast paced.
 
  1. Enter the MSI Afterburner settings/properties menu
  2. Click the monitoring tab
  3. Near the top and next to "Active Hardware Monitoring Graphs" click the "..."
  4. Click the Checkmark next to "GPU.dll", and hit OK
  5. Scroll down the list until you see "GPU Dedicated Memory Usage", "GPU Shared Memory Usage", "GPU Dedicated Memory Usage \ Process", "GPU Shared Memory Usage \ Process"
  6. Pick and choose what you want to be tracked using the checkmarks next to them. "GPU Dedicated Memory Usage \ Process" is the # that most closely reflects the # we find in FS2020 Developer Overlay and Special K (DXGI_Budget, except Unwinder uses D3DKMT api)
  7. Click show in On-Screen Display, and customize as desired.
https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/

Thanks, I literally just found the same thread. I've done that and followed the instructions and I get the extra "GPU Dedicated Memory Usage", "GPU Shared Memory Usage" but I don't get the "GPU Dedicated Memory Usage \ Process", "GPU Shared Memory Usage \ Process" options", I wonder if I somehow have an outdated GPU.dll or something. The dedicated memory usage is more or less the same as what it would already show for memory usage so not that helpful.

Any ideas?

I didn't use that link, I used the MSIs site, the version number is the same on the MSI interface but it doesn't say the build number so maybe it's that? i'll try reinstalling again.

*edit*

Yes it was that, for anyone else testing this make sure to use the Guru3d link and don't get the beta version from MSI's own site, the version number is the same but it's a different build.

Wolfenstein-The-New-Colossus-Screenshot-2020-10-17-13-40-05-34.png


This is Wolfenstein II The New Colossus basically maxed out at 4k with only motion blur turned off (as preference) showing 6,568MB allocated but only 5,454MB used, so a delta of 1,114MB
 
Last edited:
As someone with some experience in the field, you want to have the minimum VRAM target as low as possible to have a wider potential audience but allow memory usage to scale based on the users hardware/settings, it's why non-official memory allocation counters are very misleading.

Very true.

10GB of VRAM is HUGE! 1 x 8 x 1024 x 1024 x 1024 x 10 = 85,899,345,920 bits of information. In comparrison the world's population sits ~7,800,000,000. That is a difference of 78,099,345,920. It's not easy to fill that sort of storage and is why we don't see games using 11GB of VRAM on desireable cards such as the 1080Ti. Of course using 8K with numpty levels of AA can eat VRAM, but you won't have playable frame rates when doing so.

I think historic version DirectX are to blame as I'm sure I read that they didn't allow direct control of memory. So devs never knew exactly how much VRAM would be used when storing data so they had to over compensate. DX 12 is supposed to address this by giving more direct acccess to VRAM.

Feel free to correct me here as I've skipped sleep last night :)
 
fixed .....:D

No. Not when you realise that RT saves dev time while providing far more quality and that rasterization is there to fall back on when hardware can't keep up. It's more likely that devs will spend less time on rasterization (less quality) since you now have the choice.

Remember Nvidia has moved to RTX meaning it's cards support hardware RT, while AMD now has a hybrid approach with RDNA2 so that even consoles support some form of RT. Even Intels planned GPUs will support hardware RT at some level. Basically even cheap cards will now give you some form of hardware RT.
 
MooresLawIsDead, feel free to take it with a pinch of salt, but whether you choose to believe him or not prices have reflected it.

MILD just runs with any and every story to get hits along with an active imagination - basically a YouTube wccftech.

Dunno why people still take him seriously - other than he seems to say a lot what certain people want to hear even though he also contradicts it 1-2 videos later half the time. Any correct information is purely by chance.
 
MILD just runs with any and every story to get hits along with an active imagination - basically a YouTube wccftech.

Dunno why people still take him seriously - other than he seems to say a lot what certain people want to hear even though he also contradicts it 1-2 videos later half the time. Any correct information is purely by chance.

He has a tendency of saying things before they happen, which is why people listen to him.
 
5 years? Nvidia gpu's are not a good long term choice, Kepler & Maxwell are good examples of why they aren't, even the 1080ti is already starting to fall behind it's performance replacement the RTX 2080. I praise Zotac for having a 5 year warranty on their cards but I'd never plan on keeping one for that long, my jump ship time would be when it's no longer the last gen series so today I would have been selling my GTX 1080 Pascal card if I still had it.
I know its a long shot but I feel it will easily last long for me. I will be upgrading from integrated graphics. I game at 800*600 sometimes at 30fps. I have low standards.
So I think i will upgrade once my machine can no longer do 60+ fps at medium/high settings at 1440p in the majority of the games.

Also I am just 17, i don't think I will have money to be able to afford a $700+ gpu again after 2 years :p
 
Last edited:
No. Not when you realise that RT saves dev time while providing far more quality and that rasterization is there to fall back on when hardware can't keep up. It's more likely that devs will spend less time on rasterization (less quality) since you now have the choice.

I don't think this can be overstated enough. Dev time to fake scenes into looking more real past the limitiations of basic rasterization takes a lot of manual tweaking in game and that's a very big burden on them, if we lift that burden everyone wins.

The other thing that's worth noting, what rasterization cannot naturally produce it has to fake which often leads to bad approximations, it cannot do real genuine global illumination for example. The bottom line is that while rasterization is a lot quicker than ray tracing at the basic rendering, the fakery that rasterization has to employ to achieve the same image quality as ray tracing are progressively getting more expensive. You can fake approximate global illumination for example but it is very expensive in terms of performance and it's not a great approximation. Now you could argue back and forth of which is a better trade off between image quality and performance and if the "real thing" is worth it. But the more important point is that these hacks to improve upon rasterization have become progressively more expensive over time and will continue to do so into the future. Eventually rasterization + fakery will simply become such a bad diminishing returns that it will be beaten out by ray tracing which "just werks (tm)".

And I think when it came to accurate reflections that improve past the limitations of screen space reflections was probably the start of end for rasterization. You could imagine Nvidia engineers sitting around thinking about ways to get SSR to reflect more accurately and break through current limitations, but I bet you whatever the next best hack was so performance expensive they just bit the bullet and said it's now just going to be easier to ray trace it accurately to start with, rather than building hacks on top of hacks to get the rasterized approximation closer to reality. And even if that argument didn't stick for reflections it would eventually stick for something more advanced, this every growing complexity of faking stuff for rasterization is a diminishing returns curve is the real point.
 
He has a tendency of saying things before they happen, which is why people listen to him.

In most cases he has pushed 3-4 different versions of a story - people seem to latch on to the one that was right and somehow forget the others - his "exclusives" have almost all been so utterly wrong there is no chance they came from even a remotely legit source in the first place and almost anything he has got right has been reproducing information after the fact from people like kopite7kimi.

I don't have a problem with him (even if that sounds like it) but I do have a problem with people giving his information the level of validity some do - it is pure chance if any of it is right - this is like people taking wccftech seriously.

Even his "great play" is fitting circumstantial evidence to a narrative he has concocted which may or may not be right and most of it isn't anything earth shattering as shortages and price manipulation (generally driven by retailer gouging) has been a thing of the last few nVidia launches - you or I could have come up with the same thing with a bit of imagination.
 
on my ultrawide monitor 5120*1440 I have seen 7-8 gb ram used quite often on my gtx 1080ti, I think 4k monitor uses even more memory, so yeh having only 10gb will probably end up being a problem

I want to upgrade to a 30 series, but yeh I reckon I'll hold out for a 20gb version, or maybe even a 16gb big navi if they are a decent upgrade from my 1080ti
 
In most cases he has pushed 3-4 different versions of a story - people seem to latch on to the one that was right and somehow forget the others - his "exclusives" have almost all been so utterly wrong there is no chance they came from even a remotely legit source in the first place and almost anything he has got right has been reproducing information after the fact from people like kopite7kimi.

Can you give any examples?
 
Status
Not open for further replies.
Back
Top Bottom