• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I might be missing something, but i don't see how direct IO will reduce or eliminate the need for Mipmaps.
I don't believe it will eliminate mipmaps. Also, up to 33% but that's not verifiable.

This RTX I/O is just another feature for "The Way Its Meant To Be Played" program. IMO, worst case, it does nothing more then hamper AMD cards. IE: Increase asset loading/stream so that RTX I/O looks normal while slowing down AMD builds considerable (like tessellations), for example.
 
Just like RTX and DLSS are still not exactly mainstays 2 years on, I don't expect RTX IO to become a standard for a while yet.

I suspect that's largely because many developers don't want to implement features which won't work on non-Nvidia GPU's, they'd rather wait and introduce the open/cross-compatible equivalents:

DirectX Raytracing (DXR) = RTX
DirectStorage = RTX IO
DirectML = DLSS

RDNA2 is expected to support all of them and have its own 'tensor'-like cores.
 
I suspect that's largely because many developers don't want to implement features which won't work on non-Nvidia GPU's, they'd rather wait and introduce the open/cross-compatible equivalents:

DirectX Raytracing (DXR) = RTX
DirectStorage = RTX IO
DirectML = DLSS

RDNA2 is expected to support all of them and have its own 'tensor'-like cores.

I know, but Nvidia have talked up RTX like it is already a roaring success and very little has developed out of it. So we will see where RTX IO goes before there are better cards out, with no potential memory limitation.
 
I feel like vram usage can be arbitrarily ballooned by any developer.

No matter how much vram you have it will never be enough for a game that's coded to use more than what you have.

It's usually scaled to offer better image quality, some games show you VRAM usage in the settings menu so you can decide yourself.
 
10GB of Vram is not a worry for me as I play mostly 1440p 170 Hz and VR. I occasionally play at 4K but found the TV better suited for consoles hence why I will be getting one.

I believe that if you want the new cards, you should get them at the start of the generation to get maximum usage time out of them. The 3080 is priced to compete with the consoles imo and it won't happen again for a long time.

I ain't waiting for a Super or Ti edition with more Vram only to pay more for it if this card does the job.

Still going to wait for reviews though just in case this card is great at 4K but not so much at lower resolutions. If AMD come up with something before I get my hands on a 3080 then I will wait but if there's no news then so be it.

Don't care if they have something better later cause they were late to the party.
 
I know, but Nvidia have talked up RTX like it is already a roaring success and very little has developed out of it. So we will see where RTX IO goes before there are better cards out, with no potential memory limitation.

Nvidia seems to look at upcoming technologies and make their own closed/proprietary version sooner in order to try and consolidate market share.
 
It's usually scaled to offer better image quality, some games show you VRAM usage in the settings menu so you can decide yourself.

I did some testing on the sims I run now and research on some other titles I may be interested in later. 10gb is more than enough for my current titles/interest and my future vram needs don't seem to be "knowable" in an actionable way beyond "more vram better".
 
Everyone still going on about 10gb gddr6x on the 3080 like its 10gb gddr6 on a 2080.


Its not

10GB is 10GB no matter how fast it is.

News just in billion pound tech multinational got it wrong, handful of ocuk forum nerds were right!

Nvidia is not your friend, it doesn't have your best interests at heart. It wants to extract the most money it possibly can out of your pockets, period. The last thing they want is to give their customers another 1080 Ti - a card that has amazing value and keeps its relevancy for years and years. Planned obsolescence is much more lucrative.
 
It's a bit faster... and?

Well if you know anything about cars you will know that there are 6L V6 truck engines and 6L V6 sports car engines. The latter is "just a bit faster". But you drag race a 6L truck against a 6L sports coupe the takeaway is not going to be "shoulda put a bigger engine in the coupe" is it?
 
You'd think that if AMD had an upcoming 3080 challenger, they would release some info on it before the 17th, to gain sales.

I really hope they do, and there's plenty of time still, but if they say nothing I think their silence would be telling.
 
You'd think that if AMD had an upcoming 3080 challenger, they would release some info on it before the 17th, to gain sales.

I really hope they do, and there's plenty of time still, but if they say nothing I think their silence would be telling.
Somethings afoot, we will hear about it soon ;)
 
Well if you know anything about cars you will know that there are 6L V6 truck engines and 6L V6 sports car engines. The latter is "just a bit faster". But you drag race a 6L truck against a 6L sports coupe the takeaway is not going to be "shoulda put a bigger engine in the coupe" is it?
I know nothing about cars I'm afraid :p

I don't think the analogy works tho. If you have a sink and only holds 3L, you can't make it hold 4L by filling it faster.
 
Well if you know anything about cars you will know that there are 6L V6 truck engines and 6L V6 sports car engines. The latter is "just a bit faster". But you drag race a 6L truck against a 6L sports coupe the takeaway is not going to be "shoulda put a bigger engine in the coupe" is it?

The Dodge Viper would like a word :)
(sports car with a V10 truck engine)
 
Status
Not open for further replies.
Back
Top Bottom