• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

HIGHEST VRAM I ever saw was shadow of the tomb raider peaking at 7.2Gb if I recall correctly, 3440*1440. All other games have sat fair amount less than that. Even then mighty RDR2 system crusher.

My 2080S is poor value but a powerful card, still I can throw more power at nt 5Mp Res 120hz monitor and that is what I will be doing. It will be inevitable I'm the next two years I will need more horsepower whatever, hence selling my 2080S early. I'm going to get at least 70% back on what I bought it for on black Friday last year, not a bad 'rental' price for the second top GPU.

I'm actually ok to buy a 3080 with 10Gb, sure id take more for the same price why wouldn't I. But from past cards and games it'll give you at least three years without reliably coming up to the buffer, more like four years perhaps.

Refreshes happen in some form whatever the climate, supers or node tweaks, of course id rather buy the best finished product halfway into the lifespan but that's exactly that, halfway into things, life's short and I'm an enthusiast, I accept that and work hard for my hobby.

Having said that the GTX 1080 at launch was really expensive, then the Ti came out and it fell massively. I don't see the same situation happening with a product like the 3090 out of the gate at launch, ultimately we can all save money at some point but you gotta wait/not game/be obsessed reading/watching YT endlessly.

I'll let dust settle a bit for a month or two but as winter kicks in the Intel iGPU on my 8600k will suck for real games, currently I'm happy playing red alert remastered:)
 
HIGHEST VRAM I ever saw was shadow of the tomb raider peaking at 7.2Gb if I recall correctly, 3440*1440. All other games have sat fair amount less than that. Even then mighty RDR2 system crusher.

My 2080S is poor value but a powerful card, still I can throw more power at nt 5Mp Res 120hz monitor and that is what I will be doing. It will be inevitable I'm the next two years I will need more horsepower whatever, hence selling my 2080S early. I'm going to get at least 70% back on what I bought it for on black Friday last year, not a bad 'rental' price for the second top GPU.

I'm actually ok to buy a 3080 with 10Gb, sure id take more for the same price why wouldn't I. But from past cards and games it'll give you at least three years without reliably coming up to the buffer, more like four years perhaps.
By what numbskull logic are you saying that the next 3-4 years gaming at 4k will be fine for 10GB VRAM... because your older games don't use 10GB VRAM? The mind boggles.

We already know games like FS2020 use 12+GB VRAM at 4k and the new generations of games will only increase in graphical fidelity and complexity so it is logical and inevitable that vRAM requirements will increase... especially at 4k..
 
3090 availability will be poor according to Moore's Law, so yeah I don't doubt enthusiast-FOMO will play a part for those without the patience to wait. However, it's unwise to not wait to see what AMD bring, especially for those who have made 1000 posts complaining about Nvidia and their pricing.
That is true regarding the complaints :D
Not much I do pre-order but I did pre-order a 2080. In fact's I think it's the only thing I've ever pre-ordered.And if I want to pick one up soon after release this time, I'd just do the same.

Hahahaha:D

It got a signal from HQ ;)
Didn't think of that :D. Am sure I noticed it recently too but then yesterday I actually remember walking past and seeing it bright, then it went dim as I was noticing it.
I wondered if it had a meaning. Looking from the side it looks dusty. Might take it out later and put the hoover end around it.
Must be under warranty but not returning it for that :). Wont be able to sell it either tho so maybe spare case it goes into
 
By what numbskull logic are you saying that the next 3-4 years gaming at 4k will be fine for 10GB VRAM... because your older games don't use 10GB VRAM? The mind boggles.
Sorry what older games? Is RDR2 old by your metric?
I did not use logic but experience. Consider trying that to guide you in life rather than sensationalism.

Where did I say 4k, the fools resolution, I'm at 5Mp not 8Mp, will leave you to calculate how much less rendering power that is.
 
Sorry what older games? Is RDR2 old by your metric?
I did not use logic but experience. Consider trying that to guide you in life rather than sensationalism.

Where did I say 4k, the fools resolution, I'm at 5Mp not 8Mp, will leave you to calculate how much less rendering power that is.

Sorry I misread 3440*1440... but the principle remains that using the last 3 years to judge the next 4 years, especially when you already saw 7.5GB VRAM usage in a 2018 game, makes no logical sense... but as you admit you aren't using logic then I guess it's no surprise. :p VRAM requirements at higher resolutions will rise as fidelity and detail increases, especially with heavy use of RT.
 
By what numbskull logic are you saying that the next 3-4 years gaming at 4k will be fine for 10GB VRAM... because your older games don't use 10GB VRAM? The mind boggles.

We already know games like FS2020 use 12+GB VRAM at 4k and the new generations of games will only increase in graphical fidelity and complexity so it is logical and inevitable that vRAM requirements will increase... especially at 4k..
I hade a few games from the time I had a Titan XP that used over 11gb and that was a few years ago...

Granted it probably only used it as it was there rather than it actually needing it, but still, with new gen consoles coming out I do think 10gb is not enough. One can make do with it, but it won’t be optimal. 16gb is what 3070 and upwards should have imo.


Didn't think of that :D. Am sure I noticed it recently too but then yesterday I actually remember walking past and seeing it bright, then it went dim as I was noticing it.
I wondered if it had a meaning. Looking from the side it looks dusty. Might take it out later and put the hoover end around it.
Must be under warranty but not returning it for that :). Wont be able to sell it either tho so maybe spare case it goes into

:D
 
Where did I say 4k, the fools resolution
Oi!!! :p

I prefer higher image quality to higher FPS, what is foolish about that? People who want 144fps at your res need a lot more grunt than I do for my 40-60fps requirement at 4K ;)
 
Sorry what older games? Is RDR2 old by your metric?
I did not use logic but experience. Consider trying that to guide you in life rather than sensationalism.
Why would games built around this generation of consoles and their limitations be an accurate way of guaging requirements of games being built around the next generation of consoles?
 
Just saw this. LOL!!!

Ahahahahahaha :D

1.jpg
 
OK, fair enough. But we don't know for sure if it needs all that vram. We know that some games use all available vram regardless like COD if I remember correctly.

And also this is an example of a game that uses extreme levels of ram and vram. So a lot of games might not peak like this game does.

HBCC confirms it's usage not just allocation
https://www.youtube.com/watch?v=3_iU9Dq8O-M

I've tried to warn people about vram before but it's the same story every time. Now I just let them learn their lesson. This one actually goes past 10 GB sometimes. It's funny to me that people think this is the limit of what games will push for next 3-4 years.
https://www.overclockers.co.uk/forums/threads/rip-8-gb-vram.18864852/
 
Why not? Does anyone think it's going to be rubbish? We also have no idea on availability, ie, if they can supply enough, and prices often seem to creep up too soon after launch.
While I may not this time, I don't think it's bad people do. if AMD were launching with reviews out around same time then I could understand it a bit more, waiting for the reviews.

Risk is you dont pre-order and after the reviews find you can't actually buy one or price crept up. Great if folks fine with that.

Just found the GEFORCE RTX logo on my 2080 has gone dim. Think I just found a justification for an upgrade, again :D

I remember people returning their 2080ti for a refund because they were disappointed with the performance.

Pre ordering without seeing a review is stupid advice in my opinion. Refunds shouldn't have been allowed to teach them a lesson.
 
Yep bugger all rtx games, just control, minecraft and a 40 Yr old quake thing
Yea. If AMD had done that we would have had a lot more meme's I recon :p

Pre ordering without seeing a review is stupid advice in my opinion. Refunds shouldn't have been allowed to teach them a lesson.

Why, you can just return it unopened? At most you lose postage. Reviews are released before the card is delivered so you can just cancel pre-order and no accept delivery when it comes.
 
since movong to a 4k display, all my gpus have been hit really hard ram usage wise to the point i need to drop settinsg to free up ram and mainatin a decent frame rate, destiny 2 and death stranding are 2 of the biggest gpu memory games at 4k, my 2080ti could just about handle both with max settings at around 110fps but almost all of the 11gb frame buffer was used, now i have a 5700xt stop gap card and i have to drop settinsg to high or medium to stay under 8gb, even then the frame rate is around 75 fps.

i hoped the 3080 would have more ram than the 2080ti, but it looks like a 3090 will be my next purchase, kinda needed at 4k max settings imho, anything below 4k and you should be fine with 10gb or less.
 
Back
Top Bottom