• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Even with RT disabled the 1080Ti is not going to be strong at 4k or even 2 or 3k in newer games. If people want to keep it and maximise their investment then fine, but lets not pretend it's now still a top-tier card and it lacks support for the latest features.


Unless there are SEVERE Ampere stock shortages that make them impossible to get hold for weeks of then the 2x00 series will only drop in value as time goes on, but clearly the cheaper the card in the first place then the lower the impact. Anyone with half a brain sold them weeks ago.

I feel like the 1080ti will last fine. Newer VERY taxing games which are poorly optimised *looks at RDR2*, sure it might struggle, but otherwise I don't envisage the 1080ti or the 2080 being obsolete. I think the 3080 and 3090 are clearly for those who want to game at 4k/120fps or at least get over the 60 mark.

I can optimise nearly any game to 4k/60 on my 2080 and thats the same as a 1080ti.

I think its important to keep everything in perspective. If you want raytracing and ultra super dooper settings, sure the 1080ti might not cut the mustard for new titles in the next 24-36 months but I don't evisage it becoming obsolete at all in the next 12-24 months, aslong as you are happy tweaking with a few settings.

Now if we get widespread adoption of DLSS 2.0 and developers start REALLY pushing things, assuming we use DLSS 2.0.. then things get crazy, but as its propriertry software and consoles are the biggest focus for developers, its not going to happen across the board. Look at RDR2.. my 2080 would have been fine playing it at 4k if they had some form of DLSS, but its no where to be seen. I think DLSS is amazing.. but if its not widespread and mass adopted.. its just a perk for the few games that use it, which is sad.




TLDR: I think smart people who see the value in products they buy are good to wait 12-24 months on their fully functioning 1080ti/2080 and see what NVIDIA offer next *cough super* *cough 3080ti* or at least waiting for AMD's offerings to drive down the prices a little and see evidence of next-gen games on VRAM requirements.

However for the idiots out there who bend over and are NVIDIA shills to any new GPU they release and get excited over a marble demo.... then yes we'll probably be buying the 3080+3090 and justifying it as the best thing since sliced bread :D (I'm probably one of those idiots, but I'm aware I'm one of those idiots :D :D)
 
performance-3840-2160.png
I think HUB have this wrong TBH. Their Doom Eternal 2080 testing is an outlier.

Something is amiss, yes.
 
https://www.dsogaming.com/pc-performance-analyses/marvels-avengers-pc-performance-analysis/

Speaking of 4K, we did notice some VRAM limitations when using the game’s HD Texture Pack. On Ultra Settings/Ultra Textures, our performance went downhill in the following scene. As you can see, our RTX2080Ti was used to its fullest and pushed 17fps.
However, when we used Ultra Settings/High Textures, our performance skyrocketed to 42fps. This appears to be a VRAM limitation. As we can see, the game’s Ultra textures used 10.5GB of VRAM, whereas High textures used 8GB of VRAM. Thus, it will be interesting to see whether the NVIDIA RTX3080 will be able to handle this game in 4K with its 10GB VRAM.
 

This is something that eventually when games incorporate it RTX IO might fix.

We just don't know yet how this is going to pan out.

As I pointed out in the other thread. Even on the consoles they need system ram for the OS and for the game before you even start talking about VRAM.

So lets say this game uses 10.5GB of vram on a console too. That leaves 5.5GB of ram for the OS and for the game it's self. How do you think it will run? Sounds like a rubbish experience to me. These days windows alone with nothing open uses 4GB of ram.

So yes, PC to PC comparisons of VRAM are valid. But console to PC are not. It's not apples to apples comparison.
 
This is something that eventually when games incorporate it RTX IO might fix.

We just don't know yet how this is going to pan out.

As I pointed out in the other thread. Even on the consoles they need system ram for the OS and for thee game before you even start taking about VRAM.

So yes, PC to PC comparisons of VRAM are valid. But console to PC are not. It's not apples to apples comparison.

Maybe, if you're missing the requirements by a little, but if it's more like 2 GB or so then probably not. Also RTX IO is per-game implementations, so there's no universal boost older games will receive.

Lead Programmer on Doom Eternal
 
Maybe, if you're missing the requirements by a little, but if it's more like 2 GB or so then probably not. Also RTX IO is per-game implementations, so there's no universal boost older games will receive.

Lead Programmer on Doom Eternal

Yes. I agree. Older games most likely will not be upgraded to support this feature, but next gen games which exist on the new consoles in 2021 most likely will start to use it.

The thing is for the life of the next gen console generation that total system memory of 16GB is fixed. So developers will optimise the games to fit in to that memory envelop.

If they go over it, the games will just not deliver a good experience on the consoles.
 
I don't class rdr2 as poorly optimised? It's the only game in years I've played that had a wow next gen moment for me


Maybe consider that never cards are just better than your 1080Ti? 2080S ran RDR2 at mostly high settings and got 60-75 FPS in game at 5Mp Res.

I heard the 1080Ti was down at 40fps or so.

It's just ageing is all and it's ok. It's a legend of a card sure but has its limits.
 
This is something that eventually when games incorporate it RTX IO might fix.

We just don't know yet how this is going to pan out.

As I pointed out in the other thread. Even on the consoles they need system ram for the OS and for the game before you even start talking about VRAM.

So lets say this game uses 10.5GB of vram on a console too. That leaves 5.5GB of ram for the OS and for the game it's self. How do you think it will run? Sounds like a rubbish experience to me. These days windows alone with nothing open uses 4GB of ram.


So yes, PC to PC comparisons of VRAM are valid. But console to PC are not. It's not apples to apples comparison.

This I belive is an important point. I reckon around 8GB of that 16GB minimum will be reserved for the OS and for the game ram on the console. Otherwise the games wont run very well at all.

I don't think it all loads in to vram and runs...?
 
Can anyone confirm how it works, do they arrange pick up and delivery or malfunctioning card?
Just been through the RMA with them, so pickup from house by Fedex, turned round and replacement card in 5 working days back to my door. All sorted via webchat with Nvidia
 
This I belive is an important point. I reckon around 8GB of that 16GB minimum will be reserved for the OS and for the game ram on the console. Otherwise the games wont run very well at all.

I don't think it all loads in to vram and runs...?

I really don't like comparing PC & Console on hypothetical points, because there's vast efficiency advantages for consoles so it doesn't translate to PC well at all. Once we can test games then we will know better.

Speaking strictly about the consoles though, it's not 8 GB for OS+misc, it's much less. Microsoft says 2.5 GB for OS.
"Developers will be using the overall 16GB of memory in two ways: there’s 10GB for fast GPU optimal memory, 3.5GB for standard memory, and 2.5GB reserved by the OS."

So for pure graphics work XSX will have a full fat 10 gigs to work with. That's the new minimum for 4K next-gen games. For PC you'll probably need more to make up for inefficiencies unless it's an RTX IO game. Then again, we are on PC so we can always reduce other settings or textures if we want.
 
Aren’t we being a bit harsh on 2080Ti owners with all these comments about “...shoulda sold your card weeks ago” etc.

Was there really any guarantee that the 3000-series would seemingly offer such an improvement? Not really, especially after the damp squib of the 2000-series launch.
 
Aren’t we being a bit harsh on 2080Ti owners with all these comments about “...shoulda sold your card weeks ago” etc.

Was there really any guarantee that the 3000-series would seemingly offer such an improvement? Not really, especially after the damp squib of the 2000-series launch.

Indeed. If people had sold and the 3000 series was another damp squid then people would have been laughing at all the ones who sold.

There were people who sold their 1080ti last time and then regretted it and bought 1080ti back.
 
Aren’t we being a bit harsh on 2080Ti owners with all these comments about “...shoulda sold your card weeks ago” etc.

Was there really any guarantee that the 3000-series would seemingly offer such an improvement? Not really, especially after the damp squib of the 2000-series launch.


None of us *knew* what was coming. But I it was odd that so many Turing owners dismissed every generation prior to Turing and assumed that Turing's "progress" was the new normal going forward.

Turing's price/performance uplift over the previous generation was noticeably worse than every generation before it. I didn't think it was *impossible* that things would continue to stagnate, but it was going to take more than a single generation to get me to buy-in to the idea.

Many of that crowd still seem to consider Ampere to be some sort of miracle, but if you look at all the generations other than Turing and extrapolate what Turing should have offered, Ampere is merely a "good" generational improvement.

Launch the 2080Ti at $700 and the progress from Fermi to Ampere looks more or less steady.
 
I really don't like comparing PC & Console on hypothetical points, because there's vast efficiency advantages for consoles so it doesn't translate to PC well at all. Once we can test games then we will know better.

Speaking strictly about the consoles though, it's not 8 GB for OS+misc, it's much less. Microsoft says 2.5 GB for OS.
"Developers will be using the overall 16GB of memory in two ways: there’s 10GB for fast GPU optimal memory, 3.5GB for standard memory, and 2.5GB reserved by the OS."

So for pure graphics work XSX will have a full fat 10 gigs to work with. That's the new minimum for 4K next-gen games. For PC you'll probably need more to make up for inefficiencies unless it's an RTX IO game. Then again, we are on PC so we can always reduce other settings or textures if we want.

So suddenly the 10GB on the 3080 seems reasonable.. Inline with the next gen console generation.
 
See my comment above.

It'll be fine most of the time but, if you get a scammer who claims your item turned up damaged or non-functional or whatever, they'll raise a dispute and the bay will instantly side with them and not even listen to your side of the story. They'll even take the money back from you via Paypal and leave you without the money or the product.

This is particularly dangerous with graphics cards right now because buyers may suddenly realise they've paid over the odds and find a reason to return the card, even deliberately breaking it.

It's just not worth the risk.

Just need to insure it to the correct amount.
 
Why have they skimped on the vram? It makes me uncertain about committing. I haven't upgraded my gpu in years. Now , right at the perfect time, some silly gimp on the vram causes issue .
 
Back
Top Bottom