• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3080TI launching this year $999 with 20GB VRAM

That's interesting, where are the console going to get all that extra memory from?

Nowhere. That's the minimum. Console players will just live with whatever they're given, same as always. Nothing stopping anyone on PC from dropping textures either, if they're ok with low/medium settings.

I think the larger point is, people who spend $700 on a GPU alone have higher expectations than people paying $500 on a console.
 
This is also worth a read: https://devblogs.microsoft.com/directx/directstorage-is-coming-to-pc/

Some choice quotes basically saying what I've been saying for the last few days in this discussion.

"we unveiled the Xbox Velocity Architecture, a key part of how the Xbox Series X will deliver next generation gaming experiences"

Which links this article https://news.xbox.com/en-us/2020/07/14/a-closer-look-at-xbox-velocity-architecture/

Quote "Xbox Series X includes the highest memory bandwidth of any next generation console with 16GB of GDDR6 memory, including 10GB of GPU optimized memory at 560 GB/s to keep the processor fed with no bottlenecks."

Back to the original article, quote: "Game workloads have also evolved. Modern games load in much more data than older ones and are smarter about how they load this data. These data loading optimizations are necessary for this larger amount of data to fit into shared memory/GPU accessible memory. Instead of loading large chunks at a time with very few IO requests, games now break assets like textures down into smaller pieces, only loading in the pieces that are needed for the current scene being rendered. This approach is much more memory efficient and can deliver better looking scenes, though it does generate many more IO requests."

The bottom line is that developers have been moving away from putting loads of crap in vRAM and hoping they need it eventually, to just streaming assets into vRAM for rendering just in time. And this allowed game words to be bigger than vRAM limitations by an order of magnitude and they want to continue to leverage this principle but with faster NVMe drives. You're not going to need 20Gb of vRAM, all you're doing is buying 10Gb of very expensive high speed GDDR6 memory for no reason.
 
There's a lot of AI stuff being developed though. I wonder if that will need much RAM? Also the idea that some of the RT work will be pre-calculated to improve the in-game efficiency. All got to be stored somewhere I suppose.
 
Basically 95% of all PC gamers right now have less than 8Gb and for years and years the best those 95% will upgrade to is 10Gb, as most will buy into 3080s or below, and the consoles wont have access to more than 8-10Gb either for the next 6 years. So what's this fabled new id game going to run on then?
Why don't you ask them? You can tell them they're wrong and you know best :p
 
They will definitely release Ti and/or Super versions. I'm surprised that people have such short memories and actually are in denial that their might be a Ti coming.


Indeed - many (of the few) Titan owners were a bit miffed when the 1080ti was released giving most of the perf of the first Titan.

It is a clever way of getting those 3090's sold testing those (like me) who maybe just cant wait. A bit annoying for the consumer that Nvidia cant map out the release of the cards to fill that massive gap between the 3080 and 3090, even just considering price doubles. They need summat left in the pot to release when AMD bring theirs out I guess.

There has to be ti's and supers like the 10 and 20 series.

If the 3090 does what it says then could be the last card for a while I need as a 4k 120hz gamer if it delivers that performance seeing as Jensen touted it to do 8k@ 60fps!
 
If the 3090 does what it says then could be the last card for a while I need as a 4k 120hz gamer if it delivers that performance seeing as Jensen touted it to do 8k@ 60fps!
No chance, marketing crap.

Did you see the fps charts of the games they ran? Control was down to 8fps lowest, it was only fairly light workload games that ran well.
 
This rumour does sort of remind me of the Maxwell era.

Nvidia released the Titan X around March 2015 at £900ish and then by June had released a £300 cheaper 980Ti which was essentially a Titan X with half the VRAM....but at the time 6GB was plenty anyway.
 
No chance, marketing crap.

Did you see the fps charts of the games they ran? Control was down to 8fps lowest, it was only fairly light workload games that ran well.

Oh not seen that - is there some proper reviews out now then? I do have to say that the whole DLSS thing confuses me a bit.
 
Oh not seen that - is there some proper reviews out now then? I do have to say that the whole DLSS thing confuses me a bit.
RTX-3090-8-K-gaming-benchmarks.png


Seems like they only show minimums for some of the games, no clue why?
 
RTX-3090-8-K-gaming-benchmarks.png


Seems like they only show minimums for some of the games, no clue why?

No, they are showing the Average frame rate in the darker green for standard gameplay, and then the average framerate with DLSS 2.0 on in the lighter Green - Only 4 titles from those benchmarks support DLSS.

Basically Control was averaging 8fps at 8k with Ray Tracing enabled, that wasn't the minimum!

Where's that slide from? Note the games that did quite well are all relatively old.

Rocket League and Rainbow Six Siege are 5 years old.
Destiny 2 is 3 years old

etc
 
No, they are showing the Average frame rate in the darker green for standard gameplay, and then the average framerate with DLSS 2.0 on in the lighter Green - Only 4 titles from those benchmarks support DLSS.

Basically Control was averaging 8fps at 8k with Ray Tracing enabled, that wasn't the minimum!

Where's that slide from? Note the games that did quite well are all relatively old.

Rocket League and Rainbow Six Siege are 5 years old.
Destiny 2 is 3 years old

etc
Oh right, got it, used to seeing averages/minimums like that in charts and didn't notice that.

https://www.dsogaming.com/news/nvidia-shares-first-8k-gaming-benchmarks-for-its-ampere-rtx-3090-gpu/
 
Oh right, got it, used to seeing averages/minimums like that in charts and didn't notice that.

https://www.dsogaming.com/news/nvidia-shares-first-8k-gaming-benchmarks-for-its-ampere-rtx-3090-gpu/


Still @ 8k those results are awesome and plenty of room for 4k 120hz possible with a 3090. As you say, all we've seen yet is marketing stuff from Nvidia and 3rd party professional reviews will sort the wheat from the chaff. When is the NDA lifted for 3rd party reviews? You know - or rumoured?
 
Indeed - many (of the few) Titan owners were a bit miffed when the 1080ti was released giving most of the perf of the first Titan.

It is a clever way of getting those 3090's sold testing those (like me) who maybe just cant wait. A bit annoying for the consumer that Nvidia cant map out the release of the cards to fill that massive gap between the 3080 and 3090, even just considering price doubles. They need summat left in the pot to release when AMD bring theirs out I guess.

There has to be ti's and supers like the 10 and 20 series.

If the 3090 does what it says then could be the last card for a while I need as a 4k 120hz gamer if it delivers that performance seeing as Jensen touted it to do 8k@ 60fps!

Yep that was me. Bought the titan and then 2 months later the £300 cheaper 1080ti came out with 96% of the performance.
 
8k gaming is so pointless, that it makes me wonder why they didn't showcase 4k/120hz instead.

How many people have 8k screens around the World right now? A few thousand?

Anyone who bought an LG OLED in the last 2 years has a 4k/120hz native panel sitting there already.
 
780Ti. 980Ti. 1080Ti. 2080Ti.

Why do people doubt there will be a 3080Ti, it’s just a question of how soon.
Probably cause there wasn't a 790, 990, 1090 or 2090. When there was a 590 & 690 there wasn't a 580ti or 680ti.

So the pattern has changed.

I agree there is a gap for another card. I just don't believe we can make any assumptions about it's specs at the moment (except for it'll be faster than 3080 and slower than 3090)
 
Probably cause there wasn't a 790, 990, 1090 or 2090. When there was a 590 & 690 there wasn't a 580ti or 680ti.

So the pattern has changed.

I agree there is a gap for another card. I just don't believe we can make any assumptions about it's specs at the moment (except for it'll be faster than 3080 and slower than 3090)

There was a Titan X (Maxwell), Titan X (Pascale), Titan Xp and RTX Titan though

The '3090' seems to have taken their place.
 
Back
Top Bottom