• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3080TI launching this year $999 with 20GB VRAM

Associate
Joined
31 Jul 2019
Posts
515
The new I/O technology is designed to load stuff into VRAM faster, but it's still using the PCIe bus, which is far, far slower than the VRAM itself. There's still going to be a huge performance penalty for running out of VRAM and needing to shuffle stuff in and out, regardless of whether it's coming from system memory or direct from an SSD.

Thanks. Moving from a 4GB 670 to a new card will obviously be a huge performance leap whatever, but don't want to spend a load of cash and then find I'm short on VRAM. Just need to wait for the benchmarks, and the AMD launch, and the console launches...
 
Associate
Joined
12 Jul 2020
Posts
288
Nay chance, why do you think the gap between 3080 and the 3090 is so big? So they can slap a card in at around the 1k mark.
I guess. I'm not prepared to pay anything north of that, and even then I had to convince myself even £700 for a single component is worth it.

The RTX 3070 does appear to be the sweet spot with perf/wattage/price though.
 
Associate
OP
Joined
16 Jan 2010
Posts
1,423
Location
Earth
Is the extra 10GB VRAM likely to be a worthwhile upgrade for a £250-300 price increase? [Edit: noted that it will have extra CUDA cores too]

Also, do we know enough about this touted new gizmo where the GPU accesses data on SSD drives in a more effective way? Does that perhaps reduce the VRAM requirements?
I think the gap between 3080 and 3080 TI is potentially bigger than 2080TI to 2080S and 1080ti vs 1080. Exciting times.
 
Associate
Joined
31 Jul 2019
Posts
515
I think the gap between 3080 and 3080 TI is potentially bigger than 2080TI to 2080S and 1080ti vs 1080. Exciting times.

It is genuinely exciting to be a nerd right now. I mean I'm 38, have a respectable job in the city, wife and kids, and here I am getting all over-enthusiastic about computer games!
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Is the extra 10GB VRAM likely to be a worthwhile upgrade for a £250-300 price increase? [Edit: noted that it will have extra CUDA cores too]

Also, do we know enough about this touted new gizmo where the GPU accesses data on SSD drives in a more effective way? Does that perhaps reduce the VRAM requirements?

10Gb of extra vRAM will be a waste for gaming. The few games we have right now that can throw enough assets in to vRAM to exceed 10Gb simply wont run fast enough to be playable, as evidenced by FS2020 @4kUltra. Any extra assets you're throwing into vRAM that are being used by the GPU and aren't just mis-allocated also put more demand onto the GPU itself.

Games have grown from just 10Gb installs to 100Gb+ and the vRAM usage hasn't really changed much, the vast majority of AAA games in the last few years are using somewhere in the region of 5-6Gb when maxed out, of the few exceptions that go above 8Gb we know most of the time that's not real usage, it's allocated but not used memory as 8Gb limited cards suffer no impact having a "limited" amount of vRAM, Resident Evil is a great example of this. The simple fact is that engines are now good enough to predict what will be needed and dynamically stream assets in and out of vRAM from disk so that the only thing we need to keep in vRAM is only the assets used in actual rendering, we don't need some fancy cache. if we did we'd need 100Gb+ vRAM cards.

The RTX IO thing is using the new Microsoft tech for basically allowing the GPU to bypass the CPU and fetch assets direct from disk for much faster and more streamlined fetching and not putting too much strain on the CPU to do uncompression of assets. The CPU ferrying that data back and forth is starting to become a bottleneck as game installs reach 200Gb and start to surpass that. This has been a primary consideration in the consoles and their move to not only having SSDs as primary disks for game storage but also very fast disks, the PS5 will have an SSD that's 5.5GB/sec specifically to allow developers to move towards more aggressive swapping to vRAM.

We had this discussion over about 30 pages in the Ampere rumour thread leading up to the Nvidia reveal where I basically suggested that where as in the past we'd simply load all the assets we need in a "level" into vRAM and it's just a fast cache. Modern engines simply stream in whatever they need or think they'll need in the near future. It means that vRAM stopped growing with the size of games and we're now in a paradigm where most of the vRAM usage you'll see are only assets the GPU needs right now, or expected to need in a few seconds. And so the amount of vRAM you need really is proportional to what you can fit on screen at once, and how much you can fit on screen at once is basically limited by your GPUs ability to number crunch and provide you with frames. Pile too many assets on screen at once and your GPU will be the bottleneck.

The 24Gb of the 3090 is insane, that's basically just targeted at CAD and professional design people who are going to be doing non-real time renders and that kind of thing. It's not at all necessary for gaming. But it's causing people to make these guesstimates that what the 3080 really needs is 16Gb of vRAM and that 10Gb won't be enough. It's worth also keeping in mind that PC games are no longer PC games. They're multiplatform games and share a platform with the consoles and developers target the lowest common denomintors for development which is the consoles. Next Gen are going 16Gb total -2Gb (or likely more) for the OS and all the "apps" and then -4Gb or more for the actual game engine itself leaving games realistically with probably no more than 8Gb of vRAM, 10 at a push.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
10Gb of extra vRAM will be a waste for gaming.
A programmer at iD software just said 8GB is the bare minimum for upcoming titles. Flat out said anything less than 8GB is not worth buying. Poor 3060 :p

I'm sure you'll keep insisting that 12/16/20 is a "waste", but if AMD release cards with more VRAM - and if nV release refreshes with more VRAM - I hope to see you back here saying you were wrong :p
 
Caporegime
Joined
18 Oct 2002
Posts
39,454
Location
Ireland
A programmer at iD software just said 8GB is the bare minimum for upcoming titles. Flat out said anything less than 8GB is not worth buying. Poor 3060 :p


He can only really speak for their engine and what they have coming up, not as if he's in the know about what competing engines are doing in terms of vram usage and texture sizes.
 
Soldato
Joined
11 Mar 2013
Posts
5,451
If I was buying a 3080 card I would want more vram. I would wait for more vram on the 3080 cards. I also hope they redesign the shroud. I don't like it at all. It's tacky. The 10 and 20 series shrouds were much better
 

smr

smr

Soldato
Joined
6 Mar 2008
Posts
8,753
Location
Leicestershire
If I was buying a 3080 card I would want more vram. I would wait for more vram on the 3080 cards. I also hope they redesign the shroud. I don't like it at all. It's tacky. The 10 and 20 series shrouds were much better

Not everyone needs more than 10GB RAM which is why they designed the 3080 as is. I'm sure they've a pretty decently experienced, no less qualified marketing team at nvidia :)

I have a 2GB Graphics card right now. So 10Gb is very nice indeed, especially for £650.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
A programmer at iD software just said 8GB is the bare minimum for upcoming titles. Flat out said anything less than 8GB is not worth buying. Poor 3060 :p

I'm sure you'll keep insisting that 12/16/20 is a "waste", but if AMD release cards with more VRAM - and if nV release refreshes with more VRAM - I hope to see you back here saying you were wrong :p

That's interesting, where are the console going to get all that extra memory from? There's no way their GPUs will have access to more than 10Gb of vRAM, and that's for the next 6 years.

Having more vRAM than you need is a waste, just like having more system RAM than you need is a waste. Any more vRAM than you need to accomplish the job adds no additional value. The RTX Titan can't break 30fps with 12Gb of assets in vRAM, you think you're going to fill 20Gb? LOL at what, 3 FPS? No one is going to play games at 25fps or lower.

They will release card with more vRAM if the GPU can warrant the increase in vRAM otherwise they're just adding cost to their cards with no benefit. I have no problem admitting if I'm wrong, but all the actual evidence so far shows that games basically don't use more than 8, on average 6 or less. And new cards will run into GPU bottlenecks before they will vRAM ones. At least my speculations are based on data.

Basically 95% of all PC gamers right now have less than 8Gb and for years and years the best those 95% will upgrade to is 10Gb, as most will buy into 3080s or below, and the consoles wont have access to more than 8-10Gb either for the next 6 years. So what's this fabled new id game going to run on then?
 
Back
Top Bottom