• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

I honestly don't think my 1080Ti can actually *use* all 11gb of RAM. I just don't think the GPU and bus can move that much data quickly enough.
Try some modded games,with ENBs and run them at 4K with 4K and higher resolution assets.


At what FPS? would it even be double digits?

My 1080Ti barely runs my HP Reverb at 90fps whilst using ~6gb of ram. I don't want to see what happens to my frame rates if it was using 11gb of ram.
 
Because I'm an idiot and I'm on the brink of getting a new PC... Do you think the current gen Intel chips/mobos that are only PCIe3 could hold something like the 3090 back or do you think it's not really something to worry about?

As others have said, we don't know for sure but I'd bet large amounts of money that it'll make no practical difference at all.

If you're planning on going Intel now then get a decent Z490 board and you can always drop a Rocket Lake in later this year or early next which will give you PCIE4 for the GPU anyway.
 
At what FPS? would it even be double digits?

I have seen what running out of VRAM does,it craters FPS into low levels. It's one of the symptoms if your GPU or CPU isn't at 100% utilisation.

This is what happened with my GTX1080,when there was a bug in the DX12 path in ROTR in the Village of the Damned. FPS just plummeted as the VRAM utilisation was on the edge. It was fixed if I used DX11,or dropped settings to medium.

yeah be a good mach for a 3080/3090, other may have opinions for other manufactures, i have a sf750w sfx platinum unit, on the low side but it should do, if not i'll have to opt for a bigger unit

Is this in a SFF PC?? Why are you trying to shoehorn a 300~350W GPU into a system like that? It's going to cause your GPU to start throttling in a small case,because at least blowers push air out of the system. In this case the whole of the case is going to start heating up,including the CPU.

Surely get an RTX3070 or look if Big Navi is more efficient.
 
Why wouldn't you sell your 2080 Ti? If you know the value of the card is going to insta-drop when they announce the 3080 for £200-400 less than a 2080 Ti which is faster, has the most update support and smashes Turing in it's only trump card - RTX.

Sell 2080 Ti now at £700... or wait a day for that brand new car level depreciation.

This. I sold mine for £900 about a month ago. That's how quick it goes down..
 
I have seen what running out of VRAM does,it craters FPS into low levels. It's one of the symptoms if your GPU or CPU isn't at 100% utilisation.

This is what happened with my GTX1080,when there was a bug in the DX12 path in ROTR in the Village of the Damned. FPS just plummeted as the VRAM utilisation was on the edge. It was fixed if I used DX11,or dropped settings to medium.


Let me put it another way. Do you think my old GTX 560 would have any use for 11gb of vram?

I do not think it would.
 
Crysis was a graphics demo masquerading as a game. It may have been amazing to look at but it wasn't representative of gaming in general at the time and thus benchmarks were merely a curiosity (i.e. the infamous 'can it run Crysis') rather than actually useful to any degree.
I enjoyed it as a game AND it looked amazing. Gaming always needs ambitious projects to push the envelope. Is FS 2020 a graphics demo masquerading as a game? Representative of gaming now means what exactly? Plants vs Zombies, Skyrim with all possible mods, CS:GO, RDR2, MS FS 2020? Tell me which are representative oh wise one...?
 
Next gen consoles have access to 14gb of memory - 16gb total with 2gb reserved for the OS, the game can use 14gb. On current gen consoles, games have access to 7gb memory.

Having experienced several console generations as a PC gamer, I can tell you that if any specs from your PC is lower than a console, you're gonna have a bad time

That's total for the system, the game engine and the video RAM. So of the remaining 14Gb you have to get the game engine in there as well, which on the PC goes right into system RAM, and itself can easily be 4-6Gb of RAM leaving somewhere in the neighbourhood of 8-10Gb for use as vRAM. And that's precisely what you'd expect because the speed of the GPUs in the next gen consoles are something in the ballpark of like a 1080 and that's the vRAM those cards had access to. The next gen consoles are NOT (*edit, sorry this was wrong) going to blow past 8Gb of usage for vRAM because you'll be at 10fps loading that many high quality assets into your game engine.

There seems to be this pervasive misunderstanding that vRAM is chosen to future proof the card against what up coming games will demand, and that's wrong. The purpose of vRAM is to feed the GPU data to do work on, you can't just keep loading data into the game world without impact to your performance. If there's a floor to GPU performance (min acceptable frame rate) then there's an equivalent ceiling to useful vRAM. The amount of vRAM to put on a card is directly proportional to the amount of work the GPU can do, thats how you make vRAM decisions. With probably the exception of cards aimed more at CAD or doing other non-gaming things like professional rendering.
 
Last edited:
We will need to see how AMD and Nvidia prices this generation. It might be worth looking at AMD and a new monitor if they do price competitively(and that monitor would work with Nvidia GPUs too).

Grabbing a new monitor that I don't really want will cost me more in the long run. Only changing this monitor for another ultra wide that's full array local dimming and hdr1000 rated for approx £1000. No such monitor exists, only Samsung G9 is remotely close but too wide for me and a bit more expensive than that.
 
Let me put it another way. Do you think my old GTX 560 would have any use for 11gb of vram?

I do not think it would.

My GTX1080 had more than enough grunt to run the game. But when it ran out of VRAM,the FPS tanked.

gjlWVWz.jpg


The VRAM was teetering on the edge of 8GB under DX12. The moment it went slightly over 8GB of actual usage the FPS literally halfed and the game was also more stuttery if I panned.

I think it was fixed later on,but a lot of what people think is the GPU running out of grunt can be VRAM limitations.

Your GTX1080TI can use all 11GB,as texture mods are not reliant on your GPU. I run modded Fallout 4,which is capped at 60FPS. With some of the 4K and 8K,etc texture mods,once it goes over 8GB,performance shows the same crashes.

GPU utilisation was nowhere near 100% in those scenarios with a GTX1080. The GPU wasn't the problem,the amount of VRAM was.

If I had a GTX1080TI,I would see more consistent performance at qHD due to VRAM.
 
Last edited:
Anyone following this thread for s few days will have seen lots of examples of games using 8gb+ vram. Not just MSFS. Plenty games

It why I am much more careful with modding now,but I had just got new shiny,so decided to see what I could get away!

:P


3080 i was looking at, that would mean looking at the 3070 if it doesn't require over 650W.. I'm probably gonna need to order a new PSU

Well I run a GTX1080FE off a Corsair SF450,so I suspect an RTX3070 would be absolutely fine on your PSU. Total board power is 220W IIRC,which is around 40W higher than a GTX1080FE. Mate run an FX CPU and an R9 390 off a 500W PSU for years. I think the GA104 based GPUs will be fine - it looks like the GA102 based ones might be the thirsty ones. OTH,we need to wait and see how performance stacks up,I would still expect the RTX3080 to be better performance/watt over an RTX2080TI.

I think perhaps the RTX3080 could be doable,but the issue is going to be what the peaks look like. However,the Gainward RTX3080 did appear to use dual 8 pin PCI-E power connectors,so maybe the 320W rated board power might be closer to a peak figure,than the RTX3090?? TBH,I wonder if AMD will be more efficient this time around. It will be interesting to see how this all pans out.
 
3080 i was looking at, that would mean looking at the 3070 if it doesn't require over 650W.. I'm probably gonna need to order a new PSU
Just realised my Antec/Seasonic TP-650 is 10 years old in a couple months.

"Supposed" to replace them after a few years anyhow, or so I thought.
 
Just thought. All of the rumoured prices for the cards are in $. Are we expecting it to be a simple £/$ swap? IE $1400 becomes £1400?

If they are,then there is a simple answer - don't buy them at that price?? :p

Just realised my Antec/Seasonic TP-650 is 10 years old in a couple months.

"Supposed" to replace them after a few years anyhow, or so I thought.

My Mate's TP550 has lasted nearly a decade too,and for 5 of those years was running an overclocked FX6350 and an R9 390.
 
Anyone following this thread for s few days will have seen lots of examples of games using 8gb+ vram. Not just MSFS. Plenty games

The game wil use whatever is there, my 2080Ti used over 10GB in The Division 2, my 5700XT uses just over 7 with the same settings and res (LG 38GL950G-B). No stutters etc, both completely smooth (Ti hits higher obv).
 
My GTX1080 had more than enough grunt to run the game. But when it ran out of VRAM,the FPS tanked.

gjlWVWz.jpg


The VRAM was teetering on the edge of 8GB under DX12. The moment it went slightly over 8GB of actual usage the FPS literally halfed and the game was also more stuttery if I panned.

I think it was fixed later on,but a lot of what people think is the GPU running out of grunt can be VRAM limitations.

Your GTX1080TI can use all 11GB,as texture mods are not reliant on your GPU. I run modded Fallout 4,which is capped at 60FPS. With some of the 4K and 8K,etc texture mods,once it goes over 8GB,performance shows the same crashes.

GPU utilisation was nowhere near 100% in those scenarios with a GTX1080. The GPU wasn't the problem,the amount of VRAM was.

If I had a GTX1080TI,I would see more consistent performance at qHD due to VRAM.

Your chart has no explanation for the different color traces.

And you never answered my question. Do you think my old GTX 560 could use 11gb of vram?
 
Back
Top Bottom