• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Was a few years ago. You didn't really have a choice then - there was no cross support. If you had an AMD card you would've naturally gravitated towards a freesync monitor and Nvidia, g-sync. Much better now with nvidia supporting both. I'd imagine there's loads of us in a similar situation.
And that was exactly the plan. Locked in.
 
I'm on a £700 gsync for the time being that I'm not ready to replace so what AMD do next won't make too much of a difference to me sadly. Next time, once I've managed to change my monitor, I'd love to have a choice. I'll support AMD with a console buy :D

Buy both consoles... The more you buy, the more you save. That quote applies to everything. It's the wisdom of Jensen. It just works :P
 
Now that is revealing one big issue IMO with reviewers and TechTubers and overall review mentality. It's pretty much release reviews and forget about the product. Not that many go back after 6-12 months and retest and even fewer actually dives deep into calibrating/tune the products(that's a separate pet-peeve of mine).

Not one of them (except maybe Gamers Nexus) will show anything other than FPS numbers. Nobody will look into mem usage or discuss future proofing for the reviews, mark my words. it will just be FPS on 5 year olds games. as per.
 
Now that is revealing one big issue IMO with reviewers and TechTubers and overall review mentality. It's pretty much release reviews and forget about the product. Not that many go back after 6-12 months and retest and even fewer actually dives deep into calibrating/tune the products(that's a separate pet-peeve of mine).

I think it was hardware unboxed that has said theyd be using MSFS in their benchmarking from now on?
 
What do people think is a reasonable future-proof amount of VRAM for those playing at 1440p with no plans to move to 4k?

Next gen consoles have access to 14gb of memory - 16gb total with 2gb reserved for the OS, the game can use 14gb. On current gen consoles, games have access to 7gb memory.

Having experienced several console generations as a PC gamer, I can tell you that if any specs from your PC is lower than a console, you're gonna have a bad time
 
Is the amount of memory really gonna be a issue though?

I've been running a 2080 at 1440p for the last two years and never had any issues with the 8GB it has.

It is & it isn't. It depends on how much you care about texture quality & streaming distance/details.

eg look at this comparison and pay attention to pop-in & lod transitions https://youtu.be/bGc7Hr68X9g?t=31
if what you see there for consoles doesn't bother you then vram shouldn't concern you; essentially that's the difference we're talking about

and here's hd textures comparison too
https://youtu.be/WlXzMT6QjQE
 
The whole 3080 10GB is actually really clever by Nvidia. Because it will make the card much cheaper for them to make, but also. when the reviews all come out after release. The card will no doubt perform great, because the reviewers will be testing GTAV, Witcher, CSGO, etc etc, the usual suspects. And the VRAM won't all be used up. But when you find a title that does (MSFS) or more in 12 months time. The card will be seriously gimped. But by then you have already bought it based on the release reviews.
Basically the reviews won't show it gimped by mem saturation. But that is a real threat down the line.

Personally I think it's a subtle hint by Nvidia to push people towards the more expensive 3090, but they need to be careful.
If they don't come out and confirm a 20Gb 3080 or 16Gb 3070 soon, then it would be a invitation for AMD to come in and fill that VRAM void.

This is why I'm holding off, I want to see the whole line up from both sides.
 
I think that people shelling out £800 for a card marketed at 4K GAMING don't want to get their ego dented when iD release their next game with "Nightmare" settings which is not compatible with 10GB VRAM.

This could potentially also be relevant an upcoming game that NVIDIA have their prints on. I'm sure CDPR have made them aware exactly what kind of specs are needed to run that game at ultra. It would be pretty shameful to have your new shiny 10GB card not meet the recommended specs of a next gen game after two months.

All this arguing back and forth about if it's "needed" or just cached is pointless. The bottom line is would the consumer be better off with more VRAM for £800 - yes. NVIDIA will give you just enough for now though, until they want you to buy their next card.
 
Last edited:
Next gen consoles have access to 14gb of memory - 16gb total with 2gb reserved for the OS, the game can use 14gb. On current gen consoles, games have access to 7gb memory.

Having experienced several console generations as a PC gamer, I can tell you that if any specs from your PC is lower than a console, you're gonna have a bad time
Only 2GB for the OS? I've seen 3.5GB quoted.

Also won't some game data as well as game graphics data be loaded into the vram whereas the game data on the PC version is loaded into ram?
 
I think that people shelling out £800 for a card marketed at 4K GAMING don't want to get their ego dented when iD release their next game with "Nightmare" settings which is not compatible with 10GB VRAM.

This could potentially also be relevant an upcoming game that NVIDIA have their prints on. I'm sure CDPR have made them aware exactly what kind of specs are needed to run that game at ultra. It would be pretty shameful to have your new shiny 10GB card not meet the recommended specs of a next gen game after two months.

All this arguing back and forth about if it's "needed" or just cached is pointless. The bottom line is would the consumer be better off with more VRAM £800 - yes. NVIDIA will give you just enough for now though, until they want you to buy their next card.


If the specs are true, then the 3080 10GB will have less VRAM than the 1080ti which was released in March 2017, I think nothing more needs to be said.

A scandalous ripoff.
 
Only 2GB for the OS? I've seen 3.5GB quoted.

Also won't some game data as well as game graphics data be loaded into the vram whereas the game data on the PC version is loaded into ram?

You need to also appreciate its GDDR6 system memory and a very fast PCI-E 4.0 SSD too. So things can be loaded and unloaded into the system memory very quickly,unlike the PC. Consoles will also use a more cutdown OS.
 
I think that people shelling out £800 for a card marketed at 4K GAMING don't want to get their ego dented when iD release their next game with "Nightmare" settings which is not compatible with 10GB VRAM.

Ha, that's one of the examples always on my mind. I really couldn't believe it but my Vega 64 would choke on Doom Eternal and started stuttering heavily until I reduced textures/streaming, then it ran freely and smoothly. And that's not even an open world game, instead it has very limited & restricted levels, and the textures themselves aren't quite fully up to a 4K standard. Definitely don't expect the next Doom game to choke any less with 10 GB.
 
What do people think is a reasonable future-proof amount of VRAM for those playing at 1440p with no plans to move to 4k?
I think 12 GB should be the absolute minimum. 3070 12 GB and 3080 16 GB would have been ideal.

I don't think that's unreasonable. The 3070 perf is supposed to match the 2080 Ti - an 11 GB card.

If the performance of the 2080 Ti justified 11 GB, then why should we suddenly believe that a 3070 only needs 8 GB for the same performance? It doesn't add up.
 
Back
Top Bottom