I'm sure reviewers will try and find where it will be hampered. Also I'm sure it will be really tough to find such scenarios on pcie4The card will be seriously gimped.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'm sure reviewers will try and find where it will be hampered. Also I'm sure it will be really tough to find such scenarios on pcie4The card will be seriously gimped.
And that was exactly the plan. Locked in.Was a few years ago. You didn't really have a choice then - there was no cross support. If you had an AMD card you would've naturally gravitated towards a freesync monitor and Nvidia, g-sync. Much better now with nvidia supporting both. I'd imagine there's loads of us in a similar situation.
I'm on a £700 gsync for the time being that I'm not ready to replace so what AMD do next won't make too much of a difference to me sadly. Next time, once I've managed to change my monitor, I'd love to have a choice. I'll support AMD with a console buy![]()
I'm going to run a 3080 on a 9 year old Corsair HX520W. Come at me.
Now that is revealing one big issue IMO with reviewers and TechTubers and overall review mentality. It's pretty much release reviews and forget about the product. Not that many go back after 6-12 months and retest and even fewer actually dives deep into calibrating/tune the products(that's a separate pet-peeve of mine).
Now that is revealing one big issue IMO with reviewers and TechTubers and overall review mentality. It's pretty much release reviews and forget about the product. Not that many go back after 6-12 months and retest and even fewer actually dives deep into calibrating/tune the products(that's a separate pet-peeve of mine).
Not one of them (except maybe Gamers Nexus) will show anything other than FPS numbers. Nobody will look into mem usage or discuss future proofing for the reviews, mark my words. it will just be FPS on 5 year olds games. as per.
What do people think is a reasonable future-proof amount of VRAM for those playing at 1440p with no plans to move to 4k?
Is the amount of memory really gonna be a issue though?
I've been running a 2080 at 1440p for the last two years and never had any issues with the 8GB it has.
The whole 3080 10GB is actually really clever by Nvidia. Because it will make the card much cheaper for them to make, but also. when the reviews all come out after release. The card will no doubt perform great, because the reviewers will be testing GTAV, Witcher, CSGO, etc etc, the usual suspects. And the VRAM won't all be used up. But when you find a title that does (MSFS) or more in 12 months time. The card will be seriously gimped. But by then you have already bought it based on the release reviews.
Basically the reviews won't show it gimped by mem saturation. But that is a real threat down the line.
Is the amount of memory really gonna be a issue though?
I've been running a 2080 at 1440p for the last two years and never had any issues with the 8GB it has.
Only 2GB for the OS? I've seen 3.5GB quoted.Next gen consoles have access to 14gb of memory - 16gb total with 2gb reserved for the OS, the game can use 14gb. On current gen consoles, games have access to 7gb memory.
Having experienced several console generations as a PC gamer, I can tell you that if any specs from your PC is lower than a console, you're gonna have a bad time
I think that people shelling out £800 for a card marketed at 4K GAMING don't want to get their ego dented when iD release their next game with "Nightmare" settings which is not compatible with 10GB VRAM.
This could potentially also be relevant an upcoming game that NVIDIA have their prints on. I'm sure CDPR have made them aware exactly what kind of specs are needed to run that game at ultra. It would be pretty shameful to have your new shiny 10GB card not meet the recommended specs of a next gen game after two months.
All this arguing back and forth about if it's "needed" or just cached is pointless. The bottom line is would the consumer be better off with more VRAM £800 - yes. NVIDIA will give you just enough for now though, until they want you to buy their next card.
Only 2GB for the OS? I've seen 3.5GB quoted.
Also won't some game data as well as game graphics data be loaded into the vram whereas the game data on the PC version is loaded into ram?
I think that people shelling out £800 for a card marketed at 4K GAMING don't want to get their ego dented when iD release their next game with "Nightmare" settings which is not compatible with 10GB VRAM.
I think 12 GB should be the absolute minimum. 3070 12 GB and 3080 16 GB would have been ideal.What do people think is a reasonable future-proof amount of VRAM for those playing at 1440p with no plans to move to 4k?
What time is the Nvidia event in BST?