So the latest 5090 rumour is that it will have a proper 512-bit bus and 32Gb of DDR7 VRam.
Definitely holding out for one now.
Hope you mean GDDR7
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
So the latest 5090 rumour is that it will have a proper 512-bit bus and 32Gb of DDR7 VRam.
Definitely holding out for one now.
So the latest 5090 rumour is that it will have a proper 512-bit bus and 32Gb of DDR7 VRam.
Definitely holding out for one now.
All the 3080/3080ti/4070ti/4070 owners will inevitably be forced to upgrade again when a broken game gets ported across to PC. All part of Jensen’s plan.I think the one thing we will see, next generation, is more vram.
I think the one thing we will see, next generation, is more vram.
All the 3080/3080ti/4070ti/4070 owners will inevitably be forced to upgrade again when a broken game gets ported across to PC. All part of Jensen’s plan.
See below.
This. The tops cards will have enough but Nvidia will continue to starve lower cards to force more frequent upgrades or to get people to buy top cards.
I watched that video and there is performance to be had at huge power consumption.
Intel need 250W to match AMD at 100W with their cpus but no one seems to care. People just accept that type of power usage even if you can get very similar results at far less Wattage. Double standards are applied all across the tech industry and sometimes it a nothing burger and sometimes it is outrageous and the company should be castigated.
There's quite a difference between CPU and GPU power consumption when it comes specifically to gaming though which is what this would be aimed at, Yes I can push my 13900K to 250 watts if I want to in benchmarks that hammer the CPU but in games I've yet to see it go over 120 watts, Most of the time it hovers around 80 watts.
GPU's on the other hand are a different kettle of fish as with modern games becoming very demanding you want all the performance you can get your hands on but not at the heat/power output cost of a 600 watt card, That would be ludicrous.
AMD already ahead of ya dudeI think the one thing we will see, next generation, is more vram.
Is it because you wanna play Remnant in ultra/decent FPS by any chance?, I have been playing it recently with a mate, fantastic game, loved the first one but number 2 dials it to 11...Caved and bought a 4090 devs don't gaf about optimising
There needs to be a:
5090 24gb
5080 16gb (no we dont need 2 variants, just 1 will do thanks)
Everything else should retain at least 12-16gb and 8gb should be a thing of the past unless its some crappy 5050 card that no one ill use for gaming.
No issues with DLSS and ultra in Remnant II with a 3070. 3440x1440 ultrawide.Is it because you wanna play Remnant in ultra/decent FPS by any chance?, I have been playing it recently with a mate, fantastic game, loved the first one but number 2 dials it to 11...
It was only two years ago they said that 8GB was fine. Now it's 12GB.
It was only two years ago they said that 8GB was fine. Now it's 12GB. At that rate, but the time the 5000 is released (2025) it will be a minimum of 16GB? Who's going to buy a 5080 with the minimum RAM? Strikes me that AMD got it right, and NVIDIA are way out.
There needs to be a:
5090 24gb
5080 16gb (no we dont need 2 variants, just 1 will do thanks)
Everything else should retain at least 12-16gb and 8gb should be a thing of the past unless its some crappy 5050 card that no one will use for gaming.
Caved and bought a 4090 devs don't gaf about optimizing
So the latest 5090 rumor is that it will have a proper 512-bit bus and 32Gb of DDR7 VRam.
Definitely holding out for one now.
Nice sig @Jay85
I've had it for months! How have you only just noticed
At Nvidia prices that's $400 on VRAM alone....It wouldn't surprise me if Nvidia decided to do 32GB on the 5090.
At Nvidia prices that's $400 on VRAM alone.
In Jensens law prices double every 2 years.£800 if they use Apple memory and follow their example! ;-)