Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I’m only running at 3440x1440 and I’m using more than 10GB of VRAM as I write this so I definitely won’t be buying a card with less VRAM than I have.
Thing is they will, but will also want more money for it. If I knew waiting meant I would get the 20gb version just before cyberpunk 2077 is out for £649, I would wait. But this is pretty much certainly won’t happen.From recent Benchmarks shown on Videocardz website the 3080 can sustain 60fp+ @ 4k. Would be nice to see what the VRAM usage is during the reviews process. Before the announcements I saw on WCCftech and on Moores Law is Dead there is a rumoured 3080 sku with 20GB. I suspect that when AMD announces big Navi with bigger vram than the 3080's, Nvidia will launch this card. Hence why I am waiting till the release of AMD's cards to launch before I pull the trigger
Thing is they will, but will also want more money for it. If I knew waiting meant I would get the 20gb version just before cyberpunk 2077 is out for £649, I would wait. But this is pretty much certainly won’t happen.
That is just too much. And all for what? The possibility that I may need to use nightmare instead ultra nightmare texture setting? Yea, I think not, especially when I will only upgrade again when the next gen is out so won’t be a vram issue by then. Oh and the fact that in majority of cases one needs a magnifying glass on a still shot to see the difference between the two textures.Yea a 20GB version would be around £800 I reckon.
That is just too much. And all for what? The possibility that I may need to use nightmare instead ultra nightmare texture setting? Yea, I think not, especially when I will only upgrade again when the next gen is out so won’t be a vram issue by then. Oh and the fact that in majority of cases one needs a magnifying glass on a still shot to see the difference between the two textures.
Each to their own though, for some people maximum no matter if it makes any difference in image quality or not is a must. I don’t suffer from that problem. I enjoy to tinker and am happy to use one setting lower if when in game I can’t see a difference between the two.
That is just too much. And all for what? The possibility that I may need to use nightmare instead ultra nightmare texture setting? Yea, I think not, especially when I will only upgrade again when the next gen is out so won’t be a vram issue by then. Oh and the fact that in majority of cases one needs a magnifying glass on a still shot to see the difference between the two textures.
Each to their own though, for some people maximum no matter if it makes any difference in image quality or not is a must. I don’t suffer from that problem. I enjoy to tinker and am happy to use one setting lower if when in game I can’t see a difference between the two.
What if they are using GDDR6 and not GDDR6x? what difference would that make? Or is the faster stuff more of a demo.. like RT was with TuringThe only thing to consider is the rumours are that the new AMD cards are indeed 16GB cards.
Not sure what you mean, elaborate please.Isn’t that why we’re all here though?
Not my concern, I have 2. Gsync panels here and want to go nvidia on this occasion. I doubt nvidia will lower prices much below £649 unless AMD really try and go competitive and offer 3080 performance with 16gb for say £499 or something which I can’t see them doing. It is possible, but AMD these days don’t go down that route with their GPU’s.The only thing to consider is the rumours are that the new AMD cards are indeed 16GB cards.
That‘s your call, no one is trying to convince you to do otherwise. I recommend buying a 3090 24GB’s!You can ust whatever justification you like but I'm not moving down to a 10gb card after having 11gbs for over 3 years.
What if they are using GDDR6 and not GDDR6x? what difference would that make? Or is the faster stuff more of a demo.. like RT was with Turing
Not sure what you mean, elaborate please.
I game at 4K, that already looks much better that 90% of gamers which are on lower resolutions. That makes a MUCH bigger difference to IQ in my opinion than choosing nightmare instead of ultra nightmare textures. If you don’t believe me, if you have doom go try it and come back and show us the difference between the two, you will struggle I recon.
Not my concern, I have 2. Gsync panels here and want to go nvidia on this occasion. I doubt nvidia will lower prices much below £649 unless AMD really try and go competitive and offer 3080 performance with 16gb for say £499 or something which I can’t see them doing. It is possible, but AMD these days don’t go down that route with their GPU’s.
I will order the FE on release and take my chances.
That‘s your call, no one is trying to convince you to do otherwise. I recommend buying a 3090 24GB’s!
Just ask Kaapstad who has had 24GB TitanRTX’s for around 2 years how useful that has been
I meant AMD could be using the standard GDDR6 but more of it vs Nvidia's faster GDDR6xThey wouldn't down grade the ram on the 3080 just to bump up the amount. You lose memory bandwidth.
I meant AMD could be using the standard GDDR6 but more of it vs Nvidia's faster GDDR6x