The only reason 10GB isn't enough, is because AMDs putting 16GB on theirs.
Not exactly correct now is it, my friend? What's the next argument gonna be? Games using over 10gb is bought off by AMD?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
The only reason 10GB isn't enough, is because AMDs putting 16GB on theirs.
Well when you want the gpu u last u a gen or two, 10gb is not enough.The only reason 10GB isn't enough, is because AMDs putting 16GB on theirs.
No not really.Tbf it's not like nVidia have dropped their buyers in it before, 1060 3GB/2060 6GB for a start! 10GB should be fine for a couple of years at least though at 4k, nv will make sure of it (they do pretty much control the market and games tend to get built for them).
Yup exactlyCouldn't disagree more, pretty much all titles will be console first, which means AMD gfx cards should be in a great place
Yeah, ultra is just a waste sometimes.Most likely 1-2 settings that half framerate going from high to ultra, just like Odyssey.
Consoles will be 4k/60/high I'm guessing, ultra is not worth the small IQ difference and a drop to 30fps.
I think that means the RT usage is likely to normally be in line with what RDNA2 can comfortably manage.Couldn't disagree more, pretty much all titles will be console first, which means AMD gfx cards should be in a great place
That's fair enough, but sometimes the effects are hard to notice or just subjective. Like shadows can be softer or harder, it's partly preference.I'd prefer ultra settings in a game I've paid for, so for me a major thing. I've been playing Cold War at 1440p and the difference between that and 4k is pronounced, but so is the performance dip with terrible memory leakage imho on the 5700xt at the minute, it's a beta but for me, I want the best visuals beings as a PC costs thousands of pounds.
128 watts in game!? What?!
That seems crazy low to me... granted it was playing RDR2, but I expect despite being "last gen" that game should be stressing the console pretty well unless it's locked to the old framerates?
The tweet is refencing figures measured here btw https://tweakers.net/reviews/8252/3/xbox-series-x-onze-eerste-indrukken-en-tests-meten-is-weten.html
Well, RT increases VRAM usage, so even that scenario would cause VRAM to be a factor earlier on.I feel if rt is the next big thing (let's buy that argument for a while) these VRAM discussions won't matter cuz the hardware will be obsolete every 2nd year atleast for the next 4 generations... so buying a 3080 at launch is a better decision than buying a 3080 super 6 to 12 months down or spending $200 on more VRAM
That's a fair point, but I'd also not take a last ten game as indicative of software that's designed to maximise the new hardware design.Its obviously gong to go higher than that in other games - it has to or the PSU choice in the machine makes no sense - at 128w that PSU in the series x is running under its optimal range, so it's pulling excess wattage from the wall - a smaller PSU would be more efficient, produce less heat and cheaper to manufacture - so why put in a PSU that's less efficient, runs hotter at low load and makes the machine more expensive? Only reason is that 128w is not represented of next gen games that push the hardware
And you can't even easily buy a 3080 as they only made 10 of them at launchWell, RT increases VRAM usage, so even that scenario would cause VRAM to be a factor earlier on.
True. If it becomes available in mid 2021 that's a different argument for lifespan.And you can't even easily buy a 3080 as they only made 10 of them at launch
No not really.
Multi plat titles will be built on big navi
Well, RT increases VRAM usage, so even that scenario would cause VRAM to be a factor earlier on.