Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I would take a 7900XTX at £900 if it was 30% faster than a 4090 in raster.
If that were the case, amd would be charging more for their GPU.
Is it FSR to blame for the often soft as heck image in Alan Wake 2 even at supposedly 100% native resolution?
Must admit it becomes tempting to scale below 100% res for extra smoothness as the original image looks bad enough to start with that you don't notice any change. I think I heard the game has FSR on as it's AA method or something even at 100% res?
Then there's games like A Plague Tale: Requiem, which last I played had only its own in-engine resolution scaler (supposedly not FSR or DLSS). The game looked mint in native, and any drop in image scaling was unnoticeable after about 85%-90%, which is still good going.
Apologies if all the above is misinformation, just my casual observations about some non-RT scenarios where upscaling becomes viable / tempting when usually it wouldn't be.
It would then have aligned with my direct observations in actual games so I'd have mostly agreed yes. Why would I say something to the contrary to my direct evidence and experience of the technology right from it's infancy?
I have seen and shown the evidence countless times and professional outlets are saying the same thing as well as demonstrating it to their millions of subscribers. Why would we not stick to our views when the evidence shows we are right.
We already know online polls are no different to just opinions as many vote based purely on their liking or hate for something regardless of whether they've actually used it in broad scope or not, regardless of what the actual poll question is specifically. It's typically people going "upscaling bad native good".
I'm sticking with my view as it's based on actual evidence and direct experience. Nothing will change that.
They only charge based on what Nvidia do but the point I’m making was that RT + DLSS technologies are now costing the consumer around £700 extra on a GPU like the 4090 so if you could get a card like the 4090 without those features but with the die space instead being dedicated to faster raster so a further 30% faster than a 4090 for around £900 would you take that card over say the current 4070ti?If that were the case, amd would be charging more for their GPU.
So the answer is there is no evidence, that's good to know.
They only charge based on what Nvidia do but the point I’m making was that RT + DLSS technologies are now costing the consumer around £700 extra on a GPU like the 4090 so if you could get a card like the 4090 without those features but with the die space instead being dedicated to faster raster so a further 30% faster than a 4090 for around £900 would you take that card over say the current 4070ti?
without those features but with the die space instead being dedicated to faster raster so a further 30% faster than a 4090 for around £900 would you take that card over say the current 4070ti
Just had a look at the poll and dayum. The amount of people not using DLSS is unreal
Switch on DLDSR and DLSS fellas. At least give it a try over a few games. Don't listen to what people say about this or that. Trust your own eyes. That's what I did ages ago, way before anyone here was talking about DLDSR and now I rarely game without it.
That's what the poll data is showing mate.Trust your own eyes. That's what I did ages ago
That's what the poll data is showing mate.
Poll does not mention DLDSR