So, this is a genuine question, as I don't know the answer and am interested.
I see a lot of conversation about the relative performance of GPUs based on rasterization, without the DLSS/FSR/Whatever funky software enhancements are offered. And I understand that the manufacturers are being somewhat disingenuous when they claim, for example, 4090 performance from a 5070, if one expects that to apply to raw frame processing. But, in the real world, does anyone switch all the enhancements off in their games? And given that there is no real metric for image quality provided in reviews, how can one actually understand the best GPU for a given set up?
My preference is single player games, with as much eye candy as possible on a 49" super UW monitor. That monitor will "only" manage 144hz refresh, so anything over 144fps is essentially pointless. But I want the game to be as smooth as possible, with all the fancy lighting and stuff. If DLSS4 with tons of "fake frames" will provide an objectively better experience than DLSS3 or no DLSS at all, then why should I care about raw rasterization performance? Am I missing something?
Alternatively, I can completely understand that if you have a 1080p monitor running at 540hz playing competitive fps games, and care not one bit about image quality, then your priorities will be different. But again, if performance is better, why would you care whether the frames are generated by the engine or the GPU?
What am I missing? Is there a reason why I should particularly care about performance without frame gen, etc? I appreciate this is probably a somewhat contentious question, but I'm hoping the responses stay friendly. The question is a genuine one.
TIA
I see a lot of conversation about the relative performance of GPUs based on rasterization, without the DLSS/FSR/Whatever funky software enhancements are offered. And I understand that the manufacturers are being somewhat disingenuous when they claim, for example, 4090 performance from a 5070, if one expects that to apply to raw frame processing. But, in the real world, does anyone switch all the enhancements off in their games? And given that there is no real metric for image quality provided in reviews, how can one actually understand the best GPU for a given set up?
My preference is single player games, with as much eye candy as possible on a 49" super UW monitor. That monitor will "only" manage 144hz refresh, so anything over 144fps is essentially pointless. But I want the game to be as smooth as possible, with all the fancy lighting and stuff. If DLSS4 with tons of "fake frames" will provide an objectively better experience than DLSS3 or no DLSS at all, then why should I care about raw rasterization performance? Am I missing something?
Alternatively, I can completely understand that if you have a 1080p monitor running at 540hz playing competitive fps games, and care not one bit about image quality, then your priorities will be different. But again, if performance is better, why would you care whether the frames are generated by the engine or the GPU?
What am I missing? Is there a reason why I should particularly care about performance without frame gen, etc? I appreciate this is probably a somewhat contentious question, but I'm hoping the responses stay friendly. The question is a genuine one.
TIA