Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
That's is EXACTLY what I'm saying man. I'm agreeing with you, you are basically saying that a 3090 was literally worth double the money (or whatever the % is) of a 6900xt. But you don't hear that being told often around here. The usual word around is "nvidia greedy"
I eventually got around to taking a few screenshots in MW2. DLSS doesn't look as sharp as FidelityFX CAS, but it still looks good and shouldn't affect visibility for spotting players. These 3 screenshots are upscaling/sharpening off, DLSS Quality and FidelityFX CAS 75 sharpness, can you tell which one is which?I think the prob with DLSS in MW2 isn't the blur from DLSS itself but that it blocks the CAS sharpening option in the game. Maybe post a screenshot or two as I'm curious how it looks as I play at 4K 116fps (reflex cap) myself and find the game looks rather soft without CAS
I eventually got around to taking a few screenshots in MW2. DLSS doesn't look as sharp as FidelityFX CAS, but it still looks good and shouldn't affect visibility for spotting players. These 3 screenshots are upscaling/sharpening off, DLSS Quality and FidelityFX CAS 75 sharpness, can you tell which one is which?
Screenshot 3 is upscaling/sharpening off, not DLSS.3rd screenshot is very soft but I was more talking about Warzone 2 since the long view distance makes lack of sharpeming really apparent.
so no one can tell which is which and actually called the native one worse in a blind testScreenshot 3 is upscaling/sharpening off, not DLSS.
so no one can tell which is which and actually called the native one worse in a blind test
yeah ive just seen a few posts recently saying dlss looks crap cause its just an upscaled image from some low quality source when to most it looks betterGood on nvidia/cdpr for pushing the boundaries of visuals, not a chance anyone will be able to enjoy this except 4090 users but as they said themselves, this is a tech demo for the future of gaming, I rather have that than have companies being lazy keeping us in the outdated ways.
It's going to be the new "crysis" and that's ok, well in fact, better because crysis was a fundamentally awfully optimised game and had no reason to be as demanding as it was.
Maybe 5080 will get 60 fps with dlss quality
Pretty much been the case for a good 1-2 years now. Every time a blind test been posted, both static screenshots and videos, the nay sayers always get it wrong
I always use dlss whenever possible as it is simply better "overall".
Given pretty much all the major reviewers also state this with evidence to back up such claims I still don't know why this is even a talking point any more.
Well I didn't call it worse, I just know that without sharpening the game image looks too soft for me. CAS also works with native resolution which is how I normally play and going by the other screenshots DLSS does a good job in multiplayer at least.so no one can tell which is which and actually called the native one worse in a blind test
But we are not at that point, especially for true Path Tracing, which requires massive computational power. Maybe in 10-15 years we will be there, but with consoles still a thing even then I don't think it's a sure thing. Visuals aside, physics have been stale for more than 10 years now, no one these days innovates anymore.Making abstraction of AMD vs NVIDIA or the silly pricing, real time RT and path tracing would have represented a renaissance in game graphics under normal circumstances. Is sad to see that in instead of appreciating these steps forward, the majority of the talks are about trivializing the moment, the cards and the amazing tech behind it.
Personally i don't think we'll ever switch over to full path tracing, the computation requirements and by extension the amount of silicon needed far exceed what can reasonably be considered a cost effective die.
The 4090 has a 600 mm² die already and it's only capable of path tracing in limited amounts, all be it less limited than lesser cards, even while being assisted with upscaling.
/Hot take
Yea I'd considered if dedicated RT silicon should be a thing but then IDK how well that would work what with the latency hit from transferring data between chips.My hot taken is we will get seperate GPU'S dedicated to RT. One for rasta powa and one for RT. It might even be all on one card.
Either that or they will need to improve software denoising further or something.
It makes PC gaming worthwhile again.adding excessively pointless extras to games to bring your OP gpu to its knees so you have to buy a new one next release
For large scale adoption, yes, but it has also to do with the high price of admition.But we are not at that point, especially for true Path Tracing, which requires massive computational power. Maybe in 10-15 years we will be there, but with consoles still a thing even then I don't think it's a sure thing. Visuals aside, physics have been stale for more than 10 years now, no one these days innovates anymore.
Yeah because I think the shaders have to be able to communicate with them as the scene is being computed.Yea I'd considered if dedicated RT silicon should be a thing but then IDK how well that would work what with the latency hit from transferring data between chips.