• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

That's is EXACTLY what I'm saying man. I'm agreeing with you, you are basically saying that a 3090 was literally worth double the money (or whatever the % is) of a 6900xt. But you don't hear that being told often around here. The usual word around is "nvidia greedy"

Do you remember what happened then with pricing for gpus then ? Also no I don't think 3090s were worth double AMD cards for gaming, they were worth it compared to Titan and A-series (Quadro cards) for work use only..

My main setup here..



Also yes Nvidia is greedy because they are selling "gaming cards" for a top end 4090 for £2k... now without the shortages, covid lockdowns and the world mess then with mining etc. Come on the 4090 should have been the 4080/Ti normally and the 4090Ti the Titan with double the VRAM.

It all has become a mess and a 90 class card should have dual gpus on one card... so really 3090/ti and 4090/ti are really 80/Titan class not 90. Then you have Nvidia playing well they are creator cards blaa blaa and then remove NVLINK on 4090s.. why ? that was one of the biggest selling points to creators or people that used them for work and would justify the price.. 4090 is a gaming card now where the 3090/ti were really gaming and work cards for people that could make use of NVLINK.
 
Last edited:
I think the prob with DLSS in MW2 isn't the blur from DLSS itself but that it blocks the CAS sharpening option in the game. Maybe post a screenshot or two as I'm curious how it looks as I play at 4K 116fps (reflex cap) myself and find the game looks rather soft without CAS :)
I eventually got around to taking a few screenshots in MW2. DLSS doesn't look as sharp as FidelityFX CAS, but it still looks good and shouldn't affect visibility for spotting players. These 3 screenshots are upscaling/sharpening off, DLSS Quality and FidelityFX CAS 75 sharpness, can you tell which one is which?

MW2-1.png
MW2-2.png
MW2-3.png
 
I eventually got around to taking a few screenshots in MW2. DLSS doesn't look as sharp as FidelityFX CAS, but it still looks good and shouldn't affect visibility for spotting players. These 3 screenshots are upscaling/sharpening off, DLSS Quality and FidelityFX CAS 75 sharpness, can you tell which one is which?

MW2-1.png
MW2-2.png
MW2-3.png

3rd screenshot is very soft but I was more talking about Warzone 2 since the long view distance makes lack of sharpeming really apparent.
 
Good on nvidia/cdpr for pushing the boundaries of visuals, not a chance anyone will be able to enjoy this except 4090 users but as they said themselves, this is a tech demo for the future of gaming, I rather have that than have companies being lazy keeping us in the outdated ways.

It's going to be the new "crysis" and that's ok, well in fact, better because crysis was a fundamentally awfully optimised game and had no reason to be as demanding as it was.

Maybe 5080 will get 60 fps with dlss quality :D

so no one can tell which is which and actually called the native one worse in a blind test

Pretty much been the case for a good 1-2 years now. Every time a blind test been posted, both static screenshots and videos, the nay sayers always get it wrong :p

I always use dlss whenever possible as it is simply better "overall".

Given pretty much all the major reviewers also state this with evidence to back up such claims I still don't know why this is even a talking point any more.
 
  • Like
Reactions: TNA
Good on nvidia/cdpr for pushing the boundaries of visuals, not a chance anyone will be able to enjoy this except 4090 users but as they said themselves, this is a tech demo for the future of gaming, I rather have that than have companies being lazy keeping us in the outdated ways.

It's going to be the new "crysis" and that's ok, well in fact, better because crysis was a fundamentally awfully optimised game and had no reason to be as demanding as it was.

Maybe 5080 will get 60 fps with dlss quality :D



Pretty much been the case for a good 1-2 years now. Every time a blind test been posted, both static screenshots and videos, the nay sayers always get it wrong :p

I always use dlss whenever possible as it is simply better "overall".

Given pretty much all the major reviewers also state this with evidence to back up such claims I still don't know why this is even a talking point any more.
yeah ive just seen a few posts recently saying dlss looks crap cause its just an upscaled image from some low quality source when to most it looks better
 
so no one can tell which is which and actually called the native one worse in a blind test
Well I didn't call it worse, I just know that without sharpening the game image looks too soft for me. :p CAS also works with native resolution which is how I normally play and going by the other screenshots DLSS does a good job in multiplayer at least.
 
Making abstraction of AMD vs NVIDIA or the silly pricing, real time RT and path tracing would have represented a renaissance in game graphics under normal circumstances. Is sad to see that in instead of appreciating these steps forward, the majority of the talks are about trivializing the moment, the cards and the amazing tech behind it. :(
 
Last edited:
Making abstraction of AMD vs NVIDIA or the silly pricing, real time RT and path tracing would have represented a renaissance in game graphics under normal circumstances. Is sad to see that in instead of appreciating these steps forward, the majority of the talks are about trivializing the moment, the cards and the amazing tech behind it. :(
But we are not at that point, especially for true Path Tracing, which requires massive computational power. Maybe in 10-15 years we will be there, but with consoles still a thing even then I don't think it's a sure thing. Visuals aside, physics have been stale for more than 10 years now, no one these days innovates anymore.
 
Path tracing is awesome and is 100% the future of lighting. We won't see the full benefits of it* until consoles can do path tracing and at the most optimistic that will be PS6 generation but more likely the PS7 generation.

PS5 is about as good at RT as the 6600XT give or take a bit. At 1440p a 4090 is 20x faster than a 6600XT in Hogwarts Legacy with RT for example and that is not even path traced. I don't think a PS6 can do a 20x RT increase over PS5 to get us in the ballpark where 1440p + path traced + upscaling can get you to a 30 fps presentation. Lets just say RDNA 4 is 3x faster in RT than RDNA 3 which itself is a claimed 2.7x faster than RDNA2. If Sony double the CU/RT core count in PS6 then you are looking at 13x faster with equal clocks. Maybe an RDNA 5 based PS6 might manage it or maybe RDNA 4 has a greater focus on RT or maybe Sony decide to more than double the CU/RT count for PS6. So there are some pathways to getting there making not impossible but I am doubtful.

So that means for path tracing to actually be used devs will need to build a standard lighting model as they do now and also include a path traced option. I can see some bigger games doing this as a technical show case but I don't see it being done all that often. I think it will remain a niche feature in some AAA for a good 5 years maybe 10.

*Benefits would be the fact it is easier and quicker to set up than the current faked lighting that games use so the time saved there can be used elsewhere.
 
Personally i don't think we'll ever switch over to full path tracing, the computation requirements and by extension the amount of silicon needed far exceed what can reasonably be considered a cost effective die.

The 4090 has a 600 mm² die already and it's only capable of path tracing in limited amounts, all be it less limited than lesser cards, even while being assisted with upscaling.

/Hot take :D
 
Maybe it was a youtube thing, but I looked a bit at the Cyberpunk2077 full path video and irrespective of native or frame "generation" what I noticed once again is that the NPC bystanders looked quite low polygon. Well, that and that a place looks far too clean and buildings have far too smooth lines.

I know big open world city but it does look like a tech demo in some ways. But then I never did finish TW3 as playing as Geralt didn't appeal...
 
Last edited:
Personally i don't think we'll ever switch over to full path tracing, the computation requirements and by extension the amount of silicon needed far exceed what can reasonably be considered a cost effective die.

The 4090 has a 600 mm² die already and it's only capable of path tracing in limited amounts, all be it less limited than lesser cards, even while being assisted with upscaling.

/Hot take :D

My hot taken is we will get seperate GPU'S dedicated to RT. One for rasta powa and one for RT. It might even be all on one card.

Either that or they will need to improve software denoising further or something.
 
My hot taken is we will get seperate GPU'S dedicated to RT. One for rasta powa and one for RT. It might even be all on one card.

Either that or they will need to improve software denoising further or something.
Yea I'd considered if dedicated RT silicon should be a thing but then IDK how well that would work what with the latency hit from transferring data between chips.
 
  • Like
Reactions: TNA
adding excessively pointless extras to games to bring your OP gpu to its knees so you have to buy a new one next release
It makes PC gaming worthwhile again. :p

Like back in the Crysis days where you’d look forward to the next gpu release to see how you’d perform.

Many people that buy high end hardware want games and applications to push it to the max!

I don’t want PS4 quality games with high res textures. I want unreal engine 5 with balls out ray tracing and direct storage.

Most high end parts aren’t pushed in gaming. Look at NVMe drives as an example. Gen 5 is coming but gen 3 still gives similar performance in games!

It’s not all rosey though, the prices are obscene. I remember when top end cards cost about 600 quid (titan excluded). It’s funny how now it’s just become acceptable to spend close to 2 grand for a graphics card.

I justify it by selling the hardware to reduce the cost….
 
Last edited:
But we are not at that point, especially for true Path Tracing, which requires massive computational power. Maybe in 10-15 years we will be there, but with consoles still a thing even then I don't think it's a sure thing. Visuals aside, physics have been stale for more than 10 years now, no one these days innovates anymore.
For large scale adoption, yes, but it has also to do with the high price of admition.

Is possible now since is in the games, perhaps even more so in simpler ones, just needs a lot of expensive gear.

Hopefully RT will pick up and replace rasterisation for the most part.
 
Yea I'd considered if dedicated RT silicon should be a thing but then IDK how well that would work what with the latency hit from transferring data between chips.
Yeah because I think the shaders have to be able to communicate with them as the scene is being computed.

My guess is they will want basically break it down by the existing architecture block diagrams (I know we're in a green team thread but bear with). For AMD at least it would be similar to how MCDs have been created from the periphery of those diagrams between RDNA2->RDNA3 to include the memory controllers and infinity cache.

The next step probably means splitting out whole shader engines (which include groups of shaders, TMUs, RAs, etc.) or groups of them into their own separate dies as these are supposed to operate relatively independently of each other. Then the GCD would become a much smaller coordinator with the geometry engine or whatever.
 
Back
Top Bottom