So can any of the DLSS and RT expert explain what is going on in the 4K + RT + DLSS Spiderman benches from PurePC:
Marvel's Spider-Man Remastered PC - Test wydajności kart graficznych i procesorów. Jakie wymagania sprzętowe? (strona 19) Test wydajności Marvel's Spider-Man Remastered PC. Jakie ma wymagania sprzętowe? Test wydajności kart graficznych Radeon i GeForce i procesorów Intel Core i AMD Ryzen.
www.purepc.pl
Just for that setting the plain 3060 suddenly comes and beats the 3070 Ti etc. Or in other words all the 8GB cards.
However in the other 4K benches (they benched 4K with Very High and three combinations of RTX On/Off and DLSS OFF/On), the order wasn't like that.
For instance here's 4K Very High, RTX On, DLSS Off:
So it looks like to me that DLSS gains performance but needs more VRAM.
(PurePC didn't test FSR or IGTI, but then they are strange and label RT as RTX in all their charts.)
Is this a Spiderman thing, or do other DLSS implementations show similar things.
Couldn't find what version they were testing, but the test is from Wednesday the 10th.