• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Release dates for the super cards

You're best bet at that price range is a used 3080 with Nvenc.

I already have NVENC. AV1 from comparison footage i've seen so far is substantially better.

Huh? With the exception of RT, your 1080 Ti is still at least on par with the PS5 GPU. If anything it just shows how great value PC really is. If you got it on release you had top tier performance (far above the PS4/XB1) for 3 years before the new consoles, then another 7-8 years (assuming it doesn't die) of on-par performance with the new consoles. To say nothing of the money that could've been made mining during those periods (even if only when you sleep & aren't using it).

Going to a console you're going to give up a LOT in features/performance/utility and then be nickel and dimed to way above what you'd pay for a PC upgrade anyway. It just makes no sense.

PC games and optimization aren't getting any better. I don't want to have to brute force performance with an expensive GPU because developers are lazy/incompetent to make their games run correctly. DLSS/FSR seems like a crutch to me when a game runs crap - i'm not particularly interested.
 
PC games and optimization aren't getting any better. I don't want to have to brute force performance with an expensive GPU because developers are lazy/incompetent to make their games run correctly. DLSS/FSR seems like a crutch to me when a game runs crap - i'm not particularly interested.
If you're not into FSR/DLSS then moving to console isn't going to help. FSR, other forms of upscaling like TAAU and dynamic resolution are all rampant there too. Very few games actually run at native 4K. Many don't even consistently hit 1440p. I was watching a Digital Foundry video about Cyberpunk 2077 a couple of days ago and the PS5 version of that drops as low as 1080p with its dynamic resolution system, which is then upscaled using FSR. You don't get the choice of turning them off on console either of course.
 
With the key games it's not that devs are lazy, they have proven they are not at the end of it all, look at how Cyberpunk has redeemed itself, one of the most highly optimised engines out there to the point that review outlets use it as a baseline to compare new games against for CPU and GPU utilisation.

What all this highlights and always has done is how expensive in terms of framerate, ray tracing and now path tracing is. The visual quality transforms a game from RTGI/PTGI, but it comes at a 60% fps hit on average from just RT alone, and then more when using path tracing. So unless you have a GPU capable of running ray tracing at a hardware level, then you're in for a poorer experience instead of the 60+ at 4K or 100+ at 1440p with everything whacked up. This does mean using DLSS/Frame Gen too, but that is all fine nowadays as the tech has matured to a level where it's good enough to just do its thing and we can enjoy the games.
 
PC games and optimization aren't getting any better.
This is incorrect. Furthermore, every issue you see in the PC version exists in the console version as well - except you don't even have the choice to brute force it away (if possible) or fix it through user fixes/mods etc. If you're hoping moving to a console is going to give you some kind of polished optimised-for-console game, then I have some bad news - those haven't existed since consoles moved to x86 (i.e. 2013).

I don't want to have to brute force performance with an expensive GPU because developers are lazy/incompetent to make their games run correctly. DLSS/FSR seems like a crutch to me when a game runs crap - i'm not particularly interested.
The issues with games atm are api/engine related (f.ex. Unreal Engine 4 & 5 traversal stutter) and can be mitigated (hardware-wise) only through stronger CPUs/faster and lower latency memory, but really it's on the games themselves & there's only so much stronger hardware can do against software bottlenecks. Spoiler alert: this is also the case on the consoles (because it's an engine issue). The only thing that's different on PC is you can also have shader compilation stutters (in games that don't compile them at start) but those can at least be "fixed" by playing through the game, because once you encounter them then they don't happen again (because the shader has been compiled).

So it has nothing to do with the GPU, and least of all DLSS/FSR2, which btw you should wish to be so lucky as to have that choice on consoles. Just look at the latest FF game, it has a low resolution with a weak AA AND upscaling through FSR 1 (!), and to top it off because you have no control over the settings and it has permanent dynamic resolution - because it's hammering the CPU so hard in performance mode and the devs weren't smart (as is very common actually, and it applies to many console games), the game ends up dropping the resolution to the lowest end EVEN IF it would have the GPU resources to go higher. So you get a way worse image quality than necessary and you can't change a setting so you're then forced to endure it like that (or enjoy the 30 fps mode), when on PC it would be a simply change to make (by the user). And there's many examples of issues like this that people don't know about.

Today a console is just a very locked-down gaming PC at somewhat of a discount (and in the long run more expensive). But hey, don't let me stop you from making a bad choice. :P
 
Nvidia CES Address. First half of the video goes over the use of AI in game development and some upcoming titles that will be using RTX and DLSS 3, the RTX Super staff starts at 18:09


RTX 4080 Super to be 40% faster then the RTX 3080 Ti (without frame generation). That would make about 9%/10% faster then stock 4080, that still leaves plenty of room a 4080 Ti to come in and sandwich itself nicely between the 4080 Super and a 4090.
 
Last edited:
Back
Top Bottom