I guess you're talking about this
video.
Nah, actually all of them - each shows various things that are relevant to the conversation, like total lack of optimisation in many games these days, which are often actually easy to do. Or complete butchering of TAA settings, lack of real transparency in games, bunch of cheap effects masked by TAA making image blurry etc.
I'm no game developer, but I'll love see some sitting around a round table talking what works and what not for them, esspecially about different engines.
That would be great, way more interesting than also non-devs people like DF and the likes. It's ok to look at final product and say "I like this or that better", compare screenshots etc. but I don't understand why some people consider them as some experts in the field, where in reality seem to be at best on the level of average enthusiast gamers.
With that said, if I get it right, he's proposing a 1080p image (leaving aside that he's talking about performance scaling with resolution and frame rate purely on TFLOPs, which don't really scale linearly), with what he calls "good" AA (but doesn't show it)
He does show the good AA, in other videos, with live examples in movement. It's actually just tweaked TAA, which always had the bad press that was caused mostly by really bad implementation in UE4 engine and not by the tech itself. That said, DLSS and the likes are really also TAA with slapped "AI" on top (more on that later). Also, he clearly put on the graph "AT LEAST 1080p, better quality AA etc." as base resolution to aim at, instead of upscaled from 840p+ like one often gets with DLSS these days - that was the point, not to limit visuals to 1080p. You would get better upscaling form good 1080p than crap 864p, is the point too. As in, the opposite of what you thought he proposed.
and let the monitor/TV use its own upscaler (probably not the greatest) instead of something like DLSS which can be very good at upscaling from 1080p to 4k...
I am not sure where you got that idea from, this isn't what he said at all and it wasn't even his point. He was just comparing that native 4k isn't necessary for playing on TV for various reasons (same as 4k movies, that most people can't even see any difference as they don't have big enough TV, close enough to their sofa, to actually be able to see the difference), where you can get very similar level of details in 1080p with proper AA and then upscaled to 4k as needed. Again, this is in comparison to upscaling from 864p like a lot of games do (especially on consoles) these days.
Of course, he's advocating for consoles where normally you'll get FSR, but still, talking down DLSS, AI, hardware upgrades (PS5 Pro) - as in don't need them, leaving outside Series S and ignoring stronger hardware is a bit amusing to me. But hey, his game, his target - but that doesn't have to be the same for all!
Again, not what he was talking about, as I described above. When you have proper FPS in 1080p, with good AA and amount of details, then you can do whatever you want with the image - leave it as is, upscale in various methods etc. But the goal should be to make the source image as good and fluid as possible first and then worry about fluff later.
Dismissing "stronger" cards like the 4070ti simply because he's focusing on consoles and GPUs around that power doesn't say much.
Consoles and xx60 series cards are huge majority of the market by far. 4070Ti+ is a tiny fraction of the market in comparison, simply because of the pricing. That said, they test everything on the xx60 series as that's about console speed and if you can make game work well for this largest chunk of the market, higher-end GPUs will give you even more FPS and higher resolution. Sounds logical to me. Then, you can later add more fluff on top for higher end users, but that's an extra. However, when you aim at 4090 as your main target and then game works really badly on xx60 cards, you failed as a dev to target the majority of the market. And that seems to be the point here. Also, don't forget what happened to 3080 10GB and that it already suffers in games where weaker GPU with more vRAM can get better results in same settings - it wouldn't be a problem if games were better optimised.
By the time their game will see the light of day (if it will), 4070ti will probably fall into x60 series performance of that "current" gen or below... Sure, will probably be good for old consoles hw, but do we really care around here? And even console players seem to be "fine" by default with 30fps and whatever resolution the box can output...
I do play at times on a console, my wife mostly plays on the console. Neither of us would touch 30FPS with a 10 foot pole... It's just a horrible experience, making me physically sick. That publishers seem to be pushing 30FPS as the new meta (total regression) doesn't mean it's good for players nor that they like it. Usually games give option to have more bling in 30FPS or proper 60FPS with worse lighting - all the stats I've seen show that huge majority of players go for 60FPS and ignore the bling.
Moving on, he talks about some techinics for a few games where you can see at
Quantum Break how the AO artefacts around the character's head. One solution would be/was HBAO+... a close source technique, aka nVIDIA's tech if I'm not mistaken

MXAO, on the other hand, seems like a nice option and Bayer in general for Lumen.
AMD came up with very good AO methods, along with bunch of other things - all open tech, available for years now. There's a reason many indie devs go for AMD tech like SSR, AO, upscaling etc. - it's just easily available for them to tweak and implement as desired instead of NVIDIA black box approach.
The Lumen talk... well, in software won't be great/ideal for the scene he's highlighting (or, it seems only logical, it would have been done alredy by others), also doesn't test it in HW.
Again, performance matters - HW one on 3060 would be completely unplayable, so might as well not exist at all.
Playing Immortals of Aveum (which apparently has it), I'm falling to see all the issues he's talking about outside some minor noise on a very specific material and level. Ah, and let's not forget he's disabling other effects (TAA?) which makes it look that bad

)
He's showing TAA is everywhere in games, on all kinds of effects, masking artefacts. You can't disable it in settings, it makes whole image very blurry and that's before one even adds upscaling (DLSS and other) and/or AA. Before his vids I wasn't even aware TAA is so widespread in games just not where we would expect it to be (as in to do actual AA). And it also explained to me why everything is so blurry these days in games - I though it's DLAA, DLSS, but turning them off changed nothing, etc. but now I know why. And it's not just my old eyes.
Point The Division as an example I wouldn't say is ideal either since it looks decent if you look at it broadly, but still has the same issues with lacking shadows where they should be, poor reflections with the image reflected actually changing a second or two while looking at it and it's not even wildly accurate...
Shadows would be rendered using different method, this is just about GI. It lagging isn't any different (it actually is often better) than GI in CP2077 with PT - that one can lag like hell without using AI and even with AI it's far from perfect. Very disturbing when looking at it, as my brain knows this is just wrong.
1st and 2nd pic with the shortcoming of the GI
postimg.cc
Another fail from the GI where it ilumates way, way too much the scene...
https://postimg.cc/G8dWtwNq
It's also a relatively old game (8 years and ticking), but the whole point is that it was very cheap performance wise and still looked great for that hardware level from that time. It doesn't mean it can't be improved nor tweaked further, hardware got faster since that time for sure.
Radiance Cascades seems possibly fine, but since he says it was usually done in 2D and only recently some 3D advancements have been done, of course it hasn't been talked before.
And likely won't be in AAA world, as that would require them to do something else than enabling Lumen - it cost money, so won't be done. Unless something finally gives in that world, which doesn't seem far off considering their games are flopping left and right these days, financially.
Then in closing, he shows Stalker 2 running and looking pretty bad without any AA (TAA, DLSS, DLAA, FSR) as a redit post. Well, that's on AMD's side to improve it's FSR. Looks great on DLSS!
I don't believe this was his point at all

Also, this whole vid was more of a quick bullet points, without very many details. Comments underneath seem to be full of devs by the way, quite a few said they just learned something new about methods they never heard about before - AAA dev world is just like any other IT world I work with daily, people just simply don't research and don't realise there are more than one way of doing things. I see it daily, makes me sad at times how closed minded many devs and other IT people are. Usually takes someone unusual, a genius, to push things forth - like J. Carmack in the past and Doom engine creation in very clever ways, using old math equations that most people didn't even know about.
"These problems won't be focused on or fixed by meanstream industry, because there is no market incetinve"... I mean, Sony launched PS5 Pro with AI upscaling...
This whole AI upscaling is very misleading though, isn't it? It's really just TAA with AI convolution used for better frame matching and removing temporal artefacts (which it still fails to do now and then with ghosting and other artefacts still visible in places even on newest version of it). People imagine AI in it is filling in blanks, adding missing details etc. - but that is not what it's doing there at all, as per NVIDIA's own papers. Which is why it barely uses tensor cores even on 3060, as is, because it has very little to do in the whole process. Ergo, it's marketing more than anything of real value.