• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Did consoles have the same quality as DLSS? I guess not. I did not like DLSS in version 1.x as it was crap. Considering how it is now, to me, is better to run with DLSS ON than lowering details or resolution, if the resulting image is better. And it normally is. That's the difference, when you're forced to either reduce details and resolution to get to a certain FPS or use DLSS, if the resulting image is better with DLSS, then I'll take that over some purist view of only playing native - it which case I'll be worse off. I don't notice the inserted frames on TVs, so I'm not affected. Haven't tried FG yet, as I don't own a 4xxx card, but if it works the same could be ok for me.

Imagine if dGPUs had 4X the RT/rasterised processing power we had now? Would we need DLSS/FSR for RT games? Probably not. Would we need to use the junk AA methods(which are based on old technology from a decade ago which is why they are useless) or have to use "AI" improved methods like DLAA? Nope.

Consoles for a while used tricks to improve performance too,because they designed to a cost. The console tech was hardware based and better than what a lot of average PCs could do if they had to use software based resolution scaling or dropping other settings before DLSS/FSR came along. Your RTX2080 is still more powerful than an RTX3060 or RTX2060 which are the Nvidia mainstream RT dGPUs. Faster than a ton of AMD dGPUs too. It made sense since consoles are cheap systems,even cheaper than a crap gaming PCs. But PCMR purists spun and mocked consoles.

The reality is just like console users,PCMR are being sold a dream but the reality the hardware isn't powerful enough.But they promise you it can do XYZ(only if you fiddle the image quality with our latest DLSS/FSR).

So hence,like with consoles,tricks are being done to improve performance. This is being used to upsell the abilties. The big difference is consoles are "cheap" but Nvidia/AMD/Intel are using this to upsell lower end hardware. People think AMD and Intel won't do the same? OFC they will. It makes more money for them,because you can sell less for more.

Why do you think the RTX4070/RTX4070TI are being sold based on basically an RTX3060 successor? Or the RTX3050 replacement being sold as the RTX4060TI? Or AMD making Navi33 a 6NM part. Just whack on DLSS/FSR,FG,etc and there you go. Get enough people on board,and then they can sell the next 60 class dGPU at £1000+ and see the margins rise even more. Many of us jokingly predicted we would see a 50 series dGPU being sold for 70 series money. We are nearing that reality.

If engineers had their way,we would be getting an RTX4070TI for probably £460(its an RTX3060 replacement after all),with a 50% generational improvement over the RTX3070 in rasterised:

We would be getting an RTX4080 at RTX3080 pricing and an RX7900XTX at RX6800XT pricing at and also getting a 50% generational uplift. With RTX,the RTX4070TI is 60% faster in RT than an RTX3060,the RTX4080 is 47% faster than an RTX3080 and the RX7900XTX is 56% faster than an RX6800XT.Then DLSS/FSR would be on top of this.

However,the cynic in me saw what the end result of all this was - more profit.

Phones are not a good example, because companies that make them are like mushrooms after the rain - plenty, and most important, they do compete amongst themselves!
Apple and Samsung for a few years were not really competing with each other and were making minor refreshes for years. Samsung was making their lower end models relatively worse and worse.Even camera tech stagnated,and as I am interesting in imaging tech I was wondering why they were not including XYZ tech.

You know what I saw in Asia? The lower end markets were far more vibrant because loads of Chinese companies started selling their phones. Huawei,Oppo,Xiaomi,etc. They started competing with each other more and more aggressively. It was Chinese companies who started to aggressive put in more effort into integrating multi-focal length systems into phones(I think Huawei was the first one to try and include folded optics based telephoto lenses). Then when these companies entered Europe/US it really made Apple/Samsung have to up their game.

Why would I buy a top end iPhone? What would it get me beyond the roughly 360 euros that I've paid for a very good phone with 8gb RAM, 256GB storage, 67W fast charge and decent cameras? Nothing that I would notice in the day to day life other than bragging rights of which I'm not a fan. So buying the less expensive for which offers virtually the same experience is not affecting me, the buyer.


On the GPU side, on the other hand, is different. You can feel the difference between a midrange card and a top end card and basically you have only 2 worthwhile players atm that don't compete amongst themselves! Maybe 3 if Intel hangs on delivers quality products. The answer to the situation is easy: we should stop defending these practices and find excuses like "OMG, inflation!", or basically "if they don't made 50-60% profit margin over all and at least 1billion in profits is not worth it for them, they will go bankrupt!"... plus other silly arguments. Some fanboys of these companies see them as "my company good, no matter what, THAT company BAAAAAAAAD", so they end up defending the bad practices.

Actually you can to some degree. The higher end chipsets do have noticeably improved performance because of the faster CPU cores,and the GPU performance can go up a lot. So for mobile gamers its noticeable. The cameras also get noticeably better as you go upwards,especially if you want multi-focal length systems(with decent tele lenses and low light capabilities),with decent image processing. But when I can basically buy a sub £400 phone which does 70% of a high end phone,and also get a decent dedicated camera too its not really worth the premium for me.
 
Last edited:
Bucks_Fizz_-_land.jpg
 
This resonated with me in a scary way.

We lease cars, we lease motorbikes, you can get Geforce Now. It wouldn't surprise me if a company tries a leasing model.

Phones are probably an example to look at. The top end have skyrocketed in price, someone on an average wage still wants the top of the line iPhone but a month's pay is out of reach so most high end phones are on contract.

Please don’t give them ideas. Zero chance I would ever lease or do it like a phone like contract. I have not even had a phone contract in probably a decade or so now. I just do sim only contracts for less then a tenner a month.

I would probably end up going used and not upgrading as often if they did that. They would simply **** way too many people of going down such a route.

It's heading that way. Maybe not even leasing but enforced obsolescence,because of continual revenue models. Just look at games now. You used to have one release,and then an expansion pack. Now everything is a 1000 difference editions,bits sold as DLCs,season packs,etc.

So that person keeping a GTX1080TI or RX480 for six years is not giving any more revenue,plus companies have to maintain drivers for X years. Plus community modded drivers exist(they exist for older AMD cards).Imagine if they only supported drivers for three years,there would be a massive backlash. But they can essentially can do that now can't they?

Basically companies are trying to make generational performance leaps,less on the pure hardware performance but software based "features". It's easier to enforce various degress of obsolescence that way. Too many here on tech forums keep giving these companies benefit of the doubt. But you need to think as an accountant of sorts - how do you make more money?

For Nvidia/AMD/Intel it means:
1.)Sell crapper hardware for more money,so the hardware needs "assists" to stand on its own two feet.
2.)Use excuses to say why X software trick tech will only work "properly" on Y hardware.
3.)Make sure the newer games have to essentially use X software trick tech to run "properly".

So,what happens? Gamers will feel compelled to upgrade quicker. If people don't think this does not happen,listen to what is happening in other industries(Louis Rossman on YouTube covers some of the examples).

This is will be done slowly so people won't notice it. Let's see how DLSS4 and FSR4 pan out.
 
Last edited:
  • Like
Reactions: J.D
It's heading that way. Maybe not even leasing but enforced obsolescence,because of continual revenue models. Just look at games now. You used to have one release,and then an expansion pack. Now everything is a 1000 difference editions,bits sold as DLCs,season packs,etc.

So that person keeping a GTX1080TI or RX480 for six years is not giving any more revenue,plus companies have to maintain drivers for X years. Plus community modded drivers exist(they exist for older AMD cards).Imagine if they only supported drivers for three years,there would be a massive backlash. But they can essentially can do that now can't they?

Basically companies are trying to make generational performance leaps,less on the pure hardware performance but software based "features". It's easier to enforce various degress of obsolescence that way. Too many here on tech forums keep giving these companies benefit of the doubt. But you need to think as an accountant of sorts - how do you make more money?

For Nvidia/AMD/Intel it means:
1.)Sell crapper hardware for more money,so the hardware needs "assists" to stand on its own two feet.
2.)Use excuses to say why X software trick tech will only work "properly" on Y hardware.
3.)Make sure the newer games have to essentially use X software trick tech to run "properly".

So,what happens? Gamers will feel compelled to upgrade quicker. If people don't think this does not happen,listen to what is happening in other industries(Louis Rossman on YouTube covers some of the examples).

This is will be done slowly so people won't notice it. Let's see how DLSS4 and FSR4 pan out.

Yeah. Let's see. All I know is if Nvidia think they can keep demanding prices like ada or even up them further, then they will get less and less of my money.

I will just buy used or go AMD once they sort themselves out. All they need to do is sort out FSR and their pricing and I am happy. But at times it seems like that might be too much to ask.
 
Yeah. Let's see. All I know is if Nvidia think they can keep demanding prices like ada or even up them further, then they will get less and less of my money.

I will just buy used or go AMD once they sort themselves out. All they need to do is sort out FSR and their pricing and I am happy. But at times it seems like that might be too much to ask.
I agree with you because I feel the same and it applies to all these companies.

However,my main concern is younger people or the less well informed who are newer entrants into this market. They will only know this current market. This is why I think its important for older PCMR/gamers to sort of point these things out instead of being apathetic,which is not how PCMR became PCMR all those decades ago. Because in a free market system,if the consumer does not actually show a bit of gumption then it will just break in favour of companies.
 
Last edited:
System/display?
5800X3D + 4090 on an LG C2 42"

Here's some images to illustrate what's going on (uncompressed PNG crops taken directly from 4k screengrabs):

FG-NoMB.png

This is DLSS Quality with Frame Generation turned on and no motion blur.

No-FG-MB-02.png

This one is DLSS Performance with Frame Generation turned *off* and motion blur set to 'high' (the default) - ignore the bloom - this seems to be handled differently as well with FG turned on - look at the 'C' on the side of the console there.

FG-MB.png

Now this is DLSS Quality with Frame Generation and motion blur at it's default (high) - look how exaggerated the motion blur is on the 'C' and also the neon Kiroshi sign on the left.

I *think* this is happening because motion blur isn't a static thing in Cyberpunk (and probably, lots of other games) - even with frame generation on, Cyberpunk *thinks* the game is running at sub-60fps and is applying heavier motion blur than it should given the actual perceived framerate.

Note that if you just left DLSS on 'Auto' then on a 4090 you're actually using DLSS Performance so the motion blur issue I'm seeing might not be as obvious since the game is already running at 70+ fps and any blur applied will be more subtle (though still 'wrong' for the perceived fps).
 
Last edited:
It should be input lag of about 120fps in that game and movement fluidity of 240fps.
Like I've said, maybe I'm fine even with 60fps with FG (real fps around 30, maybe around 30fps +), as long as I can v sync it to that 60fps in something like Hogwarts or having a gameplay that doesn't involve 180 no-scope type of things. Of course, as long as I don't notice the inserted frames during gameplay, I don't care. If I need more precision, like when I play some competitive game or the input lag is too big then yeah, disable FG and adjust settings to get the FPS up.

In which case no clever machine learning is needed. Sounds like if nvidia and AMD just render every other frame with much worse quality settings (lower resolution perhaps), you probably wouldn't notice and you wouldn't care.
 
Last edited:
5800X3D + 4090 on an LG C2 42"

Here's some images to illustrate what's going on (uncompressed PNG crops taken directly from 4k screengrabs):

FG-NoMB.png

This is DLSS Quality with Frame Generation turned on and no motion blur.

No-FG-MB-02.png

This one is DLSS Performance with Frame Generation turned *off* and motion blur set to 'high' (the default) - ignore the bloom - this seems to be handled differently as well with FG turned on - look at the 'C' on the side of the console there.

FG-MB.png

Now this is DLSS Quality with Frame Generation and motion blur at it's default (high) - look how exaggerated the motion blur is on the 'C' and also the neon Kiroshi sign on the left.

I *think* this is happening because motion blur isn't a static thing in Cyberpunk (and probably, lots of other games) - even with frame generation on, Cyberpunk *thinks* the game is running at sub-60fps and is applying heavier motion blur than it should given the actual perceived framerate.

Note that if you just left DLSS on 'Auto' then on a 4090 you're actually using DLSS Performance so the motion blur issue I'm seeing might not be as obvious since the game is already running at 70+ fps and any blur applied will be more subtle (though still 'wrong' for the perceived fps).
Motion blur in Cyberpunk in general is horribly done. It just blurs the whole screen when even moving slightly. It gives me nausea even at 100 plus FPS. FG likely won't play nice with motion blur due to its very nature.
 
In which case no clever machine learning is needed. Sounds like if nvidia and AMD just render every other frame with much worse quality settings (lower resolution perhaps), you probably wouldn't notice and you wouldn't care.
Maybe but that would put extra load on both GPU and CPU of which frame generation does not (well maybe it does slightly on the GPU but cpu is completely unaffected by the frame gen).
It helps even when you are cpu limited which I experienced first hand in cyberpunk as my previous 5800X was never great in that game.
 
In which case no clever machine learning is needed. Sounds like if nvidia and AMD just render every other frame with much worse quality settings (lower resolution perhaps), you probably wouldn't notice and you wouldn't care.
If it means being able play with path tracing why not? End of the day, the goal is to improve the end result. If enabling path tracing means I have to insert 'fake' frames to get it playable, its a good compromise as the end image looks better than it would with FG disabled and path tracing off. I don't think native is a priority for either AMD or Nvidia if they can come up with these 'tricks' to massively improve image quality at lower overhead. Cyberpunk path tracing for instance is just using the 'medium' setting in path tracing terms with 2 bounces and 2 rays. There is a mod which can increase that to 4 bounces and 6 rays and its another huge leap forward from 2 bounces but even 5090 will need FG to be playable with 4/6.
 
Motion blur in Cyberpunk in general is horribly done. It just blurs the whole screen when even moving slightly. It gives me nausea even at 100 plus FPS. FG likely won't play nice with motion blur due to its very nature.
Generally (when it's working properly) I don't mind it (though other people's mileage may vary) - those shots above were taken with me spinning like a top with the analogue stick pushed around half-way and the non FG screengrab (running at an actual 70-ish fps) looks OK to me (it certainly doesn't cause me any discomfort whilst I'm playing). The Frame Generated one is really bad - not necessarily because of FG, but because Cyberpunk is applying 30-40fps motion blur when it should be tailoring it to the Frame Generated rate - I can't play the game like that.
 
Generally (when it's working properly) I don't mind it (though other people's mileage may vary) - those shots above were taken with me spinning like a top with the analogue stick pushed around half-way and the non FG screengrab (running at an actual 70-ish fps) looks OK to me (it certainly doesn't cause me any discomfort whilst I'm playing). The Frame Generated one is really bad - not necessarily because of FG, but because Cyberpunk is applying 30-40fps motion blur when it should be tailoring it to the Frame Generated rate - I can't play the game like that.
Do yourself a favour and disable motion blur to fully appreciate instantaneous pixels response time of your display. ;)
 
As much as I'm happy even my 4080 can maintain 75~90FPS outside in Cyberpunk with all settings on max/RT overdrive and frame generation (@ 3440x1440) something has just always looked off with this game for me. Sometimes looks pretty damn good, but most of the time just looks "off".

It begins with that absolutely ***** HUD blur/double vision effect, but you can mod that out on PC https://www.nexusmods.com/cyberpunk2077/mods/2648

But above and beyond the HUD, as someone said above I'm guessing its the motion blur being crap, some dodgy texture quality and so on.
 
Imagine if dGPUs had 4X the RT/rasterised processing power we had now? Would we need DLSS/FSR for RT games? Probably not. Would we need to use the junk AA methods(which are based on old technology from a decade ago which is why they are useless) or have to use "AI" improved methods like DLAA? Nope.

Consoles for a while used tricks to improve performance too,because they designed to a cost. The console tech was hardware based and better than what a lot of average PCs could do if they had to use software based resolution scaling or dropping other settings before DLSS/FSR came along. Your RTX2080 is still more powerful than an RTX3060 or RTX2060 which are the Nvidia mainstream RT dGPUs. Faster than a ton of AMD dGPUs too. It made sense since consoles are cheap systems,even cheaper than a crap gaming PCs. But PCMR purists spun and mocked consoles.

The reality is just like console users,PCMR are being sold a dream but the reality the hardware isn't powerful enough.But they promise you it can do XYZ(only if you fiddle the image quality with our latest DLSS/FSR).

So hence,like with consoles,tricks are being done to improve performance. This is being used to upsell the abilties. The big difference is consoles are "cheap" but Nvidia/AMD/Intel are using this to upsell lower end hardware. People think AMD and Intel won't do the same? OFC they will. It makes more money for them,because you can sell less for more.

Why do you think the RTX4070/RTX4070TI are being sold based on basically an RTX3060 successor? Or the RTX3050 replacement being sold as the RTX4060TI? Or AMD making Navi33 a 6NM part. Just whack on DLSS/FSR,FG,etc and there you go. Get enough people on board,and then they can sell the next 60 class dGPU at £1000+ and see the margins rise even more. Many of us jokingly predicted we would see a 50 series dGPU being sold for 70 series money. We are nearing that reality.

If engineers had their way,we would be getting an RTX4070TI for probably £460(its an RTX3060 replacement after all),with a 50% generational improvement over the RTX3070 in rasterised:

We would be getting an RTX4080 at RTX3080 pricing and an RX7900XTX at RX6800XT pricing at and also getting a 50% generational uplift. With RTX,the RTX4070TI is 60% faster in RT than an RTX3060,the RTX4080 is 47% faster than an RTX3080 and the RX7900XTX is 56% faster than an RX6800XT.Then DLSS/FSR would be on top of this.

However,the cynic in me saw what the end result of all this was - more profit.


Apple and Samsung for a few years were not really competing with each other and were making minor refreshes for years. Samsung was making their lower end models relatively worse and worse.Even camera tech stagnated,and as I am interesting in imaging tech I was wondering why they were not including XYZ tech.

You know what I saw in Asia? The lower end markets were far more vibrant because loads of Chinese companies started selling their phones. Huawei,Oppo,Xiaomi,etc. They started competing with each other more and more aggressively. It was Chinese companies who started to aggressive put in more effort into integrating multi-focal length systems into phones(I think Huawei was the first one to try and include folded optics based telephoto lenses). Then when these companies entered Europe/US it really made Apple/Samsung have to up their game.



Actually you can to some degree. The higher end chipsets do have noticeably improved performance because of the faster CPU cores,and the GPU performance can go up a lot. So for mobile gamers its noticeable. The cameras also get noticeably better as you go upwards,especially if you want multi-focal length systems(with decent tele lenses and low light capabilities),with decent image processing. But when I can basically buy a sub £400 phone which does 70% of a high end phone,and also get a decent dedicated camera too its not really worth the premium for me.
4x the performance is impossible, so upscalers still are required. DLAA I thinks it's basically free, haven't looked at the numbers in Cyberpunk.

Agree on the price of cards, should be a lot lower, priced and placed at per previous gen tiers.

Mobile gaming, maybe yes, performance would be higher, but for other stuff is debatable. Even photos since average Joe is just posting stuff on social media, images that are view on small screens so defects aren't that obvious.
In which case no clever machine learning is needed. Sounds like if nvidia and AMD just render every other frame with much worse quality settings (lower resolution perhaps), you probably wouldn't notice and you wouldn't care.
Then it should be easy for AMD to come out with FSR 3 offering the same quality.
 
As much as I'm happy even my 4080 can maintain 75~90FPS outside in Cyberpunk with all settings on max/RT overdrive and frame generation (@ 3440x1440) something has just always looked off with this game for me. Sometimes looks pretty damn good, but most of the time just looks "off".

It begins with that absolutely ***** HUD blur/double vision effect, but you can mod that out on PC https://www.nexusmods.com/cyberpunk2077/mods/2648

But above and beyond the HUD, as someone said above I'm guessing its the motion blur being crap, some dodgy texture quality and so on.
A lot of this is subjective though (for example - I don't mind the HUD) - what I was pointing out is that motion blur with Frame Generation is actually *broken* in Cyberpunk at the moment resulting in objectively worse image quality. Hopefully CDPR can fix this.
 
4x the performance is impossible, so upscalers still are required. DLAA I thinks it's basically free, haven't looked at the numbers in Cyberpunk.

Agree on the price of cards, should be a lot lower, priced and placed at per previous gen tiers.

Mobile gaming, maybe yes, performance would be higher, but for other stuff is debatable. Even photos since average Joe is just posting stuff on social media, images that are view on small screens so defects aren't that obvious.

Then it should be easy for AMD to come out with FSR 3 offering the same quality.
I would be much comfortable if we got decent native performance increases,PLUS the DLSS/FSR on top so the lifespan of the cards would be extended. This is where I always thought enhanced image upscaling is really a good idea. But I just don't like they are trying to use it as the majority of the performance increase from day one on expensive pieces of hardware.

WRT to smartphones,that is why I tend to not get a more expensive one. I realised a dedicated camera could be had for the same price with my cheaper phone as a high end phone,and give me the best of both worlds.
 
A lot of this is subjective though (for example - I don't mind the HUD) - what I was pointing out is that motion blur with Frame Generation is actually *broken* in Cyberpunk at the moment resulting in objectively worse image quality. Hopefully CDPR can fix this.

That could go a way then to explaining my recent experience. I completed CP shortly after release on my 2080Ti, so just reinstalled the other day to test RT overdrive.

And yeah the HUD is subjective, just something that bugged me, but in no way related to RT/FG/motion blur/texture work.
 
Last edited:
Looks like modders have entered the fray:

Thanks to the Ray Tracing Overdrive Optimisations mod by Erok and Scorn, Cyberpunk 2077's experimental RT Overdrive mode is now playable on mainstream (RT compatible) graphics cards. With this mod, performance gains of 25%-100% are promised to gamers, with the largest performance gains coming from AMD's Radeon RX 6000 series GPUs.

What does this mod do? It downgrades RT overdrive's ray tracing by reducing its ray bouncing from 2 to 1. This reduces the impact of bounce lighting, and reduces the level of detail that many in-game reflection have. That said, most gamers would not notice a drop in visual quality outside of side-by-side comparisons unless they are only looking at Cyberpunk 2077's reflections.

In the video below, Digital Foundry has showcased this mod with an RTX 3050 at 1080p 30 FPS with DLSS set to performance mode. While 1080p 30 FPS does not sound like a great experience for most PC gamers, it shows that path traced visuals can be delivered in complex games like Cyberpunk 2077 with relatively low-end hardware.

So with everyone onboard with DLSS/FSR and FG,looks like if you want to slight decrease reflection quality there is a good performance jump. But yes I know its not "native" path tracing anymore! :cry:
 
Last edited:
Back
Top Bottom