• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

But is starting to apply to qHD and 4K with relatively expensive graphics cards,and we all know these new technologies are junk at lower resolutions. We have had qHD monitors for well over a decade,and 4K monitors for a decade. El-cheapo TVs have had 4K for years now. Phones have qHD and 4K equivalent screens increasingly.

Yet,PC monitor technologies have stagnated for years now,and yet desktop dGPUs seem to be well behind the curve,especially at under £600. People keep moaning at consoles for being the reason PC games are not moving forward in graphics,etc. PCMR laughed at consoles using upscaling,etc saying PC was all about maximum fidelity,etc. Now PC dGPUs are essentially using all the tricks consoles were mocked for,and now we are using fake inserted frames tech,that TVs use to smooth motion. But marketing it as something revolutionary,when it isn't.

This wouldn't matter if we still had good generational improvements top to bottom,but we don't anymore. At best we seem to have one good generation,then one or two useless ones.

But years ago,when we started to have 3D accellerators with PCs and finally the PC started being better at 3D games than consoles,this didn't seem to be a problem(we had games like Unreal,FarCry,HL2,Crysis,etc during an era of consoles). What seems to be the bigger problem is the under £600 market has started to stagnate,and with Intel pushing only quad cores for a decade for mainstream,so did core count. But now we are starting to stagnate at 6~8 cores too,which is no better than a console.

This means most normal gamers,don't have the hardware for devs to really push stuff forward. This then makes consoles far closer to an average gaming PC,than say the 2000~2010 period as an example. This stagnation is not all to do with consoles,its mostly to do with Nvidia/AMD just being more worried about what their accountants are saying,unlike earlier when it was more engineers genuinely wanting to outdo their competition. Games devs want to maximise sales,so target the lowest common denominator.

Didn't you hear, moors law is dead. I mean, the guy even literally died too.

Yeah things are not ideal. But apart from voting with our wallets and maybe slating them online, not much else can be done unfortunately.

I am happy with my 3080 Ti which I got for £575 last year, so in no rush to pay silly prices at all. Roll on next gen of cards I say :D
 
The big issue with my usage habits,is that the non-gaming stuff I run will now run fine on a laptop. After all I have had nothing but SFF systems since 2005,so already I prioritise lower power and more compact parts anyway. Once we started getting 8 and 16 core CPUs in laptops,with high single core boosts,these would easily last me 4~5 years for non-gaming usage.

So it means most of my PC upgrades are driving by gaming,but if the GPUs are an issue then I have to wonder why I am still having a reasonably up to date desktop.

Also the difference in cost between console and PC versions of AAA games isn't as much now. So if I am spending £500 on a console for three years,instead of a few times that for a PC,the difference in price for the few AAA games I am buying won't impact the fact the PC costs more. Plus no doubt a laptop will run all the older PC games I still play fine,and I already have a laptop,so instead of upgrading two,I just get one.
Comically my laptop is actually more powerful than my desktop in terms of gpu, it's a 60w 3060 versus my current 1060 in my desktop. The cpu is a 12700h so it's not going to be slow on the cpu front but I could do with upgrading the ram from 16gb of ddr5 to 32gb or even better 64gb (not sure it's available/supported though).... I have finally got on top of the crappy battery life though after some updates to the firmware/bios and registry....

With smartphones,Apple and Samsung attempted to do an Nvidia and AMD move with them. But Chinese companies started making better and better smartphones. One of the areas they innovated in was in introducing OIS to cheaper smartphones and telephoto lenses.

This meant companies like Samsung had to respond as they started to get under huge pressure in markets such as India,etc.



The problem is with Chinese companies pushing for OIS,etc in cheaper smartphones even that reason isn't as big a deal now. Loads of sub £500 smartphones have OIS,and use pixel binning. Many have telephoto lenses. My Samsung S20FE costed me well under £400,and I have a stabilised main camera,stabilised 75MM equivalent and an UW lens. Yes,a £1000+ S23 Ultra is better,but it's not like the S20FE isn't sufficient for most average users who want to take snapshots. And for me my S20FE and brand new Fuji XT20,with a few lenses cost me less than a £1000+ smartphone. I have the best of both worlds now.

But TBF,I still consider a £1000 smartphone a better value item,than a £1000 dGPU. It's a multi-functional device,which can probably serve as a primary computing device and photography device for a lot of people,for a few years now.



But considering the cost of the Samsung A52s it's still decent as snapshot camera.
I think it's more a balancing act with phones, I wouldn't go for a super cheap phone and the biggest issue I have with some chinese brands is they seem to shove even more adverts/tracking all over the place with no easy option to turn it off. For my usage of my phone the a52s is fine, it's 64mp and it has the option to bin pixels for a lower res photos.. it's also got OIS, best I go look for that in the settings then lol.... I think a mobile is better value than a gpu to a point (I don't think a £1000+ phone is good value), I do question the value of a tablet versus a phone when they're essentially using the same hardware and it's just the screen and battery that is really different (apple/samsung especially)

Mind you I also wouldn't go for the top of the line camera either, there's a sweet spot with just about everything, you've just got to find it... not always easy with nvidia though lol
 
Last edited:
Oh yeah from a business perspective utilising 'fake frames' saves them a huge amount of cash, you can see how nvidia is trying to 'normalise' dlss etc.... it's completely wrong but unless users push back, and lets be honest outside of people on forums like this they won't know better, they'll just get away with it.
But even on forums with enthusiasts like this one, far too many have drunk from the DLSS 3.0 fake frames marketing narrative, and will call out anyone calling it Fake Frames.

It's almost like they like being cheated!
 
Didn't you hear, moors law is dead. I mean, the guy even literally died too.

Yeah things are not ideal. But apart from voting with our wallets and maybe slating them online, not much else can be done unfortunately.

I am happy with my 3080 Ti which I got for £575 last year, so in no rush to pay silly prices at all. Roll on next gen of cards I say :D

That is a good price. Yeah,it needs enough gamers to actually react back at these companies.
Comically my laptop is actually more powerful than my desktop in terms of gpu, it's a 60w 3060 versus my current 1060 in my desktop. The cpu is a 12700h so it's not going to be slow on the cpu front but I could do with upgrading the ram from 16gb of ddr5 to 32gb or even better 64gb (not sure it's available/supported though).... I have finally got on top of the crappy battery life though after some updates to the firmware/bios and registry....


I think it's more a balancing act with phones, I wouldn't go for a super cheap phone and the biggest issue I have with some chinese brands is they seem to shove even more adverts/tracking all over the place with no easy option to turn it off. For my usage of my phone the a52s is fine, it's 64mp and it has the option to bin pixels for a lower res photos.. it's also got OIS, best I go look for that in the settings then lol.... I think a mobile is better value than a gpu to a point (I don't think a £1000+ phone is good value), I do question the value of a tablet versus a phone when they're essentially using the same hardware and it's just the screen and battery that is really different (apple/samsung especially)

Mind you I also wouldn't go for the top of the line camera either, there's a sweet spot with just about everything, you've just got to find it... not always easy with nvidia though lol

As I prefer using SFF systems,a laptop isn't such a leap for me. Looking at companies like Framework are trying to do,if it means we go back to having more upgradeable laptops,that would make the argument for me even stronger.

I agree,WRT,to phones. I was just trying to show,that Apple and Samsung,tried to do an Nvidia and AMD with smartphones,but the Chinese companies spoiled it for them. One advantage of Samsung now,is they are finally giving a few years of OS support.

I was looking at the A52/A52S myself but I got the S20FE for a decent price,and the cameras do the job well for snapshots. Tablets seem to have stagnated a bit,but I do find they are useful for some older folk(due to the larger screen),and can probably be useful as a lightweight laptop replacement if going on holidays.

With dedicated cameras,I would say investing in lenses is the wiser bit - as long as the camera body itself is "good enough" it should cover most of your needs.
But even on forums with enthusiasts like this one, far too many have drunk from the DLSS 3.0 fake frames marketing narrative, and will call out anyone calling it Fake Frames.

It's almost like they like being cheated!

Better than native,don't you know?:p

My Samsung TV is so ahead of Nvidia - it had fake frames years ago,and it was cheaper than an RTX4090. I am so ahead of the curve! :cry:
 
Last edited:
Interesting article about about the lower end RTX 4000`s being used in Gaming Laptops and the TDP limits not really making a difference above 100 watts.


Also a youtuber review give his views on what's happing .


Well,look at the review posted a bit earlier:
The comedy show continues


It gets even worse when you realise the laptop RTX3060 only has 6GB VRAM,and the RTX4060 has 8GB of VRAM,and is in a laptop with a faster CPU.

So,how much of the improvements are down to more VRAM and a faster CPU?

It's clear AMD might have had an inkling of what Nvidia was trying to do beforehand. The RX7600 series is made on TSMC 6NM,and it looks like it might compete fine and cost less to make:

So technically AMD,just rejigged the RX6600 series on a denser and slightly more efficient TSMC 6NM,and added RDNA3 features. They are not even trying now,because Nvidia have totally given up on entry level and mainstream dGPUs. All they want is people to spend £800+ on dGPUs.
 
Last edited:
So,how much of the improvements are down to more VRAM and a faster CPU?
You've also got to factor in tdp improvements due to the node and efficiency gains... that review showed the 60w 3060 and the 30w (I think) 4060 basically matching one another in terms of scores... so you could argue the 4060 is basically an overclocked 3060 but on a more efficient node.

The later benchmarks showing significant outliers in things like hogwarts could easily be put down to more vram, faster gpu clocks and cpu upgrades imo.

I would have liked to see some tests using cuda though because it might even turn out the 3060 is faster when doing non gaming tasks....

I think we might be in the gpu version of the cpu performance freeze we had when AMD couldn't really compete with Intel lol
 
We shouldn't be needing FSR and DLSS to run titles on higher end dGPUs. It additionally highlights how poor a lot of these these dGPUs are,and the reality of how much performance stagnation we had over the last 5 years and also VRAM stagnation. I can understand consoles needing them because they are built to a cost,and are relatively affordable. But a lot of these dGPUs will soon fall off a cliff,especially with the stingy amount of VRAM which is even worse for the under £600 market. By now any over £200 card should be having at least 12GB~16GB of VRAM by now. We went from 2GB to 4GB to 8GB under £400 between 2012 and 2016! AMD has been a bit better in this regard but its quite clear they are both in on the act.


This perfromance stagnation malarkey, I think it's more to do with people chasing higher resolutions and higher refresh rates. 10 years ago, no one was on 4k, 1440p was the sweet spot and anyone on 4k was stupid. Still most folk on 1080p apparently.

It's only an issue on enthusiasts forums - but many moaning have never used DLSS or FG. It's more around the comfort it gives. Would you turn off DLSS if a game ran more comfortably with it? Says more about folk thinking that todays games are released in a playable and performance optimized state, which they arent. Once games are optimized then you can turn DLSS off. DLSS is just a comfort option for those that feel the need to give money to devs, many of whom are AAA and like paying MSRP for games in Beta.

If people think that the games are fine and it's the gfx card manufacturers fault that they run like dribble, then some people need their head read.

You make the games for the hardware, not the other way around. If your game runs poorly on modern hardware, and it doesn't have any wow factor in graphics over games that do run well, then you've not optimized the game. Nearly ALL games run well after 6 months from release. Anyone paying MSRP and buying newly released games have no place whining about gfx cards especially when said games are £50+.

Two games that are oldish that genuinely require high end GPU's is CP2077, but that warrants maybe the use due to it being probably the best RT showcase game. MSFS, well that because they designed it to run from one CPU core and the draw distance on a flight sim will require lots more assets that you still couldn't see to take up performance.

If you want to pony up for MSRP games, want to play 4k @ 120hz, you gonna need the top end cards. If not, cut your cloth accordingly with lower resolutions. Not everyone is entitled as a PC gamer to run 4k.

I remember CustomPC mag stating 30fps was the playable framerate. :cry:
 
Last edited:
You've also got to factor in tdp improvements due to the node and efficiency gains... that review showed the 60w 3060 and the 30w (I think) 4060 basically matching one another in terms of scores... so you could argue the 4060 is basically an overclocked 3060 but on a more efficient node.

The later benchmarks showing significant outliers in things like hogwarts could easily be put down to more vram, faster gpu clocks and cpu upgrades imo.

I would have liked to see some tests using cuda though because it might even turn out the 3060 is faster when doing non gaming tasks....

I think we might be in the gpu version of the cpu performance freeze we had when AMD couldn't really compete with Intel lol

But the big issue is the RX7600S(which is not even fully enabled),is also consuming less power than the preceding RX6600 series,whilst being faster too. AMD basically is using a slightly denser and more efficient RDNA3 version of the RX6600 series,with even smaller die on a cheaper node. Just look at the RX7600S running at 85W~95W:

10% faster than a 140W laptop RTX3060,with a power rating under 100W. Now look at a 140W RTX4060 and a 140W RTX3060:

The RTX4060 is about 10% faster. So there is the problem - most laptops will have the dGPU running at 75W~95W anyway. Desktop variants of both cards will run at between 100W~200W for both. The AD107 is about 150MM2 on TSMC 4NM,and Navi 33 is about 200MM2,so I am not sure costs are even in favour of Nvidia.

They seemed to have given up on the entry level and mainstream. Navi 33 is actually more like an RX5500XT replacement:
This perfromance stagnation malarkey, I think it's more to do with people chasing higher resolutions and higher refresh rates. 10 years ago, no one was on 4k, 1440p was the sweet spot and anyone on 4k was stupid. Still most folk on 1080p apparently.

It's only an issue on enthusiasts forums - but many moaning have never used DLSS or FG. It's more around the comfort it gives. Would you turn off DLSS if a game ran more comfortably with it? Says more about folk thinking that todays games are released in a playable and performance optimized state, which they arent. Once games are optimized then you can turn DLSS off. DLSS is just a comfort option for those that feel the need to give money to devs, many of whom are AAA and like paying MSRP for games in Beta.

If people think that the games are fine and it's the gfx card manufacturers fault that they run like dribble, then some people need their head read.

You make the games for the hardware, not the other way around. If your game runs poorly on modern hardware, and it doesn't have any wow factor in graphics over games that do run well, then you've not optimized the game. Nearly ALL games run well after 6 months from release. Anyone paying MSRP and buying newly released games have no place whining about gfx cards especially when said games are £50+.

Two games that are oldish that genuinely require high end GPU's is CP2077, but that warrants maybe the use due to it being probably the best RT showcase game. MSFS, well that because they designed it to run from one CPU core and the draw distance on a flight sim will require lots more assets that you still couldn't see to take up performance.

If you want to pony up for MSRP games, want to play 4k @ 120hz, you gonna need the top end cards. If not, cut your cloth accordingly with lower resolutions. Not everyone is entitled as a PC gamer to run 4k.

I remember CustomPC mag stating 30fps was the playable framerate. :cry:

I used DLSS,and ultimately for me it made me realise my RTX3060TI with any modern RT effect,etc is just really a weak card. It won't have any longevity with it on,and ultimately if I could have gotten an RX6700XT for a similar price,I know it will last longer just because of the added VRAM. But AMD have apparently given up on the UK market,so we are lucky to get any of their dGPUs near to launch at RRP. Ultimately they are also joining in with the shameless RX7900XT. Its a real shame Intel is having problems,and we really need them to at least be competitive under £500.

One of the big problems with modern PC games development is "Early Access" which meant we get games which are poorly optimised in every way,and PC gamers just put up with it. Look at Ark? One of my mates pushed me to get it,and I refunded it within an hour. But loads of people threw hardware at it.

I have bought very few of these kinds of games,and only when I feel they actually are going to bother to make them run properly. So I agree people need to hold games to a higher standard,but despite a few of us telling people not to throw money at these companies,but FOMO. FOMO is half the reason why PCMR puts up with "Early Access" games,microtransactions and crap dGPUs which deserve to stay on shelves.

But if mainstream PC hardware is getting progressive worse and worse,its adding on top of all the other problems.We had massive stagnation in VRAM improvements since 2016. Between 2010 to 2016,we went from 1GB to 8GB of VRAM on sub £400 dGPUs. From 2016 to 2023,we are lucky to see even 12GB on sub £500 dGPUs.

As an exclusively mainstream dGPU buyer,a few of us noticed a progression over the last decade or so. 60 series cards and their AMD versions are progressively getting lower and lower tier,whilst prices are going upwards.

You can see that in the GPU tier they are,the relative die area,transistor count and actual performance relative to the top end.Cards like the famous GTX460TI,were made using a second tier dGPU. The same as the 8800GT,or the 6600GT and so on. Then you had lower end enthusiast models made using a salvage of the first tier dGPU(9500 Pro,X1950 PRO) and so on. Now look at the RTX4060? It's made using the fourth tier dGPU - its now a low end card,and probably will cost £400.

The GTX760 and GTX660TI,as an example were derived from the second tier,GK104 dGPUs. They were truly "midrange" dGPUs,as they set in the middle of the range. What was the fourth tier card in the Kepler range? A GTX650/GT740.

Now compare to a console,and suddenly it makes sense to bunch PC game development with consoles. They after all are technically now are within the realms of hardware the average gamer might be using with similar timeframes. But during the 2000~2012 era,the PC was moving at such a high rate compared to consoles,even the entry level/mainstream tiers were not comparable.

Most on here would never know it as they buy only higher end dGPUs,which are less affected. But for someone like me,and most of our mates we have seen this. Most of the builds or suggestions we do for other people include under £500 dGPUs. I barely know anyone with a £700+ dGPU in the realworld. I think an RTX3080 is probably the highest end IIRC.
 
Last edited:
They seemed to have given up on the entry level and mainstream. Navi 33 is actually more like an RX5500XT replacement:
I'm not sure that is entirely true, I'd say it's probably more accurate that AMD and Nvidia are trying to shift their left over stock which they hoped to make bank on during lockdown/crypto... and instead of lowering prices like they should do (they joy of a duopoly, more realistically a monopoly in many respects), they're just screwing over consumers, many of whom seem happy to be bent over...
 
But the big issue is the RX7600S(which is not even fully enabled),is also consuming less power than the preceding RX6600 series,whilst being faster too. AMD basically is using a slightly denser and more efficient RDNA3 version of the RX6600 series,with even smaller die on a cheaper node. Just look at the RX7600S running at 85W~95W:

10% faster than a 140W laptop RTX3060,with a power rating under 100W. Now look at a 140W RTX4060 and a 140W RTX3060:

The RTX4060 is about 10% faster. So there is the problem - most laptops will have the dGPU running at 75W~95W anyway. Desktop variants of both cards will run at between 100W~200W for both. The AD107 is about 150MM2 on TSMC 4NM,and Navi 33 is about 200MM2,so I am not sure costs are even in favour of Nvidia.

They seemed to have given up on the entry level and mainstream. Navi 33 is actually more like an RX5500XT replacement:


I used DLSS,and ultimately for me it made me realise my RTX3060TI with any modern RT effect,etc is just really a weak card. It won't have any longevity with it on,and ultimately if I could have gotten an RX6700XT for a similar price,I know it will last longer just because of the added VRAM. But AMD have apparently given up on the UK market,so we are lucky to get any of their dGPUs near to launch at RRP. Ultimately they are also joining in with the shameless RX7900XT. Its a real shame Intel is having problems,and we really need them to at least be competitive under £500.

One of the big problems with modern PC games development is "Early Access" which meant we get games which are poorly optimised in every way,and PC gamers just put up with it. Look at Ark? One of my mates pushed me to get it,and I refunded it within an hour. But loads of people threw hardware at it.

I have bought very few of these kinds of games,and only when I feel they actually are going to bother to make them run properly. So I agree people need to hold games to a higher standard,but despite a few of us telling people not to throw money at these companies,but FOMO. FOMO is half the reason why PCMR puts up with "Early Access" games,microtransactions and **** dGPUs which deserve to stay on shelves.

But if mainstream PC hardware is getting progressive worse and worse,its adding on top of all the other problems.We had massive stagnation in VRAM improvements since 2016. Between 2010 to 2016,we went from 1GB to 8GB of VRAM on sub £400 dGPUs. From 2016 to 2023,we are lucky to see even 12GB on sub £500 dGPUs.

As an exclusively mainstream dGPU buyer,a few of us noticed a progression over the last decade or so. 60 series cards and their AMD versions are progressively getting lower and lower tier,whilst prices are going upwards.

You can see that in the GPU tier they are,the relative die area,transistor count and actual performance relative to the top end.Cards like the famous GTX460TI,were made using a second tier dGPU. The same as the 8800GT,or the 6600GT and so on. Then you had lower end enthusiast models made using a salvage of the first tier dGPU(9500 Pro,X1950 PRO) and so on. Now look at the RTX4060? It's made using the fourth tier dGPU - its now a low end card,and probably will cost £400.

The GTX760 and GTX660TI,as an example were derived from the second tier,GK104 dGPUs. They were truly "midrange" dGPUs,as they set in the middle of the range. What was the fourth tier card in the Kepler range? A GTX650/GT740.

Now compare to a console,and suddenly it makes sense to bunch PC game development with consoles. They after all are technically now are within the realms of hardware the average gamer might be using with similar timeframes. But during the 2000~2012 era,the PC was moving at such a high rate compared to consoles,even the entry level/mainstream tiers were not comparable.

Most on here would never know it as they buy only higher end dGPUs,which are less affected. But for someone like me,and most of our mates we have seen this. Most of the builds or suggestions we do for other people include under £500 dGPUs. I barely know anyone with a £700+ dGPU in the realworld. I think an RTX3080 is probably the highest end IIRC.


Hear what you're saying, completely correct on the tier of cards actually getting lower and yes, as a consumer that is a really poor startegy from Nvidia which could be eliminated by clouding it with naming tiers something completely different - though we'd all be having a stab and whatever it was called to it's prior equivalents.

I had a 8800GT and that came in 2 VRAM flavours 320Mb & 640Mb, that's when the VRAM thing started off, plus I think these were the 1st cards with multi stream processors where processors could do all the jobs from raster pipelines to geometry and not specific cores that did specific jobs.

More recently apart from one or two games that did have VRAM issues when 1st released - it really isn't an issue aside expectations of what folk think they are entitled to, again it's still somewhat resolution dependent.

I get the ire for AMD for 'joining in' but they don't have the excuse from using the expensive GDDR6X from micron, not that it makes a blind bit of difference whilst apparently having 4x the through put. Why NV don't put normal GDDR6 on and drop all the prices, as 6X seemingly has no benefits over plain 'ol 6, only an increase in cost.

I see DLSS & FSR as comfort technologies, like motion blur was, just helps with comfort. I know it's not pure rastered frames - but does a good job in some games as does FG.

RDR2 will make my 4090 fall over if I stick 16X AA on, but you dont need so much AA on at 4k. PArt of the issue is, that people are using the settings for apples to aplles comaprison that reviewers use to show the difference in performance. But you wouldnt use an ultra preset to game on, as a starting point or benchmark check to make sure your card is performing, yes, then, like we always did, fettle to get best perf without losing IQ. That's what gamers of old have always been doing. Apparently only Ultra preset with all the techs for low resolutions that you'd turn off, need to be on for some reason these days. No one expects to go in and fettle graphics settings, even though as a PC gamer, they are all available in menus.
 
I'm not sure that is entirely true, I'd say it's probably more accurate that AMD and Nvidia are trying to shift their left over stock which they hoped to make bank on during lockdown/crypto... and instead of lowering prices like they should do (they joy of a duopoly, more realistically a monopoly in many respects), they're just screwing over consumers, many of whom seem happy to be bent over...
Have people not see what happened with Turing? Nvidia tried to jack prices up so not only could they sell older stock at RRP,but they could reset pricing to mining pricing. Now we had both the pandemic and mining together,with silly people paying way above RRP. So seeing that they want to have a second go at fixing RRPs as high as possible. This is the problem,with the whole endless margin growth model. The margins themselves,revenue,profits,etc are less important that quarter on quarter margin growth. Nvidia margins are higher than Apple.

AMD probably has realised this,so basically realised they could just tweak the RX6600 series and put a low effort dGPU out and still compete OK. If Nvidia had priced this series in line with even Ampere,it would have been:
1.)RTX4090(AD102)
2.)RTX4080(salvaged AD102)
3.)RTX4070TI(AD103)=RX7900XT(Navi 31). Not more than £600~£700 IMHO.
4.)RTX4070(salvaged AD103)=RX7800XT(salvaged Navi 31).Not more than £500~£600.
5.)RTX4060TI(salvaged AD103)=RX7700XT(salvage Navi 31 or Navi 32).Not more than £400~£450.
6.)RTX4060(AD104)=RX7600XT(salvaged Navi 32).Not more than £350.
5.)RTX4050(AD106)=RX7500XT(Navi 33)
6.)Laptop MX card(AD107)


Instead we got:
1.)RTX4090(AD102)
2.)RTX4080(AD103)=RX7900XTX(Navi 31)
3.)RTX4070TI(AD104)=RX7900XT(salvaged Navi 31)
4.)RTX4070(AD104)=RX7800XT(Navi 32?)
5.)RTX4070(AD106)=RX7700XT(salvaged Navi 32?)
6.)RTX4060TI(AD106)=RX7700(salvaged Navi 32 or full Navi 33?)
7.)RTX4060(AD106)=RX7600XT(Navi 33)

It's almost like they are both doing enough informally to make sure they can get good prices for their dGPUs.

Hear what you're saying, completely correct on the tier of cards actually getting lower and yes, as a consumer that is a really poor startegy from Nvidia which could be eliminated by clouding it with naming tiers something completely different - though we'd all be having a stab and whatever it was called to it's prior equivalents.

I had a 8800GT and that came in 2 VRAM flavours 320Mb & 640Mb, that's when the VRAM thing started off, plus I think these were the 1st cards with multi stream processors where processors could do all the jobs from raster pipelines to geometry and not specific cores that did specific jobs.

More recently apart from one or two games that did have VRAM issues when 1st released - it really isn't an issue aside expectations of what folk think they are entitled to, again it's still somewhat resolution dependent.

I get the ire for AMD for 'joining in' but they don't have the excuse from using the expensive GDDR6X from micron, not that it makes a blind bit of difference whilst apparently having 4x the through put. Why NV don't put normal GDDR6 on and drop all the prices, as 6X seemingly has no benefits over plain 'ol 6, only an increase in cost.

I see DLSS & FSR as comfort technologies, like motion blur was, just helps with comfort. I know it's not pure rastered frames - but does a good job in some games as does FG.

RDR2 will make my 4090 fall over if I stick 16X AA on, but you dont need so much AA on at 4k. PArt of the issue is, that people are using the settings for apples to aplles comaprison that reviewers use to show the difference in performance. But you wouldnt use an ultra preset to game on, as a starting point or benchmark check to make sure your card is performing, yes, then, like we always did, fettle to get best perf without losing IQ. That's what gamers of old have always been doing. Apparently only Ultra preset with all the techs for low resolutions that you'd turn off, need to be on for some reason these days. No one expects to go in and fettle graphics settings, even though as a PC gamer, they are all available in menus.

The big issue with VRAM,is more about longevity and now the issue is you see things like PCI-E 5.0 being gated behind expensive motherboards(especially with AMD!). So small framebuffers,low Cache,memory bandwidth is going to start affecting these upsold lower end dGPUs. The problem is most reviewers will test them on higher end systems. A famous example was the 8800GT 256MB,where the performance fell off a cliff! I also remember HUB,showing the GTX1060 3GB had problems in systems with less than 16GB of system RAM,etc.

I don't have anything against DLSS,FSR,etc. My issue is the marketing trying to sell them as technologies that "improve" image quality,which is not totally true IMHO. Instead they should be more honest in saying how RT,etc is very taxing and these will enable you to upscale with a lower image quality hit than using earlier technologies(which is true).

A lot of these Ultra settings,either seem to be put in by Nvidia/AMD to screw over their competition in benchmarks,or for future generations of dGPUs. I think testing at one but Ultra settings makes sense,unless there is a massive image quality reduction(then it should also be tested).
 
Last edited:
Have people not see what happened with Turing? Nvidia tried to jack prices up so not only could they sell older stock at RRP,but they could reset pricing to mining pricing. Now we had both the pandemic and mining together,with silly people paying way above RRP. So seeing that they want to have a second go at fixing RRPs as high as possible. This is the problem,with the whole endless margin growth model. The margins themselves,revenue,profits,etc are less important that quarter on quarter margin growth. Nvidia margins are higher than Apple.
There is one difference between now and turing.... lockdown/crypto stupidy has shown Nvida and AMD that people will pay well above what was rrp for their products..... they're now choosing to ignore the financial downturn and real world economics people are facing. We can only hope that enough people refuse to buy to see prices drop but somehow I don't think that is going to happen and at least in the case of nvidia they'll just divert chips to the AI market.
 
  • Like
Reactions: TNA
There is one difference between now and turing.... lockdown/crypto stupidy has shown Nvida and AMD that people will pay well above what was rrp for their products..... they're now choosing to ignore the financial downturn and real world economics people are facing. We can only hope that enough people refuse to buy to see prices drop but somehow I don't think that is going to happen and at least in the case of nvidia they'll just divert chips to the AI market.

The problem is the financial issues are starting to affect companies too,because if end consumers are affected so are all the companies providing the services. We can see the Rainforest site cutting tons of jobs as an example. Its just sheer madness - they should be cutting margins to stimulate more volume,but going the other way.
 
The problem is the financial issues are starting to affect companies too,because if end consumers are affected so are all the companies providing the services. We can see the Rainforest site cutting tons of jobs as an example. Its just sheer madness - they should be cutting margins to stimulate more volume,but going the other way.
Nvidia only has about 25-30,000 employees (quick google)... not sure why nvidia needs that many in all honesty but that is a tiny amount of staff compared with some of the other tech companies, some of them have laid that many off in the last year. Amazon has 1.5million staff (not exactly the same due to warehouses), Facebook 75,000ish, MS 200,000ish, Google 140,000ish... all from a quick google so might be off.

Their profit to staff ratio is huge and unlikely they'll be getting rid of any staff (leaked emails seem to say as much)..... don't get me wrong I think they should be lowering the prices like you but nvidia are still riding on the stupid people train.
 
Nvidia only has about 25-30,000 employees (quick google)... not sure why nvidia needs that many in all honesty but that is a tiny amount of staff compared with some of the other tech companies, some of them have laid that many off in the last year. Amazon has 1.5million staff (not exactly the same due to warehouses), Facebook 75,000ish, MS 200,000ish, Google 140,000ish... all from a quick google so might be off.

Their profit to staff ratio is huge and unlikely they'll be getting rid of any staff (leaked emails seem to say as much)..... don't get me wrong I think they should be lowering the prices like you but nvidia are still riding on the stupid people train.

The big issue is if these companies started cutting back in acquistions themselves,and start deferring orders for new hardware,ie,reducing capital expenditure. Also the US trade restrictions with China,is a whole lot of revenue Nvidia can't depend on as much now. Considering Nvidia also still derives a lot of revenue from gaming,they really need to be weary of painting themselves into a corner. They after all can afford to cut the enormous margins they make,and still be fine. The issue is going to be volume - the less volume they have the higher the cost of amoritisation per unit volume when it comes to R and D,etc. AMD sells less dGPUs,but they are primarily a CPU company which makes dGPUs on the side.

AMD,OTH,at 15000~25000 employees is already pretty lean as companies go,especially as they are developing both CPUs and GPUs. In that sense,using MS and Sony to help with GPU R and D indirectly with consoles,certainly helps them out. So in that sense AMD is more diversified.
 
Last edited:
We shouldn't be needing FSR and DLSS to run titles on higher end dGPUs. It additionally highlights how poor a lot of these these dGPUs are,and the reality of how much performance stagnation we had over the last 5 years and also VRAM stagnation. I can understand consoles needing them because they are built to a cost,and are relatively affordable. But a lot of these dGPUs will soon fall off a cliff,especially with the stingy amount of VRAM which is even worse for the under £600 market. By now any over £200 card should be having at least 12GB~16GB of VRAM by now. We went from 2GB to 4GB to 8GB under £400 between 2012 and 2016! AMD has been a bit better in this regard but its quite clear they are both in on the act.
From my point of view, if the performance given by the transistors doing the DLSS magic is higher than the same transistors doing native => DLSS FTW regardless of the tier.
 
From my point of view, if the performance given by the transistors doing the DLSS magic is higher than the same transistors doing native => DLSS FTW regardless of the tier.

If I need to do upscaling at qHD from Day One with a £500 graphics card,it's not going to have any longevity over a few years,and DLSS is not Apples to Oranges with native rendering. I might as well buy a console. Plus,what happens when two years later,the next generation gets supports for the next DLSS,and the one your "old" card supports gets dropped. Those are wasted transistors at this point.

All these added features should be on top of a decent rasterised/RT performance boost,as the "icing on the cake".Except it has become the icing itself and the cake has no sponge.
 
Last edited:
If I need to do upscaling at qHD from Day One with a £500 graphics card,it's not going to have any longevity over a few years,and DLSS is not Apples to Oranges with native rendering. I might as well buy a console. Plus,what happens when two years later,the next generation gets supports for the next DLSS,and the one your "old" card supports gets dropped. Those are wasted transistors at this point.

All these added features should be on top of a decent rasterised/RT performance boost,as the "icing on the cake".Except it has become the icing itself and the cake has no sponge.
Well, the last paragraph is what I'm talking about.
I was pointing out that DLSS offers better value/space used vs. native, to me.
You're speaking about a pricing problem, which there is. There isn't a tech problem or even a GPU problem. 4070ti is ok as an 4060ti card and the 4080 as 4070ti, both priced accordingly to last gen at least. 4090 is a bit over 60% quicker in raster and about double in RT vs 3090 at native.

DLSS hasn't been nerfed yet for Turing (touching wood), quite the contrary - it gets better and supper simple to upgrade to latest version by copying a dll, stuff which is impossible for FSR. My 2080 aged like fine wine thanks to it. :)
 
Last edited:
I had a 8800GT and that came in 2 VRAM flavours 320Mb & 640Mb, that's when the VRAM thing started off

To be pedantic was the GTS with 320 and 640, GT 256, 512, 1024 and GS with 384 or 768. Then they made a G92 GTS with 512... then for good measure basically rebadged all the cards as 9x00.

I miss the days when the AIBs could customise RAM more - I had several cards back in the early to mid 2000s where Gainward doubled up on the VRAM or used faster modules, etc.
 
Last edited:
To be pedantic was the GTS with 320 and 640, GT 256, 512, 1024 and GS with 384 or 768. Then they made a G92 GTS with 512... then for good measure basically rebadged all the cards as 9x00.

I miss the days when the AIBs could customise RAM more - I had several cards back in the early to mid 2000s where Gainward doubled up on the VRAM or used faster modules, etc.


Cheers for remembering the whole stack, can't remember there being as many as there was GT's & GTS's I do, GS I cant remember. Oh, and then a re badge. The 9000 series was a complete flop performance wise and was, just as you say, a rebadge wasn't it?
 
Back
Top Bottom