• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

HUB are not saying that the range of options should be reduced, my take is that they are saying there is a market for 8Gb cards, but only at the entry level and not for products that are priced at $400/£400ish.
thats exactly what doesnt work here, its a very hypothetical way of looking at things, short of forming a gamers' union

edit: a better framework to judge the value of 4060 ti 8gb variant would be to line up all cards presently in circulation priced not greater than $400 (launch prices) and then suggest the best, but i think its going to be a bit difficult these days with how these cards have become too complex to rank with RT and upscaling and all that jazz, no single score to base your decision on no more. this video by HUB somehow gives a very confusing message
 
Last edited:
The real issue is almost certainly the 8 lanes.

They shouldn’t sidestep that altogether, and should also be using 16 full lanes as a comparison. The lesser number of lanes will likely also create latency issues regardless of PCI-E generation.

However, to HUB’s credit, they appear to be the only ones who even mentioned the 8 lane limitation, amongst the ubiquitous other reviewers who are using the 4060 Ti models as VRAM comparisons.

That did of course lead to the basis for this video in the first place. :o
 
The real issue is almost certainly the 8 lanes.

They shouldn’t sidestep that altogether, and should also be using 16 full lanes as a comparison. The lesser number of lanes will likely also create latency issues regardless of PCI-E generation.

However, to HUB’s credit, they appear to be the only ones who even mentioned the 8 lane limitation, amongst the ubiquitous other reviewers who are using the 4060 Ti models as VRAM comparisons.

That did of course lead to the basis for this video in the first place. :o

8 lanes at PCIe 4.0 causes zero issues for a 4060Ti. I have a dual 3090 system NVLINKED on PCIe 4.0 8x for both cards and sees only a 1-2% difference from a 16x by 16x setup.The main issue is VRAM then the issues of 8 lanes on PCIe 2.0 even PCIe 3.0 should be more than enough at 8 lanes for a 4060Ti, they showed that with the 8GB card VS the 16GB card in the video where the 8GB card was still causing issues while the 16GB card was fine or a lot better.

The problem is Nvidia being Nvidia and was really silly of them to have done the 8 lanes only as these cards are aimed at people that want to upgrade a midrange system and most midrange systems will either have PCIe 3.0 or maybe be even older PCIe 2.0. Even their 3060Ti have PCIe 4.0 16 lanes before.
 
8 lanes at PCIe 4.0 causes zero issues for a 4060Ti. I have a dual 3090 system NVLINKED on PCIe 4.0 8x for both cards and sees only a 1-2% difference from a 16x by 16x setup.The main issue is VRAM then the issues of 8 lanes on PCIe 2.0 even PCIe 3.0 should be more than enough at 8 lanes for a 4060Ti, they showed that with the 8GB card VS the 16GB card in the video where the 8GB card was still causing issues while the 16GB card was fine or a lot better.
That could be because the cards aren’t bandwidth starved in the first place on an x8 Gen 4.0, so you only see a small difference. On the other hand, if they were bandwidth starved, the difference would be much greater because more of the data would be waiting to get onto the lanes.

PCI-E 3.0 x8 is the same as full lane PCI-E 2.0, so effectively pretty edge case for even a 4060 TI at high resolutions.

The increased VRAM reduces the instantaneous bandwidth requirements, which is great if you’re bandwidth starved.
The problem is Nvidia being Nvidia and was really silly of them to have done the 8 lanes only as these cards are aimed at people that want to upgrade a midrange system and most midrange systems will either have PCIe 3.0 or maybe be even older PCIe 2.0. Even their 3060Ti have PCIe 4.0 16 lanes before.
Yes, these cards are clearly aimed at more mainstream segments, so no need for the extra lanes, or even the VRAM in that regard.

The real question is why bother with 16 GB for these cards outside of very limited situations?

I suppose the reviewers seem to have found something to jump on. ;)
 
Last edited:
What they also highlight is the problems of the PCI-E 8x link on the RTX4060TI - in their extensive testing even on PCI-E 3.0 systems(which are common),the performance drop is noticeable. I suspect the RTX3060TI will have less issues.

Also the degradation in image quality. The 16GB model doesn't have seem to have as many issues,although the lack of PCI-E bandwidth can sometimes be noticed.

This is something Nvidia and AMD(RTX4060TI,RTX4060,RX7600) have done with some of their recent mainstream cards - 8GB VRAM and a cut down PCI-E bus. It's very cynical because I expect many people buying these cards will upgrading older systems. Ironically,it means a high end card is going to have less issues with an older PCI-E version system,than these newer budget cards.

If these sorts of card were well under £250 it would be less of a problem.
 
Yes, these cards are clearly aimed at more mainstream segments, so no need for the extra lanes, or even the VRAM in that regard.

The real question is why bother with 16 GB for these cards outside of very limited situations?

I suppose the reviewers seem to have found something to jump on. ;)

Totally disagree being a mainstream user myself. Enthusiasts need to understand the basic market and I talked about how PCI-E limitations are a real problem.

Almost the entirety of gamers I know use sub £500 cards,and more importantly many keep their systems for years.

Lots of Intel systems are still limited to PCI-E 3.0! Lots of AMD APU systems are limited to PCI-E 3.0 too. Lots of Zen2 and Zen3 systems are limited to PCI-E 3.0,because B450 is still being used now. Even for almost the first year of Zen2,there was no B550!
 
Last edited:
Yes, and they don’t have these for 4K VRAM comparisons.

You can get 1440p monitors with 120Hz and above refresh rates for under £150 now. They tested games at qHD and 1080p. The only reason there is a 16GB model is because they can't do 12GB due to the memory controller layout. Next generation it will be possible to do a 12GB RTX5060 with 3GB GDDR7 modules.

I run games at 1440p and and have an RTX3060TI on a PCI-E 3 system with a Ryzen 7 5700X. £400 was the limit I put on an 8GB card in 2021,and in the end seeing how the RTX3070/RTX3070TI are hitting the same bottleneck,I think I made the best balance there.

I see the VRAM limitations already - my mate on an RX6700XT has less problems.HUB also tested at 1080p and qHD too and showed problems. The consoles can use at least 10GB of VRAM,so the reality is that the writing was on the wall already.

However,my system is old now - but so many people upgraded later to similar configurations much later on. Lots of prebuilt systems have similar specifications. Lots of Intel and AMD systems are still PCI-E 3.0!

But it isn't 2021,it's 2024 and a £300+ 8GB card is a joke and sadly too many people overspent on 8GB cards. So people need to realise Nvidia and AMD had you,one way or another.

The PCI-E 8x bus was always a big problem with 8GB of VRAM. I told people for years,that this was an issue. Even the RX6600XT was the same configuration and I told people it wasn't worth £300+ even at launch . It is OK at almost £200,but I thought it was a crap dGPU and a crap launch price. The RX7600/RX7600XT I also do not like either. Too expensive for a cut down dGPU.

This is why my next platform will have PCI-E 5.0,unless I can am going the budget route. I fully expect to see the same PCI-E 8X crap on most sub £500 cards going forward.
 
Last edited:
Totally disagree being a mainstream user myself. Enthusiasts need to understand the basic market and I talked about how PCI-E limitations are a real problem.

Almost the entirety of gamers I know use sub £500 cards,and more importantly many keep their systems for years.

Oh I thought it was just follow the banana route and get the best value card! :p
 
Oh I thought it was just follow the banana route and get the best value card! :p

You would think!

A big issue is also how many prebuilt systems been configured with PCI-E 3.0 platforms still. The RTX4060TI is often sold at a reasonable premium over the RTX4060 too. I just tell people to find the extra for an RTX4070. AMD options are not as common in the UK.

Some companies are trying to charge £1000 for RTX4050 6GB laptops! :cry:
 
Last edited:
You can get 1440p monitors with 120Hz and above refresh rates for under £150 now. They tested games at qHD and 1080p. The only reason there is a 16GB model is because they can't do 12GB due to the memory controller layout. Next generation it will be possible to do a 12GB RTX5060 with 3GB GDDR7 modules.

I run games at 1440p and and have an RTX3060TI on a PCI-E 3 system with a Ryzen 7 5700X. £400 was the limit I put on an 8GB card in 2021,and in the end seeing how the RTX3070/RTX3070TI are hitting the same bottleneck,I think I made the best balance there.

I see the VRAM limitations already - my mate on an RX6700XT has less problems.HUB also tested at 1080p and qHD too and showed problems. The consoles can use at least 10GB of VRAM,so the reality is that the writing was on the wall already.

However,my system is old now - but so many people upgraded later to similar configurations much later on. Lots of prebuilt systems have similar specifications. Lots of Intel and AMD systems are still PCI-E 3.0!

But it isn't 2021,it's 2024 and a £300+ 8GB card is a joke and sadly too many people overspent on 8GB cards. So people need to realise Nvidia and AMD had you,one way or another.

The PCI-E 8x bus was always a big problem with 8GB of VRAM. I told people for years,that this was an issue. Even the RX6600XT was the same configuration and I told people it wasn't worth £300+ even at launch . It is OK at almost £200,but I thought it was a crap dGPU and a crap launch price. The RX7600/RX7600XT I also do not like either. Too expensive for a cut down dGPU.

This is why my next platform will have PCI-E 5.0,unless I can am going the budget route. I fully expect to see the same PCI-E 8X crap on most sub £500 cards going forward.
You can’t have it all.

I sure can’t, and I’m certainly not going to believe everything I read or hear.

It has to genuinely fit my requirements.
 
Last edited:
On the AMD side, 650E and 670E are pretty power hungry AFAIK even for idle power.

Unsure about LGA1700 as even something like this cheap(ish) Asus H610M Pro
Has x16 PCIe 5.0 - but from the CPU.

While motherboard manufacturers obviously love to upsell and want to make as much profit as possible (unsure but I would suspect PCIe 5.0 traces need better design, possibly even better materials / layers), I am worried about the power rise I saw on AM5 (and AM4 with PCIe 4.0).

Sure, a full x16 card is far better at masking low VRAM by streaming things from system RAM, but the main problem is too little VRAM but Nvidia keep doing that*.

I do wonder if x4/x8x16 is just about costs. When every mm² on the die gets counted double by the bean counters, increasing internal structures would cost something, and the analog parts of chips doesn't scale well. But some of this seem to be mostly a packaging costs. Well, that and market segmentation thing.


* And any VRAM threads get shut down quickly on OCUK. And 12GB is going to face this soon too.
 
I was about to agree but then thinking about it I wouldn't say marketing masterclass, more that they pretty much have us over a barrel :/

We get 8GB because they don't NEED to give us any more, we get 8x lanes because they don't NEED to give us 16x, "we" here being the market in general until a decent value alternative becomes available or the AI market crashes and they need to start caring about us again.
 
Last edited:
Clearly Nvidia are not fools as they’ve managed to get people buying 70 class cards for $800+, 60 class for $600 and 50 class for $300-500.

That they have achieved this in space of a few years is nothing short of a marketing masterclass.
Just one of the many changes since lockdown.

The world’s a complicated place. :)
 
What they also highlight is the problems of the PCI-E 8x link on the RTX4060TI - in their extensive testing even on PCI-E 3.0 systems(which are common),the performance drop is noticeable. I suspect the RTX3060TI will have less issues.
yeah thats a good finding, but still the video is relying a lot on 1440p stats, while the card has been marketed for 1080p, which may be a flawed comparison especially when compared to the 16GB variant, because the latter is dearer by $100
HUB is seemingly opining that $399 is a fair price for the 16GB variant.. if only they could enlighten decision makers at nvidia!

We get 8GB because they don't NEED to give us any more, we get 8x lanes because they don't NEED to give us 16x, "we" here being the market in general until a decent value alternative becomes available or the AI market crashes and they need to start caring about us again.
market segmentation 101, though i think 1080p will be won over by APUs within next gen or in worst case by 2026, so nvidia will be working out a new scheme lol
 
You can get 1440p monitors with 120Hz and above refresh rates for under £150 now. They tested games at qHD and 1080p. The only reason there is a 16GB model is because they can't do 12GB due to the memory controller layout. Next generation it will be possible to do a 12GB RTX5060 with 3GB GDDR7 modules.

I run games at 1440p and and have an RTX3060TI on a PCI-E 3 system with a Ryzen 7 5700X. £400 was the limit I put on an 8GB card in 2021,and in the end seeing how the RTX3070/RTX3070TI are hitting the same bottleneck,I think I made the best balance there.

I see the VRAM limitations already - my mate on an RX6700XT has less problems.HUB also tested at 1080p and qHD too and showed problems. The consoles can use at least 10GB of VRAM,so the reality is that the writing was on the wall already.

However,my system is old now - but so many people upgraded later to similar configurations much later on. Lots of prebuilt systems have similar specifications. Lots of Intel and AMD systems are still PCI-E 3.0!

But it isn't 2021,it's 2024 and a £300+ 8GB card is a joke and sadly too many people overspent on 8GB cards. So people need to realise Nvidia and AMD had you,one way or another.

The PCI-E 8x bus was always a big problem with 8GB of VRAM. I told people for years,that this was an issue. Even the RX6600XT was the same configuration and I told people it wasn't worth £300+ even at launch . It is OK at almost £200,but I thought it was a crap dGPU and a crap launch price. The RX7600/RX7600XT I also do not like either. Too expensive for a cut down dGPU.

This is why my next platform will have PCI-E 5.0,unless I can am going the budget route. I fully expect to see the same PCI-E 8X crap on most sub £500 cards going forward.

My problem with this point is that current gen consoles have been sacrificing settings including res for a long time now..... They most of the time use adaptive resolution and as evidenced by DF testing, this res can drop as low as 480p a lot of time :/ Not to mention, usually the settings are on a mix of low, medium and high and of course RT significantly reduced or off completely unless targetting 30 fps lock.... i.e. despite having unified 16gb for vram/memory, it is doing sweet **** all and you would still be better of buying a 3070 to get the overall better visuals and performance/experience, DF have show cased so many times now in various games.
 
Last edited:
Back
Top Bottom