• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Can we stop arguing (again) about what each card 'should' be called, before someone has an aneurysm.

The names don't set the prices anyway, it's irrelevant.
 
Last edited:
The names do matter as clearly Nvidia are trying to pass off lower end products as higher end ones and even charging a premium on top of last gens prices.

It is totally relevant. People haven't learnt anything from Pascal V1,when Nvidia tried the same trick.Because by passing off lower end products as higher end ones,it means the percentage transistor jump is lower than at the top.

I don't even understand how people cannot realise this. Also the cache amounts start to go down too on the lower end models, meaning memory bandwidth will start to hit more issues at the bottom.

An example:

AD102(608MM2) from GA102(628MM2): 76.3 billion transistors from 28.3 billion transistors,ie,2.69x jump.AD102=98MB cache.
RTX4090 is an 89% salvage of the AD102 and the RTX3090TI is the full GA102.

RTX4090 from RTX3090TI :

46% increase in performance at 4K,and 39% at qHD.

AD103(379MM2) from GA102(628MM2):45.9 billion transistors from 28.3 billion transistors,ie,1.62x jump.
RTX4080 uses a fully enabled chip.

15% performance jump from RTX3090TI at 4K and 19% at qHD.

AD104(295MM2) from GA102(628MM2):35.8 billion transistors from 28.3 billion transistors,ie,1.27x jump.RTX4070TI uses a fully enabled chip.

8.9% slower at 4K and 1.6% slower at qHD.

So as the transistor jump because smaller and smaller the core performance decreases. Also the lower end cores,have smaller memory buses too.AD104=49MB cache.

This is why the RTX3080 12GB is nearly 20% faster in Cyberpunk 2077 with RT on:

The RTX3080 12GB is around 30TFLOPs and the RTX4070 is around 29TFLOPs.
 
Last edited:
It is totally relevant. People haven't learnt anything from Pascal V1,when Nvidia tried the same trick.Because by passing off lower end products as higher end ones,it means the percentage transistor jump is lower than at the top.

I don't even understand how people cannot realise this.

I can. It begins with the mindshare excusing everything saying the companies are not charities.. (after they have been milking the mining booms and now the AI booms in plain sight). People are waking up, just very slowly.
 
I can. It begins with the mindshare excusing everything saying the companies are not charities.. (after they have been milking the mining booms and now the AI booms in plain sight). People are waking up, just very slowly.

You saw the test of the RTX3080 12GB against the RTX4070 12GB I posted above? Similar speed cores,but because the RTX3080 has much more memory bandwidth,it is upto 20% faster in Cyberpunk 2077 with RT.

It was the same issue,when AMD tried to quietly replace the RX5700XT with the RX6600XT at a similar price. At 1080p it was ahead,but at 4K it lost. This is despite the RX6600XT having more TFLOPs than the RX5700XT! Navi 33 has 1/3 the Infinity Cache of Navi 32,despite the latter only having 25% more shaders.
 
Last edited:
You saw the test of the RTX3080 12GB against the RTX4070 12GB I posted above? Similar speed cores,but because the RTX3080 has much more memory bandwidth,it is upto 20% faster in Cyberpunk 2077 with RT.

If it was the dan owen one I watched it after work earlier, but your preaching to the converted I'm afraid. I entertain it as it just confirms what I have been saying for the whole of the ampere era, bandwidth is important then vram. As HU put it (and the selective ones now using them about better than native), some time ago - if your spending over $500 on a premium product it has to come with at least 12GB vram. There were people literally laughing at the planned obsolescence references not long ago and now its came home to roost (again).
 
Rumour mill is putting out some sort of interesting things: 16 GB 4060 Ti and an AD103 based 4070 (my own speculation is: maybe also opening the door to 16GB?!).


 
Its the utter contempt these companies show to PCMR,who has made them billions of USD in profits. I don't know why PCMR defends these companies. The whole bunch of them are chancers! There used to be a time loyal customers used to be rewarded. Now loyal customer=mug. Its happening everywhere.

I once heard an immensely cheesy line from a romcom a few years ago where a man has an affair due to his wife being physically and mentally abusive, She asks him why he likes this other woman and he replies with "Because she doesn't punish me for loving her".

Pure cheese and cringe but it reminds me of how PC component manufacturers treat customers who keep buying from them with utter contempt.
 
Last edited:
There were people literally laughing at the planned obsolescence references not long ago and now its came home to roost (again).

Well they were warned about it... A reason a 3080 10GB was £650... and a 12GB version was £1000+ and with no real MSRP and a 3080ti was £1200 with same 12GB.

The 3080 10GB was great value on day one and 1-2 years later the skeletons in the closet were clearly showing to the people that didn't realise the card had planned obsolescence written all over it from day 1. Same with the 8GB cards sadly the 3070ti,3070,3060ti all good cards with way too little vram for the time.

Anyways we have been threw this enough now with VRAM and the reality is now out there so anyone arguing otherwise is well living in a different world from reality of what these companies are doing to their customers and how they make sure to make you update sooner rather than later..
 
Well they were warned about it... A reason a 3080 10GB was £650... and a 12GB version was £1000+ and with no real MSRP and a 3080ti was £1200 with same 12GB.

The 3080 10GB was great value on day one and 1-2 years later the skeletons in the closet were clearly showing to the people that didn't realise the card had planned obsolescence written all over it from day 1. Same with the 8GB cards sadly the 3070ti,3070,3060ti all good cards with way too little vram for the time.

Anyways we have been threw this enough now with VRAM and the reality is now out there so anyone arguing otherwise is well living in a different world from reality of what these companies are doing to their customers and how they make sure to make you update sooner rather than later..
You've got that all wrong with the 3080 mate, it's totally fine you just have to turn on DLSS and that keeps it relevant and it's still way better than everything!:p
 


Ohh look DLSS4 in the making... and why you won't need lots of VRAM in the future...

It's good tech but Nvidia will find reasons with it to limit VRAM on GPUS again for sure... Too busy with software tricks instead of adding more hardware... Lets see how long they can carry on that way.
 
The 4060ti... I mean 4070 :D would have pushed the 3080 10GB to the xx60 tier of cards... maybe it should have been called a 3070 or 3070ti with 10GB/20GB and the 3080ti 12GB with maybe 24GB of VRAM should have been the og 3080.

A true 3090 should have been closer to a 4080 going of TPU relative performance numbers as theres a massive 30% between them.

A smart move would have been to move a 3070ti/3080 on if you can get a high price for it and pick up a new 4070 with warranty and knowing the cards not been jetwashed and mined to death.
 
Last edited:
You've got that all wrong with the 3080 mate, it's totally fine you just have to turn on DLSS and that keeps it relevant and it's still way better than everything!:p

Yup madness isn't it, could be worse, you could pay ££££ and still not manage to get a locked 60 fps in broken/unoptimised games:

8XawDJA.png

8XawDJA.png

bzj4Bt2.png

9btpvqG.png

7ispTB1.png

32DggTZ.png

Heck even with upsampling, you may not get a locked 60:

P5GRJ1O.png

2muySFy.png

5qJQt4t.png

IwEIoKu.png

Hopefully nvidia will provide you with a 16gb version of the 4070 soon otherwise you're going to be suffering through another 2-3 years :cry:

Remember, it's never the games fault :) ;)

Too busy with software tricks instead of adding more hardware... Lets see how long they can carry on that way.

Until nvidia have competition in "all" areas, they'll be leading the market, it would be nice for amd to come out swinging with a new feature first tbh rather than just following what nvidia do but as shown, amd give no ***** about the pc market.

The 3080 10GB was great value on day one and 1-2 years later the skeletons in the closet were clearly showing to the people that didn't realise the card had planned obsolescence written all over it from day 1. Same with the 8GB cards sadly the 3070ti,3070,3060ti all good cards with way too little vram for the time.

Despite all the complaints from hub on "zOMG VRAM!!!!", seems they still rather have a 3080 10gb over a 6800xt and haven't really found any "serious" issues :cry:

 
I don't see the point of that comparison at the moment.

The RX 6900 XT and 6950 XT are generally a bit faster than both the RTX 3080 and RTX 4070, and the 6950 XT can be had for £620.

If you want better RT performance, then you would choose a modern Nvidia card.
 
Last edited:
Here's a video of a RTX 3080 TI struggling in the Last of Us, at 1440p Ultra:

Very high GPU usage throughout the video.

He's using an i7 12700 CPU, which I would've thought would be good enough.

Maybe some of the Ultra preset options are just awful for GPU utilization?

Is the game sensitive to RAM frequencies? he is using DDR4 @ 3600 Mhz.

There's a huge amount of system RAM being allocated (>30GB), what the hell?
 
Last edited:
Back
Top Bottom