• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

It seems better to buy a 3060 12GB, give up some performance, more VRAM, compared to 4060Ti 8GB, and save some money over the 16GB version.

Apparently Nvidia has said that the 4060ti isn't even designed to play at 1080p ultra settings, so its already out of date as a 1080p card before launch, https://youtu.be/ocAi9y4n1UQ?t=440

Certainly can see some people being tempted by the 3060 instead
 
Eh! So you're saying if they'd not renamed the 4080 12GB we wouldn't have had a 4070 Ti, ever.
We may have got something further down the line as there is a big gap between the 4070 > 4070ti/4080 12gb in CUDA but I think the plan to call it a 4080 12gb was more to do with making the daft price seem more justified.

The 4070 spec has not changed and would have always been what Nvidia intended to release, the 4070ti was just Nvidias way to get the 4080 12gb on the market albeit having to concede $100 in what they originally wanted for the card.
 

exqJdh3.gif


What a fantastic video by digital foundry again, over to you Steve/HUB ;) :cool:
Oh wow, another AMD sponsored game that - as I've said repeatedly - the more vram it uses the worse it looks. Who would have thought? I was quite convinced that the game was fine and it was the cards that didn't have a lot of vram, but maybe I was wrong.. Well...

image-2023-05-19-200640547.png
 

  • GeForce RTX 4060: MSRP $299/€329
  • GeForce RTX 4060 Ti 8GB: MSRP $399/€439
  • GeForce RTX 4060 Ti 16GB: MSRP $499/€549
 
It would have likely been released a year after the 4070 as a replacement with a few more CUDAs at the same price.
You're trolling, right. Because you seem to be responding to my posts without addressing the salient point raised in said post. For the third time now i wasn't wonder if or when they would've released the real 4070 Ti, i was wondering how much worse the real 4070 Ti would've performed.
 
You're trolling, right. Because you seem to be responding to my posts without addressing the salient point raised in said post. For the third time now i wasn't wonder if or when they would've released the real 4070 Ti, i was wondering how much worse the real 4070 Ti would've performed.

It would have been the current 4070 we have now as the "real" 4070ti. If Nvidia didn't rename the 4080 12GB.
 
You're trolling, right. Because you seem to be responding to my posts without addressing the salient point raised in said post. For the third time now i wasn't wonder if or when they would've released the real 4070 Ti, i was wondering how much worse the real 4070 Ti would've performed.
Nvidia had the plan and that plan didn't include a 4070ti until the 4080 12gb got cancelled, If they had of released a 4070ti in the future then it would have probably been more akin to the performance lift a 3070ti got over a 3070 so around 5%
 
Nvidia had the plan and that plan didn't include a 4070ti until the 4080 12gb got cancelled, If they had of released a 4070ti in the future then it would have probably been more akin to the performance lift a 3070ti got over a 3070 so around 5%
Because apparently you have insider knowledge that nobody outside of Nvidia knew about. :cry:
 
Because apparently you have insider knowledge that nobody outside of Nvidia knew about. :cry:
I think its just common sense with how predictable Nvidia has been in their releases so far this generation. Also a 70ti / super card has never been something Nvidia has released before a 70 until they were forced to with the poor press the 4080 12gb got.
 
Someone is in a pleasant mood.

You said this "It is only made to look decent value by the stupidly high prices of the GPU's below it".

£1,050 is much less 'stupidly high' for the 4080 than £1579 for the 4090.

£1,000 or less would be more ideal, but I think it will fall a bit further.

You've got it the wrong way around, Nvidia's pricing strategy for each series is based around charging as much as possible for the flagship card.

They will probably charge close to £2,000 for the RTX 4090 TI, but there's no chance it will deliver 2x the performance of the RTX 4080.
My mood is fine though reasoning with my seven year old seems easier than with you! ;)

You started off by stating that "...the 4090 is not good value", which is like saying "the sun is hot". I repeat what others have said, the top tier card has never been "good value", nor is it intended to be so. Where have you been for the last ten years? (Maybe you're only ~15yr old which would explain things and for which I apologise. :))

The top tier card (normally called "Titan") was usually the preserve of the graphics/video professional because it generally had twice the video memory of the consumer gaming variant and also cost near enough 50% as much as the x80ti class card, though the gaming performance itself was barely better, if at all.

So if you go back ten years if memory serves me correctly I believe the GTX Titan cost ~$1000 while the same generation 780Ti cost ~$700 but gaming performance wise both were pretty much the same but the Titan had twice as much memory. And so a similar pattern continued up until the RTX 3090 where Nvidia departed from the naming scheme and dropped the "Titan" name but everybody knew this was the Titan class card with barely an improvement in gaming performance over the 3080Ti but with twice as much video memory for which users paid a premium even though it was only really graphic professionals that reaped the benefit.

Forward to today where we have the 4090, which is in effect the Titan class card but Nvidia has decided to scrap precedence and created much more of a performance gap between the x80 and the "Titan" 4090. You seem to forget that the 4080 launched at £1200, so yes that was a "stupidly high price" based on the last gen pricing, which made the 4090 seem like decent value for a change and also why the 4080 sold relatively poorly which is why they have dropped the price, though many 4080's are still priced around £1200

You also seem to have been hoodwinked by Nvidia by trying to justify the price of the 4080 by comparing it to the 4090, which is what Nvidia want you to do. Whereas what you should be doing is comparing the price of 4080 (and cost per frame) to the price of the 3080.
 
Last edited:
Oh wow, another AMD sponsored game that - as I've said repeatedly - the more vram it uses the worse it looks. Who would have thought? I was quite convinced that the game was fine and it was the cards that didn't have a lot of vram, but maybe I was wrong.. Well...

image-2023-05-19-200640547.png
For people wondering what is happening here, in order to allow low VRAM cards to run the game without stutter, they have had to reduce the amount of textures held in VRAM and stream them in late. Lower VRAM requirements but more pop in.
 
Last edited:
My mood is fine though reasoning with my seven year old seems easier than with you! ;)

You started off by stating that "...the 4090 is not good value", which is like saying "the sun is hot". I repeat what others have said, the top tier card has never been "good value", nor is it intended to be so. Where have you been for the last ten years? (Maybe you're only ~15yr old which would explain things and for which I apologise. :))

The top tier card (normally called "Titan") was usually the preserve of the graphics/video professional because it generally had twice the video memory of the consumer gaming variant and also cost near enough 50% as much as the x80ti class card, though the gaming performance itself was barely better, if at all.

So if you go back ten years if memory serves me correctly I believe the GTX Titan cost ~$1000 while the same generation 780Ti cost ~$700 but gaming performance wise both were pretty much the same but the Titan had twice as much memory. And so a similar pattern continued up until the RTX 3090 where Nvidia departed from the naming scheme and dropped the "Titan" name but everybody knew this was the Titan class card with barely an improvement in gaming performance over the 3080Ti but with twice as much video memory for which users paid a premium even though it was only really graphic professionals that reaped the benefit.

Forward to today where we have the 4090, which is in effect the Titan class card but Nvidia has decided to scrap precedence and created much more of a performance gap between the x80 and the "Titan" 4090. You seem to forget that the 4080 launched at £1200, so yes that was a "stupidly high price" based on the last gen pricing, which made the 4090 seem like decent value for a change and also why the 4080 sold relatively poorly which is why they have dropped the price, though many 4080's are still priced around £1200

You also seem to have been hoodwinked by Nvidia by trying to justify the price of the 4080 by comparing it to the 4090, which is what Nvidia want you to do. Whereas what you should be doing is comparing the price of 4080 (and cost per frame) to the price of the 3080.
Why'd you need such a long reply for what was quite a simple post?

Obviously you're quite emotionally invested in getting a flagship GPU like the 4090.

I'm not particularly advocating the RTX 4080 either, but I was comparing it with the value of the 4090.

The RX 7900 XT beats them both on cost per frame.

I don't care how the GPU is named, performance and value are what count.
 
Last edited:
For people wondering what is happening here, in order to allow low VRAM cards to run the game without stutter, they have had to reduce the amount of textures held in VRAM and stream them in late. Lower VRAM requirements but more pop in.

The devs added the option for texture streaming in the game too (fastest setting is equivalent to launch version so it hasn't been dumbed down)

They fixed/optimised the game regarding textures hence why even maxed out with the texture streaming settings, the game runs and looks better on gpus whilst all still using less vram than <1.05 versions i.e. the game was a buggy mess with the texture setting especially for < high texture settings.
 
Last edited:
Wow they've caved to the gamers with potato PCs who complained about loading times

Now it loads much faster but it introduces lots of stutters due to shaders not getting cached

Even on NVME, the game loading and especially shader compilation for first time was a joke, plenty of other games out there which have way better textures etc. and run smooth as butter do it far better.
 
For people wondering what is happening here, in order to allow low VRAM cards to run the game without stutter, they have had to reduce the amount of textures held in VRAM and stream them in late. Lower VRAM requirements but more pop in.
No, that's not what's happening. There is a separate option for texture streaming and has nothing to do with textures themselves.
 
Back
Top Bottom