• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
The difference is pretty small, comparing GDDR6 with upto 20 Gbps (for the RX 7900 XT/XTX) and 21 Gbps for GDDR6X (e.g. the RTX 4090).

Also 20Gbps GDDR6 is relatively recently released compared to the development timelines of 3000 and 4000 series GPUs - but for something like the 4090 it would likely be a bottleneck while it gets more complicated on the 4080, etc. lower down the stack you get a bit of a hit and miss story between bandwidth requirement vs GDDR6 and the cost and power/thermals of getting the required bandwidth.
 
Last edited:
Yeah, but I wonder how much of this is because Nvidia insists on using GDDR6X for desktop graphics cards instead of lower spec, cheaper VRAM (like AMD does).

I think either Nvidia doesn't want to admit that any advantage from 6X is small (clearly seen when comparing the RTX 3070 and 3070 TI), or they believe it's a good selling point (in terms of marketing RTX cards).

Consoles use 16GB of GDDR6, at 'just' 14 Gbps, no issues there (although they do use >192 bit memory buses).

As far as I know, Nvidia has only released 1 card with 12GB of GDDR6, which was the RTX 3060 12GB version:

I'd take 16GB of GDDR6 over 12/8GB of GDDR6X any day. 6X also has the problems of running significantly hotter and using more power, no doubt having an impact on the overall card design.
Didn't Nvidia order too much GDDR6X? This is why there are GDDR6X RTX3060 and RTX3060TI cards? I suspect Nvidia ordered so much GDDR6X because it might have been useful for mining cards. So I am not sure if cost is a problem.
 
Last edited:
GTX 1070: 8GB, 2016.
RTX 2070: 8GB, 2018.
RTX 3070: 8GB, 2020.

That's just pure planed obsolescence.

We talk a lot about both AMD and Nvidia not being your friend, and i hold to that, but at least AMD don't flog you expensive GPU's that they know will start to struggle to run the latest games even at 1080P just 2 to 3 years later, AMD would never get away with that and what narks me is Nvidia know they will because they are the ones with an army of white knights.

8GB is dead. we should have been on 16GB RTX 2070 by now.
1070 and 2070 were fine, they are 5 and 7 year old cards now so shouldn't be expected to last forever.
 
The difference is pretty small, comparing GDDR6 with upto 20 Gbps (for the RX 7900 XT/XTX) and 21 Gbps for GDDR6X (e.g. the RTX 4090).
GDDR6 is less than that, IIRC it maxes out at around 15 GT/s per pin vs 20 on GDDR6X. GDDR6X actually uses less power to transmit the same amount of data, it runs hotter and uses more power because it's transmitting more data.
 
Or Micron gave them a special deal.

But any extra bandwidth of the memory also means having to spend less die space on cache.

Running so hot would be my worry, but has that been a problem on any designs aside from the 3090s?
 
1070 and 2070 were fine, they are 5 and 7 year old cards now so shouldn't be expected to last forever.
The issue is that before Turing VRAM was increasing:
1.)GTX275 896MB
2.)GTX470/GTX570 1.28GB
3.)GTX670/GTX770 2GB
4.)GTX970 4GB
5.)GTX1070 8GB
6.)RTX2070 8GB
7.)RTX2070 Super 8GB
8.)RTX3070/RTX3070TI 8GB

Compare that with AMD/ATI:
1.)HD4870/HD4890 512MB/1GB
2.)HD5870 1GB
2.)HD6950/HD6970 2GB
3.)HD7950/HD7970 3GB
4.)R9 290 4GB/R9 390 8GB
5.)Fury 4GB
6.)Vega 56/Vega 64 8GB
7.)RX5700XT 8GB
8.)RX6700XT 12GB/RX6800 16GB
 
Last edited:
Nvidia have been holding back games and development for a while now.
you would want 16gb nowadays and 12gb minimum.
as more newer games comes out with UE5 etc..
cards gonna tank as ram requirement goes up
I bought the 6700xt due to 12gb and 300 euro cheaper than nvidias cards like the 3070 8gb for same or better performance.
Mindfactory displays how well amd cards are selling as amd is currently outselling nvidia there.

 
The issue is that before Turing VRAM was increasing:
1.)GTX275 896MB
2.)GTX470/GTX570 1.28GB
3.)GTX670/GTX770 2GB
4.)GTX970 4GB
5.)GTX1070 8GB
6.)RTX2070 8GB
7.)RTX2070 Super 8GB
8.)RTX3070/RTX3070TI 8GB

Compare that with AMD/ATI:
1.)HD4870/HD4890 512MB/1GB
2.)HD5870 1GB
2.)HD6950/HD6970 2GB
3.)HD7950/HD7970 3GB
4.)R9 290 4GB/R9 390 8GB
5.)Fury 4GB
6.)Vega 56/Vega 64 8GB
7.)RX5700XT 8GB
8.)RX6700XT 12GB/RX6800 16GB
I agree that the 3070 should have seen an increase to at least 10gb which that and the extra performance is why I'd have always found the extra money to get a 3080 instead.
 
I agree that the 3070 should have seen an increase to at least 10gb which that and the extra performance is why I'd have always found the extra money to get a 3080 instead.

The only time really 8GB becomes a limitation on the 3070 is when you have ray tracing in the equation, where the requirement for various extra buffers can demand another 2GB or so over non-RT. For 1440p 8GB is "fine" for now and by the time it isn't the 3070 will have had its day anyhow. (I'm still of the opinion nVidia should have made it a 10-12GB card though).

I only really bought a 3070 because of some ray tracing based development I've been doing - out of anything I've played lately only Hogwarts Legacy would have required something more than my previous 1070 - I rarely do anything other than play older games these days :( 90% of newer releases do not interest me in the slightest.
 
Last edited:
The issue is that before Turing VRAM was increasing:
1.)GTX275 896MB
2.)GTX470/GTX570 1.28GB
3.)GTX670/GTX770 2GB
4.)GTX970 4GB
5.)GTX1070 8GB
6.)RTX2070 8GB
7.)RTX2070 Super 8GB
8.)RTX3070/RTX3070TI 8GB

Compare that with AMD/ATI:
1.)HD4870/HD4890 512MB/1GB
2.)HD5870 1GB
2.)HD6950/HD6970 2GB
3.)HD7950/HD7970 3GB
4.)R9 290 4GB/R9 390 8GB
5.)Fury 4GB
6.)Vega 56/Vega 64 8GB
7.)RX5700XT 8GB
8.)RX6700XT 12GB/RX6800 16GB

AMD 5870 also had a 2GB version. https://www.techpowerup.com/gpu-specs/radeon-hd-5870-eyefinity-6.c497
 
so is it from now on RT on 8gb cards is going to be a no no expecially on AAA games? if t6hats the case, then seems pretty pointless buying 8gb cards for games that have RT. luckily myself i dont care monkeys about RT:D

It's just a gimmick to get people to buy more expensive cards. Anyone can see that.
 
Oof - This game is really heavy on graphics:

Ultra_1440p-p.webp


I think I may rethink getting an RTX 3080, maybe the 10GB would be a limiting factor. Would be interesting to compare it to the 12GB variant.
Keep hearing 3090 is only 10% faster than a 3080, 3090's got higher lows than the 3080ti has avg fps?
 
What do people think the minimum 'playable' frame rate is for a game?

I used to play hundreds of hours of perfect dark multiplayer at sub 20fps, kids these days don’t know how good they’ve got it!:D

On the PC, it depends on the game. An FPS needs a minimum of 60, but you would always prioritise FPS over graphics in an FPS. Something like a RTS, 30 could be playable.
 
Status
Not open for further replies.
Back
Top Bottom