• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Resale value is a genuine reason.

Fake Frame?

For me that's a major minus point although I hear AMD are going to copy that soon so I might have to try Intel!

Intel are also going fake frame way.

Whilst raw performance would be nice, having a good solution which is only going to get better is nice to have, Nvidia are ahead of the game and likely will have the superior option for a good year or more just as has been the case with previous tech. and arguably still is the case for some things....
 
Resale value is a genuine reason.

Fake Frame?

For me that's a major minus point although I hear AMD are going to copy that soon so I might have to try Intel!


It is a feature, you can choose to not enable DLSS 3 or FSR 3. I personally have never used DLSS or FSR of any type and I do not intend to in the future.

We all have different reasons for which gpus we choose and for me the deal breaker was the shape and size of the 4090s PCB. Now I have my 7900xtx with the waterblock on and vertically mounted I know I made the correct decision and to my eyes the 4090 would have looked wrong aesteticly.
 
I don't understand why people would rather pay $100 more for an Nvidia card if the performance is the same.
Imagine if you were looking for a 48inch 4k OLED TV which traditionally have cost say £600 but now due to only 2 companies left making TVs they have jacked up the price so now you have the choice of either £1000 for an LG or £900 for a Bush, both use the same panel but the LG has more features and uses a bit less power.
 
I don't understand why people would rather pay $100 more for an Nvidia card if the performance is the same.
Simple reason is cuda when using the gpu for something other than gaming....

The software that I use that is gpu accelerated is far better with cuda than it is with open-cl that AMD (and nvidia) support.... for those of us that can make use of it, the extra £100 is money well spent, but I'm not saying the current prices are good or anything though, they're not.
 
I think what's happened is a new tier of 4K capable graphics cards has been created (the RTX 4080 and 4090, plus the 4070 TI in some games and the Navi31 GPUs). This tier is only going to be expanded in the future, with cards like the RTX 4090 TI and possible Navi31 + v-cache GPUs. It looks like the 4090 can just about handle minimums of 60 FPS in demanding games like Hogwarts Legacy (depending on which reviews you look at).

A 4K capable teir wasn't really something that existed before, even the RTX 3090 and 3090 TI struggled at 4K in demanding games. This is why both companies think customers will pay, because 4K is still very much a premium/ luxury option. You pay through the nose for high memory bandwidth cards. Remember the high production cost of the Radeon VII and Vega 64 GPUs? AMD struggled to make money with these.

The other (relatively) new thing is RT hardware, since this was introduced costs have definitely increased (compare to the GTX 1000 series) - You definitely don't get it free and it might be a good thing for consumers if high end cards without the RT cores were introduced (not bloody likely :cry:). Maybe avoid cards with lots of RT cores?

I also think the size of the market for 4K cards has grown a lot in the last couple of generations, there are more people willing to buy (almost entirely because of Nvidia).

So, sticking with 1440p capable cards is where we will see the (somewhat) more affordable cards being released - which was standard for the older generation graphics cards like the GTX 1000 series, and RDNA gen 1). To say things have remained the same in the last 5-6 years wouldn't be accurate, games have got significantly more GPU intensive (as graphic details have increased) at 1080p, 1440p or higher.
 
Last edited:
In addition you also have to factor in recent displays. OLEDs and top QLED whilst not cheap are able to push the frames and offer good settings, so the punter has better choice now compared to two years ago. Most gamers have been satisfied with reasonable 1440p for a long time so there was no draw for a lot of expense in that bracket.
 
Last edited:
Would an RTX 3080 be fast enough to handle The Witcher 3 (next gen) at 1440p (60 FPS min, RT off)? Or would a RTX 3080 TI be a better choice?

Any there any reviews with this info?
 
Last edited:
Would an RTX 3080 be fast enough to handle The Witcher 3 (next gen) at 1440p (60 FPS min, RT off)? Or would a RTX 3080 TI be a better choice?

Any there any reviews with this info?
found this on google... running a ryzen 5800x and 3080 TI. Based on a quick watch I'd say that you'd likely be safer with the 3080 TI

 
Last edited:
I think what's happened is a new tier of 4K capable graphics cards has been created (the RTX 4080 and 4090, plus the 4070 TI in some games and the Navi31 GPUs). This tier is only going to be expanded in the future, with cards like the RTX 4090 TI and possible Navi31 + v-cache GPUs. It looks like the 4090 can just about handle minimums of 60 FPS in demanding games like Hogwarts Legacy (depending on which reviews you look at).

A 4K capable teir wasn't really something that existed before, even the RTX 3090 and 3090 TI struggled at 4K in demanding games. This is why both companies think customers will pay, because 4K is still very much a premium/ luxury option. You pay through the nose for high memory bandwidth cards. Remember the high production cost of the Radeon VII and Vega 64 GPUs? AMD struggled to make money with these.

The other (relatively) new thing is RT hardware, since this was introduced costs have definitely increased (compare to the GTX 1000 series) - You definitely don't get it free and it might be a good thing for consumers if high end cards without the RT cores were introduced (not bloody likely :cry:). Maybe avoid cards with lots of RT cores?

I also think the size of the market for 4K cards has grown a lot in the last couple of generations, there are more people willing to buy (almost entirely because of Nvidia).

So, sticking with 1440p capable cards is where we will see the (somewhat) more affordable cards being released - which was standard for the older generation graphics cards like the GTX 1000 series, and RDNA gen 1). To say things have remained the same in the last 5-6 years wouldn't be accurate, games have got significantly more GPU intensive (as graphic details have increased) at 1080p, 1440p or higher.

If the 3090Ti struggles in some games at 4K i would say the 4070Ti defiantly does, i wouldn't class it as a 4K card given that the 3090Ti is faster.



kH8nLs8.png
 
Last edited:
It looks like the RTX 3080 struggles a bit 1440p in the Witcher 3, with RT off:

It can manage a steady framerate with DLSS on quality mode (dips down to 60 FPS).
Game has had a couple of big patches recently to considerably improve performance especially with the CPU bottleneck issues.
 
@humbug
Do you know if that's  just an mba xtx in the tpu slide, didn't know tpu had it faster than a 4080?

I knew it was faster than the 4080, the only AIB review on TPU i could find was the Asus TUF, its only 2 to 3% faster than the MBA card.


I think newer drivers have it a little improved since early reviews, if and when AMD are done with that i would like to see them retested.
 
The TUF is a good overclocker, it clock's to 3.2Ghz, this is a 14% gain over the MBA card, now its much closer to the 4090.

7V7pe9V.png

Reason we aren't looking at RT?

9btpvqG.png

Shame overclocking won't get it anywhere close to ada's RT, not to mention, the huge increase in power required for RDNA 3 OC too.
 
Last edited:
  • Like
Reactions: TNA
I don't buy the 4K excuse.The IBM T220 had a 3840X2400 resolution in 2001,and the first consumer orientated 4K monitor was in 2013. Monitor resolutions are now hardly going up like in the past,as 4K monitors are now around £200 and 4K TVs are cheap. Plus if you go back far enough 1680X1050/1680X1200 and 1920X1200 were the "high resolution" resolutions of their era,and using modern logic we would be still stuck at those resolutions,and the average person at 720p! It shows you how much stagnation there is in GPU performance increases,and PC monitor technologies especially at the mainstream pricing tiers.

The reality is modern dGPUs are just overpriced,and you can see that in the margins of Nvidia,being better than Apple. PC gamers are just increasingly mugs and whales,and its turning into Stockholm Syndrome because all these companies(as I said for many years) look at PC gamers as an easy very high margin market. Intel and Nvidia were jacking up prices for parts whilst spending billions of USD between them subsidising Atom and Tegra CPUs for tablets,etc. Yet people defended the price increases,when essentially we were subsidising some non-gamers tablet!

This is why companies are trying to do such antics,and yet all electronics rely on the same silicon manufacturing and I see less of an issue in consumer pricing for other products. Even prebuilt laptops and desktops can be had for less than self building them now,as its quite clear the parts are being sold for far less to large OEMs,whilst they jack up pricing for individual parts to PC gamers. You also see it with the price inflation in games too. Companies have realised PC gamers are a weak willed captive market,which wasn't the case 15~20 years ago and why back then there was less of this being done. This is what the whole marketing driven FOMO has done to gaming. PC gamers are largely to blame for this situation,as they voted with their pockets.
 
Last edited:
It looks like the RTX 3080 struggles a bit 1440p in the Witcher 3, with RT off:

It can manage a steady framerate with DLSS on quality mode (dips down to 60 FPS).

It doesn't look like it struggles at all, just check the GPU usage - looks like clear CPU limitations which should be improved in the recent patch.
1440p 60 fps should be no problem for the 3080 in Witcher 3 with RT off. I'll get around to testing the CPU on the old i7 at some point...

If you can get a 3080ti for cheap I'd still be looking at that though :p
 
Status
Not open for further replies.
Back
Top Bottom