• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 4 thread

Lol, how to say you are incapable of accepting facts, without saying you are incapable accepting facts.
 
**** the whine, imagine the heat!!!!!!!!


E2AOFVK.gif

I miss the days when AMD and Nvidia would bash each other with funny commercials about the heat output and noise :cry:
 
I wouldn't touch an RTX4070TI with a bargepole at £800 because of its paltry 12GB of VRAM,especially as RT increases VRAM usage and UE5 looks like it will be VRAM heavy. At least the RTX4070 is under £600 so its more acceptable. But the RTX4070TI should be a sub £600 RTX4070,the RTX4070 should be an RTX4060TI and the RX7900XT should be the RX7800XT. So its like trying to win the battle of the overpriced dGPUs.

VRAM is one thing which will age a premium priced dGPU much quicker than people think. People who spend over £700 on a dGPU want to turn up settings and play at 4k,etc.So its hilarious to talk about longevity - longevity with PC enthusiasts seems to be the two years between new launches,but looking at Steam that is more like 3~5 years.OFC,Nvidia could have solved this by giving the card 24GB quite cheaply,but the more you buy the more you spend.
 
Last edited:
I like the 4070, but as you ^^^^ say not quite at $600, i could forgive it for its 12GB VRam at $500, but not much more, its not a bad GPU, not at all.
 
I like the 4070, but as you ^^^^ say not quite at $600, i could forgive it for its 12GB VRam at $500, but not much more, its not a bad GPU, not at all.

The problem is Nvidia not only pushed their dGPUs by two levels upwards(AMD pushed theirs up one level because of the smaller generational increase),but it also lead to the VRAM amounts being mismatched.

An example during Turing was the RTX2080 8GB replacing the GTX1080TI 11GB. This is what we should be having:
1.)RTX4060TI(RTX4070) 12GB under £400
2.)RTX4070(RTX4070TI) 12GB under £600
3.)RX7800XT(RX7900XT) 20GB under £600

OFC,with GDDR6/GDDR6X being cheaper now,Nvidia could have made the RTX4070TI a 24GB card but chose not to. They hope in two years time,when they launch the RTX5070TI 16GB people will upgrade when the VRAM starts to run out,and when that ironically will also hurt RT performance more and more.

Again,the VRAM wouldn't be so much of an issue,but the RTX4070TI has premium pricing just like the RTX2080 did.
 
Last edited:
The problem is Nvidia not only pushed their dGPUs by two levels upwards(AMD pushed theirs up one level because of the smaller generational increase),but it also lead to the VRAM amounts being mismatched.

An example during Turing was the RTX2080 8GB replacing the GTX1080TI 11GB.

OFC,with GDDR6/GDDR6X being cheaper now,Nvidia could have made the RTX4070TI a 24GB card but chose not to. They hope in two years time,when they launch the RTX5070TI 16GB people will upgrade when the VRAM starts to run out,and when that ironically will also hurt RT performance more and more.

Don't i know it, as you know i have a 2070S and the more games i install from Game Pass the more i'm reminded that my GPU isn't lacking horsepower, its not great but it can still handle the latest and greatest reasonably well at 1440P, if not for the lack of VRam.

All i want is a better GPU with more VRam, and not for £600+, i don't care what colour it comes in, well...... i'm not ready to go blue on that front, but anyway, is that too much to ask from someone with a £500 GPU from 2 generations ago?
 
Last edited:
Don't i know it, as you know i have a 2070S and the more games i install from Game Pass the more i'm reminded that my GPU isn't lacking horsepower, its not great but it can still handle the latest and greatest reasonably well at 1440P, if you for the lack of VRam.

All i want is a better GPU with more VRam, and for for £600+, i don't care what colour it comes in, well...... i'm not ready to go blue on that front, but anyway, is that too much to ask from someone with a £500 GPU from 2 generations ago?

Many who are buying it are probably buying the RTX4070TI at £800 over the RTX4070 in the hope it will last longer. However,if the PS5 Pro has more RAM,and the XBox Series X replacement arrives in 2025,with more RAM then it will be another case of a decent core held back by cost cutting. Nvidia will then release a 16GB RTX5070. 16GB of VRAM can't be that expensive if the PS5 was profitable since last year - Nvidia even put GDDR6X on an RTX3060TI because there was too much of it that couldn't be sold.
 
Many who are buying it are probably buying the RTX4070TI at £800 over the RTX4070 in the hope it will last longer. However,if the PS5 Pro has more RAM,and the XBox Series X replacement arrives in 2025,with more RAM then it will be another case of a decent core held back by cost cutting. Nvidia will then release a 16GB RTX5070. 16GB of VRAM can't be that expensive if the PS5 was profitable since last year - Nvidia even put GDDR6X on an RTX3060TI because there was too much of it that couldn't be sold.

That's what worries me, i do game development as a hobby, that BTW is not me claiming to be an expert, its a hobby, but i follow the game dev world and i have to tell you they aren't following the Nvidia lead on VRam, they want to make better looking games and i do understand what that means. i live it.

12GB... that from now is going to age like milk in the sun, mark my words.
-----

To expand on my team blue comments, yeah i'm sorry but he's right.

 
Last edited:
That's what worries me, i do game development as a hobby, that BTW is not me claiming to be an expert, its a hobby, but i follow the game dev world and i have to tell you they aren't following the Nvidia lead on VRam, they want to make better looking games and i do understand what that means. i live it.

12GB... that from now is going to age like milk in the sun, mark my words.
-----

To expand on my team blue comments, yeah i'm sorry but he's right.


My big issue is that the RTX4070 and RTX4070TI have the same VRAM amount and memory bandwidth. So when that 12GB of VRAM or memory bandwidth is the limiting factor,the RTX4070 and RTX4070TI will have the same playable experience. Its like with the RTX3060TI,RTX3070 and RTX3070TI when they hit the VRAM limits. Considering the RTX3060TI was much cheaper,its the RTX3070 and RTX3070TI owners who got the worst deal here.
 
Last edited:
My big issue is that the RTX4070 and RTX4070TI have the same VRAM amount and memory bandwidth. So when that 12GB of VRAM or memory bandwidth is the limiting factor,the RTX4070 and RTX4070TI will have the same playable experience. Its like with the RTX3060TI,RTX3070 and RTX3070TI when they hit the VRAM limits. Considering the RTX3060TI was much cheaper,its the RTX3070 and RTX3070TI owners who got the worst deal here.
I took a £137 hit when I sold my 60ti FE. I downgraded from a 3080 10gb for the same reason I upgraded to the XT: vram. I was maxing out the 3080 on Far Cry 6 (sold it for msrp) and decided to make do with a 60ti for 18m knowing I’d make less of a loss. I would have kept it if I could play TLOU on ultra with 12gb vram which would have been ample for me.

Hence I embarked on an upgrade I didn’t envisage 2 years ago.
 
Last edited:
My big issue is that the RTX4070 and RTX4070TI have the same VRAM amount and memory bandwidth. So when that 12GB of VRAM or memory bandwidth is the limiting factor,the RTX4070 and RTX4070TI will have the same playable experience. Its like with the RTX3060TI,RTX3070 and RTX3070TI when they hit the VRAM limits. Considering the RTX3060TI was much cheaper,its the RTX3070 and RTX3070TI owners who got the worst deal here.
It why i never bough one, i saw the writing on the wall, i've been saying it since the 3070 was brand new, remember? :)
 
UE5 looks like it will be VRAM heavy.
Actually, the opposite. It's much lighter on the VRAM than UE4 (thanks to Nanite) and most games released up to now (AAA), relative to asset quality. I'd worry more for custom engines in that regard (in particular ports from consoles, cough Sony cough). Its problems are the perennial ones for UE, which are how to properly handle open-world asset streaming and not break the game & how to better make use of the CPU. Basically defeating the stutterfest it's plagued by. Moreso for smaller external devs, because there's a lot of ways to trip yourself up as a dev using UE, and it's much harder to actually modify it to your use case to run well when you don't have an elite programming team that specializes in it (like The Coalition).

CDPR for example has completely solved this problem for themselves, and now that they're on UE5 and major contributors to it perhaps that will put UE5 on the right track. Ubisoft also does very well on this front (but their older approaches tend to lean heavier on vram) but then they have huge teams & tune their engines specifically for open-world games. I would say that out of all the current open-world games that have (at least some form of) RT, the best mix of HQ assets + good LoD management + no undefeatable stutter, the best one is Watch Dogs: Legion. And if we look at how that handles vram we can see it's eminently playable on even an 8 GB card (with minimal difference vs ultra HD textures & max streaming budget), so for sure 12 GB will be fine for the remainder of this generation. In fact already the cross-gen titles that aren't completely gimped to low settings (or lacking some key graphical features entirely, like say a basic GI) prove to be more than enough to stress even the PS5/XSX so we're not going to see games that push memory requirements that much higher when all the assets are done with the consoles in mind. If anything we see stuff like Pathtracing being what gets pushed as the stressor option on PC, and that's super heavy on compute but not moreso on vram than basic RT.

For UE5 you can see CPU usage issues but modest vram usage:

And if we look at Remnant 2 which just recently launched as a UE5 title with Nanite (but no Lumen), we can see it do very well with little vram but have very HQ assets on show (which makes sense, it's basically what Nanite exists for):
 
@Poneros those textures Remnant 2 look quite flat and not high res. They also seem to be lacking any Specular, Gloss and Metallic maps. I can see Albedo, Normal and Dirt maps but that's about it, its why they look flat.
I'm currently playing Sniper Elite 5, the textures in that look miles better than they do in your example.

PS: you would only use asset streaming of you didn't have enough VRam to begin with and its not cost free, even with the fastest CPU's and SSD's.

The best performance is always streamed directly from VRam. Ninite is a method of reconstruction tessellated geometry on the fly, its does away with LOD assets swapping.
 
Last edited:
Back
Top Bottom