• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Worth going from a 3070 to a 5070?

Leaks are showing a massive leap for the 9070xt over the 7900xtx specifically in BMW raytracing.

The new Radeon cards will probably not be as good for RT, but it looks like the gap will lessen, certainly against the 5070.

Hopefully. Without competition Nvidia is going to kill PC gaming by pricing it out of reach for the majority of gamers
 
That's Black Myth Wukong at 4K but using FSR 3 upscaling set to quality. It's from the Digital Foundry 5090 review

It's worth noting that India Jones full path tracing is a heavier work load than this, so AMD would be even worse against the RTX cards

CHIPHELL-RX9070XT-RUMORS-1200x773.jpg



The 9070XT leak shows BMW with RT (Left side results) in native 4K is running around same level as the 4080Super. That's pretty good for a mid range card.
 
Last edited:
CHIPHELL-RX9070XT-RUMORS-1200x773.jpg



The 9070XT leak shows BMW with RT in native 4K running around same level as the 4080Super. That's pretty good for a mid range card.
Goes to show how overpriced the so called 4080 is. Clearly a 70 class card.
 
Goes to show how overpriced the so called 4080 is. Clearly a 70 class card.
Sure was, but at least it obliterated the previous generation. This new generation is looking like an overclock. It's crazy. Unless the 5080 is going to somehow score uplift from architectural differences that the 5090 couldn't find outside of adding more cores and power, this will be the first gen where the new 80 class is embarrassed by the previous flagship... I think? And the rest of the stack looks just as mild.

Considering that AMD GPU's regressed this gen, nvidia did a glorified overclock, intel is bleeding out in the water, and AMD's new CPU generation was a steamer - this must be the worst period in tech in a long time? Is this the future unless they figure out those new transistors or quantum tunnelling or some **** I'm not going to pretend I know about beyond youtube cartoons?

What a ******' diamond in the rough the 9800X3D is though. GOAT.
 
I'm in the same boat, I have a 3070TI, currently its still pulling its weight in my most played games, E.G Destiny 2, but any newer titles are asking for a GPU with a bit more under the hood.

I'm torn between the 5070ti and the 9070xt, the 5070 isn't even a consideration with only 12gb of Vram. AMD have dropped the ball once again with their launch, having to wait another month after the release of the 5070 series will not go well for them. I have a feeling most will pull the trigger on the 5070ti.

Personally if i can, i will wait for all the reviews to come out before making a decision.
 
I'm currently running a 1440p ultrawide on a 3070 and feel it still does pretty well. Is it worth upgrading to a 5070? Or wait another gen?

what sort of things do you do with you monitor at 1440pUW. if your gaming ever just light weight games, your would see a good uplift just grabbing a 4070s and the 5070 is supposed to be about 20% uplift over the 4070s
 
So you are considering another card where the VRAM will also be tight?

that depends on that you do, going 8gb to 12gb is a good uplift.
i used a 3070 for a long time and never had any memory problems.
both of the kids pc's and 8gb cards and they game, and again they have never had a memory problem
 
that depends on that you do, going 8gb to 12gb is a good uplift.
i used a 3070 for a long time and never had any memory problems.
both of the kids pc's and 8gb cards and they game, and again they have never had a memory problem

I had a 3070 several games I played I had problems with.

I was playing at 1440p, Fallout 76, would run at 140 fps most of the time, then the FPS would dive now and then, 100% GPU memory use, thats completely unmodded as well. Both cities skylines games maxed the VRAM. Fallout 4 admittedly with mods, would easily get to 8gb use, dont forget that game was released in 2015.

The AMD equivalent at the time had 12gb and if the 3070 would have also had 12gb it would have been about right. Nvidia are right buggers for doing this, its nothing new.

I had a 1070 before that, which was heading towards 5 years old, that also had 8GB VRAM, the 1070 was released in 2016. Why now, in 2025, 9 years later is the equivalent card only shipping with 4GB more?

The only reason I bought that 3070 was because its was in the middle of the mining crisis and I had no other option really, the card (in my opinion) was a lemon, I got shot of it and bought a 7900xt and never looked back.

I don't think its acceptable at all, for a modern GPU that they are charging good money for to be shipped with the amount of memory they have.

Remember the more you buy, the less VRAM you have, the more you save...... or something like that.
 
Aside from VRAM limitations, unless you are going to at least double GPU performance I don't see much point. It's kind of depressing how few graphics settings you usually have to change to get a 50% uplift or more.
For example if you go from a 4090 to a 5090 the only benefit might be moving from dlss balanced to dlss quality at essentially the same frame rate.
 
Aside from VRAM limitations, unless you are going to at least double GPU performance I don't see much point. It's kind of depressing how few graphics settings you usually have to change to get a 50% uplift or more.
For example if you go from a 4090 to a 5090 the only benefit might be moving from dlss balanced to dlss quality at essentially the same frame rate.

I agree that DLSS has made upgrading less relevant
 
My opinion about 3070 upgrade to 5070 is a no for many of the reasons mentioned above. Either get the 5070ti, 9070xt or wait for the refresh. I can imagine a quick refresh if the 5070 doesn't sell well, or the 9070/9070xt is competitive. A 5070 Super will appear from nowhere or the 5070ti will drop in price.
 
i dont run RT at all but i game at native 1440 no DLSS. it can have a hit on FPS but its the drop's that was the problem for me


it is if your at 1440/4k native no DLSS

The uplift in visuals provided by RT are a lot higher than the perceived loss of visuals that you get with DLSS. Especially seeing as most people cannot tell the difference between DLSS quality and native res.
RT off/on is easy to tell the difference. Not using RT just to avoid DLSS is taking a huge visual hit overall. You're spending hi end money on a graphics card to run mid-range visuals.
 
The uplift in visuals provided by RT are a lot higher than the perceived loss of visuals that you get with DLSS. Especially seeing as most people cannot tell the difference between DLSS quality and native res.
RT off/on is easy to tell the difference. Not using RT just to avoid DLSS is taking a huge visual hit overall. You're spending hi end money on a graphics card to run mid-range visuals.
the only game i really play doesn't support RT, so i suppose i just got used to not seeing it
 
The uplift in visuals provided by RT are a lot higher than the perceived loss of visuals that you get with DLSS. Especially seeing as most people cannot tell the difference between DLSS quality and native res.
RT off/on is easy to tell the difference. Not using RT just to avoid DLSS is taking a huge visual hit overall. You're spending hi end money on a graphics card to run mid-range visuals.
It's not as if gamers haven't tried running DLSS/RT on their GPUs.

An awful lot of 4090 users preaching, don't get the realisation of not having a 4090 or anywhere near that level of performance...
 
Back
Top Bottom