• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Has Anyone Moved From Nvidia To Amd And Not Regretted It

Nv eol the 4080 to launch the 2fps faster 4080S at the price point created by AMD, while Nv users waited a year to get a 4080 which diminishes the late to markets 80S value rather than sticking two fingers up to Nv and vote with their wallet and go AMD.:thumbsup:

Yes but when you say it like that it looks like you're saying its AMD fault, the 4080 launched first to replace the $700 3080 at $1200, 49% more GPU for 72% more money.

The 7900 XTX launched next to replace the $1000 6900 XT for $1000, 47% more GPU for 0% more money, and an extra 8 GB of VRam.

Nvidia launched the 4080S, at basically the same performance as the standard 4080, a relaunch at $1000 to stick it to AMD.
Rumour is AMD have huge piles of RDNA 3 stock they can't shift, they are all 7900 series, even now at $800 they still can't shift them, its probably one of the contributors to AMD no longer competing in the high end, Nvidia like to deny AMD sales at that end and AMD can't afford to engage in price wars with Nvidia.
Nvidia have effectively forced them out.

Some people might say "well then make them $600, wub wubf werb web wer" That simple right? Now you have to make the 7900 XT $500, the 7900 GRE $400, the 7800 XT $350, the 7700 XT $270, the 7600 XT $220 and the 7600 $170 and you're losing money. But what do people who only ever buy Nvidia care about that?

The only thing you can realistically do is abolish the high end.
 
Last edited:
DLSS + Ray Tracing + More efficient cards. Other than getting a good deal on an AMD card, I don't see any significant reason to move to AMD. Some extra Vram and that's about it?
I like the fact that I can just buy an Nvidia card and know it'll do everything. Want to use RT? Go for it. Want to use resolution scaling and maintain image quality? DLSS is the best option.

Obviously if you don't care about either of those and aren't looking at the very top end of performance (4090), AMD have decently priced cards with good pure raster performance.
 
DLSS + Ray Tracing + More efficient cards. Other than getting a good deal on an AMD card, I don't see any significant reason to move to AMD. Some extra Vram and that's about it?
I like the fact that I can just buy an Nvidia card and know it'll do everything. Want to use RT? Go for it. Want to use resolution scaling and maintain image quality? DLSS is the best option.

Obviously if you don't care about either of those and aren't looking at the very top end of performance (4090), AMD have decently priced cards with good pure raster performance.
All good points, but after using two NV systems for over ten years, going from NV to AMD at the high end, AMD has less stutter/texture pop in when gaming, higher texture settings are accessible that won't auto downgrade due to lack of available vram and less likely to become a paper weight when DLSS technologies evolve to eat even more vram on outgoing NV gpus as shown on the 30 stack-outwith the 3090.

My 3070 and 80 were terrible at times for stutter/texture pop in, my 4070 still has the same problem.

Lots of 3070/80/Ti users in here have shifted to AMD largely because of the vram issues/longevity.

Both vendors have their pros/cons, just depends what features you value more.
 
I think if stuttering and texture pop in were noticeably worse on nVidia GPUs that would be a huge talking point that insert whichever tech youtuber would still be covering as that does make for a perceivably worse experience for the user.

Having gone back and forth between the two for the past 20 years, I've only noticed the texture thing back in the ATI days, and I'm susceptible to stuttering and know how jarring it can be when textures suddenly change in quality.
 
Last edited:
Yeah, in the last 10 years I have gone 7950 -> 970 -> 980 -> 5700 -> 3060ti, and never had any issues with cards from either company.

At this point though, due to a small MATX case and 750w PSU, wattage and heat are very important factors for me.
Then do what I've done for the longest time with AMD cards. Dial back the maximum clocks. You barely loss any performance for the first 10-20% depending on the GPU and power consumption drops a lot. It's like a thing with AMD, they push way past the efficiency sweet spot for another silly 5% performance and a truck load of extra power draw.

I think if stuttering and texture pop in were noticeably worse on nVidia GPUs that would be a huge talking point that insert whichever tech youtuber would still be covering as that does make for a perceivably worse experience for the user.

Having gone back and forth between the two for the past 20 years, I've only noticed the texture thing back in the ATI days, and I'm susceptible to stuttering and know how jarring it can be when textures suddenly change in quality.
It's the worst and an absolute deal breaker for me. One of the reasons I got rid of the RTX 3070 I once had. But I don't care what brand it is. It could have been a Radeon GPU, I would still have sold it and scolded it afterwards :D
 
Last edited:
Then do what I've done for the longest time with AMD cards. Dial back the maximum clocks. You barely loss any performance for the first 10-20% depending on the GPU and power consumption drops a lot. It's like a thing with AMD, they push way past the efficiency sweet spot for another silly 5% performance and a truck load of extra power draw.

The mid to high end nVidia cards are exactly the same, with a 3090 you could save around 100W in some scenarios and only lose 5-10% on your framerates.
 
I think if stuttering and texture pop in were noticeably worse on nVidia GPUs that would be a huge talking point that insert whichever tech youtuber would still be covering as that does make for a perceivably worse experience for the user.

Having gone back and forth between the two for the past 20 years, I've only noticed the texture thing back in the ATI days, and I'm susceptible to stuttering and know how jarring it can be when textures suddenly change in quality.

There have been problems in the past with Nvidia cards that none of them talk about.

Texture problems absolutely are due to VRam and they have talked about this, credit to HUB who have even briefly mentioned it for the 4070.

What i will say here now is that the 7800 XT is by far and a long way the silkiest smoothest card i have ever had, even Star Citizen which is renown for being an unpoised alpha mess is butter smooth.

But i'm not going to hark on about it as my last card was an 8GB Nvidia, that had a lot of problems with this.
 
All good points, but after using two NV systems for over ten years, going from NV to AMD at the high end, AMD has less stutter/texture pop in when gaming, higher texture settings are accessible that won't auto downgrade due to lack of available vram and less likely to become a paper weight when DLSS technologies evolve to eat even more vram on outgoing NV gpus as shown on the 30 stack-outwith the 3090.

My 3070 and 80 were terrible at times for stutter/texture pop in, my 4070 still has the same problem.

Lots of 3070/80/Ti users in here have shifted to AMD largely because of the vram issues/longevity.

Both vendors have their pros/cons, just depends what features you value more.
Haven't had a single issue with my 3080, but can appreciate that some might struggle with 4K/UW, especially with how poorly made most modern games are.

I plan on moving everything to 4K for my next upgrade, which will probably be AMD Zen5 (don't trust Intel) and an RTX 5080 since it sounds like AMD are skipping this generation (hopefully means a more competitive option in the future).
 
I think if stuttering and texture pop in were noticeably worse on nVidia GPUs that would be a huge talking point that insert whichever tech youtuber would still be covering as that does make for a perceivably worse experience for the user.
Techtubers/press have been talking about it since 30 series, the lack of vram topic literally blew up last gen.

Devs were getting the blame because 8/10/12Gb should be plenty.

Hub/Daniel Owen/TPU/others are all now repeatedly mentioning when GPU BM'ing/new games that performance is fine on X amount of vram but can't guarantee it's going to be enough while actually playing the game and could introduce stutter/texture pop in etc.
 
Lots of 3070/80/Ti users in here have shifted to AMD largely because of the vram issues/longevity.
So lots of people buying high end Nvidia are now buying high end AMD and yet AMD decide to pull out of the high end market?
Humbug gave the impression it was because they weren't selling cards but if lots of people were moving over from Nvidia they must have been.
So this makes their decision to pull out of the high end market even more curious if they were gaining marketshare in that space.
 
Techtubers/press have been talking about it since 30 series, the lack of vram topic literally blew up last gen.

Devs were getting the blame because 8/10/12Gb should be plenty.

Hub/Daniel Owen/TPU/others are all now repeatedly mentioning when GPU BM'ing/new games that performance is fine on X amount of vram but can't guarantee it's going to be enough while actually playing the game and could introduce stutter/texture pop in etc.

So it’s a VRAM issue rather than a vendor specific problem? Granted you’re less likely to experience it on AMD cards as they are more generous with VRAM at the lower tiers.
 
Wife went from a 3080 to a Nitro 7900XT. She plays at Ultrawide 3840x1600, and everything runs better than the 3080 which was really beginning to struggle a bit at times.

She gets better performance at UW with the 7900XT, than I do at QHD with a 3080...so yeah definately more powerful, and no worries at all about VRAM anymore, as the card has 20GB compared to the 10GB of the 3080, so should avoid any memory relating hitching at all if the framebuffer is exceeded.

She's had very few issues too, it just works. Our 3080s consume roughly 350W also, so very little power consumption difference.

If the 8900XTX or whatever its called comes out and genuinely is a 7900XTX, marginally faster, better raytracing, and costing £500, I'd be very tempted, as we all know Nvidia's Blackwell is not going to come cheap, even if it is decently faster than the 4 series.
Newer gen gpu with an msrp of £900 better than gpu that launched 2 years prior for £650, no surprises there.
 
Techtubers/press have been talking about it since 30 series, the lack of vram topic literally blew up last gen.

Devs were getting the blame because 8/10/12Gb should be plenty.

Hub/Daniel Owen/TPU/others are all now repeatedly mentioning when GPU BM'ing/new games that performance is fine on X amount of vram but can't guarantee it's going to be enough while actually playing the game and could introduce stutter/texture pop in etc.

Crazy to think that when most of us these days insist on putting 32 GB or more RAM in our systems but 10/12 GB is more than enough for the thing rendering everything that makes up what you see in your games.

Memory is a hagiarchy, the fastest is the VRam on your GPU, then the system RAM, then the page file to disk, the reasoning often is "my 3070 works fine, i don't see the problem" yes it works fine in the same way a 75 Hz screen works fine, if you don't have a 144 Hz screen you might not see how it can be better.

I've watched people put up videos of them playing games, even on this forum and say look, see... its fine, the frame time line looks like a seismograph reading an event, my frame time line is hair thin and perfectly flat.
 
Last edited:
Devs were getting the blame because 8/10/12Gb should be plenty.
Devs should get part of the blame. Game comes out, runs like trash, 2 patches later and it's playable.

Multi-threading and efficient asset management requires work, and is usually low on the list when you're rushing out a game a year early.
 
So lots of people buying high end Nvidia are now buying high end AMD and yet AMD decide to pull out of the high end market?
Humbug gave the impression it was because they weren't selling cards but if lots of people were moving over from Nvidia they must have been.
So this makes their decision to pull out of the high end market even more curious if they were gaining marketshare in that space.
AMD just can't compete with NV on sheer all-round performance (raytracing included) and my guess is they weren't gaining as much market share as they'd like and is not cost beneficial to keep pursuing the top end gaming market. Those with the cash to spend will always pay more for the best IMO.
Those 3070/80/Ti users who switched to AMD probably did so over cost I think, or getting on the anti-Nvidia bandwagon :D .
I remember reading AMD are going all in on AI so maybe that's part of the reason too. More profit to be made there than assigning resources to top end gaming GPU's for AMD, maybe?
 
Newer gen gpu with an msrp of £900 better than gpu that launched 2 years prior for £650, no surprises there.

Sorry but the 7900XT has not been anywhere close to that joke MSRP for at least 1.5 years. Let’s at least compare actual current new prices from both. Or maybe we should compare to the 4080 MSRP, or the 3080Ti MSRP?
 
Sorry but the 7900XT has not been anywhere close to that joke MSRP for at least 1.5 years. Let’s at least compare actual current new prices from both. Or maybe we should compare to the 4080 MSRP, or the 3080Ti MSRP?
You can really buy the 3080 new anymore, you can pick up used for around £300.

The point I was making was at release the 7900XT was nearly 50% more than a 3080 was at release and came 2.5 years later so it should be better.
 
You can really buy the 3080 new anymore, you can pick up used for around £300.

The point I was making was at release the 4080 was nearly 100% more than a 3080 was at release, was only 45% faster and came almost 2.5 years later so it should be better.

FTFY

I get it, the release prices were a joke. But if someone is buying a GPU today we need to look at current prices. I’m going to judge the 7900XT I can buy today for £630 and two free games.
 
Back
Top Bottom