• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

For those who dislike nvidia, why do you keep buying nvidia and not amd?

I don't dislike Nvidia GPUs. I don't dislike AMD GPUs. I've owned both.

I dislike bad value and being taken for a ride. My current is still a 1080 but my next will probably be AMD just for price/performance reasons.
Same, I don't blindly follow any brand. I want the best mid-range bang for buck.

Having competing brands is better for us consumers, glad Intel is on the scene now too. Pushes them all to innovate and hopefully drive down prices. Well... usually, maybe not when it comes to graphics cards in the last few years.
 
I just buy from whichever company has the best value for money. Coincidentally, that's been AMD/ATI for over 20 years. Nvidia has been by far the worst value at the time of every single upgrade I've done since the Geforce2 MX which was very good value.

Judging by the data, 92% of gamers don't care about value for money which is strange. Gamers can't use the RTX/DLSS excuse either because the enormously popular 10 series and older didn't have that.

I'm so bored with the consistency of everyone getting excited for a new Nvidia generation, then asking when they can pre order a new Nvidia generation, then moaning about Nvidia prices. Then moaning about Nvidia VRAM. Then buying Nvidia anyway because they don't realise things will get better if Nvidia's market share shrinks.
 
Last edited:
I just buy from whichever company has the best value for money. Coincidentally, that's been AMD/ATI for over 20 years. Nvidia has been by far the worst value at the time of every single upgrade I've done since the Geforce2 MX.

Judging by the data, 92% of gamers don't care about value for money which is strange.

I love value for money. But it is not the be-all and end-all.

DLSS being better alone is worth paying a little extra for me personally. The fact that I have to go out of my way to install a mod to get DLSS working on Resident Evil 4 Remake because FSR just is not as good says a lot. Plus it is in many more titles.

End of the day if paying more will bring extra happiness, then it is worth it. Of course within reason and it will be different for each person.

To me Nvidia has the better overall product and can command more for similar performing cards. AMD either need to up their game or simply stop charging silly moneys.
 
I was going to get the 6800XT when it was announced but the 3080 was easier to get. I use RT and the image scaling features, so it worked out.

When I decide to upgrade, if AMD have more feature parity with Nvidia, I’ll give them a go.
 
I love value for money. But it is not the be-all and end-all.

DLSS being better alone is worth paying a little extra for me personally. The fact that I have to go out of my way to install a mod to get DLSS working on Resident Evil 4 Remake because FSR just is not as good says a lot. Plus it is in many more titles.

End of the day if paying more will bring extra happiness, then it is worth it. Of course within reason and it will be different for each person.

To me Nvidia has the better overall product and can command more for similar performing cards. AMD either need to up their game or simply stop charging silly moneys.

Yes I agree, RTX and DLSS are great features which add value. However, GTX 10 series and older didn't have those features, so I'm not sure what gamers got in return for paying more.
 
Last edited:
The fact that I have to go out of my way to install a mod to get DLSS working on Resident Evil 4 Remake because FSR just is not as good says a lot.
All it says is that Resident Evil 4's FSR implementation is awful. Even Digital Foundry called it out as being terrible and couldn't understand how Capcom had made it look so bad. I've seen plenty of bad DLSS implementations too. The ghosting was so bad in God of War at launch that I had to turn it off. Maybe they fixed it later. Maybe Capcom will fix RE4 later. Personally, I can't tell the difference between FSR, DLSS and XeSS when all are done well. Hogwarts Legacy is a game I've tried all three in (with XeSS at its best, since it came out during the brief time I owned an Arc card) and there's no appreciable difference. Maybe if I was taking freeze frames and zooming in by 500% then I could tell them apart, but not playing the game from a normal viewing distance.

As for the thread question, I don't. I've owned more AMD cards than Nvidia ones. None of the latest cards from either are appealing however.
 
All it says is that Resident Evil 4's FSR implementation is awful. Even Digital Foundry called it out as being terrible and couldn't understand how Capcom had made it look so bad. I've seen plenty of bad DLSS implementations too. The ghosting was so bad in God of War at launch that I had to turn it off. Maybe they fixed it later. Maybe Capcom will fix RE4 later. Personally, I can't tell the difference between FSR, DLSS and XeSS when all are done well. Hogwarts Legacy is a game I've tried all three in (with XeSS at its best, since it came out during the brief time I owned an Arc card) and there's no appreciable difference. Maybe if I was taking freeze frames and zooming in by 500% then I could tell them apart, but not playing the game from a normal viewing distance.

As for the thread question, I don't. I've owned more AMD cards than Nvidia ones. None of the latest cards from either are appealing however.

Not sure what resolution/settings you were using but at 1440p quality in Hogwarts Legacy FSR was poor for me compared to DLSS though in screenshots FSR looks slightly sharper in motion there was quite a bit of shimmering, pixel running on edges and small thin objects in the distance flickering in and out of existence, at 4K it was a bit less noticeable but DLSS was still ahead for image stability. Adding in nVidia FreeStyle for a little bit of image sharpening/detail post-processing and it looks a lot better than FSR.
 
To answer the OP's question, I only built my first rig in 2017 after 20 years out the game. Didn't really have much clue and went blindly for a 1070ti based on reviews and price to performance.

Then got a 2070 super at a good price as fancied an upgrade and at time I was helping a friend at work with his vega 56 black screening. Long story short it was RMA'd, he got a refund and bought another but a different model. Few months later that started doing the same. That was eventually returned and he bought a 5700xt. That lasted a bit longer and was several driver releases before it started black screening. Tried everything, looked through here on all the fixes people suggested at the time, reinstalled the O/S, different driver versions, bios etc you name it... but it would always at some point black screen. No idea if he was just unlucky, was doing something wrong, but that was enough to put me off AMD GPU's.

In the end he bought a 3070fe and hasn't had any issues since, still using the same CPU, board and ram etc. I've since impulse bought a 3060ti FE whilst the mining craze was in full swing, but wanted a 3080FE. It was cheap and it's a great value card to be fair. I am now seriously needing/wanting an upgrade and the AMD cards are not very appealing for the reasons other people mention. They are too close in price to the 4070ti and 4080, offer less features and I've no idea if drivers are still an issue with them, but all of that makes me think twice. Now if the 7900xtx was £700-800, then I would bite, as it would be considerably cheaper than a 4080 with less features but similar raster. For me, buying an AMD GPU just feels like a gamble.
 
I think I dislike both AMD and Nvidia at the moment. They certainly don't like us as customers. :D
I bought an Nvidia card last time round as it was the best deal I could find at the time. When it comes time to replace it, I'll buy whoever is offering the best performance for the price, maybe even Intel will be viable.
 
Remember all of that stuff about how Nvidia was buying into loads of proprietary tech to lock their competitors out? Well, it worked. CUDA has made their GPGPU stuff way better (where AMD has horribly neglected their own), DLSS came faster and better than FSR, etc.
 
Last edited:
I buy both, although I didn't have an AMD card between my... either X1900XTX or X1950XTX, I can't remember, and my Vega 64. Actually, I was also given a 2900XT but I passed it on to a mate.

During that time I felt that Nvidia were producing better GPUs, so my PCs had Nvidia GPUs. The situation was the other way around back in the GeForce FX days, where I ran Radeon 9500 Pros in both my PCs.

My other half will only have Nvidia GPUs in her PC, however, because in her words they just work. 20 years of seeing a variety of Radeon cards "just work" in my PCs have done nothing to change her mind. She doesn't base her decision on features like RT or DLSS either, since she doesn't care enough about PC hardware to know what they are.

I don't think there's anything AMD can do to fight that mindset. I don't think pricing alone would do it. I think you'd need a brand with name recognition like Samsung to enter the market and immediately compete at the highest levels to start to break Nvidia's hold.
 
I don't particularly dislike or like either brand, but I've always ended up with Nvidia cards because either AMD weren't competitive in performance or Nvidia was the better value for money option.

With the latest gen cards I was considering a 4070 Ti, 7900 XT and 7900 XTX, and I chose the 4070 Ti. With the relatively poor power efficiency on the 7900 XT and XTX, both in gaming (with a frame cap) and idle/low loads, it would easily add an extra £100-£200 to the overall cost during its lifetime. Nvidia has reflex low latency to keep input lag low in GPU bound scenarios, AMD has no equivalent feature. Then of course there is DLSS upscaling which is still superior to FSR and supported on a lot of older games that FSR isn't. I also considered that the 4070 Ti was going to run a lot cooler and quieter on the cheapest £800 AIB model, to get a similar level of noise on a 7900 XT it would be £900+. The only advantage AMD has is extra VRAM. The 4070 Ti was an easy choice in the end, AMD is just not competitive in overall price/performance.
 
All it says is that Resident Evil 4's FSR implementation is awful. Even Digital Foundry called it out as being terrible and couldn't understand how Capcom had made it look so bad. I've seen plenty of bad DLSS implementations too. The ghosting was so bad in God of War at launch that I had to turn it off. Maybe they fixed it later. Maybe Capcom will fix RE4 later. Personally, I can't tell the difference between FSR, DLSS and XeSS when all are done well. Hogwarts Legacy is a game I've tried all three in (with XeSS at its best, since it came out during the brief time I owned an Arc card) and there's no appreciable difference. Maybe if I was taking freeze frames and zooming in by 500% then I could tell them apart, but not playing the game from a normal viewing distance.

As for the thread question, I don't. I've owned more AMD cards than Nvidia ones. None of the latest cards from either are appealing however.

But when you are way behind at 9% marketshare and sponsor a game and tell them not to use DLSS, you better damn sure as hell make sure your equivalent launches in a great state no?
If you can't be asked to get that right when in such sorry state, what message does that send?

FSR just ain't in the same league imo. It is not in as many games and can't be easily updated to the latest version like DLSS can.

Really hope someone higher up at AMD reads this thread to understand what it is that they need to do. But that said, I think they already know all that, unless they really are that deluded.


Yes I agree, RTX and DLSS are great features which add value. However, GTX 10 series and older didn't have those features, so I'm not sure what gamers got in return for paying more.

We already know Nvidia's mindshare was much stronger. So most just did not see AMD as an option full stop. I even ask regular mainstream people and they have never heard of AMD which I find so weird.

Then you had people like the poster above who saw others have a nightmare with the black screen issue which did not help.

Then you have people like me who have had more Radeon cards slowly slipping away due to AMD's "we are a premium brand" strategy.
 
I've owned two Nvidia cards, a 1070 and a 3060Ti FE. Both times it was simply because they were the only reasonably priced options. I'd much rather have an AMD GPU, I find they always work well for me, good multi-monitor support and are usually more competitively priced/more VRAM.

However I'm not blindly loyal to a brand and if they change their behaviour to be more like Nvidia then I'll just buy what's best for my bank balance. AMD are losing sight of what kept them in the game, good value for money. I'd forget the halo products and just go for the xx50-xx70 class cards. Make AMD great again ;)
 
  • Like
Reactions: TNA
I think the whole pc gpu / gaming landscape is pretty broken but my card choice tends to just be whatever I consider to be best at the time. I pick a budget then shop around it.

My best cards were radeon 7950s because they were at a time when pc gaming for me was top dog. Now I just run a 3060ti I bought second hand after selling a couple of 3070fes in the mining boom.

I can't justify pc gaming at a high end right now because I genuinely prefer the console experience for the simplicity, repeatability and speed of use.

The pricing on all gpus now is a killer with both brands equally taking the mick. I think im just getting old but I kinda ask myself "what am I getting from this £1000 purchase"... And the answer is the same crap ports running faster. So I don't bother any more lol.
 
But when you are way behind at 9% marketshare and sponsor a game and tell them not to use DLSS, you better damn sure as hell make sure your equivalent launches in a great state no?
If you can't be asked to get that right when in such sorry state, what message does that send?

FSR just ain't in the same league imo. It is not in as many games and can't be easily updated to the latest version like DLSS can.

Really hope someone higher up at AMD reads this thread to understand what it is that they need to do. But that said, I think they already know all that, unless they really are that deluded.
They did the same thing with boundary.. Game was coming out with DLSS, then AMD sponsored it and.....it's gone.

That's why I refuse to buy AMD. They specifically make games they sponsor to work like crap on any other hardware but theirs.
 
Last edited:
They did the same thing with boundary.. Game was coming out with DLSS, then AMD sponsored it and.....it's gone.

That's why I refuse to buy AMD. They specifically make games they sponsor to work like **** on any other hardware but theirs.

Oh common man. Nvidia have done that and way worse. What do we do, buy Intel? :cry:

I agree with some stuff you say, but comments like that make me think maybe the guys saying rollo may have a point ;)
 
I need the cuda cores Nvidia offers on their gpus, so I'm basically locked into Nvidia for the foreseeable future. If not for that I'd still go with a 4090 over what AMD offers but if they had equal performance and a lower price point at that range I'd go AMD if the CUDA situation wasn't of such importance for my use case.
 
Back
Top Bottom