• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

Sponsoring games to push your line's advantage is being Nvidia 101. If they cheaped out on a cheap part presumably to screw their own consumers and AMD is pressing that advantage then I say fair play to them, Nvidia's had it coming.
That's just a silly thing to say but sure, just remember (you and whoever liked your post) what you are calling fair play. Next time if nvidia is the one doing the shaenanigans, call it fair play as well as im sure you did in the past right? :D

The bias is overflowing but whatever
 
Even if nvidia did pay to get their tech in games (which they aren't in terms of dlss, reflex, frame gen as it is easy for devs to implement thanks to nvidias open source streamline solution), I rather have this than amds approach i.e. over the fence with "documentation", do as you please, we don't care hence the lack of uptake of our technology.... You're paying not just for quality and consistently good features in a timely manner but also for widespread adoption.

Poor support/guidance/over the fence approach with no interest or rather ambition in helping to achieve more widespread adoption with internal and external products is why we dump so many 3rd party software vendors in work.
 
Last edited:
AMD will release a 7800XT which is 5-10% faster than the 4070 but with worse ray tracing and power consumption for RRP ~$649, and everyone will sigh and roll their eyes.
Interesting you should say that, because techpowerup predicts that the recently released W7800 pro card should, theoretically, and with a lot of salt etc. be approximately 10% faster than the 4070 and it has specs that could easily fit in as a 7800 XT (256-bit bus, further cut down Navi 31 with approx 44 TFLOP throughput).


And I can't imagine that it would be much cheaper than $649, considering...

At least it'd probably get more VRAM. :D
 
The bias is overflowing

Using the word bias while complaining loudly that it's not fair that games aren't limited to the memory capacity that Nvidia cards have :eek:

Naturally this is upvoted by Nexus18 who is a bastion of bias for Nvidia with not a fig leaf to hide it.

Even if nvidia did pay to get their tech in games ... I rather have this than amds approach.

AMD has been giving out plenty of VRAM for a long time now, they've even said it's cheap to do so.

Surely this is good competition but no, weird complaints declare it unfair because games companies use that extra VRAM :mad:

Nvidia could with little effort, wipe out this problem that embarrasses their users in forum arguments, of certain AMD cards sometimes having more VRAM and thus having memory limited scenarios of better performance.

That would upset the segmentation so they don't. That's how much they care about their competitor having this niche advantage.
 
Using the word bias while complaining loudly that it's not fair that games aren't limited to the memory capacity that Nvidia cards have :eek:

Naturally this is upvoted by Nexus18 who is a bastion of bias for Nvidia with not a fig leaf to hide it.



AMD has been giving out plenty of VRAM for a long time now, they've even said it's cheap to do so.

Surely this is good competition but no, weird complaints declare it unfair because games companies use that extra VRAM :mad:

Nvidia could with little effort, wipe out this problem that embarrasses their users in forum arguments, of certain AMD cards sometimes having more VRAM and thus having memory limited scenarios of better performance.

That would upset the segmentation so they don't. That's how much they care about their competitor having this niche advantage.

The point wasn't on hardware/vram aspects? :confused: It was regarding features/sponsored titles having nvidia features i.e. a **** ton of games with nvidia tech. over amd sponsored games.

Bencher has a point, amd have fans that foam at the mouth when nvidia do "bad" hence why they are more vocal online hence why threads explode with the "nvidia bad, amd good", you don't see amd threads exploding when they have issues i.e. RDNA 3 vapour chamber heat/catching fire but nvidia and the power adapter (which ended up being user error [granted poor design]) = "zOMG, how does this company get away with it!?" and so on.

Are people with more vram really getting a benefit from having more vram though? Are the textures etc. really that much better than the "high" setting or what consoles (with a "unified total" memory system of 16GB) offer? Or rather is the benefit of all that extra vram for PC just to avoid poor optimisation/issues, lack of a unified memory config/good implementation of direct storage in a handful of well regarded broken/buggy games..... Since apparently it's never the games/developers fault then I guess it isn't CDPR/game fault that cp 2077 path tracing runs better on a 3070 than a 7900xtx, right?

As I have said before, I got no loyalty to any company, my amd purchases far outweigh both nvidia and intel combined but people need to stop getting brainwashed into thinking "amd good, nvidia bad" and thinking that everything bad which happens in the gpu space is nvidias fault.
 
Last edited:
I got no loyalty to any company
image.png
 

It's funny though, last nvidia gpu I owned before 3080 was a 8800gt, after that and before the 3080, had several amd gpus (and no real problems), likewise with cpus, last intel cpu was i5 750 but nah, "nvidia fanboy" :cry:

As I've got older, I appreciate having a good experience "now" and not in maybe 2-3 years time or longer..... You can't put a price on that.

AMD fans just don't like the harsh truth and get overly defensive on it and best part is as I always keep pointing out, when even the diehard amd fans/nvidia haters keep crying on here about how bad nvidia are, they still keep buying nvidia products over amd :D
 
That's just a silly thing to say but sure, just remember (you and whoever liked your post) what you are calling fair play. Next time if nvidia is the one doing the shaenanigans, call it fair play as well as im sure you did in the past right? :D

The bias is overflowing but whatever
Nvidia has been doing the shenanigans for a long time, they'll do it again irrespective of what AMD does. The whole thing of pushing for absurd, invisible to the naked eye levels of tessellation at the time comes to mind in particular.

But in any case I'm on a 3060 12GB so my only real bias is thinking the upgrade paths they're offering at the moment look a bit ****.
 
Nvidia has been doing the shenanigans for a long time, they'll do it again irrespective of what AMD does. The whole thing of pushing for absurd, invisible to the naked eye levels of tessellation at the time comes to mind in particular.
And im sure you and whoever liked your post were also claiming "fair play" and they were doing great. RIGHT? Oh, no, you also say that when amd pulls it. ahokay
 
Using the word bias while complaining loudly that it's not fair that games aren't limited to the memory capacity that Nvidia cards have :eek:

Naturally this is upvoted by Nexus18 who is a bastion of bias for Nvidia with not a fig leaf to hide it.
That is purely strawmaning. Makes sense, when you cant argue against what im saying.

Games can take as much as 58 gb of vram for all I care. What im saying is, the game that takes 58gb of vram, should freaking look better than eg. Plague Tale that needs 5. The fact that 97% of the games that hog vram are amd sponsored and they look from average to plainly bad (godfall, forspoken) is my issue. If they looked great and needed vram, that's absolutely fine. But they dont.
 
Nvidia has been doing the shenanigans for a long time, they'll do it again irrespective of what AMD does. The whole thing of pushing for absurd, invisible to the naked eye levels of tessellation at the time comes to mind in particular.

But in any case I'm on a 3060 12GB so my only real bias is thinking the upgrade paths they're offering at the moment look a bit ****.

Yup that with crysis 2 and witcher 3 was very scummy, the over tessellation provided zero benefit to the IQ, at least with RT, the higher resolution, denoising, increased light bounces for RT and so on do provide noticeable IQ improvements.

They have been on a good track with their sponsored titles over the last couple years though i.e. their sponsored titles run just as good as nvidia competing gpus or sometimes better e.g. look at how well metro EE (full technical sponsorship by nvidia) runs on rdna 2.

AMD have been more scummy recently with removing/blocking nvidia tech. i.e. dlss and so on and by not supporting initiatives that would benefit all customers and developers regardless of brands/loyalty i.e. streamline.
 
Last edited:
That is purely strawmaning. Makes sense, when you cant argue against what im saying.

Games can take as much as 58 gb of vram for all I care. What im saying is, the game that takes 58gb of vram, should freaking look better than eg. Plague Tale that needs 5. The fact that 97% of the games that hog vram are amd sponsored and they look from average to plainly bad (godfall, forspoken) is my issue. If they looked great and needed vram, that's absolutely fine. But they dont.

I must admit, it is rather amusing how nvidia got **** back in the day for their sponsored games that ran awfully on launch i.e. batman arkham knight and so on, now most of the titles that have serious issues/run badly are amd sponsored :cry:

Maybe this is amds plan... to force people to go to consoles? :D
 
I must admit, it is rather amusing how nvidia got **** back in the day for their sponsored games that ran awfully on launch i.e. batman arkham knight and so on, now most of the titles that have serious issues/run badly are amd sponsored :cry:

Maybe this is amds plan... to force people to go to consoles? :D
With their current pricing you could say the same about Nvidia ;)
 
The point wasn't on hardware/vram aspects? :confused:

Bencher has a point, amd have fans that foam at the mouth when nvidia do "bad" hence why they are more vocal online hence why threads explode with the "nvidia bad, amd good", you don't see amd threads exploding when they have issues i.e. RDNA 3 vapour chamber heat/catching fire but nvidia "zOMG, how does this company get away with it!?" and so on.

Are people with more vram really getting a benefit from having more vram though? Are the textures etc. really that much better than the "high" setting or what consoles (with a "unified total" memory system of 16GB) offer? Or rather is the benefit of all that extra vram for PC just to avoid poor optimisation/issues, lack of a unified memory config/good implementation of direct storage in a handful of well regarded broken/buggy games..... Since apparently it's never the games/developers fault then I guess it isn't CDPR/game fault that cp 2077 path tracing runs better on a 3070 than a 7900xtx, right?

As I have said before, I got no loyalty to any company, my amd purchases far outweigh both nvidia and intel combined but people need to stop getting brainwashed into thinking "amd good, nvidia bad" and thinking that everything bad which happens in the gpu space is nvidias fault.

I'd rather see someone proving these VRAM issues are from incompetence or sabotage than saying that because Game B uses less memory, it must be unfair tactics that Game A uses more and exceeds an Nvidia limit (but not an AMD limit).

That is purely strawmaning. Makes sense, when you cant argue against what im saying.

Games can take as much as 58 gb of vram for all I care. What im saying is, the game that takes 58gb of vram, should freaking look better than eg. Plague Tale that needs 5. The fact that 97% of the games that hog vram are amd sponsored and they look from average to plainly bad (godfall, forspoken) is my issue. If they looked great and needed vram, that's absolutely fine. But they dont.

No it is not. Earlier you described that games requiring more memory causes problems that can only be overcome by Nvidia being less of a stingy git with memory. Shock horror at the idea of a competitor forcing Nvidia to give the customer more.

Αgain, you are missing the point. I agree it has caused problems. That's not because of the lack of VRAM. If that was the problem then you wouldn't have better looking games using half or less than that the vram of horrible looking games. No matter how much VRAM nvidia adds, there will always be amd sponsored games that ask more than that. The only thing nvidia can do to stop these amd schemes is to add more vram than amd cards have. Else they are screwed, it will keep on happening

But you have instead rationalised the fact that Nvidia doesn't care about this by shifting the blame on game companies being incompetent and AMD scheming to make sure games companies are incompetent.

All to make sure that in very specific scenarios, specific AMD cards will have a memory advantage.
 
No it is not. Earlier you described that games requiring more memory causes problems that can only be overcome by Nvidia being less of a stingy git with memory. Shock horror at the idea of a competitor forcing Nvidia to give the customer more.


But you have instead rationalised the fact that Nvidia doesn't care about this by shifting the blame on game companies being incompetent and AMD scheming to make sure games companies are incompetent.

All to make sure that in very specific scenarios, specific AMD cards will have a memory advantage.
Again, for the 9th time, if a game takes 3 times the vram of another game, then it SHOULD look better. The games that hog vram do NOT look better than eg. Plague Tale. Therefore, it necessarily follows that the problem lies within the game and not within the amount of vram. That's literally logic 101.
 
Again, for the 9th time, if a game takes 3 times the vram of another game, then it SHOULD look better. The games that hog vram do NOT look better than eg. Plague Tale. Therefore, it necessarily follows that the problem lies within the game and not within the amount of vram. That's literally logic 101.

I do not think you have enough information to claim different games "should" be using X or Y amount of memory.
 
And im sure you and whoever liked your post were also claiming "fair play" and they were doing great. RIGHT? Oh, no, you also say that when amd pulls it. ahokay
The thing is it's just part of the competition now. It's established. Nvidia had a big hand in establishing it as such. Would I prefer it if all such behaviour just magically evaporated? Sure! It's very much so a case of playing stupid games and winning stupid prizes for all involved when looking in on it from a magical hypothetical universe where none of this behaviour took place.

But we can't really expect AMD to play goodie-two-shoes when Nvidia has clearly benefited from this behaviour.
 
The thing is it's just part of the competition now. It's established. Nvidia had a big hand in establishing it as such. Would I prefer it if all such behaviour just magically evaporated? Sure! It's very much so a case of playing stupid games and winning stupid prizes for all involved when looking in on it from a magical hypothetical universe where none of this behaviour took place.

But we can't really expect AMD to play goodie-two-shoes when Nvidia has clearly benefited from this behaviour.
Nvidia used to do it and it was disgusting. Now AMD is doing and you called it fairplay. At this point we are moving into AMD exclusive games, if you don't have an amd card you cant play them, and that's kinda true cause the majority of gamers do not actually have 4090s and 4080s but low / mid range ampere / turing..

My issue is, why don't you put the effort to at least make the game LOOK great on top of hogging vram. Cause right now I have no incentive to buy an amd card with lots of vram,, since I have to pay for that vram, it consumes more power and ill end up playing games that actually look worse.
 
Nvidia used to do it and it was disgusting. Now AMD is doing and you called it fairplay. At this point we are moving into AMD exclusive games, if you don't have an amd card you cant play them, and that's kinda true cause the majority of gamers do not actually have 4090s and 4080s but low / mid range ampere / turing..

My issue is, why don't you put the effort to at least make the game LOOK great on top of hogging vram. Cause right now I have no incentive to buy an amd card with lots of vram,, since I have to pay for that vram, it consumes more power and ill end up playing games that actually look worse.
"Used to?" What makes you think Nvidia aren't behaving like this today as well? Their absolute slew of proprietary value adds would suggest otherwise whereas AMD's equivalents are always open access. You can't tell me that e.g. Control and Cyberpunk's RT settings weren't configured with disadvantaging AMD cards in mind, either. You're basically expecting AMD to bring a knife to a gun fight.

But my issue with Nvidia here is that their problem is actually trivially resolvable: just offer higher VRAM alternatives to their current products. It's an easy thing to do, requires zero rearchitecting and doesn't even cost very much. But they don't want to do that because ultimately Nvidia want those parts to get crippled by VRAM requirements so that you buy their next gen as well... It's just coming home to roost a bit faster than anticipated.
 
Back
Top Bottom