• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

This is such a lousy discussion.

There's about 3 people with no evidence trying to claim VRAM usage is deliberately pro-AMD but when you slightly poke them about what they have to show for it they'll curl up and say it's actually an honourable complaint about inefficient VRAM usage but the continued issue is still that they have nothing.

You reckoning is not worth spit and if you can't actually tie it to AMD then read the thread title?
 
It isn't. It was/is just crappy if such an old game needs more than 8gb at 1440p.

Funny thing though is that they've launched 4GB Fury cards (plenty of talks back then with 4gb is enough for 4k), then 8gb cards as Vega and 5700xt. Why, haven't they heard about Rust? Or was it simply because they ignore outliers/ badly optimized games?

How is this still the apology? Game A suffers VRam over flow, "its the developers fault" Game B suffers VRam over flow, "its the developers fault" Game C, Game D, E, F, G, H.... the list grows and grows and its never the hardware, unless its AMD, its always the developer, and there are now dozens of bad developers..... meanwhile: Developers: PC's are weak, But we don't have to worry about them anymore, Consoles have the bigger market share, they are better than PC's, now we can do more of the stuff that we want to do, we are no longer hamstrung by crummy hardware.
 
Last edited:
Ok then so the 3070 running CP 2077 path tracing better than a 7900xtx isn't the developers fault either, gotcha! :D

Duh?

It's so hypocritical for RT/DLSS quarantine thread inhabitants to be complaining that cards have an inherent edge because of their design. Yeah, a card designed for an advantage at X has an advantage when X turns up :eek:
 
Duh?

It's so hypocritical for RT/DLSS quarantine thread inhabitants to be complaining that cards have an inherent edge because of their design. Yeah, a card designed for an advantage at X has an advantage when X turns up :eek:

Except it's not a hardware limitation, it's a game/Dev issue.... 7900xtx should be performing on par with a 3080/3090, not being beat by a 3070 :) May very likely be down to Nvidia.....

But if using above logic, it's not a game/Dev fault at all :)
 
How is this still the apology? Game A suffers VRam over flow, "its the developers fault" Game B suffers VRam over flow, "its the developers fault" Game C, Game D, E, F, G, H.... the list grows and grows and its never the hardware, unless its AMD, its always the developer, and there are now dozens of bad developers..... meanwhile: Developers: PC's are weak, But we don't have to worry about them anymore, Consoles have the bigger market share, they are better than PC's, now we can do more of the stuff that we want to do, we are no longer hamstrung by crummy hardware.
If a game like rust needs more than 8GB vRAM for 1440 is not the hardware's fault, doesn't matter if is AMD or nVIDIA.
If a game like TLOU needs more than 8GB at 1080p and beefy CPU is not the hardware's fault, doesn't matter if is AMD or nVIDIA.
If FC6 can't properly stream in and out and needs more than 8GB at 1080p for high texture pack, is not the hardware's fault, doesn't matter if is AMD or nVIDIA.

etc.

HZD does fine with 8GB even at high resolutions.
GoW does fine with 8GB even at high resolutions.

If Stalker 2 comes out with something like 12gb minimum for 1080p/1440p for high/ultra textures and at least 16gb for 4k, yeah, understandable.
In the meantime, if the devs think the PC is weak, consoles strong, what can i say... lol.
 
Last edited:
Except it's not a hardware limitation, it's a game/Dev issue.... 7900xtx should be performing on par with a 3080/3090, not being beat by a 3070 :) May very likely be down to Nvidia.....

But if using above logic, it's not a game/Dev fault at all :)

Sounds like a distraction from the garbage that is the memory discussion.

Continue with the memory discussion that has filled half this thread and provide the backing for why it is bad for the retail gpu segment that AMD has been typically generous with memory.

As far as I have observed the only issue is that running out of VRAM embarrasses Nvidia segmentation and embarrasses people who tie their self worth to Nvidia.

People without any loyalty wouldn't be emotionally trapped in defending and deflecting this issue which has no defence whatsoever.
 
Sounds like a distraction from the garbage that is the memory discussion.

Continue with the memory discussion that has filled half this thread and provide the backing for why it is bad for the retail gpu segment that AMD has been typically generous with memory.

As far as I have observed the only issue is that running out of VRAM embarrasses Nvidia segmentation and embarrasses people who tie their self worth to Nvidia.

People without any loyalty wouldn't be emotionally trapped in defending and deflecting this issue which has no defence whatsoever.

Like I've said many times before. I got no problem having more vram "if" it is actually being put to proper use other then to avoid/brute force pass issues inherent to how the game has been ported. Again see tlou when it released and watch df video, a similar spec pc to the ps5 it's running it worse in a number of ways but not a fault with the game.... :cry:

People should be not just pushing Nvidia to provide more vram but also pushing developers to do a better job of porting/optimising for pc hardware otherwise keep enjoying having to spend ££££ just to avoid/brute force issues....

If you can't see or understand that then you're part of the problem with these well regarded broken pc titles
 
Like I've said many times before. I got no problem having more vram "if" it is actually being put to proper use other then to avoid/brute force pass issues inherent to how the game has been ported. Again see tlou when it released and watch df video, a similar spec pc to the ps5 it's running it worse in a number of ways but not a fault with the game.... :cry:

People should be not just pushing Nvidia to provide more vram but also pushing developers to do a better job of porting/optimising for pc hardware otherwise keep enjoying having to spend ££££ just to avoid/brute force issues....

If you can't see or understand that then you're part of the problem with these well regarded broken pc titles

This is the swivel position I've seen in this thread a few times now and it raises an important question.

Why does this discussion belong in this thread if you abandon trying to pin this on AMD.

Surely this is manufacturer agnostic and deserves a special thread for speculating that developers find it a lot easier to optimise for their fixed hardware consoles than the mess that is PC.
 
If a game like rust needs more than 8GB vRAM for 1440 is not the hardware's fault, doesn't matter if is AMD or nVIDIA.
If a game like TLOU needs more than 8GB at 1080p and beefy CPU is not the hardware's fault, doesn't matter if is AMD or nVIDIA.
If FC6 can't properly stream in and out and needs more than 8GB at 1080p for high texture pack, is not the hardware's fault, doesn't matter if is AMD or nVIDIA.

etc.

HZD does fine with 8GB even at high resolutions.
GoW does fine with 8GB even at high resolutions.

If Stalker 2 comes out with something like 12gb minimum for 1080p/1440p for high/ultra textures and at least 16gb for 4k, yeah, understandable.
In the meantime, if the devs think the PC is weak, consoles strong, what can i say... lol.

Consoles are better than Nvidia GPU's
 
There is a big thread on AT forums about this:
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. 4K is irrelevant.
  11. Texture quality is irrelevant as long as it matches a console's.
  12. Detail levels are irrelevant as long as they match a console's.
  13. There's no reason a game should use more than 640K 8GB, because a forum user said so.
  14. It's completely acceptable for 3070/3070TI/3080 owners to turn down settings while 3060 users have no issue.
  15. It's an anomaly.
  16. It's a console port.
  17. It's a conspiracy against nVidia.
  18. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  19. It's completely acceptable to disable ray tracing on nVidia cards while AMD users have no issue.
They keep adding new ones when they pop up! I thought it was amusing.

:p
 
Last edited:
Seems a comprehensive list, barely scratched the surface in here.

Not that it should be in here at all but literally 3 people wanted to press the AMD fault angle.
 
There is a big thread on AT forums about this:
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. 4K is irrelevant.
  11. Texture quality is irrelevant as long as it matches a console's.
  12. Detail levels are irrelevant as long as they match a console's.
  13. There's no reason a game should use more than 640K 8GB, because a forum user said so.
  14. It's completely acceptable for 3070/3070TI/3080 owners to turn down settings while 3060 users have no issue.
  15. It's an anomaly.
  16. It's a console port.
  17. It's a conspiracy against nVidia.
  18. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  19. It's completely acceptable to disable ray tracing on nVidia cards while AMD users have no issue.
They keep adding new ones when they pop up! I thought it was amusing.

:p

Pure gold.
 
You can use higher resolution and more diverse textures on Consoles than you can on Nvidia GPU's.
Lol, how come? From the top of my head, there are at least 3 GPUs that have the same or higher vRAM amount than the total amount consoles have (if you want to play that angle), and others than don't have an issues in lower resolutions.

Not to mentions the RT (in)capabilitis of console or how they lag behind in rasterisation.


Seems a comprehensive list, barely scratched the surface in here.

Not that it should be in here at all but literally 3 people wanted to press the AMD fault angle.
Is not AMD's fault.
 
Last edited:
Back
Top Bottom