Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
That's just a silly thing to say but sure, just remember (you and whoever liked your post) what you are calling fair play. Next time if nvidia is the one doing the shaenanigans, call it fair play as well as im sure you did in the past right?Sponsoring games to push your line's advantage is being Nvidia 101. If they cheaped out on a cheap part presumably to screw their own consumers and AMD is pressing that advantage then I say fair play to them, Nvidia's had it coming.
Interesting you should say that, because techpowerup predicts that the recently released W7800 pro card should, theoretically, and with a lot of salt etc. be approximately 10% faster than the 4070 and it has specs that could easily fit in as a 7800 XT (256-bit bus, further cut down Navi 31 with approx 44 TFLOP throughput).AMD will release a 7800XT which is 5-10% faster than the 4070 but with worse ray tracing and power consumption for RRP ~$649, and everyone will sigh and roll their eyes.
The bias is overflowing
Even if nvidia did pay to get their tech in games ... I rather have this than amds approach.
Using the word bias while complaining loudly that it's not fair that games aren't limited to the memory capacity that Nvidia cards have
Naturally this is upvoted by Nexus18 who is a bastion of bias for Nvidia with not a fig leaf to hide it.
AMD has been giving out plenty of VRAM for a long time now, they've even said it's cheap to do so.
Surely this is good competition but no, weird complaints declare it unfair because games companies use that extra VRAM
Nvidia could with little effort, wipe out this problem that embarrasses their users in forum arguments, of certain AMD cards sometimes having more VRAM and thus having memory limited scenarios of better performance.
That would upset the segmentation so they don't. That's how much they care about their competitor having this niche advantage.
I got no loyalty to any company
Nvidia has been doing the shenanigans for a long time, they'll do it again irrespective of what AMD does. The whole thing of pushing for absurd, invisible to the naked eye levels of tessellation at the time comes to mind in particular.That's just a silly thing to say but sure, just remember (you and whoever liked your post) what you are calling fair play. Next time if nvidia is the one doing the shaenanigans, call it fair play as well as im sure you did in the past right?
The bias is overflowing but whatever
And im sure you and whoever liked your post were also claiming "fair play" and they were doing great. RIGHT? Oh, no, you also say that when amd pulls it. ahokayNvidia has been doing the shenanigans for a long time, they'll do it again irrespective of what AMD does. The whole thing of pushing for absurd, invisible to the naked eye levels of tessellation at the time comes to mind in particular.
That is purely strawmaning. Makes sense, when you cant argue against what im saying.Using the word bias while complaining loudly that it's not fair that games aren't limited to the memory capacity that Nvidia cards have
Naturally this is upvoted by Nexus18 who is a bastion of bias for Nvidia with not a fig leaf to hide it.
Nvidia has been doing the shenanigans for a long time, they'll do it again irrespective of what AMD does. The whole thing of pushing for absurd, invisible to the naked eye levels of tessellation at the time comes to mind in particular.
But in any case I'm on a 3060 12GB so my only real bias is thinking the upgrade paths they're offering at the moment look a bit ****.
That is purely strawmaning. Makes sense, when you cant argue against what im saying.
Games can take as much as 58 gb of vram for all I care. What im saying is, the game that takes 58gb of vram, should freaking look better than eg. Plague Tale that needs 5. The fact that 97% of the games that hog vram are amd sponsored and they look from average to plainly bad (godfall, forspoken) is my issue. If they looked great and needed vram, that's absolutely fine. But they dont.
With their current pricing you could say the same about NvidiaI must admit, it is rather amusing how nvidia got **** back in the day for their sponsored games that ran awfully on launch i.e. batman arkham knight and so on, now most of the titles that have serious issues/run badly are amd sponsored
Maybe this is amds plan... to force people to go to consoles?
The point wasn't on hardware/vram aspects?
Bencher has a point, amd have fans that foam at the mouth when nvidia do "bad" hence why they are more vocal online hence why threads explode with the "nvidia bad, amd good", you don't see amd threads exploding when they have issues i.e. RDNA 3 vapour chamber heat/catching fire but nvidia "zOMG, how does this company get away with it!?" and so on.
Are people with more vram really getting a benefit from having more vram though? Are the textures etc. really that much better than the "high" setting or what consoles (with a "unified total" memory system of 16GB) offer? Or rather is the benefit of all that extra vram for PC just to avoid poor optimisation/issues, lack of a unified memory config/good implementation of direct storage in a handful of well regarded broken/buggy games..... Since apparently it's never the games/developers fault then I guess it isn't CDPR/game fault that cp 2077 path tracing runs better on a 3070 than a 7900xtx, right?
As I have said before, I got no loyalty to any company, my amd purchases far outweigh both nvidia and intel combined but people need to stop getting brainwashed into thinking "amd good, nvidia bad" and thinking that everything bad which happens in the gpu space is nvidias fault.
That is purely strawmaning. Makes sense, when you cant argue against what im saying.
Games can take as much as 58 gb of vram for all I care. What im saying is, the game that takes 58gb of vram, should freaking look better than eg. Plague Tale that needs 5. The fact that 97% of the games that hog vram are amd sponsored and they look from average to plainly bad (godfall, forspoken) is my issue. If they looked great and needed vram, that's absolutely fine. But they dont.
Αgain, you are missing the point. I agree it has caused problems. That's not because of the lack of VRAM. If that was the problem then you wouldn't have better looking games using half or less than that the vram of horrible looking games. No matter how much VRAM nvidia adds, there will always be amd sponsored games that ask more than that. The only thing nvidia can do to stop these amd schemes is to add more vram than amd cards have. Else they are screwed, it will keep on happening
Again, for the 9th time, if a game takes 3 times the vram of another game, then it SHOULD look better. The games that hog vram do NOT look better than eg. Plague Tale. Therefore, it necessarily follows that the problem lies within the game and not within the amount of vram. That's literally logic 101.No it is not. Earlier you described that games requiring more memory causes problems that can only be overcome by Nvidia being less of a stingy git with memory. Shock horror at the idea of a competitor forcing Nvidia to give the customer more.
But you have instead rationalised the fact that Nvidia doesn't care about this by shifting the blame on game companies being incompetent and AMD scheming to make sure games companies are incompetent.
All to make sure that in very specific scenarios, specific AMD cards will have a memory advantage.
Again, for the 9th time, if a game takes 3 times the vram of another game, then it SHOULD look better. The games that hog vram do NOT look better than eg. Plague Tale. Therefore, it necessarily follows that the problem lies within the game and not within the amount of vram. That's literally logic 101.
The thing is it's just part of the competition now. It's established. Nvidia had a big hand in establishing it as such. Would I prefer it if all such behaviour just magically evaporated? Sure! It's very much so a case of playing stupid games and winning stupid prizes for all involved when looking in on it from a magical hypothetical universe where none of this behaviour took place.And im sure you and whoever liked your post were also claiming "fair play" and they were doing great. RIGHT? Oh, no, you also say that when amd pulls it. ahokay
Nvidia used to do it and it was disgusting. Now AMD is doing and you called it fairplay. At this point we are moving into AMD exclusive games, if you don't have an amd card you cant play them, and that's kinda true cause the majority of gamers do not actually have 4090s and 4080s but low / mid range ampere / turing..The thing is it's just part of the competition now. It's established. Nvidia had a big hand in establishing it as such. Would I prefer it if all such behaviour just magically evaporated? Sure! It's very much so a case of playing stupid games and winning stupid prizes for all involved when looking in on it from a magical hypothetical universe where none of this behaviour took place.
But we can't really expect AMD to play goodie-two-shoes when Nvidia has clearly benefited from this behaviour.
"Used to?" What makes you think Nvidia aren't behaving like this today as well? Their absolute slew of proprietary value adds would suggest otherwise whereas AMD's equivalents are always open access. You can't tell me that e.g. Control and Cyberpunk's RT settings weren't configured with disadvantaging AMD cards in mind, either. You're basically expecting AMD to bring a knife to a gun fight.Nvidia used to do it and it was disgusting. Now AMD is doing and you called it fairplay. At this point we are moving into AMD exclusive games, if you don't have an amd card you cant play them, and that's kinda true cause the majority of gamers do not actually have 4090s and 4080s but low / mid range ampere / turing..
My issue is, why don't you put the effort to at least make the game LOOK great on top of hogging vram. Cause right now I have no incentive to buy an amd card with lots of vram,, since I have to pay for that vram, it consumes more power and ill end up playing games that actually look worse.