• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

Oh...... RX 6700XT, Sapphire Pulse, good one that. £350.


This one also down to £350.
They were both at least £375 a week ago, possibly a little more.
 
If anything my next GPU is likely to be AMD, seems Nvidia only want to provide enough VRAM if I pay an arm and a leg for it.

I see more and more of this comment or similar, particularly owners of 3070's getting a bit bitter, and although I doubt it'll be much, I wonder if and how much of a percentage of market share Nvidia will loose over the next year or so because if this.
 
I see more and more of this comment or similar, particularly owners of 3070's getting a bit bitter, and although I doubt it'll be much, I wonder if and how much of a percentage of market share Nvidia will loose over the next year or so because if this.
It must grate when you know your GPU has enough grunt to run a game but find you're running out of VRAM. I suppose that's the business model you buy into, regular upgrades. Wouldn't be so bad if the prices were more reasonable.
 
Typically I find the game that is my "go to" to be the one where my 3070 runs out of VRAM....

memory.png


.....what difference that makes I'm not sure, it is not as tho I have a 8GB / 16GB switch to throw. Otherwise the card is rock solid, absolutely, and has been great at either 1440p or UW. It runs really well. There are some later titles it seems to run out of grunt with, well to a point, Hogwarts springs to mind, need to get back to that one. But a part of me reckons that inadequate development and optimisation plays a big part at dragging the performance down as does any GPU limitation.

It wasn't my first choice of that gen, that went to a Red Devil LTD ED 6800XT. It lasted around 2 weeks, returned for a refund. That AMD card ran as expected in the AAA titles I had to test it but in the Indie type games I played it was horrendous. AMD support forums were of little use but it did remind me of historical problems I had in the past with ATI.

With the above in mind there will probably be a resignation and acceptance of sticking with another Nvidia card, even tho I would like that to be by choice of knowing an AMD one would work as expected, but it doesn't, for me.
 
I see more and more of this comment or similar, particularly owners of 3070's getting a bit bitter, and although I doubt it'll be much, I wonder if and how much of a percentage of market share Nvidia will loose over the next year or so because if this.

Nvidia won't lose any market share over this.

We have seen this generation after generation how well AMDs cards perform over years of updates, the "fine wine" yet people still by Nvidia anyway.

Nvidia have won the mind game war. The only thing that AMD can do is put something out there where you would actually have to be clinically insane so by Nvidia..... Even then people would still by Nvidia over AMD.
 
Last edited:
Nvidia won't lose any market share over this.

We have seen this generation after generation how well AMDs cards perform over years of updates, the "fine wine" yet people still by Nvidia anyway.

Nvidia have won the mind game war. The only thing that AMD can do is put something out there where you would actually have to be clinically insane so by Nvidia..... Even then people would still by Nvidia over AMD.
I still don't get how people think "fine wine" is a good thing.
Would you like 95% of performance now or in 12 months (with 80% of performance now)? I'll go for 80% of performance now and then increase it up to 95% in 12 months please.

I mean I'd rather have them start at 80% and increase to 95% over 12 months than remain at 80% for the life of the card, but I'd prefer 95% from the start even if it meant very little in the way of boosts in the future.

Also, when they get reviewed they get reviewed with the performance available at release, nobody's gonna wait 12 months to review a card so the drivers have matured.

If a game comes out buggy and then gets patched over the course of a year so that it's now mostly bug free and optimised, nobody praises that system. Look at No Mans Sky, awful state at release but sounds like it's in a good place now. But I doubt anyone hoping that every future game does the No Mans Sky "fine wine" approach.

As for the whole 8/12/16GB VRAM thing, the way I see it, you knew what you were buying and if you didn't maybe that's on you (at least to a degree). If you were thinking that 8GB isn't enough (at the time) or wouldn't be enough going forward, maybe don't buy it. The last time I bought an 8GB card was in January 2019 and it was a 1070. At the time, playing at 1440p that seemed OK, I didn't expect the card to be compromise free at the time given there was also a 1070Ti, 1080 and 1080Ti. So I wasn't surprised when 8GB started being an issue, although these days I imagine lack of grunt is as much of an issue for the card as lack of VRAM. Besides which at that point the other options were Vega 56, Vega 64, 1070Ti and 1080 that were also 8GB and then the 1080Ti at 11GB (and probably a Titan card of some description).

I mean would it be nice if graphics cards had more VRAM, of course it would, but why stop at 16GB? I see posts saying how cheap VRAM is and AMD putting 16GB on cheaper cards seems to prove this, so why not go to 32GB or 64GB just to really sort this out?
 
Typically I find the game that is my "go to" to be the one where my 3070 runs out of VRAM....

memory.png


.....what difference that makes I'm not sure, it is not as tho I have a 8GB / 16GB switch to throw. Otherwise the card is rock solid, absolutely, and has been great at either 1440p or UW. It runs really well. There are some later titles it seems to run out of grunt with, well to a point, Hogwarts springs to mind, need to get back to that one. But a part of me reckons that inadequate development and optimisation plays a big part at dragging the performance down as does any GPU limitation.

It wasn't my first choice of that gen, that went to a Red Devil LTD ED 6800XT. It lasted around 2 weeks, returned for a refund. That AMD card ran as expected in the AAA titles I had to test it but in the Indie type games I played it was horrendous. AMD support forums were of little use but it did remind me of historical problems I had in the past with ATI.

With the above in mind there will probably be a resignation and acceptance of sticking with another Nvidia card, even tho I would like that to be by choice of knowing an AMD one would work as expected, but it doesn't, for me.

Look at the VRam in the OSD on this, 120Hz, keep your eye on the system ram as i approach the scrapyard, the VRam is already full, it need more, what to put it? System ram, keep your eye on it and watch the frame rates, goes from 100 <> 120, to 30.

RTX 2070 Super.

 
Look at the VRam in the OSD on this, 120Hz, keep your eye on the system ram as i approach the scrapyard, the VRam is already full, it need more, what to put it? System ram, keep your eye on it and watch the frame rates, goes from 100 <> 120, to 30.

RTX 2070 Super.

And that's a perfect example of terrible optimization. You think this game,, the way it looks, should require more than 8gb of vram? The game looks like it came straight from 2008, something like a fallout 3.
 
And that's a perfect example of terrible optimization. You think this game,, the way it looks, should require more than 8gb of vram? The game looks like it came straight from 2008, something like a fallout 3.
Isn't this exactly the opposite: an example of excellent optimisation.

Something Hardware Unboxed mentioned in a recent 8GB/16GB video that some games run without a framerate hit on 8GB, because they automatically select very low res textures if the vram buffer is insufficient for the specified quality preset. Game runs fine, but looks rubbish, the internet: "See, there's nothing wrong with 8GB on a high-end card!"
 
Last edited:
Isn't this exactly the opposite: an example of excellent optimisation.

Something Hardware Unboxed mentioned in a recent 8GB/16GB video that some games run without a framerate hit on 8GB, because they automatically select very low res textures if the vram buffer is insufficient for the specified quality preset. Game runs fine, but looks rubbish, the internet: "See, there's nothing wrong with 8GB on a high-end card!"
You look at that game up there and you think it's justified to use more than 8gb? LOL. Witcher 3 was using 1.5 to 2gb of vram and looked much better. Plague tale uses 5gb of vram and looks 20 times better. What are you suggesting exactly?
 
And that's a perfect example of terrible optimization. You think this game,, the way it looks, should require more than 8gb of vram? The game looks like it came straight from 2008, something like a fallout 3.

Its a 2013 game.

something like a fallout 3

It looks nothing like Fallout 3.

Its built in Unity. https://unity.com/ one of the best engines out there, IMO second only to Unreal Engine.

I do wish people would stop using "optimisation" as a catch all for what doesn't run well on gimped hardware, doing that just encourages said hardware vendor to keep selling you gimped low end hardware for enthusiast grade pricing.

Besides that people who use the "Optimisation" catch all have no clue what that means in game development.
 
Its a 2013 game.



It looks nothing like Fallout 3.

Its built in Unity. https://unity.com/ one of the best engines out there, IMO second only to Unreal Engine.

I do wish people would stop using "optimisation" as a catch all for what doesn't run well on gimped hardware, doing that just encourages said hardware vendor to keep selling you gimped low end hardware for enthusiast grade pricing.

Besides that people who use the "Optimisation" catch all have no clue what that means in game development.
Im sorry but if a 2013 game that looks like that needs more than 8gb of vram, then what should eg Cyberpunk require? 40gb? If that's not awful optimization then I don't what to tell you. The game looks way worse than gamas that require 1/4th the vram. Crysis 3 came out around the same time, and so did witcher 3.
 
Look, the behaviour its exiting, in seamlessly pushing VRam overspill in to System ram without so much as a slight stutter or reduction in texture quality is testament to the quality of the code.

Im sorry but if a 2013 game that looks like that needs more than 8gb of vram, then what should eg Cyberpunk require? 40gb? If that's not awful optimization then I don't what to tell you. The game looks way worse than gamas that require 1/4th the vram. Crysis 3 came out around the same time, and so did witcher 3.

Cyberpunk has nice lighting, beyond that it doesn't look good at all, it looks ok, in my view. Lighting is everything in that game, without it it looks "meh"
 
Last edited:
I still don't get how people think "fine wine" is a good thing.
Would you like 95% of performance now or in 12 months (with 80% of performance now)? I'll go for 80% of performance now and then increase it up to 95% in 12 months please.

I mean I'd rather have them start at 80% and increase to 95% over 12 months than remain at 80% for the life of the card, but I'd prefer 95% from the start even if it meant very little in the way of boosts in the future.

Also, when they get reviewed they get reviewed with the performance available at release, nobody's gonna wait 12 months to review a card so the drivers have matured.

If a game comes out buggy and then gets patched over the course of a year so that it's now mostly bug free and optimised, nobody praises that system. Look at No Mans Sky, awful state at release but sounds like it's in a good place now. But I doubt anyone hoping that every future game does the No Mans Sky "fine wine" approach.

As for the whole 8/12/16GB VRAM thing, the way I see it, you knew what you were buying and if you didn't maybe that's on you (at least to a degree). If you were thinking that 8GB isn't enough (at the time) or wouldn't be enough going forward, maybe don't buy it. The last time I bought an 8GB card was in January 2019 and it was a 1070. At the time, playing at 1440p that seemed OK, I didn't expect the card to be compromise free at the time given there was also a 1070Ti, 1080 and 1080Ti. So I wasn't surprised when 8GB started being an issue, although these days I imagine lack of grunt is as much of an issue for the card as lack of VRAM. Besides which at that point the other options were Vega 56, Vega 64, 1070Ti and 1080 that were also 8GB and then the 1080Ti at 11GB (and probably a Titan card of some description).

I mean would it be nice if graphics cards had more VRAM, of course it would, but why stop at 16GB? I see posts saying how cheap VRAM is and AMD putting 16GB on cheaper cards seems to prove this, so why not go to 32GB or 64GB just to really sort this out?

I didn't say it's a good thing at all. I actually do agree with you.
But if you have 2 different GPUs that perform relatively similar for similar prices is the better card the one that stays roughly the same or the one that could gain 10-20% more over the next couple of years. Sure it's not a good thing 100% at the start but the idea that GPU will last potentially longer does make it more appealing.

Regardless my point was less about "fine wine" and more that AMD have already lost the marketing game. Even if AMD release a product 20% cheaper and 20% more performance they still wouldn't outsell Nvidia. They need to really go with the Ryzen Gen 1 mentality and absolutly crush the price difference. Ryzen 1 came out almost 50% cheaper than Intels comparable top end even if it was 20% slow, the difference really helped with market share.

Memory costs more the larger the sizes become and takes up more PCB so I can see why they don't want to go to endless sizes, 16gb is however relatively cheap and cheap enough that Nvidia could easily do it. I don't think it's so much as why shouldn't AMD just put more than 16 on the cheaper models, and more that it's clearly planned obsolescence on Nvidias side, we already know some games are being bottlenecked by GPU memory so Nvidia should create a product to fix that issue, they just don't want to.

 
Last edited:
I didn't say it's a good thing at all. I actually do agree with you.
But if you have 2 different GPUs that perform relatively similar for similar prices is the better card the one that stays roughly the same or the one that could gain 10-20% more over the next couple of years. Sure it's not a good thing 100% at the start but the idea that GPU will last potentially longer does make it more appealing.

Regardless my point was less about "fine wine" and more that AMD have already lost the marketing game. Even if AMD release a product 20% cheaper and 20% more performance they still wouldn't outsell Nvidia. They need to really go with the Ryzen Gen 1 mentality and absolutly crush the price difference. Ryzen 1 came out almost 50% cheaper than Intels comparable top end even if it was 20% slow, the difference really helped with market share.

Memory costs more the larger the sizes become and takes up more PCB so I can see why they don't want to go to endless sizes, 16gb is however relatively cheap and cheap enough that Nvidia could easily do it. I don't think it's so much as why shouldn't AMD just put more than 16 on the cheaper models, and more that it's clearly planned obsolescence on Nvidias side, we already know some games are being bottlenecked by GPU memory so Nvidia should create a product to fix that issue, they just don't want to.


I have some sympathies for Nvidia, to a very limited extent....

When it comes to memory (or rather the analogue side of the...) silicon mores law really is dead, if that is what Nvidia are talking about they are not wrong, as a result the memory architecture is taking up more and more room with in the chip, so Nvidia are saving die space by having less memory lanes on the chip, it is perhaps why we now have 192Bit 4070/Ti, down from 256Bit on the 3070/Ti, that saves quite a bit of space. The problem with that is even if you're using 2GB Memory IC's you're still only getting 12GB, so it may be more an optimisation exercise rather than something that is cynically deliberate. tho i do think the 8GB 3070 and 10GB 3080 was cynical.

It has to be said AMD face the same problem, they are the ones who have explained this problem of Analogue silicon no longer shrinking.
But AMD came up with a clever solution, that is remove the analogue portion off the logic die, that way you can make it on a cheap older node and it wouldn't matter, what's more you can pick and chose how many memory lanes you have because you can "Glue" them to the logic die like Lego.

For all the bluster and talk of AMD being a poor second to Nvidia...... mmmmmmm...... maybe in software, but even that's debatable.
 
Last edited:
I didn't say it's a good thing at all. I actually do agree with you.
But if you have 2 different GPUs that perform relatively similar for similar prices is the better card the one that stays roughly the same or the one that could gain 10-20% more over the next couple of years. Sure it's not a good thing 100% at the start but the idea that GPU will last potentially longer does make it more appealing.

Regardless my point was less about "fine wine" and more that AMD have already lost the marketing game. Even if AMD release a product 20% cheaper and 20% more performance they still wouldn't outsell Nvidia. They need to really go with the Ryzen Gen 1 mentality and absolutly crush the price difference. Ryzen 1 came out almost 50% cheaper than Intels comparable top end even if it was 20% slow, the difference really helped with market share.

Memory costs more the larger the sizes become and takes up more PCB so I can see why they don't want to go to endless sizes, 16gb is however relatively cheap and cheap enough that Nvidia could easily do it. I don't think it's so much as why shouldn't AMD just put more than 16 on the cheaper models, and more that it's clearly planned obsolescence on Nvidias side, we already know some games are being bottlenecked by GPU memory so Nvidia should create a product to fix that issue, they just don't want to.

Yeah, sorry, it wasn't aimed specifically at you, I just quoted you as you mentioned the "fine wine", it was more general than that.
And yes, if there are 2 GPUs the same and one will stay the same and the other improve 10-20% over the next couple of years that'd be good. However, if the card was just 10-20% better at release you wouldn't have to wait a couple of years to get that performance (by which time the card is probably already having to turn down settings for the newer games, the older ones you'll already be done with while it wasn't working at it's full capabilities).
Now consider 2 cards, about the same price, one 10% better than the other as shown by every review site, but the other card will gain 20% performance over 2 years to end up being 10% better than the first card (by which point people have mostly stopped buying the either of them). Which one will most people buy? Are you going to buy a card that's 10% worse but has the possibility of gaining more performance (or not, it's not a guarantee) or the one that's better now?

I don't disagree that more VRAM would be better, I'm just saying that to a degree it's also on the people buying the cards. If they knew when they bought it that 8GB wasn't going to be enough for them in a couple of years time, why did they buy the card? If people stop buying the cards then maybe Nvidia will stop selling them...
 
I have some sympathies for Nvidia, to a very limited extent....

When it comes to memory (or rather the analogue side of the...) silicon mores law really is dead, if that is what Nvidia are talking about they are not wrong, as a result the memory architecture is taking up more and more room with in the chip, so Nvidia are saving die space by having less memory lanes on the chip, it is perhaps why we now have 192Bit 4070/Ti, down from 256Bit on the 3070/Ti, that saves quite a bit of space. The problem with that is even if you're using 2GB Memory IC's you're still only getting 12GB, so it may be more an optimisation exercise rather than something that is cynically deliberate. tho i do think the 8GB 3070 and 10GB 3080 was cynical.

It has to be said AMD face the same problem, they are the ones who have explained this problem of Analogue silicon no longer shrinking.
But AMD came up with a clever solution, that is remove the analogue portion off the logic die, that way you can make it on a cheap older node and it wouldn't matter, what's more you can pick and chose how many memory lanes you have because you can "Glue" them to the logic die like Lego.

For all the bluster and talk of AMD being a poor second to Nvidia...... mmmmmmm...... maybe in software, but even that's debatable.

Yeh they really do have good hardware design it just always seems to fall short, be they are the company who keep pushing new tech and new ideas, software wise as someone who has used both AMD are far superior at least in the standard UI, I always liked AMDs better over Nvidias Windows 98 looking thing.
Yeah, sorry, it wasn't aimed specifically at you, I just quoted you as you mentioned the "fine wine", it was more general than that.
And yes, if there are 2 GPUs the same and one will stay the same and the other improve 10-20% over the next couple of years that'd be good. However, if the card was just 10-20% better at release you wouldn't have to wait a couple of years to get that performance (by which time the card is probably already having to turn down settings for the newer games, the older ones you'll already be done with while it wasn't working at it's full capabilities).
Now consider 2 cards, about the same price, one 10% better than the other as shown by every review site, but the other card will gain 20% performance over 2 years to end up being 10% better than the first card (by which point people have mostly stopped buying the either of them). Which one will most people buy? Are you going to buy a card that's 10% worse but has the possibility of gaining more performance (or not, it's not a guarantee) or the one that's better now?

I don't disagree that more VRAM would be better, I'm just saying that to a degree it's also on the people buying the cards. If they knew when they bought it that 8GB wasn't going to be enough for them in a couple of years time, why did they buy the card? If people stop buying the cards then maybe Nvidia will stop selling them...

I don't think either of those choices would be wrong, but people do tend to want "now" rather than "later" I do agree there, but we can only hope people vote with their wallets, it does appear to be working this gen with really early price cuts on both brands bar the top cards
 
I don't like either of the control panels.
Nvidia's looks really dated, but I sort of like the simplicity.
AMD's looks better and less dated, although it is starting to look a little dated now (IMO). What I dislike the most about AMD's is that it looks like it was made for a smartphone or tablet with everything being ****-off great big buttons that look like I'm going to be dabbing at them with my fat fingers rather than a mouse pointer. Make it about 1/4th or 1/8th of the size and I think it would seem much more reasonable and useable.

I've not used the GeForce Experience thing so I've no idea what that's like.

Maybe what both of them need is the ability to use skins (like Afterburner) including community made ones. Maybe then everyone could find something they liked and it would take the burden off of AMD and Nvidia.
 
Look at the VRam in the OSD on this, 120Hz, keep your eye on the system ram as i approach the scrapyard, the VRam is already full, it need more, what to put it? System ram, keep your eye on it and watch the frame rates, goes from 100 <> 120, to 30.

RTX 2070 Super.



Not only do I now know to be wary of adding "Rust" as being a game that will run without any issues on a 3070 but I'm also fully understanding of what is meant by the term "trickle down economics" :D

Maybe I need to try out my friends 2080Ti, over that of my 3070. It might help out a little, I never get beyond playing the basket game when it comes to buying anything with this latest gen.
 
Back
Top Bottom