• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any news on 7800 xt?

£350 for an 8GB card is hard to swallow in 2023 though. Same with the 4070's 12GB: versus the 7800 XT, in theory 4-5% slower for 50W less isn't a bad, but 12GB is too little for £500+ at this stage.
I'm not saying its a good buy, just that right now its offering better price performance than any of AMDs new stuff aside from the 7600.
 
I was referring to value overall due to the fact it has more VRAM which I believe will be important at 1440p for example. Obviously value is subjective but you are correct, judged objectively purely on price/perf the 3060Ti 8GB is better. But by your metric the 7600 is better again as long as you game at 1080p.

If you are in the market for a very good 1440p GPU the 12GB 7700 XT is the best value.
 
I was referring to value overall due to the fact it has more VRAM which I believe will be important at 1440p for example. Obviously value is subjective but you are correct, judged objectively purely on price/perf the 3060Ti 8GB is better. But by your metric the 7600 is better again as long as you game at 1080p.

If you are in the market for a very good 1440p GPU the 12GB 7700 XT is the best value.
I think for 1440p the minimum you should be looking at is a 7800XT especially if you want to hang onto the card for a while.
 
£350 for an 8GB card is hard to swallow in 2023 though. Same with the 4070's 12GB: versus the 7800 XT, in theory 4-5% slower for 50W less isn't a bad, but 12GB is too little for £500+ at this stage.
So I'm an AMD guy but own a 4070, you'll find compared to AMD (which I also own in my 2nd rig, along with both machines cpus) the 4070 uses literally 3.5-4gb less than a 6800xt on EVERY game me and my mate have compared at the exact same settings/res - whether this is games being better optimized for Nvidia vs AMD, I don't know. But my point is if I use 4gb less than a 16gb card that means I've still got the same remaining future amount of vram as a 16gb card if you think about it? As they're literally wasting 3.5-4gb more vram than me?

Even the 'allocated' amount differs by around the same amount and neither brand card ever seems to come close, pair that with how good dlss3.5/frame generation is, and I don't think I'd have a problem using this for 3 years...

Regarding the wattage, a 4070 undervolts to real world usage of 105-145w - 145w being native 1440p with RT on at max settings in Control and around 105-125 in TLOU at native 1440p ultra :) in the most demanding areas it'll sometimes goto 135w. So in reality nearly 120w less consumption, from my experience undervolting my amd cards I'd imagine it wouldn't be unreasonable to get the 7800xt to sat 180-190w but not much less seeing as the 6800xt doesn't undervolt that much from it's stock draw.
The 4070 imho seems to have that RX 6600 XT kind of power consumption vs performance factor, which is why I chose it, you could get them for £520 last month as well, so £40 more than a 7800xt, so I wouldn't call it a rip off at this point.
FWIW mine never even turns it's fan on (65c stock) and runs on average at 53-57C native 1440p ultra.

So when your Nvidia card uses 3.5-4gb less vram in reality vs AMD cards it makes it relative to a 16gb card wasting an extra 3.5-4gb in all current and previous games, I've tested 40 games since buying in it july and compared the to my mates 6800xt and we've both yet to find a single game where this hungrier vram issue isn't the case on his 6800xt... I'd put money on it a 7800xt/7900xt/x would do the same at the same res/settings...

This seems to be something people don't want to admit to when banging on and on about having another 4gb over these cards or the 3080 etc... And remember I'm an AMD guy myself, but I wont be biased, hence why I gave this card a chance - or it'd be sent straight back using the 14 day no questions asked warranty!

I'm in no way saying this is AMD's fault FWIW, just stating games are obviously written to be better optimised on Nvidia in terms of vram usage vs performance achieved? Who knows, but this is fact that this happens with my specific card at least.

I'll stress this point, I'm talking about actual usage when played not allocation predicted in the settings menu...

So TLDR, when you're playing the game on my card or a 6800xt, both cards actual vram usage (NOT the predicted allocated amount in the settings menu) end up with the same remaining 'free' vram, due to the 16gb 6800xt using 3.5-4gb more per game than my 4070 at the same setting/res natively, thus a 16gb card ends up with the same remaining vram as my 4070 12gb... That means the whole 12gb worries isn't a factor with my card vs say a 6700xt...
I have tested this is pretty much every current and previous game (bar BG3 as that's not my cup of tea) and have had the same continuity throughout.

I am not in anyway trolling, but in reality that IS what my card/system uses at 1440p ultra native and I've tested it against my mates system with a 6800xt... So I can't really be any fairer than that when it comes to actual ownership/real world testing.
The first thing I did was test all this when I tried out the card, if it'd been sky high actual usage in game I'd of honored the 14 day returns policy and sent it packaging and got a different card.
 
Last edited:
So I'm an AMD guy but own a 4070, you'll find compared to AMD (which I also own in my 2nd rig, along with both machines cpus) the 4070 uses literally 3.5-4gb less than a 6800xt on EVERY game me and my mate have compared at the exact same settings/res - whether this is games being better optimized for Nvidia vs AMD, I don't know. But my point is if I use 4gb less than a 16gb card that means I've still got the same remaining future amount of vram as a 16gb card if you think about it? As they're literally wasting 3.5-4gb more vram than me?

Even the 'allocated' amount differs by around the same amount and neither brand card ever seems to come close, pair that with how good dlss3.5/frame generation is, and I don't think I'd have a problem using this for 3 years...

Regarding the wattage, a 4070 undervolts to real world usage of 105-145w - 145w being native 1440p with RT on at max settings in Control and around 105-125 in TLOU at native 1440p ultra :) in the most demanding areas it'll sometimes goto 135w. So in reality nearly 120w less consumption, from my experience undervolting my amd cards I'd imagine it wouldn't be unreasonable to get the 7800xt to sat 180-190w but not much less seeing as the 6800xt doesn't undervolt that much from it's stock draw.
The 4070 imho seems to have that RX 6600 XT kind of power consumption vs performance factor, which is why I chose it, you could get them for £520 last month as well, so £40 more than a 7800xt, so I wouldn't call it a rip off at this point.
FWIW mine never even turns it's fan on (65c stock) and runs on average at 53-57C native 1440p ultra.

TLDR is when your Nvidia card uses 3.5-4gb less vram in reality vs AMD cards it makes it relative to a 16gb card wasting an extra 3.5-4gb in all current and previous games, I've tested 40 games since buying in it july and compared the to my mates 6800xt and we've both yet to find a single game where this hungrier vram issue isn't the case on his 6800xt... I'd put money on it a 7800xt/7900xt/x would do the same at the same res/settings...

This seems to be something people don't want to admit to when banging on and on about having another 4gb over these cards or the 3080 etc... And remember I'm an AMD guy myself, but I wont be biased, hence why I gave this card a chance - or it'd be sent straight back using the 14 day no questions asked warranty!

I'm in no way saying this is AMD's fault FWIW, just stating games are obviously written to be better optimised on Nvidia in terms of vram usage vs performance achieved? Who knows, but this is fact that this happens with my specific card at least.
Is this reduced use of VRAM to do with Nvidia using more compression? Whether that is a good thing or not I'll let others comment.
 
SAM/rebar increases vram usage, it is usually about 1-2GB more. Nvidia only enable it for a very select few games on their end so even though a lot of sites will say rebar is on, that is only referring to the bios setting end and not game/driver end but yes, even with rebar/sam on/off, generally amd does use more vram, whether this is amd not doing as good of a job with vram management or/and nvidia doing better optimisation here, well, take your pick.
 
Is this reduced use of VRAM to do with Nvidia using more compression? Whether that is a good thing or not I'll let others comment.
I've genuinely no idea mate, as I say I'm an AMD guy by preference, but I will always try something new, no matter who makes it, as with when I do engine swaps in project cars - I have 0 brand loyalty if something is better/suits the usage case better.
I think in reality appearance wise it'd be very hard to tell if there is some kind of sneaky compression? I just put it down to most games being Nvidia sponsored/biased towards? So assumed the AMD cards must be less optimized/having a harder time running them vram wise?

In this scenario, I just wanted to build a silent air cooled pc which wasn't like a furnace next to my leg, that used sweet FA electric, as this system sits literally next to an airing cupboard/boiler, so the room doesn't need help being a sauna, anyway TLDR with the efficiency of the chosen parts in my sig, I achieved all the above paired with undervolting, so in THIS case, using a 4070 did the job, and when it doesn't it'll go in my SFF mitx 2nd rig (as I know it fits which was also a bonus when choosing it) but yeah no bias.

The bonus is though, the money I saved not getting the 6950/7900xt means that in 3 years I have my money back off the 4070 to go towards another GPU upgrade, as my total system including my 32" 1440p 165hz monitor/amp/speakers only uses 260-300w at the wall (measured with a plug monitor) average is 260w and 300w is when I play something like control with RT on in native 1440p max settings and around 280w for something like RE4 Remake in 1440p max settings native with RT on.

The 6950-7900XT would use that 260-300w or more in the 7900xt case just on their own! As I've gone from a 550-650w wall draw to 260-300w, which in a year at my disgusting KW/H rate and 6 hours evening usage every day works out at a saving of £192 every year - soon adds up... Means by year 3 when I chuck it in my 2nd MITX rig I have made my money back off the card, and have just shy of £600 towards a new gpu, free money. Cant really say no to that vs it being wasted on powering another 250-320w on the gpu each time I power the rig on with the AMD cards I would have originally used...
 
Last edited:
So I'm an AMD guy but own a 4070, you'll find compared to AMD (which I also own in my 2nd rig, along with both machines cpus) the 4070 uses literally 3.5-4gb less than a 6800xt on EVERY game me and my mate have compared at the exact same settings/res - whether this is games being better optimized for Nvidia vs AMD, I don't know. But my point is if I use 4gb less than a 16gb card that means I've still got the same remaining future amount of vram as a 16gb card if you think about it? As they're literally wasting 3.5-4gb more vram than me?

Even the 'allocated' amount differs by around the same amount and neither brand card ever seems to come close, pair that with how good dlss3.5/frame generation is, and I don't think I'd have a problem using this for 3 years...

Regarding the wattage, a 4070 undervolts to real world usage of 105-145w - 145w being native 1440p with RT on at max settings in Control and around 105-125 in TLOU at native 1440p ultra :) in the most demanding areas it'll sometimes goto 135w. So in reality nearly 120w less consumption, from my experience undervolting my amd cards I'd imagine it wouldn't be unreasonable to get the 7800xt to sat 180-190w but not much less seeing as the 6800xt doesn't undervolt that much from it's stock draw.
The 4070 imho seems to have that RX 6600 XT kind of power consumption vs performance factor, which is why I chose it, you could get them for £520 last month as well, so £40 more than a 7800xt, so I wouldn't call it a rip off at this point.
FWIW mine never even turns it's fan on (65c stock) and runs on average at 53-57C native 1440p ultra.

TLDR is when your Nvidia card uses 3.5-4gb less vram in reality vs AMD cards it makes it relative to a 16gb card wasting an extra 3.5-4gb in all current and previous games, I've tested 40 games since buying in it july and compared the to my mates 6800xt and we've both yet to find a single game where this hungrier vram issue isn't the case on his 6800xt... I'd put money on it a 7800xt/7900xt/x would do the same at the same res/settings...

This seems to be something people don't want to admit to when banging on and on about having another 4gb over these cards or the 3080 etc... And remember I'm an AMD guy myself, but I wont be biased, hence why I gave this card a chance - or it'd be sent straight back using the 14 day no questions asked warranty!

I'm in no way saying this is AMD's fault FWIW, just stating games are obviously written to be better optimised on Nvidia in terms of vram usage vs performance achieved? Who knows, but this is fact that this happens with my specific card at least.
The £650 3080 gets a lot of hate over VRAM yet is still putting out slightly better performance than AMDs new £500 GPU and Nvidia's near £600 card after 3 years which is something even the legendary 1080ti couldn't manage to do.
Screenshot-629.png
 
The £650 3080 gets a lot of hate over VRAM yet is still putting out slightly better performance than AMDs new £500 GPU and Nvidia's near £600 card after 3 years which is something even the legendary 1080ti couldn't manage to do.
Screenshot-629.png
Crazy, I know the 12gb 3080 trades blows with mine these days and sometimes beats it, but that's decent for a 3 year old card!
 
Is this reduced use of VRAM to do with Nvidia using more compression? Whether that is a good thing or not I'll let others comment.

Probably management in the drivers too.
The £650 3080 gets a lot of hate over VRAM yet is still putting out slightly better performance than AMDs new £500 GPU and Nvidia's near £600 card after 3 years which is something even the legendary 1080ti couldn't manage to do.
Screenshot-629.png

So out of interest for the next 3~5 years you promise 12GB in the RTX4070/RX7700XT and 16GB in the RX7800XT will have zero effect.Because most people buying a card will be keeping it that long.

Were you not defending 8GB cards in the past? How did that pan out?
 
Last edited:
Probably management in the drivers too.


So out of interest for the next 3~5 years you promise 12GB in the RTX4070/RX7700XT and 16GB in the RX7800XT will have zero effect.
Because most people buying a card will be keeping it that long.
I know you weren't replying to me personally mate, but just so everyone knows what I meant (as I'm not the best with wording stuff) my point is more I think I wont have any trouble like a 6800xt/7800xt wont. If they waste another 3.5-4gb where I save it, then that means theoretically we're always having the same amount of vram left to use? So for the next 3 years I reckon I'll be fine, and use DLSS3.X onwards and FG as required when it chugs.
 
Last edited:
So out of interest for the next 3~5 years you promise 12GB in the RTX4070/RX7700XT and 16GB in the RX7800XT will have zero effect.
Because most people buying a card will be keeping it that long.
2 years from now all the cards in this performance bracket will only be able to run 1080p anyway for new games, just like the 1080ti is today so the 10gb on the 3080 will be fine.
 
The £650 3080 gets a lot of hate over VRAM yet is still putting out slightly better performance than AMDs new £500 GPU and Nvidia's near £600 card after 3 years which is something even the legendary 1080ti couldn't manage to do.
Screenshot-629.png

Yup, no matter what your take is on vram, 3080 10gb for £650 was incredible value :cool:

So out of interest for the next 3~5 years you promise 12GB in the RTX4070/RX7700XT and 16GB in the RX7800XT will have zero effect.Because most people buying a card will be keeping it that long.

Were you not defending 8GB cards in the past? How did that pan out?

I think Daniel owens latest video kind of summed up the whole vram/8gb thing quite well:


Essentially what a lot of us have been saying for a long time, most 8gb gpus don't even have the grunt to get good performance in the first place thus higher preset of upscaling or dropping settings is required which in return also reduces vram usage e.g. look at the 3090, 24gb yet it is proving of no use really for gaming outside of a select few "controversial" games or/and you want to add loads of texture packs.

Going forward as of now though, when gpus are costing £500+, they should have at least 16gb vram now, more out of principle given the costs.
 
Last edited:
2 years from now all the cards in this performance bracket will only be able to run 1080p anyway for new games, just like the 1080ti is today so the 10gb on the 3080 will be fine.

Yup same way a 4090 in some UE 5 titles can't even be considered a "4k" gpu now unless you use frame generation AND upscaling.
 
I know you weren't replying to me personally mate, but just so everyone knows what I meant (as I'm not the best with wording) was, my point is more I think I wont have any trouble like a 6800xt/7800xt wont. If they waste another 3.5-4gb where I save it, then that means theoretically we're always having the same amount of vram left to use?

People said that with the RX6700XT 12GB/RTX3060 12GB that the extra VRAM would have no impact over the RTX3060TI 8GB/RTX3070 8GB. But in the end increasingly over the last years there are more and more issues. Nvidia probably do more VRAM management to some level - even AMD admitted the same with the Fury X 4GB(against the GTX980TI 6GB).

I personally wouldn't be unhappy with a 12GB card myself TBH,but my big concern is if Nvidia makes the RTX5070 16GB,or introduces new compression tech next generation which won't work with older cards. If they don't 12GB might be fine for a while.

The 8GB RTX4060 cards are more of a concern to me.

2 years from now all the cards in this performance bracket will only be able to run 1080p anyway for new games, just like the 1080ti is today so the 10gb on the 3080 will be fine.

Really like when you defended 8GB cards,and they fell off a cliff? My RTX3060TI now has more issues,and there are cards with similar performance with more VRAM which are running better. 12GB~16GB is what I would be looking at qHD now.

So basically you think the RTX4070 should only have 8GB~10GB of VRAM?

So I expect to see you not upgrade your RTX3080 10GB until September 2025. Or will this be another case,you make an excuse and happen to upgrade to another card with more VRAM as a coincidence. But argue until the very end 8GB~10GB is perfectly fine.

You are literally trying to defend the RTX4060TI 8GB:
I don’t think it’s there yet as even a 4060ti priced at £350 compared to the cheapest 7700XT at £430 is a 23% price increase for just a 14% performance increase.

Especially when the £430 RX7700XT cards come with a copy of Starfield which is £60 to buy. Or when an RX6700XT was as low as £300 with the same game.

It gets beaten by an RTX3060 12GB in some games because it runs out of VRAM.
 
Last edited:
People said that with the RX6700XT 12GB/RTX3060 12GB that the extra VRAM would have no impact over the RTX3060TI 8GB/RTX3070 8GB. But in the end increasingly over the last years there are more and more issues. Nvidia probably do more VRAM management to some level - even AMD admitted the same with the Fury X 4GB(against the GTX980TI 6GB).

I personally wouldn't be unhappy with a 12GB card myself TBH,but my big concern is if Nvidia makes the RTX5070 16GB,or introduces new compression tech next generation which won't work with older cards. If they don't 12GB might be fine for a while.

The 8GB RTX4060 cards are more of a concern to me.



Really like when you defended 8GB cards,and they fell off a cliff? My RTX3060TI now has more issues,and there are cards with similar performance with more VRAM which are running better.

So basically you think the RTX4070 should only have 8GB~10GB of VRAM. So I expect to see you not upgrade your RTX3080 10GB until September 2025. Or will this be another case,you make an excuse and happen to upgrade to another card with more VRAM as a coincidence. But argue until the very end 8GB~10GB is perfectly fine.

This is always such a silly statement and I never get why people insinuate that people who are upgrading from 8/10gb gpus are doing it solely for vram??? :confused: You do realise, more 3090 owners have jumped to a 4090 than 3070/3080 users have........ People are upgrading to newer cards more so for the extra performance, particularly the 4090 because of its huge lead it has over every other card this gen. If people are just wanting more vram, why aren't we seeing more users just making do with upgrading to a 7800xt/7900xt and second hand 3090s if vram is the sole reason for an upgrade.
 
Last edited:
People said that with the RX6700XT 12GB/RTX3060 12GB that the extra VRAM would have no impact over the RTX3060TI 8GB/RTX3070 8GB. But in the end increasingly over the last years there are more and more issues. Nvidia probably do more VRAM management to some level - even AMD admitted the same with the Fury X 4GB(against the GTX980TI 6GB).

I personally wouldn't be unhappy with a 12GB card myself TBH,but my big concern is if Nvidia makes the RTX5070 16GB,or introduces new compression tech next generation which won't work with older cards. If they don't 12GB might be fine for a while.

The 8GB RTX4060 cards are more of a concern to me.
Yeah I think I'll personally be fine, it was factors/research like this that was a deciding factor with the performance per watt factor when undervolted that sold me on this with the impressive dlss3.X/FG/RT the 4070 can do, at worst I end up running 1440 medium settings with DLSS3/4 (whatever comes out in 2-3 years, and as with the 3XXX supporting dlss3.X, I can't see dlss4 not working on mine when it comes out) but yeah by that point I'll have a wink under £600 towards another upgrade, i.e. my cards refunded me and given me 3 years of free service and it can go live in my MITX SFF 2nd rig. I literally cant loose in my situation can I :)

I just think it's interesting that people never mention this 3.5-4gb wasted on rival cards to mine, makes me think maybe the same happens with the higher tier cards and their rivals as well? I cba to look into it, but I wouldn't be surprised?

Either way though means I literally have the same remaining vram usage currently as a 16gb amd card that rivals it... Without even using dlss/fg etc..
 
I think the more concerning issue for pc gaming right now is that CPUs aren’t keeping pace, a 4090 is already bottlenecked at 4k in some instances even on the latest CPUs so next gen when 4090 performance level is considered 1440p and is more affordable then the issue will be even more prevalent especially if the new CPUs are only 20% faster by then, framegen technologies help out in this area but they are not everyone’s cup of tea with the latency penalties they bring.
 
Last edited:
I think the more concerning issue for pc gaming right now is that CPUs aren’t keeping pace, a 4090 is already bottlenecked at 4k in some instances even on the latest CPUs so next gen when 4090 performance level is considered 1440p and is more affordable then the issue will be even more prevalent especially is the new CPUs are only 20% faster by then, framegen technologies help out in this area but they are not everyone’s cup of tea with the latency penalties they bring.

Yeah nvidia really need to sort their driver overhead out as well as giving rebar more attention, at the very least enabling it driver level so we don't have to test and do it ourselves, they are leaving a lot of performance on the table here.

I'm very tempted to get a 4090 as cp 2077 soon (which I love) and some good games upcoming like avatar and so on but it's a bit off putting when you see a 4090 ******** the bed in some of these newest UE 5 titles :o
 
I think the more concerning issue for pc gaming right now is that CPUs aren’t keeping pace, a 4090 is already bottlenecked at 4k in some instances even on the latest CPUs so next gen when 4090 performance level is considered 1440p and is more affordable then the issue will be even more prevalent especially if the new CPUs are only 20% faster by then, framegen technologies help out in this area but they are not everyone’s cup of tea with the latency penalties they bring.
Nvidia tends to be heavier on the CPU than AMD, in any case if you're spending that kind of money you're expected to have a top of the line system to support it.
 
Back
Top Bottom