• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any news on 7800 xt?

OCUK is in full scalping mode with the 7800XT Pulse now £30 above MSRP. What happened to prices very close to MSRP this week that you promised, @Gibbo?

I would buy one, but not for this price. I'm probably going to pre-order for almost MSRP somewhere else instead. It's a shame, I would have preferred to buy here.
 
You are in to cars, so let’s use a car analogy:

Car A has a fuel tank capacity of 25 litres

Car B has a fuel tank capacity of 50 litres.

Both have the same fuel consumption and have to do a 100 mile trip, neither are going to run out of fuel. Now make them do a 500 mile trip and car B will make it, but car A will have to refuel, or be left on the roadside waiting for a tow truck.

Except in my situation my car has a smaller fuel tank and does better mpg so can still do the distance/trip the bigger fuel tank car does at worse mpg? Neither car runs out of fuel and gets it done?
That' what I'm getting at. Mine is no worse off, they're equal.
 
You realise what an analogy is right? It is used to get a point across. Assume they have the exact same performance and fuel consumption.

Go check the 4060Ti 8GB vs 16GB reviews to see my point. Even better go look at the tech videos Tommyboy pointed you too earlier.
 
Last edited:
You realise what an analogy is right? It is used to get a point across. Assume they have the exact same performance and fuel consumption.

Go check the 4060Ti 8GB vs 16GB reviews to see my point. Even better go look at the tech videos Tommyboy pointed you too earlier.
So why use it, if it's not relevant?

In my case, both cards end up with the same physical remaining memory whilst achieving the same performance/settings/res natively. So your analogy wasn't correct for my point.

I don't get why you're reliant on me looking at other peoples reviews when I own the hardware myself and the games and physically have seen it myself and measured the usage against my mates 6800xt, it's fact we both end up achieving the same fps/settings/res and fact his card uses more vram by 3.5-4gb in every game out of 30-40 we've tested, whilst both cards end up with the same unused amount of vram left. I.e. both cards are more than capable/should last as long as each other due to the way each card does it's thing in games? That's all I'm getting across here, that it doesn't matter in reality with my card as it's physically using less but achieving the same results and ends up with the same remaining vram in the tank, thus the headroom is the same for the future.

Why would I need to watch some clickbait revenue moan fest video on a gimped 4060ti?
 
You're missing my point, my 12gb card has the same amount of vram left as a 16gb rival AMD card at the same native 1440p graphics quality/fps, so thus the whole 'you'll run out of vram' isn't a thing for my card if a 16gb 6800xt has the same amount remaining with it's heavier vram usage to achieve the same quality/res/settings.
You were making a point? ;)

I didn't say anything about running out of VRAM so IDK why that's come up, that sort of thing is pretty rare what with modern memory management techniques.
I'll stress this point, I'm talking about actual usage when played not allocation predicted in the settings menu...
I know, again i said nothing about supposed actually usage vs what it say in the settings, and i say supposed because it's highly unlikely that whatever tool you're using to see VRAM usage is reporting what's actually being used, it's probably reporting what's being allocated.

GPUs are pretty much black-boxes, AFAIK even with CUDA you still only get a high level understanding of what the GPU is doing. Unless you have access to the sort of tools Nvidia/AMD use there's no way to know why an NVidia card seemingly uses less memory to achieve the same result (that is even if it is as without low level access you can't know if one card is using spare cycles to decompress, what the memory is being used for, or basically anything). Heck I'd be surprised if Nvidia fully understands AMD cards and visa versa.
 
You were making a point? ;)

I didn't say anything about running out of VRAM so IDK why that's come up, that sort of thing is pretty rare what with modern memory management techniques.

I know, again i said nothing about supposed actually usage vs what it say in the settings, and i say supposed because it's highly unlikely that whatever tool you're using to see VRAM usage is reporting what's actually being used, it's probably reporting what's being allocated.

GPUs are pretty much black-boxes, AFAIK even with CUDA you still only get a high level understanding of what the GPU is doing. Unless you have access to the sort of tools Nvidia/AMD use there's no way to know why an NVidia card seemingly uses less memory to achieve the same result (that is even if it is as without low level access you can't know if one card is using spare cycles to decompress, what the memory is being used for, or basically anything). Heck I'd be surprised if Nvidia fully understands AMD cards and visa versa.
So Adrenaline/Nvidia Experience/Msi Afterburner/Insert brand names own software bundle, is lying then along with being able to go into terminal in games and also get read outs/graphs? They're all lying yeah?

Are all the sensors on temps also lying? And the wattage draw in software that isn't far behind that measured by a physical digital plug meter/smart meter? All lying?

And the game allocating the vram based on all these readings it gets just like the monitoring software? The games lying too yeah?

So if that's the case why would I believe any of these youtube reviews/rants when they use the same monitoring software.
 
Last edited:
You’ve decided you are right and everyone else is wrong. The videos I am referring to are not moan fest click bait, they are tests using identical settings in a controlled test. Have you ever watched a technical video relating to cars?

The reason I mentioned the 4060Ti is because the two versions are identical apart from VRAM. So any issues relating to VRAM limits will be purely down to the VRAM and no other factor.
 
Last edited:
Except in my situation my car has a smaller fuel tank and does better mpg so can still do the distance/trip the bigger fuel tank car does at worse mpg? Neither car runs out of fuel and gets it done?
That' what I'm getting at. Mine is no worse off, they're equal.

NVIDIA cards are not more efficient with VRAM, once 12GB becomes a hard limit your card will not perform as well as a card with a larger amount of VRAM. Of course it's highly unlikely you'll still have your card when 12GB becomes a problem because 8GB is only just becoming an issue in some poorly optimised recent releases. If you're in it for the long haul then a card with more VRAM is a much more sensible purchase.

I'm not sure why you're looking for validation for your purchase? You've bought a very good GPU, with arguably the most complete feature set, you shouldn't have issues for years to come. Enjoy it.
 
Last edited:
You’ve decided you are right and everyone else is wrong. The videos I am referring to are not moan fest click bait, they are tests using identical settings in a controlled test. Have you ever watched a technical video relating to cars?
No you are telling me to disregard actual real life experience owning the game/hardware and rely on some random kids youtube video to make your opinion on something you don't own comply with your argument...
 
Interesting take. Though basically you are trying to do some anti AMD trolling. The thing is your attempt was quite amateurish truth be told. Here’s how I would take a well aimed dig at AMD on this issue.

Way to go AMD, your idiotic pricing of the 7700 XT failed and you were told about this when these prices were announced.

So now for my theory on this price drop. Why, because your upsell for the 7800 XT worked too well and there are none available anywhere near MSRP. People don’t want a 7700 XT but they can’t get a 7800 XT.

Hmm, actually maybe AMD executed this perfectly. Up-sell the early adopters so the 7800 XT sells out, reduce price of the 7700 XT and then they sell like hot cakes. £430 for a GPU faster than a 2080Ti and a Starfield bundle.

Hairy PC bloke coined the phrase on the 11700KF.

I just bought one for £140.

It’s all about the price.
 
By next year when the 7700XT is going for £350 it’ll be considered as the best value gpu on the market.

Exactly.

I don’t hold animosity toward AMD for launch prices. In fact I don’t hold animosity to any launch price of anything.

Dropping a die nm costs millions. Designing a new PCB costs a fortune also, then production costs and logistics etc. It’s not a cheap outing.

The 7700XT is a great card at the wrong price. AMD know this, we know this. It’s a huge uplift from the card it replaces.
 
NVIDIA cards are not more efficient with VRAM, once 12GB becomes a hard limit your card will not perform as well as a card with a larger amount of VRAM. Of course it's highly unlikely you'll still have your card when 12GB becomes a problem because 8GB is only just becoming an issue in some poorly optimised recent releases. If you're in it for the long haul then a card with more VRAM is a much more sensible purchase.

I'm not sure why you're looking for validation for your purchase, you've bought a very good GPU, with arguably the most complete feature set, you shouldn't have issues for years to come. Enjoy it.
No mate, as I said in my original post it's paid for itself in 3 years and given me a wink under £600 back and will live in my SFF 2nd rig and replace the AMD card in there at that point.

I wasn't looking for validation I was just making a point that no one seems to mention despite slating lower vram cards like mine that the actual utilization vs predicted/allocated in the game settings ends up being more efficient than the rival more thirsty less optimised 6800xt, thus at the end of it we both ironically end up with the same amount of vram, thus neither card should have a problem for a few years considering I've tested it in nearly 40 games new and slightly older with a friend with pretty much the same game library. So it isn't fluke. Nor was it using dlss/rt.

I just thought this was a valid point seeing as I'm an AMD guy but took a walk on the wild side and was shocked at the bias I saw against my card considering it was only £520 which is £40 more than a 7800XT, so that extra £40 could be worth considering to some for the RT and DLSS3.5/FG/Super low latency mode etc.

I'm just genuinely shocked no one said "oh that's decent at least you're not worrying about how much it'll use, the 4070 must be better optimized/games are written better/optimized better (out of admitted bias/sponsoring) for nvidia, so that negates the lesser vram."

There was literally zero bias/argument intented, it just sounded like people were confusing actual usage for 'allocated/predicted' by the game in the settings vs what you actually use, then to say all the read outs are wrong but get me to look up a video using the same read outs/programs is ridiculous. That's like saying all the sensors/temps are also lying. Then why have them!

If anything it came across biased against me, vs someone saying, oh in real life usage that's interesting, you shouldn't have a problem do to how well optimized that appears to be :)
 
Last edited:
No you are telling me to disregard actual real life experience owning the game/hardware and rely on some random kids youtube video to make your opinion on something you don't own comply with your argument...

Nope, you asked a question and refuse to accept the answer and it isn’t because we aren’t explaining it right. It’s because you stubbornly think you know best. Dunning Kruger at its finest.
 
Nope, you asked a question and refuse to accept the answer and it isn’t because we aren’t explaining it right. It’s because you stubbornly think you know best. Dunning Kruger at its finest.
A question you couldn't answer so said just look at random youtuber peoples reviews on a gpu I don't own... that's completely irrelevant. VS answering it with your own experience/knowledge? Of which it now comes across you don't have the answer to?

Instead you could have just said, oh that works out well on your behalf you wont run out of vram then judging by your actual usage owning said hardware, and if both yours and the rival end up with the same left in the tank and do the same performance you're both winning. But it seems you wont accept that?

I even said I'll happily provide screenshots of settings/performance to show anyone, and you declined and said no look at some randoms youtube videos? I know what my own hardware does strangely enough, as I own it and have tested it?
 
Last edited:
No mate, as I said in my original post it's paid for itself in 3 years and given me a wink under £600 back and will live in my SFF 2nd rig and replace the AMD card in there at that point.

I wasn't looking for validation I was just making a point that no one seems to mention despite slating lower vram cards like mine that the actual utilization vs predicted/allocated in the game settings ends up being more efficient than the rival more thirsty less optimised 6800xt, thus at the end of it we both ironically end up with the same amount of vram, thus neither card should have a problem for a few years considering I've tested it in nearly 40 games new and slightly older with a friend with pretty much the same game library. So it isn't fluke. Nor was it using dlss/rt.

I just thought this was a valid point seeing as I'm an AMD guy but took a walk on the wild side and was shocked at the bias I saw against my card considering it was only £520 which is £40 more than a 7800XT, so that extra £40 could be worth considering to some for the RT and DLSS3.5/FG/Super low latency mode etc.

I'm just genuinely shocked no one said "oh that's decent at least you're not worrying about how much it'll use, the 4070 must be better optimised/games are written better/optimized better (out of admitted bias/sponsoring) for nvidia, so that negates the lesser vram."
I think you may have missed the context of the 4070 being a 4060 (based on the underlying chip) priced like a 4080. As such people expect a decent amount of VRAM. Throw large textures at it that a 16GB card would cope with and you will see issues but that depends on your expectations/uses. It was just the least worst option from Nvidia, in the same way as the 7800XT, it's not perfect but not as bad as some other options.
 
I think you may have missed the context of the 4070 being a 4060 (based on the underlying chip) priced like a 4080. As such people expect a decent amount of VRAM. Throw large textures at it that a 16GB card would cope with and you will see issues but that depends on your expectations/uses. It was just the least worst option from Nvidia, in the same way as the 7800XT, it's not perfect but not as bad as some other options.
I don't care about those pathetic name war arguments, otherwise we can go that route with the 7900xt/7800xt blah blah blah. I just care about it doing what I want at X res/settings/wattage mate..

As I said above "I just thought this was a valid point seeing as I'm an AMD guy but took a walk on the wild side and was shocked at the bias I saw against my card considering it was only £520 which is £40 more than a 7800XT, so that extra £40 could be worth considering to some for the RT and DLSS3.5/FG/Super low latency mode etc.

I'm just genuinely shocked no one said "oh that's decent at least you're not worrying about how much it'll use, the 4070 must be better optimized/games are written better/optimized better (out of admitted bias/sponsoring) for nvidia, so that negates the lesser vram."
 
Last edited:
The reality is many here upgrade very quickly,but if you don't upgrade very quickly 12GB~16GB of VRAM will give you longer service than an 8GB/10GB card of similar power. So I would always take an RTX4070 12GB/RX7800XT 16GB over an RTX3080 10GB. Also the fact that driver support and optimisations will be longer for newer generation cards.

So you need to ask whether the RTX3080 10GB vs the RTX4070 12GB/RX7800XT 16GB will have more issues say in 2~3 years. I would think it probably will.

Just like someone buying an RTX3060TI 8GB in 2020 would have had a few years of decent service,buying an RTX4060TI 8GB now won't mean the same lifespan. Things move on.

I upgrade only when I absolutely have to.

My 2080ti was unable to run the game I wanted to play at the relatively moderate settings so it had to go. I’d had it over 3 years.

6700XT replaced my other failed 2080ti which I’d had even longer and so on.

So it’s every 3 years for me regardless of when they launched etc.
 
A question you couldn't answer.

I answered perfectly fine, I also gave advice where to go for a video that demonstrates how hitting VRAM limits manifest.

The problem is you just don’t seem to understand the answer. VRAM allocation =/= VRAM needed. When the VRAM needed exceeds the VRAM available, the min FPS drops like a brick. The fact “your” GPU has not hit such limits does not mean those limits don’t exist.
 
Last edited:
No mate, as I said in my original post it's paid for itself in 3 years and given me a wink under £600 back and will live in my SFF 2nd rig and replace the AMD card in there at that point.

I wasn't looking for validation I was just making a point that no one seems to mention despite slating lower vram cards like mine that the actual utilization vs predicted/allocated in the game settings ends up being more efficient than the rival more thirsty less optimised 6800xt, thus at the end of it we both ironically end up with the same amount of vram, thus neither card should have a problem for a few years considering I've tested it in nearly 40 games new and slightly older with a friend with pretty much the same game library. So it isn't fluke. Nor was it using dlss/rt.

I just thought this was a valid point seeing as I'm an AMD guy but took a walk on the wild side and was shocked at the bias I saw against my card considering it was only £520 which is £40 more than a 7800XT, so that extra £40 could be worth considering to some for the RT and DLSS3.5/FG/Super low latency mode etc.

I'm just genuinely shocked no one said "oh that's decent at least you're not worrying about how much it'll use, the 4070 must be better optimized/games are written better/optimized better (out of admitted bias/sponsoring) for nvidia, so that negates the lesser vram."

There was literally zero bias/argument intented, it just sounded like people were confusing actual usage for 'allocated/predicted' by the game in the settings vs what you actually use, then to say all the read outs are wrong but get me to look up a video using the same read outs/programs is ridiculous. That's like saying all the sensors/temps are also lying. Then why have them!

If anything it came across biased against me, vs someone saying, oh in real life usage that's interesting, you shouldn't have a problem do to how well optimized that appears to be :)

Your card isn't more efficient, it uses less and caches less because it has less to work with. This isn't an issue as it produces the same results, it only becomes an issue if you ever reach the hard limit in the card's lifespan.

I bought an 8GB RX480 in 2016, to this day it is still usable and in use, but the 4GB version is now effectively obsolete.

It all depends on how long you keep your cards.
 
I don't care about those pathetic name war arguments, otherwise we can go that route with the 7900xt/7800xt blah blah blah. I just care about it doing what I want at X res/settings/wattage mate..

As I said above "
was it using dlss/rt.

I just thought this was a valid point seeing as I'm an AMD guy but took a walk on the wild side and was shocked at the bias I saw against my card considering it was only £520 which is £40 more than a 7800XT, so that extra £40 could be worth considering to some for the RT and DLSS3.5/FG/Super low latency mode etc.

I'm just genuinely shocked no one said "oh that's decent at least you're not worrying about how much it'll use, the 4070 must be better optimized/games are written better/optimized better (out of admitted bias/sponsoring) for nvidia, so that negates the lesser vram."
The name thing is insignificant however it is literally a more cut down chip. It's like renaming a Fiesta to a Mondeo and upping the price ten grand and telling you, "it's fine it's just a name it still does 70mph". AMD did the same thing so no bias there ;)
 
Back
Top Bottom