• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any news on 7800 xt?

So Adrenaline/Nvidia Experience/Msi Afterburner/Insert brand names own software bundle, is lying then along with being able to go into terminal in games and also get read outs/graphs? They're all lying yeah?
No, they're probably reporting allocated not actual usage, and they're not lying. You've just not read what it is those applications are actually reporting.
Are all the sensors on temps also lying? And the wattage draw in software that isn't far behind that measured by a physical digital plug meter/smart meter? All lying?

And the game allocating the vram based on all these readings it gets just like the monitoring software? The games lying too yeah?

So if that's the case why would I believe any of these youtube reviews/rants when they use the same monitoring software.
What does any of that have to do with the price of eggs? Obviously you're getting angry that you don't understand how GPUs/software works but taking that out on people who are trying to help you understand isn't very constructive.
 
The name thing is insignificant however it is literally a more cut down chip. It's like renaming a Fiesta to a Mondeo and upping the price ten grand and telling you, "it's fine it's just a name it still does 70mph". AMD did the same thing so no bias there ;)
Exactly, so it doesn't matter as everyone does it, what matters is the actual performance achieved which is on par with a 6800xt which is what I wanted but at a way lesser wattage/temp, which I achieved. As for the price though it was £520 elsewhere last month, that is £40 more than the RRP of the 7800XT... So hardly a 'rip off' or major price hike? And as people have just mentioned on this thread OCUK has just jacked the 7800xt up £30 more than RRP, thus it's only a tenner cheaper than mine was last month? Hardly a rip off now is it.
 
Your card isn't more efficient, it uses less and caches less because it has less to work with. This isn't an issue as it produces the same results, it only becomes an issue if you ever reach the hard limit in the card's lifespan.

I bought an 8GB RX480 in 2016, to this day it is still usable and in use, but the 4GB version is now effectively obsolete.

It all depends on how long you keep your cards.
In the case of wattage vs performance achieved at the 105-145w it runs at, it majorly is efficient, as previous stated this was a major factor as with the fact it never ever turns the fan on bar in control when it touches the 65c fan on profile when using RT at 1440p native maxed out. The rest of the time as aforementioned it runs at 53-57c 1440p ultra native.
 
No, they're probably reporting allocated not actual usage, and they're not lying. You've just not read what it is those applications are actually reporting.

What does any of that have to do with the price of eggs? Obviously you're getting angry that you don't understand how GPUs/software works but taking that out on people who are trying to help you understand isn't very constructive.
Your clutching at straws now. They use the same software which you say doesn't work then say it does when it suits your youtube narrative.
I'm pretty sure I can read what on the fly usage is vs allocated and remaining are.
 
Last edited:
Exactly, so it doesn't matter as everyone does it, what matters is the actual performance achieved which is on par with a 6800xt which is what I wanted but at a way lesser wattage/temp, which I achieved. As for the price though it was £520 elsewhere last month, that is £40 more than the RRP of the 7800XT... So hardly a 'rip off' or major price hike? And as people have just mentioned on this thread OCUK has just jacked the 7800xt up £30 more than RRP, thus it's only a tenner cheaper than mine was last month? Hardly a rip off now is it.
It's a thing this generation, hence I bought a 6950XT, better value with a full fat chip. I'll skip and see what happens on the next round. A small undervolt works wonder as does capping framerate to your monitor, no point rendering more than your screen can display ;) At current prices I'd go for the 7800XT but then I don't really care about RT until it becomes the default. Ultimately if you're happy I'm happy, you pay your money and make your choice. At the time a last gen flagship was a better deal for me and a similar price to the 4070.
 
It's a thing this generation, hence I bought a 6950XT, better value with a full fat chip. I'll skip and see what happens on the next round. A small undervolt works wonder as does capping framerate to your monitor, no point rendering more than your screen can display ;) At current prices I'd go for the 7800XT but then I don't really care about RT until it becomes the default. Ultimately if you're happy I'm happy, you pay your money and make your choice. At the time a last gen flagship was a better deal for me and a similar price to the 4070.
Yeah, whatever works for your intended usage, mine being a literal silent/cool running pc that uses the same or less power including a monitor/amp/speakers as a 6950/7900xt on it's own (both I heavily considered before getting the 4070) but at the end of the day, I like to game 6 hours of an evening every day, which at the rate of electric I pay a KW/H works out at £192ish a year...
So there's no way when my entire setup at the wall with a proper wall mounted wattage meter draws 260w or 280-300w with RT games, would I think it was a good idea to have a 6800xt paired with my hardware using 500-650w at the wall, i.e. literally throwing away double that £192 a year to run the system?

For what the same fps/settings as a 6800xt but without RT/DLSS3.5/FG/SLL etc... Pointless? In 3 years when I upgrade it with the money saved my gpu has cost me nothing and given me a wink under £600 back in money that'd be wasted powering a 550-650w system with a rival card, I may love AMD but I'm sure you'd agree throwing away £600ish to run one for the same performance with less features for 3 years is ridiculous, no?

It just seems mental how we know full well how the Nvidia shills are blindsided but to hear this from fellow AMD people made me shocked, it's like the new fanboy trend to hate on people this year is coming from AMD guys these days? I just don't get why I can't like/own both like I do in 2 different rigs and have done all my life as with cars/parts/brands?

I guaranty if I'd made these points about an AMD card everyone would just nod.
 
Last edited:
Your clutching at straws now. They use the same software which you say doesn't work then does.
I'm pretty sure I can read what on the fly usage is vs allocated and remaining are.
I'm not sure what your problem is, people have tried to explain in the simplest terms possible how/why your anecdotal comparison is meaningless, even why in a lab environment it would be meaningless. That you're basically trying to compare apples with oranges when you're 100 miles away, it's the middle of the night, and you're blind in one eye. People have been more than courteous too you and have spent time trying to educate you but for some reason you seem to be purposefully antagonising people.

Whoever mentioned the Dunning–Kruger effect was spot on, you don't even understand that thermal diodes and/or registers are entirely different than memory and how software works.
 
In the case of wattage vs performance achieved at the 105-145w it runs at, it majorly is efficient, as previous stated this was a major factor as with the fact it never ever turns the fan on bar in control when it touches the 65c fan on profile when using RT at 1440p native maxed out. The rest of the time as aforementioned it runs at 53-57c 1440p ultra native.

We were talking about your comments on your GPU being more efficient with VRAM.

I've already confirmed that you have bought a very good GPU. It wouldn't be my choice as, going back over a decade now, ATI/AMD have always provided me a better option at a lower price point. This is of course relative to my budget and usage.
 
I'm not sure what your problem is, people have tried to explain in the simplest terms possible how/why your anecdotal comparison is meaningless, even why in a lab environment it would be meaningless. That you're basically trying to compare apples with oranges when you're 100 miles away, it's the middle of the night, and you're blind in one eye. People have been more than courteous too you and have spent time trying to educate you but for some reason you seem to be purposefully antagonising people.

Whoever mentioned the Dunning–Kruger effect was spot on, you don't even understand that thermal diodes and/or registers are entirely different than memory and how software works.
I don't get how it's meaningless if I achieve the same res/quality visuals/fps as something that uses more vram to do so and end up with the same left in reserve thus both cards will last just as long and have their own way of achieving the same performance? Why is that something to slate? I got what I wanted at out my choice of res/fps/texture quality and settings without dlss and match a card(s) that would eat what my entire system uses?
Why would I want to set fire to twice the electric bill a year, when that could be spent on a new gpu in 3 years, netting me effectively a free gpu refund in 3 years when I put it in my 2nd rig as a free upgrade to that?
 
We were talking about your comments on your GPU being more efficient with VRAM.

I've already confirmed that you have bought a very good GPU. It wouldn't be my choice as, going back over a decade now, ATI/AMD have always provided me a better option at a lower price point. This is of course relative to my budget and usage.
I just don't understand how if I achieve the same performance/fps/res/settings as a rival more power hungry card how that's a bad thing/no it's not doing what people think? When I'm using it, and it is giving me that performance/visuals that match my mates 6800xt? If something does say 60fps at 1440p ultra no dlss/fsr... then it does that, if it does that and then leaves you with the same amount of vram being unused as a rival card, then you're both doing fine for a couple years gaming aren't you? That was all I'm saying man.
 
I just don't understand how if I achieve the same performance/fps/res/settings as a rival more power hungry card how that's a bad thing/no it's not doing what people think? When I'm using it, and it is giving me that performance/visuals that match my mates 6800xt? If something does say 60fps at 1440p ultra no dlss/fsr... then it does that, if it does that and then leaves you with the same amount of vram being unused as a rival card, then you're both doing fine for a couple years gaming aren't you? That was all I'm saying man.

I never said it was a bad thing, I just tried to explain the VRAM situation as you're hung up on this "unused VRAM" malarkey.

I think you've got about three different conversations going with three different posters, so I'll bow out.

Like I said, enjoy your GPU and stop worrying about what others think of your purchase.
 
I never said it was a bad thing, I just tried to explain the VRAM situation as you're hung up on this "unused VRAM" malarkey.

I think you've got about three different conversations going with three different posters, so I'll bow out.

Like I said, enjoy your GPU and stop worrying about what others think of your purchase.
Fair enough mate, I apologize if anything came across the wrong way.
I like that you don't have a bias, that's what I was trying to make a point of, as I don't but I hate how people jump down the bandwagon and think you're only allowed to like 1 brand, when in reality both make good hardware (depending on the time/what's for sale)
I merely mentioned it cause I thought when the 4070 was mentioned and fact it was only £520 last time I checked last month, that for £40 more it might be worth considering based on my actual usage compared to a 6800xt, as I'd like to think most people aren't biased...
 
Last edited:
The only way to be 100% sure about VRAM usage is to run actually GPU debugging / profiling software which game developers use. It would be a lot of work.

However, some people go to Windows Taskmanager and get worried that 90% of memory is used. Well, any system should cache as much as it can. Or someone going to a server and saying the physical memory usage on this or that box is 95% used; well sorry I would expect a database server to cache as much as possible. Once it has read a file, only throw it away from main memory if something more important comes along.
 
OCUK is in full scalping mode with the 7800XT Pulse now £30 above MSRP. What happened to prices very close to MSRP this week that you promised, @Gibbo?

I would buy one, but not for this price. I'm probably going to pre-order for almost MSRP somewhere else instead. It's a shame, I would have preferred to buy here.
I was looking yesterday at buying and I feel like today it's gone up £10? That's annoying!
 
Back
Top Bottom