• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

7800XT is going to be a 5-year card!

Removed User 345456 said:

I love how people make up their own prices, as stated multiple times :rolleyes: it was £520 - only £40 more than the 3 year old 6800xt in a dress 7800xt, with FAR more better modern features/future proofing for the next few years!


Future proof with 12GB? - good luck with that!

:cry:
You haven't got a clue. Youtube clickbait rant videos made by teenagers decide what you spend your money on.

If you actually bothered to read what I and other owns have stated multiple times, our cards use 3.1gb LESS vram than a 6800xt when it comes to actual in game usage, i.e. the games are more biased/optimised to nvidia, thus afterwards we end up with THE SAME amount of vram remaining not being used as a 16gb 6800xt that uses up more achieving the same settings/res.

I can run everything at 1440p ultra natively, you haven't got a clue. Even CP/Control/TLOU.
Sure I could use dlss/fg if I wanted to achieve triple figure fps... But why would I want that in a SP game? It's pointless.

ALL of your posts are made up of trolling, you're clearly some teen with too much time on his hands that probably doesn't even own a pc!

If you actually bothered to try out each gpu at the same settings/res natively you'd see this.
 
Last edited:
You haven't got a clue. Youtube clickbait rant videos made by teenagers decide what you spend your money on.

If you actually bothered to read what I and other owns have stated multiple times, our cards use 3.1gb LESS vram than a 6800xt, i.e. the games are more biased/optimised to nvidia, thus afterwards we end up with THE SAME amount of vram as a 16gb 6800xt that uses up more achieving the same settings/res.

I can run everything at 1440p ultra natively, you haven't got a clue. Even CP/Control/TLOU.
Sure I could use dlss/fg if I wanted to achieve triple figure fps... But why would I want that in a SP game? It's pointless.

ALL of your posts are made up of trolling, you're clearly some teen with too much time on his hands that probably doesn't even own a pc!

Our cards?

Why does that not hold true whenever someone does a VRam comparison?
 
How is a 4070 futureproofed when it's slower in the majority of games than my 3 year old card that only cost me £129 more.
No it isn't. Let's see some actual proof conducted by yourself and not some biased clickbait graph/youtuber footage...
Let's see your 3 year old card do psycho ray tracing at 1440 max settings in CP or RT maxed in Control natively at 1440p, do go on to tell me how fsr is better too with it's shimmering...

Why would I care about what you bought 3 years ago when I bought this at the end of July this year and don't own last gen tier cards? Let alone the fact I paid £129 less than your old card that doesn't have future proofing like frame generation/dlss3.5/super low latency/ray tracing?

You've just contradicted the whole argument about my card being overpriced when it's £40 more than a 7800xt which is a 3 year old 6800xt in a dress with no gains in performance nor rdna4 nor fsr3... Well done you.
 
This VRAM argument is either valid or not based on how prepared you are to lower settings.

If my experience with the 3060ti is anything to go by it still plays the vast majority of games at 1440p 60fps at high settings and it is approaching 3 years old. I suspect the 4070 will hold up in a similar fashion by being able to play the vast majority of games at 1440p 60fps at high settings in 2025.
 
Our cards?

Why does that not hold true whenever someone does a VRam comparison?
Who's someone? Some biased rigged nonsense by a youtube clickbaiter taking a backhander to sell you X brand and Y model card/stirring up bs to create content...
Try actually testing your own card next to someone with mine with the same cpu/ram/screen res/settings then get back to me with how I'm lying, as my mate literally has a 5700x and 32gb like me, we bought them in the same sale, only difference is he has a 6800xt... The card I'd considered last year amongst others...

You're the only person that's questioned this vram usage, everyone else says well it's hardly surprising as game devs are either sponsored or favour nvidia as a bias/they could be using sneaky compression - regardless this IS true.

I'll use 9.3gb vram ACTUAL usage mid game, my friend uses 12.7... how is that fake? we both roughly have 2-3gb vram remaining at the same res/settings/game. We've tested it on many new and older games and it reigns true.

Next you'll be telling me the low vram usage we've all shown in Starfield/CP/Resi4/TLOU/Control is a lie.
 
Also, you do realise don't you that A, when a GPU runs out of VRam that doesn't necessarily mean your MSI OSD will show the VRam as filled, or even close to filled, and B, the GPU doesn't just stop when the VRam is full, it will do one of three things, its will try to swap files round constantly as you move through the world, it will compress the textures making them a lower resolution / quality or it will send what it can't fit in VRam to the lext level in the memory hierarchy, that being System RAM, none of these are good things.
 
This VRAM argument is either valid or not based on how prepared you are to lower settings.

If my experience with the 3060ti is anything to go by it still plays the vast majority of games at 1440p 60fps at high settings and it is approaching 3 years old. I suspect the 4070 will hold up in a similar fashion by being able to play the vast majority of games at 1440p 60fps at high settings in 2025.
I don't lower anything, i play everything at 1440p ultra as i've said and the most any game has used was 9.3-9.7gb vram actual use in game. The only time I use dlss/fg is on Starfield as I like to run it at 90-110fps and know no matter how hectic the fight/planet etc is it'll always be rock solid way above 60fps, I have dabbled with RT with CP and the new 2.0 because it supports dlss3.5/FG/super low latency etc etc and again 83-93 or mid 70s in intense areas or 120-130fps with RT off and DLSS3.5/FG on.

I don't see the point in playing any SP games at more than 60fps tbh as why would I need an advantage over an NPC, I just run uncapped for MP games.
 
No it isn't. Let's see some actual proof conducted by yourself and not some biased clickbait graph/youtuber footage...
Let's see your 3 year old card do psycho ray tracing at 1440 max settings in CP or RT maxed in Control natively at 1440p, do go on to tell me how fsr is better too with it's shimmering...

Why would I care about what you bought 3 years ago when I bought this at the end of July this year and don't own last gen tier cards? Let alone the fact I paid £129 less than your old card that doesn't have future proofing like frame generation/dlss3.5/super low latency/ray tracing?

You've just contradicted the whole argument about my card being overpriced when it's £40 more than a 7800xt which is a 3 year old 6800xt in a dress with no gains in performance nor rdna4 nor fsr3... Well done you.

While I may not have put it in such strong terms, I think there is definitely an argument to be made for the 4070 being more future proofed than the 7800XT based on the superior features Nvidia offers despite it's lower VRAM.

We have yet to see how good AMD's frame gen tech is going to be and they don't have an answer for Ray Reconstruction but you should only buy a product based on the features that are available now and not future promises.
 
If a game sees memory it will use it if it needs to, there is no such thing as one brand of GPU using less VRam vs another, the game determines that.
 
Also, you do realise don't you that A, when a GPU runs out of VRam that doesn't necessarily mean your MSI OSD will show the VRam as filled, or even close to filled, and B, the GPU doesn't just stop when the VRam is full, it will do one of three things, its will try to swap files round constantly as you move through the world, it will compress the textures making them a lower resolution / quality or it will send what it can't fit in VRam to the lext level in the memory hierarchy, that being System RAM, none of these are good things.
So the AMD Adrenaline/Nvidia GF Experience/MSI afterburner/ASUS tuning/console debug modes in games etc etc, all lying yeah? SO why do you lot live off quoting that live data output and linking screens/videos to it.
It's all just a lie is it. haha ok m9.

I'm well aware IF something runs out of vram it'll chug and try and use my ram/page file but that is blatantly obvious and you can again tell a drop in fps let alone if monitoring it or it's locked...
The point is nothing does use even the 'allocated' amount, let alone come close. Neither does it on a 6800xt, so neither card has an issue when mine seems to be more lighter on the workload vram wise due to devs favouring/sponsored by nvidia. So neither card will have a problem, when it does i'll just cheat with Frame Gen/DLSS3.5/lower settings as required. By that point i'll have retired it to my 2nd rig or sold it, I literally couldn't care less.

I have better things to do than sit here and debate stuff with someone who doesn't own my hardware and bases their knowledge on youtube speculation and argues about things they've never bought/wont try for themselves.

I decide what my wallet is used for, not youtube/reddit/forums.

The END. unsubbed :)
 
I don't lower anything, i play everything at 1440p ultra as i've said and the most any game has used was 9.3-9.7gb vram actual use in game. The only time I use dlss/fg is on Starfield as I like to run it at 90-110fps and know no matter how hectic the fight/planet etc is it'll always be rock solid way above 60fps, I have dabbled with RT with CP and the new 2.0 because it supports dlss3.5/FG/super low latency etc etc and again 83-93 or mid 70s in intense areas or 120-130fps with RT off and DLSS3.5/FG on.

I don't see the point in playing any SP games at more than 60fps tbh as why would I need an advantage over an NPC, I just run uncapped for MP games.

There is a reasonable chance you will have to drop settings in the years to come as that is just the way of things. I personally wouldn't care too much about having to run high settings to achieve 60fps. It's when you have to start dropping to medium I tend to upgrade.

If a game sees memory it will use it if it needs to, there is no such thing as one brand of GPU using less VRam vs another, the game determines that.

Yes you can't trust the readings these programs give you. Some games will fill up all the VRAM you have available.
 
If a game sees memory it will use it if it needs to, there is no such thing as one brand of GPU using less VRam vs another, the game determines that.
Yeah and if the game says you need to use 12-13gb and end up using 9.3/9.7 then you're using less than it's allocated thus are fine. When playing it and clearly seeing no chugging/lag/frame drops, then it's doing fine and not dropping frames is it, nor running out of vram.

Next you'll be telling me the games part of a conspiracy in the settings menu when it shows what it'll require to allocate at those settings :rolleyes:

It's all lies, everyones lying you know everything and all the devs made the software to monitor this along with console debug features, ALL LIARS. You are right they're all wrong. my mistake.

Bye now, you're an NPC to me, Unsubbed, life to live.
 
There is a reasonable chance you will have to drop settings in the years to come as that is just the way of things. I personally wouldn't care too much about having to run high settings to achieve 60fps. It's when you have to start dropping to medium I tend to upgrade.



Yes you can't trust the readings these programs give you. Some games will fill up all the VRAM you have available.
Yeah and by that point as I've said so many times mate, I'll have sold the card and made a bit back on what I've saved not having a more power hungry card, so will have a nice amount to go towards the next gpu, can't really fault that.
 
So the AMD Adrenaline/Nvidia GF Experience/MSI afterburner/ASUS tuning/console debug modes in games etc etc, all lying yeah? SO why do you lot live off quoting that live data output and linking screens/videos to it.
It's all just a lie is it. haha ok m9.

I'm well aware IF something runs out of vram it'll chug and try and use my ram/page file but that is blatantly obvious and you can again tell a drop in fps let alone if monitoring it or it's locked...
The point is nothing does use even the 'allocated' amount, let alone come close. Neither does it on a 6800xt, so neither card has an issue when mine seems to be more lighter on the workload vram wise due to devs favouring/sponsored by nvidia. So neither card will have a problem, when it does i'll just cheat with Frame Gen/DLSS3.5/lower settings as required. By that point i'll have retired it to my 2nd rig or sold it, I literally couldn't care less.

I have better things to do than sit here and debate stuff with someone who doesn't own my hardware and bases their knowledge on youtube speculation and argues about things they've never bought/wont try for themselves.

I decide what my wallet is used for, not youtube/reddit/forums.

The END. unsubbed :)

Did you even read anything i said? a texture is a texture, its a fixed size, it will use the same amount of VRam on brand A as it does on brand B, if brand A is showing less VRam than Brand B its not because brand A is somehow able to make 100MB fit in a 50MB volume, its because that texture isn't in the VRam or its been compressed, this is not a good thing.
 
No it isn't. Let's see some actual proof conducted by yourself and not some biased clickbait graph/youtuber footage...
Let's see your 3 year old card do psycho ray tracing at 1440 max settings in CP or RT maxed in Control natively at 1440p, do go on to tell me how fsr is better too with it's shimmering...

Why would I care about what you bought 3 years ago when I bought this at the end of July this year and don't own last gen tier cards? Let alone the fact I paid £129 less than your old card that doesn't have future proofing like frame generation/dlss3.5/super low latency/ray tracing?

You've just contradicted the whole argument about my card being overpriced when it's £40 more than a 7800xt which is a 3 year old 6800xt in a dress with no gains in performance nor rdna4 nor fsr3... Well done you.
Situations where a 4070 is faster than a 3080 is with FG which isn’t in the majority of games so that makes the 3080 faster in the majority. Also with things like FG what’s to say Nvidia won’t change the way it works and gate it behind the 50 series leaving you a card that’s forced back to normal DLSS?.

When a 3080 released it offered 88% of the 3090 performance so almost a top tier gaming experience for 2 years for only £129 more than you’ve spent, now contrast this to a 4070 which only offers 50% of a 4090 so a net loss of 38% for a £129 saving, this is why a 3080 was a far more futureproofed purchase as it’s still beating every card below its price point 3 years later in the majority of cases while the 4070 couldn’t even do this on day one.

I’m not claiming the 7800XT is good either but rather cards of this performance level should now be under £400

This shows how 70 cards done against the previous gen 80 cards, notice the odd one out.
GTX780>970 +21% performance
980>1070 + 29%
1080 > 2070 +14%
2080 > 3070 + 26%
3080 > 4070 - 6%
 
Last edited:
You haven't got a clue. Youtube clickbait rant videos made by teenagers decide what you spend your money on.

If you actually bothered to read what I and other owns have stated multiple times, our cards use 3.1gb LESS vram than a 6800xt when it comes to actual in game usage, i.e. the games are more biased/optimised to nvidia, thus afterwards we end up with THE SAME amount of vram remaining not being used as a 16gb 6800xt that uses up more achieving the same settings/res.

I can run everything at 1440p ultra natively, you haven't got a clue. Even CP/Control/TLOU.
Sure I could use dlss/fg if I wanted to achieve triple figure fps... But why would I want that in a SP game? It's pointless.

ALL of your posts are made up of trolling, you're clearly some teen with too much time on his hands that probably doesn't even own a pc!

If you actually bothered to try out each gpu at the same settings/res natively you'd see this.

It's not me that bought an inferior card with 12GB of VRAM and it's not me continually writing essays filled with mental gymnastics to justity my purchase dude, chill.

:cry:

The more you ramble on repeating the same justifications over and over again, and throwing out personal insults just makes your arguments look even weaker.

;)
 
Back
Top Bottom