• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any news on 7800 xt?

I think the truth in all of this is if you want to keep up and not suffer performance issues, either VRAM or pure grunt, you need to upgrade every other generation. They pretty much design it this way, if users didn't upgrade for long periods they'd have no sales. Cards like the 1080/Ti showed that.
 
Either way though means I literally have the same remaining vram usage currently as a 16gb amd card that rivals it... Without even using dlss/fg etc..
You only compared Vram usage, NV driver uses(shares) more system ram than AMD, which isn't as simple as NV uses less vram especially when pushing higher settings.
 
Last edited:
I want to make it clear I'm in no way trolling, as I own both AMD and Nvidia gpu's in both systems along with AMD cpu's in both...

I'll stress this point, I'm talking about actual usage when played not allocation predicted in the settings menu...

I just find it crazy no one other than me has spotted this 3.5-4gb vram wastage when using amd cards against my card? Makes me wonder if this happened on previous gen cards as well or if it's just a case of mine or the entire 4xxx series is better optimised than previous generations?

Either way if 3.5-4gb is wasted on a 16gb card vs a 12gb then they BOTH have the same amount of remaining vram - in EVERY modern game, or ones a bit older - I've tested 30-40 myself currently against my mates 6800xt, then that shuts down this whole vram argument when people troll about 12 vs 16gb on current rival cards?

It doesn't matter what the reasoning behind it is, if they both achieve the owners resolution/fps range/texture settings choice, then either card is suitable, no?
 
Last edited:
Yeah nvidia really need to sort their driver overhead out as well as giving rebar more attention, at the very least enabling it driver level so we don't have to test and do it ourselves, they are leaving a lot of performance on the table here.

I'm very tempted to get a 4090 as cp 2077 soon (which I love) and some good games upcoming like avatar and so on but it's a bit off putting when you see a 4090 ******** the bed in some of these newest UE 5 titles :o
I think we can agree that if anything struggles on a $1000+ card it's a serious case of bad optimization.
 
I think we can agree that if anything struggles on a $1000+ card it's a serious case of bad optimization.
It's never the games fault though...... ;) :D :p

A good example of where I would love more vram now is for starfield and it's HD texture packs (same way it was for FO, skyrim and so on), that is where more vram really shines when it comes to texture mod packs but again, I wouldn't solely upgrade a gpu just for vram like some make out, good thing though with starfield is it already uses incredibly little vram as is which gives more room to add more texture packs.
 
You only compared Vram usage, NV driver uses(shares) more system ram than AMD, which isn't as simple as NV uses less vram.
Does it? From what I saw we were using the same amount of system ram? I mean I'd be happy to go back and play X game and compare it with someone on here that actually owns a 6800xt at the same settings/res and both post screenshots? But genuinely from what I saw at my given usage (1440p maxed out native) there wasn't anything in it?
If there is then my mistake, I was mainly interested in temps/vram/fps/wattage usage tbh, as anyone can bang in more ram considering how dirt cheap it is/becomes when a newer platform comes out vs shelling out for an entire new gpu - just look at ddr4 vs 5...
Also a 2nd hand cpu is a drop in the water as a future upgrade compared to a gpu.

I'll stress that I'm talking about actual usage when played not allocation predicted in the settings menu...
 
Last edited:
I think the more concerning issue for pc gaming right now is that CPUs aren’t keeping pace, a 4090 is already bottlenecked at 4k in some instances even on the latest CPUs so next gen when 4090 performance level is considered 1440p and is more affordable then the issue will be even more prevalent especially is the new CPUs are only 20% faster by then, framegen technologies help out in this area but they are not everyone’s cup of tea with the latency penalties they bring.

The reality is many here upgrade very quickly,but if you don't upgrade very quickly 12GB~16GB of VRAM will give you longer service than an 8GB/10GB card of similar power. So I would always take an RTX4070 12GB/RX7800XT 16GB over an RTX3080 10GB. Also the fact that driver support and optimisations will be longer for newer generation cards.

So you need to ask whether the RTX3080 10GB vs the RTX4070 12GB/RX7800XT 16GB will have more issues say in 2~3 years. I would think it probably will.

Just like someone buying an RTX3060TI 8GB in 2020 would have had a few years of decent service,buying an RTX4060TI 8GB now won't mean the same lifespan. Things move on.
 
Last edited:
It's never the games fault though...... ;) :D :p

A good example of where I would love more vram now is for starfield and it's HD texture packs (same way it was for FO, skyrim and so on), that is where more vram really shines when it comes to texture mod packs but again, I wouldn't solely upgrade a gpu just for vram like some make out, good thing though with starfield is it already uses incredibly little vram as is which gives more room to add more texture packs.
Personally I look for extra RAM just to have a little bit of future proofing, worked well on my HD7970 and RX 590.
Had the 7600 gotten more than 8GB it would have been the perfect upgrade as I'd get double performance for the same money, instead I'll either wait another gen or splurge on a 6950xt.
 
Does it? From what I saw we were using the same amount of system ram? I mean I'd be happy to go back and play X game and compare it with someone on here that actually owns a 6800xt at the same settings/res and both post screenshots? But genuinely from what I saw at my given usage (1440p maxed out native) there wasn't anything in it?
If there is then my mistake, I was mainly interested in temps/vram/fps/wattage usage tbh, as anyone can bang in more ram considering how dirt cheap it is/becomes when a newer platform comes out vs shelling out for an entire new gpu - just look at ddr4 vs 5...
Also a 2nd hand cpu is a drop in the water as a future upgrade compared to a gpu.
It's not as simple as adding more system ram.

NV is sharing system ram which is slower than vram.

Check YT vids by Daniel Owen, the 8gb 60 v 16Gb in particular he points out vram usage better than most.

For reference I upgraded both my systems from 3070 and a 3080 to 4070 and 79XTX.
 
It's not as simple as adding more system ram.

NV is sharing system ram which is slower than vram.

Check YT vids by Daniel Owen, the 8gb 60 v 16Gb in particular he points out vram usage better than most.

For reference I upgraded both my systems from 3070 and a 3080 to 4070 and 79XTX.
I thought AMD did that as well with Smart Access Memory? I have 32gb and no game seems to use more than 10.5gb, at my setttings/res anyway, so I don't think that's a factor for me?
 
I thought AMD did that as well with Smart Access Memory? I have 32gb and no game seems to use more than 10.5gb, at my setttings/res anyway, so I don't think that's a factor for me?
No, that's not how Sam/Rebar works, that's a whole different conversation.

Again NV do more memory swapping with system ram because they generally have less vram, it mostly works until it doesn't which is generally before a GPU with more vram that you currently think is just a 'waste'.

When NV tell you 8Gb is enough and release the exact same GPU with double the vram and outperforms the 8Gb version massively while using less system ram should be an indicator for everyone to take note of.
 
Last edited:
I just find it crazy no one other than me has spotted this 3.5-4gb vram wastage when using amd cards against my card?
Using VRAM or RAM is never wasted, the whole point of it is to be used.

You don't want to go fetching data from slower storage like SSD/HDD and you don't want to spend cycles that could be used for other work decompressing data.
 
Last edited:
Using VRAM or RAM is never wasted, the whole point of it is to be used.

You don't want to go fetching data from slower storage like SSD/HDD and you don't want to spend cycles that could be used for other work decompressing data.
You're missing my point, my 12gb card has the same amount of vram left as a 16gb rival AMD card at the same native 1440p graphics quality/fps, so thus the whole 'you'll run out of vram' isn't a thing for my card if a 16gb 6800xt has the same amount remaining with it's heavier vram usage to achieve the same quality/res/settings.

I'll stress this point, I'm talking about actual usage when played not allocation predicted in the settings menu...
 
Last edited:
You are simply misunderstanding how VRAM allocation and limits work and manifest.

Look up the guy Tommyboy mentioned to get some tech comparison videos that show even at 1440 that 12GB VRAM GPUs can hit VRAM limits. It even impacts 8GB GPUs at 1080p and it is happening in more and more games.

The Last of Us
Ratchet and Clank
A lot of the new UE5 games
 
You're missing my point, my 12gb card has the same amount of vram left as a 16gb rival AMD card at the same native 1440p graphics quality/fps, so thus the whole 'you'll run out of vram' isn't a thing for my card if a 16gb 6800xt has the same amount remaining with it's heavier vram usage to achieve the same quality/res/settings.

Clearly the games you have tested and the settings haven't limited a 12GB card then. Irregardless of whether they both appear to have the same overhead, a card with more VRAM should try to cache more data.
 
Clearly the games you have tested and the settings haven't limited a 12GB card then. Irregardless of whether they both appear to have the same overhead, a card with more VRAM should try to cache more data.
Yeah, I've played 30-40 games on it so far since installing the card on july 26th.

Now I'll stress this point, I'm talking about actual usage when played not allocation predicted in the settings menu...

Nothing has gone above 9.5gb (even with RT on natively) where the 6800xt at the same res/settings (without RT) was using 11.5-12gb.
All I'm saying is neither brand reaches their 'allocated' amount in the game and both end up with the same total of vram remaining when compared due to the cards different actual usage when ran at the same settings. I.e. if mine uses 4gb and the 6800xt uses 7.5-8gb we still end up with the same remaining vram left in the tank.
 
You are simply misunderstanding how VRAM allocation and limits work and manifest.

Look up the guy Tommyboy mentioned to get some tech comparison videos that show even at 1440 that 12GB VRAM GPUs can hit VRAM limits. It even impacts 8GB GPUs at 1080p and it is happening in more and more games.

The Last of Us
Ratchet and Clank
A lot of the new UE5 games
Nope, you are misunderstanding that I'm talking about actual usage in game not allocated/predicted in the settings menu.

I don't need to watch a video online when I have the hardware/games and have tested them with my mates rig with a 6800xt at the same settings.

The reality is, I have TLOU/R&C and neither use above 9.5gb vram native 1440p max settings, R&C uses a bit more with RT on natively but that's the only exception, but RT isn't comparable to my point as the 6800xt cannot do RT at the fps/settings my 4070 can, so I purposely haven't included RT as a factor in the comparison of actual in game usage and not 'allocated' vram in the settings menu, to which the usage still never hits!

So TLDR, when you're playing the game on my card or a 6800xt, both cards actual vram usage (NOT the predicted allocated amount in the settings menu) end up with the same remaining 'free' vram, due to the 16gb 6800xt using 3.5-4gb more per game than my 4070 at the same setting/res natively, thus a 16gb card ends up with the same remaining vram as my 4070 12gb... So I wouldn't be as affected as with a 6700xt for example...(which wouldn't match my performance anyway)

I have tested this is pretty much every current and previous game - nearly 40 in total, bar BG3 as that's not my cup of tea, and have had the same continuity throughout.

I am not in anyway trolling, but in reality that IS what my card/system uses at 1440p ultra native and I've tested it against my mates system with a 6800xt... So I can't really be any fairer than that when it comes to actual ownership/real world testing.

The first thing I did was test all this when I tried out the card, if it'd been sky high actual usage in game I'd of honored the 14 day returns policy and sent it packaging and got a different card, as I don't really have any brand loyalty bar using AMD for cpu's by choice as I don't like how hot intels run+the cooling/fan noise required, but if Intel made a cpu that stayed as cool, ofcourse I'd get that if it was better.
 
Last edited:
Nope, I’m really not misunderstanding your point. I am telling you that your point does not mean what you think it means.

In simple terms a game will check the VRAM it has available and use as much as it needs to use above its minimum. The game could simply be using more VRAM cache because it has more space to do so. Allocated VRAM =/= minimum VRAM. Try the same settings with an 8GB GPU and you could be hitting VRAM limits. Or compare Nvidia vs AMD with the same VRAM.

You are in to cars, so let’s use a car analogy:

Car A has a fuel tank capacity of 25 litres

Car B has a fuel tank capacity of 50 litres.

Both have the same fuel consumption and have to do a 100 mile trip, so you fill both up to 3/4 full and neither are going to run out of fuel on such a short trip. Now they need to do a 500 mile trip and you fill both to max fuel. Car B will make it, but car A will have to refuel, or be left on the roadside waiting for a tow truck.
 
Last edited:
Nope, I’m really not misunderstanding your point. I am telling you that your point does not mean what you think it means.

In simple terms a game will check the VRAM it has available and use as much as it needs to use above its minimum. The game could simply be using more VRAM cache because it has more space to do so. Try the same settings with an 8GB GPU and you could be hitting VRAM limits. Or compare Nvidia vs AMD with the same VRAM.
Ok, so in the usage case that both my 12gb 4070 and a 6800xt achieve the same visual quality/resolution/fps i.e. everyone's happy with either card and achieved the settings they want, and end up with the same remaining physical vram left over (as in actual not allocated) then what difference does that make to my point/why would it matter either way?

Both will have the same physical amount left? Thus the 12gb wont have any issues running said game nor will the 6800xt, that's my point. So both should be reasonably safe for 2-3 years as per my original post :)

Genuinely not arguing just asking? As I don't think it matters how you word it, if both don't run out of vram and both end up with the same amount physically left, then they're both doing what they need to do to play the game at the users preferred settings/res/quality? Thus a 4070 should last as long as a 6800xt?
 
Back
Top Bottom