• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for 4K? Discuss..

Status
Not open for further replies.
Ah yes, here we go with matt and his " those results are wrong when it comes to amd though as I get much better performance from my amd gpus*"

*highly fine tuned best of the best components with AC right beside his pc and a fresh windows install

:D

Basically as you play at 3440x1440 you are safe even with 10GB. Not that it matters, even if you played at 4K you would need DLSS which would mean running at a lower resolution anyway :cry:

As for Forspoken, the less the spoken the better. Lol.

Tommy should remove it from the OP :cry:

Exactly :cry:

But but but you aren't allowed to use upscaling tech because it doesn't fit the narrative even though....

The level of detail is good​


In terms of image reconstruction, the upsampling variants cannot score, because there are generally no problems in the game. What FSR and DLSS then do very well again are the image details. Despite the significantly lower render resolution, they are at the level of the native TAA resolution, which applies to both the quality and the performance modes. On the other hand, if you reduce the native resolution, you also lose image sharpness and thus details.




Upsampling is a good choice​


Even if the flickering of the native resolution cannot be prevented with upsampling, both DLSS 2 and FSR 2 leave a good impression in Dead Space Remake. If you need more FPS, you should forget about the meaningless graphic details and go straight to AMD's or Nvidia's upsampling. In quality mode, the image quality is comparable to the native resolution including TAA, but the performance is clearly better. With the same render resolution, DLSS and FSR are then superior. In performance mode, DLSS has small advantages over FSR, but these are hardly noticeable in the horror game.


In general, it is worth considering upsampling regardless of any performance problems in Dead Space Remake due to the higher frame rate and comparable image quality.

So to sum up, computerbase results:

- they are wrong on upscaling
- they are wrong on the performance for amd/rdna 3 gpus

However, they are right on vram despite showing nothing to back up the issues they pointed out "that may happen"

:cry:


I did see comments on forspoken running just fine a 3080 at 4k but was hoping the others would answer, was pretty obvious though given the silence, now we are on to dead space but seems that possibly is actually ok too so we'll soon be back to fc 6 again :D
 
Last edited:
  • Haha
Reactions: TNA
I did see comments on forspoken running just fine a 3080 at 4k but was hoping the others would answer, was pretty obvious though given the silence, now we are on to dead space but seems that possibly is actually ok too so we'll soon be back to fc 6 again :D

:cry::cry::cry:
 
I did see comments on forspoken running just fine a 3080 at 4k but was hoping the others would answer, was pretty obvious though given the silence, now we are on to dead space but seems that possibly is actually ok too so we'll soon be back to fc 6 again :D
I got demo installed on my desktop and it "runs ok" on 3080.
I do have an issue with flickering textures (they flicker between low and high quality ones) but even lowering them to medium doesn't fix it.
Will try it on my laptop today as that has 16GB of vram.
Don't know might be an issue with ram amount as devs recommend more than 16GB and that's what is in my desktop.
I gotta say tho DLSS in this game looks great even on performance at 4K.
 
Last edited:
The state of the GPU market really is lol worthy. The 4080 really is a turd of a card, and here you got AMD users getting a boner at being able to match it.

Just imagine had AMD priced the card well, you would have had loads of people here on your team with their 7900 XTX. Instead I keep seeing people buying 4080's which has got to be up there as one of the most slated cards.
I bet AMD were over the moon when they see the 4080 specs and pricing, If Nvidia had made the 4080 a true 3080 successor and used a 102 die priced around $800 AMD would have been in serious trouble.
 
The state of the GPU market really is lol worthy. The 4080 really is a turd of a card, and here you got AMD users getting a boner at being able to match it.

Just imagine had AMD priced the card well, you would have had loads of people here on your team with their 7900 XTX. Instead I keep seeing people buying 4080's which has got to be up there as one of the most slated cards.

The 4080 is not a turd of a card, it’s just badly priced. The card itself is very capable on its technical merits.
 
The 4080 is not a turd of a card, it’s just badly priced. The card itself is very capable on its technical merits.

Thanks for stating the obvious. It's the price that makes it a turd, I thought that much was obvious. Otherwise my statement would be saying anything that is not a 4090 is rubbish which is obviously not the case.

For future record, as far as I am concerned what makes any gpu great or turd is the price assuming no driver issues and it works.
 
No one has shared this graph, but if you disable blurry DLSS/FSR and run native, the (power limited and throttling) XTX is actually 8%+ faster than the 4080 at 4K native with RT.
iCaCwH2.png

With an AIB XTX you can and will see much better numbers than that, I'll put a video up soon don't worry. :)
This just proves the 7900 xtx is the better card. Really not surprised to see it selling so well from mlid’s recent video.
 
Just tested Forspoken on my laptop (16GB of vram) and flickering textures aren't there.
Went back to my desktop and yeah textures do flicker or don't load properly.
The game seem to have some artificial limit on my 10GB 3080 as it doesn't seem to use more than 8GB of it and that is memory allocation and not actually used.
On my laptop it was using around 10GB of vram.
 
Last edited:
Just tested Forspoken on my laptop (16GB of vram) and flickering textures aren't there.
Went back to my desktop and yeah textures do flicker or don't load properly.
The game seem to have some artificial limit on my 10GB 3080 as it doesn't seem to use more than 8GB of it and that is memory allocation and not actually used.
On my laptop it was using around 10GB of vram.

Interesting. Is this game AMD sponsored by any chance? Did not see any AMD logos when loading the demo :p
 
This just proves the 7900 xtx is the better card. Really not surprised to see it selling so well from mlid’s recent video.
It was always the better card, just like the 4090 was always the better card vs the XTX. There are always outliers of course in both examples, but for the most part it holds true.

You can argue about features, software etc but my belief on the actual hardware is set, having owned the XTX and the 4090 and tuned them both.

You'll never see a 4080 faster than a 4090, but you will occasionally see an XTX faster than a 4090 in favourable titles.
 
Last edited:
Ah yes, here we go with matt and his " those results are wrong when it comes to amd though as I get much better performance from my amd gpus*"

*highly fine tuned best of the best components with AC right beside his pc and a fresh windows install

:D



Exactly :cry:

But but but you aren't allowed to use upscaling tech because it doesn't fit the narrative even though....



So to sum up, computerbase results:

- they are wrong on upscaling
- they are wrong on the performance for amd/rdna 3 gpus

However, they are right on vram despite showing nothing to back up the issues they pointed out "that may happen"

:cry:


I did see comments on forspoken running just fine a 3080 at 4k but was hoping the others would answer, was pretty obvious though given the silence, now we are on to dead space but seems that possibly is actually ok too so we'll soon be back to fc 6 again :D


Don't forget folks some reframing happened over the weekend. Firstly, make sure when you actually play a game, motion blur is on, as apparently you are meant to game on settings we all use to VALIDATE REVIEWER TEST settings. You know, ultra preset - make sure you turn on all the settings for potato enhancement @ 4k. As apparently we all must game at that, I never knew I'd been doing it wrong all this time. Clouded judgement from playing benchmarks all day.

Re-framed the whole is 10GB enough now to; is 10GB enough to validate reviewer test settings. Well no one said the settings and result wasn't valid - just you wouldn't actually game on reviewer test settings - in other news - fine for actually playing the game. Further evidence the cust serv op chatbot careerist is as divisive as first thought.

Plays benchmarks or games for short periods with large perf overlays, which is valid if wants to try and make a channel around OC'd cards compared with review sites findings; fine for comparing apples to apples performance - oh and that card crunching graphical masterpiece that is chavball manager.

Dunno why you even converse with the cust serv op chatbot; soon to be outsourced to chatGPT.

Then there's her groom of the stool, who cant work out why people bought a 4090 this time round and oddly says one is fine with prices this time. I waited for the 3090 results to be released before buying a 3080. Saw 10% extra perf for 100% extra cost and went for a 3080 which was 90% of a 3090. Price was never a problem - but near as makes no difference in performance for double the price - er nah. Purchase based on what you get for the money - not affordability - I dont get the difficulty in understanding but it does go well with never atucally adding anything to the debate - just gifs and smilies. Probabaly gets on a sunshine bus each morning to go to what she thinks is work.

Too hard to work out the 10% for double the money last time, the money still in the bank on a 3080. The 2nd hand value of a 3080 vs 3090. If she 'wanted' a 4090 She'd be armpits to £3300 difference being the £550 banked for not getting a 3090 and the £500 resale price of 3080 held way better than a 3090. 3090's are half price 2nd hand Still paying £200+ 2nd hand for 3090 2nd hand for 10% perf.

So 2 games in OP...........where is Farcrysis 6 in that list, or was it fixed and now works? When I look at FC6 and max everything it says 9.3GB Even with low fps enhancement - blur setting on.

Apparently posting early AM means one must have been drinking - not out night shooting with mates until wee hours. Just goes to show how narrow minded people are. I'm in the thick of the rat race apparently - someone needs to google rat race to work out what it is - but that'll get re-framed I imagine.
 
This just proves the 7900 xtx is the better card. Really not surprised to see it selling so well from mlid’s recent video.
I'm surprised considering its an 80 class competitor and usually 80 class cards go for £700.

I think people are paying for the name rather than the performance level as I don't think it would have sold at all had AMD named it a 7800XT.
 
Last edited:
It was always the better card, just like the 4090 was always the better card vs the XTX. There are always outliers of course in both examples, but for the most part it holds true.

You can argue about features, software etc but my belief on the actual hardware is set, having owned the XTX and the 4090 and tuned them both.

You'll never see a 4080 faster than a 4090, but you will occasionally see an XTX faster than a 4090 in favourable titles.
Obviously faster card from the same architecture will be faster in every game.
I dont know why would anybody expect 4080 to be faster than 4090 in anything.
 
  • Haha
Reactions: TNA
Don't forget folks some reframing happened over the weekend. Firstly, make sure when you actually play a game, motion blur is on, as apparently you are meant to game on settings we all use to VALIDATE REVIEWER TEST settings. You know, ultra preset - make sure you turn on all the settings for potato enhancement @ 4k. As apparently we all must game at that, I never knew I'd been doing it wrong all this time. Clouded judgement from playing benchmarks all day.

Re-framed the whole is 10GB enough now to; is 10GB enough to validate reviewer test settings. Well no one said the settings and result wasn't valid - just you wouldn't actually game on reviewer test settings - in other news - fine for actually playing the game. Further evidence the cust serv op chatbot careerist is as divisive as first thought.

Plays benchmarks or games for short periods with large perf overlays, which is valid if wants to try and make a channel around OC'd cards compared with review sites findings; fine for comparing apples to apples performance - oh and that card crunching graphical masterpiece that is chavball manager.

Dunno why you even converse with the cust serv op chatbot; soon to be outsourced to chatGPT.

Then there's her groom of the stool, who cant work out why people bought a 4090 this time round and oddly says one is fine with prices this time. I waited for the 3090 results to be released before buying a 3080. Saw 10% extra perf for 100% extra cost and went for a 3080 which was 90% of a 3090. Price was never a problem - but near as makes no difference in performance for double the price - er nah. Purchase based on what you get for the money - not affordability - I dont get the difficulty in understanding but it does go well with never atucally adding anything to the debate - just gifs and smilies. Probabaly gets on a sunshine bus each morning to go to what she thinks is work.

Too hard to work out the 10% for double the money last time, the money still in the bank on a 3080. The 2nd hand value of a 3080 vs 3090. If she 'wanted' a 4090 She'd be armpits to £3300 difference being the £550 banked for not getting a 3090 and the £500 resale price of 3080 held way better than a 3090. 3090's are half price 2nd hand Still paying £200+ 2nd hand for 3090 2nd hand for 10% perf.

So 2 games in OP...........where is Farcrysis 6 in that list, or was it fixed and now works? When I look at FC6 and max everything it says 9.3GB Even with low fps enhancement - blur setting on.

Apparently posting early AM means one must have been drinking - not out night shooting with mates until wee hours. Just goes to show how narrow minded people are. I'm in the thick of the rat race apparently - someone needs to google rat race to work out what it is - but that'll get re-framed I imagine.
@gpuerrilla He's daytime drinking now must be triggered as its another classic nonsensical post. Getting things wrong since September 2013 and still going strong. :cry:
 
@gpuerrilla He's daytime drinking now must be triggered as its another classic nonsensical post. Getting things wrong since September 2013 and still going strong. :cry:


Answer this - do you play FC6 or any games with a 4090/ 7900XTX with motion blur on?

Probably another sentence one wont like as cant understand as nonsensical. Answer the question chatAMD.
 
Answer this - do you play FC6 or any games with a 4090/ 7900XTX with motion blur on?

Probably another sentence one wont like as cant understand as nonsensical. Answer the question chatAMD.
Live scenes in Exmouth, as senial old drunk asks random question.
CByED37.gif
Here is the Far Cry 6 and 10GB threads. Ask me in there, maybe I'll humour your troll with a reply. ;)
 
Status
Not open for further replies.
Back
Top Bottom