• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for 4K? Discuss..

Status
Not open for further replies.
That's impossible considering it launched end of March 2022... not even a year.
At this point you are arguing for no reason other than to argue. The 3090ti is a chip that existed 3 years ago, but nvidia decided to release it in 2022. That's like saying the 3080 12gb or god forbid the 3050 is performance from 2022. No it isn't. It's 3 year old chips that released recently for whatever sufficient to nvidia reasons.
 
As promised, here is part 1 which is the intro part and the most demanding part of the game I've encountered so far. Once you take control of the character, FPS are 60-75 at 4K max settings with RT. Although the game pre-compiles the shaders before you load into the game, it does do some shader compiling at various points as you move around the world. Part 2 to follow...

That's not what I've found. Dying Light 2 RT was incredibly taxing on my 4090 and 3090 TI and the XTX delivers quite impressive performance there in comparison. Same for F1 2021/22 and the Callisto Protocol.
And Part 2

Found a stock 4090 for comparison using the same settings with no DLSS/FSR if you would like to see how they stack up.
I would have uploaded my own 4090 footage but its sold now.
That's impossible considering it launched end of March 2022... not even a year.
That's a good point, I forgot it was less than a year old. :)
 
Last edited:
That's raytracing? LOL! That's even less than Far Cry 6! C'mon bruh. Tell the whole truth.

Computerbase:
Raytracing: Image quality and performance in detail
Dead Space Remake offers ray tracing on the PC - but only in the minimum possible version: ambient occlusion by rays. This is not wrong per se, but has hardly any visual effects in the game. This is also because Dead Space Remake is generally very dark. But it is much more important that the feature in the game generally has little effect.
Visually almost entirely without effect
90 percent of the time ray tracing makes no visual difference at all in Dead Space Remake, 8 percent of the time there's a very slight difference, and only 2 percent of the time does ray tracing actually look better. Admittedly, no measurements were taken, but it feels like such a distribution.
For the best possible image quality, ray tracing should be activated in Dead Space Remake, occasionally the rays in the horror game have an optical added value. If you have to do without ray tracing - be it for performance reasons or simply because the graphics card does not support ray tracing - you ultimately do without almost anything. Implementation is primarily a checklist feature.

71ObJUQ.gif
 
That's raytracing? LOL! That's even less than Far Cry 6! C'mon bruh. Tell the whole truth.

Computerbase:




71ObJUQ.gif

rofl-lol.gif
 
Here's a comparison on my Nitro XTX vs Bang4Bucks water cooled 4090 at 3Ghz. The most interesting part of this is he is using a second PC to record the gameplay, I am recording from the same PC. Look at the difference in the allocated video memory. Don't believe the myth that AMD always used more video memory allocation. For every example that says they do, there are just as many out there where they use less. It really just varies on a game to game basis.
 
  • Like
Reactions: TNA
Here's a comparison on my Nitro XTX vs Bang4Bucks water cooled 4090 at 3Ghz. The most interesting part of this is he is using a second PC to record the gameplay, I am recording from the same PC. Look at the difference in the allocated video memory. Don't believe the myth that AMD always used more video memory allocation. For every example that says they do, there are just as many out there where they use less. It really just varies on a game to game basis.


Your 7900xtx is using 70w more than the 4090 yet it's running several degrees cooler than water 4090, what's your fan speeds?. The 7950x showing it's hunger here too, I've experienced that as well pulling 40w more than the 13900k
 
Last edited:
Your 7900xtx is using 70w more than the 4090 yet it's running several degrees cooler than water 4090, what's your fan speeds?. The 7950x showing it's hunger here too, I've experienced that as well pulling 40w more than the 13900k
Yes the 4090 is more efficient, the extra performance (4 FPS) the 7900 XTX offers is costing about 70W more here. I will add that I am still power limited, if I could increase the power limit my FPS would keep going up but this is the maximum power available to the AIB XTX. It needs more, and you'll get more performance. The 4090 has BIOS available that go up to 660W and even the FE has 600W. XTX is stuck at 464W and this holds back performance. The MBA card is stuck at 355W, so you see why AIB XTX offer 20%+ performance improvement when tuned.

My edge temp may be lower, but I suspect my Hotspot temperature is higher. Fan speed was 60%. He doesn't monitor hotspot but on my 4090 hotspot was usually about 10c higher than edge temperature. Yes the 7950X uses more power too in games, that's what 16 high performance cores will do Vs 8. Those roles will be completely reversed in any other kind of multi thread workload though. It will be interesting to see how this (gaming power draw) changes when i put a 7950X3D into this system.
 
Last edited:
did anyone check the other thread.. polaris cant play forespoken - so you need to choose your GBs wisely

:cry:


AMD boys all happy and laughing about Forspoken getting added to the 12GB is not enough list. Then the game comes out and turns out it’s another pile of turd again:


shocked-shocked-face.gif



brostradamus putting this one to bed with his post above:


mati-bodoh.gif



Rest of the people following the thread:

kelce.gif



Mods and new comers to the thread:

funny-animals-dog.gif
 
Here's a comparison on my Nitro XTX vs Bang4Bucks water cooled 4090 at 3Ghz. The most interesting part of this is he is using a second PC to record the gameplay, I am recording from the same PC. Look at the difference in the allocated video memory. Don't believe the myth that AMD always used more video memory allocation. For every example that says they do, there are just as many out there where they use less. It really just varies on a game to game basis.
Yikes, AMD cpu + AMD GPU = paying their weight in electricity. Holy cow the power draw is absurd.

I'd love to see how AMD + AMD does in cyberpunk against Intel + nvidia :D
 
Yikes, AMD cpu + AMD GPU = paying their weight in electricity. Holy cow the power draw is absurd.

I'd love to see how AMD + AMD does in cyberpunk against Intel + nvidia :D

Didn't you get the memo.....

Power efficiency & consumption is no longer important ;) :p :D




Finally got round to watching some dead space gameplay, is it just me or do the graphics look a bit meh?
 
Didn't you get the memo.....

Power efficiency & consumption is no longer important ;) :p :D




Finally got round to watching some dead space gameplay, is it just me or do the graphics look a bit meh?

I am somewhat tempted to just play the original again. According to ultrawide mrk, the HDR is not a proper implementation and he is playing in SDR. The RT is almost non existent (explains why the 7900 XTX doing so well :p). Plus reports DLSS/FRS creating ghosting. Then again I am in no rush to play the game full stop so happy to wait until the price drops significantly until the time I clear my backlog of games.
 
Bottom line:

- dropping settings to target (at least) 60fps can offer a good experience or is vRAM limited?

- how about for 45-60 fps?

-how about for 30 - 45 fps?

If there is a problem, how many games have that problem? In how many games the problem exists due to lazy cooding or devrloper bot giving a sh**t?

Not the least... how many games run without issues and how how much you overpay for silly amounts of vRAM that do nothing outside of consuming energy for nothing?
 
Bottom line:

- dropping settings to target (at least) 60fps can offer a good experience or is vRAM limited?

- how about for 45-60 fps?

-how about for 30 - 45 fps?

If there is a problem, how many games have that problem? In how many games the problem exists due to lazy cooding or devrloper bot giving a sh**t?

Not the least... how many games run without issues and how how much you overpay for silly amounts of vRAM that do nothing outside of consuming energy for nothing?

Nah mate. You are just reframing things. It does not matter if the developer is lazy or the game is crap etc. All that matters is there simply is a single title that exists in any shape or form that uses more than x amount of vram (whatever the flavour of the year is) and you have lost the argument there and then. I have forspoken! :p
 

DW5Qsq5.png

HS0Uhxq.png

Enabling ray tracing comes with a surprisingly small performance hit, around 15-20%, but the ray tracing effects aren't that impressive anyway. On the other hand, looking at our quality comparisons I can support the inclusion of RT, especially the shadows look much better. With RT, at 1080p you'll need a RTX 3070 or RX 6800 XT. For 1440p60 a RTX 3090 or RX 7900 XT is required, and 4K60 with RT is only smooth on the RTX 4090. Given the visual fidelity offered, RT on or off, I'd say these hardware requirements are crazy and the developer should have spent more time optimizing their game for the PC platform.



58TLztE.png

lvofqeB.png

There isn't enough GPU power for Forspoken​


Forspoken puts a strain on the graphics card like no other game. Even 45 FPS in Full HD without ray tracing is not so easily possible - even reasonably modern hardware quickly fails. In concrete terms, this means: In the test field, which is admittedly still small (an update will follow), it must be a Radeon RX 6800 XT or GeForce RTX 3080. to achieve that. No, this is not a joke.


45 FPS is still possible with a GeForce RTX 3060 Ti, but only if the graphics preset is turned back to the "Standard" setting. And then the aggressive LOD is visibly annoying.


The GeForce RTX 3080 still achieves 45 FPS in WQHD, but the Radeon RX 6800 XT no longer does. The Radeon RX 6900 XT should do the trick. For Ultra HD with upsampling to "Quality" it has to be at least a GeForce RTX 3080 Ti or a Radeon RX 7900 XT - that's tough.

In the duel between AMD and Nvidia, Nvidia has the edge in Forspoken – at least as far as the old generation is concerned. The GeForce RTX 3080 is 17 percent faster in WQHD than the Radeon RX 6800 XT, in the AAA game average it is only 4 percent otherwise. The Radeon RX 7900 XTX based on RDNA 3, on the other hand, does much better: AMD's flagship is a close duel against Nvidia's GeForce RTX 4080, depending on the resolution, the Radeon achieves a tie or a narrow gap of a maximum of 4 percent. However, the RDNA 3 driver is already optimized for the game, while RDNA 2 owners have been waiting for an update since December 2022.




Ada Lovelace works with it as usual compared to its predecessor Ampere, but RDNA 3 is significantly faster than RDNA 2. The distance between the Radeon RX 6800 XT and the Radeon RX 7900 XTX in WQHD is usually 48 percent, but in Forspoken it is significantly higher 72 percent. It is unclear whether this is due to the older Radeon driver for RDNA 2 or simply to the fact that RDNA 3 can fully utilize the dual-issue shader units in the game.

According to the developer, the graphics card should have 12 GB of VRAM in order to be able to enjoy Forspoken with maximum texture details without any problems. Whether it really has to be 12 GB is not entirely clear, but 8 GB is definitely not enough. Without ray tracing, there were no problems in Full HD, but not all textures load with the rays. Here the texture level has to be turned back to solve the problem.


The problems were not noticed with a 10 GB graphics card, but the editors did not take a close look at the phenomenon either. The game itself quickly occupies almost 16 GB of graphics card memory, but this has no real significance.
 
Last edited:
  • Like
Reactions: TNA
though the game has been critically panned..i think we should do a nvidia fanboy with these benchmarks, we can also build a narrative that how this is supposed to be a next gen cutting edge game like how the amd fangirls did when comparing performance with assassins creed when everybody and their dog knows that nothing has yet surpassed doom eternal in implementation

and i forgot the thing about hardware schedulers as well
 
Status
Not open for further replies.
Back
Top Bottom