Soldato
Didn't you just say it offered same performance as the 3090ti? That's basically 2 years old worth of performance.
That's impossible considering it launched end of March 2022... not even a year.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Didn't you just say it offered same performance as the 3090ti? That's basically 2 years old worth of performance.
At this point you are arguing for no reason other than to argue. The 3090ti is a chip that existed 3 years ago, but nvidia decided to release it in 2022. That's like saying the 3080 12gb or god forbid the 3050 is performance from 2022. No it isn't. It's 3 year old chips that released recently for whatever sufficient to nvidia reasons.That's impossible considering it launched end of March 2022... not even a year.
And Part 2As promised, here is part 1 which is the intro part and the most demanding part of the game I've encountered so far. Once you take control of the character, FPS are 60-75 at 4K max settings with RT. Although the game pre-compiles the shaders before you load into the game, it does do some shader compiling at various points as you move around the world. Part 2 to follow...
That's not what I've found. Dying Light 2 RT was incredibly taxing on my 4090 and 3090 TI and the XTX delivers quite impressive performance there in comparison. Same for F1 2021/22 and the Callisto Protocol.
That's a good point, I forgot it was less than a year old.That's impossible considering it launched end of March 2022... not even a year.
Raytracing: Image quality and performance in detail
Dead Space Remake offers ray tracing on the PC - but only in the minimum possible version: ambient occlusion by rays. This is not wrong per se, but has hardly any visual effects in the game. This is also because Dead Space Remake is generally very dark. But it is much more important that the feature in the game generally has little effect.
Visually almost entirely without effect
90 percent of the time ray tracing makes no visual difference at all in Dead Space Remake, 8 percent of the time there's a very slight difference, and only 2 percent of the time does ray tracing actually look better. Admittedly, no measurements were taken, but it feels like such a distribution.
For the best possible image quality, ray tracing should be activated in Dead Space Remake, occasionally the rays in the horror game have an optical added value. If you have to do without ray tracing - be it for performance reasons or simply because the graphics card does not support ray tracing - you ultimately do without almost anything. Implementation is primarily a checklist feature.
Here's a comparison on my Nitro XTX vs Bang4Bucks water cooled 4090 at 3Ghz. The most interesting part of this is he is using a second PC to record the gameplay, I am recording from the same PC. Look at the difference in the allocated video memory. Don't believe the myth that AMD always used more video memory allocation. For every example that says they do, there are just as many out there where they use less. It really just varies on a game to game basis.
Yes the 4090 is more efficient, the extra performance (4 FPS) the 7900 XTX offers is costing about 70W more here. I will add that I am still power limited, if I could increase the power limit my FPS would keep going up but this is the maximum power available to the AIB XTX. It needs more, and you'll get more performance. The 4090 has BIOS available that go up to 660W and even the FE has 600W. XTX is stuck at 464W and this holds back performance. The MBA card is stuck at 355W, so you see why AIB XTX offer 20%+ performance improvement when tuned.Your 7900xtx is using 70w more than the 4090 yet it's running several degrees cooler than water 4090, what's your fan speeds?. The 7950x showing it's hunger here too, I've experienced that as well pulling 40w more than the 13900k
did anyone check the other thread.. polaris cant play forespoken - so you need to choose your GBs wisely
Yikes, AMD cpu + AMD GPU = paying their weight in electricity. Holy cow the power draw is absurd.Here's a comparison on my Nitro XTX vs Bang4Bucks water cooled 4090 at 3Ghz. The most interesting part of this is he is using a second PC to record the gameplay, I am recording from the same PC. Look at the difference in the allocated video memory. Don't believe the myth that AMD always used more video memory allocation. For every example that says they do, there are just as many out there where they use less. It really just varies on a game to game basis.
Yikes, AMD cpu + AMD GPU = paying their weight in electricity. Holy cow the power draw is absurd.
I'd love to see how AMD + AMD does in cyberpunk against Intel + nvidia
Didn't you get the memo.....
Power efficiency & consumption is no longer important
Finally got round to watching some dead space gameplay, is it just me or do the graphics look a bit meh?
Bottom line:
- dropping settings to target (at least) 60fps can offer a good experience or is vRAM limited?
- how about for 45-60 fps?
-how about for 30 - 45 fps?
If there is a problem, how many games have that problem? In how many games the problem exists due to lazy cooding or devrloper bot giving a sh**t?
Not the least... how many games run without issues and how how much you overpay for silly amounts of vRAM that do nothing outside of consuming energy for nothing?
They do on my 3080 cos i play it at 800x600 due to having sod all vram, lol.is it just me or do the graphics look a bit meh?
Enabling ray tracing comes with a surprisingly small performance hit, around 15-20%, but the ray tracing effects aren't that impressive anyway. On the other hand, looking at our quality comparisons I can support the inclusion of RT, especially the shadows look much better. With RT, at 1080p you'll need a RTX 3070 or RX 6800 XT. For 1440p60 a RTX 3090 or RX 7900 XT is required, and 4K60 with RT is only smooth on the RTX 4090. Given the visual fidelity offered, RT on or off, I'd say these hardware requirements are crazy and the developer should have spent more time optimizing their game for the PC platform.
There isn't enough GPU power for Forspoken
Forspoken puts a strain on the graphics card like no other game. Even 45 FPS in Full HD without ray tracing is not so easily possible - even reasonably modern hardware quickly fails. In concrete terms, this means: In the test field, which is admittedly still small (an update will follow), it must be a Radeon RX 6800 XT or GeForce RTX 3080. to achieve that. No, this is not a joke.
45 FPS is still possible with a GeForce RTX 3060 Ti, but only if the graphics preset is turned back to the "Standard" setting. And then the aggressive LOD is visibly annoying.
The GeForce RTX 3080 still achieves 45 FPS in WQHD, but the Radeon RX 6800 XT no longer does. The Radeon RX 6900 XT should do the trick. For Ultra HD with upsampling to "Quality" it has to be at least a GeForce RTX 3080 Ti or a Radeon RX 7900 XT - that's tough.
In the duel between AMD and Nvidia, Nvidia has the edge in Forspoken – at least as far as the old generation is concerned. The GeForce RTX 3080 is 17 percent faster in WQHD than the Radeon RX 6800 XT, in the AAA game average it is only 4 percent otherwise. The Radeon RX 7900 XTX based on RDNA 3, on the other hand, does much better: AMD's flagship is a close duel against Nvidia's GeForce RTX 4080, depending on the resolution, the Radeon achieves a tie or a narrow gap of a maximum of 4 percent. However, the RDNA 3 driver is already optimized for the game, while RDNA 2 owners have been waiting for an update since December 2022.
Ada Lovelace works with it as usual compared to its predecessor Ampere, but RDNA 3 is significantly faster than RDNA 2. The distance between the Radeon RX 6800 XT and the Radeon RX 7900 XTX in WQHD is usually 48 percent, but in Forspoken it is significantly higher 72 percent. It is unclear whether this is due to the older Radeon driver for RDNA 2 or simply to the fact that RDNA 3 can fully utilize the dual-issue shader units in the game.
According to the developer, the graphics card should have 12 GB of VRAM in order to be able to enjoy Forspoken with maximum texture details without any problems. Whether it really has to be 12 GB is not entirely clear, but 8 GB is definitely not enough. Without ray tracing, there were no problems in Full HD, but not all textures load with the rays. Here the texture level has to be turned back to solve the problem.
The problems were not noticed with a 10 GB graphics card, but the editors did not take a close look at the phenomenon either. The game itself quickly occupies almost 16 GB of graphics card memory, but this has no real significance.