Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Rtx4070 uses rtx4080 dies which had defects
Yup, that shouldn't be bottlenecking the Ti at all.He's using an i7 12700 CPU, which I would've thought would be good enough.
You've got that all wrong with the 3080 mate, it's totally fine you just have to turn on DLSS and that keeps it relevant and it's still way better than everything!
No need! Honestly not sure what you're apologising to me for, no offence taken. Just like to remind people I'm not dead yet. I think there are perfectly valid points to a 3080 starting to creak at full whack on new games. I said that it would happen a while ago.Apologies about that @Bill Turnip
I don't see the point of that comparison at the moment.
The RX 6900 XT and 6950 XT are generally a bit faster than both the RTX 3080 and RTX 4070, and the 6950 XT can be had for £620.
If you want better RT performance, then you would choose a modern Nvidia card.
Here's a video of a RTX 3080 TI struggling in the Last of Us, at 1440p Ultra:
Last of Us RTX 3080 ti 1440p max details gameplay
RTX 3080ti - 2k gaming performance - The last of us game Test PC Details● CPU: Intel i7 12700 non k● Cooler: Deepcool AG620● Ram: Gskill tridentz 2x32gb ...www.youtube.com
Very high GPU usage throughout the video.
He's using an i7 12700 CPU, which I would've thought would be good enough.
Maybe some of the Ultra preset options are just awful for GPU utilization?
Is the game sensitive to RAM frequencies? he is using DDR4 @ 3600 Mhz.
There's a huge amount of system RAM being allocated (>30GB), what the hell?
A patch for The Last of Us Part I is now live. This update primarily focuses on performance improvements, reduced shader building times, and texture fidelity on Low and Medium presets.
Downloading this patch will trigger a full shader rebuild.
For optimal improvements to your experience, please ensure you have the most up-to-date version of your respective driver. At the time of publishing these notes, the most recent versions are:
AMD Software: Adrenalin Edition 23.4.3, which includes a fix for “longer than expected shader compilation time” in The Last of Us Part I
NVIDIA GeForce Game Ready Driver 531.79
- Reduced shader building times
- Optimized code to improve global CPU performance
- Optimized content to improve performance across several levels
- Improved level loading to help reduce the amount of 'Please Wait' and loading screens
- Added a new Effects Density setting, which adjusts the density and number of non-critical visual effects (Options > Graphics)
- Increased crowd sizes on Low and Medium Ambient Character Density settings and added a Very Low option (Options > Graphics)
- Implemented additional scalability tuning for Low and Medium in-game Graphics settings
- Reduced the VRAM impact of texture quality settings, allowing most players to increase their texture quality settings or experience improved performance with their current settings
- Fixed a crash that would occur on boot on Windows 11
- Fixed a crash that could occur on Intel Arc
- Fixed a crash that may occur when starting a New Game in Left Behind prior to the completion of shader building
- Corrected an issue where pointing the camera at the floor while aiming would cause the player and camera to visually stutter
- Fixed an issue where Sniping Sensitivity settings were not applying to all scoped weapons. Additionally, Sniping Sensitivity has been renamed to Scoped Sensitivity (Options > Controls > Mouse Camera Sensitivity)
- Fixed an issue where players could not click on 'Jump To Unbound' when prompted in the custom controls settings (Options > Controls > Input > Customize Controls)
- Fixed an issue where changing Graphics settings (Options > Graphics > Graphics Preset) during gameplay wouldn't restart the player at the correct checkpoint
- Fixed an issue where adjustments to Lens Flare (Options > Graphics > Visual Effects Settings > Lens Flare) were not not applied
- [Pittsburgh] Fixed an issue where players may consistently fall out-of-world when restarting at the checkpoint in the bookstore
- [Left Behind] Fixed an issue where players may get soft locked when jumping into the electrified water
AMD-Specific
We at Naughty Dog and our partners at Iron Galaxy are closely watching player reports to support future improvements and patches. We are actively optimizing, working on game stability, and implementing additional fixes which will all be included in regularly released future updates.
- Implemented improvements to load times
- Fixed an issue where shaders may take an abnormally long time to load
For other issues we are currently tracking or investigating, please refer to our Known Issues. If you encounter any of these problems, please contact support to help us gather more data and insights.
- Reduced the VRAM impact of texture quality settings, allowing most players to increase their texture quality settings or experience improved performance with their current settings
Luckily some of us have a sense of humour! Even @Bill Turnip gave up being objective. Upgrading to the 4090 was the solution though.. only a mere £900 extra plus the original investment eh. Oh and the 4070 being faster than the 3090 hope that upgrade went well for them!
No need! Honestly not sure what you're apologising to me for, no offence taken. Just like to remind people I'm not dead yet. I think there are perfectly valid points to a 3080 starting to creak at full whack on new games. I said that it would happen a while ago.
Owning one doesn't make it the greatest super thing ever with no flaw though. BUT this is old hat now, the 4000 series is the latest issues, and I'm pretty sure the 5000 series will definitely ride in to save the day with a low low price.
can the 3080 with 12GB handle the Witcher 3 next gen at 1440p, smoothly?
The 3080 TI seems to bet able to, with RT disabled (lows of ~68 FPS):
The Witcher 3 Next Gen UPDATE - RTX 3080 Ti FPS Test (1440p/4K)
Timeline:0:00 - 1440p Ultra+; RT ON ALL; DLSS QLTY2:11 - 1440p Ultra+; RT ON ALL; TAAU3:36 - 1440p Ultra+; RT OFF ALL; TAAU5:08 - 4K Ultra+; RT ON ALL; DLSS ...youtu.be
Do you mean v4.03/v4.02?Bit of a pointless video now too as game had a recent patch to considerably improve performance.
Can't remember version. The cpu utilization was the main issue with the perf though.Do you mean v4.03/v4.02?
Looks like it was focused on improving CPU performance.
I bet the price isn't reduced....smaller 4090s
60 fps is trash tierSeems like it's not an issue (at least getting >60FPS) in version 4.0, if RT is disabled or just RTXGI enabled, at least with a 12900K.
Digital Foundry tested this really well in this video:
https://youtu.be/HH87uJzUoew?t=950
Frame generation seems to make a very large difference when CPU limited, at least in this game.