• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

You've got that all wrong with the 3080 mate, it's totally fine you just have to turn on DLSS and that keeps it relevant and it's still way better than everything!:p

Luckily some of us have a sense of humour! Even @Bill Turnip gave up being objective. Upgrading to the 4090 was the solution though.. only a mere £900 extra plus the original investment eh. Oh and the 4070 being faster than the 3090 hope that upgrade went well for them! :p
 
I still have my sense of humour, no fret there! I've just run out of helpful and objective things to say (or stupid things which might entertain). Discussions just become cases of throwing enough **** to see what sticks in the long run.

Decided to just sit back and enjoy playing games... and turning down settings every now and then!
 
Apologies about that @Bill Turnip
No need! Honestly not sure what you're apologising to me for, no offence taken. Just like to remind people I'm not dead yet. I think there are perfectly valid points to a 3080 starting to creak at full whack on new games. I said that it would happen a while ago.

Owning one doesn't make it the greatest super thing ever with no flaw though. BUT this is old hat now, the 4000 series is the latest issues, and I'm pretty sure the 5000 series will definitely ride in to save the day with a low low price.
 
I don't see the point of that comparison at the moment.

The RX 6900 XT and 6950 XT are generally a bit faster than both the RTX 3080 and RTX 4070, and the 6950 XT can be had for £620.

If you want better RT performance, then you would choose a modern Nvidia card.

Agree, nowadays it is a different scene but what could people do 2-3 years ago? Pay an extra £500+ to get a gpu that has arguably not aged as good as the peasant 10gb 3080....

If buying now, you either go with a 4090 or look at second hand last gen gpus, don't settle for anything else imo.

Here's a video of a RTX 3080 TI struggling in the Last of Us, at 1440p Ultra:

Very high GPU usage throughout the video.

He's using an i7 12700 CPU, which I would've thought would be good enough.

Maybe some of the Ultra preset options are just awful for GPU utilization?

Is the game sensitive to RAM frequencies? he is using DDR4 @ 3600 Mhz.

There's a huge amount of system RAM being allocated (>30GB), what the hell?

Most comparisons for tlou are pointless now given the patch released yesterday tbh:

A patch for The Last of Us Part I is now live. This update primarily focuses on performance improvements, reduced shader building times, and texture fidelity on Low and Medium presets.
Downloading this patch will trigger a full shader rebuild.
For optimal improvements to your experience, please ensure you have the most up-to-date version of your respective driver. At the time of publishing these notes, the most recent versions are:
AMD Software: Adrenalin Edition 23.4.3, which includes a fix for “longer than expected shader compilation time” in The Last of Us Part I
NVIDIA GeForce Game Ready Driver 531.79
  • Reduced shader building times
  • Optimized code to improve global CPU performance
  • Optimized content to improve performance across several levels
  • Improved level loading to help reduce the amount of 'Please Wait' and loading screens
  • Added a new Effects Density setting, which adjusts the density and number of non-critical visual effects (Options > Graphics)
  • Increased crowd sizes on Low and Medium Ambient Character Density settings and added a Very Low option (Options > Graphics)
  • Implemented additional scalability tuning for Low and Medium in-game Graphics settings
  • Reduced the VRAM impact of texture quality settings, allowing most players to increase their texture quality settings or experience improved performance with their current settings
  • Fixed a crash that would occur on boot on Windows 11
  • Fixed a crash that could occur on Intel Arc
  • Fixed a crash that may occur when starting a New Game in Left Behind prior to the completion of shader building
  • Corrected an issue where pointing the camera at the floor while aiming would cause the player and camera to visually stutter
  • Fixed an issue where Sniping Sensitivity settings were not applying to all scoped weapons. Additionally, Sniping Sensitivity has been renamed to Scoped Sensitivity (Options > Controls > Mouse Camera Sensitivity)
  • Fixed an issue where players could not click on 'Jump To Unbound' when prompted in the custom controls settings (Options > Controls > Input > Customize Controls)
  • Fixed an issue where changing Graphics settings (Options > Graphics > Graphics Preset) during gameplay wouldn't restart the player at the correct checkpoint
  • Fixed an issue where adjustments to Lens Flare (Options > Graphics > Visual Effects Settings > Lens Flare) were not not applied

  • [Pittsburgh] Fixed an issue where players may consistently fall out-of-world when restarting at the checkpoint in the bookstore
  • [Left Behind] Fixed an issue where players may get soft locked when jumping into the electrified water

AMD-Specific

  • Implemented improvements to load times
  • Fixed an issue where shaders may take an abnormally long time to load
We at Naughty Dog and our partners at Iron Galaxy are closely watching player reports to support future improvements and patches. We are actively optimizing, working on game stability, and implementing additional fixes which will all be included in regularly released future updates.
For other issues we are currently tracking or investigating, please refer to our Known Issues. If you encounter any of these problems, please contact support to help us gather more data and insights.

  • Reduced the VRAM impact of texture quality settings, allowing most players to increase their texture quality settings or experience improved performance with their current settings

Ugjujvc.png

6mtjZ0W.gif

Luckily some of us have a sense of humour! Even @Bill Turnip gave up being objective. Upgrading to the 4090 was the solution though.. only a mere £900 extra plus the original investment eh. Oh and the 4070 being faster than the 3090 hope that upgrade went well for them! :p

Ask tommy how he is finding the 4070 :cry:

At least 4090 users are getting practically a whole generation leap compared to every other current gen gpu, meanwhile 3090, extra £750 spent well..... Jenson be loving it.
 
No need! Honestly not sure what you're apologising to me for, no offence taken. Just like to remind people I'm not dead yet. I think there are perfectly valid points to a 3080 starting to creak at full whack on new games. I said that it would happen a while ago.

Owning one doesn't make it the greatest super thing ever with no flaw though. BUT this is old hat now, the 4000 series is the latest issues, and I'm pretty sure the 5000 series will definitely ride in to save the day with a low low price.

Don't think everyone has ever said it's the best gpu ever, however, as above, given the choices and those who got it for £650, it is arguably the best bang per buck gpu for a long time now, the alternatives to a £650 3080 were quite frankly ****:

- pay extra £750 for a 4-15% perf difference where the main advantage has proved to be pretty pointless thus far except for those who needed the vram for their work
- pay similar or £500+ extra for an alternative which had no upscaling tech at all and pitiful ray tracing which is proving to be very valuable

Given current gen gpus and even a 4090 are ******** the bed in broken games without the aid of fsr/dlss, you would kind of expect a 3 year old gpu to also be ******** the bed without dlss/fsr too :cry:
 
can the 3080 with 12GB handle the Witcher 3 next gen at 1440p, smoothly?

The 3080 TI seems to bet able to, with RT disabled (lows of ~68 FPS):

Bit of a pointless video now too as game had a recent patch to considerably improve performance.

The Daniel Owen video above is much more relevant now.
 
Last edited:
Seems like it's not an issue (at least getting >60FPS) in version 4.0, if RT is disabled or just RTXGI enabled, at least with a 12900K.

Digital Foundry tested this really well in this video:
https://youtu.be/HH87uJzUoew?t=950

Frame generation seems to make a very large difference when CPU limited, at least in this game.
 
Last edited:
smaller 4090s

I bet the price isn't reduced....

Got to be honest, I was kind of expecting them to do this when they announced the proart models and was surprised the 4090 wasn't an option. Having said that it looks like the one dimension most people actually need reducing hasn't been touched, it's still just as wide as the old one meaning there could be cable issues...
 
Last edited:
Back
Top Bottom