• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

The leaks show it more than 15% faster, well Nvidias own slides!

And with DLSS 3? If it wasn't much faster I'd get a 3090Ti and marvel that I've got a last year's flagship beast

With DLSS 3 yes but going by the DF video there are some clear shortcomings with it atm (latency and interpolation artifacts being the biggest), this may improve over time.

In straight raster performance in MS FS i'll give you it was faster but in Plagues tale it was 9% so on average 15% is prob where it will land.
 
DLSS 3 aint all that.


Kudos to Tim for making a video about it, but how is this acceptable when its on an Nvidia GPU but the same sort of image artefact's aren't when its on an Intel GPU?

Nvidia can literally get away with just about anything.

nWfRTLR.png
 
Most people in here then lol, it only took a few reviews and some fomo and it was acceptable :cry: .

There isn't a 4k gamer, high end VR user or multi screen simmer that doesn't want one. I wasn't expecting the 4090 to give that much pure raster, RT improvements are excellent and if DLSS 3.0 manages to mitigate heavy games such as MSFS then 4k high fps gaming is here with the 4090 for most current titles.

I did try and put one of the AIB's for £1699 yesterday into a basket and would have paid if I managed it, though was a bit apprehensive. Do I need it? No. Do I want one - yes. I do wonder though in a few months if they'll drop in price a bit, especially the AIB's which are £100's more for not really much performance you'd actually 'feel' when gaming. I have an 850W Be quiet, bottom line, so I think I'll have to budget for an ATX 3.0 PSU also as I only have 2x PCI'e ports on my PSU.

Then there's my 3800X too, so want to see what others find @ 4k use with my CPU and whether it causes a bottleneck at 4k as the rest of my rig except NVME's is relatively mid range these days.

Heat the room and not the house they say regarding home heating. 4090 will make a nice office heater as does the 3080 now.

Wonder how well the 4090 will undervolt as the 3000 series did well.
 
Last edited:
DLSS 3 doesn't look like something I want to use in its current state.
I agree. For me DLSS 2 still seems to be the sweet spot. Provides a large performance boost without introducing input latency or major artifacting (usually). I hope Nvidia don't move away from it to go all-in on this frame generation stuff. They should keep that as an optional toggle.
 
PCIE 5.0 and DP 2.0 for the 4090Ti? Would not surprise me!

They maybe able to do DP 2.0 with a firmware update to make it compatible but may not allow full DP 2.0 speeds, they did that back in the day to update some cards to DP 1.4 monitors that didn't display video. The PCIE 5 is on die so unless it's hidden in there and not enabled then nope will not be possible unless they respin the chips with it added in. I think that boat has sailed sadly this gen for pcie 5.
 
They maybe able to do DP 2.0 with a firmware update to make it compatible but may not allow full DP 2.0 speeds, they did that back in the day to update some cards to DP 1.4 monitors that didn't display video. The PCIE 5 is on die so unless it's hidden in there and not enabled then nope will not be possible unless they respin the chips with it added in. I think that boat has sailed sadly this gen for pcie 5.

Think it was MLID a while ago said in tandem with this they have capability to use faster memory so there is chance on a refresh later Ada cards could have better spec and these features it currently lacks. Not great if you just sunk two grand on the best but lets wait and see.
 
DLSS 3 doesn't look like something I want to use in its current state.

Right, what's the point of it other than to make bar charts look better, its just a marketing tool

Those visual artefacts can clearly be seen unless you're running at over 120 FPS, but even then you can still see them.

The input latency increases, by a noticeable amount, the whole point of high frame rates is to make the game feel more responsive, with DLSS 3 120 FPS actually feels like 40 FPS, not my words, Tim said that.

So all it does is produce an image quality that has blurring and artefacts in it and make the game feel less responsive, but it is something reviewers can put on a slide as a nice big FPS bar, like Nvidia did almost exclusively on their launch and we all went OOOoooo.... look at those big bars.

We have now got to a stage where Nvidia are marketing something that makes your gaming experience worse as a positive feature.
 
Last edited:
Right, what the point of it other than to make bar charts look better, its just a marketing tool

Those visual artefacts can clearly be seen unless you're running at over 120 FPS, but even then you can still see them.

The input latency increases, by a noticeable amount, the whole point of high frame rates is to make the game feel more responsive, with DLSS 3 120 FPS actually feels like 40 FPS, not my words, Tim said that.

So all it does is produce an image quality that has blurring and artefacts in it and make the game feel less responsive, but it is something reviewers can put on a slide as a nice big FPS bar, like Nvidia did almost exclusively on their launch and we all went OOOoooo.... look at those big bars.

We have now got to a stage where Nvidia are marketing something that makes your gaming experience worse as a positive feature.
Caveat emptor as they say ;) As true now as it was then.
 
Kudos to Tim for making a video about it, but how is this acceptable when its on an Nvidia GPU but the same sort of image artefact's aren't when its on an Intel GPU?

Nvidia can literally get away with just about anything.

nWfRTLR.png



I called it.... @TNA


Bets on how many will use their screen captures of the "fake frames" to say "DLSS 3/FG sucks and this is why" even though they have said "it is very hard to see these in normal gameplay and the gameplay itself is mostly good"
:cry:
;)
:D

:cry:
 
Last edited:
  • Haha
Reactions: TNA
Caveat emptor as they say ;) As true now as it was then.

The problem is we deserve this, i can't wait for the green army to get warmed up on acid pills so they can run around forums telling us how DLSS 3 is now the new "Better than native" image quality feature without any sense of their own self respect.
 
Last edited:
Back
Top Bottom