• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Yeah, honestly, you could argue very reasonably that a 3090 does 4K 'easy' because it does. Even with all the settings maxed out in most cases (FR at 60 or greater). The 4090 won't be breaking a sweat on anything in 4K for years to come.

As another person said, we're held up by the console cycle, while the consoles themselves are arguably held up by the late arrival of next-gen engines like UE5.
 
Nvidia has released benchmarks for Overwatch 2

This is average FPS at 2560x1440p max settings.

Using the RTX3080 as a base, the rtx4080 12gb is 19% faster than the 3080. The 4080 16gb is 47% faster. And the rtx4090 is 105% faster than the RTX3080



God i wish they would stop using games like fortnite and overwatch to advertise how fast these cards are. The general population that plays these games are kids or streamers that want views to appeal to kids!

Do you see where i'm coming from with this?
 
Yeah, honestly, you could argue very reasonably that a 3090 does 4K 'easy' because it does. Even with all the settings maxed out in most cases (FR at 60 or greater). The 4090 won't be breaking a sweat on anything in 4K for years to come.

As another person said, we're held up by the console cycle, while the consoles themselves are arguably held up by the late arrival of next-gen engines like UE5.

Sounds like someone who never used 4k on a 3090. As a 3090 owner gaming at 4K, I can assure you that many games need much more power, especially if you want to turn up all settings and even look at the RT options. This is assuming you're on a 100Hz+ monitor/TV, 4k 60fps/hz is much easier, though of course feels horrible to game at.
 
Yeah, honestly, you could argue very reasonably that a 3090 does 4K 'easy' because it does. Even with all the settings maxed out in most cases (FR at 60 or greater). The 4090 won't be breaking a sweat on anything in 4K for years to come.

As another person said, we're held up by the console cycle, while the consoles themselves are arguably held up by the late arrival of next-gen engines like UE5.

Don't get me wrong, Great if it happens but I can't see it as "4K" is a selling point, If you make 1 of your main selling points moot in the latest game at max settings, RT, Bells and whistles etc... you have to come up with something new and it's easier to keep the narrative going.
 
4k will become easy this generation because we're now limited by console hardware - so games target console graphics while pc hardware performance charges ahead - it happened in the last two generations as well.

When the ps3 launched in 2007, playing Crysis in 1080p was hard. By the time the ps4 launched in 2013, 1080p was very easy for gaming and 1440p was hard. Then the PS5 arrived in 2020 and 1440p has become very easy. By the time the PS6 arrives in +-2027 or even before it, 4k will have become easy

Even in 2027, we'll have numpties buying 8090ti's and using them at 1080P/1440P, claiming 4K "isn't worth it" or "isn't mainstream" :D
 
Sounds like someone who never used 4k on a 3090. As a 3090 owner gaming at 4K, I can assure you that many games need much more power, especially if you want to turn up all settings and even look at the RT options. This is assuming you're on a 100Hz+ monitor/TV, 4k 60fps/hz is much easier, though of course feels horrible to game at.
First of all I've had a 3090 since day one and game exclusively at 4K, and I've built no fewer than 11 3090-based PCs in the interim. Second of all you should read more carefully as I specifically stated 4K60.
 
Last edited:
First of all I've had a 3090 since day one and game exclusively at 4K, and I've built no fewer than 11 3090-based PCs in the interim. Second of all you should read more carefully as I specifically stated 4K60.

No, you stated "FR at 60 or greater". The "or greater" part means above 60. Why are you even trying to lie when your post is publicly visible?

If you've owned a 3090 from day one, you can't have been playing modern games. I'm not sure if I'm being trolled and feel ridiculous posting these, but here are some benchmarks to shows 3090 performance at 4k in modern games.


otVS5PB.png
ARGwbXN.png
H0LNxOe.png

Not the above are average FPS. Average of 60, 70, even 80 means you'll get loads of dips below 60 FPS.

3090 clearly can't run all modern games consistently above 60FPS, and 60fps feels horrible in most games. Need an average of 100+ fps for a game to be smooth IMO, but each to his own.
 
Same. I don't need it as it will only offer me more fps atm in the games that I play, and the 6900XT handles them nicely as it is. I'm going to try and wait to see what AMD offers :)
Exactly, playing at 3440x1440 (monitor) and 4K TV I am getting great frames on my 3080Ti even with a relatively low end CPU.
It's only the likes of Cyberpunk that push it but I really don't need to max out RT.

I've been playing Destiny 2 recently and have been able to remain very close to the 144Hz cap most of the time with everything maxed and I'm getting about 100 to 130 FPS at 4K.
Not a new game but I think it looks great personally.

Really don't NEED an upgrade, I'm just getting itchy fingers lol.
 
No, you stated "FR at 60 or greater". The "or greater" part means above 60. Why are you even trying to lie when your post is publicly visible?

If you've owned a 3090 from day one, you can't have been playing modern games. I'm not sure if I'm being trolled and feel ridiculous posting these, but here are some benchmarks to shows 3090 performance at 4k in modern games.


otVS5PB.png
ARGwbXN.png
H0LNxOe.png

Not the above are average FPS. Average of 60, 70, even 80 means you'll get loads of dips below 60 FPS.

3090 clearly can't run all modern games consistently above 60FPS, and 60fps feels horrible in most games. Need an average of 100+ fps for a game to be smooth IMO, but each to his own.
Mate, just say 'sorry, I misread your post'. We all make mistakes. I said 60fps+. There is no ambiguity there. And posting examples where there's only ONE game that doesn't average over 60fps (Cyberpunk) is hardly proving your point. It's disproving it.

Second, you're telling me 'so you can't run ALL modern games at 60 or greater' — again, put your reading glasses on, I said MOST.

And trying to convince a two-year 3090 owner who's run just about every game there is as 4K60 that it somehow struggles to do that is, quite frankly, mental.
 
Last edited:
Exactly, playing at 3440x1440 (monitor) and 4K TV I am getting great frames on my 3080Ti even with a relatively low end CPU.
It's only the likes of Cyberpunk that push it but I really don't need to max out RT.

I've been playing Destiny 2 recently and have been able to remain very close to the 144Hz cap most of the time with everything maxed and I'm getting about 100 to 130 FPS at 4K.
Not a new game but I think it looks great personally.

Really don't NEED an upgrade, I'm just getting itchy fingers lol.

Get a bigger SSD, More/better cooling, Better fans, Lighting, Custom keyboard, High end headphones with external AMP/DAC, Lots of stuff that can improve your experience before you go the GPU route :)
 
Last edited:
I did understand it thank you. Games are only getting more and more complex with even more complex graphical FX being added on over time, What we had in the past doesn't matter as now we are heading into the likes of path tracing which computationally is exceptionally expensive and makes even RT in Cyberpunk look primitive. This is why I don't think going forward 4K will be "easy" to run on the latest games at max settings while getting decent FPS as the bar is being sharply raised.

And that's where you are wrong

Games are tied to console hardware and always have been - they are already heading into the lull for this generation and RT won't stop that
 
Get a bigger SSD, More/better cooling, Better fans, Lighting, Custom keyboard, High end headphones with external AMP/DAC, Lots of stuff that can improve your experience before you go the GPU route :)
Yeh I've got a few "grown up" things to prioritize anyway.

I think I'm sorted headphones/sound wise and also SSDs, but I do want to get a better case.

Also, I did a budget upgrade to my platform a few weeks ago from something 7 or 8 years old to B550/5600 - I think I'll look at getting a real platform upgrade first (probably in a year) to the 13900k (or future 14900k), better case, DDR5 and next gen PSU.
 
Last edited:
This is what causes so much misinformation online these days. People like you who seemingly do not understand tentative language.

There's a difference between "expect" (for the 3000 series) and "has already arrived" for the 4000 series. If you need further clarification, it's time to go back to school.
I'll bookmark this post for when Gibbo starts the backorder thread.

I think what your also forgetting this time as that all 3 cards will be above 1k and only one of them is any good so that's where the demand will go.
 
Last edited:
No, you stated "FR at 60 or greater". The "or greater" part means above 60. Why are you even trying to lie when your post is publicly visible?

If you've owned a 3090 from day one, you can't have been playing modern games. I'm not sure if I'm being trolled and feel ridiculous posting these, but here are some benchmarks to shows 3090 performance at 4k in modern games.


otVS5PB.png
ARGwbXN.png
H0LNxOe.png

Not the above are average FPS. Average of 60, 70, even 80 means you'll get loads of dips below 60 FPS.

3090 clearly can't run all modern games consistently above 60FPS, and 60fps feels horrible in most games. Need an average of 100+ fps for a game to be smooth IMO, but each to his own.

No one experienced with 4k, turns ON all the settings let alone to max them out. Lots of gfx technologies are there to make low resolutions look better. At 4k those get turned off or are low.

Whilst what you posted is good for assessing how each GPU tier stacks up against the next, it isn't the settings anyone experienced uses at 4k, Ultra presets are for comparing apples to apples just using simple presets. They are for comparing the relative performance for the GPU review. It is not how to get the best out of a game.

An experienced gamer will fettle all the gfx settings to get the best performance with IQ. If you are just using presets then you are doing it wrong. Why would you use AA at max @ 4k?

Select a preset then adjust from there, to maximise performance without losing image quality, but no one using 4k should be using ULTRA preset as it turns on loads of potato resolution enhancing technologies.

Then there is DLSS - which really works well at higher resolutions - though could argue it mitigates those using ultra preset only for DLSS to max perf whilst retaining image quality........which is what experienced gamers do at native pure rater anyway.

If you use ultra presets at 4k, then you will be drawn into getting the most expensive gfx card as you are unable to understand or too lazy to fettle all those gfx settings to get the best image and performance for your setup. Old gamers been doing this for years.

After all, this is what the PC master race is all about, options of hardware to suit budget and then gfx options to get the best from it.

Otherwise - get a console.
 
Back
Top Bottom