Reviews and release fps charts are for apples to apples comparisons, so that a metric for % performance increase can be evaluated over earlier cards.
Anyone who actually games at 4k knows that you don't turn some settings to max or on at all, as they are there to enhance potato resolutions. So you turn those off, releasing GPU resource. Anyone using review FPS to claim whether a card is 4k or not doesn't know what they are doing. Probably the same people that use gfx presets, where an experienced PC gamer will spend time setting up a game with all the settings available to get the best fps/IQ available for a particular game and those use higher resolutions. Using presets are for folk who doesn't understand all the graphical settings and what they are for so presets are a choice of 4/5 short cuts for folk not so clued up. The added bonus of PC gaming is the plethora of settings available for fettling to give you the best gaming experience, regarding your monitor refresh rate and IQ. You just cut your cloth accordingly to your set up.
Amazes me how many people that go on about 4k but have never gamed at it.
Exactly this. Even those people with all the vram in the world turn down settings because they want to achieve certain fps. As tna said, things like motion blur, DOF, film grain, lens effect, CA etc. etc. all go straight off.
Funnily I actually have more settings at max for 4k than I do when gaming on my 3440x1440 144hz display and this is largely because:
- it is easy to have a locked 60 at 4k when using dlss on the 3080
- where as on my 3440x1440 ips 144hz, I want to be getting at least a constant 80/90 or even 100+ fps depending on the game (in order to get some of the benefits from a 144hz panel, freesync is great but low fps is still low fps when it comes to motion clarity and input lag) and the only way to do this is to reduce settings