I'm a PC gamer. That means I can confgure my game settings to my liking. You'll find that most PC gamers do this, well those with the mental faculties to understand why configuring settings is advantageous.
I mentioned in my post that I configure my settings so that the minimum FPS I get in games is 40, as that's the lowest limit of my monitor's freesync range.
Depending on the game, having a minimum FPS of 40 could mean a maximum FPS of 100, or 140, with an average FPS of a healthy 70-90. That's what I shoot for.
I do think that those still playing at 1080P are rather foolish, as 1440P is so superior in terms of image quality. Perhaps you haven't compared gaming at 1080P to 1440P, if not you really should, as the difference is remarkable.
The only settings I usually lower are Gameworks settings. For example, I lowered the tessellation in the Witcher 3 down one notch (via the AMD drivers) for a massive performance increase with no obvious IQ decrease. Many people did the same, since the gameworks features are so badly optimized. I also disabled the useless godrays in Fallout 4. This kind of thing.
At the end of the day, with my eyes, playing games at 1440P at a healthy FPS with only badly optimized setting disabled looks far better than 1080P. Though I can understand people with 4GB cards sticking with 1080P in the latest titles such as Tomb Raider, as even at 1080P 4GB just doesn't cut it anymore. Glad I have a card with 8GB