Most people that I know IRL who are pc gamers are tech savy. The "casuals" that don't notice the difference between 60-120 are the ones that play on consoles hence why they find 30 smooth.
100hz 3440x1440 and 200hz 2560x1080 are just as demanding as each other for single GPU power so there is no point arguing which is better or what the point of them is..... You want more screen real estate space, sharper/clearer image (people use these gaming monitors for work, browsing etc. too) then you go with the 3440x1440 100, if you want better motion/smoothness then you go with 200hz 2560x1080.
Anyway, my original point was that IMO, 2560x1080 is too low for 35" and sacrificing the res. is not worth the extra 100hz, if you disagree with that then fair enough.
100hz 3440x1440 and 200hz 2560x1080 are just as demanding as each other for single GPU power so there is no point arguing which is better or what the point of them is..... You want more screen real estate space, sharper/clearer image (people use these gaming monitors for work, browsing etc. too) then you go with the 3440x1440 100, if you want better motion/smoothness then you go with 200hz 2560x1080.
Anyway, my original point was that IMO, 2560x1080 is too low for 35" and sacrificing the res. is not worth the extra 100hz, if you disagree with that then fair enough.