Because you're making the mistake that 10% = 10%. It all depends are where you're taking your index. X is 10% more than Y =/= Y is 10% less than X.
Let's plug some figures in ;; We'll pull these from
https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html and use the 4k RT diff (if you're buying this high end, chances are you aren't wanting to play 1080p low
)
4090 is listed at 55.9fps
4080 is listed at 37.8fps
So let's at 10% to the 4080 (so 10% faster than a 4080) = 41.58fps
Let's now take 10% off the 4090 (so 10% slower than 4090) = 50.31fps <<<< Notice how different.
Let's do it for 1440p - just to give the 4080 a better shot.
103.9fps vs 76.0fps
76 + 10% = 83.6fps
103.9 - 10% = 93.51fps <<< Still very different.
This is where you made the mistake. Being 10% quicker than a 4080 does not mean it's 10% slower than a 4090. The difference is more "significant" than 10%