Total performance improvement in current games range from unimpresive 25% to very impressive 68% in wolfenstein 2. In the most demanding location 1080ti gets 50fps min and 61 fps average while, 2080ti 90fps min and 103 fps average (111fps with OC).
Average performance improvement is around 30% but if I would take into account just games that run with problems on 1080ti, then these particular games run around 40-50% better now.
Hellblade - 42 fps on 1080ti, 69fps on stock 2080ti
Rise of the tomb raider 59fps on 1080ti, and 75 on 2080ti
Shadow of the tomb raider 40 fps on 1080ti, 59 fps on 2080ti, and with high detail 2080ti unlike 1080ti (50fps)run that game with very impressive 80fps. expected.
The witcher 3, before 59fps on 1080ti, now 81 fps on 2080ti so basically game will run for 99% of time above 60fps on 2080ti, and on 1080ti 99% of time below that target. There are also other games like crysis 3, mass effect andromeda, the evil within 2, TC the division, the crew 2, and many other that run around 40-50fps on 1080ti and run now either above 60fps, or very close to it (very basic details tweak and 60fps locked gameplay should be already).
So I say in current games 25-68% is good performance already, 2080ti results looks very similar to GTX 1080 in SLI (the same cost as single 2080ti) and future games will show even better performance. Native HDR support (in some games 1080ti is 20% slower in HDR while 2080ti should not have problems like that), new shading features like veriable rate shading alone should improve performance by 20%, DLSSx1 shoud give additional 50% up to 100% (min fps on 2080ti in final fantasy without DLSS 18fps, and 40fps with DLSSx1), and DLSSx2 should provide SSAA quality although we still dont know how much performance cost DLSSx2 will take. DLSSx1 is rendering internally at 1440p, yet performance is much worse than 1440p, so DLSSx2 will probably cost some additional fps too compared with native 4K, although probably not as much as native 4K plus SSAAx4.
Also there's RTX feature. In star wars RTX tech demo 1080ti scores 9min-11average fps while 2080ti 45-65 fps, that's 6x improvement in performance although we still dont know how RTX performance will look in real games. According to Digital Foundry gamescon RTX tech demos were prepared in a rush for RTX GPU's and made with titan V in mind first (and that card doesnt have RTX cores). So these tech demos were probably far from being optimized, and Digital Foundry also mentioned very interesting thing, because apparently RTX effects are calculated 1:1 with native resolution now, and if developers wold just render RTX effects at 1080p and then just upscale them the end result should still look great and performance will be good even at higher resolutions.
So to sum it up, although Turing GPU's are already here we still dont exactly how fast they are, but because current games only use half of turing chip we can say for sure, with time performance gap between pascal and turing will widen.