Grim - Not sure I have much trust in the Teraflops of GPUs as a metric to determine performance. The Vega 64 offers 12.66TFs, but if this represented actual performance, it would overtake the RTX 2080 Super (11.15 TFs).
For the last few of GPU upgrades I've brought ( HD 4870 > HD 7870 > R9 390), I've always tried to get at least a doubling in the Texture and Pixel Rates, which I think is a reasonably effective way to guarentee a significant boost in GPU performance.
Another useful stat is memory bandwidth, but I believe this matters less.
Assuming the Xbox X series GPU spec is correct, and a PC equivalent was created, it would offer more than double the pixel (64000 vs 165600) and texture rate (160000 vs 386400) compared to my R9 390, but 'just' a 75% improvement in memory bandwidth.
I think it's worth pointing out that the Xbox Series X GPU apparently
has a higher pixel rate and memory bandwidth than even the ~£1000 RTX 2080 TI
The pixel rate should certainly help it perform well at 4k resolution.
It's conceivable that a RDNA2 graphics card variant could equal the RTX 2080 TI on texture rate performance too (the Radeon VII already has 420.0 GTexel/s).