I certainly hope so too budI would go with daft myselfBut yer, I will be giving a vid or written review (probably the latter) with benchmarks and I bloody hope there is some serious gains over my current 1080Ti
![]()

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I certainly hope so too budI would go with daft myselfBut yer, I will be giving a vid or written review (probably the latter) with benchmarks and I bloody hope there is some serious gains over my current 1080Ti
![]()
A single 1080 Ti won’t manage a minimum of 60 FPS in 4K in most newish games.
Hell, I’m playing the Division with 2 1080 Tis and I can’t maintain a minimum of 60 FPS with HTFS on and particles quality on very high.
I can’t maintain a minimum of 60 FPS in Ghost Recon Wildlands either.
I strongly doubt that a single 2080 Ti is more powerful than 2 1080 Tis.
Nvidia have mentioned that they will be renewing multi GPU performance with NVLink - so if they stick to that we should see some improvements moving forward.That’s because SLI is and I will not put it lightly. In its current state is utter garbage.
In most games with my 1080Ti SLI I was seeing less than 20% increase over a single card (the division being one, bf1, shadow of Mordor, ghost recon woodlands, origins etc etc) and in other games no increase at all or worse performance overall.
I expect a single 2080Ti to beat 1080Ti SLI purely based on the fact that SLI as of right now is crap. More so in newer games.
Average FPS increase between 1080Ti and 1080 is 14% with max overclocked bench at 23% faster - so 35-45% is significantly faster than 1080Ti (around 12-22% vs 1080Ti max overclock & around 21-31% fps increase) - but yeah keep making crap up because it so much easier than using facts...
Ok cool in which case could the raytracing not be done at 1080p (which divides nicely from "4k" but the rest of the game still be 4k (sorry if I sound stupid)The ray tracing is done on a per pixel basis. For each pixel displayed on screen, a number of rays are cast into the scene to calculate exactly what colour the pixel should be. At 4K vs 1080p, either the number of rays cast per display pixel would need to be 1/4 of the rays used at 1080p to maintain performance, or the frame rate would have to be cut to 25%. The more rays/pixel or the higher the resolution, the more processing power required.
What is not true?
Ok cool in which case could the raytracing not be done at 1080p (which divides nicely from "4k" but the rest of the game still be 4k (sorry if I sound stupid)
That the framerate difference between ray tracing a scene at 1080p and 4K would be 25%, which is what you were suggesting wasn't it?
If we forget any potential optimisations, then it would not be unreasonable to expect that the raytracing portion of frame generation would indeed quarter versus 1080p at 4K, if only due to the increase in pixels. But that tells us nothing about the actual overall framerate difference, because raytracing takes up an unknown portion of the total time it takes to generate a frame.
Put it this way:
Using this link for 1080ti performance on Witcher 3 we can see that at 1080p, the card gets 140fps, whereas as 4k it gets around 66fps.
https://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,26.html
Now, using those numbers, let's do some calculations, remembering that for 60fps, a single frame has to complete in 16.8ms.
- 140fps at 1080p means that each frame is taking 7.1ms to render.
- Lets add raytracing onto that, and say that it takes another 3ms to complete.
This works out to a combined total of 10.1ms to render a single raytraced frame at 1080p and would equal 99FPS.
- 66fps at 4K means that each frame is taking 15.2ms to render.
- And for the sake of argument, let's say at 4K is 4 times slower than it was at 1080p, making the raytracing take 12ms to complete.
That's a combined total of 27.2ms, which equals around 37-38fps for fully raytraced 4k Witcher 3 at 4K.
So that's 99FPS with RT on at 1080p, and 38FPS with RT on at 4K.
The point being that even if raytracying a scene is 4 times slower at 4k versus 1080p, that doesn't mean that the overall framerate would also be four times slower, since it would depend entirely upon the proportion of raytracing versus rasterisation, which right now, we have no idea on.
I'm really sorry if i'm articulating this poorly, hopefully that makes sense.
I'm by no means an expert on modern GPU architectures, but I'm a professional mobile/app developer who has worked on several quite large scale 3D projects, so working out rendering pipelines and calculating performance is something I do quite a bit.
An a loosely related note, I wouldn't be surprised if NVIDIA's raytracing implementation allows for a variable resolution which is completely decoupled from the main rendering/rasterisation resolution, meaning that raytracing can be completed in 1080p while the rasterisation is still 4k. Given that shadows and lighting don't necessarily need the same level of fidelity and detail as textures and geometry, this would seem like an obvious optimisation to me. If NVIDIA themselves aren't baking it in, then I suspect games developers and the big engines will certainly offer something similar.
just to be that guy , 16.6ms is the available window for 60fps .. 33.3 for 30fps and so on![]()
Happy with that if trueRTX 2080 Ti Founders Edition vs GTX 1080 Ti Founders Edition 10 games benchmarks leaked:
RTX 2080 Ti Founders Edition vs GTX 1080 Ti Founders Edition 10 games benchmarks leaked:
RTX 2080 Ti Founders Edition vs GTX 1080 Ti Founders Edition 10 games benchmarks leaked: