• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

I would go with daft myself :D But yer, I will be giving a vid or written review (probably the latter) with benchmarks and I bloody hope there is some serious gains over my current 1080Ti :D
I certainly hope so too bud ;).... Although im currently playing through AOE3 on mine.... I think its only using about 30% of the GPU at 4K!!
 
A single 1080 Ti won’t manage a minimum of 60 FPS in 4K in most newish games.

Hell, I’m playing the Division with 2 1080 Tis and I can’t maintain a minimum of 60 FPS with HTFS on and particles quality on very high.

I can’t maintain a minimum of 60 FPS in Ghost Recon Wildlands either.

I strongly doubt that a single 2080 Ti is more powerful than 2 1080 Tis.

That’s because SLI is and I will not put it lightly. In its current state is utter garbage.

In most games with my 1080Ti SLI I was seeing less than 20% increase over a single card (the division being one, bf1, shadow of Mordor, ghost recon woodlands, origins etc etc) and in other games no increase at all or worse performance overall.

I expect a single 2080Ti to beat 1080Ti SLI purely based on the fact that SLI as of right now is crap. More so in newer games.
 
That’s because SLI is and I will not put it lightly. In its current state is utter garbage.

In most games with my 1080Ti SLI I was seeing less than 20% increase over a single card (the division being one, bf1, shadow of Mordor, ghost recon woodlands, origins etc etc) and in other games no increase at all or worse performance overall.

I expect a single 2080Ti to beat 1080Ti SLI purely based on the fact that SLI as of right now is crap. More so in newer games.
Nvidia have mentioned that they will be renewing multi GPU performance with NVLink - so if they stick to that we should see some improvements moving forward.
 
Average FPS increase between 1080Ti and 1080 is 14% with max overclocked bench at 23% faster - so 35-45% is significantly faster than 1080Ti (around 12-22% vs 1080Ti max overclock & around 21-31% fps increase) - but yeah keep making crap up because it so much easier than using facts...

All these benchmarks are at 1080p so you would expect 1080 / 1080 ti to be much closer. It's only at higher res where ti bandwidth really shines. If you watch adoredtv latest video he goes through this. It's quite interesting and sheds some light on the nvidia performance bar charts released
 
The ray tracing is done on a per pixel basis. For each pixel displayed on screen, a number of rays are cast into the scene to calculate exactly what colour the pixel should be. At 4K vs 1080p, either the number of rays cast per display pixel would need to be 1/4 of the rays used at 1080p to maintain performance, or the frame rate would have to be cut to 25%. The more rays/pixel or the higher the resolution, the more processing power required.
Ok cool in which case could the raytracing not be done at 1080p (which divides nicely from "4k" but the rest of the game still be 4k (sorry if I sound stupid)
 
What is not true?

That the framerate difference between ray tracing a scene at 1080p and 4K would be 25%, which is what you were suggesting wasn't it?

If we forget any potential optimisations, then it would not be unreasonable to expect that the raytracing portion of frame generation would indeed quarter versus 1080p at 4K, if only due to the increase in pixels. But that tells us nothing about the actual overall framerate difference, because raytracing takes up an unknown portion of the total time it takes to generate a frame.

Put it this way:

Using this link for 1080ti performance on Witcher 3 we can see that at 1080p, the card gets 140fps, whereas as 4k it gets around 66fps.

https://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,26.html

Now, using those numbers, let's do some calculations, remembering that for 60fps, a single frame has to complete in 16.8ms.

- 140fps at 1080p means that each frame is taking 7.1ms to render.
- Lets add raytracing onto that and say that it takes another 3ms to complete.

This works out to a combined total of 10.1ms to render a single raytraced frame at 1080p and would equal 99FPS.

- 66fps at 4K means that each frame is taking 15.2ms to render.
- And for the sake of argument, let's say that raytracing at 4K is 4 times slower than it was at 1080p, making it take 12ms to complete.

That's a combined total of 27.2ms, which equals around 37-38fps for fully raytraced Witcher 3 at 4K.

So that's 99FPS with RT on at 1080p, and 38FPS with RT on at 4K, which significantly more than 25% of the 1080p performance.

The point being that even if raytracying a scene is 4 times slower at 4k versus 1080p, that doesn't mean that the overall framerate would also be four times slower, since it would depend entirely upon the proportion of raytracing versus rasterisation, which right now, we have no idea on.

I'm really sorry if i'm articulating this poorly, hopefully that makes sense.

I'm by no means an expert on modern GPU architectures, but I'm a professional mobile/app developer who has worked on several quite large scale 3D projects, so working out rendering pipelines and calculating performance is something I do quite a bit.

An a loosely related note, I wouldn't be surprised if NVIDIA's raytracing implementation allows for a variable resolution which is completely decoupled from the main rendering/rasterisation resolution, meaning that raytracing can be completed in 1080p while the rasterisation is still 4k. Given that shadows and lighting don't necessarily need the same level of fidelity and detail as textures and geometry, this would seem like an obvious optimisation to me. If NVIDIA themselves aren't baking it in, then I suspect games developers and the big engines will certainly offer something similar.
 
Last edited:
That the framerate difference between ray tracing a scene at 1080p and 4K would be 25%, which is what you were suggesting wasn't it?

If we forget any potential optimisations, then it would not be unreasonable to expect that the raytracing portion of frame generation would indeed quarter versus 1080p at 4K, if only due to the increase in pixels. But that tells us nothing about the actual overall framerate difference, because raytracing takes up an unknown portion of the total time it takes to generate a frame.

Put it this way:

Using this link for 1080ti performance on Witcher 3 we can see that at 1080p, the card gets 140fps, whereas as 4k it gets around 66fps.

https://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,26.html

Now, using those numbers, let's do some calculations, remembering that for 60fps, a single frame has to complete in 16.8ms.

- 140fps at 1080p means that each frame is taking 7.1ms to render.
- Lets add raytracing onto that, and say that it takes another 3ms to complete.

This works out to a combined total of 10.1ms to render a single raytraced frame at 1080p and would equal 99FPS.

- 66fps at 4K means that each frame is taking 15.2ms to render.
- And for the sake of argument, let's say at 4K is 4 times slower than it was at 1080p, making the raytracing take 12ms to complete.

That's a combined total of 27.2ms, which equals around 37-38fps for fully raytraced 4k Witcher 3 at 4K.

So that's 99FPS with RT on at 1080p, and 38FPS with RT on at 4K.

The point being that even if raytracying a scene is 4 times slower at 4k versus 1080p, that doesn't mean that the overall framerate would also be four times slower, since it would depend entirely upon the proportion of raytracing versus rasterisation, which right now, we have no idea on.

I'm really sorry if i'm articulating this poorly, hopefully that makes sense.

I'm by no means an expert on modern GPU architectures, but I'm a professional mobile/app developer who has worked on several quite large scale 3D projects, so working out rendering pipelines and calculating performance is something I do quite a bit.

An a loosely related note, I wouldn't be surprised if NVIDIA's raytracing implementation allows for a variable resolution which is completely decoupled from the main rendering/rasterisation resolution, meaning that raytracing can be completed in 1080p while the rasterisation is still 4k. Given that shadows and lighting don't necessarily need the same level of fidelity and detail as textures and geometry, this would seem like an obvious optimisation to me. If NVIDIA themselves aren't baking it in, then I suspect games developers and the big engines will certainly offer something similar.

just to be that guy , 16.6ms is the available window for 60fps .. 33.3 for 30fps and so on :)
 
Did anyone realize the forest before the tree with BFV?
The only thing that DICE showcased, demonstrated and spoken on in BFV is ray traced reflections. That I've seen. If there is more ray traced elements in BFV (not concept) please link me.

So with only reflections BFV tanks performance to 1080p only for decent FPS.
Perhaps if DICE maintains a 1 element ray traced strategy there maybe optimizations had (according to the Digital Foundry Video). Thus, why they are confident they might etch out higher FPS at 1080p.

Unfortunately, in my view, doesn't change much. Although I would agree it made no sense to ray trace shadows, AO and the like as it's simply costly.
I still believe that this would take off better and quicker if either Intel/AMD/Nvidia create a stand along card for ray tracing. Be it from last gen or something new.
 
Last edited:
There's always one :p

LOL :P

jypDlk2.jpg
 
@melmac Seeing as TP has been quoted above with a few numbers based around what was said. I think some text has been left out:
"Turing is a beast. It’s going to significantly improve the gaming experience on old games and it’s going to rock it when you adopt new technology…”

If two at last two of those three things make folks smile then it's wallets out time soon :p. We know NV will likely dangle a big enough carrot.
 
Last edited:
RTX 2080 Ti Founders Edition vs GTX 1080 Ti Founders Edition 10 games benchmarks leaked:


just watched through and oh deer, the performace between the two cards is marginal at best, fair enough a couple of titles were showing a decent improvment, and i supose its running full tilt at 4k res but lets just hope that the drivers are early stage because if not a fair few ppl are going to be returning the new rtx and sticking with 1080ti's. i'll hold out for more reviews but if more of same i may end up getting a second 1080ti
 
Back
Top Bottom