• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RTX 3080 TI isn't going to be- EDIT - No We Were All Wrong!

The problem is that you pay a lot more just for a few hundred Mhz, the MSI RTX 2080 Ti @ 1755 boost clock goes for ~£1,295 at the moment but the Zotac GeForce RTX 2080 Ti @ 1545 boost clock for ~£1,050. I can't see a reason why it would be different for the RTX 3080 TI, there's lots of money to be made out of 10-20% overclocked models.

But like I said, the base model will probably have a slightly higher boost clock than the RTX 2080 TI.

I hope they do a POS edition, I'd be tempted :D

BTW, the Ampere wikipedia page says the Fabrication process is 7nm (TMSC), so I'm going with that for now. Link:
https://en.wikipedia.org/wiki/Ampere_(microarchitecture)#cite_note-verge-A100-1

We already know that GA100 is built on 7nm TSMC. What we don't know is if GA102/104/106 is also on TSMC 7nm or Samsung 8nm/10nm
 
It's worth noting that the Boost clock of the RTX 2080 TI was very similar to the previous generation's TITAN X Pascal (1545 MHz vs 1531 MHz). Based on this, I'd estimate that the base model RTX 3080 TI will have a very similar boost clock to the current gen. Titan RTX (1770 MHz), just to guarentee an advantage overall in benchmarks.

Another thing to take into account is that Nvidia will want to leave a significant performance gap inbetween the RTX 3080 TI and the Ampere based Titan, just like the Titan RTX did with the current generation. So, the ampere based Titan model will probably get an extra couple hundred Mhz boost clock and around 256 extra shaders, just like the current gen. Maybe if production yields are good, it will be called the RTX 3090 instead? I also think that if any card goes over 300 TDP, it would be this one, not the 3080 TI.

The boost figures quoted for the RTX Titan and the 2080 Ti don't mean anything useful at all.

If you actually use both cards they end up boosting about the same in real life.

The quoted boost figures are all about sales waffle and not much else.

There is not much margin in it on benches between the RTX Titan and 2080 Ti either, the Titan edges it by about 7%.
 
We already know that GA100 is built on 7nm TSMC. What we don't know is if GA102/104/106 is also on TSMC 7nm or Samsung 8nm/10nm

I am rather hoping the Titan will use the same GA100 silicon.

This is not unheard of as the Titan V used the same as the professional cards on Volta.

This made the Titan V a bargain as it was a monster gaming card and also the same for professional work, it could even do ray tracing before Turing ever saw the light of day.
 
We already know that GA100 is built on 7nm TSMC. What we don't know is if GA102/104/106 is also on TSMC 7nm or Samsung 8nm/10nm

All the rumours/leaks point towards 8nm. I can't see the being the capacity on 7nm for the kind of quantities Nvidia sell bearing in mind all the other companies also using it.
 
I am rather hoping the Titan will use the same GA100 silicon.

This is not unheard of as the Titan V used the same as the professional cards on Volta.

This made the Titan V a bargain as it was a monster gaming card and also the same for professional work, it could even do ray tracing before Turing ever saw the light of day.

Wasn’t Titan V a £2.5 - £3k part though compared to £1.2k for Titan X, XP? You can’t define as a bargain surely
 
Wasn’t Titan V a £2.5 - £3k part though compared to £1.2k for Titan X, XP? You can’t define as a bargain surely

The Titan V was expensive if you use it just for gaming but its DP ability will totally demolish any Turing professional card.

It is actually the Titan X and XP that are expensive as they are very poor at professional work.
 
The boost figures quoted for the RTX Titan and the 2080 Ti don't mean anything useful at all.

If you actually use both cards they end up boosting about the same in real life.

The quoted boost figures are all about sales waffle and not much else.

There is not much margin in it on benches between the RTX Titan and 2080 Ti either, the Titan edges it by about 7%.

It's just an estimate, but it makes sense to me that Nvidia would want the GPU clocks to be very similar to the best (premium) GPU of the last generation, to encourage people to upgrade. It's useful to know the boost clock to work out the overall processing power of the card (TF). I know these cards may clock higher, but it can't be relied on when you are trying to make fair comparisons. There's also the 'silicon lottery' to consider.

I agree with you though, the higher clocks on some factory overclocked GPUs are used to encourage customers to pay hundreds more (Especially for high end models), for a relatively small improvement.

The difference in performance between the RTX 2080 TI and the RTX Titan is more like 18%, according to this:
https://www.techpowerup.com/gpu-specs/titan-rtx.c3311
 
Last edited:
The Titan V was expensive if you use it just for gaming but its DP ability will totally demolish any Turing professional card.

It is actually the Titan X and XP that are expensive as they are very poor at professional work.

Right so if you happen to be a professional in say deep-learning, data analytics and AI working on your own PC (not a work one) and you are a gamer then Titan V was good for you and your niche of niches case especially if some company purchased the GPU for you or you could claim it back through tax expenses etc.
 
You still need to turn enable overclocking to get higher clocks than the official boost clocks, right? I think this is how it works on my brother's gtx 1080.

I bet most GPUs can get at least 100-200mhz higher than this figure, so higher boost clocks should still equal higher clocks overall.

I think I may be flogging a dead horse, but does anyone think the RTX 3080 TI will have a higher GPU boost clock than the 2000 series RTX Titan, which is 1770mhz?

I think Nvidia will need to leave a performance gap for the RTX 3090 / next Titan GPU, which I think could have a boost clock of upto 2000-2100mhz.
 
Last edited:
Does Nvidia's 'OC Scanner' algorithm run in the background after installing the Nvidia drivers + software, or does it need to be enabled first in software like MSI Afterburner?

Also, does it only work on GTX / RTX series GPUs, or older GPUs too?
 
Anyone know the answer to my question in the post above?

Also, I don't think there will be an RX 3090 on second thought, because the xx90 part of the model name usually refers to dual GPU designs with Nvidia graphics cards.
 
Last edited:
Does Nvidia's 'OC Scanner' algorithm run in the background after installing the Nvidia drivers + software, or does it need to be enabled first in software like MSI Afterburner?

Also, does it only work on GTX / RTX series GPUs, or older GPUs too?

I belive OC scanner only works on RTX so far as it's Nvidias algorithm and it's not automatic you need to install something like afterburner and press the OC scanner button
 
Would've thought an 18 TF card (assuming approx. 1770Mhz GPU clock) would be more than enough for most gamers playing at 4K resolution. I suppose the point of this is really to try to calculate the maximum performance of the RTX 3080TI, assuming it is built with the Ampere architecture rather than a different design, and has a TDP of around 300w.

Is it realistic to expect more than a 5/6TF improvement in a single graphics card generation? Has this ever happened with a Nvidia graphics card before? The biggest jump in performance seems to be from the 980 TI to 1080 TI, which was 5.28‬ TF (~67% overall perf. improvement!).

For some reason, some people think there will be close to a 100% performance increase in a single generation, at the same / slightly higher TDP...

Sadly, even if the price was twice the amount of the RTX 2080 TI, this would not be reflected in the performance...
 
Last edited:
In recent times the 1080Ti was actually a brute of a card. It has shown in hindsight that it was the card to get as it has been so powerful for so long. If raytracing and other current gimmicks are of no interest then this card will have been quite the unicorn as it still handles games really well.

Sadly even this card is not quite enough for enthusiast displays so we are waiting for the one that is.
 
Back
Top Bottom