• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RTX 3080 TI isn't going to be- EDIT - No We Were All Wrong!

Soldato
Joined
21 Jul 2005
Posts
20,148
Location
Officially least sunny location -Ronskistats
Going by the rumours the 3080 seems the most interesting because people say/assume it has a 102 die. That means Nvidia are giving you more potent silicon than they would have ideally wanted to probably because they see a threat from Big Navi not from the kindness of their corporate hearts. Which means you should take AMDs offering seriously

This. nVidia have not been kind to anyones wallets ever. Then theres the tribe of posters that retort "if only AMD could compete at the high end X, Y, Z" well if AMD put out something credible.. oh forget it they will still buy nVidia lol.
 
Soldato
Joined
15 Oct 2019
Posts
11,832
Location
Uk
Going by the rumours the 3080 seems the most interesting because people say/assume it has a 102 die. That means Nvidia are giving you more potent silicon than they would have ideally wanted to probably because they see a threat from Big Navi not from the kindness of their corporate hearts. Which means you should take AMDs offering seriously
If there giving away more performance I would also also expect a price increase. No such thing as a free lunch where nvidia are concerned.
 
Soldato
Joined
16 Jan 2006
Posts
3,023
If there giving away more performance I would also also expect a price increase. No such thing as a free lunch where nvidia are concerned.

Why do you think it should work like that? Yes the cards go up in price each generation but they shouldn't get say 50% more expensive for 50% more performance.

Otherwise these cards will be costing 10k in 2030

And more performance is kind of the whole point of a new generation.

What's next...no new cards but prices go up by 50% every year anyway?
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
I think its a silly comparison tbh, because assuming a 3080Ti is coming anytime soon (it could well be a year or more away) the performance measure for it would be graphics and for GA100 it would be compute, deep learning and analytics.

Going by the rumours the 3080 seems the most interesting because people say/assume it has a 102 die.

I would've thought NV would want to release the 3080 TI this year, to make sure they maintain a significant performance advantage over AMD's RDNA 2 GPUs. RDNA 2 GPUs will probably be a lot more competitive with the RTX 3080 and 3070, than the 3080 TI. We also know NV can produce more powerful GPUs than the 3080 TI already, so there should be nothing stopping them from releasing the TI this year.

Also, the GA100 is obviously a very capable GPU for graphics + rendering (as well as other areas compute, deep learning) so not sure why you think it's silly to compare it to the 3080 TI. It represents the best that Nvidia can produce with the current technology (it almost certainly has the most transistors of it's generation, like previous Tesla GPUs). We can see this from the performance stats:

Pixel Rate: 225.6 GPixel/s Texture Rate: 609.1 GTexel/s 19.49 TFLOPS

We can't possibly work out the performance of the 3080 or 3080 TI without looking the specs for the Tesla A100.

No idea what the die the 3080 and 3080 TI will use. Could it end up being the GA102 for both (this is what Techpowerup estimate), or is the 3080 TI definately gonna use a different die?
 
Last edited:
Soldato
Joined
20 Aug 2019
Posts
3,033
Location
SW Florida
If there giving away more performance I would also also expect a price increase. No such thing as a free lunch where nvidia are concerned.

More performance at a given price point is just normal generational progress. It's not a "free lunch".

It's why the 1070 I bought new for under $400 was much faster than something like the GTX 580 that luanched at $500.
 
Soldato
Joined
15 Oct 2019
Posts
11,832
Location
Uk
More performance at a given price point is just normal generational progress. It's not a "free lunch".

It's why the 1070 I bought new for under $400 was much faster than something like the GTX 580 that luanched at $500.
The 570 was also under $400 which would be the equivalent tier card to the 1070 would it not?.

The point I was making is that if nvidia is going to increase the performance over and above their usual bump this time around then I would fully expect a price increase from that "given price point" to go hand in hand.

What's next...no new cards but prices go up by 50% every year anyway?

Not quite but Turing pretty much gave the same performance at the same price as pascal with only the 2080ti offering better performance than was previously available but with a hefty price increase.
 
Last edited:
Soldato
Joined
21 Jul 2005
Posts
20,148
Location
Officially least sunny location -Ronskistats
The 1080Ti was an example of unicorn like 'free lunch' whereby although it was a top of the stack release, they probably didnt foresee that it would be king of the hill for so long, and instead of a steady % bump like the other flavours it gave a meaty % boost. If you were lucky enough to get one at standard RRP before the miners and hardcore mob you had a bargain it turns out.

They wont be giving that away this time around, only if AMD kick em in the sac and force them to with a product that trades blows.
 
Soldato
Joined
20 Aug 2019
Posts
3,033
Location
SW Florida
The 570 was also under $400 which would be the equivalent tier card to the 1070 would it not?.

The point I was making is that if nvidia is going to increase the performance over and above their usual bump this time around then I would fully expect a price increase from that "given price point" to go hand in hand.



Not quite but Turing pretty much gave the same performance at the same price as pascal with only the 2080ti offering better performance than was previously available but with a hefty price increase.


A ~30% increase with each generation should be reasonable. Now that we are looking at *two* generations since pascal, 60% faster than a 1080Ti *for 1080Ti money* would be "okay".

Historically speaking Turning's lackluster price/performance is an outlier.
 
Soldato
Joined
15 Oct 2019
Posts
11,832
Location
Uk
Why do you think it should work like that? Yes the cards go up in price each generation but they shouldn't get say 50% more expensive for 50% more performance.

Gtx1080ti > rtx2080ti was almost a 50% price increase and for only 30% more performance.
 
Soldato
Joined
20 Aug 2019
Posts
3,033
Location
SW Florida
Gtx1080ti > rtx2080ti was almost a 50% price increase and for only 30% more performance.

Turing is the outlier. They would need to stagnate/regress for *many* generations before Turing becomes the norm.

And I really think we should stop getting distracted with naming schemes and "tiers".

I'll take a $600 "bottom tier" 3050ti if it's 60% faster than my 1080Ti.

Likewise I will pass on "God tier" "3090 Super Duper THIS IS SPARTA! Ti" if it only offers 35% more performance than my 1080ti card and costs $1500.
 
Last edited:
Associate
Joined
21 Apr 2007
Posts
2,494
I would've thought NV would want to release the 3080 TI this year, to make sure they maintain a significant performance advantage over AMD's RDNA 2 GPUs. RDNA 2 GPUs will probably be a lot more competitive with the RTX 3080 and 3070, than the 3080 TI. We also know NV can produce more powerful GPUs than the 3080 TI already, so there should be nothing stopping them from releasing the TI this year.

Also, the GA100 is obviously a very capable GPU for graphics + rendering (as well as other areas compute, deep learning) so not sure why you think it's silly to compare it to the 3080 TI. It represents the best that Nvidia can produce with the current technology (it almost certainly has the most transistors of it's generation, like previous Tesla GPUs). We can see this from the performance stats:

Pixel Rate: 225.6 GPixel/s Texture Rate: 609.1 GTexel/s 19.49 TFLOPS

We can't possibly work out the performance of the 3080 or 3080 TI without looking the specs for the Tesla A100.

No idea what the die the 3080 and 3080 TI will use. Could it end up being the GA102 for both (this is what Techpowerup estimate), or is the 3080 TI definately gonna use a different die?

Dude this is all emotion not fact, you're spinning on noise and speculation. Nvidia if they wanted could give you right now a GPU that out performs a 2080Ti for half the price.... If they wanted. No one cares about A100 in the consumer space hell its probably on a different process node to consumer GPUs and has very little baring on what we might get. Avoid the smoke and mirrors its silly to make comparisons on things like TFlops cos it doesn't translate into graphics perf and its doubly silly if your not making comparisons to competitive products.

You're asking questions we don't know and fanning the flames to keep some topic going... we don't know **** is the simple truth and bizzarely that seems at this point to suit AMD. 102 die on an NVidia 80 class card says something if true beyond that ur guess is as good as mine.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
30% seems quite likely to me, they were near the die size limit even with the RTX 2080 TI. You could get maybe a 40% increase with the factory OC'd models.

The fab. process shrink to 5nm should be significant though, I think both for AMD and Nvidia, as it should be able to cram in a lot more transistors into the same space, if the transistor density estimates (TSMC) end up being true.

On the other hand though, the fab. process shrinks don't seem to result in as large a density improvement as they do with some CPUs.

  • With the 7nm Samsung process the transitor density is 95.02M / mm²
  • With the A100 7nm GPU, the transitor density is 65.6M / mm²
This makes me wonder if part of the reason why AMD is behind Nvidia is related to the lower transitor density of their RDNA v1 and v2 GPUs.
 
Last edited:
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
Dude this is all emotion not fact,

If you say so. I was mostly just stating what we already know about the A100 Ampere GPU, like the theoretical performance. Even the benchmarks are looking good...

Honestly, I'd like to hear other opinions, or extra information, we don't have that much to go on really.
 
Back
Top Bottom