• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

But Jensen, how fast is it ?

Associate
Joined
16 Oct 2012
Posts
218
Seeing as there were no relative performance figures for the 20xx vs the 10xx series in non raytracing applications it's a relatively safe bet there arent many architecture improvements on the pixel shader side and the improved 12nm process isnt likely to let them clock much faster. Assuming that how fast are the 20xx cards?

For a while the FLOP calculation for Nvidia GPU's has been
CUDA cores x Clock speed x 2

TITAN V: GPU Cores 5120
1455 MHz (Turbo)= 14.9 TFLOP
1900MHz (overclock) = 19.5 TFLOP

RTX 2080 Ti : GPU Cores 4352
1545 MHz (Turbo)= 13.4 TFLOP
2000MHz (overclock ?) = 17.4 TFLOP


GTX1080 Ti : GPU Cores 3584
1582 MHz (Turbo) = 11.3 TFLOP
2000MHz (overclock) = 14.3 TFLOP

RTX 2080 : GPU Cores 2994
1710 MHz (Turbo)= 10.2 TFLOP
2000MHz (overclock ?) = 12 TFLOP


GTX1080 : GPU Cores 2560
1733 MHz (Turbo) = 8.9 TFLOP
2000MHz (overclock) = 10.2 TFLOP


Keep in mind there's some assumptions going on here but FLOPS generally translate pretty linearly to frame rate within the same general architecture. If i'm right RTX2080 looks like a pretty terrible deal while RTX 2080Ti isnt actually that bad, despite the initial sticker shock.
 
Soldato
Joined
12 Mar 2006
Posts
22,990
Location
N.E England
Best post I’ve seen for hours. I expect marginal gains over a 1080ti (I really want it to smash it though :(), drivers may optimise it to gain more and more but I reluctantly say I’m out

I’m very interested if AMD can capitalise on this pricing and pull a rabbit out the hat
 
Associate
Joined
27 May 2010
Posts
1,051
Location
Kent
Best post I’ve seen for hours. I expect marginal gains over a 1080ti (I really want it to smash it though :(), drivers may optimise it to gain more and more but I reluctantly say I’m out

I’m very interested if AMD can capitalise on this pricing and pull a rabbit out the hat

It would be nice but I can't see it. Surely we'd have heard something by now if they had an ace up their sleeve.
 
Associate
Joined
9 Jul 2009
Posts
1,008
There is a reason they didn't show us any benchmarks. Sure they may be 8x faster in ray tracing but in actually real games they will be only marginally faster. In fact I fully expect the 2080 to be slower then the 1080ti unless ray tracing is enabled in the few games that support it. Why else would they release the 2080ti alongside it?
 
Soldato
Joined
26 Sep 2010
Posts
7,157
Location
Stoke-on-Trent
Surely we'd have heard something by now if they had an ace up their sleeve

There may be a renewed drive to do a 7nm RX Vega now. The 18 month gap until Navi felt weird to me, I can't get my head around the notion that Vega is respun onto 7nm for Instinct cards, but RX Vega will continue to be overpriced sat on 14nm until end of next year. Does not compute.

I always had a feeling we'd see a 7nm RX Vega with Hynix HBM2 to improve yields, boost performance a bit, bring power draw down and lower prices a touch until Navi lands (and also give those 7nm muscles plenty of flex so the process is mature for Navi), I think there is the opportunity for AMD now to fill a gap.
 
Associate
OP
Joined
16 Oct 2012
Posts
218
Titan V overclocks a bit better than you have there and actually performs much the same as Pascal for clockspeed.
I did lowball the clock on the Titan V but only because i've not seen them consistently hit 2GHz and counting 50MHz jumps seems a bit mealy.

The OCUK timespy results are actually pretty revealing, taking the top overclock GPU score for each of the cards on OCUK
TitanV @1972/1000, GFX Score 14598 (2.85 points per cuda core)
TitanXP @2050/3106, GFX Score 11405 (2.97 points per cuda core)
1080 Ti @2151/3000, GFX Score 11267 (3.14 points per cuda core)
1080 @2190/2850, GFX Score 8552 (3.34 points per cuda core)

There's an 11% difference in clock speed between the Titan V and the GTX1080 and there's 17% difference in points per cuda core. In the scheme of things that 6% difference is pretty small compared to the overall gain in performance. Like i said there are assumptions here which i know are flawed, i'm just trying to "guess with the benefit of experience"
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
I did lowball the clock on the Titan V but only because i've not seen them consistently hit 2GHz and counting 50MHz jumps seems a bit mealy.

The OCUK timespy results are actually pretty revealing, taking the top overclock GPU score for each of the cards on OCUK
TitanV @1972/1000, GFX Score 14598 (2.85 points per cuda core)
TitanXP @2050/3106, GFX Score 11405 (2.97 points per cuda core)
1080 Ti @2151/3000, GFX Score 11267 (3.14 points per cuda core)
1080 @2190/2850, GFX Score 8552 (3.34 points per cuda core)

There's an 11% difference in clock speed between the Titan V and the GTX1080 and there's 17% difference in points per cuda core. In the scheme of things that 6% difference is pretty small compared to the overall gain in performance. Like i said there are assumptions here which i know are flawed, i'm just trying to "guess with the benefit of experience"

You really need to discount the 1080 and 1080 Ti scores above as they were done on waterblocks with exceptional cards. Normal 1080 and 1080 Ti cards hover around 2000mhz on air much the same as the Titan V and Titan Xp do.

Here is another one with the Titan V @2047mhz on air.

https://www.3dmark.com/fs/14722347

To be fair though the average is about 2000mhz for the card.:)
 
Associate
Joined
27 May 2010
Posts
1,051
Location
Kent
There may be a renewed drive to do a 7nm RX Vega now. The 18 month gap until Navi felt weird to me, I can't get my head around the notion that Vega is respun onto 7nm for Instinct cards, but RX Vega will continue to be overpriced sat on 14nm until end of next year. Does not compute.

I always had a feeling we'd see a 7nm RX Vega with Hynix HBM2 to improve yields, boost performance a bit, bring power draw down and lower prices a touch until Navi lands (and also give those 7nm muscles plenty of flex so the process is mature for Navi), I think there is the opportunity for AMD now to fill a gap.

How long would we be looking at for that to happen though? If they want to take advantage of the whole Turing price debacle then they need something out, even a hint or a carefully planted rumour, to pick up the disgruntled customers who would have gone green.

Surely they couldn't get something as complicated as you describe done and out before years end?
 
Soldato
Joined
19 Oct 2002
Posts
5,048
Location
Pembrokeshire
The entire thing seemed to be about shadows something you're not going to notice in game or appreciate if you did. I would have been interested in a game demo playing maxed out at 4K or a comparison with a 1080Ti.
 
Soldato
Joined
26 Sep 2010
Posts
7,157
Location
Stoke-on-Trent
How long would we be looking at for that to happen though? If they want to take advantage of the whole Turing price debacle then they need something out, even a hint or a carefully planted rumour, to pick up the disgruntled customers who would have gone green.

Surely they couldn't get something as complicated as you describe done and out before years end?

Well in real terms I don't know. From what I understand, Vega is a complex package to build with the HBM being built into it, so it's not just a case (as I thought previously) of taking the defective Vega 20 dies used for Instinct and deactivating some parts to get the 7nm equivalent of a Vega 10.

That being said, Instinct is supposed to be landing end of this year, and Lisa Su said back in May (was it?) that Vega 20 was already sampling to vendors.

How much of a stretch is it to think that if AMD have been producing 7nm Vega 20 for most of this year so far, they can spin out a 7nm Vega 10 too?

Bet even if they can, you'd need Hynix HBM2 to bring the costs down (keep the Samsung HBM2 for Instinct), package everything up and get it launched and on the shelves by end of September in order to capitalise on the RTX's stupid prices. And even then that's assuming the minor speed bump you'd get on 7nm will consistently push Vega 64 over GTX 1080 performance and Vega 56 consistently over the GTX 1070.
 
Associate
OP
Joined
16 Oct 2012
Posts
218
You really need to discount the 1080 and 1080 Ti scores above as they were done on waterblocks with exceptional cards. Normal 1080 and 1080 Ti cards hover around 2000mhz on air much the same as the Titan V and Titan Xp do.

With respect you probably are correct but I dont see why that's really relevant to the core argument of my post, you're getting a little to hung up on the detail. The point is that RTX2080 is a total dog at that price point and nothing from RTX2080TI will get anywhere close to TITAN V.

p.s. my 1080 is on air and matches the other ones score, anecdote is anecdote but still
 
Associate
Joined
27 May 2010
Posts
1,051
Location
Kent
Well in real terms I don't know. From what I understand, Vega is a complex package to build with the HBM being built into it, so it's not just a case (as I thought previously) of taking the defective Vega 20 dies used for Instinct and deactivating some parts to get the 7nm equivalent of a Vega 10.

That being said, Instinct is supposed to be landing end of this year, and Lisa Su said back in May (was it?) that Vega 20 was already sampling to vendors.

How much of a stretch is it to think that if AMD have been producing 7nm Vega 20 for most of this year so far, they can spin out a 7nm Vega 10 too?

Bet even if they can, you'd need Hynix HBM2 to bring the costs down (keep the Samsung HBM2 for Instinct), package everything up and get it launched and on the shelves by end of September in order to capitalise on the RTX's stupid prices. And even then that's assuming the minor speed bump you'd get on 7nm will consistently push Vega 64 over GTX 1080 performance and Vega 56 consistently over the GTX 1070.

It sounds like a nice idea but one that is beyond the capabilities of AMD to pull off I fear. Don't get me wrong, I'd love for them to do it, but you get the impression that part of the reason Turing is priced so high is because Nvidia know they have hte market to themselves and can pretty much act with impunity.

Pretty thoroughly depressing state of affairs really!
 
Soldato
Joined
21 Jan 2016
Posts
2,915
The entire thing seemed to be about shadows something you're not going to notice in game or appreciate if you did. I would have been interested in a game demo playing maxed out at 4K or a comparison with a 1080Ti.

True the improvements individually don't seem all that amazing, but I do think that the real time reflections shown (especially in that battlefield V part) and shadows will contribute to a general overall fidelity that while you don't necessarily notice all the individual components just gives a much more immersive experience... especially when you consider ray tracing in VR where immersion is everything.

That said, I don't think I'm stumping up north of £1k to replace my 980Ti hydrocoppers. Sadly I think I'll be eeking another year or two out of them.
 

ljt

ljt

Soldato
Joined
28 Dec 2002
Posts
4,540
Location
West Midlands, UK
True the improvements individually don't seem all that amazing, but I do think that the real time reflections shown (especially in that battlefield V part) and shadows will contribute to a general overall fidelity that while you don't necessarily notice all the individual components just gives a much more immersive experience... especially when you consider ray tracing in VR where immersion is everything.

That said, I don't think I'm stumping up north of £1k to replace my 980Ti hydrocoppers. Sadly I think I'll be eeking another year or two out of them.

Battlefield games are quite fast paced. I don't really care whether my team mates eye reflects the muzzle flash of the tank shooting at us :D. Seems a bit of a waste of resources. Games like that need higher fps and smoother gameplay. I don't spend much time looking down at a puddle for the reflections
 
Associate
Joined
21 May 2010
Posts
550
There is a reason they didn't show us any benchmarks. Sure they may be 8x faster in ray tracing but in actually real games they will be only marginally faster. In fact I fully expect the 2080 to be slower then the 1080ti unless ray tracing is enabled in the few games that support it. Why else would they release the 2080ti alongside it?
Or is it an excuse to ride the hype train even further. Start the conversation about raytracing then spin out a load of internet chatter for a week or so before telling everyone the £750 card is more powerful than the £800 one you have been selling. More hype & more price gouging. Frenzy to buy yet more of the expensive product that reviewers are now saying is the most advanced product ever etc etc.
 
Don
Joined
19 May 2012
Posts
17,185
Location
Spalding, Lincolnshire
The entire thing seemed to be about shadows something you're not going to notice in game or appreciate if you did. I would have been interested in a game demo playing maxed out at 4K or a comparison with a 1080Ti.

Pretty much :(

Was sceptical before the announcement and hoped the raytracing would be the next step that people were hyping it to be, but as predicted seems to be token visual effects tacked in here and there - nothing game changing.
 
Back
Top Bottom