• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The (Ada) Lovelace architecture Thread.

Soldato
Joined
6 Jan 2013
Posts
21,839
Location
Rollergirl
An oc 3090 was 50% faster in some games over a 2080ti (remember they got a lot of stick for the 2080ti performance over 1080ti ... 2080ti to 3080 is really 25-35% max increase depending on game and sometimes less again.

Is there any particular reason why you've decided to compare cards that didn't directly replace each other from generation to generation?

If you've chosen a 3080 to gauge the increase then surely you should be comparing to the 2080?.. similarly, 2070 to 3070?
 
Soldato
Joined
7 Dec 2010
Posts
8,221
Location
Leeds
Is there any particular reason why you've decided to compare cards that didn't directly replace each other from generation to generation?

If you've chosen a 3080 to gauge the increase then surely you should be comparing to the 2080?.. similarly, 2070 to 3070?

Well like I said Nvidia moved them all up one tier the 3000 series, really the 3080 should have been on GA104 (3070ti,3070,3060ti,3060 cards now using it) not GA102.


Reality is they should be compared to the same chips, so if being right the 3080 should have been on GA104 so guessing 3070 and the 3080 super would have been the 3070ti. Then the 3080ti would have been GA102 and probably the 3080 we have now and the titan 30 RTX would have been what we have now the 3080ti @ $2k msrp or 3090 spec @ $2.5k and probably called Titan 30 RTXp like last Titan the RTX Titan, full chips all to the A6000 (Quadro range).



2080 was on TU104.

2080ti was on TU102.


3070 is on GA104.

3080 is on GA102.

3090 is on GA102.


So really 2080 should be compared to 3070. https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/35.html


If we are comparing them by model names then yes 2080 vs 3080 then a larger increase :- https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/34.html .

I was just comparing to the top cards really, so the top one from last gen was 2080ti and compared to the 3080 and 3090, 3080ti is well same as 3090 +/- 3-5% again depending on game and some cases exactly same fps.

Way I compare them is what was last top card to latest top card normally (on the same class chip). Coming back to the normal increase if you look back in history it has been 20-30% from last gen top card to the latest gen top card.



Some good videos worth watching, showing how Nvidia increased performance from gen to gen.







 
Last edited:
Soldato
Joined
7 Dec 2010
Posts
8,221
Location
Leeds
Yea, ok you win.

There is no winning mate, it's just the sad reality of the gpu market now. Without AMD this time the 3080 would have been a 3070 really. Nvidia users need to thank them for it really, 3090 is just a Titan they renamed to 3090 and thankfully didn't have Titan prices which would have been $2k to $2.5K, but we know reality of that now too..

While 3080s are selling for £1.5K, 3080ti £2k and 3090s £3k...todays prices...


xx90 class from Nvidia were always cards with two chips on them (basically a sli single card), this is first xx90 class with a single chip. Basically they didn't want to do that but just named it the xx90 class as they always had a silly price tag on them, so also they can keep the price high as it didn't come with Titan drivers the 3090..


True xx90 class nvidia were these cards with two gpus on one card :-

GTX 295 ($500)

https://www.techpowerup.com/gpu-specs/geforce-gtx-295.c239

https://www.anandtech.com/show/2708

GTX 590 ($699)

https://www.techpowerup.com/gpu-specs/geforce-gtx-590.c281

https://www.anandtech.com/show/4239/nvidias-geforce-gtx-590-duking-it-out-for-the-single-card-king

GTX 690 ($999)

https://www.techpowerup.com/gpu-specs/geforce-gtx-690.c361

https://www.anandtech.com/show/5805...-review-ultra-expensive-ultra-rare-ultra-fast

Titan Z (at least came with Titan drivers and a really silly price of $3k)

https://www.techpowerup.com/gpu-specs/geforce-gtx-titan-z.c2575

https://www.anandtech.com/show/8069/nvidia-releases-geforce-gtx-titan-z
 
Last edited:
Associate
Joined
4 Nov 2015
Posts
250
As a more general question, should we as consumers and climate change-aware citizens accept 500w+ requirements for a single component? It's an enormous amount of power and that's not going to be cheap. Moreover, you could argue that power spent on gaming is equally as wasteful as power spent on crypto mining.

Thoughts?
 
Soldato
Joined
6 Jan 2013
Posts
21,839
Location
Rollergirl
As a more general question, should we as consumers and climate change-aware citizens accept 500w+ requirements for a single component? It's an enormous amount of power and that's not going to be cheap. Moreover, you could argue that power spent on gaming is equally as wasteful as power spent on crypto mining.

Thoughts?

It's not a compulsory purchase (for some)
 
Last edited:

G J

G J

Associate
Joined
3 Oct 2008
Posts
1,393
I have a Seasonic 500w PSU that I've had for 12-14 odd years and its run multiple systems like a champ though I've had to be carefull not to use anything too power hungry as time has gone on so I'll have to upgrade it a some point tho proberly when it dies.

Really weird to me how upcoming graphics card could use upto 500w as I run my whole system with that. :eek:
 
Soldato
Joined
30 Aug 2014
Posts
5,960
Same, only just bought a new system with an 850w in it. If that's not enough for a 4070 I'll give it a miss, don't really want a system drawing silly power. Exception would be if the performance leap was massive.
With the cost of living crisis and the environmental impact, this ludicrous power draw is even more insane. AMD are rumoured to have equal or better performance to Nvidia in the next generation while using significantly less power. I would go that way if I felt I had to upgrade.
 
Soldato
Joined
7 Dec 2010
Posts
8,221
Location
Leeds
With the cost of living crisis and the environmental impact, this ludicrous power draw is even more insane. AMD are rumoured to have equal or better performance to Nvidia in the next generation while using significantly less power. I would go that way if I felt I had to upgrade.

I'll believe it when I see it... There is even 6900xt cards that recommend 850w-1000watt PSU. Also the spikes of the 6900xt are worse than a 3090.

AMD is on TSMC 7nm and can spike worse than 3090 and Nvidia is on a worse node at samsung 8nm (that is comparable to TSMC 10nm), next time they will both be on 6nm or 5nm and you will see the reality (power use will be silly for both AMD and Nvidia, it is now a brute force war and that means silly power use and lots of heat now from your pc).


Linus had to swap out a 1000w seasonic premium psu in the tests because the psu would shut down and the card was a reference AMD card so can you imagine the AIB oc cards and what they can pull and spike to .


Watch from 7 minutes.
 
Last edited:
Back
Top Bottom