• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA GeForce RTX 3090 Ti, the flagship reinvented

Associate
Joined
31 Dec 2008
Posts
2,260
A 40watt chip losing to a 100watt chip - news at 10 shocker.

Most laptops arnt used for gaming
Nope. M1 Max GPU on its own is about 70W and since in tomb raider M1 Max is about 30% slower than 100W 3080 performance per watt is about the same.
I agree that MacBooks Pro are not made for gaming but we are comparing them to nvidia gpus aren’t we.?
 
Last edited:
Soldato
Joined
17 Jun 2004
Posts
7,587
Location
Eastbourne , East Sussex.
I got the 40w number from anandtech ; its also apples to oranges comparison as the rtx 3080L is a separate chip, whereas the M1 is on die. The other most interesting part is its using LPDDR-5 system ram as well whereas the rtx 3080l is using dedicated ram (msi ge76, 16gb of gddr6)
 
Associate
Joined
31 Dec 2008
Posts
2,260
I got the 40w number from anandtech ; its also apples to oranges comparison as the rtx 3080L is a separate chip, whereas the M1 is on die. The other most interesting part is its using LPDDR-5 system ram as well whereas the rtx 3080l is using dedicated ram (msi ge76, 16gb of gddr6)

I agree it’s comparing Apples to Oranges but again it wasn’t me who first compared it to Nvidias architecture.
30-40W in Anandech review is for CPU benchmark and not GPU. When both loaded package power draw is over 90W and about 120W from the wall.
Regarding ram/vram the M1 Max is using unified memory which has higher bandwidth than laptop 3080.
400GB/s M1 Max and 384GB/s 3080L.
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
Except M1’s are 5nm chips and M1 Max is loosing by a big margin to 100W laptop 3080 in Tomb Raider which is a native MacOS game.
So clearly it’s not that great for gaming.

Indeed M1's are 5nm unlike anything from NVidia.

They are also not designed for gaming hence it is a poor comparison.

They also use less than half the wattage of the 100W laptop you quoted.

If you use a M1 based laptop like the Pro Max for graphics based work it is a beast and will beat your 100W 3080, even more so when both are unplugged from the mains as the Apple one does not downclock when using its battery.

Having said all that the point I was making is Apple's M1 chip for efficiency and power leaves anything from NVidia a long way behind.

NVidia are selling us ancient technology 8nm Ampere and charging prices even Apple would be embarrassed to charge.
 

V F

V F

Soldato
Joined
13 Aug 2003
Posts
21,184
Location
UK
Indeed M1's are 5nm unlike anything from NVidia.

They are also not designed for gaming hence it is a poor comparison.

They also use less than half the wattage of the 100W laptop you quoted.

If you use a M1 based laptop like the Pro Max for graphics based work it is a beast and will beat your 100W 3080, even more so when both are unplugged from the mains as the Apple one does not downclock when using its battery.

Having said all that the point I was making is Apple's M1 chip for efficiency and power leaves anything from NVidia a long way behind.

Nvidia are selling us ancient technology 8nm Ampere and charging prices even Apple would be embarrassed to charge.

That's not true. Many years ago when the Mac Pro line was sitting stagnant, professionals were screaming for system updates and Apple still had no issue charging huge prices for many years old technology. So many got to the stage they switched to Windows for a while because you were going to purchase such old technology at premium prices.

It was so bad at one point many thought Apple dropped the pro line. I think it was 4 years the Mac Pro saw no changes.
 
Soldato
Joined
17 Jun 2004
Posts
7,587
Location
Eastbourne , East Sussex.
That's not true. Many years ago when the Mac Pro line was sitting stagnant, professionals were screaming for system updates and Apple still had no issue charging huge prices for many years old technology. So many got to the stage they switched to Windows for a while because you were going to purchase such old technology at premium prices.

It was so bad at one point many thought Apple dropped the pro line. I think it was 4 years the Mac Pro saw no changes.

The same 4 years that Intel didnt change from SkyLake....
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
That's not true. Many years ago when the Mac Pro line was sitting stagnant, professionals were screaming for system updates and Apple still had no issue charging huge prices for many years old technology. So many got to the stage they switched to Windows for a while because you were going to purchase such old technology at premium prices.

It was so bad at one point many thought Apple dropped the pro line. I think it was 4 years the Mac Pro saw no changes.

I totally agree Apple's pricing still leaves a lot to be desired.

The Mac Pro desktops still have not seen any changes and are still using intel CPUs.

It gets even more embarrassing for them when the top of the range MacBook Pro (laptop using the M1 chip) can beat the best Mac Pro desktops (using intel CPUs) in a lot of stuff.

The point I am trying to make is when you see a run of the mill GPU like the one below selling for nearly 3k from a shop it is enough to make Apple blush.

https://www.overclockers.co.uk/giga...dr6x-pci-express-graphics-card-gx-1bz-gi.html
 
Soldato
Joined
6 Feb 2019
Posts
17,464
I totally agree Apple's pricing still leaves a lot to be desired.

The Mac Pro desktops still have not seen any changes and are still using intel CPUs.

It gets even more embarrassing for them when the top of the range MacBook Pro (laptop using the M1 chip) can beat the best Mac Pro desktops (using intel CPUs) in a lot of stuff.

The point I am trying to make is when you see a run of the mill GPU like the one below selling for nearly 3k from a shop it is enough to make Apple blush.

https://www.overclockers.co.uk/giga...dr6x-pci-express-graphics-card-gx-1bz-gi.html


Quite cheap :p
 
Associate
Joined
8 Sep 2020
Posts
1,432
Fits my Corsair HX1000 perfectly! If its 10-15% faster than the 3090, I am really considering selling the 3080 Ti and switching to it but then I think Lovelace is 7 months away so is it worth the effort?

10-15% faster than a 3090 ? that would be NO :cry: it will probably perform like a good oc 3090 out the box with a little more when overclocked itself .. at best i reckon 5% and that is being generous but we shall see . The difference between a 3080ti > 3090 are about the same difference we are getting with the 3090 >3090ti and we all know how close the 3080ti to 3090 was with an extra 1fps if that. Benchmarks may see a difference but not in gaming . If you're after a 15% + jump then wait a few months for 4000 series :D
 
Caporegime
Joined
12 Jul 2007
Posts
40,410
Location
United Kingdom
10-15% faster than a 3090 ? that would be NO :cry: it will probably perform like a good oc 3090 out the box with a little more when overclocked itself .. at best i reckon 5% and that is being generous but we shall see . The difference between a 3080ti > 3090 are about the same difference we are getting with the 3090 >3090ti and we all know how close the 3080ti to 3090 was with an extra 1fps if that. Benchmarks may see a difference but not in gaming . If you're after a 15% + jump then wait a few months for 4000 series :D
My money is on Jays 3090 being faster than most 3090 TIs. Log it. :cool:
 
Soldato
Joined
19 Feb 2007
Posts
14,254
Location
ArcCorp
I won't

The lesson I get from the 3090 is Samsung Ampere is a dreadfully inefficient, power hungry architecture and anymore of the same is not worth wasting money on.

I only have to look at what Apple have been doing with their latest M1 chips which have integrated graphics to know how rubbish NVidia's Ampere is.

Best wait until the next gen of GPUs are available rather than waste money on a pointless 3090Ti.

I am not even using my 3090 ATM, it is sat in my spare PC. I am using my Titan V 24/7 which although getting on a bit is still good enough for my games.

When the 3090Ti does turn up if anyone can get one, it may be able to reclaim a couple of benchmark titles from the 6900XT but for real world use there is no benefit for the likely huge cost.

I spent about 10k in December on new tech and not a penny of it went to the greedy $%^& at NVidia.

@NVidia I will buy your products again but only when you make something worth buying.

Nicely surprising statement, Good to hear.

Fits my Corsair HX1000 perfectly! If its 10-15% faster than the 3090, I am really considering selling the 3080 Ti and switching to it but then I think Lovelace is 7 months away so is it worth the effort?

It has 2% more CUDA cores than a 3090, The same identical Ampere cores that are in a "standard" 3090, That's the only notable change, 2% more cores of the same architecture will not result in being 15% faster than a "standard" 3090, More like 5% faster and I'm being very liberal with that estimate.
 
Soldato
Joined
15 Oct 2019
Posts
11,656
Location
Uk
Nvidia should have built it on TSMC 7nm, clocked it to 2.5ghz and gave it the Titan name + drivers if they really wanted to make it a worthwhile card.
 
Soldato
Joined
7 Dec 2010
Posts
8,221
Location
Leeds
NVIDIA GeForce RTX 3090 Ti pictured, specifications confirmed

https://videocardz.com/newz/nvidia-geforce-rtx-3090-ti-pictured-specifications-confirmed

1.jpg



The RTX 3090 Ti Founders Edition is a triple-slot design, pretty much identical to the original model. The card will make use of a single 16-pin power connector which will feed up to 450W of power. We have already seen custom models with recommended 1000W power supply so this SKU is definitely not for mid-range spec’ed systems.

According to our information, the card will have a significantly higher base clock of 1560 MHz and a boost of 1860 MHz. This is respectively 12% and 10% higher than RTX 3090 non-Ti. What this means is that the card will offer up to 40 TFLOPS of single-precision compute power.

NVIDIA GeForce RTX 3090 Ti obviously won’t be cheap, but NVIDIA is yet to confirm the MSRP of this model, possibly during its CES 2022 special address tomorrow where the card will be presented for the first time.
 
Last edited:
Back
Top Bottom