• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA GeForce RTX 3090 Ti, the flagship reinvented

Associate
Joined
23 Oct 2019
Posts
482
I can name the 3090 owners on this forum that will be frantically hitting refresh on various store pages come launch day for the 3090 Ti, Way more money than sense :D

to be honest i bet plenty of them will not spend a single penny considering how much 3090s still sell for.
 
Soldato
Joined
6 Feb 2019
Posts
10,250
to be honest i bet plenty of them will not spend a single penny considering how much 3090s still sell for.


If the 3090ti is LHR and 3090 production stops then every 3090 owner will be able to upgrade to the 3090ti for free. That's what my mates did with their 3080's - once the 3080 LHR came out, the 3080 went up in value like crazy, to the point where you could sell your 3080 non LHR and not have to put in a cent to buy a 3080ti
 
Soldato
Joined
31 Oct 2002
Posts
8,591
to be honest i bet plenty of them will not spend a single penny considering how much 3090s still sell for.

Most of the 3090 owners have made double, triple of the value of the 3090, mining etherum on them. It's common for people to just assume "more money than sense", when in reality, 3090 was a money earner. Generates 200-300W (depending on mining setup) or heat - saving gas central heating costs this Winter. Profit per day has been consistently between £5-20 (though most often closer to £5 as of late).

There's a very good reason GPU's are so hard to get hold of... I wonder what it could be :D
 
Soldato
Joined
21 Jul 2005
Posts
15,984
Location
N.Ireland
Most of the 3090 owners have made double, triple of the value of the 3090, mining etherum on them. It's common for people to just assume "more money than sense", when in reality, 3090 was a money earner. Generates 200-300W (depending on mining setup) or heat - saving gas central heating costs this Winter. Profit per day has been consistently between £5-20 (though most often closer to £5 as of late).

There's a very good reason GPU's are so hard to get hold of... I wonder what it could be :D

Can't believe I am reading this tbh... said this from the start @TNA etc when folk were quaffing at the price.. how different a year makes the reading now! I don't think I made triple the value, that's quite strong on a single card alone, but lets just say it paid for itself a while ago. :)
 

TNA

TNA

Soldato
Joined
13 Mar 2008
Posts
21,218
Location
London
Can't believe I am reading this tbh... said this from the start @TNA etc when folk were quaffing at the price.. how different a year makes the reading now! I don't think I made triple the value, that's quite strong on a single card alone, but lets just say it paid for itself a while ago. :)
:D
 
Soldato
Joined
31 Dec 2007
Posts
12,913
Location
The TARDIS, Wakefield, UK
Is it just me or does the 3090ti name on the packaging just look like it's been stuck on over the original text?.

I thought that its more pixelated than the rest of the box as though its been smoothed artificially.

Did you click the right arrow to see picture 2. It just looks like a lower res version of the one on the left.
 
Soldato
Joined
26 Aug 2004
Posts
4,890
Location
South Wales
Can't believe I am reading this tbh... said this from the start @TNA etc when folk were quaffing at the price.. how different a year makes the reading now! I don't think I made triple the value, that's quite strong on a single card alone, but lets just say it paid for itself a while ago. :)
I made roughly 2x the cost of my 3080 with a 3060Ti going at the same time for a bit, but had to stop for a while because the room was getting roasting hot in summer.

Now just trying to see best way to sell off the 3080 and when.
 
Soldato
Joined
21 Jul 2005
Posts
15,984
Location
N.Ireland
I made roughly 2x the cost of my 3080 with a 3060Ti going at the same time for a bit, but had to stop for a while because the room was getting roasting hot in summer.

Now just trying to see best way to sell off the 3080 and when.

Yes there were moments the time to sell some of the proceeds were insane, so much so you said to yourself, man I need another! The 3080 was best outlay for not far off the same hashrate, but AIB prices were ridiculous (at or beyond a 3090) and the FE versions were hard to come by. There were a couple of members that outright denounced it was able to make such money back and were so anti-miner they would never comprehend what they should have recouped if they played along really.

Then you had people selling their cards at second hand places getting more than what they paid in the first place and either got their new 30 series card for free, or they paid very low net cost to jump upgrade. Even some core 'gamers' on here sold their 30 series cards to local high street pawn outlet to double their initial purchase. Its been a crazy year but capitalising on a bad situation has been straight forward if you bothered.
 
Soldato
Joined
30 Jul 2006
Posts
3,008
Yes there were moments the time to sell some of the proceeds were insane, so much so you said to yourself, man I need another! The 3080 was best outlay for not far off the same hashrate, but AIB prices were ridiculous (at or beyond a 3090) and the FE versions were hard to come by. There were a couple of members that outright denounced it was able to make such money back and were so anti-miner they would never comprehend what they should have recouped if they played along really.

Then you had people selling their cards at second hand places getting more than what they paid in the first place and either got their new 30 series card for free, or they paid very low net cost to jump upgrade. Even some core 'gamers' on here sold their 30 series cards to local high street pawn outlet to double their initial purchase. Its been a crazy year but capitalising on a bad situation has been straight forward if you bothered.

It's true, I expect my upgrade each generation to be completely free in future due to shortages. I'm able to work from home with stock scanners going next to me and alerting me all day until I get a 4090 at launch with a credit card much like my 3090 then I can offload the 3090 at a pawn shop or on FB marketplace/Ebay when prices and supplies run out and people get desperate. 2080ti > 3090 > 4090 = £0. Any extras I get I sell to friends and colleagues at RRP just as I have with a few ps5s and series xs to colleagues as I'm better at stock tracking than them.
 
Soldato
Joined
7 Dec 2010
Posts
6,040
Location
Leeds
EVGA GeForce RTX 3090 Ti Kingpin to require dual 12-pin power connectors.
EVGA RTX 3090 Ti Kingpin might cost more than original Kingpin
New details on the next-gen EVGA flagship graphics card.


https://videocardz.com/newz/evga-geforce-rtx-3090-ti-kingpin-to-require-dual-12-pin-power-connectors

The new information suggests that the RTX 3090 Ti Kingpin would require dual 12-pin power connectors. This is the new generation PCIe Gen 5.0 power connector that we talked about. It has 12 standard pins and 4 data paths which are optional. It is unclear for now how manufacturers will call this connector, but it seems to be fully compatible with the existing NVIDIA 12-pin (Molex Microfit 3.0) connector.

By introducing dual 12-pin power connectors, the card could theoretically consume up to 1275W of power, which is a massive upgrade with nearly 750W more power than a potential 3x 8-pin configuration (max 525W).

docbrown-backtothefuture.gif

:cry:
 
Soldato
Joined
6 Feb 2019
Posts
10,250
EVGA GeForce RTX 3090 Ti Kingpin to require dual 12-pin power connectors.
EVGA RTX 3090 Ti Kingpin might cost more than original Kingpin
New details on the next-gen EVGA flagship graphics card.


https://videocardz.com/newz/evga-geforce-rtx-3090-ti-kingpin-to-require-dual-12-pin-power-connectors

The new information suggests that the RTX 3090 Ti Kingpin would require dual 12-pin power connectors. This is the new generation PCIe Gen 5.0 power connector that we talked about. It has 12 standard pins and 4 data paths which are optional. It is unclear for now how manufacturers will call this connector, but it seems to be fully compatible with the existing NVIDIA 12-pin (Molex Microfit 3.0) connector.

By introducing dual 12-pin power connectors, the card could theoretically consume up to 1275W of power, which is a massive upgrade with nearly 750W more power than a potential 3x 8-pin configuration (max 525W).

docbrown-backtothefuture.gif

:cry:


The power supply when you turn the PC on:

 
Soldato
Joined
7 Dec 2010
Posts
6,040
Location
Leeds
The power supply when you turn the PC on:

AIB version prototypes with normal connectors to power & cool it .. :cry:

V3a9qwx.jpg



Who else is waiting for the 4060Ti which will best it

People are in for a shock when 40 series comes out, the 4080 will be back to the AD104 chip for sure and the AD102 will again be the 4080ti model and up.

This time AMD spooked Nvidia and made them move up every card one tier, people that purchased a 3080 this time really got what should have been the 3080ti (but they crippled the vram to make sure the 3080 has a short life for high resolution screens). Nvidia will not make that mistake again next time. Also wait for the prices and power use on next gen. :eek:
 
Last edited:
Man of Honour
Joined
21 May 2012
Posts
31,348
Location
Dalek flagship
I can name the 3090 owners on this forum that will be frantically hitting refresh on various store pages come launch day for the 3090 Ti, Way more money than sense :D

I won't

The lesson I get from the 3090 is Samsung Ampere is a dreadfully inefficient, power hungry architecture and anymore of the same is not worth wasting money on.

I only have to look at what Apple have been doing with their latest M1 chips which have integrated graphics to know how rubbish NVidia's Ampere is.

Best wait until the next gen of GPUs are available rather than waste money on a pointless 3090Ti.

I am not even using my 3090 ATM, it is sat in my spare PC. I am using my Titan V 24/7 which although getting on a bit is still good enough for my games.

When the 3090Ti does turn up if anyone can get one, it may be able to reclaim a couple of benchmark titles from the 6900XT but for real world use there is no benefit for the likely huge cost.

I spent about 10k in December on new tech and not a penny of it went to the greedy $%^& at NVidia.

@NVidia I will buy your products again but only when you make something worth buying.
 
Soldato
Joined
6 Feb 2019
Posts
10,250
I won't

The lesson I get from the 3090 is Samsung Ampere is a dreadfully inefficient, power hungry architecture and anymore of the same is not worth wasting money on.

I only have to look at what Apple have been doing with their latest M1 chips which have integrated graphics to know how rubbish NVidia's Ampere is.

Best wait until the next gen of GPUs are available rather than waste money on a pointless 3090Ti.

I am not even using my 3090 ATM, it is sat in my spare PC. I am using my Titan V 24/7 which although getting on a bit is still good enough for my games.

When the 3090Ti does turn up if anyone can get one, it may be able to reclaim a couple of benchmark titles from the 6900XT but for real world use there is no benefit for the likely huge cost.

I spent about 10k in December on new tech and not a penny of it went to the greedy $%^& at NVidia.

@NVidia I will buy your products again but only when you make something worth buying.


And it's funny to remember that now the Titans are dead cause Nvidia has now switched to dedicated gaming architecture with Lovelace.

HPC users will need to use a Hopper card as Lovelace likely won't be good for HPC as it's focused just on gaming.


Someone such as yourself may need to buy two GPUs going forward to replace your Titan, either way Nvidia wins and your pocket loses
 
Associate
Joined
31 Dec 2008
Posts
1,476
I only have to look at what Apple have been doing with their latest M1 chips which have integrated graphics to know how rubbish NVidia's Ampere is.
Except M1’s are 5nm chips and M1 Max is loosing by a big margin to 100W laptop 3080 in Tomb Raider which is a native MacOS game.
So clearly it’s not that great for gaming.
 
Top