• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA GeForce RTX 3090 Ti, the flagship reinvented

now this is not uncommon, we've seen it on several other cards that share PCBs this generation- what it tells us thatbthe 3090ti shares its PCB with something else, something that is not Ampere, it's probably the 4080/4090/4090ti

We've actually been told that the next Nvidia chip - Lovelace or Hopper - is going to be pin-compatible. If Nvidia had been clever they'd have made the GPU socketed, thus offering the prospect of a cheaper upgrade and locking customers in.
 
sorry I just said apparently. Lmao not sure if that price is accurate. Just illustrating that gaming consumes power. Something to be aware of when going for a higher tdp card.

Unless you're gaming for many hours a day every day is this really an issue? It was less than 20p / kwh before the recent price increases. Say it's gone up to 30p per kwh. That means you get 2 hours gaming on a 500W GPU for 30p. That's a trivial amount. 2 hours per day every day for a year is about £110. And I doubt many of those who purchase higher-end GPUs fit that mold. 30p / day is just not going to bother someone paying hundreds of pounds for a GPU.
 
Unless you're gaming for many hours a day every day is this really an issue? It was less than 20p / kwh before the recent price increases. Say it's gone up to 30p per kwh. That means you get 2 hours gaming on a 500W GPU for 30p. That's a trivial amount. 2 hours per day every day for a year is about £110. And I doubt many of those who purchase higher-end GPUs fit that mold. 30p / day is just not going to bother someone paying hundreds of pounds for a GPU.
Over 2 years, call it 220... that might be enough to make you pick a different, more expensive but more efficient card?
 
Over 2 years, call it 220... that might be enough to make you pick a different, more expensive but more efficient card?

I do think people that are willing to spend 4 figures on a graphics card likely won't base their choice on efficiency.

In the same way someone buying a supercar won't look at an alternative if it returns better mpg.

For me personally the only thing that would make me consider an alternative is if the temps become uncontrollable in my case/environment.
 
We've actually been told that the next Nvidia chip - Lovelace or Hopper - is going to be pin-compatible. If Nvidia had been clever they'd have made the GPU socketed, thus offering the prospect of a cheaper upgrade and locking customers in.

IMO would be the opposite of clever to invite the public to disassemble a GPU (Tension bracket alone will cause some to break cards), replace thermal pads on VRAM (these get disturbed taking the cooler off, considering the horrible pads Nvidia use) and reassemble with new GPU.



Also makes much more financial sense to profit from selling a new PCB, VRAM, Power delivery for extra profit on a whole new card.


Terrible for the environment of course, though capitalism, profit trumps any thoughts of protecting the environment, always has.
 
Over 2 years, call it 220... that might be enough to make you pick a different, more expensive but more efficient card?
True would certainly influence me. If I spent £1900 on a gpu (gasp and jaw on floor) then I’d keep for 5 years so would have to factor in another £500 electricity. So total cost £2400.
 
I wonder whether the 50 series will be more power efficient.

Individual dies may be, but once they start cramming 2 or 4 of them into a package and then slap 32GB of GDDR RAM on for good measure, I somehow doubt it. If the MCM architecture is to be believed, of course.
 
Individual dies may be, but once they start cramming 2 or 4 of them into a package and then slap 32GB of GDDR RAM on for good measure, I somehow doubt it. If the MCM architecture is to be believed, of course.

I believe the logic is they can use more cores and not need to clock them so high, so potentially less power draw, although that would depend on exactly how many cores are used. Speculation of course but I’m hopeful MCM will provide us with a greater leap in performance than we’ve become used to.
 
I do think people that are willing to spend 4 figures on a graphics card likely won't base their choice on efficiency.
I'm sorry, you're missing my point. I'd rather spend more, on a better chip then spend less on a lower chip that clocks better. The better chip under clocked might cost less, net, over time.

Think underclocking a 3080 Vs getting an over clocked 3070?
 
I can see it now - putting solar panels on the roof

Neighbour: "Is that to power your house"

"No, just my PC"



Oh wait this is OCUK I may have to explain what solar panels are, but before I do so.. There's this thing called the sun and it's not a newspaper. We have it here in Australia. Many other countries also have it.
Oh this is hard - like trying to explain a colour to a blind person.
:p
 
The 3090ti tells me the 3090 was power limit bottlenecked, hence it been so close to the 3080. The 3090ti is what the 3090 should have been, but of course that power usage is insanity, and I would never use such a card.
 
Oh wait this is OCUK I may have to explain what solar panels are, but before I do so.. There's this thing called the sun and it's not a newspaper. We have it here in Australia. Many other countries also have it.

Yet you failed to mention how your own short-sighted Government caved in under pressure from the fossil-fuel industry and is removing solar panel subsidies by 2030 (in fact, was initially going to be 2021..) ;)

Still, no better here in the UK, with the pathetic decline in the rate of the "feed-in tariff".

Guess people will just have to install mice and a wheels to drive those 3090Ti's. :)
 
Meanwhile people in america are boasting the power is no issue, and the heat is no issue as they can buy cheap to run a/c to cool the cards down lol at their 10 cents KwH. Crazy how far apart this is now.
 
Meanwhile people in america are boasting the power is no issue, and the heat is no issue as they can buy cheap to run a/c to cool the cards down lol at their 10 cents KwH. Crazy how far apart this is now.

If you said to someone 10 years ago, that in 2022 you'd be getting people buying an AC unit just to cool a graphics card, they would laugh at you. Yet here we are now today.

Today we can say that in the future you'll need a dedicated PSU JUST for your graphics card and we'd laugh about it....yet I can see this becoming true in the future!

What's the world coming to! Crazy.
 
Back
Top Bottom