Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Increasing clock speed is free performance but remember that Nvidia don’t give you anything for free.2.9ghz base clock on the 5090 apparently, take that with a grain of salt.
It's one thing to have to order a new PSU off Amazon before you can use your new GPU. It's another thing entirely when you have to wait for work on Hinkley Point to conclude as well.Increasing clock speed is free performance but remember that Nvidia don’t give you anything for free.
Do they not? I thought the more I bought, the more I saved.Increasing clock speed is free performance but remember that Nvidia don’t give you anything for free.
I love how that is still taken out of context today lolDo they not? I thought the more I bought, the more I saved.
Jensen himself keeps bringing it up. He said it again during his Computex keynote last month.I love how that is still taken out of context today lol
Well, higher clocks rather (even) bigger dies* = more heat.Do they not? I thought the more I bought, the more I saved.
That's good to hear, this 4080 Super is very disappointing in that regard. It runs at 56 Celsius, a stronger CPU might push it to low 60s. But that isn't going to heat my room, unlike my AMD HD4850 which was "designed" to run at 95 Celsius and warmed my feet under the desk just nicely.Well, higher clocks rather (even) bigger dies* = more heat.
So if you are using your GPU anyhow, that's "free" heating. And if you can convince yourself of that equation, there's no helping you!
* Except 5090 where rumours do point to a bigger die plus higher clocks. 2.9 might be the price in $1,000s rather than (just) the clocks in 1,000Mhz's.
I don't think it matters that much how warm the GPU runs... If a card is 300W but runs at 50*C or at 90*C, the same amount of heat is dumped into the room since about all of that 300w is converted to heat...That's good to hear, this 4080 Super is very disappointing in that regard. It runs at 56 Celsius, a stronger CPU might push it to low 60s. But that isn't going to heat my room, unlike my AMD HD4850 which was "designed" to run at 95 Celsius and warmed my feet under the desk just nicely.
C'mon AMD, forget efficiency. Get a competing product sorted, I don't care if it's hotter than the sun, we're in a cost of living crisis
I feel better nowI don't think it matters that much how warm the GPU runs... If a card is 300W but runs at 50*C or at 90*C, the same amount of heat is dumped into the room since about all of that 300w is converted to heat...
Do we have any idea of actual dates yet?
is that a Red Gaming Tech pinch of salt with significant interest?2.9ghz base clock on the 5090 apparently, take that with a grain of salt.
is that a Red Gaming Tech pinch of salt with significant interest?
Juneteenth.Do we have any idea of actual dates yet?
preorders where?Twelfthtieth of Sepruary
They need to figure out a way to use a thermoelectric generator to convert GPU & CPU heat into electricity - stick a water turbine on the back of the case or somethingI don't think it matters that much how warm the GPU runs... If a card is 300W but runs at 50*C or at 90*C, the same amount of heat is dumped into the room since about all of that 300w is converted to heat...
Or use an AC. I've said it before, the cost of something like a 4090 compared to cheaper cards (even 4080), is not just the card itself... I don't think power usage will go down anytime soon.They need to figure out a way to use a thermoelectric generator to convert GPU & CPU heat into electricity - stick a water turbine on the back of the case or something