• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

It gonna come quick enough can’t wait to see what size of lube I’m gonna buy lol.
Thought you was happy with your 2080Ti. Knew you would want to upgrade. Funny how it works at these times when a new gen of cards come out. Suddenly the 2080Ti becomes a mid range card or certainly a hell of a lot less desirable anyways :p

Remember, if you want the best you will need to buy the Titan, if not settle for the second best which will be the 3090 ;):p:D
 
It appears memory is a real problem for Ampere:
https://www.igorslab.de/en/nvidia-a...elf-the-board-partners-are-still-in-the-dark/

It's consuming a ton of power and is clocked to the very edge! I wonder if Micron screwed up??
I like this comment

"There are no final performance data yet, Nvidia keeps the ball extremely flat for fear of leaks and the AICs even on the shortest dog leash I have ever seen at a launch." Also known as Tom Bradying :D

Isn't it more to do with the board design, memory on both sides creating a high concentration of heat?
 
Last edited:
Not something I would risk saying if true. But you can bet your bottom dollar that if they screw up and its a noisy heat fest the GIFs are out from the 290x era for sure! :)
 
Not something I would risk saying if true. But you can bet your bottom dollar that if they screw up and its a noisy heat fest the GIFs are out from the 290x era for sure! :)
And the George Foreman Grill :D.

No doubt, a 300-350W card is going to have cooling challenges.
 
Both AMD and Nvidia have had problems cooling GDDR6, if i remember right the high death rate of early 2080TI's was overheating GDDR6, as was the Space Invaders graphics corruption. AMD havent got away with it either, Integrated G'Ram heat syncs on some AMD cards are particularly bad as they are GPU heat soaked, you often see those coolers running G'Ram in the high 80's, one or two AIB ones also killed their G'Ram, (Asus TUF Gaming) GDDR6 is rated for 95c, really they all need large fined discrete G'Ram coolers, like mine does, the G'Ram runs in the mid 70's on my GPU, that's about the only thing they got right with the cooling, the GPU cooler is crap.

GDDR6X may well be worse.
 
I like this comment

"There are no final performance data yet, Nvidia keeps the ball extremely flat for fear of leaks and the AICs even on the shortest dog leash I have ever seen at a launch." Also known as Tom Bradying :D

Isn't it more to do with the board design, memory on both sides creating a high concentration of heat?

First bit of decent potential competition they've had in a while has got them paranoid AF ;) :D
 
Back
Top Bottom