• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

Who feels sorry for the AIBs...
*snip*

Godspeed EVGA... also kinda funny how MSI left AMD GPUs and yet are still somehow happy to get shafted by leather jacket man... how is Nvidia any better than AMD in this sense?

At this point, I expect other AIBs to follow EVGA's footsteps.
 
Last edited:
Godspeed EVGA... also kinda funny how MSI left AMD GPUs and yet are still somehow happy to get shafted by leather jacket man... how is Nvidia any better than AMD in this sense?

At this point, I expect other AIBs to follow EVGA's footsteps.
EVGA saw it all coming - Sad as they were by far the best AIB out there.
 
whats the latest, 4 days till 5090 release and price exposure

Whats it gonna be like this release? day one sell out? FE mass scalpn? people spending an extra £500 just to get a 5090, gonna be 6 months of customers wanting to bag a card, wont be enough to go around or plenty to go around? there is no solid info
 
whats the latest, 4 days till 5090 release and price exposure

Whats it gonna be like this release? day one sell out? FE mass scalpn? people spending an extra £500 just to get a 5090, gonna be 6 months of customers wanting to bag a card, wont be enough to go around or plenty to go around? there is no solid info
Scalpers buying them all up for sure, then they will start to return their unsold stock… The base price is just too high, they will struggle to make much profit.

The 4080 was a good example of stock sitting around for weeks/months.
 
Scalpers buying them all up for sure, then they will start to return their unsold stock… The base price is just too high, they will struggle to make much profit.

The 4080 was a good example of stock sitting around for weeks/months.
Unfortunately, I'm pretty sure the 5090's won't struggle to sell at a profit. Even 2 year old used 4090's are selling for at least their RRP and in some cases a profit.
 
I've been gaming for decades now and never once though about latency. The games I've played have always felt responsive to me.

It's definitely something I don't want to become sensitive to lol.

Xbox 360, 1080p 37" TV, Gears of War, co-op, mine cart level where you spilt up. Internet delay and TV input lag meant many deaths. Drove my mate insane with me dying over and over. He would die over and over if he was joining my game/server :D

Game mode on TV's back then I thought just made the picture look weird. :)
 
Godspeed EVGA... also kinda funny how MSI left AMD GPUs and yet are still somehow happy to get shafted by leather jacket man... how is Nvidia any better than AMD in this sense?

At this point, I expect other AIBs to follow EVGA's footsteps.
People actually buy MSI Nvidia GPUs. MSI probably sold 20 AMD GPUs.
 
Last edited:
Perhaps I could have been more precise with my original statement. The 32GB is the major factor in it's AI appeal. Even if it was just a 4090 with 32GB of VRAM, it would be more attractive for AI use and get gobbled up with a higher pricetag. As I'm sure you know, the more VRAM, the larger the model you can load in to said VRAM, the more power you have at your disposal. Be interesting to see what the 60 series will even be at this point, let alone how much it will cost. It's going to be more and more dificult for Nvidia to justify selling product to the gamer market when the markup for datacentres/AI companies will be so much higher.
It's better than 24GB but with modern big models even 32GB is not enough for truly Pro work. For larger datasets one needs at least 64GB of vRAM, often 96GB+ is required. Most consumer level AI models will fit in 24GB though, so currently 32GB doesn't make much of a difference (but that could be tweaked) - it's usually a jump from 16-24GB to 64GB, not 32GB.
 
It's better than 24GB but with modern big models even 32GB is not enough for truly Pro work. For larger datasets one needs at least 64GB of vRAM, often 96GB+ is required. Most consumer level AI models will fit in 24GB though, so currently 32GB doesn't make much of a difference (but that could be tweaked) - it's usually a jump from 16-24GB to 64GB, not 32GB.
Right, that's it. Cancelling my 5090 preorder and buying an M4 Max MBP with 96GB RAM.
 
It's better than 24GB but with modern big models even 32GB is not enough for truly Pro work. For larger datasets one needs at least 64GB of vRAM, often 96GB+ is required. Most consumer level AI models will fit in 24GB though, so currently 32GB doesn't make much of a difference (but that could be tweaked) - it's usually a jump from 16-24GB to 64GB, not 32GB.
Sure. But I think they had to give it 32GB, as that's what will entice the AI market (even those who own a 4090). If you're a gamer on a 4090, there's no reason for you to buy a 5090.
 
Back
Top Bottom