• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Blackwell gpus

What I don't understand is why Nvidia got rid of SLI when apparently the more you buy the more you save. Some people here, not mentioning any times, would buy 4 GPUs if they could SLI it
What's the biggest PSU you can get these days?

I really miss crossfire and SLI, those were some of the coolest (in the figurative sense only) builds
 
surely not out till late October-November timeframe?

I have a feeling this is going to be very expensive, Nvidia could not care if any gamers buy them. They would rather sell them to AI industry instead, Tell me £2000 or £40000? But I was under the impression what they sell us gamers are the cores that cannot be used for AI based stuff because they are imperfect chips from manufacturing hence they get relegated to gaming industry.

Eventually competition will catch up with NVIDIA, AMD and Intel will come up with competing product and AI craze will die down, might take 10 years in the mean time everyone buy lots of LUBE. :D
That's what I thought but surely Nvidia won't leave their customers with nothing to but for the next three months. Can't make sense of it unless there's a ton of unsold 'old' inventory.
 
What's the biggest PSU you can get these days?

I really miss crossfire and SLI, those were some of the coolest (in the figurative sense only) builds


Had many multi gpu systems, they looked good but could be a headache, games not scaling well, micro stutter, airflow concerrns and the sheer heat they put out.

And with today's super bulky cards taking 3/4 slots, cards would be sandwiched together so tightly that watercooling would be a total necessity.
 
Had many multi gpu systems, they looked good but could be a headache, games not scaling well, micro stutter, airflow concerrns and the sheer heat they put out.

And with today's super bulky cards taking 3/4 slots, cards would be sandwiched together so tightly that watercooling would be a total necessity.

There was a peak period for multi-GPU but the results before and after that weren't great. Mid-generation DX9 games usually worked great as they were just advanced enough for the technology to work well but mostly not so advanced they leaned heavily on graphic effects which broke compatibility with multi GPU.

I never had much problem with microstutter personally though I did do a fair bit of tuning with various tools like nvinspector and whatever the name of the one which came before that which I forget now.

I remember having a case with 2 ridiculous 380mm fans so as to tame the heat side though hah (Xclio A380). https://www.techpowerup.com/review/apluscase-twinengine/5.html
 
Last edited:
I raise you a 2.5kw & a 2.8kw..


 
Last edited:
There was a peak period for multi-GPU but the results before and after that weren't great. Mid-generation DX9 games usually worked great as they were just advanced enough for the technology to work well but mostly not so advanced they leaned heavily on graphic effects which broke compatibility with multi GPU.
i guess it was TAA or other similar rendering methods using data from previous frames that raised feasibility hurdles for SLI (AFR mode)
 
dKQUODe1gHLnATUl.jpg
 
I think me getting the Alienware 32" 4K OLED would pair well with a 5070 for my needs in 6-12 months time :D

I recon you will end up on a 5080 @Nexus18 as can't see you dropping close to 2K for a 5090 :p
 
I think me getting the Alienware 32" 4K OLED would pair well with a 5070 for my needs in 6-12 months time :D

I recon you will end up on a 5080 @Nexus18 as can't see you dropping close to 2K for a 5090 :p

Yeah I'm in no rush and next purchase is all going to be about bang per buck so most likely 5070/5080.
 
  • Like
Reactions: TNA
But I was under the impression what they sell us gamers are the cores that cannot be used for AI based stuff because they are imperfect chips from manufacturing hence they get relegated to gaming industry.
Nope. You may, if defect rates are high enough, collect enough defective dies to throw into another SKU but if there's imperfections in the finished product that can't be worked around by fusing off a part of the core it goes in the bin.

It's possible to give up some die area and design in some redundancy so you could work around a defect, e.g make each cache 2-3% bigger so you can fuse off any defects or even fabricate your dies with extra CUDA, RT, or Tensor cores so you can fuse off defective cores. However by doing so you're giving up valuable die area, the dimensions of the dies are set at the very beginning of the fabrication process and each SKU has to use dies of the same dimension (heatsink flat plate, board size, number of solder bumps, etc, etc).
 
i doubt there would be a price increase on paper, note that nvidia is no longer using the most adavanced node.. but what transpires on the street remains to be seen.. also, i am bored and itching for an upgrade, so the sooner the better
 
Back
Top Bottom