• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When the Gpu's prices will go down ?

Soldato
Joined
25 Jan 2007
Posts
4,738
Location
King's Lynn
It's not being phased out, it's being only added to server gpus and very high-end GPUS this generation and being sold as a high-end feature only now. PCIe5 can't do what NVLINK does so far and there is no silicon on ADA for any form of NVLINK functions and can't share VRAM or pool CUDA cores.

That's why they have the Ampere A-series for that still being made and sold for them features, check Nvidia's pro line up and site regarding all this.

In short NVLINK has now become a luxury feature for their top of the line products and guessing when they phase out Ampere A-series they will have another gpu to take that role on once 48GB VRAM becomes more the norm on pro cards and by then 96GB cards with NVLINK will be about I'm guessing and hoping they may bring out 48GB 5090/6090 by then with NVLINK again.
In most cases the only real loss from no nvlink is the merging of vram, most software can use more than one gpu and in my experience cuda cores are cuda cores irrespective of how many gpus, at least from the softwares perspective.

There's also no obvious reason that vram couldn't be shared by pcie4/5, nvlink at a very fundamental level is/was just a connector that allowed for a faster transfer than the then current pcie allowed, they just need to design the hardware and driver/software around it.... it's arguably not much different to direct storage and/or nvidia rtx IO which communicate over pcie.

Some of the rumours for the 4090ti say it might have 48GB.... can't see it myself because nvidia wants people to buy the 'pro' cards but at the same time nvidia can see the extra money coming from smaller companies and 'uber' gamers....
 
Soldato
Joined
6 Jun 2009
Posts
5,445
Location
No Mans Land
It would annoy me had I held onto my 3090FE from 2021 (bought for £1399 AND sold for £1800) if I had to lose £700 on its value to repurchase a flagship for £1599 (4090) only to lose 50% value by Sept 2024 (ie next year). I’ve decided for me mid range gpus are the way to go- bought a 3060ti FE for £369 and plan on replacing it next Sept with a 5070 or whatever AMD have to offer mid range. I figured over 10 years if I upgrade midrange (taking the 3060ti price as upgrade cost) 3X it would cost £1200 whereas if I purchase a 4080 I would be stuck at that perf level for 5/6 years so would end up paying more to upgrade over the same period.

Im looking for a cheapish upgrade.
Would a second hand 3090 for £600 be considered good value in todays market?
 
Last edited:
Soldato
Joined
6 Jun 2009
Posts
5,445
Location
No Mans Land
Thats a good deal assuming card works correctly.

Yeah that’s what I was thinking. Wasn’t long ago 3090 FE’s were £1400 so if I can get one for £600 that’s not bad upgrade for not stupid money. I don’t need the latest and greatest as I don’t game much these days but want a step up from my current card plus I want hdmi 2.1 as my PC is plugged into my 65” OLED.
 
Soldato
Joined
7 Dec 2010
Posts
8,310
Location
Leeds
In most cases the only real loss from no nvlink is the merging of vram, most software can use more than one gpu and in my experience cuda cores are cuda cores irrespective of how many gpus, at least from the softwares perspective.

There's also no obvious reason that vram couldn't be shared by pcie4/5, nvlink at a very fundamental level is/was just a connector that allowed for a faster transfer than the then current pcie allowed, they just need to design the hardware and driver/software around it.... it's arguably not much different to direct storage and/or nvidia rtx IO which communicate over pcie.

Some of the rumours for the 4090ti say it might have 48GB.... can't see it myself because nvidia wants people to buy the 'pro' cards but at the same time nvidia can see the extra money coming from smaller companies and 'uber' gamers....


Nvidia announced a new dual-GPU product, the H100 NVL, during its GTC Spring 2023 keynote. This won't bring back SLI or multi-GPU gaming, and won't be one of the best graphics cards for gaming, but instead targets the growing AI market. From the information and images Nvidia has released, the H100 NVL (H100 NVLink) will sport three NVLink connectors on the top, with the two adjacent cards slotting into separate PCIe slots.

It's an interesting change of pace, apparently to accommodate servers that don't support Nvidia's SXM option, with a focus on inference performance rather than training. The NVLink connections should help provide the missing bandwidth that NVSwitch gives on the SXM solutions, and there are some other notable differences as well.

Take the specifications. Previous H100 solutions — both SXM and PCIe — have come with 80GB of HBM3 memory, but the actual package contains six stacks, each with 16GB of memory. It's not clear if one stack is completely disabled, or if it's for ECC or some other purpose. What we do know is that the H100 NVL will come with 94GB per GPU, and 188GB HBM3 total. We assume the "missing" 2GB per GPU is for ECC now.

Power is slightly higher than the H100 PCIe, at 350–400 watts per GPU (configurable), an increase of 50W. Total performance meanwhile ends up being effectively double that of the H100 SXM: 134 teraflops of FP64, 1,979 teraflops of TF32, and 7,916 teraflops FP8 (as well as 7,916 teraops INT8).

Basically, this looks like the same core design of the H100 PCIe, which also supports NVLink, but potentially now with more of the GPU cores enabled, and with 17.5% more memory. The memory bandwidth is also quite a bit higher than the H100 PCIe, at 3.9 TB/s per GPU and a combined 7.8 TB/s (versus 2 TB/s for the H100 PCIe, and 3.35 TB/s on the H100 SXM).

As this is a dual-card solution, with each card occupying a 2-slot space, Nvidia only supports 2 to 4 pairs of H100 NVL cards for partner and certified systems. How much would a single pair cost, and will they be available to purchase separately? That remains to be seen, though a single H100 PCIe can sometimes be found for around $28,000 (opens in new tab). So $80,000 for a pair of H100 NVL doesn't seem out of the question.
 
Associate
Joined
22 Nov 2020
Posts
1,479
There are second hand cards with warranty out there you know? :D

I got my 3080 Ti from a reputable member here and it had over two years warranty on it. Not that I am expecting it to fail, GPU's rarely do.
Sadly for me not able to access Members Market for another 2-3 years at least at my rate of activity!
 
Back
Top Bottom