• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Prob same guy who put up the the Timespy Extreme one yesterday
i feel the 7900 is going to land between the two 4080 models.. nvidia looked pretty confident with their product segmentation
also, i was wondering how can a MCM architecture be more efficient than a monolith die on the same/inferior fab process.. off chip communications are generally expensive by orders of magnitude
interesting though the path amd has taken..
 
That’s the thing man, I recently lost someone close to me and they always begrudged spending, now they gone they can’t take what they had with them. If you can afford it do what makes you happy.

I didn’t even know you could get projectors that expensive. I think the only purchase I made this year that I regret was I spent £2,000 on some IEMs and I’m too scared to use them incase I break them lol.
That's never a good situation to be in but IEMs just feel so delicate I'd feel the same! My £2k+ sound purchases went on headphones instead (Focal Utopia & Stellia).
 
I think the power limitation of 150w is probably due to the wire gauge of the 8 pin cables rather than what the psu can supply to the 12v rail on the gpu cables, whereas the new 16 pin has thicker gauge cables to handle the 600w/hugher current
150w is the what a PSU needs to be able to supply via a an 8 pci-e cable to meet the ATX 2.2 standard. Higher quality and more premium PSU are often able to exceed 150w quite happily.
 
Why did Nvidia cut the price of the 3090Ti by almost half?
The 3090Ti was released a year to late ( Mar 2022 ) and was to close to this 40 series release so was doomed price wise from the start especially with it not being that much better than a 3090 for gaming and the main thing was it was **** at mining ( power hungry ) so miners didn't want the card either meaning they ended up just sitting on the shelfs and only way to get them moving was slash the price from the £2000 msrp.
 
150w is the what a PSU needs to be able to supply via a an 8 pci-e cable to meet the ATX 2.2 standard. Higher quality and more premium PSU are often able to exceed 150w quite happily.
Have a think. Socket on PSU side is interchangeable with CPU 8 pin EPS.
EPS is 366W. So AT MINIMUM the 8 pin socket on psu side must be able to withstand 366W.
 
Last edited:
I don't know, I assume there were plenty of buyers when you could not get them for over a year, but then I guess then made too many and didn't see the cyrpto crash etc coming. Maybe they got greedy, I don't know.
After the cards could no longer print money, we got a look at what people were willing to pay to play games....and it wasn't $2k (at least not in any real volume)
 
Last edited:
I am not sure the 4090 is a Ferrari, its more like a Bugatti Chiron going through a 40mph speed camera (console port locked at 60fps) on a Welsh mountain road (unreal engine 4) stuck behind a slow moving, broken caravan (cyberpunk development team).
More like an alfa romeo, need to get into things like a100 for the real super car gpus.

A100 80gb is currently £28,000 with 10 grand off!
 
i feel the 7900 is going to land between the two 4080 models.. nvidia looked pretty confident with their product segmentation
also, i was wondering how can a MCM architecture be more efficient than a monolith die on the same/inferior fab process.. off chip communications are generally expensive by orders of magnitude
interesting though the path amd has taken..
Nvidia are confident of the product segmentation on AIR, it remains to be seen what happens to the AIB models against AMD AIB cards under liquid.

If the Nvidia 4090 boards are rated for 600 watts and so are the air-coolers, it does not leave much head-room, maybe, max sustained boost for liquid.

AMD on the other hand, might be able to get closer to the 4090 with a 7900XT liquid devil type card rated at 500w+ as the new AMD chip won't be as thermal limited on liquid.

Nvidia don't have EVGA to lean on to create a 4080/4090 FTW4 hydro-copper and we could see a surprise from Asrock, Powercolor or Sapphire at the top-end. Probably won't see these top-end AMD AIB cards until April though.

Thinking about it, now EVGA is gone, Asrock (OC Formula) Powercolor (Liquid-Devil) and Sapphire (Toxic) have better, tweakable, custom cards than anyone else in the new Nvidia AIB lineup. Interesting times for AMD AIBs and an opportunity there for them.
 
i feel the 7900 is going to land between the two 4080 models.. nvidia looked pretty confident with their product segmentation
also, i was wondering how can a MCM architecture be more efficient than a monolith die on the same/inferior fab process.. off chip communications are generally expensive by orders of magnitude
interesting though the path amd has taken..
If that is the case then AMD would have failed hard since the 7900XT is rumoured to have 140% more cores than the 6900XT.
 
That's never a good situation to be in but IEMs just feel so delicate I'd feel the same! My £2k+ sound purchases went on headphones instead (Focal Utopia & Stellia).
Honestly mate I wasn’t expecting to be scared of them lol. if I could go back I would 100% have spent them on headphones instead.
 
received-619114779692570.jpeg
 
If the Nvidia 4090 boards are rated for 600 watts and so are the air-coolers, it does not leave much head-room, maybe, max sustained boost for liquid.

theres a difference of opinion here, 450w corresponds to 2.8 gigs and looks like the 4090 has a lot of headroom when it comes to overclockability, 3,45ghz seems to the bios limit though and thermals are looking promising as well.. also leaked amd numbers arent making much sense either - tpu states 12288 shaders, 768 tmus, 256 rops, 192 rt cores on 350mm2.. it looks like one of those rx vega hype trains

nvidia isnt showing the same kind of urgency we saw during ampere launch, the 3090/3080 clubbed under the same die and that kind of stuff - makes we wonder if nvidia knows a lot more about amd than we'd care to admit

edit: oh and they also made this special ad103 die this time which is particularly intriguing
 
Last edited:
If that is the case then AMD would have failed hard since the 7900XT is rumoured to have 140% more cores than the 6900XT.

I think that number is before amd cut down the specs

Edit: ah k looks like latest specs, I think the core count was 16k or something before and now it's 12k. I wonder if they're doing the 2xfp32 unit trick like Nvidia to claim double core count when it's actually half that? Like Nvidia claims a 4090 has 16k cores but it's really only 8k

If it's 12k real cores and they're properly fed then the 7900xt would be a failure if it's not 50% faster than a 4090
 
Last edited:
I think that number is before amd cut down the specs

Edit: ah k looks like latest specs, I think the core count was 16k or something before and now it's 12k. I wonder if they're doing the 2xfp32 unit trick like Nvidia to claim double core count when it's actually half that? Like Nvidia claims a 4090 has 16k cores but it's really only 8k

If it's 12k real cores and they're properly fed then the 7900xt would be a failure if it's not 50% faster than a 4090
nvidia and amd archs are not comparable.. nvidia uses a superscalar arch, and the cuda core is basically a FMA unit whereas AMD's shader is a full fledged ALU with support for vector operations,..
so when nvidia says they have packed in "x" cuda cores it just means x FP32 FMA operations per clock cycle, but nvidia tends to share some of that stuff with INT datatype, so you'd have some cuda cores alternating between INT and FP32 datatypes between clock signals
but physically you'd still have the same number of cores as shown on the label..
amd though has few interesting options but they'd probably count the same when publishing specs
but the bigger question is that a single nvidia cuda core takes a lot less die space than an amd shader.. so, we are now looking at specs which call for 140% increase in shader count + other beefed up proportions that dont tie-up with die size
 
Last edited:
Back
Top Bottom