Upgrade from i7 10700k

This is pretty outdated advice. The correct answer is AM5. AM5 has got two more CPU upgrades, offers support for 256gb of RAM and will pan the Mac into next week for significantly less money.
I mean, I’m happy to be wrong if AM5 now easily supports 256 gigs of RAM at moderate to good speeds.

My understanding is that it’s difficult to run a lot of AM5 systems with that much RAM and the system will often only run that RAM at 3600 MT/s.

I’ll do some digging but I don’t think the Mac Studio as an AI machine is a bad idea but I’m happy to be wrong if there is a better solution.

Edit: I was wrong, the Mac Studio supports 512gigs of RAM, not 256.

Edit Edit: yeah, I'm going to stand by my statement of using the Mac Studio as a dedicated AI/LLM machine for intense use since you can allocate the unified RAM to either system or GPU memory and, therefore, hold massive AI/LLM models which is more relevant than your AM5 rig having 256gigs of RAM but not being able to allocate that to the GPU rendering and relying on a dedicated GPU like the RTX 5090 with much less RAM.

You can split the model across RAM and VRAM as per this article but it's not amazing. Here's another piece on Apple Silicon for AI/LLM use. There was a video from Level1Techs on an AM5 prosumer rig with 256gigs of RAM and I'll try find that as a source as I think he mentioned AI/LLM usage with it.

If all you want to do is play around with small models that can fit in 16gigs of VRAM then fine, any AMD or Intel system will do since you'll be using a dedicated GPU anyway.
 
Last edited:
Upgrade ability aspect aside. With a half decent motherboard and chip you go over PC6400 with 4x64gb. The door is also now open for 128gb sticks.

The Mac’s performance is decent for a small low power device and could for a time beat desktop X86, but the price of admission for the Mac has remained, even increased and X86 has moved on a lot since the Mac Mu. The entry ultra version is over £4200, the 256gb is nearly £7500 and 512gb is £10k!

The Mac Studio is great if you’ve got £5-£15k and must have a low powered pocket sized system and iOS.

If all you want to do is play around with small models you don’t a discreet GPU at all.
 
Last edited:
Upgrade ability aspect aside. With a half decent motherboard and chip you go over PC6400 with 4x64gb. The door is also now open for 128gb sticks.

The Mac’s performance is decent for a small low power device and could for a time beat desktop X86, but the price of admission for the Mac has remained, even increased and X86 has moved on a lot since the Mac Mu. The entry ultra version is over £4200, the 256gb is nearly £7500 and 512gb is £10k!

The Mac Studio is great if you’ve got £5-£15k and must have a low powered pocket sized system and iOS.
Have you seen the recent M5 Silicon reviews against Intel and AMD??????

The argument against the Mac Studio needs to take into consideration matching it's VRAM ability using dedicated Nvidia AI GPUs which are hella expensive.

I don't think your argument is quite Apples to Apples but I hear you that the Mac Studio at £10k fully specced isn't for your average joe, I just think that if you are arguing the RAM aspect, you need to argue using VRAM too since that's more relevant to AI/LLM usage.

Edit: The cheapest Nvidia AI dedicated card with 48gigs of VRAM which doesn't even come close to what you would need to be able to match the Mac Studios ability to run much larger models, but at a slower rate I'll admit, is €5,699.00.

The NVIDIA A100 80GB PCIe 4.0 starts at €23,850.00 and that's only 80gigs of VRAM.
 
Last edited:
Have you seen the recent M5 Silicon reviews against Intel and AMD??????

The argument against the Mac Studio needs to take into consideration matching it's VRAM ability using dedicated Nvidia AI GPUs which are hella expensive.

I don't think your argument is quite Apples to Apples but I hear you that the Mac Studio at £10k fully specced isn't for your average joe, I just think that if you are arguing the RAM aspect, you need to argue using VRAM too since that's more relevant to AI/LLM usage.

The OPs budget is £2k and he wants to add a GPU. The Mac is simply not an option especially a M5/ultra/studio.

You also seem to be dismissing X86 as a viable option to run AI workloads. X86 is absolutely a valid choice for most AI workloads and has a some big advantages for many like the ability to upgrade and use add in cards.
 
Last edited:
If all you want to do is play around with small models that can fit in 16gigs of VRAM then fine, any AMD or Intel system will do since you'll be using a dedicated GPU anyway.
The OPs budget is £2k and he wants to add a GPU. The Mac is simply not an option especially a M5/ultra/studio.

You also seem to be dismissing X86 as a viable option to run AI workloads. X86 is absolutely a valid choice for most AI workloads and has a some big advantages for many like the ability to upgrade and use add in cards.
No, I'm not. I'm exploring options and you're refusing to acknowledge points like the one I made and have quoted above.

I took money into account and I still do but my argument is based on VRAM requirements to price.
 
No, I'm not. I'm exploring options and you're refusing to acknowledge points like the one I made and have quoted above.

I took money into account and I still do but my argument is based on VRAM requirements to price.

OP is simply better off with an X86 system. Even assuming the highest end video card, a X86 system bests the equivalent priced Mac by some way.

The Mac is a great option if you need iOS and/or must have low a powered portable system with the support of Apple behind it. And of course are happy to pay a significant premium.
 
If you really want to max out your mobo with 256gb 6000 speed ram, top spot goes to MSI, so if looking at the B850 ti edge, go for it...2nd spot goes to Asus, esp the strix range..no problem...asrock (I'd prob just wouldn't if looking at x3d cpu's even if remote chance of failing), and avoid the gigabyte boards...50% chance they'll refuse to run


 
if you then want to run up to 4 m.2's, that takes a bit more looking into, to see which m.2 slot shares lanes with the gpu slot etc (for example my b850e-e strix board can take 5 m.2, but slot 2 and 3 will cut the x16 to x8 on the gpu), ...B850-A bnoard has 4 m.2 slots, but again, slot 3 shares with the gpu so cuts it to x8 from x16...quick look the b850 edge ti has 4 m.2 slots, slot 3 shares with the pcie_E3 slot which isn't the gpu slot(that's E1..runs at pcie4x2, slot 4 is pcie4x4, and 1 and 2 are pcie5x4.....seeing as you like the edge, ticks the box's you're looking at..save a bit by going the b850 tomahawk max wifi though(but black board), but basically same board just about
minefield going through all these boards
 
Back
Top Bottom