New build for 3D art, game dev, and gaming

Associate
Joined
22 Jan 2010
Posts
1,480
I am looking to build a new desktop for 3D art, game dev, and gaming. I've been out of the desktop game for a while so might be missing some nuances...

My monitor is a 3440x1440 ultra wide and I'll be using the 2TB 980 pro that's in my laptop.

CPU I'm thinking the 7950x for all the cores and speed in my 3D applications. The x3D seems like it will be wasted as I will surely be GPU bottle necked in most games?

GPU I'm kinda stuck with nvidia, as CUDA support is important for some of my applications. This gen seems kinda underwhelming though, so I kind of want to go as cheap as I can while still driving my monitor well and maybe upgrade next gen if the offering is better. I feel like the 4060ti wont quite cut it, especially the 8GB VRAM, so I went with the 70.

My basket at OcUK:

Total: £1,938.80 (includes delivery: £0.00)​

Thoughts?
 
I am looking to build a new desktop for 3D art, game dev, and gaming. I've been out of the desktop game for a while so might be missing some nuances...

GPU I'm kinda stuck with nvidia, as CUDA support is important for some of my applications.

Have you done much investigation of the GPU/CPU bias that's most beneficial to you?

In these benchmarks for example, the 4090 is way faster than a 4070 in Blender, though it is also true many apps don't care.

Another thing you might want to be aware of is that according to this, only the 4070 Ti and up have two encoders.

Intel alternative:

My basket at OcUK:

Total: £1,918.87 (includes delivery: £11.99)​

With this cooler (under £50).

13700K seems to do pretty well in UE5, C++ and Blender, here and I upped the memory to 64GB and the graphics card by one tier. The gaming x is expensive, but seemed to perform very well in HUB's roundup of 4070s.
 
Last edited:
Thanks. That's definitely a tempting looking alternative. I was leaning heavy on the CPU 'cause I spend a lot of time in ZBrush, but the Ti would be a lot nicer in other places and for gaming. Hard choice. I don't really work with video much so I don't think the encoder matters.

I should have said that I am looking to stay in the £2k range.

Does the i7 not mind the slower ram? the AMD chips much prefer tighter timings I think?
 
Last edited:
Does the i7 not mind the slower ram? the AMD chips much prefer tighter timings I think?

It does matter for games (less at higher resolutions, than in the video below), but the non-X3D CPUs do care more. For most apps, the difference between 4800 and 6000 is small.

 
Last edited:
This plus an RtX A4500. You’ll drop a little performance from a 7950X but the A4500 is really fast for CUDA.
Honestly unless you need the extra vram there's no real reason for a rtx a series in most 3D software, you might as well just grab the geforce version and save some cash or go one step higher, in this case a 4080. An on offer a4500 is just under £1,100, it's usually £1,600, the 4080 can be had for around £1,100-1,200... 4GB less vram but far faster cuda performance.

Also I'd argue that you might as well go AM5 7900/7900x as it's about the same multi-threaded performance as the 5950x (I have a 5950x) but opens up the am5 platform for future upgrades etc and gives you better single core performance. Having said that, I'm not sure I could recommend an AMD system with all the current issues.....if I was buying this very minute I think I'd be picking intel.

@Boozebeard I'd be looking at 64GB of ram personally for 3D art/game dev.
As mentioned above you really need to look into how much your cpu is used versus your gpu in the programs you use, it might be far more cost effective to put the bulk of you money into the gpu if the programs can make full use of cuda.
 
@Boozebeard I'd be looking at 64GB of ram personally for 3D art/game dev.
As mentioned above you really need to look into how much your cpu is used versus your gpu in the programs you use, it might be far more cost effective to put the bulk of you money into the gpu if the programs can make full use of cuda.

I spend most my time in ZBrush which is CPU only. I don't do that much with CUDA, I just need it for when I need it, if you know what I mean. But I am also getting more into doing game dev stuff, so the GPU will also get used for rendering game engine viewports and such.

Stepping down to the i7 and stepping up to the 4070Ti might be the play.
 
Last edited:
Honestly unless you need the extra vram there's no real reason for a rtx a series in most 3D software, you might as well just grab the geforce version and save some cash or go one step higher, in this case a 4080. An on offer a4500 is just under £1,100, it's usually £1,600, the 4080 can be had for around £1,100-1,200... 4GB less vram but far faster cuda performance.

Also I'd argue that you might as well go AM5 7900/7900x as it's about the same multi-threaded performance as the 5950x (I have a 5950x) but opens up the am5 platform for future upgrades etc and gives you better single core performance. Having said that, I'm not sure I could recommend an AMD system with all the current issues.....if I was buying this very minute I think I'd be picking intel.

@Boozebeard I'd be looking at 64GB of ram personally for 3D art/game dev.
As mentioned above you really need to look into how much your cpu is used versus your gpu in the programs you use, it might be far more cost effective to put the bulk of you money into the gpu if the programs can make full use of cuda.

I’m currently working on moving away from being dependent Nvidia for CUDA now it’s not Nvidia exclusive but the A4500 has a lot more performance for much less heat for me. That’s compared to a desktop RTX 3080 12gb. It’s a shame Nvidia locked out the GeForce card from Quadro features.

The RTX quadro cards seem to always be good value. I wouldn’t personally use a non professional card past hobby workloads, but the OP’s mileage might vary.
 
I spend most my time in ZBrush which is CPU only. I don't do that much with CUDA, I just need it for when I need it, if you know what I mean. But I am also getting more into doing game dev stuff, so the GPU will also get used for rendering game engine viewports and such.

Stepping down to the i7 and stepping up to the 4070Ti might be the play.
Yeah Zbrush is one of those weird 3D programs that doesn't make use of the gpu, you'd think it would with the type of work it's doing....
I still say look towards 64GB of ram, 32GB is pretty borderline in 3D work these days imo and like I said earlier, I'm not sure I could honestly recommend AMD right now so I'd be playing around with what Tetras suggested for intel.

I’m currently working on moving away from being dependent Nvidia for CUDA now it’s not Nvidia exclusive but the A4500 has a lot more performance for much less heat for me. That’s compared to a desktop RTX 3080 12gb. It’s a shame Nvidia locked out the GeForce card from Quadro features.
The rtx a4500 is basically the rtx3080 10GB with double the ram, clocked a bit slower to help reduce power draw. Unless you're doing something hat requires more ram that rtx 3080 12GB should be faster when using cuda so it's a bit weird if that's not the case.... they're all based on ga102 cores.

The RTX quadro cards seem to always be good value. I wouldn’t personally use a non professional card past hobby workloads, but the OP’s mileage might vary.
I've seen loads of smaller professional 3D companies using titans etc instead of quadro/a series... I've seen plenty of places selling workstations etc based around geforce too, why because outside of more vram (a valid reason), a couple of extremely expensive pieces of software with artificial limitations (solidworks I'm looking at you) and nvidia support (important for big business with custom cuda code) there is little to no benefit of having a quadro for most 3D design tasks using off the shelf software.
I've not seen any difference in autodesk (inventor/max) using a quadro versus a geforce, if anything the geforce was slightly faster.[/QUOTE]
 
Last edited:
IRRC the Titan cards did keep the Quadro features. TBH, now you can port CUDA to Radeon or maybe even Ark hardware, its probably time for CUDA to become something else.
 
IRRC the Titan cards did keep the Quadro features. TBH, now you can port CUDA to Radeon or maybe even Ark hardware, its probably time for CUDA to become something else.
I think they did originally keep the full feature set, then gradually had bits removed until it was basically a top tier geforce.

Porting cuda to other hardware is only useful if the software we use actually ports it... unless there is some sort of 'fake process' added to suggest there is a cuda card available. I also don't think Nvidia will let it last for long if it can be ported, cuda is part of their AI strategy so I'll bet they put some sort of hardware lock on it at some point... proprietary lock in for the cash cow.
 
Back
Top Bottom