Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Regarding your point about the Adobe software, if using an AMD card, would that then mean the only option would be to choose software rendering, say for rendering a video in Premiere?Thanks. I kind of thought that would be the case. I use Adobe stuff quite a lot too so I would miss the Cuda Cores too.
If you are using creative stuff i'd steer clear of Radeons for now:
[snip]
Radeon ****e for creative. Job done.Don't suppose you can summarise for the lazy/time starved?
Regarding your point about the Adobe software, if using an AMD card, would that then mean the only option would be to choose software rendering, say for rendering a video in Premiere?
It’s more to do with NVIDIA getting ahead of the game and introducing a proprietary standard in CUDA.Radeon ****e for creative. Job done.
edit - Basically says that AMD need to spend more on the driver teams to optimise for the creative side, not like they're short on cash atm raking it on from server products etc.
Definitively wouldn't bother for those 3 games you listed.
Only thing that might be a good reason to consider is if Vram on the 3080 is becoming a limiting factor for other stuff. What Resolution do you play at OP?
I'm on my 3090FE still and have no intention to upgrade for a while. 24GB of Vram was overkill at the time, but in the last year it's really starting to get used now by more and more games, and especially as I use VR and play modded stuff a lot.
That's because the 4080 has 16GB vram so it should still be fine for a while; the 4070ti with 12GB vram on the other hand already have some shuttering issue on games like Hogwarts Legacy at with higher quality raytracing turned on at 1440p and above.I think some games use more than they need, I've never seen my 4080 use all of its VRAM, but pretty close.