• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Worth upgrading from i7-9700?

Associate
Joined
16 Aug 2004
Posts
804
Hi,

my main use is photo processing (LR), MS Office and occasionally some strategy gaming (XCOM, Total War series and FM series). Current setup: i7-9700, 32GB RAM and 2070 Super. Resolution: 3440*1440, haven't found yet a game not performing well, so main interest if photo processing.

thanks
Yannis
 
Don't see any point. If the 9900k is the end game CPU for your board there will be no difference in LR really.

I only just upgraded to a 7700x from a 4790k as I moved to 120hz screen.
 
Hi all!

I am coming back to my question raised last February, checking whether a 13700 upgrade would noticeable speed gains in LR. I see latest versions of LR getting a bit slower and was wondering whether a CPU or GPU upgrade would make sense...

thanks
 
The 13700 is a big upgrade to productivity performance in general (though you might want to wait for the rumoured 8 P-Core, 12 E-Core 14700), but if you're referring to Lightroom, then from the benches I've seen it comes down to: which specific features are you using and as I said in my post back in February, how much/long are you waiting and when you are waiting, which part of the PC is the bottleneck?

It appears to be very important to clarify this with such apps, because if you find a specific tool or task is running slow and upgrade your CPU, but then find out the SSD was the bottleneck, expensive oops.. (or similar with the GPU).

This is an example of what I'm talking about:

"We’ve been benchmarking Adobe’s Lightroom since the first version released, and until this point in time, we’ve never used it for GPU testing. That’s because in all of our previous testing, we’ve never noticed our export processes use it. Well, a change happened at some point, and it could have very well been a while ago, and we just didn’t notice because we use the same GPU in all of our CPU testing.

What’s clear from the above result is that exporting to DNG has nothing to do with the GPU whatsoever. We’re only including the result to make that point clear. Exporting to JPG can take great advantage of a GPU, and this becomes another test where Arc shines.

As the bottom five results show, Lightroom doesn’t support as wide a range of GPU hardware as we’d like to see. All of those 200 second results are the same because the work ended up being relegated to the CPU. No AMD GPU was able to help with the process, nor did the Arc A380 – which might possibly be the first driver-related bug we’ve encountered. But even NVIDIA’s GTX 1660 – a GPU a mere two generations old – failed to be utilized, as well. We hope Adobe will expand support in the future, because GPUs do speed up the JPG export process nicely.

Oct 5 Edit: The lack of GPU acceleration in Lightroom on the AMD side of things is due to another driver bug. The just-released 22.10.1 driver notes that this issue is fixed, so we’ll retest the Radeons in time for the RTX 4090 launch.

Because it really stands out, yes – the RTX 3060 Ti placed behind the RTX 3060. This is far from being the first time we’ve seen this sort of behavior from Lightroom, but it’s the first time to happen with GPUs involved. Months ago, we found that AMD’s Ryzen 9 5900X outperforms the 5950X – a result that has been oddly consistent even across generations."


In this video, he finds little improvement for using a high-end GPU instead of the CPU's IGP:

But, this new feature:


Appears to be a monster and is showing up old PCs very badly when used.

Puget's benchmarks, they say this:

For Lightroom Classic, our testing is split into two categories: passive tasks like exporting and generating previews, and active tasks like culling and switching modules. These are combined into a single overall score, but depending on the workflow, either the passive or active score may be more important than the overall score.

 
Hi all!

I am coming back to my question raised last February, checking whether a 13700 upgrade would noticeable speed gains in LR. I see latest versions of LR getting a bit slower and was wondering whether a CPU or GPU upgrade would make sense...

thanks
I'd not upgrade for now in any case. Intel are due to launch a refreshed 13th gen in the coming months with the 14th gen and seems like the 14700 will gain a few more cores over the 13700. In 2024 from what I've read both AMD and Intel will be launch all new products so would be an even nicer upgrade to hold out until then I think and see what both companies offer.
 
Last edited:
I looked into this as I'm also currently using an i7-9700 and the price of an i9-9900(X) was so high for the small perormance gain that it just wasn't worth it. I'm currently building a complete new system based on a Ryzen 7 7700 but the GPU has yet to be selected.
 
I looked into this as I'm also currently using an i7-9700 and the price of an i9-9900(X) was so high for the small perormance gain that it just wasn't worth it. I'm currently building a complete new system based on a Ryzen 7 7700 but the GPU has yet to be selected.
ok will be curious to read your impressions by this upgrade. Not so much in games but productivity
 
Any update news on the 9700 upgrade to a newer cpu?
The 14700 is the only 14th gen CPU that has a meaningful difference in performance, due to the extra E-Cores over the 13700, so not much has changed. Since the single core performance is slightly higher because of their higher clocks, office and photo apps tend to show some gains, but we're talking very marginal improvements (a few % at most).
 
The 14700 is the only 14th gen CPU that has a meaningful difference in performance, due to the extra E-Cores over the 13700, so not much has changed. Since the single core performance is slightly higher because of their higher clocks, office and photo apps tend to show some gains, but we're talking very marginal improvements (a few % at most).
So that means that LR doesn’t benefit from multiple cores?
 
The 14700 is the only 14th gen CPU that has a meaningful difference in performance, due to the extra E-Cores over the 13700, so not much has changed. Since the single core performance is slightly higher because of their higher clocks, office and photo apps tend to show some gains, but we're talking very marginal improvements (a few % at most).
What about upgrading to 13700k? When I compare it to my humble 9700, I see several technical differences, such as more cores and higher frequency.
 
Hi,

my main use is photo processing (LR), MS Office and occasionally some strategy gaming (XCOM, Total War series and FM series). Current setup: i7-9700, 32GB RAM and 2070 Super. Resolution: 3440*1440, haven't found yet a game not performing well, so main interest if photo processing.

thanks
Yannis
didn't got the actual point
 
So that means that LR doesn’t benefit from multiple cores?
The honest answer is: I don't know. I rely on the benchmarks.

In Puget's 13th gen article, they have three different scores and from what I can see from the overall score, the answer is: LR is mainly single core/thread (which is historically the assumption, that 2D is single core/thread, while 3D editing is multi core/thread).

How can I make that assumption?
  1. The difference between the CPUs is relatively small in their scores, for example: there's only +10% between the 13600K and the 13900K, which is much less than their multithread performance (+55%, according to PassMark, here).
  2. The 5950X is behind even the Ryzen 5 7600X, when in something like Blender (heavily multithreaded) it performs more like a Ryzen 9 7900 (here).

What about upgrading to 13700k? When I compare it to my humble 9700, I see several technical differences, such as more cores and higher frequency.
Sure, you could, but since they released at pretty much the same price and the 14700K is superior, what's the point?

In my opinion, the only one that would make sense is the 13700F, because it is around £60 cheaper, but it's not usually something we suggest (that you don't get the IGP) with usage like yours.
 
Last edited:
Back
Top Bottom