• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Current generation CPU longevity estimates

how long will current cpus last for web browsing? can feel the sluggishness with a q6600 these days. dunno if web pages or browsers are getting more demanding or if its the os bloat or something else....

Single core PC's can browse the internet with no problems. As well as crappy android tablets and phones. Its an software issue on your end.
 
Single core PC's can browse the internet with no problems. As well as crappy android tablets and phones. Its an software issue on your end.

Modern CPU/GPUs (mobile SOCs included) have a lot of fixed function units in there that come into play for web browsing, e.g. encryption/decryption, shading, plotting, math ops, accelerating certain javascript functionality, video decoding, etc.. which on an old Q6600 CPU are done in software, significantly increasing the workload on the CPU cores. So don't expect a snappy experience web browsing on that machine and your experience on cheap but modern computers don't compare.
 
@HangTime no disagreements.



Not really, PCs and consoles run the same logic, and the compute workload per fps is generally the same. Last gen console games were OK being 30fps so developers had more leeway with game logic, which allowed 60/120fps on PCs with more powerful CPUs. With 60fps being the minimum accepted fps in consoles now generally developers need to limit their games to Zen 2 cores being able to offer 60fps, which is what I'd call adequate gaming. That's 16.6ms per frame, all game AI will be written so that a Zen 2 core can process a frame's worth of compute in 16.6ms, until we get new consoles.



You could accelerate AI on the GPU ever since there were GPUs, just not every type of AI. Those are high-throughput latency-insensitive problems such as parameter optimisation or ML model training, generally matrix operations, these do not happen during gaming. Game AI runs on CPUs in every single game. Now if we're talking chess or go, those are different because nature of the problem completely changes, no longer latency-sensitive, and solutions are matrix operations, so those can run on GPUs even though inference still happens on CPUs in 99.99% of cases.

Generally, games use 6-8 main threads these days doing bulk of the compute, most AI workload is in one or two, depending on the type of game. Nature of game AI is that it's a sequential problem so very ill-suited for GPU processing and the added latency of copying data to GPU memory would just make it unfeasible. In the end game AI is generally not a parallelisable problem so multithreading doesn't help (that's why adding more cores beyond 6-8 doesn't give you more fps).

Game logic, AI, physics, graphics, etc., all come together when it comes to the CPU. As you started to add elements on the PC side the requirements went up. I don't see why it won't have the same effect now as it kinda always had. How much further, well, it will depend per game and the developer's ability to optimise.

For the AI, here's demo from AMD with the AI doing "gaming AI related actions" such as path finding, avoid dangers, etc., on the GPU. In that demo were 16.384 AIs. With current GPUs it could probably go into 100.000 +
https://youtu.be/p7PlQ-q17tM?t=409

Here's in The Witcher 3 to see how the old FX streched their legs once in the city and the AI is more present, leaving behind that i3:
https://www.youtube.com/watch?v=Rutk9ErhKG4



Single core PC's can browse the internet with no problems. As well as crappy android tablets and phones. Its an software issue on your end.

Not really, is a pain. They were fine back in the day, but now, with all the "bling-bling" is not worth to bother with that.
 
Last edited:
how long will current cpus last for web browsing? can feel the sluggishness with a q6600 these days. dunno if web pages or browsers are getting more demanding or if its the os bloat or something else....
I guess it depends what "web browsing" looks like in the future, for sure it has got more demanding with extensive use of JS, embedded videos and what not. But really, a modern cpu should be fine for a long time, maybe as mentioned the issue will be if some new instruction set comes out and is used extensively.

As for your situation, the obvious approach is to do a fresh W10 install on an SSD and see how that is going, make sure you have sufficient RAM also (4GB not really enough for modern web). Q6600 is over 15 years old now, so you might have an old/upgraded windows install on there.
 
Modern CPU/GPUs (mobile SOCs included) have a lot of fixed function units in there that come into play for web browsing, e.g. encryption/decryption, shading, plotting, math ops, accelerating certain javascript functionality, video decoding, etc.. which on an old Q6600 CPU are done in software, significantly increasing the workload on the CPU cores. So don't expect a snappy experience web browsing on that machine and your experience on cheap but modern computers don't compare.

This.

Also video decoding. Plenty old machines struggle now on YouTube playback if they dont have gpu assist for the modern codecs.
Pair an old cpu with modern gpu and things may be fine. Hmm that may be something for me to test somewhere down the line with a pentium 4 and ampere gpu.
 
In terms of the consoles being the baseline, the Series X/S allocate 7 cores/14 threads and the PS5 allocates 6 1/2 cores/14 threads to games. Seems like fast 8c/16t will be fine for a long time. I suspect a few games will be able to take advantage of more cores considering how hard it is to even take advantage of 8 atm

Source is Richard Leadbetter from DF
 
Modern cards can have issues being put in such old boards though :o
It's not even actual issues. Run a motherboard that old and even if the CPU is powerful enough, you'll be I/O limited with SSDs and networking. No M2 slots etc etc.
The CPU was powerful but in general things have moved on as you'd expect
 
Viable to whom?

There's folk still using black and white TV's, they'd say this is a "viable" experience.

4690 cannot run games at 4k, while doing typical background tasks (discord, youtube, spotify, few other apps) at the same time as a AAA game. A 6700k can't. Even 9900k's struggle with some games, when running typical apps at the same time.
To general home users, people who just surf teh way use office applications and use it for media watching. There are plenty of reasons someone might hanf on to a CPU . Heck my mate just updated to a 5950X and prior to that he had been running the same setup for 12 years and he used his for all the above plus some photo editing. He is hoping his 5950x lasts as long and i see no reason why it shouldn't. Heck even depending on thetype of games people play may determine how long teh cpu lasts.
 
Viable to whom?

There's folk still using black and white TV's, they'd say this is a "viable" experience.

4690 cannot run games at 4k, while doing typical background tasks (discord, youtube, spotify, few other apps) at the same time as a AAA game. A 6700k can't. Even 9900k's struggle with some games, when running typical apps at the same time.
It really depends what you are doing. My dad was running a i7-860 for about 8.5 years. It was generally OK but was more bottlenecked by a lack of SSD and 4GB RAM than CPU grunt for his needs.
I ran a 3570k for over 5 years, primarily used for gaming. If I'm honest, I'd have been better of holding out longer waiting for DDR4 prices to drop (cost me £150 for 16GB 3000mhz) and maybe picking up Zen+ instead of Zen. I didn't get a huge gain from moving to Ryzen.

Quite why someone wants to be running Youtube, Spotify and playing a AAA game in 4k at the same time I'm not sure, but for me that's not the sort of use case to warrant an upgrade, if I was struggling I'd just switch off the background apps that I can't really be getting full value from. Having the game running at 4k should reduce the CPU load too. Obviously hardcore gamers will upgrade sooner to unlock the potential of powerful GPUS, but I don't think those are the type of people who are looking to buy a CPU and use it for many years.
 
Back
Top Bottom