I have recently started monitoring usage of GPU and CPU when gaming and have yet to see a game that uses 100% of all 4 cores and 99% / 100% my GPU. That's with a pretty old i5 760 @4ghz and 780Ti
GTA V for instance uses generally never uses 100% GPU and never the full 100% of CPU cores but fps can still dip to 45. Having said that's its the most demanding game as cpu cores can sometimes all be ~95% with 90-95% gpu usage
I would understand dips if usage of either GPU or CPU was maxed out (if either were bottlenecking) but that generally does not seem to be the case. Crysis is the exception where 99% of GPU is used, cpu cores all sit at 70-80% and fps dips lower than 60
I was thinking up upgrading my CPU and motherboard to an i5 but if my current cpu is not maxed out during games (thus limiting gpu) then there isnt any point is there?
To those who have the latest and greatest CPUs / general computing hardware, why is is required?
GTA V for instance uses generally never uses 100% GPU and never the full 100% of CPU cores but fps can still dip to 45. Having said that's its the most demanding game as cpu cores can sometimes all be ~95% with 90-95% gpu usage
I would understand dips if usage of either GPU or CPU was maxed out (if either were bottlenecking) but that generally does not seem to be the case. Crysis is the exception where 99% of GPU is used, cpu cores all sit at 70-80% and fps dips lower than 60
I was thinking up upgrading my CPU and motherboard to an i5 but if my current cpu is not maxed out during games (thus limiting gpu) then there isnt any point is there?
To those who have the latest and greatest CPUs / general computing hardware, why is is required?