Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
My evidence is my own personal experience. I can play BF1 multiplayer with 64 people, a CPU-demanding situation, at 1080p/solid 60fps absolutely fine with a 3.4Ghz 3570k.So you have posted no evidence then.
And who said the 4670k was no good for 1080p?
Do get your facts right please.
Have you got a 4690k? I did, infact I still do as it's in my media server now.
Your making broad sweeping statements when, as Kaap has already said, you have no evidence of this?
I game with high FPS as I have a 144Hz monitor so anytihng less than 140fps is noticeable to me. I do not want to settle for 60fps.
At 1080p, whether it was at low, medium, high or ultra settings, in BF1 my CPU was at 100% and my GPU was hardly being used in multiplayer games.
I switched to a 6 core HT system and bingo, GPU usage is now 90-100%, i've got constant 140fps on ultra settings. I'm telling you how it is.
Here you go, these images may help you process this 'absurd' information.
![]()
![]()
Yup, what I'm seeing there is even a 3.3Ghz 2500k is still *plenty* good for achieving good performance.Have you got a 4690k? I did, infact I still do as it's in my media server now.
Your making broad sweeping statements when, as Kaap has already said, you have no evidence of this?
I game with high FPS as I have a 144Hz monitor so anytihng less than 140fps is noticeable to me. I do not want to settle for 60fps.
At 1080p, whether it was at low, medium, high or ultra settings, in BF1 my CPU was at 100% and my GPU was hardly being used in multiplayer games.
I switched to a 6 core HT system and bingo, GPU usage is now 90-100%, i've got constant 140fps on ultra settings. I'm telling you how it is.
Here you go, these images may help you process this 'absurd' information.
![]()
![]()
Your CPU could do with an overclock as 3.9ghz is a little too slow for 1080p.
Yup, what I'm seeing there is even a 3.3Ghz 2500k is still *plenty* good for achieving good performance.
Nobody specified anything about playing with 144hz monitors. The claim was that a 3.9Ghz 4790k was 'too slow' for 1080p gaming. That's it. That's what I'm disputing.
Please pay attention here and dont confuse the context of what I'm saying.
EDIT: And really, if we're talking 144hz monitors, then people at higher resolutions will face the same issues here in terms of CPU bottlenecking. If your system cant do 140fps at 1080p with a given CPU, it certainly isn't gonna do any better turning the resolution up. You're just more likely to introduce a GPU bottleneck at some point, reducing performance even more.
I'm at 3440x1440 and still getting 99% usageI did not mention htz rate, you are busted.
As to resolution, the higher you go the easier it is for the CPU.
I'm at 3440x1440 and still getting 99% usage![]()
What the hell are you talking about? I responded to somebody else who was talking about 144hz monitors.I did not mention htz rate, you are busted.![]()
Even if that was true(which it isn't), it doesn't support your claim that an i7 Haswell at 3.9Ghz is 'too slow' for 1080p gaming.As to resolution, the higher you go the easier it is for the CPU.
Depends what settings you are using.
If you are using lower settings and getting high fps then it will impact on the CPU.
Normally @2160p you are limited to a 60htz monitor so high settings and low fps make it easy for the CPU.
The CPU does not care what settings are used in the frames, just how many there are.
Recommended, sure. If you notice, I am also recommending the person overclock their CPU. But not because it's simply 'too slow' if they dont, but because there will be the occasional situation or title where it does help some.I do see your point Seanspeed but please bear in mind the owners of the previous Sandybridge cpu's were clocking their processors on average around 4.5 Ghz and that was then.
Now for example you have Gears of war 4 ideal cpu requirement is a Haswell i7 at 4 Ghz.
I am not defending anyone here, I am just saying that to get the best out of a gtx 1070 and higher, a cpu overclock is recommended of over 4 ghz in my opinion.
I do apologize if I caused any offense.![]()
Recommended, sure. If you notice, I am also recommending the person overclock their CPU. But not because it's simply 'too slow' if they dont, but because there will be the occasional situation or title where it does help some.
This is also not correct.The CPU does not care what settings are used in the frames, just how many there are.
If you're getting the performance you desire, then dont worry about CPU usage being high. It's only a problem if it's causing performance issues.I'm at 3440x1440 and still getting 99% usageIs that a bad thing?Game is smooth,have it under a h115 temps max 65.
Cheers Sean.This is also not correct.
There are settings in games that are largely or partly CPU-dependent. A big one being draw distance-related settings, whether environmental, object or shadow-related. These involve extra draw calls for the system and the CPU is the processor that typically handles that. It's a big reason that open-world titles can often be more CPU-heavy than other types of games.
If you're getting the performance you desire, then dont worry about CPU usage being high. It's only a problem if it's causing performance issues.
And Kaap is wrong. CPU's dont 'take a breather' just because a GPU bottleneck is introduced. It is still required to do the same workload it would if you were running a lower resolution.
And Kaap is wrong. CPU's dont 'take a breather' just because a GPU bottleneck is introduced. It is still required to do the same workload it would if you were running a lower resolution.
Obviously not enough.I only game on a couple of 2160p monitors what would I know lol.
By correcting your inaccurate or exaggerated claims? Not sure how that makes me look silly.I think you need to give this a break as you are making your self look silly.
Obviously not enough.
Buying expensive stuff does not mean you understand the technology behind it better than others, unfortunately for you.
Again, you seem to misunderstand that hitting a GPU bottleneck at a high resolution DOES NOT mean the CPU is alleviated of any workload. The CPU does not care what resolution a piece of gaming software is running at. Just because it's not the performance bottleneck anymore doesn't mean it's taking a breather. It's simply not how it works. And hitting a GPU bottleneck at a higher resolution when the CPU was the bottleneck at a lower resolution simply means your performance is WORSE than it was before. This is not a better scenario to be in unless you are purposefully prioritizing IQ over framerate.
By correcting your inaccurate or exaggerated claims? Not sure how that makes me look silly.
Please pay attention here and dont confuse the context of what I'm saying.
But anybody who tells you your 4670k isn't good for 1080p or *anything else* has absolutely no idea what they're talking about, or are at the very least quite confused.