Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Must be something wrong with my eyes then, as I don't notice 50fps Vs 120fps on my freesync screen and 290 when playing Division 1
It's your eyes. Noticeable difference to the point that anyone playing any kind of shooter at a high level online would very likely describe 50fps as unplayable. Most of the top streamers go into meltdown if fps tanks below 90 to the point they restart their game. If you're good at a game and play for money, the extra gains Intel provides are pretty big. I personally find sub 50fps jarring but am happy with 60fps+ with 80-120 being ideal. The higher you go, the less noticeable the difference. Top end want 120fps+. I'm on Dell 27 1440p gsync. No substitute for fps, sync technology or not.
i5 8400 up to £200 now, who would buy that over the cheaper 2600, AMD must be getting a lot more custom due to these insane Intel prices.
The best way to describe it is that there are multiple ways a game gets smooth. One is screen quality, one is smoothness of motion and one is consistent delivery.
Adaptive sync takes care of consistent delivery. Rather than big frame rate changes meaning 30fps one second, 90fps the next, the thing there you really notice is the change in frame rate more than the frame rate itself. Human's perceive differences easily. If you can smooth the transition from 30 to 90fps over a more gradual curve, decreasing frame time or increasing frame time more smoothly... bingo, that is actually what adaptive sync achieves for us.
Smoothness of motion comes from frame rate itself. Imagine turning quickly in a game, now if that turn is cut into 5 slices with a very low frame rate you essentially get massive blur and a massive disconnect from one frame to the next. You cut that into 30 slices and it's better, you cut it into 90 or 120 slices it gets better and smoother. Again as with the frame times we notice the differences. By lessening the difference between each frame in the turn we perceive it to be smoother. Low frame rate isn't the same as high frame rate just because you have adaptive sync.
The screen quality/refresh rate/responsiveness plays into both. A screen which can refresh faster provides a crisper image. That turn at 30hz means the screen is refreshing slower, it has to keep the image the same for longer, this causes our eyes to 'remember' the frame which can induce a after image in our own eyes and the panel itself, the slower the refresh rate and the bigger the change in pixels, the slower the response time so the image on the screen itself is less accurate with more ghosting. Overdrive works again in such a way that the smaller the difference in frame change, the less work overdrive has to do the less overshoot and artefacts there will be.
Higher refresh rate wins in every single way. It increases the crispness of the image on the screen, it reduces the slight burn in effect in our own eyes, it increases the smoothness of in game motion. Adaptive sync tackles a single type of smoothness, it in no way negates high refresh rates providing a better experience all around. IT lessens the jarring motion in a games which have jarring frame rate changes, nothing more or less.
Your 1-5ms panels at 144hz, don't give the same performance when they are only running at 30hz, you're getting either a lot of overdrive artefacts or significantly slower response rates.
As for gains in frame rate, Intel provide very few. If you buy a £700 gpu to run lower res/lower settings then, well, not many people actually do that.
The difference in frame time between 30fps and 60fps is 16.67ms, the difference between 90 and 120fps is about 2.7ms, literally 1/6th the difference. There are monumentally diminishing gains as frame rate increases and at any sensible resolution and setting you are 95% gpu limited and AMD has more than enough juice to be in the same ball park and that's before you talk about many many people saying that the same game feels smoother with an AMD system. Honestly it's something I've felt for some 15 years, I've had Intel since a 2500k, waiting on Zen 2 as Zen came out right after buying a house and Zen 2 was always going to be the biggest deal of the series with 7nm. But since, way back in the times of P4D's and 2500XP, every single AMD system I ever had felt marginally smoother in everything but particularly games.
I don't in any way think Intel provides a better gaming experience, it only wins in benchmarks when you take a top end card, pair it with dire settings no one would use and tank fps well past the point you actually can make any meaningful difference.
These price hikes are disgraceful, £250 for a 8600k, £400 for a 8700k yet they are full of security holes. It's a joke and a rather bad one at that. The 9700k is probably going to be £450 with the 9900k at £550-580. With AMD on the rise they should be slashing prices not increasing them.
Intel knows there are plenty of people who'll buy and advertise Intel regardless how much price makes sense for what you'll get.These price hikes are disgraceful, £250 for a 8600k, £400 for a 8700k yet they are full of security holes. It's a joke and a rather bad one at that. The 9700k is probably going to be £450 with the 9900k at £550-580. With AMD on the rise they should be slashing prices not increasing them.
It's your eyes. Noticeable difference to the point that anyone playing any kind of shooter at a high level online would very likely describe 50fps as unplayable. Most of the top streamers go into meltdown if fps tanks below 90 to the point they restart their game. If you're good at a game and play for money, the extra gains Intel provides are pretty big. I personally find sub 50fps jarring but am happy with 60fps+ with 80-120 being ideal. The higher you go, the less noticeable the difference. Top end want 120fps+. I'm on Dell 27 1440p gsync. No substitute for fps, sync technology or not.
Massive difference between 60 and 120/144/165 hertz... can even be seen and felt on the desktop :/
If you jerk the mouse from side to side, you will see the difference. But games are designed with controllers in mind, so the need to jerk the mouse is reduced greatly. You really don't need high FPS unless you are making a living playing CSGO.Must be something wrong with my eyes then, as I don't notice 50fps Vs 120fps on my freesync screen and 290 when playing Division 1
It's fine not to game at a high level, I don't either.