• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel Price hike

Must be something wrong with my eyes then, as I don't notice 50fps Vs 120fps on my freesync screen and 290 when playing Division 1

It's your eyes. Noticeable difference to the point that anyone playing any kind of shooter at a high level online would very likely describe 50fps as unplayable. Most of the top streamers go into meltdown if fps tanks below 90 to the point they restart their game. If you're good at a game and play for money, the extra gains Intel provides are pretty big. I personally find sub 50fps jarring but am happy with 60fps+ with 80-120 being ideal. The higher you go, the less noticeable the difference. Top end want 120fps+. I'm on Dell 27 1440p gsync. No substitute for fps, sync technology or not.
 
It's your eyes. Noticeable difference to the point that anyone playing any kind of shooter at a high level online would very likely describe 50fps as unplayable. Most of the top streamers go into meltdown if fps tanks below 90 to the point they restart their game. If you're good at a game and play for money, the extra gains Intel provides are pretty big. I personally find sub 50fps jarring but am happy with 60fps+ with 80-120 being ideal. The higher you go, the less noticeable the difference. Top end want 120fps+. I'm on Dell 27 1440p gsync. No substitute for fps, sync technology or not.

The best way to describe it is that there are multiple ways a game gets smooth. One is screen quality, one is smoothness of motion and one is consistent delivery.

Adaptive sync takes care of consistent delivery. Rather than big frame rate changes meaning 30fps one second, 90fps the next, the thing there you really notice is the change in frame rate more than the frame rate itself. Human's perceive differences easily. If you can smooth the transition from 30 to 90fps over a more gradual curve, decreasing frame time or increasing frame time more smoothly... bingo, that is actually what adaptive sync achieves for us.

Smoothness of motion comes from frame rate itself. Imagine turning quickly in a game, now if that turn is cut into 5 slices with a very low frame rate you essentially get massive blur and a massive disconnect from one frame to the next. You cut that into 30 slices and it's better, you cut it into 90 or 120 slices it gets better and smoother. Again as with the frame times we notice the differences. By lessening the difference between each frame in the turn we perceive it to be smoother. Low frame rate isn't the same as high frame rate just because you have adaptive sync.

The screen quality/refresh rate/responsiveness plays into both. A screen which can refresh faster provides a crisper image. That turn at 30hz means the screen is refreshing slower, it has to keep the image the same for longer, this causes our eyes to 'remember' the frame which can induce a after image in our own eyes and the panel itself, the slower the refresh rate and the bigger the change in pixels, the slower the response time so the image on the screen itself is less accurate with more ghosting. Overdrive works again in such a way that the smaller the difference in frame change, the less work overdrive has to do the less overshoot and artefacts there will be.

Higher refresh rate wins in every single way. It increases the crispness of the image on the screen, it reduces the slight burn in effect in our own eyes, it increases the smoothness of in game motion. Adaptive sync tackles a single type of smoothness, it in no way negates high refresh rates providing a better experience all around. IT lessens the jarring motion in a games which have jarring frame rate changes, nothing more or less.


Your 1-5ms panels at 144hz, don't give the same performance when they are only running at 30hz, you're getting either a lot of overdrive artefacts or significantly slower response rates.

As for gains in frame rate, Intel provide very few. If you buy a £700 gpu to run lower res/lower settings then, well, not many people actually do that.

The difference in frame time between 30fps and 60fps is 16.67ms, the difference between 90 and 120fps is about 2.7ms, literally 1/6th the difference. There are monumentally diminishing gains as frame rate increases and at any sensible resolution and setting you are 95% gpu limited and AMD has more than enough juice to be in the same ball park and that's before you talk about many many people saying that the same game feels smoother with an AMD system. Honestly it's something I've felt for some 15 years, I've had Intel since a 2500k, waiting on Zen 2 as Zen came out right after buying a house and Zen 2 was always going to be the biggest deal of the series with 7nm. But since, way back in the times of P4D's and 2500XP, every single AMD system I ever had felt marginally smoother in everything but particularly games.

I don't in any way think Intel provides a better gaming experience, it only wins in benchmarks when you take a top end card, pair it with dire settings no one would use and tank fps well past the point you actually can make any meaningful difference.
 
Last edited:
The best way to describe it is that there are multiple ways a game gets smooth. One is screen quality, one is smoothness of motion and one is consistent delivery.

Adaptive sync takes care of consistent delivery. Rather than big frame rate changes meaning 30fps one second, 90fps the next, the thing there you really notice is the change in frame rate more than the frame rate itself. Human's perceive differences easily. If you can smooth the transition from 30 to 90fps over a more gradual curve, decreasing frame time or increasing frame time more smoothly... bingo, that is actually what adaptive sync achieves for us.

Smoothness of motion comes from frame rate itself. Imagine turning quickly in a game, now if that turn is cut into 5 slices with a very low frame rate you essentially get massive blur and a massive disconnect from one frame to the next. You cut that into 30 slices and it's better, you cut it into 90 or 120 slices it gets better and smoother. Again as with the frame times we notice the differences. By lessening the difference between each frame in the turn we perceive it to be smoother. Low frame rate isn't the same as high frame rate just because you have adaptive sync.

The screen quality/refresh rate/responsiveness plays into both. A screen which can refresh faster provides a crisper image. That turn at 30hz means the screen is refreshing slower, it has to keep the image the same for longer, this causes our eyes to 'remember' the frame which can induce a after image in our own eyes and the panel itself, the slower the refresh rate and the bigger the change in pixels, the slower the response time so the image on the screen itself is less accurate with more ghosting. Overdrive works again in such a way that the smaller the difference in frame change, the less work overdrive has to do the less overshoot and artefacts there will be.

Higher refresh rate wins in every single way. It increases the crispness of the image on the screen, it reduces the slight burn in effect in our own eyes, it increases the smoothness of in game motion. Adaptive sync tackles a single type of smoothness, it in no way negates high refresh rates providing a better experience all around. IT lessens the jarring motion in a games which have jarring frame rate changes, nothing more or less.


Your 1-5ms panels at 144hz, don't give the same performance when they are only running at 30hz, you're getting either a lot of overdrive artefacts or significantly slower response rates.

As for gains in frame rate, Intel provide very few. If you buy a £700 gpu to run lower res/lower settings then, well, not many people actually do that.

The difference in frame time between 30fps and 60fps is 16.67ms, the difference between 90 and 120fps is about 2.7ms, literally 1/6th the difference. There are monumentally diminishing gains as frame rate increases and at any sensible resolution and setting you are 95% gpu limited and AMD has more than enough juice to be in the same ball park and that's before you talk about many many people saying that the same game feels smoother with an AMD system. Honestly it's something I've felt for some 15 years, I've had Intel since a 2500k, waiting on Zen 2 as Zen came out right after buying a house and Zen 2 was always going to be the biggest deal of the series with 7nm. But since, way back in the times of P4D's and 2500XP, every single AMD system I ever had felt marginally smoother in everything but particularly games.

I don't in any way think Intel provides a better gaming experience, it only wins in benchmarks when you take a top end card, pair it with dire settings no one would use and tank fps well past the point you actually can make any meaningful difference.

It doesn't for most people, for others it does. Who wants mostly idle threads when you're sacrificing fps due to clock speed. Sure, do those extra few fps make much difference, not really. Would you take the extra fps over idle threads doing nothing if all you do is game, yes. It then becomes a question if the extra is worth the money.

The other point that seems to get ignored is that by the time 8/16 actually becomes recommended for games, I can guarantee the stuff out now will be classed as being pretty average. There will be lower end 8/8 available that will probably be far quicker.

I'd happily buy AMD come upgrade but not till they get that core speed up - it's just too much of a defecit and I don't buy the whole upgrade path thing either as next round i7 users can simply drop in an i9 when they're cheaper.

If AMD get competitive with zen 2 core for core, watch the price rocket (although still cheaper no doubt).
 
Last edited:
These price hikes are disgraceful, £250 for a 8600k, £400 for a 8700k yet they are full of security holes. It's a joke and a rather bad one at that. The 9700k is probably going to be £450 with the 9900k at £550-580. With AMD on the rise they should be slashing prices not increasing them.
 
These price hikes are disgraceful, £250 for a 8600k, £400 for a 8700k yet they are full of security holes. It's a joke and a rather bad one at that. The 9700k is probably going to be £450 with the 9900k at £550-580. With AMD on the rise they should be slashing prices not increasing them.

Just sticking to the old formula of supply and demand. Even AMD were doing it during the mining boom when a Vega card was costing £1000. Ridiculous but the mining performance was limited to these cards. In this case, however, due to competition it could actually harm them. I mean this time is probably not the best time to be charging premium when people could get the performance somewhere else.
 
Supply and demand, Intel can't supply because they have too much crap on 14nm, so retailer push prices up with limited stock. In the long run it will hurt Intel as people will just buy AMD.
 
Intel Whiskey Lake Shortage Impacting Notebook Supply, Ryzen Mobile Plentiful - Reports
https://www.tomshardware.co.uk/intel-14nm-cpu-shortage-whiskey-lake,news-59143.html

"TrendForce expects Intel's worsening 14nm CPU shortage to impact notebook shipments during the lucrative holiday season as Whiskey Lake processors are falling behind schedule."
"The company's 10nm production hasn't ramped as expected, which is pushing unanticipated demand back to overbooked 14nm production lines."
 
These price hikes are disgraceful, £250 for a 8600k, £400 for a 8700k yet they are full of security holes. It's a joke and a rather bad one at that. The 9700k is probably going to be £450 with the 9900k at £550-580. With AMD on the rise they should be slashing prices not increasing them.
Intel knows there are plenty of people who'll buy and advertise Intel regardless how much price makes sense for what you'll get.
 
It's your eyes. Noticeable difference to the point that anyone playing any kind of shooter at a high level online would very likely describe 50fps as unplayable. Most of the top streamers go into meltdown if fps tanks below 90 to the point they restart their game. If you're good at a game and play for money, the extra gains Intel provides are pretty big. I personally find sub 50fps jarring but am happy with 60fps+ with 80-120 being ideal. The higher you go, the less noticeable the difference. Top end want 120fps+. I'm on Dell 27 1440p gsync. No substitute for fps, sync technology or not.

Massive difference between 60 and 120/144/165 hertz... can even be seen and felt on the desktop :/
 
Must be something wrong with my eyes then, as I don't notice 50fps Vs 120fps on my freesync screen and 290 when playing Division 1
If you jerk the mouse from side to side, you will see the difference. But games are designed with controllers in mind, so the need to jerk the mouse is reduced greatly. You really don't need high FPS unless you are making a living playing CSGO.

On my 1440p/60Hz panel, shaking the mouse from side to side will show gaps of up to 2" between the mouse pointers. At 120Hz that gap will still be 1" and at 240Hz, 0.5".
 
I game at 240hz.

At the time of the switch I didnt notice a visual difference when i swapped from 60hz but with no other changes to my setup my gameplay improved instantly and demonstrably.

Now i can tell visually and by feel when the display settings switch lower than 240 to 144 or lower and will stop to change it back.

Seems odd saying it like that but its how i found it. I would have thought i would have noticed the switch to 240 but its more of a, you miss it when its gone kind of experience.



TLDR:
The initial change from 60 to 240hz wasnt noticeable visually to me, but showed in my gameplay. After using 240hz, changing down to settings below 240 is immediately obvious in feel and look.
 
Back
Top Bottom