• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel to launch 6 core Coffee Lake-S CPUs & Z370 chipset 5 October 2017

1440p is just an interim resolution imo.

Problem is for most users once you get to around 4K there is less and less benefit to using the extra resolution for additional screen estate and Windows really doesn't take advantage of extra resolution to for instance increase the number of pixels used to represent text and icons very well and other scaling. A lot of desktop users still do things where you are working at the pixel scale as well which becomes more complicated at those kind of resolutions.

Everyone will have a bit different appreciation and usage, etc. at these kind of resolutions but personally I've gone back to 1440p for most uses only swapping over to my 4K monitor when playing some games with controller and watching 4K movies, etc.
 
It is, but no matter how you look at it, at 144hz you need every bit of power you can get. Even if this is only 10fps difference for an extra £100 or whatever people will want it.

Well it depends how good you are I suppose, but I'll take increased image quality after a point. If you really want the best feels and money is no object then find a Sony GDW900 and have it refurbished. Nothing will touch that monitor for feels.
 
85Hz on a CRT is a whole different ball game to an LCD monitor though - even 160-200Hz LCDs don't have the motion clarity of a good CRT which can have pixel persistence in the microseconds while even with ULMB very best case is 1ms and typically more like 6-7ms for a fast LCD and 16+ms for a slow one.
 
I urge you to try a 144hz if you can. Resolution won't mean as much to you then. Even if your not a gamer the difference can be felt on the desktop.
Yeah maybe one day It's not that I can't afford to buy one I just can't justify it at the minute. I'm not gaming no where near how much I used too and to be honest what I have already does the trick.
 
I'm l33t as at 75Hz with sub 8ms. I'm a little bit more l33t with a CRT at 85Hz. With two betta blockers and 20 cans of red bull down I might do slightly better at *maybe* 95Hz and might get a one more KTD.

Increasing the refresh rate of LCD past 90Hz is pretty pointless when you can have a fundamentally better technology from the start. Anyone who is truly super sensitive to response time would notice just how gooey LCD's are and the higher refresh rate the all the other LCD's would become apparent. I have never met anyone that sensitive, but I've met a lot of people that claim to be. Sorry I had this argument with a couple of *feelers* a week ago. But either way if you want to best feels buying a CRT is the first place to put your money.
 
I'm l33t as at 75Hz with sub 8ms. I'm a little bit more l33t with a CRT at 85Hz. With two betta blockers and 20 cans of red bull down I might do slightly better at *maybe* 95Hz and might get a one more KTD.

Increasing the refresh rate of LCD past 90Hz is pretty pointless when you can have a fundamentally better technology from the start. Anyone who is truly super sensitive to response time would notice just how gooey LCD's are and the higher refresh rate the all the other LCD's would become apparent. I have never met anyone that sensitive, but I've met a lot of people that claim to be. Sorry I had this argument with a couple of *feelers* a week ago. But either way if you want to best feels buying a CRT is the first place to put your money.

CRT isn't an option for a lot of people and certainly not me. Whilst I'm not an elitist I do appreciate the difference between 60 and 144hz even if I'm on an LCD.
 
I wouldn't worry about it. 99.95% of people will not notice the difference and 90% of today's games suck for above 100FPS anyway. Most games fall apart at 60FPS TBH, but if you really want the very best for the hell of it and money is no object, buy a CRT and then build a system tuned around low latency.
 
I'm pretty happy with my gsync dell however wouldn't pay today's prices for one given they were sub £400 when I got mine. 1440p 144hz is still the sweet spot I think. For photo editing and slower games/none fps I'd probably prefer IPS though. There's more than a few TN owners who haven't even tried to calibrate their monitors who moan about the washed out look, drop the gamma slightly and it's vastly improved. Saying that, perhaps there's more difference between 8bit and 6bit than I remember as well.
 
Last edited:
No, the i5 8400 is listed as the same price as the i3 8350k. It can't be correct as why would you get the i3 when you can have 2 more cores for the same money?

For the old generation the similar price perhaps made a bit of sense as there were the same amount of threads. For coffeelake it doesn't seem to make sense, especially as the 8400 boosts to 3.8ghz all core or 4.0ghz single core. Why would anyone by the 8350k?

The Core i3 7350K was originally priced close to a locked Core i5 7400 too:

https://ark.intel.com/products/97527/Intel-Core-i3-7350K-Processor-4M-Cache-4_20-GHz
https://ark.intel.com/products/97147/Intel-Core-i5-7400-Processor-6M-Cache-up-to-3_50-GHz
 
buy a CRT and then build a system tuned around low latency

Most of the decent gaming CRTs these days are way over 10 more like 20+ years old sadly and usually on their way out - might be able to patch them up a couple more times but their day is mostly done unless you get a bit lucky - some live on forever heh - I've got an old 800x600 (colour) Wyse CRT dated stamped 1984 in storage that last I checked was still working fine.

Sadly the ones that combine decent resolution with good refresh rates are pretty expensive as well as those that do have them tend to know the value of them.

Despite the total crazy BS you were accusing me of before hey look I've been posting about this stuff for years: https://forums.overclockers.co.uk/posts/14817180 and https://forums.overclockers.co.uk/posts/13134030 etc.
 

I know but as I pointed out in my post that you quoted, they at least had the same thread count (2c/4t vs 4c/4t). Also the clocks were much higher on the i3. However, this time the i5 has 2 more cores (6c/6t vs 4c/4t) and still has a max boost of 4.0ghz.
 
I know but as I pointed out in my post that you quoted, they at least had the same thread count (2c/4t vs 4c/4t). Also the clocks were much higher on the i3. However, this time the i5 has 2 more cores (6c/6t vs 4c/4t) and still has a max boost of 4.0ghz.

This is Intel - they seem to add some weird tax for overclocking. They are probably scared more people would buy a bargain basement Core i3 8350K over the the Core i5 8600K.
 
I go so far as to say Intel are anti enthusiast and are pre overclocking chips.

OFC they are - look at your setup for example. The Xeon E3 CPUs used to work in normal consumer motherboards,as I have a Xeon E3 V2,and they changed that with the "new" C232 chipset.

Then when the socket 1151 V1 had the overclocking "bug" they subsequently fixed it with a "new" release,aka,Kaby Lake.
 
That CPU-Z benchmark makes no sense. It's also fairly unreliable since different versions give different performance figures and it shows Ryzen in a better light than most other benchmarks, for example. Why would it have roughly the same multi-core performance as Kaby Lake (with the i7-7700K being 21% faster at the same clock, due to SMT) but 15% better single-core performance per clock?

Waiting for actual reviews.
 
Assuming those figure are true, that is what, about a 6% performance increase over skylake at identical clock speeds. I though intel were saying an 11% improvement over Kabylake.
 
Assuming those figure are true, that is what, about a 6% performance increase over skylake at identical clock speeds. I though intel were saying an 11% improvement over Kabylake.
For cherry picked benchmarks, sure.

There will be some cases where you get 11%, others where you don't
 
So wccftech have stated that:
In multi-threading, the Core i7-7700K scores 2648 points but we can see that it is solely due to the higher thread count and within the range that i3 can easily overcome with a slight overclock.
erm, ok, seems legit. A 6-11% IPC increase (maybe) makes up for half the threads.
 
Back
Top Bottom