• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel to launch 6 core Coffee Lake-S CPUs & Z370 chipset 5 October 2017

Soldato
Joined
28 May 2007
Posts
18,257
The 1800X would be a tempter too at those prices. 8 core 16 thread chip for less than the price 6700K cost 6 months ago.

Are people getting better memory speeds from the 1800X over the 1700?
 
Soldato
Joined
16 May 2007
Posts
3,220
The 1800X would be a tempter too at those prices. 8 core 16 thread chip for less than the price 6700K cost 6 months ago.

Are people getting better memory speeds from the 1800X over the 1700?

The ram speeds seem similar, the type and quality of ram has more influence in my experience. As does the motherboard in general.

Mine is completely stable at 3466 c15, it boots at 3600 but crashes after a while under any real load.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
The 1800X would be a tempter too at those prices. 8 core 16 thread chip for less than the price 6700K cost 6 months ago.

Are people getting better memory speeds from the 1800X over the 1700?

Motherboard depending also. Asrock both X370s (Taichi & Pro Gaming) are very strong, others depends, some weaker on this area unfortunately (eg CH6)
AGESA 1007 comes out shortly also, we might see some better behaviour of the IMCs and at last manage 4000Mhz, the AGESA 1006 brought initially to the platform.

That would add another 10-12% over the 3466 in games.
 
Associate
Joined
27 Apr 2007
Posts
963
Actually given the prices of the 6900K/6950X even used ones, ain't worth to upgrade to them if you have a 6800K for example. Is better to sell and build from scratch.
You are missing the point that some people are allergic to upgrading the platform so in their case jumping to a 6900X or even a Xeon if they don't require high clock speeds is a preferred option.
Intel don't like to cut prices on older chips unless they have to and with orphaned platforms there is less pressure to.
Most of their chips go to OEMs in bulk orders so the retail prices aren't a big deal when you look at the big picture as far as they are concerned.
 
Soldato
Joined
29 Jan 2015
Posts
4,904
Location
West Midlands
Motherboard depending also. Asrock both X370s (Taichi & Pro Gaming) are very strong, others depends, some weaker on this area unfortunately (eg CH6)
AGESA 1007 comes out shortly also, we might see some better behaviour of the IMCs and at last manage 4000Mhz, the AGESA 1006 brought initially to the platform.

That would add another 10-12% over the 3466 in games.

We won't be seeing anywhere near 4000mhz. Hell most CPUs can't even do that let alone the IMC.
 
Soldato
Joined
22 Apr 2016
Posts
3,432
A lot of AAA titles are going to more than 4 cores now.
Bf1, overwatch, destiny2, the division just to name a few.
I was referring to software in general rather than games. Although you say AAA titles are 'going to more than 4 cores' but then most 3 existing/older games which will run at very high fps on a four core i7. It's hardly a pressing case for four plus cores when you can still get ridiculous fps from a four core but a few more from a six/eight core.
 
Soldato
Joined
19 Feb 2011
Posts
5,849
I was referring to software in general rather than games. Although you say AAA titles are 'going to more than 4 cores' but then most 3 existing/older games which will run at very high fps on a four core i7. It's hardly a pressing case for four plus cores when you can still get ridiculous fps from a four core but a few more from a six/eight core.

This needs to be put into perspective though....

200fps on a 60hz 1080p screen is useless, 200fps on a 165hz or a 144hz screen is still pretty useless. Add in adaptive sync tech, as long as your in the range of that, who cares if your 200fps, if your capping your fps to keep within the range etc, if your 7700k is effectively wasting fps at that point as your adaptive sync + gpu are keeping your game silky smooth, you dont really have much room for other tasks on the 4c 8t chip.

Swap to a 6c 12t or 8c 16t Chip, you might take a hit in overall FPS, who cares as long as your still in your adaptive sync range, all those extra cores can then be put to use streaming your game to twitch / youtube etc, or doing other stuff. Software does not really need to catch up, not if you are offering people the chance to do other stuff while you game, dunno about you, but i play a lot of MMO's which are notoriously CPU bound, quite often i read websites for info while playing, raid tactics etc or youtube videos and whatnot. Your limited in this scenario on an i7 and on an i5 its just painful to do. More cores at this point alleviates these issues.
 
Soldato
Joined
22 Oct 2008
Posts
11,493
Location
Lisburn, Northern Ireland
This needs to be put into perspective though....

200fps on a 60hz 1080p screen is useless, 200fps on a 165hz or a 144hz screen is still pretty useless. Add in adaptive sync tech, as long as your in the range of that, who cares if your 200fps, if your capping your fps to keep within the range etc, if your 7700k is effectively wasting fps at that point as your adaptive sync + gpu are keeping your game silky smooth, you dont really have much room for other tasks on the 4c 8t chip.

Swap to a 6c 12t or 8c 16t Chip, you might take a hit in overall FPS, who cares as long as your still in your adaptive sync range, all those extra cores can then be put to use streaming your game to twitch / youtube etc, or doing other stuff. Software does not really need to catch up, not if you are offering people the chance to do other stuff while you game, dunno about you, but i play a lot of MMO's which are notoriously CPU bound, quite often i read websites for info while playing, raid tactics etc or youtube videos and whatnot. Your limited in this scenario on an i7 and on an i5 its just painful to do. More cores at this point alleviates these issues.

Bang on the money.

I remember in the 2000's people taking about how dual cores were the mainstream and how quadcores were "overkill" Now quads are a minimum, with 6 or 8 cores preferred. Developers are making use of more cores and that's how it'll go over the next few years.

MOAR CORES!

Being able to play games AND do other stuff (streaming, web browsing, etc) will become more and more popular so more cores will facilitate that way of using a pc in the future too.
 
Soldato
Joined
13 Jun 2009
Posts
6,847
This needs to be put into perspective though....

200fps on a 60hz 1080p screen is useless, 200fps on a 165hz or a 144hz screen is still pretty useless. Add in adaptive sync tech, as long as your in the range of that, who cares if your 200fps, if your capping your fps to keep within the range etc, if your 7700k is effectively wasting fps at that point as your adaptive sync + gpu are keeping your game silky smooth, you dont really have much room for other tasks on the 4c 8t chip.

Swap to a 6c 12t or 8c 16t Chip, you might take a hit in overall FPS, who cares as long as your still in your adaptive sync range, all those extra cores can then be put to use streaming your game to twitch / youtube etc, or doing other stuff. Software does not really need to catch up, not if you are offering people the chance to do other stuff while you game, dunno about you, but i play a lot of MMO's which are notoriously CPU bound, quite often i read websites for info while playing, raid tactics etc or youtube videos and whatnot. Your limited in this scenario on an i7 and on an i5 its just painful to do. More cores at this point alleviates these issues.
I'm curious how you can multitask whilst using G-Sync or FreeSync though. For me anyway, most games lose a lot of FPS when played in borderless window mode compared to fullscreen and they also have nowhere near as smooth FPS. I think FreeSync support still isn't as good for windowed mode games. I usually have to play in fullscreen and then suffer the multiple-second alt-tabs when doing something outside of a game.

I feel like this is an advantage of using a Linux GPU pass-through VM solution since the Windows VM can be running a game full-screen constantly and a quick KVM switch (via a keyboard shortcut) gets you back to Linux instantly without interrupting the game.
 
Soldato
Joined
19 Feb 2011
Posts
5,849
I'm curious how you can multitask whilst using G-Sync or FreeSync though. For me anyway, most games lose a lot of FPS when played in borderless window mode compared to fullscreen and they also have nowhere near as smooth FPS. I think FreeSync support still isn't as good for windowed mode games. I usually have to play in fullscreen and then suffer the multiple-second alt-tabs when doing something outside of a game.

I feel like this is an advantage of using a Linux GPU pass-through VM solution since the Windows VM can be running a game full-screen constantly and a quick KVM switch (via a keyboard shortcut) gets you back to Linux instantly without interrupting the game.

By using more than 1 screen on the GPU? Also Adaptive sync is usually a range, like my BenQ is 45-144hz, so as long as the game sits in that range, and i fps lock it at the top end, i dont care what fps its running at as long as it does not drop out of the range. It could run at 60fps, 90fps, 83fps etc it does not matter, Adaptive sync is doing its thing.
 
Permabanned
Joined
23 Apr 2014
Posts
23,553
Location
Hertfordshire
I tend to run full screen on the gsync display and have monitoring apps, web pages, videos/football running on the other screen.

If I need to switch the pages/videos I just alt tab out, which given the performance of modern multicore CPU's it works quite well. Though a few games still dont like alt tabbing much.
 
Soldato
Joined
13 Jun 2009
Posts
6,847
I tend to run full screen on the gsync display and have monitoring apps, web pages, videos/football running on the other screen.

If I need to switch the pages/videos I just alt tab out, which given the performance of modern multicore CPU's it works quite well. Though a few games still dont like alt tabbing much.
There should be a proper way to lock/unlock the mouse within fullscreen applications. I tried a program called Actual Multiple Monitors but it just made my FPS tank and I think the mouse stayed on-screen in-game too.
 
Soldato
Joined
29 Jan 2015
Posts
4,904
Location
West Midlands
This needs to be put into perspective though....

200fps on a 60hz 1080p screen is useless, 200fps on a 165hz or a 144hz screen is still pretty useless. Add in adaptive sync tech, as long as your in the range of that, who cares if your 200fps, if your capping your fps to keep within the range etc, if your 7700k is effectively wasting fps at that point as your adaptive sync + gpu are keeping your game silky smooth, you dont really have much room for other tasks on the 4c 8t chip.

Swap to a 6c 12t or 8c 16t Chip, you might take a hit in overall FPS, who cares as long as your still in your adaptive sync range, all those extra cores can then be put to use streaming your game to twitch / youtube etc, or doing other stuff. Software does not really need to catch up, not if you are offering people the chance to do other stuff while you game, dunno about you, but i play a lot of MMO's which are notoriously CPU bound, quite often i read websites for info while playing, raid tactics etc or youtube videos and whatnot. Your limited in this scenario on an i7 and on an i5 its just painful to do. More cores at this point alleviates these issues.

Let's not forget frametimes here. As shown on a few games the ryzen is smoother against an i7 due to more headroom. Not all though, in fact quite the opposite happens but i assume that's down to clockspeed/IPC
 
Caporegime
Joined
17 Mar 2012
Posts
47,650
Location
ARC-L1, Stanton System
CPU cores are like GPU memory, if an nVidia GTX 680 GPU has 2GB then the 3GB on the HD 7970 was overkill and would not be needed for many decades.... the 290X had 4GB, that was too much the 3GB 780TI was perfect.... remember all that?

As apparent Hardware Enthusiasts some of us are a pretty strange bunch, arguing till blue in the face that less hardware for more money is better.
 
Soldato
Joined
19 Oct 2008
Posts
5,951
CPU cores are like GPU memory, if an nVidia GTX 680 GPU has 2GB then the 3GB on the HD 7970 was overkill and would not be needed for many decades.... the 290X had 4GB, that was too much the 3GB 780TI was perfect.... remember all that?

As apparent Hardware Enthusiasts some of us are a pretty strange bunch, arguing till blue in the face that less hardware for more money is better.
There's that saying - opinions, and like *******,everyone has one :). And they are just opinions, not fact.
Regarding the GPU memory, it also comes down to compression used? And what was the overall "optimal" solution, the 290X or the 780Ti? Nobody knows, and depends on exact usage which reviews won't even cover. I'd rather have the faster GPU for my resolution with less memory though, as long as it's sufficient for the next year or two at that resolution- I don't keep a GPU longer than that.
All IMO...burp :D
 
Back
Top Bottom