• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel to launch 6 core Coffee Lake-S CPUs & Z370 chipset 5 October 2017

1070 or 1080 will probably be faster than a 1080ti at this resolution . this is the case in pubg aswell.once you go over 1440 res you see the 1080ti stretch its legs.
 
Thanks guys, I've been looking on YouTube for similar setups but none have afterburner running. I'm not sure how a 1070 is faster than a 1080ti at any resolution, it doesn't make sense.
 
I assume this benchmark leak (8700k 51% faster than 7700k in multitasking) has been discussed here. Unless I'm missing something, this is awful! It's a 51% increase in performance based on adding 50% more cores. So in essence, the 8700k is just a 7700k with 2 more cores. Have I missed something here or is this really what Intel are bringing to the table?
 
Thanks guys, I've been looking on YouTube for similar setups but none have afterburner running. I'm not sure how a 1070 is faster than a 1080ti at any resolution, it doesn't make sense.

Related to the generally poor state of nvidia drivers for higher thread CPUs? Some titles I play even nvidia experience will recommend medium settings with my two high clocked 1080s with my 1800x @ 4.0 vs same thing recommending all ultra on my 6700k.
 
I assume this benchmark leak (8700k 51% faster than 7700k in multitasking) has been discussed here. Unless I'm missing something, this is awful! It's a 51% increase in performance based on adding 50% more cores. So in essence, the 8700k is just a 7700k with 2 more cores. Have I missed something here or is this really what Intel are bringing to the table?

Hardly awful when you look at the alternatives from all the other manufacturers, that being one other.
 
I assume this benchmark leak (8700k 51% faster than 7700k in multitasking) has been discussed here. Unless I'm missing something, this is awful! It's a 51% increase in performance based on adding 50% more cores. So in essence, the 8700k is just a 7700k with 2 more cores. Have I missed something here or is this really what Intel are bringing to the table?

50% performance increase for the same/similar price sounds pretty good to me
 
Related to the generally poor state of nvidia drivers for higher thread CPUs? Some titles I play even nvidia experience will recommend medium settings with my two high clocked 1080s with my 1800x @ 4.0 vs same thing recommending all ultra on my 6700k.

I thought this rumour of nvidia having poor performance on higher threaded CPU's got put to bed.
It only affected tomb raider as far as I can see.
 
I think this video proves my point and what is influencing my decision to ditch ryzen for coffeelake.


http://gamegpu.com/action-/-fps-/-tps/destiny-beta-test-gpu-cpu

Now, we know this engine uses multi core CPU's very well, 8 cores on mine and 16 on amigafans' threadripper.
Yet the 1800x despite having double the cores and threads is being beaten by a 6700 clocked 200mhz lower.
How many times are we going to see that the engine isn't optimised or other excuses?
This is probably one of the better engines out there that uses all the cores available and yet ryzen is still behind intel.

There is a slide showing a 1070 and things are a lot more even there so it might the recording software/settings I was using that caused my GPU usage to drop,
but still it doesn't fill me with confidence going forward with faster GPU's when an intel quad clocked 200mhz lower from 2015 is beating an 8 core ryzen.
When volta comes this gap will only increase. It appears it isn't clockspeed thats hurting ryzen, its also IPC.

As I said above, if gaming at higher resolutions or at 60hz it wont matter too much but for me at 165hz that 20fps increase the 6700 is giving is enough for me to jump ship.
 
How many times are we going to see that the engine isn't optimised or other excuses?

I think it is a bit of both - the reality is there are a lot of less than optimal game engines for multi-threading and partly due to the complexities and highly serialised nature of some parts of your average game engine that is going to be the story for quite awhile.
 
I think this video proves my point and what is influencing my decision to ditch ryzen for coffeelake.


http://gamegpu.com/action-/-fps-/-tps/destiny-beta-test-gpu-cpu

Now, we know this engine uses multi core CPU's very well, 8 cores on mine and 16 on amigafans' threadripper.
Yet the 1800x despite having double the cores and threads is being beaten by a 6700 clocked 200mhz lower.
How many times are we going to see that the engine isn't optimised or other excuses?
This is probably one of the better engines out there that uses all the cores available and yet ryzen is still behind intel.

There is a slide showing a 1070 and things are a lot more even there so it might the recording software/settings I was using that caused my GPU usage to drop,
but still it doesn't fill me with confidence going forward with faster GPU's when an intel quad clocked 200mhz lower from 2015 is beating an 8 core ryzen.
When volta comes this gap will only increase. It appears it isn't clockspeed thats hurting ryzen, its also IPC.

As I said above, if gaming at higher resolutions or at 60hz it wont matter too much but for me at 165hz that 20fps increase the 6700 is giving is enough for me to jump ship.

Ryzen is been used with 2666mhz ram for heaven sake. This is August 2017 not March. Anything less than 3200mhz is crooked benchmark for Ryzen losing around 20% perf.

Even between 3000 to 3466 is 10%+ the difference
https://community.amd.com/community...emory-oc-showdown-frequency-vs-memory-timings
 
Last edited:
Ryzen is been used with 2666mhz ram for heaven sake. This is August 2017 not March. Anything less than 3200mhz is crooked benchmark for Ryzen losing around 20% perf.

Fair point, not that I could see anywhere on that site showing what memory speeds ryzen is using.
That said, its more like 7%.
Take that 7% into account and a quad core intel is still beating it, its not looking good here is it?
 
@gavinh87

You can't expect games to effectively make use of 8 cores at what is only a now average clock speed on a new platform when most of the gaming player base use 4 core 4 thread cpus from a different manufacturer. There's no reason to seriously cater for them as not enough gamers use them, at least yet. For gaming alone, 6 cores and 12 threads at good clock speed will be the sweet spot for years with 4c/8t and 6 cores becoming the new staple i5. This isn't to say they're bad, but people don't seem realistic, the market share is still small plus they're new tech.
 
Last edited:
Fair point, not that I could see anywhere on that site showing what memory speeds ryzen is using.
That said, its more like 7%.
Take that 7% into account and a quad core intel is still beating it, its not looking good here is it?

When this was posted? Before or after AGESA1006? Give me the original link :)
 
@gavinh87

You can't expect games to effectively make use of 8 cores at what is only a now average clock speed on a new platform when most of the gaming player base use 4 core 4 thread cpus from a different manufacturer. There's no reason to seriously cater for them as not enough gamers use them, at least yet. For gaming alone, 6 cores and 12 threads at good clock speed will be the sweet spot for years with 4c/8t and 6 cores becoming the new staple i5. This isn't to say they're bad, but people don't seem realistic, the market share is still small plus they're new tech.

Except the game does use 8 cores effectively, on both AMD and intel, with intel somehow coming out on top despite a clock speed and core deficit.

When this was posted? Before or after AGESA1006? Give me the original link :)

April 20th, not that agesa would make much of a difference, 3200 is 3200. Agesa just made it easier to get there.
 
Except the game does use 8 cores effectively, on both AMD and intel, with intel somehow coming out on top despite a clock speed and core deficit.



April 20th, not that agesa would make much of a difference, 3200 is 3200. Agesa just made it easier to get there.

Give me the source of the image :)
 
Give me the source of the image :)

In case I made it myself in photoshop or something lol?

Heres a better one


Ryzen is using 3200mhz ram here as stated on the website. http://www.gamersnexus.net/game-bench/3038-destiny-2-beta-cpu-benchmarks-testing-research
He did say that SMT isn't working on ryzen, that being said its losing to a damn i3 lol.
I wonder what excuses people are going to make here, the game uses 8 cores and its still losing out.
I took a gamble on ryzen but it seems its not going to pay off, early days I know but as GPU's get faster a CPU isn't.
 
I think this video proves my point and what is influencing my decision to ditch ryzen for coffeelake.


http://gamegpu.com/action-/-fps-/-tps/destiny-beta-test-gpu-cpu

Now, we know this engine uses multi core CPU's very well, 8 cores on mine and 16 on amigafans' threadripper.
Yet the 1800x despite having double the cores and threads is being beaten by a 6700 clocked 200mhz lower.
How many times are we going to see that the engine isn't optimised or other excuses?
This is probably one of the better engines out there that uses all the cores available and yet ryzen is still behind intel.

There is a slide showing a 1070 and things are a lot more even there so it might the recording software/settings I was using that caused my GPU usage to drop,
but still it doesn't fill me with confidence going forward with faster GPU's when an intel quad clocked 200mhz lower from 2015 is beating an 8 core ryzen.
When volta comes this gap will only increase. It appears it isn't clockspeed thats hurting ryzen, its also IPC.

As I said above, if gaming at higher resolutions or at 60hz it wont matter too much but for me at 165hz that 20fps increase the 6700 is giving is enough for me to jump ship.

Something weird is going with the game engine.

1300X/1600X/1800X all pushing similar numbers and also in the wrong order.

5960X clearly shows that more cores should result in big gains.
 
Back
Top Bottom