• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Core 9000 series

just to point out guys I didnt upgrade to 32gig for a novelty.

FF15 makes windows auto shut down apps after about an hour of gameplay (unless a very large pagefile is set e.g. 20 gig swapfile) on some 16 gig machines due to memory leaks which wont be resolved by the developers, FiveM (modded gta5 for roleplay) after about 2-3 hours will saturate a 16gig machine, and much earlier if you have chrome running in the background, games are definitely starting to get more ram hungry. Since the release of the current gen consoles, developers have got very lazy on memory usage, apparently the reason for the real high FF15 memory usage is the developers felt it was a good idea to code the game to dynamically have a texture limit equal to about 500meg below the total memory in the system, including pagefile memory so it just keeps loading more and more textures for eternity :), someone on reddit with a 1080ti and 32gig of ram I think had almost 60 gig of data loaded by the game in memory 9+ gig in vram + 25 gig in system ram, and the rest in pagefile. Then the game crashed :D
 
I can't believe people still think 32Gb of ram is overkill - I'm just sat here browsing the internet, and my machine is using 14Gb of RAM - I'm glad I've got 32, because the usage regularly goes into the 20's... Do people enjoy paging? or having to shut down other applications everytime they launch a game?
 
just to point out guys I didnt upgrade to 32gig for a novelty.

FF15 makes windows auto shut down apps after about an hour of gameplay (unless a very large pagefile is set e.g. 20 gig swapfile) on some 16 gig machines due to memory leaks which wont be resolved by the developers, FiveM (modded gta5 for roleplay) after about 2-3 hours will saturate a 16gig machine, and much earlier if you have chrome running in the background, games are definitely starting to get more ram hungry. Since the release of the current gen consoles, developers have got very lazy on memory usage, apparently the reason for the real high FF15 memory usage is the developers felt it was a good idea to code the game to dynamically have a texture limit equal to about 500meg below the total memory in the system, including pagefile memory so it just keeps loading more and more textures for eternity :), someone on reddit with a 1080ti and 32gig of ram I think had almost 60 gig of data loaded by the game in memory 9+ gig in vram + 25 gig in system ram, and the rest in pagefile. Then the game crashed :D

The games are not meant to be using comedy RAM, you're using 32GB to compensate for incompetent original or third party code.

Which means you do have 32GB for a novel reason.
 
Thanks for the link. Actually it highlights that something is wrong with my system.

https://www.guru3d.com/articles_pages/amd_ryzen_threadripper_2990wx_review,30.html

Looking there, the 2600k (I assume stock?) is getting a score of 6,000 points. When I run timespy, I can't break 4k! and that's running at 4.6ghz. How can I check to see what's happening?

Err... You are severely bottlenecked by your ancient central processor :D

Hmm, I wonder how the title on the chart got cropped off. All this chart shows is that if you are GPU limited it doesn't really matter what CPU you have.

I am wondering why guru3d removed the 3840x2160 (4K) graphs from their reviews? I suspect to anti-promote the resolution or how?
So, because I couldn'd find what I had been looking for, the title of the chart didn't matter...
 
I am wondering why guru3d removed the 3840x2160 (4K) graphs from their reviews? I suspect to anti-promote the resolution or how?
So, because I couldn'd find what I had been looking for, the title of the chart didn't matter...
They were testing with a 1080 which is not a 4K card. In games like Tomb Raider it would have show the same story as the 1440p chart you posted in which the GPU becomes the bottleneck and all the results only have a difference of 1 or 2 frames. The CPU is just sat waiting for the GPU. The same pattern can be seen across all the games they tested at 1440p. Whilst at 720p the results have a much wider spread because the GPU is no longer the limiter and Intel is always at the top of the chart. But for some reason you didn’t post one of those charts.

The title provides the details of the benchmarking and so certainly does matter. By choosing to crop off the title, the data loses all context and just becomes misinformation.
 
They were testing with a 1080 which is not a 4K card. In games like Tomb Raider it would have show the same story as the 1440p chart you posted in which the GPU becomes the bottleneck and all the results only have a difference of 1 or 2 frames. The CPU is just sat waiting for the GPU. The same pattern can be seen across all the games they tested at 1440p. Whilst at 720p the results have a much wider spread because the GPU is no longer the limiter and Intel is always at the top of the chart. But for some reason you didn’t post one of those charts.

The title provides the details of the benchmarking and so certainly does matter. By choosing to crop off the title, the data loses all context and just becomes misinformation.

Who realistically plays on a 1080 at 720p though? come on, put these things into perspective... its pointless basing any discussion on GPU's and CPU's at unrealistic resolutions. I think the largets user base now is 1080p right? anything above that is still a minority however it is slowly moving up.

At 1080p there is not much difference in reality between AMD and Intel chips, and when you add in a GPU and Adaptive sync tech, it really becomes a non issues for 99% of people.

This whole argument of "oh Intel is 15% faster in specific title with specific card at specific resolution" is lame, old and tired.

The way i look at it is this, at the low to mid tier you have Nvidia 1060 / 1070 and AMD 580 / Vega 56, now most people running at 1060 are probably doing so due to budget, i bet there are a lot more people running a 1060 with a Freesync screen than there are with a G-Sync screen, primarily because they bought a cheap 1080p monitor and did not even realise it had Freesync.

Now if the same user base also buys a 580 and a cheap 1080p screen, again its bundled with Freesync, guess who is getting the better experience? infact i think the 580 actually beats the 1060 fairly often in titles but thats by the by, the overall experience is better because adaptive sync is delivering a better experience. At this point if your running an 8700k or a 2600x you are not really ever seeing the benefit the CPU is giving you because adaptive sync is basically giving you the best experience anyhow.

People need to start realising that GPU and Monitor choice are probably the bigger choices when building a PC other than CPU, especially if working in a set budget, buying the 2700x or 2600x over the 8700k or 9900k means you can probably put that saved cash towards a G-Sync screen and get a much better experience if your buying a 2080 or 1080ti etc. The CPU is going to be giving a similar experience whether its an Intel or AMD Chip.
 
I can't believe people still think 32Gb of ram is overkill - I'm just sat here browsing the internet, and my machine is using 14Gb of RAM - I'm glad I've got 32, because the usage regularly goes into the 20's... Do people enjoy paging? or having to shut down other applications everytime they launch a game?
Windows these days is pretty smart about using the available RAM for caching and stuff. Doesn't mean it needs it.

I run 32GB but only because I do a lot of graphics work, and that definitely does need it. 64GB would be useful on occasion and I've considered upgrading but been put off by prices.
 
I can't believe people still think 32Gb of ram is overkill - I'm just sat here browsing the internet, and my machine is using 14Gb of RAM - I'm glad I've got 32, because the usage regularly goes into the 20's... Do people enjoy paging? or having to shut down other applications everytime they launch a game?

I browse with my Phenom II rig with 4GB ram and 2.5 to spare. I game with my other AM3 rig just fine with 8GB. :D
 
I can't believe people still think 32Gb of ram is overkill - I'm just sat here browsing the internet, and my machine is using 14Gb of RAM - I'm glad I've got 32, because the usage regularly goes into the 20's... Do people enjoy paging? or having to shut down other applications everytime they launch a game?

Must be something wrong with your machine. I have 2 copies of visual studio open, about 8 chrome tabs, visual studio code, 2 remote desktops open, 2 programs running in debug mode, sql server management studio open and outlook. Using a whole 7gig.
 
The OS and applications will adapt RAM usage according to how much RAM you have, since not using RAM is essentially a waste. I thought this was common knowledge by now.
 
I guess I must be running a hell of a lot more background apps than most people - I do run quite a bit of stuff all the time, but this is what annoys me, who on earth just spends all this money JUST to game? yeah gaming maybe a primary purpose, but if that was the only reason, I'd buy a console. Tbh I do run chrome all the time with around 30 tabs open. I've got a plex server running, and some other things running along those lines too, spreadsheets open etc. But equally I never have any slow down, nothing ever has to page to disk, it's pretty damn responsive. I'd never not have 32Gb, because I had 16Gb and I can tell you with the way I use my computer, it's slower with 16Gb.
 
I guess I must be running a hell of a lot more background apps than most people - I do run quite a bit of stuff all the time, but this is what annoys me, who on earth just spends all this money JUST to game? yeah gaming maybe a primary purpose, but if that was the only reason, I'd buy a console. Tbh I do run chrome all the time with around 30 tabs open. I've got a plex server running, and some other things running along those lines too, spreadsheets open etc. But equally I never have any slow down, nothing ever has to page to disk, it's pretty damn responsive. I'd never not have 32Gb, because I had 16Gb and I can tell you with the way I use my computer, it's slower with 16Gb.

So, you do have other stuff going on. Not just browsing. That i understand.
 
Who realistically plays on a 1080 at 720p though? come on, put these things into perspective... its pointless basing any discussion on GPU's and CPU's at unrealistic resolutions. I think the largets user base now is 1080p right? anything above that is still a minority however it is slowly moving up.

At 1080p there is not much difference in reality between AMD and Intel chips, and when you add in a GPU and Adaptive sync tech, it really becomes a non issues for 99% of people.

This whole argument of "oh Intel is 15% faster in specific title with specific card at specific resolution" is lame, old and tired.

The way i look at it is this, at the low to mid tier you have Nvidia 1060 / 1070 and AMD 580 / Vega 56, now most people running at 1060 are probably doing so due to budget, i bet there are a lot more people running a 1060 with a Freesync screen than there are with a G-Sync screen, primarily because they bought a cheap 1080p monitor and did not even realise it had Freesync.

Now if the same user base also buys a 580 and a cheap 1080p screen, again its bundled with Freesync, guess who is getting the better experience? infact i think the 580 actually beats the 1060 fairly often in titles but thats by the by, the overall experience is better because adaptive sync is delivering a better experience. At this point if your running an 8700k or a 2600x you are not really ever seeing the benefit the CPU is giving you because adaptive sync is basically giving you the best experience anyhow.

People need to start realising that GPU and Monitor choice are probably the bigger choices when building a PC other than CPU, especially if working in a set budget, buying the 2700x or 2600x over the 8700k or 9900k means you can probably put that saved cash towards a G-Sync screen and get a much better experience if your buying a 2080 or 1080ti etc. The CPU is going to be giving a similar experience whether its an Intel or AMD Chip.

Amen.

Wrote many times the only games someone might need high speed are a handful of games of which one is CSGO and the others are made by PDX (hoi4, eu4, ck2, stellaris)
 
So, you do have other stuff going on. Not just browsing. That i understand.

Yes, but like most people just to make a comment online I'm not going to be reeling off every little thing my computer is doing - I was telling the truth though, I personally was just web browsing at the time I looked at task manager - there's always other stuff going on in the background on my computer.

I guess 16Gb would be plenty if literally ALL you did was install a fresh copy of windows, get steam on it, and every time you boot up you go straight to steam and double click a game, and that's ALL you do - then in essence you have a glorified console, just with a keyboard and mouse attached.
 
Back
Top Bottom