Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Overclocked itHow have you managed to get that then?
Amazon USA have the i9-9900k up for sale... It does seem to be a 5ghz chip but $582.50 isn't exactly cheap https://www.tomshardware.com/news/intel-core-i9-9900k-price-amazon,37871.html
For price comparisons - a threadripper 1950 is $710
A Ryzen 2700x $299....
just to point out guys I didnt upgrade to 32gig for a novelty.
FF15 makes windows auto shut down apps after about an hour of gameplay (unless a very large pagefile is set e.g. 20 gig swapfile) on some 16 gig machines due to memory leaks which wont be resolved by the developers, FiveM (modded gta5 for roleplay) after about 2-3 hours will saturate a 16gig machine, and much earlier if you have chrome running in the background, games are definitely starting to get more ram hungry. Since the release of the current gen consoles, developers have got very lazy on memory usage, apparently the reason for the real high FF15 memory usage is the developers felt it was a good idea to code the game to dynamically have a texture limit equal to about 500meg below the total memory in the system, including pagefile memory so it just keeps loading more and more textures for eternity , someone on reddit with a 1080ti and 32gig of ram I think had almost 60 gig of data loaded by the game in memory 9+ gig in vram + 25 gig in system ram, and the rest in pagefile. Then the game crashed
Thanks for the link. Actually it highlights that something is wrong with my system.
https://www.guru3d.com/articles_pages/amd_ryzen_threadripper_2990wx_review,30.html
Looking there, the 2600k (I assume stock?) is getting a score of 6,000 points. When I run timespy, I can't break 4k! and that's running at 4.6ghz. How can I check to see what's happening?
Hmm, I wonder how the title on the chart got cropped off. All this chart shows is that if you are GPU limited it doesn't really matter what CPU you have.
They were testing with a 1080 which is not a 4K card. In games like Tomb Raider it would have show the same story as the 1440p chart you posted in which the GPU becomes the bottleneck and all the results only have a difference of 1 or 2 frames. The CPU is just sat waiting for the GPU. The same pattern can be seen across all the games they tested at 1440p. Whilst at 720p the results have a much wider spread because the GPU is no longer the limiter and Intel is always at the top of the chart. But for some reason you didn’t post one of those charts.I am wondering why guru3d removed the 3840x2160 (4K) graphs from their reviews? I suspect to anti-promote the resolution or how?
So, because I couldn'd find what I had been looking for, the title of the chart didn't matter...
They were testing with a 1080 which is not a 4K card. In games like Tomb Raider it would have show the same story as the 1440p chart you posted in which the GPU becomes the bottleneck and all the results only have a difference of 1 or 2 frames. The CPU is just sat waiting for the GPU. The same pattern can be seen across all the games they tested at 1440p. Whilst at 720p the results have a much wider spread because the GPU is no longer the limiter and Intel is always at the top of the chart. But for some reason you didn’t post one of those charts.
The title provides the details of the benchmarking and so certainly does matter. By choosing to crop off the title, the data loses all context and just becomes misinformation.
Windows these days is pretty smart about using the available RAM for caching and stuff. Doesn't mean it needs it.I can't believe people still think 32Gb of ram is overkill - I'm just sat here browsing the internet, and my machine is using 14Gb of RAM - I'm glad I've got 32, because the usage regularly goes into the 20's... Do people enjoy paging? or having to shut down other applications everytime they launch a game?
I can't believe people still think 32Gb of ram is overkill - I'm just sat here browsing the internet, and my machine is using 14Gb of RAM - I'm glad I've got 32, because the usage regularly goes into the 20's... Do people enjoy paging? or having to shut down other applications everytime they launch a game?
I can't believe people still think 32Gb of ram is overkill - I'm just sat here browsing the internet, and my machine is using 14Gb of RAM - I'm glad I've got 32, because the usage regularly goes into the 20's... Do people enjoy paging? or having to shut down other applications everytime they launch a game?
The OS and applications will adapt RAM usage according to how much RAM you have, since not using RAM is essentially a waste. I thought this was common knowledge by now.
I guess I must be running a hell of a lot more background apps than most people - I do run quite a bit of stuff all the time, but this is what annoys me, who on earth just spends all this money JUST to game? yeah gaming maybe a primary purpose, but if that was the only reason, I'd buy a console. Tbh I do run chrome all the time with around 30 tabs open. I've got a plex server running, and some other things running along those lines too, spreadsheets open etc. But equally I never have any slow down, nothing ever has to page to disk, it's pretty damn responsive. I'd never not have 32Gb, because I had 16Gb and I can tell you with the way I use my computer, it's slower with 16Gb.
Who realistically plays on a 1080 at 720p though? come on, put these things into perspective... its pointless basing any discussion on GPU's and CPU's at unrealistic resolutions. I think the largets user base now is 1080p right? anything above that is still a minority however it is slowly moving up.
At 1080p there is not much difference in reality between AMD and Intel chips, and when you add in a GPU and Adaptive sync tech, it really becomes a non issues for 99% of people.
This whole argument of "oh Intel is 15% faster in specific title with specific card at specific resolution" is lame, old and tired.
The way i look at it is this, at the low to mid tier you have Nvidia 1060 / 1070 and AMD 580 / Vega 56, now most people running at 1060 are probably doing so due to budget, i bet there are a lot more people running a 1060 with a Freesync screen than there are with a G-Sync screen, primarily because they bought a cheap 1080p monitor and did not even realise it had Freesync.
Now if the same user base also buys a 580 and a cheap 1080p screen, again its bundled with Freesync, guess who is getting the better experience? infact i think the 580 actually beats the 1060 fairly often in titles but thats by the by, the overall experience is better because adaptive sync is delivering a better experience. At this point if your running an 8700k or a 2600x you are not really ever seeing the benefit the CPU is giving you because adaptive sync is basically giving you the best experience anyhow.
People need to start realising that GPU and Monitor choice are probably the bigger choices when building a PC other than CPU, especially if working in a set budget, buying the 2700x or 2600x over the 8700k or 9900k means you can probably put that saved cash towards a G-Sync screen and get a much better experience if your buying a 2080 or 1080ti etc. The CPU is going to be giving a similar experience whether its an Intel or AMD Chip.
So, you do have other stuff going on. Not just browsing. That i understand.