• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

i7-5820k Questions

Soldato
Joined
28 May 2007
Posts
18,262
Maybe, but Id take 12 cores on a much better platform any day. If you want to build a 1080p games console for over 120Hz gaming then you buy Intel.

A 7640X is probably the way to TBH.
 
Soldato
Joined
28 May 2007
Posts
18,262
The argument is spend as much as possible on the graphics card as the CPU performance is less important. The fastest graphics card currently available is DX11 and that code path can't scale with cores. 4 cores is probably overkill so you want the fastest highest clocking 4 core you can buy. In born to kills case the cheapest too.
 
Soldato
Joined
29 Jan 2015
Posts
4,904
Location
West Midlands
24wgah3.png


Even dx12 likes a fast CPU
 
Soldato
Joined
28 May 2007
Posts
18,262
Pretty sure that's a DX11 based game with a DX11 card running some extensions. Take a look at the Kaby i3 7100 performance. Dual cores love Nvidia drivers running DX11 code.
 
Associate
Joined
19 Jul 2011
Posts
1,899
Location
Reading
I left the cpu input voltage to auto and it fluctuates around 1.904-1.92v. What would be the upper limit? 1.95v?

As for overclocking the ram, doesn't it affect the bclk along with the overclock on the cpu? My ram is currently at 2400 mhz. Timing (15-15-15-35-1.2v)

Don't leave voltages like Memory , input and vcore on auto, its asking for trouble on any platform , set these manually. And no, Overclocking the ram won't effect the bclk. When x99 was in its infancy high memory speeds needed different bclk dividers to work, this is no longer the case, just make sure you're on the latest bios revision.

And to reiterate what i said earlier, nothing you buy right now (unless your obsessed by benchmarking) with give you any tangible performance gains over what you have now. Sure if you have money to burn go for it, new tech is fun.
 
Soldato
Joined
29 Jan 2015
Posts
4,904
Location
West Midlands
Interesting benchmark that. It's almost like Nvidia have started marketing for Intel. Switch to an AMD card and the results are very different.

Whilst that might be true, if you want the fastest GPU you have to go nvidia.

2s6kidd.png

971n2h.png


Unless AMD can do something special with navi, those who want the fastest will be going to intel/nvidia.
 
Soldato
Joined
28 May 2007
Posts
18,262
At 1080p with a 1080Ti. If Nvidia are twisting the results you're better off going all AMD if the main reason for buying a PC is gaming.
 
Soldato
Joined
29 Jan 2015
Posts
4,904
Location
West Midlands
No the arguments you make are flawed. What's wrong with my server board? And the 290X is just about holding up, although it is due upgrading.

You speak of intel being anti consumer and misleading yet you actually own probably the most controversial board in modern times. (Artificially locking xeons to a specific chipset)
You say intel are only decent at 1080p ;

157bvpx.jpg

2yw8y0g.jpg

2cf5ahu.jpg

2a5lqc0.jpg

91kpky.jpg


160er9c.jpg


Even in dx12 titles at 1440p, games still like a fast CPU. When GPU's are fast enough not to bottleneck at 1440p this gap in CPU performance is going to increase.
 
Man of Honour
Joined
30 Oct 2003
Posts
13,259
Location
Essex
who spends 4k on a rig to play at 1080? surely those with 1080ti/vega and a 1k cpu like a TR/i9 would be pushing 4k resolutions?
 
Soldato
Joined
29 Jan 2015
Posts
4,904
Location
West Midlands
Ahh fair play. I think for me I would prefer 4k 60 but then put up with a little tearing at 4k in games like cs, horses for courses I guess.

Have you tried 144hz or above?
It really is something you need to try.
A lot of people discredit it without trying it.

This looks to be a good monitor http://www.guru3d.com/news-story/acer-delays-predator-x27-4k-hdr-gsync-monitor-to-next-year.html

I've started to prefer ULMB over Gsync lately.

Though driving that will take a lot of GPU power.
 
Back
Top Bottom