Hi guys,
Sorry to be "that person" who asks for advice on whether I need to upgrade or not.
I'm currently waiting for the new AMD offering and the 980 Ti's to see what happens there and plan to buy 2 of whatever turns out to be better and Crossfire/SLI them. I am running an "old" setup from 2011 which Nvidia were gracious enough to build for me, but as far as I'm aware, it was a rather high spec build at the time and I'm not sure if the technology has come along enough to warrant an upgrade considering Intel have only really focused on energy efficiency in the past couple of years.
The current specs are:
Motherboard: Asus Rampage IV Extreme
CPU: Intel i7-3930K 3.2 Ghz OC'd to 3.8 Ghz
Memory: 16GB DDR3 Corsair Vengeance 1600Mhz (CMZ16GX3M4A1600C9)
PSU: 1200W Corsair
Video: 780 Ti
What I really need to know is, in light of the fact I want to replace the 780 Ti with 2x980 Ti's or the AMD equivalent (whichever turns out to be better), will my current system bottleneck them, am I missing out on any new important technology by not upgrading the Mobo/CPU/Memory and is it worth the additional £800-£1,000 it'll cost to upgrade them (and by "worth it" I mean would we literally be talking about a difference of 1-5 FPS in a game that already runs at 90 FPS, which to me, wouldn't be worth it)? A quick look at some CPU benchmarks shows my processor scores ridiculously compared to modern day equivalents even now. Like, the difference between a score of 12,000 and a score of 7,000...? What the chuff is that about? Have CPU's lost power or something in 4 years?
I run 3x30'' Dells at 2560x1600 but only game on one of them. I do not need it to be 4K proof and I do not game across multiple screens, though I do leave the two side screens on to keep up with emails/Skype/Facebook while I'm playing.
Thank you for any and all help you wise men (or women) of yore can provide.
Sorry to be "that person" who asks for advice on whether I need to upgrade or not.
I'm currently waiting for the new AMD offering and the 980 Ti's to see what happens there and plan to buy 2 of whatever turns out to be better and Crossfire/SLI them. I am running an "old" setup from 2011 which Nvidia were gracious enough to build for me, but as far as I'm aware, it was a rather high spec build at the time and I'm not sure if the technology has come along enough to warrant an upgrade considering Intel have only really focused on energy efficiency in the past couple of years.
The current specs are:
Motherboard: Asus Rampage IV Extreme
CPU: Intel i7-3930K 3.2 Ghz OC'd to 3.8 Ghz
Memory: 16GB DDR3 Corsair Vengeance 1600Mhz (CMZ16GX3M4A1600C9)
PSU: 1200W Corsair
Video: 780 Ti
What I really need to know is, in light of the fact I want to replace the 780 Ti with 2x980 Ti's or the AMD equivalent (whichever turns out to be better), will my current system bottleneck them, am I missing out on any new important technology by not upgrading the Mobo/CPU/Memory and is it worth the additional £800-£1,000 it'll cost to upgrade them (and by "worth it" I mean would we literally be talking about a difference of 1-5 FPS in a game that already runs at 90 FPS, which to me, wouldn't be worth it)? A quick look at some CPU benchmarks shows my processor scores ridiculously compared to modern day equivalents even now. Like, the difference between a score of 12,000 and a score of 7,000...? What the chuff is that about? Have CPU's lost power or something in 4 years?
I run 3x30'' Dells at 2560x1600 but only game on one of them. I do not need it to be 4K proof and I do not game across multiple screens, though I do leave the two side screens on to keep up with emails/Skype/Facebook while I'm playing.
Thank you for any and all help you wise men (or women) of yore can provide.