• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

A Card For A Core I7 920, Gigabyte EX58 UD5 Setup - £150 to £200

Daft question, why are you running 16GB when it's a tri-channel mother board?

Because when the system was built in 2010, I had 6 gig's (3x2 config). In 2013, it was expanded with a 4x4 Gig set of DDR3 ram so that brought it to 16 gigs. 8 gig sticks were not available then. The mobo supports 6 DDR3 Dimms, but one is impeded by the fan of the HSF (didn't discover that till later but it doesn't matter, unless I'd want to max out at 48 Gigs (6x8), it supports 24 Gigs from what it says in the manual but that dates back to 2008/9. Obviously I could replace the mobo, but then that would just mean more money and stepping on the ladder of upgrading to a newer system of components and all that malarkey. Obviously there are options as have been highlighted in this thread by others to boost what I have with my current system, but yeah, I could replace the 4 gig dims with 8 Gig ones and sell the 4 gig ones I guess, but then as I'm still using Windows 7 Home Premium as really my main OS which has a 16 gig limit. I could have upgraded to Windows 10 but for many reasons I didn't. It's on my old laptop's hard drive so technically I do have it, but that's why I also use Linux as my secondary OS with my Core I7 920 system.

Other than the important aspect of memory, and storage, the graphics card is the most significant addition to my system (no SSD yet but some day that'll come). I'll be eagly awaiting the arrival of the Sapphire RX 570 8 Gig in the next few days, after doing lots of intense research and deliberation in the past week...and 5 / 6 years it feels more like. Going from what is really a GTX8800 spec wise to this new card is going to be quite a leap in performance. Perhaps not the bleeding edge of what the PC could have, but then, beyond a certain point, you can strangle your CPU if you go too far and don't really gain much... so yeah... it's 7.45am and thus it's way past my bed time now lol... Other than the huge jump I made from an Athlon 64 3500 to the I7 system, I'm having flashbacks to the Nvidia Gforce Gainward Bliss 7800 512MB AGP to that of the old Nvidia 5200 256MB AGP I use to have in 2003/4, comparison wise.

Feels strange switching from Nvidia after all these years.
 
Last edited:
As already mentioned, your system is basically begging for an X5650 or X5660 CPU. It's a drop in upgrade, the chips are stupidly cheap second-hand, and you end up with a 6 core / 12 thread processor that should overclock to around 4GHz with your existing cooler.

I didn't know that. Built an i5 (820?) rig for a mate, but it's getting a bit past-it. Useful advice :)
 
Feels strange switching from Nvidia after all these years.

In regard to nvidia, I've almost run nvidia exclusively since 1998 with the Riva 128.

I'm a software developer, I run glass front multi-monitors, i'm someone who takes care with picture quality and reduced eye strain. Nvidia for whatever reason I find has better picture quality, I'm referring to text and gamma control, their cards are just more easy on the eye when working. I've had discussion with other software developers who have said the same. I had a AMD R9 290X I tried it for a while, but the gamma control / contrast / text sharpness I could not set the same as the Nvidia.

I personally run Nvidia Quadro because the drivers are solid as a rock, however understand you getting a gaming card as they offer much more power for the money.
 
In regard to nvidia, I've almost run nvidia exclusively since 1998 with the Riva 128.

I'm a software developer, I run glass front multi-monitors, i'm someone who takes care with picture quality and reduced eye strain. Nvidia for whatever reason I find has better picture quality, I'm referring to text and gamma control, their cards are just more easy on the eye when working. I've had discussion with other software developers who have said the same. I had a AMD R9 290X I tried it for a while, but the gamma control / contrast / text sharpness I could not set the same as the Nvidia.

I personally run Nvidia Quadro because the drivers are solid as a rock, however understand you getting a gaming card as they offer much more power for the money.

Newest aAMD card have better image quality it’s been tested by several tech chans (vega64)
 
In regard to nvidia, I've almost run nvidia exclusively since 1998 with the Riva 128.

I'm a software developer, I run glass front multi-monitors, i'm someone who takes care with picture quality and reduced eye strain. Nvidia for whatever reason I find has better picture quality, I'm referring to text and gamma control, their cards are just more easy on the eye when working. I've had discussion with other software developers who have said the same. I had a AMD R9 290X I tried it for a while, but the gamma control / contrast / text sharpness I could not set the same as the Nvidia.

I personally run Nvidia Quadro because the drivers are solid as a rock, however understand you getting a gaming card as they offer much more power for the money.

Well, the current card I've been using for years has been a cheap Nvidia gaming card, and have been happy to lower resolutions and settings to get the fastest FPS and optimal visually that I can in games like Battlefield 3. On the application side of things relating to contrast and text sharpness, it's not something I've ever heard about in being mentioned on graphic related forums or even that of youtube videos. I do occasional scripting but I typically scale text up so it's more visible anyway. I also design GUI interfaces for instruments, skins for VST instruments and those of which I create for Native Instruments Reaktor and use a 24" Samsung Syncmaster B2430 as my main PC screen. I don't think it's going to make much difference...there's a lot of variables from what one see's or perceives to see on one's screen to another.

I've been gaming since 1981 on a Sinclair ZX81 and use to use Deluxe Paint on the Amiga and I've got my Amiga 1200 setup along side my I7 920 to this day. Super Stardust AGA still blows my mind on it. Need a converter cable to scale up to 24" LCD screen size though as it's quite pixely. :D
 
Last edited:
On the application side of things relating to contrast and text sharpness, it's not something I've ever heard about in being mentioned on graphic related forums or even that of youtube videos.

They have been better at text quality for some time, going back to before 2000 this was known about. And it's nothing todo with expensive/cheap Nvidia cards, you could have the cheapest / old Nvidia you can find they have the best text quality / gamma quality for working and reducing eye strain. As said last tried with an AMD R9 290 and had to remove it as could not get text correct.

Started programming on an Atari 400 when I was 7, had the Amiga 500, BBC Micro, had just about every PC from 286 upwards. Had a TRS 80 once with 8" drives, work in software, have a lot of T shirts looking back. Would not mind 2 Amiga 1200's just to play Stunt Car Racer again on link up.
 
Fwiw I'm still running my i7 930, originally had a AMD 6950 that I was using for gaming, but the last 2-3 years its really struggled with AAA titles ever at lowest setting. I managed to get FO4 running OK though.

Anyway I did plan an upgrade just as the last intel gen got released, but after seeing what they priced it at, and still being stuck on the same nm process as the previous 3-4 generations. I wasn't overly keen on upgrading.

Ultimately I needed a new GPU, sometimes my old 6950 would blackscreen and force me to restart. I ended up buying the 570 in the black Friday sales for about £150.

Obviously my next concern was that the latest AAA titles list the minimum spec of the CPU to be around the 4th/5th gen, and I was worried that the CPU wouldn't keep up.

I'm still running with a 1080p monitor, but so far I've been able to run FO76 and BFV at max graphics without the system so much as skipping a beat.

I am still looking at a whole new system, but will wait to see what ryzen does, and whether intel release on a new nm process.

I might look at snapping up an x5660 on ebay for a bit of an upgrade - thanks to those pointing it out.
 
Fwiw I'm still running my i7 930, originally had a AMD 6950 that I was using for gaming, but the last 2-3 years its really struggled with AAA titles ever at lowest setting. I managed to get FO4 running OK though.

Anyway I did plan an upgrade just as the last intel gen got released, but after seeing what they priced it at, and still being stuck on the same nm process as the previous 3-4 generations. I wasn't overly keen on upgrading.

Ultimately I needed a new GPU, sometimes my old 6950 would blackscreen and force me to restart. I ended up buying the 570 in the black Friday sales for about £150.

Obviously my next concern was that the latest AAA titles list the minimum spec of the CPU to be around the 4th/5th gen, and I was worried that the CPU wouldn't keep up.

I'm still running with a 1080p monitor, but so far I've been able to run FO76 and BFV at max graphics without the system so much as skipping a beat.

I am still looking at a whole new system, but will wait to see what ryzen does, and whether intel release on a new nm process.

I might look at snapping up an x5660 on ebay for a bit of an upgrade - thanks to those pointing it out.


There's a video Wendell made with older OEM systems, where they didn't post with the RX 570 cards, so it's a bit of a gamble. My Athlon 64 3500 2.2 didn't ever have it's HSF upgraded, it's system ram didn't ever get beyond 3 Gigs, which for the time I considered quite a lot in 2008. It's more than the dualcore 1.7 Ghz laptop I still have today.

In the case of the AMD Sapphire RX 570 which arrives the next few days, I'm hoping that I'll not have to deal with updating the BIOS. Currently, my I7 920 is using F9m BIO version according to the CPUZ utility app.


--
As for image quality / calibration.. difference between AMD and Nvidia, that's something of an unknown anomaly atm which I might not even notice.
 
As I'm awaiting for the delivery of the card (email says Tuesday, DPD tracking says Wednesday, the latter which is the real ETA I imagine), I'm preparing what I need for it.

My current I7 setup is different from all the other computers I've had, in that it supports two cards (normally intended for SLI/Crossfire), so I'm wondering if I should encounter any problems by keeping the existing drivers for the current Nvidia card I have installed whilst installing the new card which will be an AMD based one ?

Also, this isn't something I've ever thought about doing because I didn't think I'd ever buy an AMD based card, but to follow up on that question..has anyone tried mixing an AMD card with a Nvidia card on their system and did they encounter any problems other than the possible PSU that they are using to hold them back ? It's not something I need to do, given my current setup but it would help answer the first question in identifying any conflicts I might have.

I guessing from my own knowledge and understanding of PC's that the video drivers would be independent of each other so theoretically it should work, but I'm not 100% sure. Why would I want to keep the Nvidia drivers installed, you might ask..I guess it's more about convenience.

I've always uninstalled the driver if it wasn't for a replacement card of the same type in previous PC systems.
 
Last edited:
I really can't imagine Nvidia and AMD cards mixing. Also with your ancient Nvidia card, I'm not sure why you'd want to try and couple them together.
 
I really can't imagine Nvidia and AMD cards mixing. Also with your ancient Nvidia card, I'm not sure why you'd want to try and couple them together.

True, but I'm wondering more about the driver aspect really... For example, should I have any issues with the AMD one, I'll have the Nvidia as a backup, and thus, I can just put that one back in and without having to reinstall drivers for it again.
 
Back
Top Bottom