• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

A Card For A Core I7 920, Gigabyte EX58 UD5 Setup - £150 to £200

Associate
Joined
19 Oct 2009
Posts
234
Location
Moray,Scotland
Last April I was looking for a new graphics card (well add on anther 5 years prior really I think). I've held out thus far though and weathered the storm of the bit mining craze and waited till black friday, but alas whilst tempted, the type of card I was looking at, the Nvidia GForce GTX 1060 6 Gig, was still and pretty much still is hovering around £230 when one adds up all the costs, including the postage to Scotland. The 3 Gig version didn't seem like a great leap memory wise from the 1 Gig that I've been using for the past 9 years. It's been a great card, and has fine playing games like Battlefield 3, 4 not so much though at 1080P which I'm still using.


Gaming is really secondary to why I want to upgrade though, it's really because I've been working on big 3D rendering and graphic projects in Cinema 4D such as futuristic city scapes and flying space ships. Now all this has also involved video editing with all manner of visual effects, but my current graphic card just can't cope in being able to process this movie which is just over 10 minutes long and results in being corrupt in not properly rendering and displaying what it does in the viewport of Hitfilm. In Cinema 4D, displaying high poly objects is a bit of a nightmare, rendering isn't so much of an issue with my quadcore 17 920 CPU.


For the past 19 years in using PC's, an AMD Athlon 750 with an ATI Rage 128 was my first, I have always stuck with Nvidia. I bought the last GForce Gainward (the fastest AGP card with 512MB of ram) back in 2006 for my Athlon 64 3500 2.2 Gig PC, for like £240, which I consider pretty silly, even with about 3 and half years out of it. The Palit 1 Gig cost just £95 as a comparison, which I had expected to upgrade a few years later, but strangely.. that didn't happen.


With having used what I've trusted and that being with Nvidia, it's really not until the past few days where I've seriously considered switching to what AMD (previously ATI as I one knew it). This is really primary down to the price / performance and value for money aspects. What does concern me though is reliability and power consumption that AMD graphic cards have when compared with Nvidia. I also produce music with Presonus Studio One 4 with my PC so being relatively quiet is at present is important.


I'm aware of 2 graphic cards, one that being Sapphire Radeon RX 570 Pulse 8192MB
and it's Nitro sister as being options here.. the latter having just a 54 Mhz overclock which doesn't seem much. Given what I do creatively and the amount of polygons/ textured scenes and video editing I'm doing, when I look at Nvidia and their equivalent GTX1060's, 3 Gig, 6 Gig, and the RX of AMD with 8 Gigs of builtin ram, I'm both curious about the application benefits that has, other than with games and higher resolution displays. Nvidia seems pretty mean and giving the 40 - 60 price difference of similar spec cards out there.

I'm not looking to upgrade my CPU to a new PC ect...till well.. it stops functioning as a working PC. It actually has 18 Gigs of ram but Windows 10 ( pre release ) and Linux Mint 18 is able to see.

I also need a 6 pin PSU to 8 pin graphics card cable as my current one for the Antec 750 Watt New Blue PSU only has a 6 pin, for these cards I think.



My Setup. (Jan 2010 Build)
Coolermaster Scout Case.
Windows 7 64Bit Home Premium.
64 Bit-Intel I7 920 CPU @3.57GHz,
(with a Zalman CNPS10X Extreme CPU Cooler)
Gigabyte EX58 UD5 Motherboard.
16 Gigs of DDR3 1600 Kingston Ram.
Nvidia Geforce Pailt GTS 250 1 Gig Graphics Card. (9 years old and still kicking)
Antec 750 Watt New Blue PSU
Dual 1080P capable LED Monitors.
Western Digital 500Gig Hard Disk - 1 & 2 TB External Disks
 
Last edited:
From various comments around the web from those who have bought a Radeon RX 580 Red Dragon V2 8192MB GDDR5 PCI-Express Graphics Card, it's loud, runs hot, and uses more power than that of a RX570 so that's a turn off for my preferences. Is the Saffire RX760 is better in this respect in comparison, and is there a 8 pin cable in the box I can connect to my PSU ? Vega 56 models are way over my budget..and around a 3rd of the cost of the system I put together 9 years ago...

Side topic...

I use Cinema 4D R11.5 XL with various modules I had bought during upgrading, but it got too expensive upgrade. There's no upgrade path because Maxon are a greedy, and penlise you if you fall behind their upgrade path. OpenGL is supported for it though and there's always Blender which has made significant progress over the past few years to harness newer graphic card tech. I also use Pixologic ZBrush 2018, which I use for building space ships and the cityscapes, which I then transfer to Cinema 4D to do the animation stuff.

Yeah, I've been aware of the 6 core, 12 thread, I7 chips available for my motherboard, for £12 though.. it sounds too good to be true when it cost around £200 for the I7 920 I have lol. (I did spy one on Ebay on my search whilst amazon they are about £100). I've never removed the HSF cooler from my system or with any previously, and the Zalman I have is a bit of a monster compared to some others. I was in two minds about getting this one but I've been really happy with it due to the external remote control to adjust noise levels from near silent to a jet engine for heavy rendering purposes on the fly, but medium all I need at my current overclock. 4 Ghz on the CPU is quite wasteful (electricity wise), more noisy with having to keep the CPU below 70 degrees.

WIP project, hence need for new graphics card.

MEGA%20CITY%20.png
 
Last edited:
AMD GPUs are cheap at the moment because sellers are clearing their stocks for the new Radeon VII GPUs.

Given the specs though, they will be stupidly expensive and in a market of their own to compete with Nvidia, so I really can't see how the lower budget market pricing will be affected to the point where retailers will be clearing their stocks. That's my take on it anyway.
 
It looks like I'll be pulling the trigger on the Saffire RX570 I linked to in my opening post having watched some youtube videos from some well known testers... variants like that above it seem more power hungry than I like.

So what am I looking for now ? Well the Saffire has both a 6 pin and 8 pin PCI power port. Has overclockers got any 6 Pin (from the PSU) to 6 & 8 Pin dual cables ?

Thanks...
 
Is this your PSU? https://static.raru.co.za/pdf/5036605-5438-user-manual.pdf

Maybe it's not the exact model but worth a double check. That one has 2 x 8-pin (6+2) and 2 x 6-pin. The latter modular.

Yep, that's the one I have, it's been an excellent PSU thus far, but it's even better now because you're right. I had another look in the case and didn't notice that there was 2 little wires each with 2 pins on them to bring them from 6 to 8. I've never needed to take the card out, so never thus never used them. I didn't see them on the extra modular cables so assumed they were all like that, but the primary system cables are the only ones they reside on. My current card uses two 6 Pin's. I guess if I was using 2 cards that I may want to utilize the extra power ports of the PSU and have another cable for that. But then, I'm only using one screen most of the time, so it's never been necessary. That's great, that'll knock off the extra 7 quid I thought I'd have add to my purchase with unneeded cables :-)

Cheers
 
Daft question, why are you running 16GB when it's a tri-channel mother board?

Because when the system was built in 2010, I had 6 gig's (3x2 config). In 2013, it was expanded with a 4x4 Gig set of DDR3 ram so that brought it to 16 gigs. 8 gig sticks were not available then. The mobo supports 6 DDR3 Dimms, but one is impeded by the fan of the HSF (didn't discover that till later but it doesn't matter, unless I'd want to max out at 48 Gigs (6x8), it supports 24 Gigs from what it says in the manual but that dates back to 2008/9. Obviously I could replace the mobo, but then that would just mean more money and stepping on the ladder of upgrading to a newer system of components and all that malarkey. Obviously there are options as have been highlighted in this thread by others to boost what I have with my current system, but yeah, I could replace the 4 gig dims with 8 Gig ones and sell the 4 gig ones I guess, but then as I'm still using Windows 7 Home Premium as really my main OS which has a 16 gig limit. I could have upgraded to Windows 10 but for many reasons I didn't. It's on my old laptop's hard drive so technically I do have it, but that's why I also use Linux as my secondary OS with my Core I7 920 system.

Other than the important aspect of memory, and storage, the graphics card is the most significant addition to my system (no SSD yet but some day that'll come). I'll be eagly awaiting the arrival of the Sapphire RX 570 8 Gig in the next few days, after doing lots of intense research and deliberation in the past week...and 5 / 6 years it feels more like. Going from what is really a GTX8800 spec wise to this new card is going to be quite a leap in performance. Perhaps not the bleeding edge of what the PC could have, but then, beyond a certain point, you can strangle your CPU if you go too far and don't really gain much... so yeah... it's 7.45am and thus it's way past my bed time now lol... Other than the huge jump I made from an Athlon 64 3500 to the I7 system, I'm having flashbacks to the Nvidia Gforce Gainward Bliss 7800 512MB AGP to that of the old Nvidia 5200 256MB AGP I use to have in 2003/4, comparison wise.

Feels strange switching from Nvidia after all these years.
 
Last edited:
In regard to nvidia, I've almost run nvidia exclusively since 1998 with the Riva 128.

I'm a software developer, I run glass front multi-monitors, i'm someone who takes care with picture quality and reduced eye strain. Nvidia for whatever reason I find has better picture quality, I'm referring to text and gamma control, their cards are just more easy on the eye when working. I've had discussion with other software developers who have said the same. I had a AMD R9 290X I tried it for a while, but the gamma control / contrast / text sharpness I could not set the same as the Nvidia.

I personally run Nvidia Quadro because the drivers are solid as a rock, however understand you getting a gaming card as they offer much more power for the money.

Well, the current card I've been using for years has been a cheap Nvidia gaming card, and have been happy to lower resolutions and settings to get the fastest FPS and optimal visually that I can in games like Battlefield 3. On the application side of things relating to contrast and text sharpness, it's not something I've ever heard about in being mentioned on graphic related forums or even that of youtube videos. I do occasional scripting but I typically scale text up so it's more visible anyway. I also design GUI interfaces for instruments, skins for VST instruments and those of which I create for Native Instruments Reaktor and use a 24" Samsung Syncmaster B2430 as my main PC screen. I don't think it's going to make much difference...there's a lot of variables from what one see's or perceives to see on one's screen to another.

I've been gaming since 1981 on a Sinclair ZX81 and use to use Deluxe Paint on the Amiga and I've got my Amiga 1200 setup along side my I7 920 to this day. Super Stardust AGA still blows my mind on it. Need a converter cable to scale up to 24" LCD screen size though as it's quite pixely. :D
 
Last edited:
Fwiw I'm still running my i7 930, originally had a AMD 6950 that I was using for gaming, but the last 2-3 years its really struggled with AAA titles ever at lowest setting. I managed to get FO4 running OK though.

Anyway I did plan an upgrade just as the last intel gen got released, but after seeing what they priced it at, and still being stuck on the same nm process as the previous 3-4 generations. I wasn't overly keen on upgrading.

Ultimately I needed a new GPU, sometimes my old 6950 would blackscreen and force me to restart. I ended up buying the 570 in the black Friday sales for about £150.

Obviously my next concern was that the latest AAA titles list the minimum spec of the CPU to be around the 4th/5th gen, and I was worried that the CPU wouldn't keep up.

I'm still running with a 1080p monitor, but so far I've been able to run FO76 and BFV at max graphics without the system so much as skipping a beat.

I am still looking at a whole new system, but will wait to see what ryzen does, and whether intel release on a new nm process.

I might look at snapping up an x5660 on ebay for a bit of an upgrade - thanks to those pointing it out.


There's a video Wendell made with older OEM systems, where they didn't post with the RX 570 cards, so it's a bit of a gamble. My Athlon 64 3500 2.2 didn't ever have it's HSF upgraded, it's system ram didn't ever get beyond 3 Gigs, which for the time I considered quite a lot in 2008. It's more than the dualcore 1.7 Ghz laptop I still have today.

In the case of the AMD Sapphire RX 570 which arrives the next few days, I'm hoping that I'll not have to deal with updating the BIOS. Currently, my I7 920 is using F9m BIO version according to the CPUZ utility app.


--
As for image quality / calibration.. difference between AMD and Nvidia, that's something of an unknown anomaly atm which I might not even notice.
 
As I'm awaiting for the delivery of the card (email says Tuesday, DPD tracking says Wednesday, the latter which is the real ETA I imagine), I'm preparing what I need for it.

My current I7 setup is different from all the other computers I've had, in that it supports two cards (normally intended for SLI/Crossfire), so I'm wondering if I should encounter any problems by keeping the existing drivers for the current Nvidia card I have installed whilst installing the new card which will be an AMD based one ?

Also, this isn't something I've ever thought about doing because I didn't think I'd ever buy an AMD based card, but to follow up on that question..has anyone tried mixing an AMD card with a Nvidia card on their system and did they encounter any problems other than the possible PSU that they are using to hold them back ? It's not something I need to do, given my current setup but it would help answer the first question in identifying any conflicts I might have.

I guessing from my own knowledge and understanding of PC's that the video drivers would be independent of each other so theoretically it should work, but I'm not 100% sure. Why would I want to keep the Nvidia drivers installed, you might ask..I guess it's more about convenience.

I've always uninstalled the driver if it wasn't for a replacement card of the same type in previous PC systems.
 
Last edited:
I really can't imagine Nvidia and AMD cards mixing. Also with your ancient Nvidia card, I'm not sure why you'd want to try and couple them together.

True, but I'm wondering more about the driver aspect really... For example, should I have any issues with the AMD one, I'll have the Nvidia as a backup, and thus, I can just put that one back in and without having to reinstall drivers for it again.
 
Back
Top Bottom