• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Best card with DVI output?

Associate
Joined
14 May 2012
Posts
53
Hi All,

I was thinking of building a new rig, but have parked that idea due to other financial needs :(

But I do have some cash for a new graphics card, mainly to play RDR2 (Doesn't need to run @ max settings)

It needs to have a DVI output though as I have a Korean 1440p monitor and DVI is the only input.

current specs - I7 3770k @ 4.4, 8 GB RAM, GTX 780 3GB.

Can spend up to £500 if necessary.

Thanks in advance :)
 
Whatever you get, double check your cable connections.

Its well used, but last night I needed a DVI connection, took the cap off the DVI output and found the Asus 1050ti DVI output socket lacks the 4 little pins around the large spade pin.

Could not use my DVI cable and had to dwitch to onboard gpu.
 
And one of those will work ok? Sorry for being a noob lol. I guess I could try with my existing card first :)
It should do. Displayport carries a DVI signal.

I am using a Displayport to active DVI adapter right now. Mine is an active adapter because my old Dell 3007 monitor needs a dual DVI input as the resolution is higher at 2560x1600, and my GPU only has displayport outputs. But I think you would probably be fine with a standard passive displayport to DVI adapter. with only a 1440p resolution. However it's worth waiting for someone else to confirm a passive adapter will work before getting the GPU.
 
Hi All,

I was thinking of building a new rig, but have parked that idea due to other financial needs :(

But I do have some cash for a new graphics card, mainly to play RDR2 (Doesn't need to run @ max settings)

It needs to have a DVI output though as I have a Korean 1440p monitor and DVI is the only input.

current specs - I7 3770k @ 4.4, 8 GB RAM, GTX 780 3GB.

Can spend up to £500 if necessary.

Thanks in advance :)

Have you considered to sell your monitor and for £400 to buy both GPU & Monitor?
You would still have change left at the end :)
£220 32" 2560x1440 Freesync monitor
https://www.overclockers.co.uk/aoc-...reesync-widescreen-led-monitor-mo-04m-ao.html

£176
https://www.overclockers.co.uk/powe...ddr5-pci-express-graphics-card-gx-197-pc.html

~£400

Otherwise if you want to stick to graphic card only, both 5700XT & 2070S will serve you well. Pick which ever is cheapest.
 
If you have one of the high hz korean 1440p monitors like QNIX QX 2710 etc then anything but dual link dvi won't work.

I know the GTX 1080 works and was a massive upgrade from the 780 I had before.
 
If you don't want to replace the monitor then some 2070 models have DVI output, not sure if anything better does.
My 1080ti doesn't have the DVI port and doesn't work with any cheap passive adaptor on my old korean 1440p. 1080tis with DVI do exist though :)
 
Thanks for all your help guys. I bit the bullet and got the 2060 super. I got the "advanced" one, not sold here unfortunately, but it has higher clocks.

I also got 16gb of ram off fleabay for £50.

Red dead 2 runs very well @ 1440p with a mix of high and ultra settings. Massive imrpvement over the 780!

And the 3770k isn't bottlenecking it from what I can see.

Happy camper :)
 
Whatever you get, double check your cable connections.

Its well used, but last night I needed a DVI connection, took the cap off the DVI output and found the Asus 1050ti DVI output socket lacks the 4 little pins around the large spade pin.

Could not use my DVI cable and had to dwitch to onboard gpu.

Those pins are used to transmit an analogue signal over a DVI cable. The 1050Ti does not have analogue output, and if the monitor has a DVI input its highly unlikely its an analogue only DVI input. It will work fine unless you're connecting via VGA at the monitor end.
 
Those pins are used to transmit an analogue signal over a DVI cable. The 1050Ti does not have analogue output, and if the monitor has a DVI input its highly unlikely its an analogue only DVI input. It will work fine unless you're connecting via VGA at the monitor end.


Lol. It is an impossibility to work as the cable has pins present where the gpu has not.
 
Lol. It is an impossibility to work as the cable has pins present where the gpu has not.

Those pins are not used for a digital connection, it doesn't matter.

bWsTOsr.png


If the issue is the connector on the card does not have those pins cut out at all, simply replace the cable with a dual link DVI-D cable. They cost about a fiver.
 
Lol.

That was the point I was making with my original post as that change had caught me unaware.

I am not using the 1050ti with a DVI cable, but while building my lads PC on the dining room table I was using a DVI cable and monitor.

If I had known that when ordering the card I would have ordered a cable.
 
Back
Top Bottom