• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why is there still a DVI connector on Nvidia reference cards?

I use DVI and wouldn't be interested in using any kind of adapter. Once you start adding adapters you run a greater risk of signal degradation to the screen which is not something I would want to suffer as I use my PC for photography.
 
I use DVI, only because i have a DVI port. If i didn't i'd just use DP.

I agree with you OP, posted in another thread. If it didn't have the DVI port then it would been open more by 1/2.

Unfortunately, there's way too many native DVI monitors out there that people want DVI ports still and don't want to use adapters.
 
Last edited:
That's one that's capable of VGA output, a regular DP-DVI adapter is much much cheaper and included with some cards, a HDMI-DVI adaptor is pennies and included with even more cards.

The cheaper DP/MDP-DVI adapters produce a single-link DVI output, which tops out around 1920x1200@60Hz (from memory). To go 1440P or above, or 120/144Hz, you need a dual-link DVI output, which is where the adapters get pricey (£70 rather than £25).

HDMI before 2.0 tops out around the same resolution/refresh, but don't know if there are any cheap HDMI 2.0 to DVI adapters out there that go higher yet, never looked as never needed one. :)
 
Also you're back to horrible badly secured easily breakable HDMI rather than nice sturdy well-affixed DVI :p

I put my case on its side to gain access to right side so the DVI cable gets twisted. I replaced a monitor unnecessarily due to the cable breaking. I still do the same thing.

The way I see it is that the snobby people are in looking for a forum which doesn't have the word overclocker in it because snobby people can just buy the most expensive and forget about it while the overclocker gets value from a bit of tinkering. I was told that my card wasn't worth overclocking because it wasn't as expensive as my cooling solution by a snob here. It was a case of all the gear and no idea looking at his sig.
 
I'm curious to know what the DVI holdouts think should be the cutoff for dropping the port... 2017? 2020??

Funny you should mention that, I've been wondering this myself. Do you think by removing the DVI port they could extend the exhaust area allowing more heat to be dispersed?
 
They appear to have done that with Fiji going by the render that is floating about.

That could be good for AMD, but I'd like to see the Green reference coolers do the same. Not sure what would be involved, but 2 buying options, with/without the DVI.

I would imagine the temps would be lower as the airflow would be better!
 
In the interests of full disclosure given I've been pro-DVI in this thread so far I should probably mention that I don't actually use it on any of my own machines. However, whenever advising someone not very computer-savvy about connecting stuff up then if possible I'll advise it as it'll be easy & problem-free.
 
If my card didn't have the DVI ports I'd happily just swap over to DP / HDMI, but while it still does I'm not going to cause that's money in the bank!
 
Is that DVI going to be a pain for watercooling. If the DVI wasn't there, and in theory if there was no cooler (or a very thin waterblock) it could plausibly only take up one Slot, which would be nice imo.

I guess it is only there to be converted to VGA, but I would have thought if you are buying a card that powerful, you will more than likely have a monitor that has DP or DVI. I see no need for people to use analogue any more, unless your monitor doesn't support digital inputs.
 
In the interests of full disclosure given I've been pro-DVI in this thread so far I should probably mention that I don't actually use it on any of my own machines. However, whenever advising someone not very computer-savvy about connecting stuff up then if possible I'll advise it as it'll be easy & problem-free.

To be fair, DP is pretty straight forward if not easier. You don't have fiddly screws and its just as secure with the clip!

Is that DVI going to be a pain for watercooling. If the DVI wasn't there, and in theory if there was no cooler (or a very thin waterblock) it could plausibly only take up one Slot, which would be nice imo.

I guess it is only there to be converted to VGA, but I would have thought if you are buying a card that powerful, you will more than likely have a monitor that has DP or DVI.

If it did mean one slot, could we see micro/ITX boards with more PCIE?
 
Yeah, DP is my second choice, the main reason I wouldn't pick it for people who I think are likely to mess up is actually really silly, but it's cause of daisy chaining... I've now had several people say they can't get it to work and on checking in person (after asking them to check themselves) I've discovered they're trying to use the DP out as a DP in...

That, and very early days there were occasional standby issues etc with DP, though I think they're basically all ironed out.
 
Back
Top Bottom