Why would you use any monitor connection that wasn't DisplayPort?

I'm using display port because my monitor doesn't support anything else and a cable was included in the box. Previously i was using DVI-D because a cable was included in the box. Why go to extra effort to source a display port cable instead of using whatever came with your monitor if you will get no benefit from it? As long as it gets the job done it doesn't matter at all.
 
I use dvi-d.

My 980ti has HDMI and Dp as well.

Problem is my 120hz monitor doesn't.

Are you advising me to use HDMI instead? So I can't achieve 120hz only 60?

Are you actually asking a question, or making a statement with out really having the knowledge.

I also have an old syncmaster...that's VGA...
 
I use DVI as that what my ancient 2006 monitor uses so I hope they don't drop that port on the next literation of cards. I'm currently using the DP port for a VGA Adaptor as my old Samsung 46" HDTV (still useful) only allows its native panel res (1366x768) on the vga port as the HDMI port only allows 720p or 1080i causing scaling issues.

Ideally I would have at least two display ports so I can run the adaptor and a DP monitor as I would like to upgrade someday to a 27" freesync 120Hz VA screen should one come into existence at an affordable price.

I think monitor pricing and even display tech has stalled or gone a bit backwards since my current BenQ241w is still a good screen (24" 1200p) that only cost £399 in 2008(actually that was quite a lot of coin in 06) but it needs replacing due to excessive image retention and there just does not seem to be a good screen that fits in that £399 price point. (A TN panel is not acceptable at this price point).
 
Last edited:
I use DVI as that what my ancient 2006 monitor uses so I hope they don't drop that port on the next literation of cards. I'm currently using the DP port for a VGA Adaptor as my old Samsung 46" HDTV (still useful) only allows its native panel res (1366x768) on the vga port as the HDMI port only allows 720p or 1080i causing scaling issues.

You sure its the TV at fault here? I've had exactly the same trouble with an HDMI port on a HD 7950, R9 290 and now a Fury X. It'll only spit out 720p or 1080i. Previously I had a DVI-HDMI dongle but as the Fury has done away with DVI I had to cough up for a DP-HDMI dongle. All to get 1080p out of the thing.

No-one seems to have any explanation as to why the default HDMI won't do 1080p but the other ports will.
 
I use DVI, simply because it's what came in the box with the monitor. Can't really see what benefit DisplayPort is going to give me and I'm not even sure if my graphics card has it.
 
You sure its the TV at fault here? I've had exactly the same trouble with an HDMI port on a HD 7950, R9 290 and now a Fury X. It'll only spit out 720p or 1080i. Previously I had a DVI-HDMI dongle but as the Fury has done away with DVI I had to cough up for a DP-HDMI dongle. All to get 1080p out of the thing.

No-one seems to have any explanation as to why the default HDMI won't do 1080p but the other ports will.

On older drivers, there was some tweaked gfx modes available for hdmi, but there were still scaling weirdly on the tv. I have used a dvi-hdmi adaptor, but it was still the same result as by the hdmi port. The tv just seems to work better in pc mode over vga than hdmi. DVI doesn't offer vga compatibility any more so I have to use a vga adaptor on the dp port.
 
Have to use DL-DVI with Qnix Q270 OC @ 120hz, no other input on it.

Will only switch to DP when gpus & monitors have & require DP1.3 for a 4k, 120+hz and/or HDR screen.
 
Blackjack Davy, MadMatty:

The problem arises from the fact that the HDMI standard in TVs (at least in that time) didn't allow that many custom resolutions, like the 1366x768. Also, if the TV's own resolution is 1366x768, then it doesn't matter what type of connector or graphics card you use, you can never achieve native 1920x1080. And very few HD ready (as opposed to 1080p Full HD) televisions offered 1:1 pixel scaling, so even the 720p was not possible without upscaling (1:1,066). In the end, the TV had to fit either a 1080i/p or 720p signal to another resolution, resulting in a blurry image. It affected all 1366x768 HD Ready televisions. But only when you used HDMI. VGA was OK.

Ps. The manufacturers' choice of using 1366x768 panels instead of the 1280x720 panels was quite frankly idiotic.
 
Wow. You accuse me of having more money than sense when a DisplayPort cable costs less than £10. I'm not rich but even I can afford £10.

Be wary of cheap DP cables. I bought one from Amazon for £2 or so, and had enormous problems with it.

Monitor would not wake from sleep whilst using it (Win7). DVI worked fine.

Then I upgraded to Win10, and now the monitor won't wake from sleep with either. That's progress!
 
Blackjack Davy, MadMatty:

The problem arises from the fact that the HDMI standard in TVs (at least in that time) didn't allow that many custom resolutions, like the 1366x768. Also, if the TV's own resolution is 1366x768, then it doesn't matter what type of connector or graphics card you use, you can never achieve native 1920x1080. And very few HD ready (as opposed to 1080p Full HD) televisions offered 1:1 pixel scaling, so even the 720p was not possible without upscaling (1:1,066). In the end, the TV had to fit either a 1080i/p or 720p signal to another resolution, resulting in a blurry image. It affected all 1366x768 HD Ready televisions. But only when you used HDMI. VGA was OK.

Ps. The manufacturers' choice of using 1366x768 panels instead of the 1280x720 panels was quite frankly idiotic.

Its a Sony Bravia 40" Full HD/1080p HDTV. (Well mine is anyway I've no idea about Matty's).

(P.S. 1366x768 seems to be common my mother's 32" has that native resolution (its not advertised as "HD Ready" though) and unrelated to the issues I have :p)
 
@Blackjack Davy:

Like MadMatty said, he has a 1366x768 native, so that's definitely the issue he's having. And there simply is no way he can display 1080p (nor 720p) on it without scaling. The best he can get is 1366x768, and with HDTVs, it's usually not possible with the HDMI.

But, if you have problems with a real native 1080p (1920x1080) HDTV, then that's something to be worried about.

But a couple of notes, just to be sure:

Note1:
Make sure that the Sony is indeed a 1080p HDTV, not just one that SUPPORTS 1080p in input level. I had a Panasonic HD-ready (1366x768) HDTV, which also supported 1080p. It couldn't show it natively, of course. But it would still try to reproduce the signal, as well as it could. It's very probable that your mother's TV also supports 1080p signal.

Note2:
With AMD GPUs and APUs, the default setting with HDMI signal for HDTVs is to underscan the image by 10%. This usually results in black bars around the image. If this is the case, then you should always change it back to 0%. You can find the setting in the CCC.

I'm not sure how the "handshake" works with DP-HDMI, but it's possible that the GPU doesn't treat the signals going through the DP port the same way as with HDMI, and that's why it defaults the overscan/underscan to 0% with DP ports.

This shouldn't affect the HDTV identifying the input resolution, though. It should still be the same "1080p".

Note3:
For a quick test, try this:
http://www.lagom.nl/lcd-test/clock_phase.php

If you can see a solid grey color (in reality it's a black-and-white checkerboard pattern), then you're probably operating at the native resolution. But if you can see banding or unevenness, etc., then you're probably not operating at the native frequency.

And with regards to the 1366x768 being common, just to make sure:
There are indeed WAY more 1366x768 televisions than there are 1280x720 televisions. Actually, you'd probably have a very hard time finding a real native 1280x720 HDTV, they are VERY rare.
 
I am one of those people who ultize HDMI 2.0 instead of Displayport 1.2, sole reason being that I need around 4m of cable length to connect from my PC to the monitor.

Tried using a 5m displayport cable, but it is prone to interference and displayed artefacts and driver crashes (yes I switched out the motherboard and graphics card before noticing it is the fault of the cable). So going to stick with quality HDMI cable before I'm trying out displayport again.
 
But, if you have problems with a real native 1080p (1920x1080) HDTV, then that's something to be worried about.

But a couple of notes, just to be sure:

Note1:
Make sure that the Sony is indeed a 1080p HDTV, not just one that SUPPORTS 1080p in input level.
Sony KDL-40X3000 if its not native 1920 x 1080 resolution then its news to me as it was certainly advertised as Full HD and not HD Ready and those display specs state it is. If its not then I want my money back! :p

Note2:
With AMD GPUs and APUs, the default setting with HDMI signal for HDTVs is to underscan the image by 10%. This usually results in black bars around the image. If this is the case, then you should always change it back to 0%. You can find the setting in the CCC..

With the latest drivers (Crimson) they seem to have changed that to 0% by default (no black bars). The underscan was a bit silly honestly.
 
Last edited:
@Blackjack Davy:

The KDL-40X3000 should indeed be a genuine 1920x1080 HDTV.

And just to make sure:
You still can't get 1080p through the HDMI-HDMI connection, from FuryX to the Sony?

Have you tried with another monitor/TV? I assume you have tried the different HDMI ports on the TV, and changing the resolution from both the Windows' own resolution utility, as well as from AMD's? Maybe even tried another HDMI-HDMI cable?

Also, did you already try the http://www.lagom.nl/lcd-test/clock_phase.php -link, on both the DP-HDMI and the HDMI-HDMI? Is it working OK on either one (solid grey)?

Because for HDMI to not be able to support 1080p is very odd. 1080p is basically the cornerstone of the whole HDMI technology. That's pretty much the first resolution any manufacturer would look into, and there is no viable reason why it would be a non-supported resolution.

Ps. Just as a side note, were you aware of this?

Ps2. Actually, one thing occurred to me: have you tried 1920x1080@24Hz? Because the spec sheet only indeed states 1920x1080 and 1080p, so it could also be possible that it only supports them at 24Hz. Very far fetched, though, especially if the DP-HDMI is working correctly. Worth a try, in any case.
 
I mean all modern GPUs have at least one DisplayPort connection (and normally more than one this might be normal sized DisplayPort connections or Mini DisplayPort but there really isn't much of a difference between them except that you might need an adaptor for Mini DisplayPort connections) and all modern monitors have a DisplayPort connection. So why would you choose to use HDMI when DisplayPort is the technically superior connection?
In what way is it superior when using a standard 1080p monitor? For 4K, yes, you'll need DisplayPort but for anything below 1440p is does not have any advantages at all.
I can understand people who want to also plug a console (or consoles via an HDMI hub) into their monitor but that shouldn't stop them using DisplayPort for connecting to their actual PC.
Again, why? What is the advantage really. Dispay quality will be equal.
DisplayPort has also supported full 4k at 60Hz for way longer than HDMI has and even modern AMD GPUs don't come with HDMI 2.0 so you are pretty much forced to use DisplayPort there anyway.
If you use 60Hz 4K, yes. If you use 1080p, or even 1440p 60Hz you will be fine with HDMI normally.
And for people with high end professional monitors like my main one with 10 bit colour (or even higher) DisplayPort is the only option that actually supports the higher colour quality.
True, but not everyone has a 10-bit monitor. You are in the minority.
So are there actually any reasons as to why someone would use HDMI (or even god forbid DVI-I / DVI-D or even worse VGA) over DisplayPort? Because from where I am standing if you have a choice of DisplayPort or something else the choice is obviously going to be DisplayPort.
God forbid DVI-I/DVI-D? What is wrong with DVI? DVI is awesome, even for 1080p at 144 Hz without Adaptive Sync... In fact, I would say DVI-D is the least problematic connection of them all.

For example:
VGA: quality issues, adjustment needed
DVI: just works, period
HDMI: sometimes overscan issues
DisplayPort: cable quality and short length extremely important, good certified cables are expensive. Disconnects from the system when in sleep causing issues with dual monitor setups, compatibility issues between monitors and video cards (sleep issues, wake issues, disconnect issues etc.)

If this topic was about VGA I would agree with you 100%, but DVI and HDMI are fine for the majority of monitors sold and DisplayPort has NO advantage at all. In fact, I would say it possibly causes more issues than needed.

My advice:
Only use DisplayPort when absolutely needed, for things like: Adaptive Sync, 4K 60Hz, 1440p 144Hz, or 10-bit color depth. Otherwise, there is no need. Save yourself some money.
 
Last edited:
Back
Top Bottom