1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

What will monitors requiring two video cables mean for future graphics cards?

Discussion in 'Graphics Cards' started by Quartz, Feb 9, 2019.

  1. Quartz

    Sgarrista

    Joined: Apr 1, 2014

    Posts: 7,539

    Location: Aberdeen

    The Acer Nitro 4K requires two video cables for best performance. So does the Dell 8k monitor. There are only three DP ports on my RTX 2080 Ti so couldn't run both at the same time without some performance loss. Will requiring two cables become the standard? If so, what will that mean for future graphics cards?

    I've been waiting for GPUs to switch to triple slot for a while and changing to having six or eight outputs seems an obvious excuse to do so.

    Or will we see some new display output technology?
     
  2. Donnie Fisher

    Gangster

    Joined: Jun 22, 2018

    Posts: 298

    Location: Vegas baby !

    Why would you need triple depth when you can already fit 4 outputs in a single slot, and 8 in a double ????

    In general terms though it seems to be that cable standards trend a little behind the curve of the gpu and display. I.e. it takes them a while for them to be able to have the throughput of the other 2's max capacity.
     
  3. Quartz

    Sgarrista

    Joined: Apr 1, 2014

    Posts: 7,539

    Location: Aberdeen

    Because you need one slot for a vent.
     
  4. Confused Stu

    Mobster

    Joined: Dec 27, 2009

    Posts: 2,587

    Location: Gillingham, Kent

    In the early days of 4K, I remember some monitors requiring 2xHDMI 1.4 cables to run, then HDMI 2.0 came out that had enough bandwidth to run the screens through just one cable. For me, it seems to be a constant race between cable standards and monitors for supporting particular resolutions - mostly the cables are ahead but occasionally bleeding edge monitors will exceed this and require two cables, but the cable standards will soon catch up and you're back on the one monitor = one cable standard. Sometimes you even have the cables ahead - remember when DP 1.2 supported daisy chaining so you could run two 1080P displays from a single DP output on the PC?

    You've also got bleeding edge equipment that supports multiple outputs without needing to go wider than double slot - the 5870 back in 2009 had a special 'Eyefinity Edition' that had 6 outputs on it: https://www.amd.com/en-gb/products/graphics/desktop/5000/5870-eyefinity-6#

    I genuinely believe the standard will always be one screen means one cable, with only the top 1% meaning two cables per screen, or even two screens per cable.
     
  5. Quartz

    Sgarrista

    Joined: Apr 1, 2014

    Posts: 7,539

    Location: Aberdeen

    Displayport hasn't advanced since 2016.
     
  6. Poneros

    Mobster

    Joined: Feb 18, 2015

    Posts: 2,981

    HDMI 2.1 is coming out in full force later this year, which will solve these issues for anything up to and including 8K. It's going to be a loooong time until we'll need to move past that.
     
  7. Quartz

    Sgarrista

    Joined: Apr 1, 2014

    Posts: 7,539

    Location: Aberdeen

    Actually, no. HDMI 2.1 only has a bandwidth of 48 Gbps and will require compression to display 8bpc 8k at 60 Hz; without compression you only get 50 Hz. Which is fine for TV, but not for computer usage.
     
  8. pastymuncher

    Man of Honour

    Joined: Jul 12, 2005

    Posts: 17,667

    Location: Aberlour, NE Scotland

    @Quartz Did I see in another thread that you are using optical display cables? Is that monitor specific or can they be used with any monitor? Can you see any difference in image quality? Sorry for all the questions but I have never heard of optical display cables before.
     
  9. spoffle

    Capodecina

    Joined: Jul 4, 2012

    Posts: 16,211

    Not with display port with USB C.
     
  10. spoffle

    Capodecina

    Joined: Jul 4, 2012

    Posts: 16,211

    Optical display cables aren't really a thing, but even so modern displays are digital. Using a different type of cable wouldn't grant better image quality.
     
  11. Poneros

    Mobster

    Joined: Feb 18, 2015

    Posts: 2,981

    It uses DSC, which is visually lossless, i.e. you won't be able to tell the difference. And that's good for 8K 120 too.
     
  12. Quartz

    Sgarrista

    Joined: Apr 1, 2014

    Posts: 7,539

    Location: Aberdeen

    Yes, I have a 10m optical Displayport 1.4 cable. I had to go optical because copper cables weren't working over 3 metres (and can be twitchy at that length - I get occasional screen flashes on my 34" monitor) and I needed a 5m cable. The cable just has standard Displayport connectors so will work with any monitor. I believe the cable is directional - one plug is labelled 1 and the other 2 but I can't be bothered to test it. They are seriously expensive - £179 - but the one I received seems to work just fine. The picture quality seems to be perfect and the monitor's OSD says it's displaying up to 120 Hz. (I spent £1000 on the GPU and £900 on the monitor, so another £180 to get the two working at their best is relatively small.)

    It's too early to say, but this optical cable really could be the greatest cable since sliced bread.

    USB C has insufficient bandwidth - only 10 Gbps. USB 3.2 doubles that to 20 Gbps, but that's still not enough. And then there are cable length considerations. Note that Thunderbolt 3 cables are typically very short.

    ITYM 8K60. And the veracity of that statement has yet to be tested and so remains to be seen. It may well be good enough for TV, but computer monitors are a different matter.
     
  13. Poneros

    Mobster

    Joined: Feb 18, 2015

    Posts: 2,981

    https://en.wikipedia.org/wiki/HDMI#Version_2.1

    It has already been tested as it has been out with DP1.4 already.

    https://www.youtube.com/watch?v=SUnqGL5j4ZU
     
  14. Quartz

    Sgarrista

    Joined: Apr 1, 2014

    Posts: 7,539

    Location: Aberdeen

    Now think about the amount of compression necessary to achieve that. Yes, it's going to be good for TVs, but PCs are another matter.
     
  15. Poneros

    Mobster

    Joined: Feb 18, 2015

    Posts: 2,981

    Idk why you keep repeating this line, and you could've seen the (lack of) difference in the first video anyway.

    https://youtu.be/dFbpcBuQg9s?t=272
     
  16. Quartz

    Sgarrista

    Joined: Apr 1, 2014

    Posts: 7,539

    Location: Aberdeen

    I did see it. And it wasn't an 8k or even 4k input so not a relevant test. But for a computer image, the image has to be pixel-perfect. Imagine being a computer chip designer and placing traces.
     
  17. pastymuncher

    Man of Honour

    Joined: Jul 12, 2005

    Posts: 17,667

    Location: Aberlour, NE Scotland

    Many thanks for the detailed reply Quartz. That is some serious money for a cable although spending as much as you have on your setup I would say it's a fair investment.
     
  18. Quartz

    Sgarrista

    Joined: Apr 1, 2014

    Posts: 7,539

    Location: Aberdeen

    It was a bit of a distress purchase. Hopefully OCUK will be able to supply such cables at lower cost.
     
  19. spoffle

    Capodecina

    Joined: Jul 4, 2012

    Posts: 16,211

    USB C is a connector type, it doesn't denote the bandwidth at all. Given Thunderbolt 3 as a standard also works via USB with 40Gb bandwidth, you're wrong there.

    You also missed the point of what I was saying. It's about the connector size and the ability to shoe horn loads in the space of a single pcie slot.
     
  20. Quartz

    Sgarrista

    Joined: Apr 1, 2014

    Posts: 7,539

    Location: Aberdeen

    Actually, the USB 3.x standards, which use the USB C connector, do specify the bandwidth.

    AIUI Thunderbolt 3 requires active cables.

    Indeed I did miss that point; you might have tried mentioning it straight off! Clearly I had not had sufficient tea.