• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel back Free Sync and Vulkan, G-Sync meeting its deserved end.

Caporegime
Joined
18 Oct 2002
Posts
29,865
Also the thing is, if nVidia can't drastically reduce the pricing of their new HDR Gsync modules (never mind needing active cooling!!) then it's going to open up an even wider pricing gap between them (which I don't think the wider market will stomach).
 
Permabanned
Joined
15 Oct 2011
Posts
6,311
Location
Nottingham Carlton
But I was talking about a 4K monitor mate ;)

For anything lower there are great choices for Freesync to be fair :)

Personally I can't stand 1080p now that I have been on 4K for over 4 years, I can see the pixels very clearly. Screen door effect on a monitor? No thanks :p
Thank god my eyes are bit messed up. As sad as TRUE it is:
Thats why I dont wear my glasses if I dont really need them,,,, Got 1.15 and 1.5 from well hundreds of thousand hours of display screens use.
 
Soldato
Joined
26 Aug 2004
Posts
5,032
Location
South Wales
Tbh looking at a TV 1080p is fine, probably less so with a monitor since you sit so close.

You only need to see comparisons with films on VHS vs 1080p Bluray to see how clear 1080p looks, it's strange how people seem to bash it when really there's nothing wrong with it. Some people make out as if 1080p is completely terrible, well i don't know how people managed with DVD then.. Might as well have bought a disc and thrown it straight in the bin right?
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
Also the thing is, if nVidia can't drastically reduce the pricing of their new HDR Gsync modules (never mind needing active cooling!!) then it's going to open up an even wider pricing gap between them (which I don't think the wider market will stomach).

NVidia cannot reduce the price of the HDR Gsync module since they have to buy parts to make it.
The $2600 FPGA chip costs them at BEST case $500, the other board parts $250+.
So the absolute minimum it costs them that module is $750 to $900, if someone adds costs to put parts together, design time and profit.

And before someone complains about the news articles that the module costs $500, if you follow the original article and videos, is the FPGA CPU referred as costing $500 only, in best case scenario-price estimation if Nvidia buys them in bulks of thousands.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,582
Location
Greater London
Tbh looking at a TV 1080p is fine, probably less so with a monitor since you sit so close.

You only need to see comparisons with films on VHS vs 1080p Bluray to see how clear 1080p looks, it's strange how people seem to bash it when really there's nothing wrong with it. Some people make out as if 1080p is completely terrible, well i don't know how people managed with DVD then.. Might as well have bought a disc and thrown it straight in the bin right?

It is different on a monitor vs TV. I have 1080p 50” Plasma TV which I am happy with, I sit far enough that I do not see the pixels firing, which surprisingly is not that far. But I sit close to monitors where I can see the individual pixels on a 1080p monitor. Before I actually did not even see them. But once you see the clarity 4K monitor offers and you go back to 1080p you can see a huge difference and it is off putting after unfortunately, as once you see the pixel grid, you cannot un-see it.
 
Soldato
Joined
22 Nov 2003
Posts
2,933
Location
Cardiff
Gsync's nail in the coffin for me personally, is the price of it's up and coming high performance HDR modules, coupled with the fact that it needs to be actively cooled.

As screens become faster and the resolutions increase beyond 4k, these modules are going to become more & more expensive as they require more horsepower to function, which in turn means they will get hotter & hotter.

If Gsync survives I can guarantee within 5 years we will see some form of water cooled Gsync modules :p
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
Gsync's nail in the coffin for me personally, is the price of it's up and coming high performance HDR modules, coupled with the fact that it needs to be actively cooled.

As screens become faster and the resolutions increase beyond 4k, these modules are going to become more & more expensive as they require more horsepower to function, which in turn means they will get hotter & hotter.

If Gsync survives I can guarantee within 5 years we will see some form of water cooled Gsync modules :p


I think the price of the module has to be put in perspective.
How many 4K HDR, 10bit 144Hz display exist, how many can use freesync?

The Gsync chip currently has to be very customized, and the screens are bleeding edge. Asus, who are notoriously expensive sell this for $2000 where both Asus and nvidia are likely making very healthy profits. The leading few producers have an open market.

Nvidia will almost certainly resort to some kind of custom ASIC long term. The FPGA solution facilitated rapid deployment, and is easily modify for changing HDR standards and technology but liekly costs a few hundred. Once there is better defined standards and stability then an ASIC could likely do this for $100, and long term it will become cheaper.

Freesync2 HDR monitors of the same calibre seems some way away still to get accurate price comparisons. And Feesync 2 seems to have some limitations, requiring active development form the game developers and the monitor manufacturers have to provide accurate calibration data. Also not clear what happens if you buy a newer Freesync 2 monitor that the game didn't support.

IF Adaptive syn technology can reality handle 4K 10bit HDR at massively reduced costs without major issues then I don't see why nvidia woudln't switch in thr future.
 
Associate
Joined
22 Jul 2004
Posts
1,332
Tbh looking at a TV 1080p is fine, probably less so with a monitor since you sit so close.

You only need to see comparisons with films on VHS vs 1080p Bluray to see how clear 1080p looks, it's strange how people seem to bash it when really there's nothing wrong with it. Some people make out as if 1080p is completely terrible, well i don't know how people managed with DVD then.. Might as well have bought a disc and thrown it straight in the bin right?
DVD is fine for TVs < 32" assuming high bitrate, 1080p is fine for < 50" but I would say > 55" range benefit most from 4K although people will still notice the extra sharpness from 4K @ < 55".
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,582
Location
Greater London
DVD is fine for TVs < 32" assuming high bitrate, 1080p is fine for < 50" but I would say > 55" range benefit most from 4K although people will still notice the extra sharpness from 4K @ < 55".
The thing is this all started when I was talking about a monitor which we sit close to. Somehow we moved on to tv’s and dvd’s... lol.

I think the price of the module has to be put in perspective.
How many 4K HDR, 10bit 144Hz display exist, how many can use freesync?

The Gsync chip currently has to be very customized, and the screens are bleeding edge. Asus, who are notoriously expensive sell this for $2000 where both Asus and nvidia are likely making very healthy profits. The leading few producers have an open market.

Nvidia will almost certainly resort to some kind of custom ASIC long term. The FPGA solution facilitated rapid deployment, and is easily modify for changing HDR standards and technology but liekly costs a few hundred. Once there is better defined standards and stability then an ASIC could likely do this for $100, and long term it will become cheaper.

Freesync2 HDR monitors of the same calibre seems some way away still to get accurate price comparisons. And Feesync 2 seems to have some limitations, requiring active development form the game developers and the monitor manufacturers have to provide accurate calibration data. Also not clear what happens if you buy a newer Freesync 2 monitor that the game didn't support.

IF Adaptive syn technology can reality handle 4K 10bit HDR at massively reduced costs without major issues then I don't see why nvidia woudln't switch in thr future.

Good point.
 
Soldato
Joined
6 Jan 2013
Posts
21,849
Location
Rollergirl
I've got a 43" 4k TV in my office that's hooked up to a sky box so only receiving HD signal. I thought it looked fine until I switched from BBC coverage of a world cup game over to the iPlayer 4k coverage. It was noticeable in clarity difference.

I'd wager that most people think there's no improvement to see until they actually see it, if that makes sense.
 
Associate
Joined
23 Jun 2018
Posts
347
Location
Close to the sea, UK
It is different on a monitor vs TV. I have 1080p 50” Plasma TV which I am happy with, I sit far enough that I do not see the pixels firing, which surprisingly is not that far. But I sit close to monitors where I can see the individual pixels on a 1080p monitor. Before I actually did not even see them. But once you see the clarity 4K monitor offers and you go back to 1080p you can see a huge difference and it is off putting after unfortunately, as once you see the pixel grid, you cannot un-see it.

Agree with this. My 50" 1080p TV looks fine from 8-10ft away, but I moved to a 4K monitor for desktop real estate and when I now look at my 24" 1080p monitors at work I can't help but notice the difference in clarity, it really is a huge difference.

Same with games, the difference is enormous. Pin sharp clarity vs smudgey pixellation (for me at least).
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
I'm sure Nvidia would license Gsync.

I'm shocked you'd even write that, you know perfectly well it'd never happen, I'm pretty sure Tom Peterson has said as much in the past.
Nvidia tout G-sync as the superior adaptive sync tech & a big part of what makes PC gaming with Nvidia the way games are meant to be played.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
How many 4K HDR, 10bit 144Hz display exist, how many can use freesync?

The Gsync chip currently has to be very customized, and the screens are bleeding edge.

IF Adaptive syn technology can reality handle 4K 10bit HDR at massively reduced costs without major issues then I don't see why nvidia woudln't switch in thr future.

FYI also @TNA

Both PG27UQ and Acer Predator X27 ARE NOT true 10bit panels. 8bit + FRC unfortunately.
Both cannot run 4:4:4 at refresh rate higher than 60hz so 4:2:2 only due to port bandwidth limitations.
And that is bad for computer gaming graphic making everything look dull.

Samsung is bringing into the market a 32" 4K HDR Freesync2 120hz quantum dot monitor later this summer. (we do not know if it has HDMI2.1 or not, if not graphic colour quality will suffer above 98hz the same fate as above due to DP1.4 bandwidth)
Phillips has already the 436M6BPAB, HDR1000 Freesync 4K albeit only 60hz.

As for the AU screens used on the Gsync monitors are bleeding edge.
Everyone complains that the panels aren't good. Especially when comes to multiple point dimming light. In dark (not even black) background you can see a bright halo around the mouse pointer and bright objects.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,582
Location
Greater London
Agree with this. My 50" 1080p TV looks fine from 8-10ft away, but I moved to a 4K monitor for desktop real estate and when I now look at my 24" 1080p monitors at work I can't help but notice the difference in clarity, it really is a huge difference.

Same with games, the difference is enormous. Pin sharp clarity vs smudgey pixellation (for me at least).

Good to know I am not the only one who notices this. Most people here make it sound like 4K does not bring much improvement on a monitor. It’s like wut... It is either denial due to being on a lower resolution themselves, comparing to a tv, not trying it in person or they need to go specsavers :p


FYI also @TNA

Both PG27UQ and Acer Predator X27 ARE NOT true 10bit panels. 8bit + FRC unfortunately.
Both cannot run 4:4:4 at refresh rate higher than 60hz so 4:2:2 only due to port bandwidth limitations.
And that is bad for computer gaming graphic making everything look dull.

Samsung is bringing into the market a 32" 4K HDR Freesync2 120hz quantum dot monitor later this summer. (we do not know if it has HDMI2.1 or not, if not graphic colour quality will suffer above 98hz the same fate as above due to DP1.4 bandwidth)
Phillips has already the 436M6BPAB, HDR1000 Freesync 4K albeit only 60hz.

As for the AU screens used on the Gsync monitors are bleeding edge.
Everyone complains that the panels aren't good. Especially when comes to multiple point dimming light. In dark (not even black) background you can see a bright halo around the mouse pointer and bright objects.

That Samsung monitor sounds interesting, but I really want a OLED one at this point to be honest. Hopefully they can make it happen in 2-3 years time.
 
Soldato
Joined
28 Jan 2008
Posts
6,038
Location
Manchester
I have one freesync, one gsync and 2 normal screens. Problem solved, at least till something new comes out. :)
Both my pcs currently have Nvidia cards though, hopefully when I come to upgrade one of them AMD will have some new cards out.
 
Back
Top Bottom