Associate
- Joined
- 20 Nov 2014
- Posts
- 246
I was wondering about this simple fact: the Acer Predator XR341CKA, gsync version, should mount the same panel as his brother XR341CK (freesync) and as the Dell U3415W and I reckon it's the same of the LG's too.
Now, these panels have been tested for overclocking previously and they did fail in all and every case. They can handle higher frequencies but they skip frames right after the 60's so that's basically a no-go. You don't overclock something out of his specs not to have the benefits you overclocked it for in the first place right?
Wrong, apparently, since it's what Acer (and presumably Asus too with the coming ROG) did with the aforementioned Predator, both versions. One is working @75hz the other is rumored to support up to 100hz.
Can someone please explain me how's that possible? I mean technically, how's a different pcb and the g/freesync chip altering the panel's efficiency in order to work so much beyond its specs? How can one be sure that if the gpu is generating 100 frames per second all of them are shown and none is skipped but not detected since "synced"? Moreover: isn't it a bit tricky in the sense we should expect these screens to burn out so much faster than they should? I don't know you but when I pay 1k quid for a monitor I want it to last forever, at least, and not to be working out of its actual specs either
Now, these panels have been tested for overclocking previously and they did fail in all and every case. They can handle higher frequencies but they skip frames right after the 60's so that's basically a no-go. You don't overclock something out of his specs not to have the benefits you overclocked it for in the first place right?
Wrong, apparently, since it's what Acer (and presumably Asus too with the coming ROG) did with the aforementioned Predator, both versions. One is working @75hz the other is rumored to support up to 100hz.
Can someone please explain me how's that possible? I mean technically, how's a different pcb and the g/freesync chip altering the panel's efficiency in order to work so much beyond its specs? How can one be sure that if the gpu is generating 100 frames per second all of them are shown and none is skipped but not detected since "synced"? Moreover: isn't it a bit tricky in the sense we should expect these screens to burn out so much faster than they should? I don't know you but when I pay 1k quid for a monitor I want it to last forever, at least, and not to be working out of its actual specs either