Soldato
+2.
I get so tempted to get a cheap monitor on the typical spam emails from popular UK outlets, it's only this tech is so near as to why I am holding out.
Its definitely worth the wait.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
+2.
I get so tempted to get a cheap monitor on the typical spam emails from popular UK outlets, it's only this tech is so near as to why I am holding out.
Surround support was less than the time between Titan and 290x.
Uneducated silence would have sufficed
Depends really. I see the benefit for my 1440 screen but when I'm FPS gaming I run at 1080/120 with slightly lowered settings to get a really high frame rate so doubt I'd notice much different. And I love my 1440 monitor so don't intend to swap it any time soon.
You'll still get stuttering and tearing without vsync on even at 144hz at 1080p, so the technology still benefits you regardless mate. With vsync on, you'd simply retain the stuttering whilst introducing input lag, even at high refresh rates, which again, isn't an issue with this tech.
@ Rusty, your 290 @ 1220/1675, is that your 24/7 clock?
Can't say I really notice any stuttering or tearing tbh at 1080/120. Both on my 780 when I had it or my 290. Obviously there is a very minor amount of tearing because vsync is off. But can't say it really bothers me at all. I'm more bothered about everything else going on. But I only use this monitor for FPS gaming where a lot of things tend to go on so that will somewhat mask any visual disturbances.
People were telling me Mantle was smoother but when I swapped I didn't really notice anything different to DirectX on my 780. 60 Hz to 120 Hz, however, is a different matter .
Nope but it could be if I wanted it to be. That's with +100mV over whatever is the stock voltage. I have another 100mV to play with if I want to use Trixx.
Aye I know what you mean by that, but in fast FPS the effect of gsync/freesync is still pretty damn noticable. The most glaringly obvious game for me so far is still battlefield 4, as its just so much nicer to play and the image feels almost "static". Its very hard to explain as you have to see it to really know what I mean by that, but I think you'll be pleasantly surprised when you get round to having a go when you decide to try out the tech.
Yeah same with the volts, 1.3v in MSI AB and 1.4v in Trixx.
1220 is not bad going at 1.3v, I need to run about 1.37v to get it (properly stable) at 1225, I also can't get my Vram over 1550.
A good card you here there.
Not using a single gpu.
Sep 2009 single gpu Eyefinity
March 2010 single gpu 6 screen Eyefinity.
It took Nvidia until March 2012 to enable single gpu Surround, what was that you said?
Stop trying to derail!
Lolz, usual reply when caught talking ****.
Doesn't matter whether 1 or more gpu's is needed when your talking about copying each others tech and time frames involved, the point was made that both of them bring out similar features over time in direct response to your original off topic comment.
/last word in
Seeing as you've completely gone off from my point about having in situ buffering might as well abandon thread now. GG
Edit: don't try to copy my sig either. Mines betterer
Man, I just tried a lightboost trick on my 700D monitor and it was gorgeous but very dark.
Change Magic Angle to group view.