• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

Depends really. I see the benefit for my 1440 screen but when I'm FPS gaming I run at 1080/120 with slightly lowered settings to get a really high frame rate so doubt I'd notice much different. And I love my 1440 monitor so don't intend to swap it any time soon.
 
Depends really. I see the benefit for my 1440 screen but when I'm FPS gaming I run at 1080/120 with slightly lowered settings to get a really high frame rate so doubt I'd notice much different. And I love my 1440 monitor so don't intend to swap it any time soon.

You'll still get stuttering and tearing without vsync on even at 144hz at 1080p, so the technology still benefits you regardless mate. With vsync on, you'd simply retain the stuttering whilst introducing input lag, even at high refresh rates, which again, isn't an issue with this tech.
 
You'll still get stuttering and tearing without vsync on even at 144hz at 1080p, so the technology still benefits you regardless mate. With vsync on, you'd simply retain the stuttering whilst introducing input lag, even at high refresh rates, which again, isn't an issue with this tech.

Can't say I really notice any stuttering or tearing tbh at 1080/120. Both on my 780 when I had it or my 290. Obviously there is a very minor amount of tearing because vsync is off. But can't say it really bothers me at all. I'm more bothered about everything else going on. But I only use this monitor for FPS gaming where a lot of things tend to go on so that will somewhat mask any visual disturbances.

People were telling me Mantle was smoother but when I swapped I didn't really notice anything different to DirectX on my 780. 60 Hz to 120 Hz, however, is a different matter :p.
 
Can't say I really notice any stuttering or tearing tbh at 1080/120. Both on my 780 when I had it or my 290. Obviously there is a very minor amount of tearing because vsync is off. But can't say it really bothers me at all. I'm more bothered about everything else going on. But I only use this monitor for FPS gaming where a lot of things tend to go on so that will somewhat mask any visual disturbances.

People were telling me Mantle was smoother but when I swapped I didn't really notice anything different to DirectX on my 780. 60 Hz to 120 Hz, however, is a different matter :p.

Aye I know what you mean by that, but in fast FPS the effect of gsync/freesync is still pretty damn noticable. The most glaringly obvious game for me so far is still battlefield 4, as its just so much nicer to play and the image feels almost "static". Its very hard to explain as you have to see it to really know what I mean by that, but I think you'll be pleasantly surprised when you get round to having a go when you decide to try out the tech.
 
Nope but it could be if I wanted it to be. That's with +100mV over whatever is the stock voltage. I have another 100mV to play with if I want to use Trixx.

Yeah same with the volts, 1.3v in MSI AB and 1.4v in Trixx.

1220 is not bad going at 1.3v, I need to run about 1.37v to get it (properly stable) at 1225, I also can't get my Vram over 1550.

A good card you here there.
 
Aye I know what you mean by that, but in fast FPS the effect of gsync/freesync is still pretty damn noticable. The most glaringly obvious game for me so far is still battlefield 4, as its just so much nicer to play and the image feels almost "static". Its very hard to explain as you have to see it to really know what I mean by that, but I think you'll be pleasantly surprised when you get round to having a go when you decide to try out the tech.

Think it depends what kind of variance you have on your frame rate really. If I ran 4x MSAA (with the associated frame rate variance) it would be more pronounced but I don't. I drop it to 2x so my minimums are often around like 80-90 at worst? Or on most maps 100-110 so the averages are well in excess of that and don't tend to zip about much.

I'm probably due a new 120 Hz monitor at some point so will no doubt end up trying it due to that but I don't really expect to notice much. If I do then woohoo :)

Yeah same with the volts, 1.3v in MSI AB and 1.4v in Trixx.

1220 is not bad going at 1.3v, I need to run about 1.37v to get it (properly stable) at 1225, I also can't get my Vram over 1550.

A good card you here there.

Aye I'm really happy with how it overclocks. Don't think it's 1.3V with 100mV extra. Will have to check next time. Temps are obviously OK but from memory it was 1.2 something.
 
Last edited:
I just find that I win more with high FPS, people I shoot at drop dead more often before I do. can't explain why, its not input lag

So no V-Sync and I don't run any post AA, on some maps I will even run 0 MSAA.
 
Dat placebo :).

Nah, higher frame rates than the refresh can definitely help. In the original CoD (2003) I had a really bad PC to start with which would get like 40-70 FPS. Then I got a new PC so I was able to cap it at 125 FPS. I became a lot, lot better at the game then.

Part of that was due to it being a Q3 engine game which benefits from being capped to 125/333/ another FPS number I can't remember... but the other part (which carried forward into all games I played) was just due the game feeling better at a higher frame rate even though technically my TFT monitor couldn't display more than 60 frames. I am young enough to have successfully sidestepped the CRT era. They were around when I started gaming but not the only option :D.

120 Hz monitors followed a few years down the line from then.

Edit: eek... a fair few. I can't actually remember the year I bought my first 120 Hz monitor.
 
Last edited:
Not using a single gpu.

Sep 2009 single gpu Eyefinity

March 2010 single gpu 6 screen Eyefinity.

It took Nvidia until March 2012 to enable single gpu Surround, what was that you said?

LOL. Using google are we? Look again. But pm me with the correct answer.

Why start arguing the toss and use crumby examples. It's bad enough using one GPU for surround now leave alone back then.


Stop trying to derail!
 
Last edited:
Stop trying to derail!

Lolz, usual reply when caught talking ****.

Doesn't matter whether 1 or more gpu's is needed when your talking about copying each others tech and time frames involved, the point was made that both of them bring out similar features over time in direct response to your original off topic comment.
 
Last edited:
Lolz, usual reply when caught talking ****.

Doesn't matter whether 1 or more gpu's is needed when your talking about copying each others tech and time frames involved, the point was made that both of them bring out similar features over time in direct response to your original off topic comment.

/last word in

Seeing as you've completely gone off from my point about having in situ buffering might as well abandon thread now. GG

Edit: don't try to copy my sig either. Mines betterer :p
 
Last edited:
Man, I just tried a lightboost trick on my 700D monitor and it was gorgeous but very dark.

If FreeSync is anything like this I'll be over the moon! When are these 1.2a monitors comnig out!?? :(
 
/last word in

Seeing as you've completely gone off from my point about having in situ buffering might as well abandon thread now. GG

Wot you bumping your gums about?

Z4Eqny1.png


^
Never went off anything, only comment I made in direct quote to nonsense, that's the response you get with trash talking.

Edit: don't try to copy my sig either. Mines betterer :p

That's debatable.
 
Back
Top Bottom