I'll confess to having been a little left behind on current tech. Whilst I still have an interest, over the past couple of years my attention has been drawn away from gaming and gaming tech and I've lost touch a little. So please bear with me here.
My machine is starting to feel a little old. Whilst I've not noticed anything specifically wrong with it, except for the noise, it's been a few years since I've upgraded anything and I have the urge to improve again. I think that the monitor and graphics is probably where I need to look.
So, I'd like to upgrade my monitor, and having spent a while gaming at 1080p, I think that another step up the resolution ladder might be good. So, I'm guessing that this means 1440p. Last time I bought a monitor, it was a case of seeing what size I wanted, what resolution and then getting the lowest response time available for my money. Now, though, I'm seeing a lot of people talking about 144Hz panels. Presumably, the improved refresh rate gives a smoother experience, but only if your gfx card can drive sufficient fps to keep up? Given the benchmarks I'm seeing for the latest and greatest at 1440p aren't anywhere near 144fps, is there a benefit to 144Hz that I'm missing? Or have I completely misunderstood?
I'm obviously drawn to looking at the gtx1070, as that seems a price/performance bargain; and I understand that waiting til later in the year will make things even more competitive with Ti versions; but will the 1070 be sufficient for a good experience at 1440p? I'm assuming that AMD are out of the game for now at those kind of resolutions and that I'd have to hold off for another 6 months for a 490 or something?
Lastly, I'm toying with the idea of a Vive. I'm guessing that any of this generation of cards will drive one perfectly contentedly, so I don't really need to factor that in. Is my assumption correct?
Thanks in advance for the help and sorry for the noddy questions; I've tried reading around the subject, but keep getting lost down dark alleys of tech specs and impenetrable jargon.
My machine is starting to feel a little old. Whilst I've not noticed anything specifically wrong with it, except for the noise, it's been a few years since I've upgraded anything and I have the urge to improve again. I think that the monitor and graphics is probably where I need to look.
So, I'd like to upgrade my monitor, and having spent a while gaming at 1080p, I think that another step up the resolution ladder might be good. So, I'm guessing that this means 1440p. Last time I bought a monitor, it was a case of seeing what size I wanted, what resolution and then getting the lowest response time available for my money. Now, though, I'm seeing a lot of people talking about 144Hz panels. Presumably, the improved refresh rate gives a smoother experience, but only if your gfx card can drive sufficient fps to keep up? Given the benchmarks I'm seeing for the latest and greatest at 1440p aren't anywhere near 144fps, is there a benefit to 144Hz that I'm missing? Or have I completely misunderstood?
I'm obviously drawn to looking at the gtx1070, as that seems a price/performance bargain; and I understand that waiting til later in the year will make things even more competitive with Ti versions; but will the 1070 be sufficient for a good experience at 1440p? I'm assuming that AMD are out of the game for now at those kind of resolutions and that I'd have to hold off for another 6 months for a 490 or something?
Lastly, I'm toying with the idea of a Vive. I'm guessing that any of this generation of cards will drive one perfectly contentedly, so I don't really need to factor that in. Is my assumption correct?
Thanks in advance for the help and sorry for the noddy questions; I've tried reading around the subject, but keep getting lost down dark alleys of tech specs and impenetrable jargon.