• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Well the three guys in the video all reckon it looks better than the 4K run.



Of course this is the golden question.

Looks better than 4k only because of the TAA blur in FF15. Not that it is actually better than 4k. I felt it was very poor to keep banging on about tht in the video. I appreciate what they were saying but your quote out of context is just misleading, and they should have known people would do that.

1440p with good AA will be better than 2160p with a poor TAA implementation. That's not an advantage unique to DLSS.

In terms of pure resolution advantage, they said it was behind 4K, rather than overall IQ.
 
Last edited:
Shhh. Run, you going to get burned alive in by the cult err forum members. Run for your life.....

Well it’s nonsense that’s why, the Vega 64 gets similar frame rates in games at 2560x1440 as a 2080ti does at 3840x2160.

Even with the sync technologies you would definitely notice the minimums on the Vega 64 compared to the 2080ti at 4k.

If we were speaking purely about the 2080 then I’d agree.

Edit: looking back at reviews the 2080ti gets higher framerates at 3840x2160 than Vega 64 does at 2560x1440.

Most people here agree that neither of the sync technologies are that great once the frame rates start dipping into the 30 range so I think it would be rather easy for people to pick the 2080ti out in a blind test, you’d be able to feel it in mouse/controller response.

This isn't some cultish post either as I'm looking to go the 64/FreeSync route until the next cards release.
 
Last edited:
More FPS isn't just that but also reduced input lag and motion blur.
I agree that "some" people might not notice but I bet most pc gamers will be able to tell the difference.
 

Turing is locked to max of 1068 (or is it 1062?) mv, which is the same for all AIB/founders cards - people are talking about cancelling preorders for the "extreme" overclocking variants because without more voltage they don't think overclocking will be any better than the "regular" AIB/founders cards, people are talking about waiting for others to test the cards / see what mods can be done, or saving money on not bothering with the extra charge for bigger power delivery when its probably not going to give anything extra - this was basically the same with pascal

nvidia's argument is that they've tested the hell out of their silicon and they know that at 1.068 (or 1.062v) they know the GPU will last for 5 years pretty much regardless of what else you do to it, obviously end users / competetive benchmarkers always want to push it that little bit more and "accept" that it could die sooner... the problem for AIB's is that soft mods are difficult to detect and obviously they don't want to be having to issue RMA's for users that have destroyed their chip through over volting

my guess would be that hardware mods will be worked out that do allow it, but a load of people are still going to get uppity that these are not undetectable and so they bin their warranty by doing it
 
Last edited:
More FPS isn't just that but also reduced input lag and motion blur.
I agree that "some" people might not notice but I bet most pc gamers will be able to tell the difference.

No completely different things.

This isn't some cultish post either as I'm looking to go the 64/FreeSync route until the next cards release.

I completely agree with you. As I am using both 2560X1440 freesync monitor and 4K Freesync HDR TV with my Vega 64 and my perf is 20% higher at sub 300W consumption than those reviewers.
And not going to change until next year on the next round of AMD GPUs if AMD doesn't push the Vega 20 in the mainstream market this year.
 
The big talking point on the Web now is the performance hit that comes with raytracing, even on the high-end GeForce RTX 2080Ti card. Can you reveal how MechWarrior 5: Mercenaries is currently handling the RTX features in terms of resolution and frame rate? Do you expect significantly improved performance after the final optimization pass?

This may go without saying, but it’s worth stressing that we can only speak to our specific case when it comes to performance. I expect every team, every title is likely to be a little different, even between titles that are using the same engine. Optimization is a long process, and at least at this early stage, we weren’t focusing our resources toward it a great deal. By the end of our work in this phase, we were operating at 60fps with raytraced shadows and ambient occlusion at 1080p and maxed out quality settings, down to around 30 when we activated raytraced reflections together with those two features. The reflections aren’t cheap, no doubt about that, but with additional optimization on our end and continued work from NVIDIA, I expect all of the raytraced features to be a perfectly performant option in our case.

https://wccftech.com/mechwarrior-5-mercenaries-dev-nvidia-rtx-dlss/
 
That just shows how knowledgeable you are.

More FPS alway = less input lag and more Hz = less motion blur.

Is all down to the monitor mate and panel not GPU only, and that shows how knowledgeable you are.
a) Vsync adds input lag full stop.
b) IPS & VA panels have motion blur even at 144hz also, compared to good TN.
c) You can have a 144hz monitor with 20ms input lag, and 144hz monitor with 1ms input lag.
Same applies to 60hz on eg like TVs. There are those with 20ms input lag (which are good) and those with 48ms+.

Don't confuse things.
 
Is all down to the monitor mate and panel not GPU only, and that shows how knowledgeable you are.
a) Vsync adds input lag full stop.
b) IPS & VA panels have motion blur even at 144hz also, compared to good TN.
c) You can have a 144hz monitor with 20ms input lag, and 144hz monitor with 1ms input lag.
Same applies to 60hz on eg like TVs. There are those with 20ms input lag (which are good) and those with 48ms+.

Don't confuse things.

We are not talking about monitors here only gpus and FPS.
Oh and 1ms input lag monitors do not exist.
 
Of course, they wouldn't be able to see the screen? Sorry :D.
Unless people try such an experiment we'll never know. Would at least be good if some getting the new GPU's posted up real world opinions on how a 2080 at least (Ti too ideally) feels in gameplay compared to a 1080 Ti. Note that I said "opinions" there as they of course could be biased towards the new purchase.

Did you keep the 2080 that was delivered after cancellation?

I have requested a return, but I still have it, unopened. I would certainly like to see some real world reports of people using a 2080 with 4K G-Sync across several games.
 
a44e6efdd47a5dac24be643430940616.png
 
We are not talking about monitors here only gpus and FPS.
Oh and 1ms input lag monitors do not exist.
LOL.

Asus VG248QE 0.6ms input lag
BenQ XL2420G, XL2730Z almost 0
EIZO Foris FS2333-BK 0.3ms
BenQ XL2720T about one frame

Want me to expand the list?
All monitors are 1ms GTG except the Eizo 3ms.
 
LOL.

Asus VG248QE 0.6ms input lag
BenQ XL2420G, XL2730Z almost 0
EIZO Foris FS2333-BK 0.3ms
BenQ XL2720T about one frame

Want me to expand the list?
All monitors are 1ms GTG except the Eizo 3ms.
How about you share the source of those claims because I still don't believe it. :cool:

Or maybe you are taking about response time which is usually a lot less than input lag and something completely different.
 
How about you share the source of those claims because I still don't believe it. :cool:

Or maybe you are taking about response time which is usually a lot less than input lag and something completely different.

Surely you are talking about the same thing then when you talk about frametimes. No different to response times.
 
That just shows how knowledgeable you are.

More FPS alway = less input lag and more Hz = less motion blur.

Oh and I'm talking from experience not something that I read.

...

More fps is the same thing as saying a lower frametimes so no idea why you are being inconsistent.

No we are talking about input lag which has nothing to do with panel response time or frametimes.

I don't think you actually understand what you are talking about. Surely input lag relates to your input being displayed on screen. Ideally you want them perfectly synced.
 
Last edited:
Back
Top Bottom