• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

...

More fps is the same thing as saying a lower frametimes so no idea why you are being inconsistent.

I don't think you actually understand what you are talking about. Surely input lag relates to your input being displayed on screen.

No he doesn't understand, hence I give up. Especially when he doesn't know what Google is.
 
...

More fps is the same thing as saying a lower frametimes so no idea why you are being inconsistent.



I don't think you actually understand what you are talking about. Surely input lag relates to your input being displayed on screen. Ideally you want them perfectly synced.
Wow just wow. Some people here. Lol.
 
Well that is what input lag means.

Unless you have a different definition, you are talking nonsense.
Input lag is the time between a display receives a singnal and displays it and response time is the time between pixels shift fully from one colour to another.
You can have a display with low input lag and high response time or one with high input lag and low response time. It won't be pretty but it's possible.
Obviously for best results you want both to be as low as posy.
 
Input lag is the time between a display receives a singnal and displays it and response time is the time between pixels shift fully from one colour to another.

Input lag whilst gaming is actually the time from when you press a button, that signal goes to your computer, runs through the game engine, gets put into a frame and then that frame gets displayed on your screen. (With internet games there is a also network lag which is important).

You were focusing on a single aspect of input lag whilst Panos was saying there are far more considerations than the frame time (or FPS, although you don't seem to know they are the same) differences becomes insignificant compared to the total overhead in that process (one of which is display response time).

I challenge you to get one other person on this forum to agree with you.
 
Input lag whilst gaming is actually the time from when you press a button, that signal goes to your computer, runs through the game engine, gets put into a frame and then that frame gets displayed on your screen. (With internet games there is a also network lag which is important)

You were focusing on a single aspect of input lag whilst Panos was saying there are far more considerations that the frame time differences becomes insignificant compared to the total overhead in that process (one of which is display response time).

I challenge you to get one other person on this forum to agree with you.
No actually it was Panos who started talking about monitors and clearly he doesn't know the difference between input lag and response time.
All I said is that more frames per second equals lower input lag but he said that it isn't right because it's the display that is responsible for input lag which isn't fully the case.
You are totally right in your description of input lag muon.

He also said that there are monitors with sub 1ms input lag so where is the proof of that..?
 
Well I think that's the misunderstanding then. I don't think Panos was saying that.

I think what he was referring that input lag often relates to the lowest common denominator in that chain which I suspect is rarely FPS when you get into higher numbers. Most often it probably is the monitor.

One thing I would say is that I would find it hard to believe that there are displays with provable <1ms input lag.

Is all down to the monitor mate and panel not GPU only, and that shows how knowledgeable you are.
a) Vsync adds input lag full stop.
b) IPS & VA panels have motion blur even at 144hz also, compared to good TN.
c) You can have a 144hz monitor with 20ms input lag, and 144hz monitor with 1ms input lag.
Same applies to 60hz on eg like TVs. There are those with 20ms input lag (which are good) and those with 48ms+.

Don't confuse things.
 
Last edited:
Apparently Nvidia use Vulkan for Raytracing... If so that means their new hardware is setup probably similar to AMD hardware.

I'm going to fall about laughing if AMD also does Raytracing equally as decently. Obviously Vega etc is not going to be anywhere near the level of Nvidia at it as the chips are much smaller etc.

However if both chips were identical size etc I wonder which would give the better Raytracing performance?

Something tells me AMD will also be able to Raytracing type workloads if it's done via Vulkan, and that AMD might be surprisingly good at it.
 
Apparently Nvidia use Vulkan for Raytracing... If so that means their new hardware is setup probably similar to AMD hardware.

I'm going to fall about laughing if AMD also does Raytracing equally as decently. Obviously Vega etc is not going to be anywhere near the level of Nvidia at it as the chips are much smaller etc.

However if both chips were identical size etc I wonder which would give the better Raytracing performance?

Something tells me AMD will also be able to Raytracing type workloads if it's done via Vulkan, and that AMD might be surprisingly good at it.

Bit of a weird situation with nvidia using something that was originally an AMD api that has been rebuilt. Not sure if that means that games using RTX effects have to run in vulkan or they can run in d3d with vulkan providing the effects on the side. Especially as it needs a microsoft update in october to even be enabled in the first place, MS are making the push for RT yet nvidia are using Vulkan. :confused:
 
Apparently Nvidia use Vulkan for Raytracing... If so that means their new hardware is setup probably similar to AMD hardware.

I'm going to fall about laughing if AMD also does Raytracing equally as decently. Obviously Vega etc is not going to be anywhere near the level of Nvidia at it as the chips are much smaller etc.

However if both chips were identical size etc I wonder which would give the better Raytracing performance?

Something tells me AMD will also be able to Raytracing type workloads if it's done via Vulkan, and that AMD might be surprisingly good at it.
Why would you fall about laughing if AMD does Raytracing as equally as NVidia? Surely this is a good thing? I would love to see it and in fact, love to see them do it better. I seriously think that you and Panos are still under adult supervision and shouldn't be allowed on the internet.
 
Why would you fall about laughing if AMD does Raytracing as equally as NVidia? Surely this is a good thing? I would love to see it and in fact, love to see them do it better. I seriously think that you and Panos are still under adult supervision and shouldn't be allowed on the internet.

Because it means Nvidia will be doing AMDs job for them.

Nvidia pays off all the Devs to implement RTX features and AMD gets a free ride as their cards are similar hardware pipeline wise, so should be easy to make it work on AMD, thus Nvidia can't simply leave a huge performance gap, as AMD can also benefit from it.

Adult supervision? Oh noes you hurt my feels :(
 
Something tells me AMD will also be able to Raytracing type workloads if it's done via Vulkan, and that AMD might be surprisingly good at it.

Especially as it needs a microsoft update in october to even be enabled in the first place, MS are making the push for RT yet nvidia are using Vulkan. :confused:

People seem to have missed how much nVidia have made a push for Vulkan not out of any interest in the goals of Vulkan but instead of fighting it so as to be able to shape it to their advantage - while AMD have done very little relatively with it nVidia have been on the offensive: https://developer.nvidia.com/Vulkan

So I don't believe in the long run their embracing Vulkan will really be to AMD's advantage quite the opposite infact.
 
People seem to have missed how much nVidia have made a push for Vulkan not out of any interest in the goals of Vulkan but instead of fighting it so as to be able to shape it to their advantage - while AMD have done very little relatively with it nVidia have been on the offensive: https://developer.nvidia.com/Vulkan

Maybe so, but its still something that originally was an AMD api, i don't know how much of that remains in Vulkan but its a bit odd seeing nvidia jump on it. And there's still no real answers to how the rtx effects are implemented, is vulkan somehow layered in d3d to produced the effects or is the game running in a vulkan mode like bf4 had with its mantle mode.
 
Why would you fall about laughing if AMD does Raytracing as equally as NVidia? Surely this is a good thing? I would love to see it and in fact, love to see them do it better. I seriously think that you and Panos are still under adult supervision and shouldn't be allowed on the internet.
Would love to see it also. I would actually like to see AMD catch up and Intel to come in and take market share away from Nvidia. This would benefit everyone ultimately and we would never have a xx80 Ti class GPU cost a four figure sum again. Those should be around £600-650 on launch, but currently the non ti is not even priced like that.

Truth is I cannot see AMD catching up for at least 2-3 years (at the high end) and I also cannot see Intel coming in and taking the performance crown on their first cards. So happy to stay with my G-Sync monitor until the warranty has runs out in 2 years time at the very least. If the 30 series does a much better job on on price for performance and I get that, then probably that would last me many years anyway.

If Nvidia price the 30 series silly also and AMD do a better job in providing better price for performance, I might just sell my G-sync monitor and go to Freesync 2. RTX will also have a big say in that though, if it turns out to be good in 12 months time and games like Cyberpunk 2077 support it, I will be happy to pay the premium for that.
 
Interesting, you can see AMD were in the running 2010-2012, look at the NV pricing back then
Yep. That was the first thing I noticed :(

Those were the great days. I think for nearly a decade I only purchased AMD cards at one point as they had the best price for performance. They lost the plot with Fury and Vega unfortunately :(
 
Back
Top Bottom