• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Can you explain to me why you have to reduce your frame rate to below your refresh rate with gsync?

And more to the point doesn't Gsync make high end cards redundant if all you have to do is maintain the refresh rate? There are many out there that still have 1080p monitor at 60Hz, 100hz and 120Hz monitors.
A /1070 TI/1060 will more then likely satisfy one's need for smooth game play. Which again is the point of having such a monitor. It mitigates the need for a faster video card.

Vsync adds a lot of input lag and also needs to be at a constant 60fps to remain smooth, anytime you using vsync and your framerate drops even by a few frames it gets choppy.

With gsync you don't get the input lag and your framerate can drop and still remain smooth.

You need to limit your frames with gsync because if it goes above your refresh rate it turns to vsync and you get input lag .
 
You can still get tearing at high refresh rates but yes, it certainly works better towards the lower end of its working range.

Vsync adds a lot of input lag and also needs to be at a constant 60fps to remain smooth, anytime you using vsync and your framerate drops even by a few frames it gets choppy.

With gsync you don't get the input lag and your framerate can drop and still remain smooth.

You need to limit your frames with gsync because if it goes above your refresh rate it turns to vsync and you get input lag .
Great, thanks for the explanation. I thought Gsync would clamp it down on frame rates since it's hardware based in the monitor.
 
I game on a 4K 55 inch tv can’t say I’ve noticed this lag to be honest .

Depends what games you play and how used to it you are whether you notice the lag from vsync.

If I'm playing something like monster hunter or gta where Im just walking about relaxing then vsync is fine I don't notice the lag at all.

But if I'm playing something fast paced and competitive like mortal kombat or overwatch. Then the 70ms or so vsync adds to games is a big deal and no way I would be able to compete at high rank with them on.

Easy way to test the difference is just play vsync off for hour or so on your favourite game and then go back to vsync on, you will probably notice the lag straight away.
 
Was 1080Ti the full fat chip? I thought that was the Titan Xp?
I meant we have to make sure the 2080Ti is using "big chip" (i.e. like the current Titan and the 1080Ti), not using a "medium chip" (like the 1070/1080).

Wouldn't put it pass Nvidia to try to pull one "pull the wool over our eyes" renaming 70/80 card as 80/80Ti with all the bait and switch they have done with the current gen cards. That's why I said people need to keep their eyes wide-open and watch carefully.

If the 2080Ti turn out to be using the same medium size chip as 2080, then it might not be long before we see the return of the "Ultra" naming, with a big chip cards being called something like 2080 Ultra, or if they gonna release two big chip cards, they might called them 2080 Ultra/2080 Ultra+ :p
 
Last edited:
Exactly.

Also if Gibbo is saying Pascal card prices are going up, that means his stock is running low and he won’t be needing to drop the price to clear out the remainder. Believe me, if he has not sold them by the time Turing hits his warehouse in high quantities, he will either get some kind of rebate or lower the price on a one time special to get rid of the remainder price.

He has already said in another thread they have low numbers in pascal cards apart from maybe one type. All retailers have known this release was coming so would have only brought bare minimum stock to cover themselves.
 
If they are RTX cores they are going to be pretty big irrespective of their traditional gaming performance heh - it is going to be interesting in comparison against Pascal there.
 
I meant we have to make sure the 2080Ti is using "big chip" (i.e. like the current Titan and the 1080Ti), not using a "medium chip" (like the 1070/1080).

Wouldn't put it pass Nvidia to try to pull one "pull the wool over our eyes" renaming 70/80 card as 80/80Ti with all the bait and switch they have done with the current gen cards. That's why I said people need to keep their eyes wide-open and watch carefully.

If the 2080Ti turn out to be using the same medium size chip as 2080, then it might not be long before we see the return of the "Ultra" naming, with a big chip cards being called something like 2080 Ultra, or if they gonna release two big chip cards, they might called them 2080 Ultra/2080 Ultra+ :p
I doubt it is using the same chip given the big difference in number of Cuda cores.
 
I meant we have to make sure the 2080Ti is using "big chip" (i.e. like the current Titan and the 1080Ti), not using a "medium chip" (like the 1070/1080).

Wouldn't put it pass Nvidia to try to pull one "pull the wool over our eyes" renaming 70/80 card as 80/80Ti with all the bait and switch they have done with the current gen cards. That's why I said people need to keep their eyes wide-open and watch carefully.

If the 2080Ti turn out to be using the same medium size chip as 2080, then it might not be long before we see the return of the "Ultra" naming, with a big chip cards being called something like 2080 Ultra, or if they gonna release two big chip cards, they might called them 2080 Ultra/2080 Ultra+ :p

This is exactly what I expect to happen. 2080 Ultra (or 2090) will use the 102 chip. If nvidia are feeling particularly devious they could even release a full-fat 102 chip branded as 2180/2180ti and release as many variants as they like. Think of the marketing.. you just bought a 2080ti (104 chip) 6 months ago then nvidia release a 2180 and possibly a 2180ti. You'd feel like you were on the previous generation and want to upgrade. Might explain the jump from 10 - 20 instead of 10 - 11...

edit: Look at the way they snuck in the 1070ti. The more price bands they have, the more they can push up the RRP of the top card.

edit 2: The only way that the 2080ti is the full fat chip is if the 2080 is slower than the 1080ti.
 
Last edited:
I meant we have to make sure the 2080Ti is using "big chip" (i.e. like the current Titan and the 1080Ti), not using a "medium chip" (like the 1070/1080).

Wouldn't put it pass Nvidia to try to pull one "pull the wool over our eyes" renaming 70/80 card as 80/80Ti with all the bait and switch they have done with the current gen cards. That's why I said people need to keep their eyes wide-open and watch carefully.

If the 2080Ti turn out to be using the same medium size chip as 2080, then it might not be long before we see the return of the "Ultra" naming, with a big chip cards being called something like 2080 Ultra, or if they gonna release two big chip cards, they might called them 2080 Ultra/2080 Ultra+ :p
They will get a lot of stick if they do that. But no doubt people will get used to it in a generation or two and accept it.

I do think the 2080Ti is the big chip and not the same as the 2070/80 but no doubt the Titan will come along with more cores.
 
I know, I know, I will get moaned at for this but.

HDMI isn't free, it requires an adopters fee and royalties of $0.15 per unit sold, if the manufacturer uses the HDMI logo, this fee drops to $0.05 per unit sold.

Of course people will moan at you, because the point you are trying to make is daft. You still don't have to pay AMD anything to use VRR. And the Licence fee for HDMI has to be paid if you use HDMI whether it is VRR compatible or not.
 
2070 I would image would be the perfect card as you are on 1440p. You get fresh new tech and a shiny toy to bench and play with and some boost in fps until 7nm comes out. But question is can you wait a month, or will you just say **** it and buy 2080Ti day when part financing Jensen’s next leather jacket? :p;)
Unfortunately I think it will be the latter based on previous experience with will power :p
 
I meant we have to make sure the 2080Ti is using "big chip" (i.e. like the current Titan and the 1080Ti), not using a "medium chip" (like the 1070/1080).
I mean, you are exactly right. The 1080 is a medium sized chip, and a medium range card.

But god help you if you call it a mid range card on this forum. The owners will jump down your throat.

1080 is and always has been a mid range GPU. and I say this as an owner!
 
Everyone in here, after Nvidia announced the 20 series.

JQw5TgG.png


:D
 
Guys,

Remember the cuda core count. CCC as it's dubbed. If there are unusual gaps in CCC between 2070, the 2080 and 2080 TI. You know there is going to be another sku that maybe released to fill that gap. Just like the 1070 to 1080.

1070 released Spring/Summer 2016 with 1920 CC, 120 TU
1080 released Spring/Summer 2016 with 2560 CC, 160 TU

Whoow there before you buy that 1070 that's a pretty large disparaging gap between the 1070 and 1080. It's almost looks like there is a missing sku between those 2 cards. 1070 bought anyway and all is happy in the land until one faithful morning...

In October 2017, a 1070 TI was released w/2432 CC, 152 TU. That left a 512CC gap between the 1070 and 1070TI. But only left a 128CC gap between the 1070 ti and 1080.

Moral of the story. If you see gaps like this again for Turing then pay attention to history. It might repeat itself.
 
Back
Top Bottom