• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Only 30-40 fps on a 2080Ti> Is this right?

http://www.pcgameshardware.de/Grafi...ormance-in-Shadow-of-the-Tomb-Raider-1263244/

Looks like it's on par or slightly better then a 1080Ti in performance? if true this is truly the next step in laughable


That isn't an apples to apples comparison, the 20 series is tested with RT.


Put it this way, Turing is an iteration of Pascal with more cuda cores. Worst case scenario is it performance the same per core and per clock. There is a 30% bump at the lower bound, probably 40% at least
 
Makes you wonder how Pascal and Vega and all other earlier cards will cope when RayTracing is on :D

Pretty sure you can just turn them options off :) from what i can gather this stuffs going to be added later as a patch to many titles, with the option to enable it if your hardware supports it... Not even Nvidia will gimp their old products at the expense of putting one over AMD, well yeah they would, but not that blatantly obvious, or would they? :)

Anyhow, it doesnt bode well for the 2080ti, when its dipping to 30fps in that new Tomb Raider game @ 1080p, highest fps i see it hit was like 78fps or something...
 
That isn't an apples to apples comparison, the 20 series is tested with RT.


Put it this way, Turing is an iteration of Pascal with more cuda cores. Worst case scenario is it performance the same per core and per clock. There is a 30% bump at the lower bound, probably 40% at least
Personally think that is very optimistic but we shall see.
 
That isn't an apples to apples comparison, the 20 series is tested with RT.


Put it this way, Turing is an iteration of Pascal with more cuda cores. Worst case scenario is it performance the same per core and per clock. There is a 30% bump at the lower bound, probably 40% at least

While you are probably right, how can you know the CUDA cores are the same as the ones on Pascal? If the 2080 is to beat the 1080Ti it will need to be a fair bit more efficient per core or a significantly higher clock. Only then would it come close to justifying the price.
 
Pretty sure you can just turn them options off :) from what i can gather this stuffs going to be added later as a patch to many titles, with the option to enable it if your hardware supports it... Not even Nvidia will gimp their old products at the expense of putting one over AMD, well yeah they would, but not that blatantly obvious, or would they? :)

Anyhow, it doesnt bode well for the 2080ti, when its dipping to 30fps in that new Tomb Raider game @ 1080p, highest fps i see it hit was like 78fps or something...
I could happily play Tomb Raider games at 30 fps in truth. Twitch shooters not so much but G-Sync does a decent job of keeping everything smooth. And if it is too much of a frame hog, I can always turn it off also. Not like it is compulsory ;)
 
That isn't an apples to apples comparison, the 20 series is tested with RT.


Put it this way, Turing is an iteration of Pascal with more cuda cores. Worst case scenario is it performance the same per core and per clock. There is a 30% bump at the lower bound, probably 40% at least

Umm not buying it, supposedly Turing has Hardware dedicated to the Raytracing stuff, if they enabled that stuff the rest of the GPU picks up the rasterization slack... imho Turing is basically Pascal with Raytracing hardware bolted on and a small bump in perf...

I reckon the whole things going to be a massive disappointment.
 
That isn't an apples to apples comparison, the 20 series is tested with RT.


Put it this way, Turing is an iteration of Pascal with more cuda cores. Worst case scenario is it performance the same per core and per clock. There is a 30% bump at the lower bound, probably 40% at least
Except 30% doesn't cover the price increases...
 
I could happily play Tomb Raider games at 30 fps in truth. Twitch shooters not so much but G-Sync does a decent job of keeping everything smooth. And if it is too much of a frame hog, I can always turn it off also. Not like it is compulsory ;)

Can honestly say the games they demoed yesterday dont really interest me, however should they lob that fancy RTX stuff into Division 2, or a decent ARPG or MMORPG then im all over it :) however shadows is generally the first thing that gets turned off on most raiding PC's as its notoriously bad for FPS with tons of people on screen at once.

But i play a lot on my Xbox now, and 30fps doesnt bother me at all either. But your also correct, you can switch this stuff off, but would suck to have to do that if you just weighed out £1100+ for a card designed to make use of that stuff, kinda defeats the purpose of the card itself :(
 
given that when these prices (in the uk) were announced the £ was at a hell of a low ($1.25/6 i think)

assuming it picks up again do you think NV will re-evaluate its RRP in blighty and therefore pressure 3rd party cards / retailers to do the same?

right now one of the things grinding my gears is the terrible conversion rate we are getting.

No way i am going down to 30fps even if the eye candy is nice (for most games) i would rather had it off and have 4k/60
 
Won't that be just when Ray Tracing is enabled? We need to know what the FPS is with RT turned off...

Thats what i thought, but surely the Rest of the card picks up the Rasterization parts of the rendering, if you turn off RTX features, hopefully you'll get better FPS, but then that defeats the purpose of buying a card designed to use those features that you just turned off???
 
Back
Top Bottom