• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Another RTX GPU Moan Thread

  • Thread starter Thread starter Guest
  • Start date Start date
Either way, first time I'm actually ever regretting buying from OCUK, if this would have been Amazon I'd have returned it by now for a refund and bought a 1080 ti or a crap card till the new AMD Offerings arrive.

What does your place of purchase matter here? I fail to see how buying from OcUK has exacerbated the sht show that is RTX :confused:
 
I tried loads of stuff, deleted it now, not going to download the game again, everyone else was saying would only work at 4k. Either way, first time I'm actually ever regretting buying from OCUK, if this would have been Amazon I'd have returned it by now for a refund and bought a 1080 ti or a crap card till the new AMD Offerings arrive.
I'm enjoying mine. FF doesn't seem like a game I will enjoy so not bothered but I really hope DLSS gets implemented in other games soon and supports 1440P too, without down sampling from 4K. QUite enjoying BFV with RT on though. There was an NV slide somewhere showing 1440P and 4K DLSS performance but I cannot find it now, but it is meant for 1440P too.. It is a bit rubbish though that they've only done 4K in that game especially as it seemed the easier of the two new tech's to support yet only get half a job done on DLSS :).
 
Last edited:
DLSS is a framerate booster, you don't need that at 1440p on an RTX card...
The more FPS the merrier :). BF5 with RT on gets me around 73+ FPS @ 1440P, a boost of another 25%+ would be most welcome :). I'm using the 2080 with a 6700K at the moment and getting a few more FPS out of it than I did with a 1950X but gameplay felt smoother with the 1950X. Either way 25% boost even with a 2080 will be good to have. Some say they can't game at < 120+FPS :D.
I don't think anyone would turn down a 25% boost in FPS whatever the resolution when that potential exists in the GPU.
 
Last edited:
DLSS in ff15 is 4k only, I found a site (forgot what it was) where they did screenies showing the option greyed out at 1440p and 1080p. Pretty big fail in my opinion.

Also check gamers nexus video on it, it causes a ton of shimmering (flickering), even when standing still.
 
DLSS in ff15 is 4k only, I found a site (forgot what it was) where they did screenies showing the option greyed out at 1440p and 1080p. Pretty big fail in my opinion.

Also check gamers nexus video on it, it causes a ton of shimmering (flickering), even when standing still.


Well the whole purpose of DLSS was to make 4k gaming more viable in demanding games so it doesn't really have much value at lower resolutions.
It could be made to work but then there would have to be new models generated. By default DLSS will render at something like 1800P and then upscale to 4k giving an image quality that should be better than 4K with standard temporal AA.
I imagine there might be a value in a 1080p mode for low end cards that render at say 850p and upscale, but the low end cards would need the Tensor cores so it is not obvious if that would make sense vs simply offering more CUDA cores.

The shimmering does need to be corrected though.
 
I would love to see this in 3440x1440. 4K isn't in my future plans either, so DLSS could be a waste for me if it doesn't do my res :(
 
DLSS in ff15 is 4k only, I found a site (forgot what it was) where they did screenies showing the option greyed out at 1440p and 1080p. Pretty big fail in my opinion.

Also check gamers nexus video on it, it causes a ton of shimmering (flickering), even when standing still.

The flickering could be solved by improving the analysis back at Nvidia HQ, or even defaulting transparent textures to TAA only. I would hope more resolutions are catered for long term as the RTX 2060 is on it's way.
 
The same process that makes it viable for 4K needs to be done for other resolutions in order to make it beneficial. Inherently, the DL technology relies heavily on looking at raw pixels to distinguish onscreen objects.
 
I would love to see this in 3440x1440. 4K isn't in my future plans either, so DLSS could be a waste for me if it doesn't do my res :(
Guess each res is at nv discretion. Not clear to me if they have to run each res point through the machine.

Would be pretty dumb of nv to limit to 4k only though- as has been said it would be wasted on most of 20 series range
 
Have you seen the benchmarks for DLSS in FF? the 2070 isn't very far behind the 2080 at all, i was expecting a much bigger gap.
I didn't think there was much between any of them with DLSS? Remember seeing 60 FPS 2080 and 68 FPS 2080 Ti, which is only 13% faster in the case of the Ti than the 2080.

On the shimmering, one of the reviewers said it was there without DLSS. Maybe people are noticing it more because they're examining the screen more closely with DLSS ?
 
I would love to see this in 3440x1440. 4K isn't in my future plans either, so DLSS could be a waste for me if it doesn't do my res :(

I think in a couple of years wide screen monitors will be running at higher resolutions than 3440x1440 and using more pixels than 4k, when this happens DLSS will be an option to use.
 
Well the whole purpose of DLSS was to make 4k gaming more viable in demanding games so it doesn't really have much value at lower resolutions.
It could be made to work but then there would have to be new models generated. By default DLSS will render at something like 1800P and then upscale to 4k giving an image quality that should be better than 4K with standard temporal AA.
I imagine there might be a value in a 1080p mode for low end cards that render at say 850p and upscale, but the low end cards would need the Tensor cores so it is not obvious if that would make sense vs simply offering more CUDA cores.

The shimmering does need to be corrected though.

Dont know why you think it has no value at lower resolutions, I have played plenty of games where I cannot hit frame target at 1080p and 1440p. Ironically FF15 is one of those games. A RTX 2080 cannot sustain 60fps at 1440p e.g. in FF15.
 
I didn't think there was much between any of them with DLSS? Remember seeing 60 FPS 2080 and 68 FPS 2080 Ti, which is only 13% faster in the case of the Ti than the 2080.

On the shimmering, one of the reviewers said it was there without DLSS. Maybe people are noticing it more because they're examining the screen more closely with DLSS ?

On the GN video he does side by side with and without, you can clearly see without either has no shimmering or significantly less.
 
Dont know why you think it has no value at lower resolutions, I have played plenty of games where I cannot hit frame target at 1080p and 1440p. Ironically FF15 is one of those games. A RTX 2080 cannot sustain 60fps at 1440p e.g. in FF15.
Agree. Everyone wants higher FPS and if the capability is built into the card to provide it then it should be used, and will be in this case I think. There was definitely a chart somewhere showing the gains at both 1440P and 4K but for the life of me cannot find it now. I think it was one provided by NV at some point. Maybe that's why I can't find it - they've removed it!!!! :D
I tried the FF demo and for sure was only getting around 60 FPS out of the 2080 which will likely be the same as the 1080 Ti. DLSS should give the 2080 another 10+FPS if it was added.
Reminds me of the "At 1080P you don't need anything more than a 1070" or similar talk :). Mooorreeeee power (well, FPS anyway).
On the GN video he does side by side with and without, you can clearly see without either has no shimmering or significantly less.
Maybe the shimmering is AI filling in shimmering with more shimmering :).
The review I was referencing was from the young lad. Dunno what his name is - looks like should be in school :p
 
Last edited:
Agree. Everyone wants higher FPS and if the capability is built into the card to provide it then it should be used, and will be in this case I think.

I agree, but we all know that if AMD had started the lowering of iq by upscaling first for faster performance, there would have been hell on on here, the Nvidia lot would have been livid, calling them cheating *****! the lot :p

They actually done it before, when AMD were lowering the iq through their drivers for faster performance, was all over the usual sites as well, everyone was having a go at em :D

Had to stop it in the end, and just made some excuse about it being a bug, that they'd fixed.:p
 
Last edited:
I agree, but we all know that if AMD had started the lowering of iq by upscaling first for faster performance, there would have been hell on on here, the Nvidia lot would have been livid, calling them cheating *****! the lot :p

They actually done it before, when AMD were lowering the iq through their drivers for faster performance, was all over the usual sites as well, everyone was having a go at em :D

Had to stop it in the end, and just made some excuse about it being a bug, that they'd fixed.:p

I actually thought this myself.. nVidia is just "cheating" the resolution in a new way and marketing it as a good thing. It's confusing to see people praising them for something that had even nVidia attempted this 5 years ago people would have called it out, let alone AMD implementing anything similar (provable by the uproar of AMD toning down the use of over-tessellation to improve performance with ZERO visual degradation)
 
I actually thought this myself.. nVidia is just "cheating" the resolution in a new way and marketing it as a good thing. It's confusing to see people praising them for something that had even nVidia attempted this 5 years ago people would have called it out, let alone AMD implementing anything similar (provable by the uproar of AMD toning down the use of over-tessellation to improve performance with ZERO visual degradation)

The tessellation thing was a total joke especially considering the screenshots of crysis 2 showing flat concrete barriers get a ton of tessellation added to them for absolutely zero visual difference. Even random pieces of wood could get a ton added to it and it made barely any difference. Some examples of that here:

https://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

Thankfully that kind of crap is mostly in the past now.
 
Tessellation can be pretty nice visually but ultimately the same thing can also be achieved without tessellation, and I sort of agree that its a tech perhaps designed to slow down gpu's with the aim of selling faster gpu's.

Its sort of how I dont like dx11 vs dx9, dx9 games can look very nice and tend to need lot less gpu power to do it vs dx11, dx9 also supports msaa way easier and as a result sgssaa. In dx11 years we have had many aa been used by dev's but they all inferior to the much older msaa and ssaa, and of course sgssaa.

Half the time I think software enhancements are only done to sell new gpu's.
 
Back
Top Bottom