Soldato
thick and fast? come on now
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Show us all these 3080's, yo!Nah, they are coming thick and fast now.
Same.Then I, for one, will leave it. My 2080 Ti is perfectly good enough for now and will keep me going until the 40xx or 7xxx series arrive. I suspect many people will do the same.
Actually, there's a modest chance my next discrete GPU will be from Intel. If the leaks are true then Intel could have a very competitive product. It just comes down to price.
Considering DLSS has comparatively very negative little visual impact I don't see what the problem is with the 3080 using DLSS to get playable framerates with RT Ultra on.Same.
The 3080 can't do 4k with ultra RT on.
It still needs DLS turned on
True.Considering DLSS has comparatively very negative little visual impact I don't see what the problem is with the 3080 using DLSS to get playable framerates with RT Ultra on.
Well yes DLSS works comparatively well on every card that supports DLSS... but a 3080 is around 50% faster than a 2080Ti in 4k whether DLSS is on or off.True.
Then again dlss works really well on a 2080 too though
Nice.Well yes DLSS works comparatively well on every card that supports DLSS... but a 3080 is 50% faster than a 2080Ti in 4k whether DLSS is on or off.
How long have you been a member talking about GPU's in these threads and after 6 months of Ampere release and a gazillion reviews you still don't know how much faster a 3090 is than a 3080? Go and see a doctor because I think someone labotomised you while you were sleeping.Nice.
How about the 3090?
Well yes DLSS works comparatively well on every card that supports DLSS... but a 3080 is around 50% faster than a 2080Ti in 4k whether DLSS is on or off.
IndeedI find it interesting that RT is supposed to save us from faked lighting effects, but it requires faked resolution to do so.
We are just trading one cheat for another.
Only because they butchered the none RT iq.The difference is that the ‘faked resolution’ has a minimal effect on image quality and the ray tracing generally has a significant improvement.
Only because they butchered the none RT iq.
Check out many none RT games and the IQ looks very similar to the RT ultra games. Barely any difference
Yea i think DLSS is of course more a meaningful feature than RT.Ok. It’s still irrelevant. That kind of technology works with rasterised games as well. You can push higher detail and more frames in the same hardware...
RT or not, upscaling technology clearly benefits all gamers. You can do more with less hardware without a noticeable drop in rendering quality. Just think about the quality of the experience you could get out of something like a Nintendo switch if they could get it working properly. Like I said, people who argue against it are just clutching at straws.
Note. I used the term upscaling technology. I couldn’t give two cares who makes it, as long as it works without a noticeable quality loss.
Only because they butchered the none RT iq.
Check out many none RT games and the IQ looks very similar to the RT ultra games. Barely any difference
Also lets take the latest new RT title The Medium.
Taken from here:
"With DLSS 2.0, our RTX3080 was able to offer constant 60fps at both 1080p and 1440p."
So again, at 4k it will probably be around the 30 fps mark.
The NEXT GEN titles render the 3080 a 1440p card.
Unless one is happy with 30-40 fps performance that is.
Which don't get me wrong is completely playable especially with a gsync / freesync display.
I find it interesting that RT is supposed to save us from faked lighting effects, but it requires faked resolution to do so.
We are just trading one cheat for another.
Indeed, I’ve tried DLSS and don’t like it, too much of a drop in image quality and very noticeable to me, rather have RT off, which is what I’ve done with CP. Control runs well with RT without the need for DLSS though so that’s nice
You are making a pointless statement. No one will play 4K without DLSS 2.0.
Nvidia's aim is that higher resolutions are now achieved through image reconstruction. Not brute force rendering.
I've been using DLSS an it is fine.
I don't see the same number of people complaining about variable rate shading for example. Seem to pick on DLSS all the time. VRS is pretty obvious if you deliberately look for it.
Nearly all types of AA (except SSAA) are also means of avoiding using higher resolutions.