• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anybody else resenting AMD because of DLSS?

Status
Not open for further replies.
Honestly, I lost faith in game developers. When you get 35-40 FPS in low-medium mixed settings with a GTX 1080 at 1080p in Cyberpunk, then dropping below 30 FPS in the Medium at literally medium settings, it tells me that developers are already optimizing around DLSS. AMD surely will provide similar tech, and once it happens, probably every developer will just optimize their games around these techs. And of course at that point, it only becomes a matter of image quality, where I cannot do any comparison of, due to AMD not being able to even provide a demo of their tech. But again, AMD also has their qualified engineers, maybe their version will really be competitive with the latest iteration of DLSS. Honestly, I'm %100 sure AMD will manage leverage a mediocre alternative for the technology, and surely so, developers will target their optimization headrooms for these techs. Why wouldn't they? For the quality part, I simply bet on Nvidia. But everything can happen!

If DLSS 2.0 tends to look better than AMD's one, I will keep on with Nvidia. Because I feel like these AI upscaler technologies will become necessities very quickly.
 
Don't resent AMD because somebody else has made implemented a technology and not let anybody else use. AMD introduced SAM and now everybody can use it. Everything AMD does they do open source everything Nvidia do they keep it to themselves. Adaptive sync was a thing before nvidia scammed into it and stuck 150 to 250 quid ontop of many monitors for using it. AMD and others worked on it before nvidia implemented it. Cash grab central.

Blame nvidia for not letting anybody use their stuff.


Huhh SAM is a pcie specification called resizable Bar. All AMD did was rename it for their marketing to hide what it really was. AMD are no angels...

"
The original FreeSync is based over DisplayPort 1.2a, using an optional feature VESA terms Adaptive-Sync.[5] This feature was in turn ported by AMD from a Panel-Self-Refresh (PSR) feature from Embedded DisplayPort 1.0,[6] which allows panels to control its own refreshing intended for power-saving on laptops.[7] AMD FreeSync is therefore a hardware–software solution that uses publicly-available protocols to enable smooth, tearing-free and low-latency gameplay.

FreeSync has also been implemented over HDMI 1.2+ as a protocol extension. HDMI 2.1+ has its own variable refresh rate system.[ 8]
"

Again showing Amd took another industry standard and renamed it for marketing and again to hide where it came from.

All these companies are as bad as each other.. Don't get me started on Intel too.. They vandalised pc cpu progress for years and all the security issues with their cpus.
 
I doubt will RE will have many RT features as AMDs cards just can't handle them while staying above 20fps so maybe just shadows or something like that.
 
I doubt will RE will have many RT features as AMDs cards just can't handle them while staying above 20fps so maybe just shadows or something like that.
It looks like apart from the usual RT shadows it also has global illumination which is a big deal. Will AMD force the dev to lock Nvidia cards out of RT like they did with Godfall though?

guess we will see, if the adoptions rates carry on in games as they are itl be a thing of the past :)
I already see lots of people posting "No DLSS no buy". It feels like this forum lives in its own little alternate universe where DLSS, 4k and and Ray tracing don't matter. Its all about prehistoric resolutions like 1440p here :rolleyes:. Even console gamers laugh at 1440p these days.
 
It won't be. Of all things we will need it more when 8k becomes mainstream and DLSS tech will be needed more. Even now for 4k it's needed..
How long has 4k been out for yet it is still considered premium option in some instances?

I'll give it another decade before people can even talk about 8k becoming mainstream. At which point some other tech will come along.
 
How long has 4k been out for yet it is still considered premium option in some instances?

I'll give it another decade before people can even talk about 8k becoming mainstream. At which point some other tech will come along.


Well first 4k tv was 2012, so heading for 10 years now and will be replaced for sure in 5 years time with 8k options becoming standard as they did with 1080p when they phased that out and made it near impossible to buy a 1080p at a large size again and only options were 4k.

So I say 5 years time it will become mainstream depending on adoption and price, but I think price will be as we have now for 4K and not the main issue but the issue of do we really need 8K for tvs of a certain size, see for me 8k becomes sensible at 85" too 100" min size anything smaller 4k is enough from normal viewing distance and also UK houses are not really great for huge tvs as we have found out with large 4k tvs, unless you make a room for it or redesign your living room, rooms are just too small in most UK houses. To me perfect size for 8K would be 150" but where would you put such a sized tv to make real use of 8K.
 
Honestly, I lost faith in game developers. When you get 35-40 FPS in low-medium mixed settings with a GTX 1080 at 1080p in Cyberpunk, then dropping below 30 FPS in the Medium at literally medium settings, it tells me that developers are already optimizing around DLSS. AMD surely will provide similar tech, and once it happens, probably every developer will just optimize their games around these techs. And of course at that point, it only becomes a matter of image quality, where I cannot do any comparison of, due to AMD not being able to even provide a demo of their tech. But again, AMD also has their qualified engineers, maybe their version will really be competitive with the latest iteration of DLSS. Honestly, I'm %100 sure AMD will manage leverage a mediocre alternative for the technology, and surely so, developers will target their optimization headrooms for these techs. Why wouldn't they? For the quality part, I simply bet on Nvidia. But everything can happen!

If DLSS 2.0 tends to look better than AMD's one, I will keep on with Nvidia. Because I feel like these AI upscaler technologies will become necessities very quickly.

You mean in nvidia sponsored titles they're 'optimised around DLSS', I wonder why that might be...

It's too consistent now to be anything other than exactly what nvidia want, and these games aren't exactly taking graphics to a different level to justify the performance requirements either.

It won't be. Of all things we will need it more when 8k becomes mainstream and DLSS tech will be needed more. Even now for 4k it's needed..

The most popular gaming resolution: 1080p, the biggest growing gaming resolution: 1440p, oh yeah 8k is gonna be mainstream *real* soon...
 
The most popular gaming resolution: 1080p, the biggest growing gaming resolution: 1440p, oh yeah 8k is gonna be mainstream *real* soon...

Talking about tvs, for that it will be for computer monitors we will see, but monitors are also getting very big too. A lot of people game on tvs remember.
 
8K :p

There's only half a **** given about 4K and it's been around heck of a long time.

If you mug yourself into buying screens you can't run games on natively you're just continuing the ancient tradition of lowering the resolution to make it playable :D
 
Still find 4k super meh on my oled TV for gaming, well unless I fancy sitting 3/4 feet from a 55" bright display...

It's a shame more people don't get to see HDR on an OLED in their own home setups, that is truly something to shout from the rooftops about :cool:
 
Typical ignorant responses from people who I'd wager have never actually used it.

And if you have actually used it, maybe you've only seen 1.0? DLSS 2.0 at quality setting is incredibly good. I was skeptical myself, until I actually tried it for myself.
DLSS 3.0 will be ridiculously good I've no doubt.

I literally just got a 3000 card and first thing I did was try it on Cyberpunk, its exactly the same as using a monitor at non native res. It reminded me of 2008 trying to run crysis.
 
8K :p

There's only half a **** given about 4K and it's been around heck of a long time.

If you mug yourself into buying screens you can't run games on natively you're just continuing the ancient tradition of lowering the resolution to make it playable :D


Welcome to a PC enthusiast forum. That's what we do here, we want the best tech available ;)
 
Still find 4k super meh on my oled TV for gaming, well unless I fancy sitting 3/4 feet from a 55" bright display...

It's a shame more people don't get to see HDR on an OLED in their own home setups, that is truly something to shout from the rooftops about :cool:

Agree HDR is the game changer this time more than resolution, wonder what the game changer will be with 8K to make them easier to sell. HDR and larger screens is what sold me onto 4K as I knew going any larger than 55" on 1080p looked rubbish because not enough resolution for the size.
 
I literally just got a 3000 card and first thing I did was try it on Cyberpunk, its exactly the same as using a monitor at non native res. It reminded me of 2008 trying to run crysis.

Have you made sure it is set to "quality" and also what res. is your native monitor set to?

I never lower the resolution on my monitor even if it meant awful fps purely because of how terrible it looked and that's because of how **** monitors are for scaling (no problems doing this on my TV as even 1080P looks bloody good) but DLSS 2.0 is fantastic on a **** LCD monitor.





Either way, I completely agree with wunkley, it only seems to be people who have never used DLSS or are AMD loyal that diss DLSS 2.0... If the fundamental tech. wasn't impressive, I don't think AMD and Microsoft would also be working on the exact same thing.... Plus I think every tech entity on youtube in the know rate DLSS 2.0 as being a superb feature and have gone into in depth analysis to back up these claims rather than some one liners on a random forum with no proof to show/explain why it is just "lowering resolution"....

Thankfully all the games I have/intend to play have DLSS 2.0, 150 hours in cyberpunk, just waiting on the next patch to finish my second play through :cool: Currently playing control now. Pretty hyped for dying light 2 (expecting it to be a mess on release though....)
 
Poor old Jesse and her poor eyesight.

MBmgNkD.png


Has to take a couple of steps closer so she can see the pictures clearly. :p

oZ5l8qP.png
 
Status
Not open for further replies.
Back
Top Bottom