• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
My issue with DLSS is it leaves the door open for devs to actually be a bit lazy with fidelity as they can leverage DLSS to cover for them so to speak.
Well, the latest implementations seem to actually provide an IQ improvement over native resolution textures in some titles, so if in the end it does a better job with less effort then it could lead to being a benefit for some devs who are not as skilled as others. I can't see many downsides to DLSS when compared with the practical and tangible benefits it will bring. It's impressive.
 
My issue with DLSS is it leaves the door open for devs to actually be a bit lazy with fidelity as they can leverage DLSS to cover for them so to speak.

Was looking into wafer yields this morning after reading someone from this forum spouting nonsense saying 7nm wasnt good.

Well actually 7nm is extremely well with regards to yields but not only that 5nm is actually ahead of where 7nm was at 5nms current state, which bodes well for the future. But even better news us TSMC has leveraged EUV and it looks like their 6nm node is an EUV refined 7nm which brings abit more performance, and it seems 7nm products can be moved to 6nm easier than going to 7nm+, and 6nm us supposedly cheaper due to being able to use EUV stages.

What this means is AMD can do a refresh on some products to a cheaper better performing node, nuch like they respun to Zen+ etc

I think AMDs partnership with TSMC is paying huge dividends, they took the fight to Intel, Nvidia using what seems to be an inferior Samsung 8nm node, may leave them open to AMD chasing them with 7nm and potentially refining with 6nm

Some good points. This is also partly why I'll hold off maxing out the potential of motherboard, to see what AMD bring out before the move to AM5. I would think a fast Zen3 CPU is going to be able to feed the next 3+ gens of GPUs.
 
Some good points. This is also partly why I'll hold off maxing out the potential of motherboard, to see what AMD bring out before the move to AM5. I would think a fast Zen3 CPU is going to be able to feed the next 3+ gens of GPUs.
Easily... especially as resolutions increase and games become less CPU-dependent. Many review online show that at 4k the difference in CPU performance is negligible between even mid-range and high-end CPU's... even a 7+ years old CPU is within spitting distance of the new ones! Even at 1440p the difference is much less apparent than at 1080p. https://www.tomshardware.com/features/nvidia-geforce-rtx-3080-ampere-cpu-scaling-benchmarks

As usual, 4K is the great CPU equalizer. The demands on the GPU are so high that even a seven-year-old Core i7-4770K now only trails the Core i9-10900K by 8% at 4K ultra. It's still a noticeable 13% behind at 4K medium, but hopefully you get the point we're trying to make: The RTX 3080 is very much a graphics card designed to tackle high-resolution gaming. 4K ultra at 60 fps or more is now totally possible, and you can do it with a Core i7-4770K, Ryzen 5 3600, or go all the way and pair it with a Core i9-10900K or Ryzen 9 3900X.

If you already have a Zen2 3700x or 3900x now then you are good for a few years at least... I intend to get a Zen3 on release and then sit tight for a good while and just upgrade my GPU. :)
 
Why are you only thinking of now... don't you also think ahead when making purchases? It really does provide a notable improvement in performance without sacrificing visual quality, has received a lot of acclaim from users and media, and clearly it is going to be implemented in a lot of future titles in 2021/2022. It will prolong the life of GPU purchases as it will increase performance at higher resolutions, or even allow higher resolutions or settings to be used where they otherwise wouldn't be the power to do so. It's an impressive technology that has some significance.

Your entire argument rests on the adoption rate of DLSS. I'm not just thinking about now... The evidence so far(since Turing launch) suggests a very slow adoption rate in its current form. Developers wouldn't leave that much free performance on the table unless something wasn't as rosy as Nvidia would like us to believe.
 
The other stumbling block as far as DLSS adoption goes is it wont be console native, and as much as we PC gamers hate it, console gaming revenue drives the gaming industry.

DLSS like a lot of Nvidia proprietary stuff may never get a true foothold due to this
 
Ignoring the fact that AI up scaling looks good. I can't help laugh at the change in tone from PC enthusiasts when it comes to "omg consoles arent native 4k" "they use checkerboarding!" etc.

When nvidia push something to save resources everyone is hyped to ****.

I don't see many next gen games nailing constant 60fps on 3080s without ai up scaling.
 
Well, the latest implementations seem to actually provide an IQ improvement over native resolution textures in some titles, so if in the end it does a better job with less effort then it could lead to being a benefit for some devs who are not as skilled as others. I can't see many downsides to DLSS when compared with the practical and tangible benefits it will bring. It's impressive.

DLSS, 4K (with TXAA) and FidelityFX compared.

https://www.dsogaming.com/screensho...ative-4k-vs-fidelityfx-upscaling-vs-dlss-2-0/

Apologies, not getting at you personally but I really wish this lie would die already. It is pure Nvidia marketing BS as usual but DLSS is not actually better than 4K native. The comparisons are generally made at 4K with TAA which softens the image. If you compare to actual 4K with MSAA then DLSS is inferior. DLSS will cause some artifacting during movement but most of the time it does very well. The problem with 4K MSAA is it takes a lot of GPU grunt and this is where DLSS helps but the reality is your are not at TRUE 4K when you enable DLSS.
 
Last edited:
Really, Richdog? Are you going there? You?
Indeed. Hahaha :D

Made me laugh when he came up with that. Was funny watching him get triggered when I copy and pasted the exact words he used against me a few days ago back on him, did not like the taste of his own medicine and got triggered. He basically run out of intellectual steam and started attacking me personally to try and win :p

What he did not realise when doing so is he was digging himself a bigger hole. I mean how can go around calling 3090 owners mugs (not just this thread) and then go on to say I am “not used to having adult discussions in real-life outside of your bedroom.”. Pot calling the kettle black or what. Too funny. Guy needs to chill out, step back and think a little before posting. We are all enjoy this hobby and like to discuss and debate things, no need to get butthurt and get personal :D
 
The other stumbling block as far as DLSS adoption goes is it wont be console native, and as much as we PC gamers hate it, console gaming revenue drives the gaming industry.

DLSS like a lot of Nvidia proprietary stuff may never get a true foothold due to this
It's in Nvidia interests to make sure it's either in many major titles at release, or added shortly afterwards. From what I have read and understood it's Nvidia who do the leg-work to train it via their own supercomputer network and then patch it via drivers.

Nvidia Q&A https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-your-questions-answered/

Wiki https://en.wikipedia.org/wiki/Deep_... works as follows,the training of the network.
DLSS 2.0[edit]
DLSS 2.0 works as follows:[14]

  • The neural network is trained by Nvidia using "ideal" images of video games of ultra-high resolution on supercomputers and low resolution images of the same games. The result is stored on the video card driver. It is said the Nvidia uses DGX-1 servers to perform the training of the network.
  • The Neural Network stored on the driver compares the actual low resolution image with the reference and produce a full high resolution result. The inputs used by the trained Neural Network are the low resolution aliased images rendered by the game engine, and the low resolution, motion vectors from the same images, also generated by the game engine. The motion vectors tell the network which direction objects in the scene are moving from frame to frame, in order to estimate what the next frame will look like.[15]
 
It's in Nvidia interests to make sure it's either in many major titles at release, or added shortly afterwards. From what I have read and understood it's Nvidia who do the leg-work to train it via their own supercomputer network and then patch it via drivers.

Nvidia Q&A https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-your-questions-answered/

Wiki https://en.wikipedia.org/wiki/Deep_learning_super_sampling#:~:text=DLSS 2.0 works as follows,the training of the network.
Also, just like sli was never in consoles, yet many titles where compatible with it.

Will amd have DLSS
 
The other stumbling block as far as DLSS adoption goes is it wont be console native, and as much as we PC gamers hate it, console gaming revenue drives the gaming industry.

DLSS like a lot of Nvidia proprietary stuff may never get a true foothold due to this

Has any Nvidia proprietary technology ever gained wide adoption? I presume not as they'd never open their technology up. That is one thing putting me off Nvidia, they like to tie users in and then drop technology when it doesn't take off. I like to keep my options open, especially with fast moving technology.
 
Also, just like sli was never in consoles, yet many titles where compatible with it.

Will amd have DLSS
I'm not sure, but I really hope AMD come out with their own version of DLSS, so long as it doesn't present any patent or technology barriers.

Has any Nvidia proprietary technology ever gained wide adoption? I presume not as they'd never open their technology up. That is one thing putting me off Nvidia, they like to tie users in and then drop technology when it doesn't take off. I like to keep my options open, especially with fast moving technology.

I don't understand your logic... even without DLSS, Nvidia are strong performers with the 3080. DLSS is a bonus feature and benefit for any owner... it has no downsides to speak of and doesn't tie you in to anything.
 
From @Hedge

[Ignoring the fact that AI up scaling looks good. I can't help laugh at the change in tone from PC enthusiasts when it comes to "omg consoles arent native 4k" "they use checkerboarding!" etc.

When nvidia push something to save resources everyone is hyped to ****.

I don't see many next gen games nailing constant 60fps on 3080s without ai up scaling./QUOTE]

I
I too don't see 3080 pushing 4k. That's why I'm skipping this gen most likely.
 
Every thread? Exaggerating much?

Seems you didn’t like what I said and now getting upset. It’s ok for you to go around in “every thread” insinuating/calling 3090 users mugs, but when called up on being happy to pay £200 extra for 10gb vram, instead of replying explaining your choice you get emotional and accuse me of being childish and defending my hardware choice? Jog on mate ;)

Whoever you're quoting, I already have on ignore. It seems I made a wise decision ;)

Can never understand those who openly mock other people's purchasing decisions, as they differ from their own. The same people also tend to be very critical of things/people/opinions online, though if you saw them in person, they'd never dare to say such things. Such is the power of the keyboard warrior - protected by their LCD tan.
 
Whoever you're quoting, I already have on ignore. It seems I made a wise decision ;)

Can never understand those who openly mock other people's purchasing decisions, as they differ from their own. The same people also tend to be very critical of things/people/opinions online, though if you saw them in person, they'd never dare to say such things. Such is the power of the keyboard warrior - protected by their LCD tan.
Indeed :D
 
Whoever you're quoting, I already have on ignore. It seems I made a wise decision ;)

Can never understand those who openly mock other people's purchasing decisions, as they differ from their own. The same people also tend to be very critical of things/people/opinions online, though if you saw them in person, they'd never dare to say such things. Such is the power of the keyboard warrior - protected by their LCD tan.

The only time I think it's justified is when people are paying stupidly inflated prices and it enables companies to just milk us for every penny they can get.

If we could show some self restraint it might help us get products for more reasonable prices.

I appreciate thats hippy talk.
 
Can never understand those who openly mock other people's purchasing decisions, as they differ from their own. The same people also tend to be very critical of things/people/opinions online, though if you saw them in person, they'd never dare to say such things. Such is the power of the keyboard warrior - protected by their LCD tan.

We aren't in the wild west and life isn't Football Factory. Sounds like you need to work on your confidence if that kind of thing weighs on your mind so much.

The only time I think it's justified is when people are paying stupidly inflated prices and it enables companies to just milk us for every penny they can get.

If we could show some self restraint it might help us get products for more reasonable prices.

Agreed... the 3090 is bad enough, but buying two of them for SLI is just completely daft.
 
Every thread? Exaggerating much?

Seems you didn’t like what I said and now getting upset. It’s ok for you to go around in “every thread” insinuating/calling 3090 users mugs, but when called up on being happy to pay £200 extra for 10gb vram, instead of replying explaining your choice you get emotional and accuse me of being childish and defending my hardware choice? Jog on mate ;)
well said
 
Status
Not open for further replies.
Back
Top Bottom