• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Fanboys think like that, you aren’t one so you can’t ‘understand’ why, but it’s easy to see why they think like that.

Spot the new guy!


Watched overvolted yesterday, they have slides of AMD's MI100 data centre GPU, it has 120 CU's, that's 7,680 Shaders, 3 times as many as the 5700XT. This will not become a Graphics GPU but it shows AMD are now willing to push way past 64 CU's.... if anyone thinks 80 CU's is too many for AMD? Think again :)

https://youtu.be/WQo32d_rmOE?t=1449


To be fair they could do a " Titan" type card with the 120cu part like a 100cu card and price it stupidly high the same as Nvidia do.
 
Current rumors is that RDNA 2 won't beat a 3090/3080ti. But it will beat a 3080. No real confirmation on this as of yet. And, it's not clear which sku was used to present this. It's assumed it's a 6900 xt but no solid info on id of the card.

If we play the "let's believe the slides game", were that to be true it'd be a bit disappointing (admittedly said disappointment would be somewhat price dependant!)... they need to do better than averaging around 30% faster than a 2 year old card if they want to make meaningful progress against Nvidia's entrenched userbase given Nvidia's next gen will supposedly already have been out for a month or two at this point. Of course those slides say 6900XT so that's assuming the 6900XT would be their top product which it may not be.

I do think AMD need to worry about DLSS 3.0, not sure that even an updated FidelityFX CAS will suffice. I'm really hoping they have some news regarding DirectML and a DLSS equivalent.... for the mid range especially it could make a huge difference.
 
I do think AMD need to worry about DLSS 3.0

Was taking you seriously until this part.

giphy.webp



(timing Grim5 and the posse entrance) :)
 
Think whatever makes you happy, I'm not going to get into a debate about whether you personally hate DLSS or not.

I'm completely serious though. Regardless of your own personal feelings about it, and even if you think DLSS is the crappiest thing ever, if AMD don't have a viable competitor to DLSS soon they are going to be missing out on a pretty important feature. Reviewers will show nvidia cards significantly outperforming their AMD equivalents with what to most "normal" non pixel peeping people (ie the mass market which is what really matters to AMD's bottom line) will look like incredibly similar visuals. I would bet that the majority couldn't even spot the difference between FidelityFX and DLSS which would almost invalidate my argument but for the fact that DLSS has a significantly bigger performance gain than FidelityFX which only reduces render resolution to around 75% vs DLSS for example upscaling 1440p to 4k.

They really need to show off some similar tech or they will lose out... that's it, that's all I'm saying. I'm not saying whether that's fair or whether the tech is the best thing ever... I think it is fair to say however that machine learning based upscaling is only going to get better over the coming years, not worse. I'd like to know for sure AMD is at least on the train rather than standing at the station.
 
I doubt nvidia will put DLSS on the lower end Gpus as if it works so well then why bother spending more in a higher end card?.
 
I doubt nvidia will put DLSS on the lower end Gpus as if it works so well then why bother spending more in a higher end card?.

Cause it works even better for a high end card? In death stranding with dlss a 2080ti gets nearly 50fps at 8k - so right now dlss seems to be the only people will be able to make use of their 8k TVs, while dlss at the entry levels provides entry level 4K 60fps gaming to compete with consoles
 
If we play the "let's believe the slides game", were that to be true it'd be a bit disappointing (admittedly said disappointment would be somewhat price dependant!)... they need to do better than averaging around 30% faster than a 2 year old card if they want to make meaningful progress against Nvidia's entrenched userbase given Nvidia's next gen will supposedly already have been out for a month or two at this point. Of course those slides say 6900XT so that's assuming the 6900XT would be their top product which it may not be.

I do think AMD need to worry about DLSS 3.0, not sure that even an updated FidelityFX CAS will suffice. I'm really hoping they have some news regarding DirectML and a DLSS equivalent.... for the mid range especially it could make a huge difference.
I do not doubt per nvidia's boiler plate for reviewers they are going to used dlss and not tell you about it. Making you think that is natural performance. Nvidia is known for pulling tricks like that in there NDA to reviewers.

Most others will let you know that they're using DLSS to increase performance. Which will ultimately will debunk and discredit those few reviewers who try to do it and not say so/make it clear in their reviews/charts.

Let's not forget regardless if they use DLSS even if it's available in the drivers it still has to be implemented in the game via update. In which case everyone will know what the update will entail.

Therefore, I don't see the need for AMD to be concerned about. More than likely it will be implemented in "canned" benchmarks for a quick and dirty.

So stop being afraid of them.
 
Cause it works even better for a high end card? In death stranding with dlss a 2080ti gets nearly 50fps at 8k - so right now dlss seems to be the only people will be able to make use of their 8k TVs, while dlss at the entry levels provides entry level 4K 60fps gaming to compete with consoles
Most entry level gamers don't have 4K monitors though.

My point is that mid range gamers would most likely have a 144hz 1440p screen which currently you would need a £500 Gpu if not more to maintain 144fps. Now if you could upscale 720p to 1440 on a £200 rtx 3050 and maintain DLSS 1440p 144fps then why spend more on a GPU? surely nvidia isn't going to allow that to happen?.
 
Last edited:
Most entry level gamers don't have 4K monitors though.

My point is that mid range gamers would most likely have a 144hz 1440p screen which currently you would need a £500 Gpu if not more to maintain 144fps. Now if you could upscale 720p to 1440 on a £200 rtx 3050 and maintain DLSS 1440p 144fps then why spend more on a GPU? surely nvidia isn't going to allow that to happen?.

I think this is what will settle when we get into 2021. You will pickup a mid range card for £300ish that gets the best of both worlds. People will flock to it as they will already have a decent monitor or they will see it as an investment and get the monitor later. As pointed out though, as fancy as DLSS may be, the game doesnt use it then its a shame.. people need to see it getting traction or it will irritate everyone.
 
Most entry level gamers don't have 4K monitors though.

Doesn't really matter for the time being. Next gen entry level GPU's are competing with $500 consoles, consoles which offer up to 4k 60fps gaming, so those entry level GPUs use that as their performance target.
 
I do not doubt per nvidia's boiler plate for reviewers they are going to used dlss and not tell you about it. Making you think that is natural performance. Nvidia is known for pulling tricks like that in there NDA to reviewers.

Most others will let you know that they're using DLSS to increase performance. Which will ultimately will debunk and discredit those few reviewers who try to do it and not say so/make it clear in their reviews/charts.

It's not deceptive reviewers that concerns me, as that generally all comes out in the wash. Honest and respectable reviewers however will of course have to show native performance comparisons, but in the name of thoroughness you would expect them to also compare DLSS and/or FidelityFX figures and footage where it is an option. Where I think AMD is potentially left vulnerable in this scenario is both FidelityFX and DLSS being "more than good enough" for the average consumer, but Nvidia showing a clear performance edge with DLSS vs Fidelity FX or DLSS vs native.

Let's not forget regardless if they use DLSS even if it's available in the drivers it still has to be implemented in the game via update. In which case everyone will know what the update will entail.

Well therein lies the rub.... if, and I appreciate it's very much an if, you believe the rumours that Nvidia have a DLSS 3.0 in the works which can automatically be enabled on any game using TAA (see here) then DLSS could become much more ubiquitous. The main thing (other than the crappy first implementation) that in my opinion has stopped DLSS from being a bigger factor this generation is the relative lack of games utilising it, which is why this is potentially huge news if it turns out to be true.

Therefore, I don't see the need for AMD to be concerned about.

Respectfully disagree. I believe they very much should be concerned, unless of course they have a suitable counter in the works in which case I'd love to hear more about it when they launch these new cards. I'm fairly confident that AMD will have no choice in the matter ultimately, but to me it's more a question of will it be this year? Next year? Year after?


So stop being afraid of them.

Let's not be childish eh?
 
Last edited:
The 10 owners of 8K TV's in the UK thank nVidia ;)

Well I don't see next gen GPU's offering enough performance without DLSS techniques - and those GPU's will hang around for 2 years prior to replacement. So we're not just talking todays 8k TV owners, we're including anyone who buys one in the next 2 to 3 years too.

I believe last year I saw a Samsung 8k panel on sale at the end of last year matching their Q90R 4k tv price from a few stores, which suggests there is a fair bit of price reductions to come. 8k is ridiculous expensive now but as with all technology they will become much cheaper every year going forward.

Just like next gen consoles are supporting 8k image output, just for those 10 8k owners you claim to know of - its about looking forward
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom