• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
five8five said:
If these dates are accurate, I can understand why Nvidia have gone with a "fill yer boots strategy".
So they don't care about a bad launch, what you said makes no sense if they knew when Amd were going to launch they could have delayed it a month at least if not more and launched a week before Amd and destroyed Amd launch and had a lot of stock as well.:rolleyes:
 
You're reply was a little baffling as I didn't specifically mention the "lower stack".

Maybe it just makes sense to me only. What we know is this:

Nvidia announced beginning of Sept with Jensen in kitchen.
Pre-orders for 3080 on 17th Sept (two and half weeks later)
Stock dissolves - people being told delay possibly Christmas till cards shipped to door.

I'll clarify what I was meaning. If those rumoured release dates are true, Nvidia would know that AMD cards weren't in production so they could capitalise with an early launch.

OK so we know AMD are announcing late October, if we roughly follow that means mid November pre-orders or orders from retailers. This was always the case as they said launch just before consoles.
So unless your quote meant, look its going to be 2021 for some of the lower stack Navi to be available, it still doesn't make sense as nvidia also knew by this point they didnt have any stock to 'fill yer boots' ahead of AMD.
What would make more sense, like the 3070 delay, would have been if nvidia knew they wernt in production - to actually delay the 3080s by a month and prepare more?
 
Even if you're just a gamer. Your friends will hear you farting when you use a mic and you're sacrificing performance without some sort of DLSS product.
I don't play with other people, so no need for a mic, and DLSS is a crutch barely implemented in any games. Buying a card that best matches your required gaming res and a VRR monitor has been perfectly fine for donkey's years, I have no intention of joining the simpletons who are prepared to hand over obscene amounts of money to Nvidia for lacklustre rasterisation performance propped up by premium silicon that never gets used without explicit developer implementation.

This adoration of DLSS is so goddamn retarded it's not even funny any more.
 
Last edited:
You sound mad though, don't worry. Its nothing to be ashamed of that the 6900XT is slower than the RTX3080 in Ray Tracing.

Its good to see the hype train coming down, it wasn't long ago that some people thought these next gen consoles were going to be running AAA games at 4k 120fps with Ray Tracing that beats PC's of today.
The only people to have said any of this are the little AMD fanatics that exist only in your fantasy land where you can berate them. In the real world, nobody with half a brain has ever said, nor expected, the new consoles to push 4K 120.

And I'd like to see your evidence that proves the 6900XT (if there is such a thing) is slower in ray tracing than the 3080. Because until you cite your source, or benchmarks comes out, you really need to hush your drivel. And again, nobody expected Big Navi to outperform Ampere's ray tracing. You're either making things up to pointlessly bait or you're deluded and convinced yourself it's happening so you can act all superior.
 
This adoration of DLSS is so goddamn retarded it's not even funny any more.

Called out for it before its getting tiring. People on here well overrate it, like you said its so select that reviewers arent helping by having to cherry pick benchmarks to include it to show off the feature.
 
I never quite understood the excessive emphasis placed on the importance of DLSS. It's an almost entirely superfluous feature (and will be for the foreseeable future) until it becomes ubiquitous, universally supported and widely implemented in hundreds of games, of which there is no guarantee. Unless of course you are someone who exclusively plays games that contain DLSS and no other games, but that's a rather strange scenario.

I don't even want to get started on the gimmick that is ray tracing. Until it offers a much more significant improvement in visual quality (i.e. it reaches a point where I can easily differentiate between rtx on/off without needing to squint or perform an in-depth visual analysis of images) and becomes implemented in hundreds of games, it's nothing more than a gimmick.

Considering all of that, I must once again hand it to Nvidia for their marketing prowess. Judging by the sentiment of many online comments, Nvidia have succeeded in winning over the public's imagination to the point where their "superior suite and feature set" have become a differentiating factor for many when it comes to choosing Nvidia GPUs over AMD's future offerings. Many are ignoring/rejecting the fact that it will take at least several generations of GPUs before ray tracing performance and supersampling capabilities become an essential aspect of GPU choice. Until these features mature and reach a point where we simply cannot do without them, they are at best non-essential complementary features which, for the observant consumer, should not demand a price premium especially where they hinder your FPS potential.
 
I never quite understood the excessive emphasis placed on the importance of DLSS. It's an almost entirely superfluous feature (and will be for the foreseeable future) until it becomes ubiquitous, universally supported and widely implemented in hundreds of games, of which there is no guarantee. Unless of course you are someone who exclusively plays games that contain DLSS and no other games, but that's a rather strange scenario.

I don't even want to get started on the gimmick that is ray tracing. Until it offers a much more significant improvement in visual quality (i.e. it reaches a point where I can easily differentiate between rtx on/off without needing to squint or perform an in-depth visual analysis of images) and becomes implemented in hundreds of games, it's nothing more than a gimmick.

Considering all of that, I must once again hand it to Nvidia for their marketing prowess. Judging by the sentiment of many online comments, Nvidia have succeeded in winning over the public's imagination to the point where their "superior suite and feature set" have become a differentiating factor for many when it comes to choosing Nvidia GPUs over AMD's future offerings. Many are ignoring/rejecting the fact that it will take at least several generations of GPUs before ray tracing performance and supersampling capabilities become an essential aspect of GPU choice. Until these features mature and reach a point where we simply cannot do without them, they are at best non-essential complementary features which, for the observant consumer, should not demand a price premium especially where they hinder your FPS potential.

I don't think Ray Tracing is going to offer a significant visual difference ever. It's pretty obvious that the main improvements come from the resolving of technical limitations and accuracy - both of which offer minor improvements outside of reflections.

High quality, accurate reflections are hard/impossible to do without ray tracing - so the difference with it ON/OFF is very noticeable but that's about it.

When it comes to illumination and shadows, the fake lighting and shadows games have been using are very good already and ray tracing makes little visual improvements, its mainly to the accuracy of the image but certainly nothing you'd be prepared to sacrifice for.

That's why for me personally, in a game like Control that gives the option - I only played it with Ray Traced reflections, I didn't see the point of the shadows and lighting. I will continue to play games like this, only using RT reflections until such time that enabling everything has no performance impact then it becomes a moot point.

And I still stand my previous comments about Ray Tracing that I've made: HDR has a far bigger impact to the visuals of a game than Ray Tracing and if you're thinking about spending extra money for a GPU just for Ray Tracing, I'd put that money into an HDR1000 certified screen instead, you will get a huge amount more bang for your buck
 
CyberCatPunk - "Confirmed: Navi21 --- Sienna Cichlid Compute --- GFX10 (gfx1030) 80CUs/5120 shaders/320 TMUs /96 ROPs ~20TF (FP32)"

Who is this guy, and where does he get his info from?

If 96 ROPs is correct, it should make a pretty substantial difference to performance at higher resolutions (especially if combined with a 25% overclock vs the 5700 XT).

The trouble I have with believing any leaks regardless of the website it's posted on, is that they never say exactly where (or who) they got the information from. It's highly likely they just took an educated guess.

If one part of a spec rumour is wrong, all of the rumour is invalidated imo. They either have the correct info from a reliable source, or they (probably) don't.
 
Last edited:
I never quite understood the excessive emphasis placed on the importance of DLSS.

I think it's a smart way of migrating from rasterisation to RT. FPS can be held high while more of the silicon is dedicated to RT. Not a great fan of DLSS as it has it's own issues with artifacts.
 
I never quite understood the excessive emphasis placed on the importance of DLSS. It's an almost entirely superfluous feature (and will be for the foreseeable future) until it becomes ubiquitous, universally supported and widely implemented in hundreds of games, of which there is no guarantee. Unless of course you are someone who exclusively plays games that contain DLSS and no other games, but that's a rather strange scenario.

I don't even want to get started on the gimmick that is ray tracing. Until it offers a much more significant improvement in visual quality (i.e. it reaches a point where I can easily differentiate between rtx on/off without needing to squint or perform an in-depth visual analysis of images) and becomes implemented in hundreds of games, it's nothing more than a gimmick.

Considering all of that, I must once again hand it to Nvidia for their marketing prowess. Judging by the sentiment of many online comments, Nvidia have succeeded in winning over the public's imagination to the point where their "superior suite and feature set" have become a differentiating factor for many when it comes to choosing Nvidia GPUs over AMD's future offerings. Many are ignoring/rejecting the fact that it will take at least several generations of GPUs before ray tracing performance and supersampling capabilities become an essential aspect of GPU choice. Until these features mature and reach a point where we simply cannot do without them, they are at best non-essential complementary features which, for the observant consumer, should not demand a price premium especially where they hinder your FPS potential.
The way we've seen RT done on PC is a one pony trick. As it will only compliment rasterized games. You will never see a fully ray traced gamed, never.
Ray-Traced shadows: a complete joke.
Ray-Traced Reflections: Which is the staple of what people define as "ray tracing". Nvidia PR at it's finest.
Ray-Traced Global Illumination: Ah, now that's a tough one to see in all games as it's costly. Not too many other "rt options" unless you downsample the image to a lower resolution...Oh wait, DLSS...lol

I'm sure there are other effects. But overall these effects do nothing more then enhance rasterized games. And there is no guarantee that most of these effects will be had within one game without downsampling the resolution, aka DLSS, in order to compensate for the performance penalty associated with RT. Or That's the trick...Or...ignore the man behind the curtain.

But that's only 1/2 the problem. You have to have an artist that knows how to properly use just a few of these effects without the game looking "goofy". For example:


As you can see, a HDTV shouldn't reflect it's surroundings like a mirror, for example.

Another example is that it's so "over done" it looks out of place to the rasterized surroundings.



Until we see environments that look like the ones below will we really start to see what RT can do.


Unfortunately, as suggested earlier, it will be many GPU generations before that's a possibility. So, if RT effects are lowered/reduced to enhance performance I'm all for it. As it will not look like the above anyway. So making RT effects fit and look similiar to a rasterized game is the way to go (providing a performance boost).
 
Last edited:
CyberCatPunk - "Confirmed: Navi21 --- Sienna Cichlid Compute --- GFX10 (gfx1030) 80CUs/5120 shaders/320 TMUs /96 ROPs ~20TF (FP32)"

Who is this guy, and where does he get his info from?

If 96 ROPs is correct, it should make a pretty substantial difference to performance at higher resolutions (especially if combined with a 25% overclock vs the 5700 XT).

The trouble I have with believing any leaks regardless of the website it's posted on, is that they never say exactly where (or who) they got the information from. It's highly likely they just took an educated guess.

If one part of a spec rumour is wrong, all of the rumour is invalidated imo. They either have the correct info from a reliable source, or they (probably) don't.

Isn't that just the specs from TPU database lol
 
When it comes to illumination and shadows, the fake lighting and shadows games have been using are very good already and ray tracing makes little visual improvements, its mainly to the accuracy of the image but certainly nothing you'd be prepared to sacrifice for.

When you have proper real time indirect lighting, caustics and other features like that it immensely improves the quality of the lighting/shadows - there is an organicness to it in action that traditional techniques can't compare to (and doesn't convey well in static screenshots).

It will be like 60Hz vs 120+ all over again, lots of people talking it down until they spend some time with it when we have proper implementations then go back to without it and realise how horrid it is.

Until we see environments that look like the ones below will we really start to see what RT can do.

There is nothing in those screenshots beyond the capabilities of the path tracing implementation in Quake 2 RTX.
 
Last edited:
The only people to have said any of this are the little AMD fanatics that exist only in your fantasy land where you can berate them. In the real world, nobody with half a brain has ever said, nor expected, the new consoles to push 4K 120.

And I'd like to see your evidence that proves the 6900XT (if there is such a thing) is slower in ray tracing than the 3080. Because until you cite your source, or benchmarks comes out, you really need to hush your drivel. And again, nobody expected Big Navi to outperform Ampere's ray tracing. You're either making things up to pointlessly bait or you're deluded and convinced yourself it's happening so you can act all superior.

EastCoastHandle repeatedly stated 4k120 with RT too in next gen consoles bla bla bla ( Call of Duty ), other examples as well. You gonna ignore those and slip them under the rug?

Aleo a small sidenote: me not continuing to argue on the previous pages didnt mean any of you were right. Fact that you quote eachother calling me a troll and insulting me isn’t proof of anything, even less so that you guys are right. Just tired of the goalposts moving and having to repeat myself with things ive recently said ( on the previous page) cause you know, easier to ignore and cherry pick arguments and attack me instead of the things being discussed.

Anyway.
 
Last edited:
EastCoastHandle repeatedly stated 4k120 with RT too in next gen consoles bla bla bla ( Call of Duty ), other examples as well. You gonna ignore those and slip them under the rug?

Aleo a small sidenote: me not continuing to argue on the previous pages didnt mean any of you were right. Fact that you quote eachother calling me a troll and insulting me isn’t proof of anything, even less so that you guys are right. Just tired of the goalposts moving and having to repeat myself with things ive recently said ( on the previous page) cause you know, easier to ignore and cherry pick arguments and attack me instead of the things being discussed.

Anyway.
What are you going on about?
COD Cold War will support 4K 120 fps on console. Not sure of FPS/resolution of RT yet though.
https://metro.co.uk/2020/09/10/120f...ps-cold-war-ps5-and-xbox-series-x-s-13251406/
https://www.ign.com/articles/cod-black-ops-cold-war-4k-120hz
https://www.pushsquare.com/news/202...s_cold_war_is_4k_120_frames-per-second_on_ps5
https://www.tweaktown.com/news/7475...20fps-on-playstation-5-xbox-series/index.html

It's not me who's saying it though. I'm only reporting the news. So get your facts straight 1st and do a little research before you accuse someone of making 4k/120fps up.
:D
 
Last edited:
Sure, you back then quoting that verge article when it said 4k 120 and RT and pretending it will be all at once when they were intentionally misleading...

should have quoted you before you had the chance to go back and edit.

I’m sure others remember it as well.

And yeh, it’s not you that’s saying it, but quoting articles and going all like ‘wow so RDNA2 goes 4k120 RT on !!! Crazy perf etcetc’ seems like personal input as well.

Anyway, @LePhuronn going back to that August month, you can easily see quite a few people believing in 4k120 with RT on top when that video talking about Call of Duty on consoles was quoted.
 
Last edited:
Sure, you back then quoting that verge article when it said 4k 120 and RT and pretending it will be all at once when they were intentionally misleading...

should have quoted you before you had the chance to go back and edit.

I’m sure others remember it as well.

And yeh, it’s not you that’s saying it, but quoting articles and going all like ‘wow so RDNA2 goes 4k120 RT on !!! Crazy perf etcetc’ seems like personal input as well.

Anyway, @LePhuronn going back to that August month, you can easily see quite a few people believing in 4k120 with RT on top when that video talking about Call of Duty on consoles was quoted.

There is a difference between what they can do and what they will do. The consoles can do 4k120 (you might even be able to squeeze RT as as well). What they will do is the choice of the developer. Framerate and resolution will always be the choice of the developer regardless of GPU specs. If developers wanted to target a 3090 just about achieving 1080p30 they could do it.

Sidenote: Page 279 was were i could find mentions (from a quick skim) of the consoles doing 4k120 and even then it seems like you may be misremembering
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom