• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Just like next gen consoles are supporting 8k image output, just for those 10 8k owners you claim to know of - its about looking forward

I really can't see 8k taking off in any meaningful way any time soon, you might go from 10 owners to 20 in the next 2-3 years ;) ... it just doesn't offer enough of an advantage for the typical user over 4k at any reasonable sort of screen size to make up for the cost.

I've mentioned it before, but where I do think AI upscaling in general (not just DLSS, I think people get too hung up on DLSS being an nVidia thing) has some real promise for the near to medium term is in helping to drive two viewpoints being rendered for high resolution, high refresh displays in VR without having to sacrifice graphical fidelity and effects. Due to the high (and still growing) FoVs more resolution in VR is currently at the very high returns stage, vs the arguably rapidly diminishing returns of 4K to 8k TVs. See Facebook reality lab's work on Neural net supersampling for example https://uploadvr.com/facebook-neural-supersampling/

I just honestly don't see how people can view the promise of machine learning/neural net upscaling with anything other than excitement for what the future holds in terms of performance per watt, per buck, whatever metric you like! Ignore the term DLSS, it's as meaningless as regarding RTX as the sole method of ray tracing when in reality it is purely nvidia's implementation of it. I simply want to know what AMD plans on this front, because I am pretty confident that planning they must be...
 
Last edited:
It's not deceptive reviewers that concerns me, as that generally all comes out in the wash. Honest and respectable reviewers however will of course have to show native performance comparisons, but in the name of thoroughness you would expect them to also compare DLSS and/or FidelityFX figures and footage where it is an option. Where I think AMD is potentially left vulnerable in this scenario is both FidelityFX and DLSS being "more than good enough" for the average consumer, but Nvidia showing a clear performance edge with DLSS vs Fidelity FX or DLSS vs native.
This is a direct problem with your bias toward Nvidia. It has nothing to do with dlss. It more of your view of why isnt AMD not more like Nvidia. Here is a news flash for you; dlss doesnt motivate me to want their products. That's the point. IMHO I am not alone. Therefore there is nothing to worry about.:D

Well therein lies the rub.... if, and I appreciate it's very much an if, you believe the rumours that Nvidia have a DLSS 3.0 in the works which can automatically be enabled on any game using TAA (see here) then DLSS could become much more ubiquitous. The main thing (other than the crappy first implementation) that in my opinion has stopped DLSS from being a bigger factor this generation is the relative lack of games utilising it, which is why this is potentially huge news if it turns out to be true.
It's only been one game...lol.
It's not automatic. It will still require their driver team to implemented it in the game. Which will invoke a driver update for the game. Everyone will know.

The gist is that it would be much better than it was before. Which at the time was still all new to them. In other words they found a way of doing it now.


Respectfully disagree. I believe they very much should be concerned, unless of course they have a suitable counter in the works in which case I'd love to hear more about it when they launch these new cards. I'm fairly confident that AMD will have no choice in the matter ultimately, but to me it's more a question of will it be this year? Next year? Year after?
And this is where we fundamentally disagree as I mentioned before either you are biased for NVIDIA or you're not. Any reasonable, sound individual will not expect the same features in a competing product. Which makes it very difficult for me to understand your point of view.

If you want a AMD based on their features that you buy that. If you want the video, which is what you are angling for, then you buy that. It's really that simple. No need to be afraid.

Let's not be childish eh?
Your post came off as fear, uncertainty and doubt. There's a name for that and it's called fud. I'm just calling it out for what it is. However, I do agree it is childish.
:cool:
 
Last edited:
This is a direct problem with your bias toward Nvidia. It has nothing to do with dlss. It more of your view of why isnt AMD not more like Nvidia. Here is a news flash for you dlss doesnt motivate me to want their products. That's the point you don't seem to understand. There for there is nothing to worry about.

You see what you are doing there is taking your own personal motivation and applying it to everyone else in the market at large. That's not how this stuff works, we here are a very small subset of the wider market.


It's not automatic. It will still require their driver trained implemented in the game. The gist is that it would be much better than it was before. Which at the time was still all new to them. In other words they found a way of doing it.

You don't know that, I don't know that... literally all the rumour is, is that it works by default on any game with TAA. Could be absolute nonsense of course, like I said it's a big IF, but you are adding conditions that aren't in the rumour. "Moore’s Law Is Dead also notes that NVIDIA could be turning DLSS 3.0 on by default for all games, which goes to show how invested and confident green team is regarding the AI-based technology."



Any reasonable sound individual but not expect the video features in a competing product. Which makes it very difficult for me to understand your point of view.

Your sentence makes no sense, which makes it very difficult for me to understand your point of view.


Is it very difficult to understand or comprehend that if you like dlss that means you would want Nvidia vs whatever AMD may offer? Again if you're so concerned about what a MD office wait until they actually offer it. I'm sure they have a plan. Again no need to be afraid.

Again with the afraid ******** while completely missing the point :rolleyes:
 
Last edited:
DLSS is neither fake, nor cheating, if it provides high quality visuals at a high resolution and with a high framerate. It doesn't matter one little bit if it's rendering at a lower res and applying AI upscaling, and that's 'fake', if that AI upscaling is good and the end result works.
 
DLSS is neither fake, nor cheating, if it provides high quality visuals at a high resolution and with a high framerate. It doesn't matter one little bit if it's rendering at a lower res and applying AI upscaling, and that's 'fake', if that AI upscaling is good and the end result works.

We've become obsessed with "real" and "fake" resolution instead of focusing on what matters, what your eyes see, nor how what we got there. It's the destination, not the journey - just like there are some really good vegetarian burger patties now that taste pretty much like beef but some people won't eat it cause it's fake beef instead of cussing in the taste.
 
You see what you are doing there is taking your own personal motivation and applying it to everyone else in the market at large. That's not how this stuff works, we here are a very small subset of the wider market.
I think you misunderstood 'personal motivation' for my conviction in my disposition in which you "inquired of". :D

It has never been my intention to imply the PC Market as part of my reply to you. What I did make clear is that feature-set does not paint the entire picture regarding market trends.

Price will play an enormous factor this time around when compared to consoles. And no matter what features are incorporated into a gpu if the price isn't right people we'll just go to consoles. Do to less hassle. So you're right we are just a subset in that regard.

One thing I've witness throughout the years is that the Nvidia never made a noticeable impression on the gaming market. Don't get me wrong, they do command the GPU Market. But not the gaming market as a whole. They have made several attempts to gain mindshare but have failed miserably.

For example, hair works, physx, tessellation, optic, frameworks, etc. All of it diluted as nothing more then 'features" in the game...absence of Nvidia branding. Aided by AMD rebuke using competing features and consoles being consoles. Which is heavily influenced based on AMD hardware. Dlss will also succumb to the same fate. A feature used by all with no Nvidia branding. History repeating itself. :D

But let's examine price. For example:
$550 for a console for new games that you know would be available.
VS
$550 for just a GPU that offers dlss that will be implemented in games that people already played. The optics in just saying that alone makes DLSS look bad. Will the game you want to play have dlss? Who knows it's a 12 sided die roll. Will it be just for ray tracing, just for rasterized games? Will it be disabled for higher end cards, will it work at all. Will it work? Will it blur it up again. Etc, etc uncertainly looms.

The point is that dlss is only enticing to a minority of individuals looking for better performance in existing games that people already played, for example. And that increase in performance doesnt always factor to better game play/immersion. Just higher numbers.

However I will throw a bone, we know that cyberpunk 2077 will incorporate that. That's a new game right! But how replayable is that though. Not much from what I've seen.

Therefore, the true fear, uncertainty and doubt comes when we don't know for certain what new games we're going to get on PC. And if we do get those games they will come directly from console. Be it at the same time as console release or months/years later.

That is the subset section that is paying more for a gpu using dlss? What a waste!

Personally, I am not motivated in buying a new GPU to for it. Nor is performance a factor for me any more. So my interest in it is very low. :cool:

When comparing the IQ of DS Detroit:Become Human comes to mind. Its beautiful at 4k. Furthermore, I've played a few chapters more then once to see different outcomes. Replayability is very high in that game. And performance was never a factor.

You don't know that, I don't know that... literally all the rumour is, is that it works by default on any game with TAA. Could be absolute nonsense of course, like I said it's a big IF, but you are adding conditions that aren't in the rumour. "Moore’s Law Is Dead also notes that NVIDIA could be turning DLSS 3.0 on by default for all games, which goes to show how invested and confident green team is regarding the AI-based technology."
We've seen no demonstration that you can use dlss without any first-hand update from the game itself. So let's not jump to conclusions. The rumor is that you can enable it from the drivers. However the driver team still has to code it in games.

A prime example of the lack of Dlss 2.0 is Battlefield 5.

Your sentence makes no sense, which makes it very difficult for me to understand your point of view.
Yes I noticed that and have corrected it but will copy and paste it below.

"And this is where we fundamentally disagree as I mentioned before either you are biased for NVIDIA or you're not. Any reasonable, sound individual will not expect the same features in a competing product. Which makes it very difficult for me to understand your point of view.

If you want a AMD based on their features than you buy that. If you want the nvidia, which is what you are angling for, then you buy that. It's really that simple. No need to be afraid."

Again with the afraid ******** while completely missing the point :rolleyes:
So you ignore my statement calling your posts out to be nothing more than fud.
I see what you did there is an understatement.
:D
 
Last edited:
We need to merge the Navi and Ampere threads,nowadays there seems to be no distinction.
My bad. I got off track there for a moment.
Back to navi 2x!!

I read that the OEM card is no longer a blower Style. I wonder if they went with a 3 fan configuration or the same as before?

The largest fan size for a 3 fan configuration is what 92 mm? I always wanted to see a larger 3 fan config.
 
I really can't see 8k taking off in any meaningful way any time soon, you might go from 10 owners to 20 in the next 2-3 years ;) ... it just doesn't offer enough of an advantage for the typical user over 4k at any reasonable sort of screen size to make up for the cost.

As I mentioned the cost is coming down, I expect during this year's end of year sales you'll see 8k TV's sold at the same price as 4k panels again.

When they are the same price, why not get 8k right. As I said before, several months ago I saw a couple stores selling the Samsung 8k Q900 at the same price as the Samsung 4k Q90R - makes little sense to get 4k at that point.
 
For example, hair works, physx, tessellation, optic, frameworks, etc. All of it diluted as nothing more then 'features" in the game...

****************

But let's examine price. For example:
$550 for a console for new games that you know would be available.
VS
$550 for just a GPU that offers dlss that will be implemented in games that people already played.
However I will throw a bone, we know that cyberpunk 2077 will incorporate that. That's a new game right!
:D

From those features tessellation is used all around, it wasn't/is a nVIDIA thing. They only overdone it to make the competition look bad in certain titles. :)

The other stuff did not got traction simply because the consoles lack the power to run even basic graphics at native 1080p/4k, never mind extra stuff on top. And, of course, because gamers accepted that and bought them in droves.
Search Froblins demo on YouTube. AMD proved also that you could have accelerated AI (thousands of them, each with their own tasks), on the GPU back in HD4xxx days. No dev used that although has the potential to bring hugely big worlds with smarter and larger NPC crowds. Same reasons as the above, of course.

Some people will game on the PC due to some of its advantages, comparing prices is pointless. DLSS lets you use a lower resolution to get higher framerates, ergo, since you're using lower res anyway, why not turn ON DLSS and have (nearly) the same image quality in motion? Because some guys hate it by default? :)

We don't know how many will support that, but is a nice feature to have and if a nVIDIA card offers that (another type of "fine wine" technology), plus probably better RT performance at relatively the same price bracket as AMD, then the green team will still remain in the favors of the crowds.
 
From those features tessellation is used all around, it wasn't/is a nVIDIA thing. They only overdone it to make the competition look bad in certain titles. :)

The other stuff did not got traction simply because the consoles lack the power to run even basic graphics at native 1080p/4k, never mind extra stuff on top. And, of course, because gamers accepted that and bought them in droves.
Search Froblins demo on YouTube. AMD proved also that you could have accelerated AI (thousands of them, each with their own tasks), on the GPU back in HD4xxx days. No dev used that although has the potential to bring hugely big worlds with smarter and larger NPC crowds. Same reasons as the above, of course.

Some people will game on the PC due to some of its advantages, comparing prices is pointless. DLSS lets you use a lower resolution to get higher framerates, ergo, since you're using lower res anyway, why not turn ON DLSS and have (nearly) the same image quality in motion? Because some guys hate it by default? :)

We don't know how many will support that, but is a nice feature to have and if a nVIDIA card offers that (another type of "fine wine" technology), plus probably better RT performance at relatively the same price bracket as AMD, then the green team will still remain in the favors of the crowds.

I’m pretty sure he ( and others like him) would never buy NVIDIA no matter the price, performance, features.

Considering how much they’re trying to bash every good thing NV does while praising even the most mundane features AMD sports while making things up/ supporting by arguing in bad faith... no chance.

Funny thing is you can tell by the way a person argues if they have a brand allegiance or not. Only thing that saddens me is when others come for advice and are given bad advice by these kinds of people. I guess it only makes sense if they own stock or are paid somehow to do this.

Even the ‘i’m a more concious buyer, you guys buying NVIDIA gpus for high prices suck cause you’re ruining it for the rest of us’. Yeah we should totally not buy highend GPUs, TVs, consoles etc to please the ones who cant afford them... it’s always that ‘he has more than me, he stole it, ruins it for the rest of us! has less? He’s dumber, lazier. I’m the most fair/ smartest’ etc mentality.

Also unlike some, I’ve had plenty of gpus from both brands, just that lately AMD hasnt been able to compete at the highend - this gen it was features ( Had the RTX series not have had RT and DLSS i would have bought the 5700xt, great performing card for the price, if and only if there was no RT/ DLSS in the mix for almost same price. Later on i also found out i had a bonus image quality through VRSS ( for my Oculus Rift S), a segment in which AMD also lacks vs NV. Idc about ‘ but theres not that many games ‘, i played all the ones that did have those features and guess what? It felt good to have new graphical features to play with that in some cases made the game look way better. Even better, i get RT until next gpu cycle as well, now when more games than ever will use it, due to next gen consoles, those with 5700xts not planning to upgrade? Tough luck. Watch consoles have more impressive features than your 5xxx card). Not my fault, and honestly not sure why i should care. Buying PS5 day one, skipping this gpu gen cause i’m set for now with my 2080, checking next to see who’s on top when I’ll be buying in 2 years or so when next-gpus come out.
 
Last edited:
For example, hair works, physx, tessellation, optic, frameworks, etc. All of it diluted as nothing more then 'features" in the game...absence of Nvidia branding. Aided by AMD rebuke using competing features and consoles being consoles. Which is heavily influenced based on AMD hardware. Dlss will also succumb to the same fate. A feature used by all with no Nvidia branding. History repeating itself. :D

Propitiatory tech from one side or the other rarely takes off but in many cases these technologies have only become a thing at all when nVidia pushed them - ATI/AMD had TruForm and later a dedicated tessellation unit before nVidia bothered with tessellation but failed to support it properly. Games are generally the poorer for not having hardware accelerated physics. Mantle largely flopped until it folded into Vulkan which only gained some traction after nVidia threw their efforts behind it.

I'm not sure your point is such a glorious one.
 
Some people will game on the PC due to some of its advantages, comparing prices is pointless. DLSS lets you use a lower resolution to get higher framerates, ergo, since you're using lower res anyway, why not turn ON DLSS and have (nearly) the same image quality in motion? Because some guys hate it by default? :)

Why not just turn down some graphical settings in game and have (nearly) the same image quality in motion?
Serisouly, if we are satisfied with "nearly the same" you might as well just turn down the graphics.

Edit: Has anyone done a comparison of Native ultra everything vs Native turned down, vs DLSS ultra everything vs DLSS turned down?
 
DLSS is neither fake, nor cheating, if it provides high quality visuals at a high resolution and with a high framerate. It doesn't matter one little bit if it's rendering at a lower res and applying AI upscaling, and that's 'fake', if that AI upscaling is good and the end result works.
What is fake though is marketing it as some killer feature that will change your gaming experience when its not available in any of the games you play.
 
What is fake though is marketing it as some killer feature that will change your gaming experience when its not available in any of the games you play.

I think it's a rad idea, particularly DLSS 3, which is supposed to work without pre-training.

What I want to see if the performance per transistor though, which is where the RTX cards regressed with the inclusion of RT and tensor cores, and the consumer picked up the cost. At the time DLSS 1.0 was pretty useless so the move was questionable IMHO. With DLSS 2.0 being very good, I can better understand the move now. But how many extra transistors, in the form of tensor cores (assumed), are required to get that DLSS 2.0 boost?

From this metric you can proper infer "value", where maybe the RTX cards cost a fortune, but if they're sporting 50% more transistors than the competition you might say "fair enough". AMD might choose not to compete with DLSS, but for the same transistor count you might get similar performance DLSS vs. no DLSS, or not.

For example, 5700xt 10.3B transitors, RTX2070S 13.6B transistors. Very similar performance w/o DLSS.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom