• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
hmmm, this leak is saying 225% performance increase over the 5700XT. But AMD themselves are saying "up to 50% performance per watt increase"

I wonder which will turn out to be true.


I honestly hate this percentage talk, it used to be years ago a card was say 50% higher than the last card in actual performance. Now there's always this caveat of performance per watt, performance per mm squared, or even performance per board area. The only reason these are used is they can get bigger numbers out of them to make it seem more powerful than it is. Gets irritating when these vendors are always inventing new terminologies to try and push something as more powerful than it actually is.
 
I honestly hate this percentage talk, it used to be years ago a card was say 50% higher than the last card in actual performance. Now there's always this caveat of performance per watt, performance per mm squared, or even performance per board area. The only reason these are used is they can get bigger numbers out of them to make it seem more powerful than it is. Gets irritating when these vendors are always inventing new terminologies to try and push something as more powerful than it actually is.

Yes, it can be confusing, we saw that with the Polaris launch. If what you say is true and AMD are using the 50% performance per watt as the the biggest number to show in their marketing slides to make it seem more powerful than it is, then that doesn't look good does it??
 
Yes, it can be confusing, we saw that with the Polaris launch. If what you say is true and AMD are using the 50% performance per watt as the the biggest number to show in their marketing slides to make it seem more powerful than it is, then that doesn't look good does it??

It honestly gets irritating that we've got to a point where both vendors pull these numbers out of their respected asses that NOBODY cares about, we want actual GAME performance, i don't care for anything else and i doubt many others do either.
 
Yes, it can be confusing, we saw that with the Polaris launch. If what you say is true and AMD are using the 50% performance per watt as the the biggest number to show in their marketing slides to make it seem more powerful than it is, then that doesn't look good does it??
80% performance per rbg led
 
80% performance per rbg led

Not really anything related to the topic at hand and don't know why I thought of it. But when I read this post I was thinking of the Vega 64 limited edition card and how it was stunning to look at and just had the most basic lighting. No crazy RGB.
 
Not really anything related to the topic at hand and don't know why I thought of it. But when I read this post I was thinking of the Vega 64 limited edition card and how it was stunning to look at and just had the most basic lighting. No crazy RGB.


I have one, though had to change the fan as it was making a weird noise.

qr4J10M.png
 
Last edited:
Not really following, what card have you got and the new stuff will be on RDNA2 so your hinting this is bringing nothing too?

AMD cards are severely lacking in various areas in terms of features compared to Nvidia. Just earlier today I wanted to use Topaz to enhance a video to 4K from 1080p - it's Nvidia only, with no AMD alternatives. Or taking some super-res screenshots with Ansel. Or even just allowing VSR to go past 5K. And the examples could go on all day.

Up until Navi, this was somewhat acceptable because AMD cards had either more forward looking hardware (hence GCN scaling well past its pre-Turing competitors when it comes to DX12 & Vulkan, and just general longevity) or simply a lot of compute & more vram, as well as other niche features of their own (for all its bad rep, Vega was excellent for 4K & HDR, as well as some niche scenarios thanks to HBCC - compared to Pascal). Instead, Navi makes the Radeon card nothing more than a bargain-bin version of Geforce. No extra features, no forward looking anything - just a gimped version that's sold for less - and with added driver headaches to boot, as is usual for AMD's first foray into anything.

Will RDNA 2 be the same? Who knows, in some ways it's clearly not going to make any grounds (eg the CUDA ecosystem), but I have yet to hear of what it will bring that will set it apart from Turing & Ampere. The 16 GB is a nice start, but there needs to be more, and we'll see how the price shakes up also. But since I've heard nothing (outside of the vram), I have no reason to wait. Yes, I'd like more vram but in the end I can live with 4-5 GB less for everything else that Nvidia brings to the table. So if AMD doesn't want to lose more sales to impatient people like myself, they'll have to speak up beforehand on why we should wait at all.

I don't delude myself thinking that my one sale makes a difference to their bottom line or decision-making, but I don't do charity for amoral corporations (ie any of them) either.
 
AMD cards are severely lacking in various areas in terms of features compared to Nvidia...

I dont think you answered my question unless your eluding to have owned a Vega and now ceased. The weird part was the 15-20 years banding, if it is 20 years then you have owned a Radeon right up till now. If it was 15 years then it could be as they were bought out by AMD.
 
AMD cards are severely lacking in various areas in terms of features compared to Nvidia. Just earlier today I wanted to use Topaz to enhance a video to 4K from 1080p - it's Nvidia only, with no AMD alternatives. Or taking some super-res screenshots with Ansel. Or even just allowing VSR to go past 5K. And the examples could go on all day.

Some devs just get "incentives" to keep things NVIDIA only, the sad fact is a lot of times when AMD bring in new features they largely get ignored by the industry. Truform back around 2001 was something that could make the angular models in games more rounded and realistic, but it never went anywhere, and that's at a point in time where models were very angular and truform had an obvious benefit. TrueAudio was used in a few games but again never got any real traction.

So why keep cranking out features if the industry can't be arsed to adopt them? It costs time and money and they basically get ignored for the most part. If they don't pay the devs to use them they generally don't as it's "more work for them to do", and as we know a lot of PC ports have the bare minimum of pc centric features in them in the first place.
 
Are you talking about DirectML or FidelityFX? In both cases developers need to enable it? DirectML isn't AMD's and FidelityFX isn't DLSS.



This post is kind of ironic. If Humbug is talking about DirectML, then artificial intelligence is going to be doing exactly that. If he is talking about FidelityFX, then no it can't add missing details as FidelityFX is just an upscaler but it does need to be added by the developer.


FidelityFX not being DLSS doesn't make it not the same thing in its result.

The one in the Middle is FidelityFX. No, FidelityFX works in any game old or new without any developer input/

LvqGzmE.png
 
Last edited:
Can’t really tell the difference between those screenshots unless you look really hard certainly won’t in real-time plying a game.

The middle one does look better if your looking very closely though.
 
Some devs just get "incentives" to keep things NVIDIA only, the sad fact is a lot of times when AMD bring in new features they largely get ignored by the industry. Truform back around 2001 was something that could make the angular models in games more rounded and realistic, but it never went anywhere, and that's at a point in time where models were very angular and truform had an obvious benefit. TrueAudio was used in a few games but again never got any real traction.

So why keep cranking out features if the industry can't be arsed to adopt them? It costs time and money and they basically get ignored for the most part. If they don't pay the devs to use them they generally don't as it's "more work for them to do", and as we know a lot of PC ports have the bare minimum of pc centric features in them in the first place.

TruForm essentially became tessellation down the line. The problem often is ATI/AMD just not providing the level of documentation and support nVidia does for their features leaving a lot of developers in the dark while nVidia will go as far as sending people in to work on a feature or doing it in house if a developer is willing to send them the relevant source code/material.

The lack of traction on TrueAudio is a shame - audio in games has stagnated for too long.
 
Can’t really tell the difference between those screenshots unless you look really hard certainly won’t in real-time plying a game.

The middle one does look better if your looking very closely though.
To me the middle one, which is FidelityFX looks best, there is more detail in that than the native 4K side, the one on the right is DLSS before Nvidia fixed it, its a blury mess missing all the detail.
I have is turned on all the time, makes every game look 4K....
 
To me the middle one, which is FidelityFX looks best, there is more detail in that than the native 4K side, the one on the right is DLSS before Nvidia fixed it, its a blury mess missing all the detail.
Yeah the Dlss one certainly looks the worse but like I said you would never be able to tell these minute differences while your playing the game in real-time. It’s like perceiving the difference between 100fps and 110fps without a FPS counter.
 
Yeah the Dlss one certainly looks the worse but like I said you would never be able to tell these minute differences while your playing the game in real-time. It’s like perceiving the difference between 100fps and 110fps without a FPS counter.

Yes but the point is image quality upscaling is not a feature unique to Nvidia, AMD has it and it works well.
 
Yes but the point is image quality upscaling is not a feature unique to Nvidia, AMD has it and it works well.
Sorry not trying to miss the point just that it’s not really that important outside of screenshots.

Does it come with a performance hit if you activate it?
 
Can’t really tell the difference between those screenshots unless you look really hard certainly won’t in real-time plying a game.

The middle one does look better if your looking very closely though.
The right hand side one is pure crap! Look at the rivets in the foreground, for one.. and the number of visible jaggies.

Middle is best.
 
Status
Not open for further replies.
Back
Top Bottom