It can be both.
Edit, I'm not saying the performance rumours are true, just that technically it can be both those things.
Yeah, thought about it for a bit and you are right. If they made a big enough chip and throw a lot of power at it.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It can be both.
Edit, I'm not saying the performance rumours are true, just that technically it can be both those things.
hmmm, this leak is saying 225% performance increase over the 5700XT. But AMD themselves are saying "up to 50% performance per watt increase"
I wonder which will turn out to be true.
I honestly hate this percentage talk, it used to be years ago a card was say 50% higher than the last card in actual performance. Now there's always this caveat of performance per watt, performance per mm squared, or even performance per board area. The only reason these are used is they can get bigger numbers out of them to make it seem more powerful than it is. Gets irritating when these vendors are always inventing new terminologies to try and push something as more powerful than it actually is.
Yes, it can be confusing, we saw that with the Polaris launch. If what you say is true and AMD are using the 50% performance per watt as the the biggest number to show in their marketing slides to make it seem more powerful than it is, then that doesn't look good does it??
80% performance per rbg ledYes, it can be confusing, we saw that with the Polaris launch. If what you say is true and AMD are using the 50% performance per watt as the the biggest number to show in their marketing slides to make it seem more powerful than it is, then that doesn't look good does it??
80% performance per rbg led
Not really anything related to the topic at hand and don't know why I thought of it. But when I read this post I was thinking of the Vega 64 limited edition card and how it was stunning to look at and just had the most basic lighting. No crazy RGB.
Not really following, what card have you got and the new stuff will be on RDNA2 so your hinting this is bringing nothing too?
This is actually probably true. Lol.Psst, he was buying Nvidia no matter what anyway. And pay extra for the dongle and all. Gotta read in between the lines.
AMD cards are severely lacking in various areas in terms of features compared to Nvidia...
AMD cards are severely lacking in various areas in terms of features compared to Nvidia. Just earlier today I wanted to use Topaz to enhance a video to 4K from 1080p - it's Nvidia only, with no AMD alternatives. Or taking some super-res screenshots with Ansel. Or even just allowing VSR to go past 5K. And the examples could go on all day.
Are you talking about DirectML or FidelityFX? In both cases developers need to enable it? DirectML isn't AMD's and FidelityFX isn't DLSS.
This post is kind of ironic. If Humbug is talking about DirectML, then artificial intelligence is going to be doing exactly that. If he is talking about FidelityFX, then no it can't add missing details as FidelityFX is just an upscaler but it does need to be added by the developer.
Some devs just get "incentives" to keep things NVIDIA only, the sad fact is a lot of times when AMD bring in new features they largely get ignored by the industry. Truform back around 2001 was something that could make the angular models in games more rounded and realistic, but it never went anywhere, and that's at a point in time where models were very angular and truform had an obvious benefit. TrueAudio was used in a few games but again never got any real traction.
So why keep cranking out features if the industry can't be arsed to adopt them? It costs time and money and they basically get ignored for the most part. If they don't pay the devs to use them they generally don't as it's "more work for them to do", and as we know a lot of PC ports have the bare minimum of pc centric features in them in the first place.
To me the middle one, which is FidelityFX looks best, there is more detail in that than the native 4K side, the one on the right is DLSS before Nvidia fixed it, its a blury mess missing all the detail.Can’t really tell the difference between those screenshots unless you look really hard certainly won’t in real-time plying a game.
The middle one does look better if your looking very closely though.
Yeah the Dlss one certainly looks the worse but like I said you would never be able to tell these minute differences while your playing the game in real-time. It’s like perceiving the difference between 100fps and 110fps without a FPS counter.To me the middle one, which is FidelityFX looks best, there is more detail in that than the native 4K side, the one on the right is DLSS before Nvidia fixed it, its a blury mess missing all the detail.
Yeah the Dlss one certainly looks the worse but like I said you would never be able to tell these minute differences while your playing the game in real-time. It’s like perceiving the difference between 100fps and 110fps without a FPS counter.
Sorry not trying to miss the point just that it’s not really that important outside of screenshots.Yes but the point is image quality upscaling is not a feature unique to Nvidia, AMD has it and it works well.
Sorry not trying to miss the point just that it’s not really that important outside of screenshots.
Does it come with a performance hit if you activate it?
The right hand side one is pure crap! Look at the rivets in the foreground, for one.. and the number of visible jaggies.Can’t really tell the difference between those screenshots unless you look really hard certainly won’t in real-time plying a game.
The middle one does look better if your looking very closely though.