• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
And I just scored a 3080FE at cost, or at leat I think I did.
Lethargy ... a lack of energy and enthusiasm. I think that perfectly describes AMD's RT performance.
  • Their first try, even though Nvidia are on their 2nd generation. Why didn't they get involved earlier? Yes, building silicon is expensive, but running silicon compilers is relatively cheap. Manufacturers know the performance before a design is fabricated. AMD knew that Nvidia would be increasing their performance.
  • AMD may have been in financial trouble, but that is something that is often accompanied by a lack of energy and enthusiasm.
  • Microsoft and Sony wanted cheap APUs that would be good enough for console gamers. That's what RDNA2 appears to be. Though Sony were smart enough to build in some additional features that are not part of RDNA2 found within PC cards.
  • Games using RT so far are built on DXR or Vulkan. 3Dmark's Port Royal is built around DXR.

Do you forget AMD and Freesync, Nvidia got the drop but Freesync still won because of open standards and the fac tthe market was saturated with cheaper Freesync monitors. The vast majority of AAA games are most likely to be focused on next gen consoles and whatever version of RT they will use. So that can and will influence how next-gen games implement RT on PC. So the fact that the only game in town for RT has been Nvidia means nothing longer term. It's not always about being first, or even the best.
 
This is the thing, people dissing on AMD RT despite the fact the 6800 does have it at roughly 2080Ti speeds and we have to see how AMD implement DLSS. So it might be OK, or it might be crap... wait and see. Yet that does not make the 6800 series bad GPUs.

For the most part I think what AMD promised and what they delivered was quite different as far as availability goes. The RT performance as it stands is meh but there is a possibility that it will improve if AMDs DLSS efforts are worth it. They are in the ball park of Nvidia's top end at rasterisation and despite what many try to claim, that is the main thing that matters.
 
Last edited:
and of course the RT experts on here that are either developers or part of the industry consortium groups.. :rolleyes:



You know the sad thing, if amd were equal in rt performance then the focus of the rt brigade would then be "oh but the 30 series are a bit better in 4k performance". Its the standard bs of picking out something and making a big deal of it.
 
This is the thing, people dissing on AMD RT despite the fact the 6800 doe shave it at roughly 2080Ti speeds and we have to see how AMD implement DLSS. So it might be OK, or it might be crap... wait and see. Yet that does not make the 6800 series bad GPUs.

People just need to justify their purchases - and this goes both ways.

TBH most people will be happy with either a 3080 or a 6800XT for the majority of games that exist right now
 
Lethargy ... a lack of energy and enthusiasm. I think that perfectly describes AMD's RT performance.

:rolleyes: Fanboys will fanboy

Lethargy ... a lack of energy and enthusiasm. I think that perfectly describes AMD's RT performance.
  • Their first try, even though Nvidia are on their 2nd generation. Why didn't they get involved earlier? Yes, building silicon is expensive, but running silicon compilers is relatively cheap. Manufacturers know the performance before a design is fabricated. AMD knew that Nvidia would be increasing their performance.

Because companies timing plans are different. Because all their resources are invested in getting RDNA 2 going. You seem to think there is a magic go faster button that AMD couldn't be bothered to press during development

  • AMD may have been in financial trouble, but that is something that is often accompanied by a lack of energy and enthusiasm.
:rolleyes: Someone clearly needs a history lesson if you think a lack of energy or enthusiasm has anything to do with their financial troubles.

  • Microsoft and Sony wanted cheap APUs that would be good enough for console gamers. That's what RDNA2 appears to be. Though Sony were smart enough to build in some additional features that are not part of RDNA2 found within PC cards.

:rolleyes: Fanboys will fanboy

  • Games using RT so far are built on DXR or Vulkan. 3Dmark's Port Royal is built around DXR.

Games use DirectX yet, both companies release drivers to optmise performance and it is known that graphics cards in general will perform better on sponsored titles. Therefore we can conclude that just because something uses a platform agnostic API, doesn't mean that the hardware will be at maximum performance, and there are advantages to targeted optimisations
 
The RTX 3080 has 68 RT cores.
The 6800XT has 72 RT cores.

in terms of Ray Tracing we don't yet know what the true performance of RDNA2 is, not yet, not until we have more RT games that aren't sponsored by Nvidia.
I want to see how good they run in rendering. Hopefully the update to Blender happens 2021Q1. Depending on how the code is implemented that will be first insight into the speed difference between them.

That reminds me AMD was supposed to have some stuff focussed on content creation, around the 6900xt launch.
 
:rolleyes: Fanboys will fanboy



Because companies timing plans are different. Because all their resources are invested in getting RDNA 2 going. You seem to think there is a magic go faster button that AMD couldn't be bothered to press during development


:rolleyes: Someone clearly needs a history lesson if you think a lack of energy or enthusiasm has anything to do with their financial troubles.



:rolleyes: Fanboys will fanboy



Games use DirectX yet, both companies release drivers to optmise performance and it is known that graphics cards in general will perform better on sponsored titles. Therefore we can conclude that just because something uses a platform agnostic API, doesn't mean that the hardware will be at maximum performance, and there are advantages to targeted optimisations

If you look back you will see that I've already said I don't care what manfacturer is in my PC, I simply buy the best at the time. Calling fanboy seems to be something people do on here when feeling aggrieved. What sort of person would feel aggrieved, while discussing GPUs?

I described that energy and enthusiasm was linked to financial state.

So how do you explain 3DMark's Port Royal results?
 
If you look back you will see that I've already said I don't care what manfacturer is in my PC, I simply buy the best at the time. Calling fanboy seems to be something people do on here when feeling aggrieved. What sort of person would feel aggrieved, while discussing GPUs?
So where did you read that Nvidia handed DLSS to Microsoft? Or that AMD blocked RT on Cyberpunk? You said that too in the past.
If both are using DXR, why did Ubisoft released a patch offering support for AMD RT on WDL?
 
You know the sad thing, if amd were equal in rt performance then the focus of the rt brigade would then be "oh but the 30 series are a bit better in 4k performance". Its the standard bs of picking out something and making a big deal of it.

Yeah you know that would be one of the tools in the kit coming out. Where one dies off (G-sync) another one replaces it (RT etc.) only because people fall for the marketing, people believe they need it before its actually a thing.
 
And I just scored a 3080FE at cost, or at leat I think I did.

Fingers crossed, great cards. Running mine with a 3770k, which is still doing surprisingly well. Will grab a 5900x or 5950x when they are more available.

Do you forget AMD and Freesync, Nvidia got the drop but Freesync still won because of open standards and the fac tthe market was saturated with cheaper Freesync monitors.

I'm sure i've called Gsync a scam on here in the past. Freesync was a great move by AMD, but how long has it taken them to bring it up to a decent standard? And when they did, Nvidia came along and called it Gsync compatible.

The vast majority of AAA games are most likely to be focused on next gen consoles and whatever version of RT they will use. So that can and will influence how next-gen games implement RT on PC. So the fact that the only game in town for RT has been Nvidia means nothing longer term. It's not always about being first, or even the best.

I can see the argument, AMD consoles will hold back the PC master race. Bad ports get a lot of flack on PC. I think we will get decent console ports where AMD will run low - medium settings in RT, while Nvidia will be running high.
 
So where did you read that Nvidia handed DLSS to Microsoft?

Handed the DLSS model. I don't know where I read it. I even googled for it, but can't find anything. Perhaps on one of the ML discussions.

Or that AMD blocked RT on Cyberpunk? You said that too in the past. If both are using DXR, why did Ubisoft released a patch offering support for AMD RT on WDL?

I gave up on Ubisoft when they left Steam so not following it closely. Wasn't it covered that the AMD cards were not displaying RT correctly and so needed a patch?

As far as CP2077, the biggest release in memory, I didn't read anything regarding AMD other than they were working with CDPR to get it working. That's the biggest release in memory that they didn't have and still don't have working. Now Nvidia came out and declared that they were only using DXR for RT, no proprietory extensions, while AMD were trying to sell there new DXR capable cards. Looking at RT performance so far we can see AMD is ~30-50% slower than Nvidia in RT without considering DLSS. So how many new RDNA2 cards do you think AMD would sell when the biggest release in memory ran in single digits if not fractions on AMD cards with RT enabled? It's clear AMD does not want RT running in CP2077 anytime soon.
 
Fingers crossed, great cards. Running mine with a 3770k, which is still doing surprisingly well. Will grab a 5900x or 5950x when they are more available.

Thanks, just waiting to see if it turns out an actual order. I have been here before with an MSI 3080 which was cancelled after 4 days by the retailer.

I'm sure i've called Gsync a scam on here in the past. Freesync was a great move by AMD, but how long has it taken them to bring it up to a decent standard? And when they did, Nvidia came along and called it Gsync compatible.

No doubt Nvidia are the masters at Marketing (though they sometimes get it wrong). As far as the rebranding of Freesync goes, I don't believe Nvidia have been successful as most people still say Freesync when they mean VRR. So I think AMD came out on top overall with the VRR "war". The thing with Freesync is it was an open standard and as such there were cheap nasty monitors adertised as Freesync, while at the same time you could get a Freesync monitor that was as good as a more expensive G-Sync model. This is where many had unfair comparisons to declare G-Sync better. Though when you got 2x similar monitors, one Freesync and one G-Sync then the Freesnyc one was usually quite a bit cheaper. You can check this out with some of the Asus range of higher end monitors.

I can see the argument, AMD consoles will hold back the PC master race. Bad ports get a lot of flack on PC. I think we will get decent console ports where AMD will run low - medium settings in RT, while Nvidia will be running high.

Only time will tell but I feel that it will be more a case of developers taking a "good enough" approach unless they are funded by Nvidia as CP2077 is. So overall we will see a case of both AMD and Nvidia being close enough in RT that it won't really matter for most games.

Again the only caveat is how will AMD do for their DLSS equivalent. If it looks like DLSS 1.0 then it will be a fail IMHO.
 
Last edited:
The AMD U.K store web-page has been updated and its working ok having been broken for me since the day after 6800XT launch day.

It's not much but at least its a start. The £580 6800XT Reference is looking like the buy of the year, as time passes its looking even better value than when it launched. There is going to be a very big 2021 hangover for some peeps that blew £1,900 on those 3090's with bottom tier PCB boards (2x8pin zotacs for a start).
 
As far as CP2077, the biggest release in memory, I didn't read anything regarding AMD other than they were working with CDPR to get it working. That's the biggest release in memory that they didn't have and still don't have working. Now Nvidia came out and declared that they were only using DXR for RT, no proprietory extensions, while AMD were trying to sell there new DXR capable cards. Looking at RT performance so far we can see AMD is ~30-50% slower than Nvidia in RT without considering DLSS. So how many new RDNA2 cards do you think AMD would sell when the biggest release in memory ran in single digits if not fractions on AMD cards with RT enabled? It's clear AMD does not want RT running in CP2077 anytime soon.
But CP runs in digits on Nvidia cards too and yet that doesn't stop you to praise them and their great RT performance. Really open CP and try to run the game in 4k native with RT and see if you can get 10 FPS.
Why would AMD be afraid with that performance? It would be almost as good as Nvidia, close to 0. :)
 
Some box shots from randoms on the internet, not my cards and i don't have any for sale. Only evidence at least that the AMD AIB's shipped at least 1 card in 2020. LOL.

To my knowledge nobody has managed to review a 6900XT Nitro+ yet and yet here it is, sitting there like unicorn tears.

EdgBEH5l.jpg


x7EKBA9l.jpg


Elf2AZJl.jpg


dY95r5al.jpg


7bwM2del.jpg
 
Ray tracing, or better, path tracing is the holly grail of computer graphics. I don't think you can do more about that in terms of visuals.

Sure, you do more in terms of physics, AI, animation, etc. Just because RT is used in multiple areas (global illumination, ambient occlusion, shadows and reflections), doesn't mean that is heavy - as in shooting more rays than necessary so you can compare it with tessellation.

AMD on the other side, just puts out there different demos and tech and stands aside, waiting for devs to pick everything up and put it in their games. Tessellation was in some way (True Form was called if I recall correctly) in AMD's hardware waaaay before dx11. They also had AI, GI and other stuff running on the GPU back in HD4xxx days. True Sound rings a bell? Bullet (AMD's alternative to physics)?

I couldn't care less if is AMD or nVIDIA (or Intel?) in my PC. I only care getting the best card on my budget and needs and at those launch prices, to me and my needs, the green team has a better offer.


There is absolutely zero mention of ray-tracing in this video about Unreal Engine 5's new Nanite and Lumen technologies which will be used with PlayStation 5.



DLSS like processing is the future, be it real time in games as we have now or a combination including AI upscaling reducing delivery size and storage. Again, you can't blame Nvidia for AMD's lethargy. Indeed Nvidia handed Microsoft the DLSS model to be used in the DirectX family.

In principle and in general AMD's philosophy is to push technologies that improve the image quality/clarity/fidelity, as seen with the implementation of FidelityFX and Contrast Adaptive Sharpening.

DLSS is the opposite in nature - it reduces the image quality where it see more performance is needed because of some weird new technologies like DXR are implemented when the hardware transistors budget is years away of being ready yet..
 
I got a good laugh at the amount of insecure responses that fail to acknowledge or understand what we are actually seeing in games with ray tracing. And how ineffective it is, for me, to be a actual selling point to consider. For example, CB2077...the buggiest game of 2020 that's been blacklisted, rebuked, meme'd, refunded and sued...has been championed as the pinnacle of gaming "for ampere". With a 3090 at best and 3080 at worst. If the frame rates are not to your satisfaction you can enable DLSS. It will blur the image but allow you to enable RT with a slight performance boost.

The images below show you the game with and without RT.
To the left is without RT.
To the right with RT (some images may show slightly blurred do to DLSS).











Vocally, some post the delusion that the entire game looks better with RT on. If you took their word for it and not look at the game with RT On vs Off yourself you would think it's true. However, when you do you realize they are spouting off one of the biggest lies one can tell in order to promote Ampere. If anything the biggest difference you will see with RT on in this game is a performance penalty. Nothing more, nothing less.

Is it worth it to me to invest in Ampere when this is what I get in return? No, it is not. I cannot invest in 'closed api' as I already know that real RT is far better then this as exampled in this post:
https://www.overclockers.co.uk/foru...-event-thread.18904106/page-465#post-34373304

To this date I have not found any reasonable counter as to why I should consider it. However, if I don't align my opinion to favor nvidia I'm called a AMD fanboy who only talks negative about Rt/Nvidia. A badge of honor I wear proudly. Because that's the best and only response I've received from cheerleaders when presented with information like this which they want to counter but cannot. Using Ad Hominem to vocalize their own frustration. :D

Recap of what I've said:
First
, RT as whole in games offers poor IQ improvements to performance over it's rasterized counter part(s) that it replaces. It is not a standard in which someone who buys a midrange gpu can enjoy without the performance penalty or blurring of IQ through the use of DLSS to increase performance.

Second, it's not true/real ray tracing. As the game is still rasterized. The IQ will remain rasterized with only a few elements of RT put into the game which, to me, is only a few drops in the bucket. Games will never be fully RT'd.

Third, all it does is help decrease development time implementing lighting, shadows, reflections, ambient occlusion, global illumination, etc. As it depends on how it's implemented/tweaked to improve overall performance. And, which implementation the developer will use in the game. As all of the assets of RT are not always implemented in a game do to the performance penalty and/or direction. But like I stated before, it doesn't change the fact that its still a rasterized game. Once developers find a why to use it with a minimal performance penalty I can see it being used at the cost of very little improvement in IQ...oh wait...Dirt 5, etc...:D

However, since we are in the RDNA 2 thread. One has to decide if this is worth it to you or not. For me, it is not. I rather pay a cheaper price, if and when it becomes available, for better next gen level of rasterized performance. If some games include RT so be it. However, RT won't be a selling point to me. My interest is in the game itself.

Merry Christmas!!!
:D:D
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom