• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
Oh yeah, I'm very triggered, I've never been so triggered, I'm the most triggered triggered person that has ever been triggered.....

I'm guessing it's not using all the features of DX12 since it's been rushed to market. That's just a guess though, you would be better asking the Red Engine developers.

BTW, you missed ....
Since you keep bringing up this bug-ridden game and ampere i will need an answer to my question? We know that the Nvidia has been working with CD projekt Red for quite a while with this game. DirectX 12 ultimate is not something new. Especially when developers who are creating games for console can use it. As well as Radeon users.

It only appears that after everything you said ampere is not as good as RDNA 2 after all. And you're linked videos aren't capable of answering the question either. But nice try, lol.
:D
 
Last big title that Nvidia sponsored from the same company:
zep4dtgaxkx51.jpg


Replace the useless hairworks with the useless RT we have now, and you see why Nvidia always puts some bloatware in their sponsored games. :)

Is that the same useless RT that Apple, AMD, Intel, Microsoft and Sony are investing in?

I agree HairWorks was awful. TressFX did a better job. But in defence of HairWorks, it could put a lot more hair on screen (that sounds rude) :o
 
Since you keep bringing up this bug-ridden game and ampere i will need an answer to my question? We know that the Nvidia has been working with CD projekt Red for quite a while with this game. DirectX 12 ultimate is not something new. Especially when developers who are creating games for console can use it. As well as Radeon users.

I didn't bring it up, I answered your hate fueled post.

It only appears that after everything you said ampere is not as good as RDNA 2 after all. And you're linked videos aren't capable of answering the question either. But nice try, lol.
:D

Yep, it certainly looks that way ....

cLXBjMF.jpg



DirectX Raytracing Feature Test

1 GPU
  1. Score 63.14, GPU 3090 @2265/4876, CPU 5950X @4.8, Post No.0462, OC2000 - Link Drivers 460.79
  2. Score 62.98, GPU 3090 @2205/5328, CPU 9900KF @5.0, Post No.0379, spartapee - Link Drivers 457.09
  3. Score 62.38, GPU 3090 @2160/4976, CPU 9900k @5.0, Post No.0480, Raiden85 - Link Drivers 460.89
  4. Score 62.25, GPU 3090 @2220/5276, CPU 10900k @5.3, Post No.0373, Jay-G25 - Link Drivers 457.09
  5. Score 61.19, GPU 3090 @2160/5252, CPU 6950X @4.405, Post No.0372, FlyingScotsman - Link Drivers 457.09
  6. Score 60.23, GPU 3090 @2145/5176, CPU 3175X @4.8, Post No.0415, sedy25 - Link Drivers 457.30
  7. Score 59.34, GPU 3090 @2070/4976, CPU 5950X @4.965, Post No.0474, Grim5 - Link Drivers 460.89
  8. Score 58.58, GPU 3090 @2100/5276, CPU 3600X @4.4, Post No.0445, Bickaxe - Link Drivers 457.51
  9. Score 55.57, GPU 3090 @1980/4876, CPU 5950X @4.1, Post No.0429, Kivafck - Link Drivers 457.30
  10. Score 55.57, GPU 3090 @1995/4876, CPU 10900k @5.1, Post No.0357, Sedgey123 - Link Drivers 457.09
  11. Score 55.50, GPU 3090 @2085/5076, CPU 3800X @4.7, Post No.0450, ChrisUK1983 - Link Drivers 457.51
  12. Score 55.47, GPU 3090 @2040/4876, CPU 5900X @3.7, Post No.0423, atomic7431 - Link Drivers 457.30
  13. Score 54.39, GPU 3090 @1905/5176, CPU 10900k @5.2, Post No.0446, kipperthedog - Link Drivers 457.51
  14. Score 52.24, GPU 3080 @2235/5252, CPU 3900X @4.649, Post No.0413, haszek - Link Drivers 457.09
  15. Score 50.56, GPU 3080 @2145/5248, CPU 3600 @4.4, Post No.0411, TNA - Link Drivers 457.30
  16. Score 34.15, GPU 6900XT @2625/4280, CPU 5800X @5.049, Post No.0477, 6900 XT - Link Drivers 20.12.2
  17. Score 33.31, GPU 3070 @2085/4050, CPU 3175X @4.12, Post No.0392, sedy25 - Link Drivers 457.09
  18. Score 32.54, GPU 2080 Ti @2130/3500, CPU 3950X @4.301, Post No.0357, Grim5 - Link Drivers 452.06
  19. Score 29.91, GPU 2080 Ti @1980/3500, CPU 8700 @4.3, Post No.0391, Quartz - Link Drivers 456.55
  20. Score 23.96, GPU 6800 @2295/4220, CPU 3900X @4.541, Post No.0459, Chrisc - Link Drivers 20.12.1
  21. Score 21.36, GPU 2080 @2025/4050, CPU 9900k @5.0, Post No.0365, Cooper - Link Drivers 457.09

I wonder if that's an RDNA2 wrench :D
 
Is that the same useless RT that Apple, AMD, Intel, Microsoft and Sony are investing in?

I agree HairWorks was awful. TressFX did a better job. But in defence of HairWorks, it could put a lot more hair on screen (that sounds rude) :o
Nvidia emphasis on heavy RT games is nothing else but another hairworks. It won't exist anywhere outside Nvidia sponsored titles. Discrete RT will be used in more games.
 
Why has a thread about RDNA2 turned into a thread about nVidia Ray-Tracing? At this point in time, it appears the only people concerned about Ray-Tracing are nVidia (because they've invested in it) and nVidia fans. Maybe there should be a poll asking if anyone buys a GPU because it does Ray-Tracing?
 
I didn't bring it up, I answered your hate fueled post.



Yep, it certainly looks that way ....



I wonder if that's an RDNA2 wrench :D
Ah look at that! You triggy :D

But I still need an answer to my post though. Those benchmarks are not an answer to why Ampere cannot use dx12u features in cb2077, an open world game, that offers 5 elements of ray tracing.

I seriously like to know why because I am looking forward to more games that use variable rate shading, mesh shading, etc. To me its more revolutionary then rt.

Which is why I find RDNA 2 a better option.
:D
 
Last edited:
Why has a thread about RDNA2 turned into a thread about nVidia Ray-Tracing? At this point in time, it appears the only people concerned about Ray-Tracing are nVidia (because they've invested in it) and nVidia fans. Maybe there should be a poll asking if anyone buys a GPU because it does Ray-Tracing?

It's because the trolls are trying to sell Nvidia cards.
 
Ah look at that! You triggy :D

But I still need an answer to my post though. Those benchmarks are not an answer to why Ampere cannot use dx12u features in cb2077, an open world game, that offers 5 elements of ray tracing.

I seriously like to know why because I am looking forward to more games that use variable rate shading, mesh shading, etc. To me its more revolutionary then rt.

Which is why I find RDNA 2 a better option.
:D

Maybe due to it not using that level of DXR? Did you ask the devs who worked on Red Engine?
 
Why has a thread about RDNA2 turned into a thread about nVidia Ray-Tracing? At this point in time, it appears the only people concerned about Ray-Tracing are nVidia (because they've invested in it) and nVidia fans. Maybe there should be a poll asking if anyone buys a GPU because it does Ray-Tracing?
Because it gets boring to praise Jensen on Nvidia topics. :)
 
Maybe due to it not using that level of DXR? Did you ask the devs who worked on Red Engine?
Oh but you don't remember. When Ubisoft launched an AC game sponsored by Nvidia, they removed DirectX 10.1 support from it because Nvidia was lagging far behind AMD at that time. This is the way it's meant to be played, you sign the pact with the devil you do what he wants. :)
 
Nvidia emphasis on heavy RT games is nothing else but another hairworks. It won't exist anywhere outside Nvidia sponsored titles. Discrete RT will be used in more games.

I don't see any reason why not. Remember RT is not proprietary to Nvidia. What excuse will you have once Intel enters the market, Evil Intel? Got RL stuff to do, so please post the Steam split on AMD/Nvidia. I'm sure devs can put in a slider or two in future games just as they have done in past.
 
Maybe due to it not using that level of DXR? Did you ask the devs who worked on Red Engine?
Nvidia had a team working with CDPR. It's standard operating procedures called developer relations. It's suppose to be in their development kit when optimizing the game to work with their hardware.
Therefore, I question ampere's ability not the developer. Because CDPR won't implement features that don't work nor work well with ampere. As it stands, in my opinion, ampere doesn't work or doesn't work well using both high levels of RT and DX12U. So they dumped DX12U for RT.

But for AMD, they are plowing ahead with DX12U console ported games to PC. Optimizing and updating SAM for both Zen3 and apparently Zen2. They even worked with rival Intel to bring SAM on their chipsets as well.

When I factor all of these great benefits AMD offers it's impossible for me to ignore to only look at ampere and CB2077. Your view of that is minuscule in comparison.
RDNA 2 offers so much!! And, is a direct meanings of gaming on those ported console games too.

So your post is still not an answer to what's going on.

:D
 
I don't see any reason why not. Remember RT is not proprietary to Nvidia. What excuse will you have once Intel enters the market, Evil Intel? Got RL stuff to do, so please post the Steam split on AMD/Nvidia. I'm sure devs can put in a slider or two in future games just as they have done in past.
Because it requires too many resources vs traditional rendering. So they will have to stop somewhere, even Nvidia won't be able to push the narative more than 2 more generations.
The consoles have hw for discrete RT so the game creators will use RT in the future. But never at the level Nvidia will use in its own sponsored titles. But they are not using heavy RT to make the games better, the primary purpose is to make the old cards and competition cards obsolete and sell their new cards. When heavy RT will become too expensive, they will move to the next feature they can exploit.
 
Got RL stuff to do, so please post the Steam split on AMD/Nvidia. I'm sure devs can put in a slider or two in future games just as they have done in past.
Lol you need to understand that only a small percent of these steam users have a card that can do RT and the percent that owns a 3000 gen Nvidia card is minuscule.
 
Well this thread has turned far too sour. So many people banging on and on about NVidia trolls etc etc but all I see is NVidia haters. I don't care for any of that in truth and would rather everyone got along without this nonsense. Surely we all enjoy our hardware, regardless of red or green or blue in it?
 
Status
Not open for further replies.
Back
Top Bottom