• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
I hope everyone knows that RT and how the consoles will use it will be a complete revolution of ill relevance.
Can't wait for those microscopes comparing IQ with and without RT for those subtle differences. And the debates that will be had to make shiny, shinier and to make reflective more reflector on PC. :D
 
Can't wait for those microscopes comparing IQ with and without RT for those subtle differences. And the debates that will be had to make shiny, shinier and to make reflective more reflector on PC. :D

If decently powered raytracing hardware becomes widespread, it's not about a few shinies, it's a huge step towards even greater realism, and will massively improve things like reflections and water.
 
To be fair im not bothered so much about the raytracing, if big navi does a good job of it, and I mean proper with similar results to nvidia then even better. What I really need is a card that can handle a good resolution on a display keeping the frames high and tight.
 
I'd bank on at least 56CU's for the PC part, they may not se the need for bigger/better (ie Capture 90% and below of the market, leave the big big chips to nV). Imo they should play it smart and leave the big chip/expensive to produce stuff to nV.
Isn't the rumour at least two, possibly three RDNA2 cards which could provide a good spread.

I just feel they need to be more relevant in the upper reaches of benchmarks. After Nvidia release the 30XX cards, the benchmarks will be a sea of green.
 
If decently powered raytracing hardware becomes widespread, it's not about a few shinies, it's a huge step towards even greater realism, and will massively improve things like reflections and water.




Ray Tracing....architects.






RTX ON


Nope, it's about the shiny.
See how shiny those pipes are. You can shave your face off those pipes. Boom.... Now imagine this game now at 1440p 60fps on 3080ti.
 

RTX ON

Nope, it's about the shiny.
See how shiny those pipes are. You can shave your face off those pipes. Boom.... Now imagine this game now at 1440p 60fps on 3080ti.
Ah yes, realism. As we all know, every industrial facility has at least one guy whose job it is to polish all the pipes to a high gloss :D

RT is atm largely a gimmick. It could be used to add realism, or we could just have mirror finishes on every surface :p
 
Ah yes, realism. As we all know, every industrial facility has at least one guy whose job it is to polish all the pipes to a high gloss :D

RT is atm largely a gimmick. It could be used to add realism, or we could just have mirror finishes on every surface :p









Gimmick doesn't even describe it all. More like con. If they didn't have such a following this would be laughed right out the market. But people take this seriously. Even on this forum.
Now let us both go yell at our HDTV's for not reflecting, exactly, what's across the room when it's off. :D


Edit:
Had Nvidia told developers to make subtle changes and reduce the IQ of RT by a factor of 10x only using it where it made sense the Turing cards would have put RDNA 1 to shame. But Nvidia's marketing department really screwed up with the presentation of RTX make it more of a meme then a value option.




Now they need Ampere to bring the performance that Turing should have had because they dialed ray tracing to 11. Instead of just increasing the IQ of ray tracing with a beefier GPU (Ampere). And some talk about AMD's marketing department needing help. The only developer that did this was Dice with BF5. Neither Nvidia nor their marketing department had the sense to take note of how well Dice implemented RT without the need of DLSS (because they lowered the IQ of RT and only used it were it made sense among other tweaks).


-----------

The point is that right now we are going to get console ported games using RT sparingly. And it will be used only were it make sense for it. And the actually IQ per pixel for RT will be much lower then what Nvidia did for obvious reasons.
 
Last edited:
@EastCoastHandle Nvidia did the same with PhysX. When it was put into games,not only was it dialed up to silly levels,they also removed normal physics effects,to make the physx effects look more of a change. It also meant most people with mainstream Nvidia GPUs,ended up with crap FPS. If they had used it more globally at a lower level,it would have been more natural and less of a gimmick. The same goes with tessellation where they had a genuine advantage over AMD,but ended up lazily just getting devs to over tessellate a few objects massively,instead of using it at a lower level,more widely to generate a more global effect.In the latter case AMD could use driver optimisations to drop down tessellation,without much of a visual difference. They couldn't have done that if it was used at a lower level,but more globally.

And the actually IQ per pixel for RT will be much lower then what Nvidia did for obvious reasons.

Consoles tend to use visual effects in a more efficient way,whereas with PC AMD/Nvidia want to sell their GPUs,so tend to overuse certain effects,which take up too much of the GPU processing power for a small benefit. So you end up having to dial down other effects to compensate. This is why you can have some very pretty games on consoles,despite relatively weak hardware.
 
Actually we need RT dialled up to silly levels (and the hardware to support it) when you have that level of real time quality is where it really starts to show how fake and lacking contemporary techniques are. Don't mistake poor developer implementation for a deficiency with the technology.
 
:laugh: RDNA 1 is already 50% faster performance per watt than that Vega.
Vega no matter how much improvements does get, is still an old architecture with many missing features.

Ryzen 5000 APUs have to be with RDNA 1st generation, Ryzen 6000 APUs have to be with RDNA 2nd generation.


You don't even know what you are talking about. There hasn't been a ban.

It is my free will to stop posting in censored forum.

Oh crap you are back and still spouting total rubbish. Had hopped you had gone for good. Your trolling really spoils these forums
 
Actually we need RT dialled up to silly levels (and the hardware to support it)

Will even an Ampere Titan (or 2 in SLI) be able to do this.

From what I have seen so far with Turing I think Ampere hardware will come up very short of the mark.

I think RT will really take off with the next gen after Ampere in a couple of years time.
 
Will even an Ampere Titan (or 2 in SLI) be able to do this.

From what I have seen so far with Turing I think Ampere hardware will come up very short of the mark.

I think RT will really take off with the next gen after Ampere in a couple of years time.

The big problem at the moment is no developer can really go all out on RT and not support older hardware - path tracing implementations have come on a long way in the last couple of years or so - when developers can forgo having to build games to run on older hardware and build games from the ground up only using RT techniques things will change a lot - Quake 2's path tracing implementation in its latest incarnation runs well on a 2080ti and will run very very well on newer GPUs.

RTX ON


Nope, it's about the shiny.
See how shiny those pipes are. You can shave your face off those pipes. Boom.... Now imagine this game now at 1440p 60fps on 3080ti.

There is nothing stopping RTX rendering scenes like the top images. Getting so hung up on 1-2 games that don't implement it well is a bit silly.
 
Last edited:
Are you back from holiday?
Lol.

He is back with his silly statements. AMD should have done this and AMD should have done that. Like they have a magic wand they refuse to use.

While they were at it, AMD should have released a GPU 100% better than the 2080Ti with 400% better RT performance. They should have released Arcturus 3 years ago also :p
 
Hey Grim, where are you man? These guys are slaughtering RTX, defend defend!! :p

Papa Jensen won’t be happy with your performance of late, you don’t want to be on his bad side so close to new GPU release do you? :D
 
Status
Not open for further replies.
Back
Top Bottom