• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
I'm still holding of on playing miles morale till CPU usage issue is fixed, refuse to turn of SMT! :p But just seen computerbase.de benchmarks, a 3070 matching a 6950xt @ 4k and just about beating it at 1440 :eek:


rfFHmpA.png

EciIiGk.png

Seems a title which prefers nvidia even without RT, although having said that, nvidia did significantly improve their DX 12 performance with that driver update:

6ygoGE9.png

Intel arc doing well there.
 
Last edited:
I'm still holding of on playing miles morale till CPU usage issue is fixed, refuse to turn of SMT! :p But just seen computerbase.de benchmarks, a 3070 matching a 6950xt @ 4k and just about beating it at 1440 :eek:
Based on the average RT uplift of 1.6x that AMD's advertised for the 7900XTX that'd put it on par with the 3090 (at 4k) for this game (around 39fps) - hopefully better. Unfortunately (for AMD) Nvidia doubled their RT performance for the 40-series.
 
Last edited:

That's cool. I wonder how they managed to make it work. All other RTX tech demos I've seen either don't have mirrors or if they have mirrors the mirror is disabled when RTX is turned on so it'a doesn't have any reflection

The reason for being cautious with having two mirrors is that Ray Tracing is basically doing simulated light bounces so if you have two mirrors opposite eachother and you turn RT on you would be generating an infinite amount of bouncing rays between the mirrors which would then cause the game to crash. So somehow they've managed to limit how the light bounces off the surfaces in that demo to prevent the infinite bounce effect

My only guess is that they've coded the mirror surface to only produce one bounce to prevent the issue but that then would reduce the image quality of the reflection itself
 
Last edited:
I'm still holding of on playing miles morale till CPU usage issue is fixed, refuse to turn of SMT! :p But just seen computerbase.de benchmarks, a 3070 matching a 6950xt @ 4k and just about beating it at 1440 :eek:


rfFHmpA.png

EciIiGk.png

Seems a title which prefers nvidia even without RT, although having said that, nvidia did significantly improve their DX 12 performance with that driver update:

6ygoGE9.png

Intel arc doing well there.

3070 performance in them results is junk too, the so called 2080ti killer for less than half the price as he sold it to us :rolleyes: , that is dead in the water due to the 8GB VRAM (another lie from Nvidia and another planned obsolescence card), look at the 1% lows they are terrible, it will soon be a 1080p card only too, soon after the 3080 10GB will join the 8GB and under cards as Nvidia has made it clear now 1440p and up needs 12GB this time and really 16GB for 4K and up. Just look at the 3060 12GB a worse card and how it performs on them results in the 1% lows compared to its average frame rates and do the same for the cards with less VRAM than 2080ti's 11GB so 10GB and down cards to see the planned coming horror from Nvidia.

Remember how excited everyone was when they said they can now have 2080ti performance for less than half the price of a 2080ti ? Here is the reality now for a game that plays fine on a ps4 standard model and ps5 with all the RT stuff enabled. Even the ps5 has become a 1080p 60FPS console and 4K 30FPS with the latest games, proving how under powered even the ps5 is and again sold as 8K gaming console :cry:. The industry is disgusting now as a whole. Coming soon PS5 PRO and Xbox series X Pro... but digital foundry said no need for them because they were busy shilling for them to make you go buy consoles that were out of date the day they sold them.




Other day on a new video I saw he stated maybe soon new models coming ... off course new models coming because console gamers were lied to too as the hardware is under powered for the so called next gen games they promised for them.:rolleyes:





Just to show you all how bad digital foundry can be and the shilling for these companies that sponsor them and their content.


Latest review from DF of a game that is not even that amazing to look at and looks about as good as well done previous gen console games to me and is well 30fps 4k and 1080p 60fps.... for Xbox series x and ps5... :cry: You really can't make this stuff up.. reality is more comical...


 
Last edited:
That's cool. I wonder how they managed to make it work.
As you suspected, devs can limit the number of times rays can bounce to enable convincing but not infinite mirrors. Offline 3D renderers have been doing this for decades to enable ray-traced reflections and refractions.
 
Based on the average RT uplift of 1.6x that AMD's advertised for the 7900XTX that'd put it on par with the 3090 (at 4k) for this game (around 39fps) - hopefully better. Unfortunately (for AMD) Nvidia doubled their RT performance for the 40-series.

Fingers crossed it is at the very least matching a 3090/ti in RT as if not, that is a pretty big let down imo, been pretty poor having a 3070 matching a 6900xt in previous games with RT but to now have it matching/besting the absolute flagship by amd isn't a good show imo but at the same time I always did expect this to happen tbh. I'm very wary of "real world" results of rdna 3 RT tbh, as iirc, their RT scenarios weren't maxed out RT and using the second down from "max"? That and iirc, the slides were "up to xxxx amount faster"? RDNA 2 is "ok" with RT when RT settings are reduced a notch or 2 or other settings are sacrificed.

3070 performance in them results is junk too, the so called 2080ti killer for less than half the price as he sold it to us :rolleyes: , that is dead in the water due to the 8GB VRAM (another lie from Nvidia and another planned obsolescence card), look at the 1% lows they are terrible, it will soon be a 1080p card only too, soon after the 3080 10GB will join the 8GB and under cards as Nvidia has made it clear now 1440p and up needs 12GB this time and really 16GB for 4K and up. Just look at the 3060 12GB a worse card and how it performs on them results in the 1% lows.

Remember how excited everyone was when they said they can now have 2080ti performance for less than half the price of a 2080ti ? Here is the reality now for a game that plays fine on a ps4 standard model and ps5 with all the RT stuff enabled. Even the ps5 has become a 1080p 60FPS console and 4K 30FPS with the latest games, proving how under powered even the ps5 is and again sold as 8K gaming console :cry:. The industry is disgusting now as a whole. Coming soon PS5 PRO and Xbox series X Pro... but digital foundry said no need for them because they were busy shilling for them to make you go buy consoles that were out of date the day they sold them.




Other day on a new video I saw he stated maybe soon new models coming ... off course new models coming because console gamers were lied to too as the hardware is under powered for the so called next gen games they promised for them.:rolleyes:




Just to show you all how bad digital foundry can be and the shilling for these companies that sponsor them and their content.

That time of the month again eh.... :cry: :p

That extra 3GB of vram really going to town with giving a 2080ti an extra 4 fps over the 3070 8gb :D Doesn't seem like the extra vram is really helping that much in this tbh

- amd cards are hit hard regardless of 16gb vram, can't even help the top end flagship surpass a "mid-tier" 8gb nvidia card
- intel arc 8gb gpus is matching nvidias 12gb 3060 and 6800xt 16gb
- 2070 8gb is matching 6800 16gb

It was always expected consoles would end up going back to 30 fps for all eye candy and 4k, especially when factoring in RT, always the way with console lifecycle, at the end of the day, they are equivalent to a 2070s, which is how old now?
 
Last edited:
  • Haha
Reactions: TNA
Fingers crossed it is at the very least matching a 3090/ti in RT as if not, that is a pretty big let down imo, been pretty poor having a 3070 matching a 6900xt in previous games with RT but to now have it matching/besting the absolute flagship by amd isn't a good show imo but at the same time I always did expect this to happen tbh.



That time of the month again eh.... :cry: :p

That extra 3GB of vram really going to town with giving a 2080ti an extra 4 fps over the 3070 8gb :D Doesn't seem like the extra vram is really helping that much in this tbh

- amd cards are hit hard regardless of 16gb vram, can't even help the top end flagship surpass a "mid-tier" nvidia card
- intel arc 8gb gpus is matching nvidias 12gb 3060 and 6800xt 16gb
- 2070 8gb is matching 6800 16gb

It was always expected consoles would end up going back to 30 fps for all eye candy and 4k, especially when factoring in RT, always the way with console lifecycle, at the end of the day, they are equivalent to a 2070s, which is how old now?

With RT on because as we know Nvidia has better RT, the 6800 is a solid card when not using Nvidia scaled RT to make sure it cripples AMD cards. Turn RT off and the 6800 will win clearly... ;)

Rest is well there in your results you posted, the facts are there and what Nvidia has done and doing.

Also updated the post with a new video too.


The clue for people buying GPUS when the new consoles came out was to make sure to buy a card with same console memory as a whole, ps5 16GB and Xbox series X yes they do share it with the cpu too I know, but the clues were there and why console game devs said 16GB they wanted minimum this time for their games. Watch how quickly the ps5 pro/series x pro get 24GB.. and a much better RDNA 2 APU. We saw it before with ps4 to ps4 pro , the pro model was over twice as fast and they added 1GB more VRAM too it too so the OS can sit there.
 
Last edited:
With RT on because as we know Nvidia has better RT, the 6800xt is a solid card when not using Nvidia scaled RT to make sure it cripples AMD cards.
;)


Rest is well there in your results you posted, the facts are there and what Nvidia has done and doing.

Also updated the post with a new video too.


The clue for people buying GPUS when the new consoles came out was to make sure to buy a card with same console memory as a whole, ps5 16GB and Xbox series X yes they do share it with the cpu too I know, but the clues were there and why console game devs said 16GB they wanted minimum this time for their games. Watch how quickly the ps5 pro/series x pro get 24GB.. and a much better RDNA 2 APU. We saw it before with ps4 to ps4 pro , the pro model was over twice as fast and they added 1GB more VRAM too it too so the OS can sit there.

RT eats vram though. The raster results are a bit odd tbf as look at the arc 8gb gpu and even the 2060 gpu, which holds up well all things considering 6gb vram....

You forget that RT was present in spiderman when it was console only ;) I don't think the RT in this is implemented via RTX/nvidia methods but could be wrong, although it has been significantly dialled up compared to console (and the game is very good for providing scalability in terms of RT settings so cards with lesser capability in RT can use the appropriate settings, don't forget, even in amds sponsored RT games, they still see similar performance loss so with miles morale, it's not really a case of "nvidia scaled RT"). Much like what uncharted is to amd i.e. a partnership/standard sponsorship and that's all, not a "technical" sponsorship like cp 2077, dl 2.

What nvidia have done? Offering 2080ti level of performance for half the price? :D :cry: ;)

Consoles are having to get better hardware largely because they lack the grunt first and foremost, they are already sacrificing not just RT settings but all the other raster/graphical settings and especially the res. The res. is nowhere near 4k in most console games now and so adding another 8GB vram won't help at all. That's like saying, should be buying a 8 core cpu since consoles have 8 cores but reality is a 6 core such as 5600x will still destroy the console cpu.
 
RT eats vram though. The raster results are a bit odd tbf as look at the arc 8gb gpu and even the 2060 gpu, which holds up well all things considering 6gb vram....

You forget that RT was present in spiderman when it was console only ;) I don't think the RT in this is implemented via RTX/nvidia methods but could be wrong, although it has been significantly dialled up compared to console (and the game is very good for providing scalability in terms of RT settings so cards with lesser capability in RT can use the appropriate settings, don't forget, even in amds sponsored RT games, they still see similar performance loss so with miles morale, it's not really a case of "nvidia scaled RT"). Much like what uncharted is to amd i.e. a partnership/standard sponsorship and that's all, not a "technical" sponsorship like cp 2077, dl 2.

What nvidia have done? Offering 2080ti level of performance for half the price? :D :cry:;)

Consoles are having to get better hardware largely because they lack the grunt first and foremost, they are already sacrificing not just RT settings but all the other raster/graphical settings and especially the res. The res. is nowhere near 4k in most console games now and so adding another 8GB vram won't help at all. That's like saying, should be buying a 8 core cpu since consoles have 8 cores but reality is a 6 core such as 5600x will still destroy the console cpu.


There is a 20% difference between the 3070 and 2080ti in them results. That's basically a different tier card difference. They have not given you a 2080ti for half price. Just they want you to believe that with their select titles when compared. As we see now there is a 20% difference and the 2080ti is the winner clearly. ;)


The spiderman games only have RT enabled in the ps5 and pc versions, the ps4 versions have no RT. Also the pc ports of sony games are nothing like their consoles versions in code and even engines used, other companies port the pc versions on their own engines.
 
Last edited:
I do think it's kind of wild that we've come so far with what's achievable with rasterization in terms of performance and visual fidelity, and now we're almost going backwards in the rush to embrace real-time raytracing. Asobo's Plague Tale: Requiem is visually breathtaking and (currently) features no RT whatsoever. I don't think Asobo are particularly great at optimizing their engine, but the artistry is phenomenal. When we're already capable of creating incredibly beautiful games like Forza Horizon 5 (that can effortlessly render hundreds of fps on AMD or Nvidia hardware), throwing away all of that performance in chasing ever more realistic lighting and materials seems like an dreadful waste.

I guess that's just the nature of technological progress and the drive on the part of creatives to create something 'real' - when I used to work in CG visual FX in the 90's we had access to many ray-traced features (shadows, reflections, refraction) but we never used them because they were too damn slow to render. Instead we used shadow and reflection maps and every trick in the book to create realistic looking scenes without the render cost of doing it 'accurately' - of course, as time went on, CPUs got more powerful but our render times never went down because with more powerful hardware we started to use global illumination, ray-traced area lights etc.

Funny (and I guess, predictable) how real-time 3D seems to be following the same path :)
 
Last edited:
There is a 20% difference between the 3070 and 2080ti in them results. That's basically a different tier card difference. They have not given you a 2080ti for half price. Just they want you to believe that with their select titles when compared. As we see now there is a 20% difference and the 2080ti is the winner clearly.
;)



The spiderman games only have RT enabled in the ps5 and pc versions, the ps4 versions have no RT.

NS99bXy.png

Yup damn that 3070 at 4k maxed RT!!!! :mad:

:p

It depends largely on the game, most of the time, quite even then it wins some, loses some, at the time of release, I know I certainly wouldn't have been losing sleep over 3GB less vram with a saving of £600+, at the end of the day, anyone with either gpu will likely be looking to upgrade next year, at least if you're playing at high res. or/and high refresh rate.


I do think it's kind of wild that we've come so far with what's achievable with rasterization in terms of performance and visual fidelity, and now we're almost going backwards in the rush to embrace real-time raytracing. Asobo's Plague Tale: Requiem is visually breathtaking and (currently) features no RT whatsoever. I don't think Asobo are particularly great at optimizing their engine, but the artistry is phenomenal. When we're already capable of creating incredibly beautiful games like Forza Horizon 5 (that can effortlessly render hundreds of fps on AMD or Nvidia hardware), throwing away all of that performance in chasing ever more realistic lighting and materials seems like an dreadful waste.

I guess that's just the nature of technological progress and the drive on the part of creatives to create something 'real' - when I used to work in CG visual FX in the 90's we had access to many ray-traced features (shadows, reflections, refraction) but we never used them because they were too damn slow to render. Instead we used shadow and reflection maps and every trick in the book to create realistic looking scenes without the render cost of doing it 'accurately' - of course, as time went on, CPUs got more powerful but our render times never went down because with more powerful hardware we started to use global illumination, ray-traced area lights etc.

Funny (and I guess, predictable) how real-time 3D seems to be following the same path :)

Finished plague tale last week there and it is definitely a looker without RT but I think if/when they add RT and do a decent implementation of it, it will take it to the next level of visuals. It looked fantastic and almost photo realistic in some scene but personally I noticed a lot of the raster issues with the lighting and reflections i.e. light bleeding through objects/walls, reflections distorting/missing and some areas in general just having a glow look about them when they shouldn't have.

I wonder how much of people not noticing ray tracing improvements in lighting comes down to them not using good quality displays too, as obviously with ips and tn and dark scenes, dark areas with lighting will always have a "glow" about them so you don't quite experience that infinite contrast ratio in dimly lit/dark areas like you would on oled.

Isn't it meant to be getting RT features soon'ish ?

Apparently it was suppose to be a day 1 patch but still no word.... Probably because performance will be awful :p I'll be doing another replay once it lands.
 
Last edited:
  • Like
Reactions: TNA
Isn't it meant to be getting RT features soon'ish ?
Supposedly - although their engine's kinda slow (I get around 70-90 fps at 4k DLSS Quality on a Ryzen 5800X3D and 3090 Ti without RT) - so I'd hate to see how it performs with RT turned on. Also, the game's rasterized lighting is so carefully crafted that I doubt they'd be able to mess with it much - I'm thinking at most they'd add ray-traced reflections and/or shadows (maybe ambient occlusion). Well see.
 
Last edited:
Speaking the RT effects, pretty sure it was mentioned somewhere but can't find it now, however, rewatching some of the older videos, the colours, fire lighting surroundings etc. looks considerably better than what I saw in my play-through:

 
Last edited:
Speaking the RT effects, pretty sure it was mentioned somewhere but can't find it now, however, rewatching some of the older videos, the colours, fire lighting etc. looks considerably better than what I seen in the current game:
Here's the RTX trailer:

 
Here's the RTX trailer:

Problem is with the "RTX" label is that sometimes it can only infer to dlss and not ray tracing. Obviously not frame for frame here:

HtU2rHB.png

My game colour in this scene looked more like the one from the RTX trailer:

zpNEtu9.png

2 areas that stick out the most to me as being very different and looking like some kind of RT lighting/GI are these scenes/areas:

7S9WMpS.png

tFJHgaD.png

kRhqDQj.png

This scene looks more washed out than what I saw in my game though:

paLGL2M.png

Of course could also be different colour palette just affecting the scene and not being ray tracing but the fire lighting the surrounding areas reminded me of bright memory infinite RT on VS off:

MzzLEa2.jpg

sO8W62N.jpg
 
Status
Not open for further replies.
Back
Top Bottom