• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
LOTR Gollum gets 48fps @ 4k on the 4090 :D

the-lord-of-the-rings-gollum-geforce-rtx-3840x2160-nvidia-dlss-desktop-gpu-performance-scaled.jpg


Untitled.png


Can't wait! :cry:
Noticed how they didn't even add 8GB cards to that list too. This is now the way things are going by Nvidia, just to sell you fake frames that are in the unreal level compared to native and selling it as 4K when it is nothing more than upscaled lower resolution and fake frames added. Pure scam really. This I can never fall for or allow to be the future of GPUs on a PC. Disgusting.

Also cyberpunk is nothing more than a tech demo and marketing tool for Nvidia for their DLSS tech and I don't even look at benchmarks for new hardware from Nvidia with it because it has been optimised to show the best case scenario for their DLSS tech and fake frames. Wake me up when GPUS are being sold for their native performance and compared apples to apples again, until then I have zero interest in benchmarks with this tech in it as its nothing more than a new cheat for gpu makers that in the past they would have been panned for.
 
According to Nvidia stats about 60% of RTX GPU owners turn RT on

They only got one positive stat of out of all the stats, a solitary positive stat then?

And yet the majority of gamers are still using <8gb gpus, so much for all that movement of wanting/needing more vram, people voting with their wallets :cry:
Yes they are voting with their wallets/playing within spec of their hardware, one things for sure they aint buying/playing RTX titles. :cry:


The RTX titles are coming in thick and fast with CP at no. 63

Far cry is at 61;););)

Hogwarts is in at no. 72
 
Last edited:
Noticed how they didn't even add 8GB cards to that list too. This is now the way things are going by Nvidia, just to sell you fake frames that are in the unreal level compared to native and selling it as 4K when it is nothing more than upscaled lower resolution and fake frames added. Pure scam really. This I can never fall for or allow to be the future of GPUs on a PC. Disgusting.

Also cyberpunk is nothing more than a tech demo and marketing tool for Nvidia for their DLSS tech and I don't even look at benchmarks for new hardware from Nvidia with it because it has been optimised to show the best case scenario for their DLSS tech and fake frames. Wake me up when GPUS are being sold for their native performance and compared apples to apples again, until then I have zero interest in benchmarks with this tech in it as its nothing more than a new cheat for gpu makers that in the past they would have been panned for.

It may be but it is also a showcase for path tracing and what to expect from future games in terms of visuals.

On gollum, given how awful it looks especially textures, if the game demands more than 10gb for 1440p then it will be another **** optimised title which will no doubt magically get patched months down the line to run better.

According to Nvidia stats about 60% of RTX GPU owners turn RT on

They only got one positive stat of out of all the stats, a solitary positive stat then?


Yes they are voting with their wallets, they aint buying/playing RTX titles. :cry:


The RTX titles are coming in thick and fast with CP at no. 63

Far cry is at 61;););)

Hogwarts is in at no. 72

We covered this in another thread already.

That is "current" players, I would kind of expect live service/MP games to be dominating over triple a single player games....

FC 6 was free recently and at its lowest price too hence why it is got a peak, generally how these things work, what do you think will happen when cp 2077 gets its new dlc and/or there is a price drop too?

Since when do old games dictate what the current coming out games and future games are using?

Don't know why you got such a hard-on for living in the past with old/ancient methods, not like you got a crap gpu.
 
Last edited:
It may be but it is also a showcase for path tracing and what to expect from future games in terms of visuals.

Ohh the graphics tech side I love the whole PT and RT and ways to make more real and more beautiful CG, I have always liked this tech and its nothing new we did all this back in the day and how movies have always been made, only real difference is none of it was real time and needed render farms to render a single frame even.

The problem I have with Nvidia right now is the tricks that they are using to over price their hardware and they are nothing more than software tricks and a way to use hardware designed for their business section and trying to justify it to gamers as a price increase for better performance/imagery that is not native or even real frames. I understand why all these upscaling techs and frame generation techs are useful but the way they are selling it is just wrong to me and they are using it to fake benchmarks and hiding real apples to apples hardware increases.

You know what I mean Nexus there I'm sure. It's not about how good or bad the tech is, it's about the way they have weaponised it to justify doubling the price of 80 class cards for example and selling lower tiers of hardware as so called higher tiers.

Selling less hardware for more is not how it's meant to be in my book and hardware is meant to get cheaper and better over time not what is currently happening at many tiers being sold by Nvidia and AMD. Sadly I think they really fear the issues with silicon manufacturing in the near future so they want to drip feed us now and use any tricks to do that. We have had silicon manufacturing issues in the past too but they gave us good performance increases each generation when we were stuck on 28nm for a very long time for example.
 
Last edited:
On gollum, given how awful it looks especially textures, if the game demands more than 10gb for 1440p then it will be another **** optimised title which will no doubt magically get patched months down the line to run better.
I'll always come back to Cyberpunk in terms of this, that game uses less than 5GB of RAM and 10GB of VRAM at max path tracing, 3440x1440, Ultra quality settings and DLSS Quality - There's no reason why any other game should be using more than 10GB of VRAM in that context since 2077 is the only game using RT so heavily.
 
Ohh the graphics tech side I love the whole PT and RT and ways to make more real and more beautiful CG, I have always liked this tech and its nothing new we did all this back in the day and how movies have always been made, only real difference is none of it was real time and needed render farms to render a single frame even.

The problem I have with Nvidia right now is the tricks that they are using to over price their hardware and they are nothing more than software tricks and a way to use hardware designed for their business section and trying to justify it to gamers as a price increase for better performance/imagery that is not native or even real frames. I understand why all these upscaling techs and frame generation techs are useful but the way they are selling it is just wrong to me and they are using it to fake benchmarks and hiding real apples to apples hardware increases.

You know what I mean Nexus there I'm sure. It's not about how good or bad the tech is, it's about the way they have weaponised it to justify doubling the price of 80 class cards for example and selling lower tiers of hardware as so called higher tiers.

Selling less hardware for more is not how it's meant to be in my book and hardware is meant to get cheaper and better over time not what is currently happening at many tiers being sold by Nvidia and AMD. Sadly I think they really fear the issues with silicon manufacturing in the near future so they want to drip feed us now and use any tricks to do that. We have had silicon manufacturing issues in the past too but they gave us good performance increases each generation when we were stuck on 28nm for a very long time for example.

Like I always keep saying, these tech demos serve a big purpose from a technical advancement progression in terms of plotting the future roadmaps, yes, it's marketing too, what in this world isn't done for marketing/social media reasons?

I don't disagree about hardware advancement but at same time, the hardware isn't there to gives us the visuals on show with ray tracing thus we "need" upscaling tech and frame generation to achieve this vision "now" and not having to wait several years till we can actually use it.

My problem with upscaling and frame generation is that devs are now using this as a reason to not optimise their games. Heck wasn't it ubisoft or someone who answered a community question about denuvo impacting performance and they came back with "dlss should overcome these performance issues" :rolleyes: That is the biggest problem.

I'll always come back to Cyberpunk in terms of this, that game uses less than 5GB of RAM and 10GB of VRAM at max path tracing, 3440x1440, Ultra quality settings and DLSS Quality - There's no reason why any other game should be using more than 10GB of VRAM in that context since 2077 is the only game using RT so heavily.

Definitely!

People will say the textures are crap but personally I think they're pretty damn good especially the characters models, most detailed skin/npcs I have seen especially for an open world game.
 
We covered this in another thread already.

That is "current" players, I would kind of expect live service/MP games to be dominating over triple a single player games....

FC 6 was free recently and at its lowest price too hence why it is got a peak, generally how these things work, what do you think will happen when cp 2077 gets its new dlc and/or there is a price drop too?

Since when do old games dictate what the current coming out games and future games are using?

Don't know why you got such a hard-on for living in the past with old/ancient methods, not like you got a crap gpu.
That's why I mentioned FC6, with 3 winks- we know you like repeating yourself.

That singular stat of 60% means nothing-as you explained, there's 'reasons' for that stat.

I'm all for advancement, but it won't advance much if Nv wall off the top tier with embarrassing performance drop offs with higher pricing every pp under the 4090 will it?

First time in 10 years Iv'e went AMD, that's down to higher priced/lower performing vram limited gpus@launch :cry: -they can ram it, more software features that are needed to make your new gpu look better does not justify that kind of price increase.

Turn it on for 30secs, see the massive fps drop then switch it back off more like. But it still counts!!! :p ;)

Exactly.
 
Last edited:
That's why I mentioned FC6, with 3 winks- we know you like repeating yourself.

That singular stat of 60% means nothing-as you explained, there's 'reasons' for that stat.

I'm all for advancement, but it won't advance much if Nv wall off the top tier with embarrassing performance drop offs with higher pricing every pp under the 4090 will it?

First time in 10 years Iv'e went AMD, that's down to higher priced/lower performing vram limited gpus@launch :cry:-they can ram it, more software features that are needed to make your new gpu look better does not justify that kind of price increase.



Exactly.

Well tbf, given the most dominant res. is still 1080p, even top end rdna 2 would have no issue with rt at that res. :cry:

I agree, hardware needs to be better but so does the optimisation/implementation of RT i.e. like metro ee but that isn't going to happen until devs dump raster, which means they need to dump old gen consoles first and foremost.
 
Well tbf, given the most dominant res. is still 1080p, even top end rdna 2 would have no issue with rt at that res. :cry:
That's as true as the 4080 and 90 are the only 40 series bulletproof on the vram for>1080p.:cry:

I agree, hardware needs to be better but so does the optimisation/implementation of RT i.e. like metro ee but that isn't going to happen until devs dump raster, which means they need to dump old gen consoles first and foremost.

It ALL needs to improve hardware side, GPUs and Vram, they've even ****ed with the memory bus to ensure performance drop off, they are imposing performance penalty's now to keep GPUs within spec FFS.:cry:

Until then the 90 series will be the only GPU that can drive new effects-decently as shown with the 4060Ti when it can't even run DLSS 3.
 
Noticed how they didn't even add 8GB cards to that list too. This is now the way things are going by Nvidia, just to sell you fake frames that are in the unreal level compared to native and selling it as 4K when it is nothing more than upscaled lower resolution and fake frames added. Pure scam really. This I can never fall for or allow to be the future of GPUs on a PC. Disgusting.

Also cyberpunk is nothing more than a tech demo and marketing tool for Nvidia for their DLSS tech and I don't even look at benchmarks for new hardware from Nvidia with it because it has been optimised to show the best case scenario for their DLSS tech and fake frames. Wake me up when GPUS are being sold for their native performance and compared apples to apples again, until then I have zero interest in benchmarks with this tech in it as its nothing more than a new cheat for gpu makers that in the past they would have been panned for.
Yes, but the old-style IQ cheating was so, well, old-style.

In 2023 the IQ cheating is being done with AI!

AI is magic, gets better results than native and all that.

AI hype is very much Nvidia's thing nowadays - didn't the latest AI hyping see their market cap get close to a trillion?
 
Yes, but the old-style IQ cheating was so, well, old-style.

In 2023 the IQ cheating is being done with AI!

AI is magic, gets better results than native and all that.

AI hype is very much Nvidia's thing nowadays - didn't the latest AI hyping see their market cap get close to a trillion?

Ai is the future, there's a reason companies are investing millions and billions into it and it's certainly showing the benefits in many areas, hence why amd are doing a U-turn on their stance now too.
 
For those who say they are waiting for GPUs that don't have any AI stuff, you may as well invent a Time Machine cause there is no going back now, you're one foot in the grave already like the people who wanted a horse instead of a car
 
Last edited:
Lol@fake frames. Bro, there’s no perceptible difference between FG and real frames on my 4090. Always the ones who dont have access to the tech dissing it.

You can keep spamming this over and over, wont make it true. But i guess it might make you feel better… idk

And the latency? Im on a goddamn instant pixel response 165hz qdoled. And if i’m not feeling it, you wont feel it on a ****** lcd. Unless there’s something else whacky going on with your VRR/vsync settings.

AI is the future. So are the AI upscaling and ‘fake frames’.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom