• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Caporegime
Joined
17 Mar 2012
Posts
47,671
Location
ARC-L1, Stanton System
RT has existed for decades before it was called RTX.

Having said that Nvidia were the ones to recognise the potential of it in real time render gaming and put a lot of hard work in to making that a reality, a lot of good and complex work.

Gsync HDR (aka Gsync Ultimate) came out in like 2018, back then a module was required, AMD didn't have a solution for 144Hz HDR with guaranteed 1000 nits, Nvidia did, hence why a module was necessary and why the whole Ultimate certification exists, to offer that guarantee . Even today none of AMD's FreeSync tiers guarantee a luminance maximum for HDR.

Obviously on modern displays things have changed and up to 1000 nits is possible without a module. You're talking about the early days, so that's what it was like during those early days of VRR factoring in HDR performance. At the baseline ALL of these VRR displays follow the same adaptive sync standard that has been dictated. It's the additional features that once was only possible with the Gsync module.

Yes the module is basically defunct now since monitor's onboard scalers have been able to do what the gsync module has been able to do for some time. But let's not gloss over the actual specs and timeframes in question.

Are you saying that because Nvidia couldn't do HDR / 1000 Nits+ without a G-Sync module they needed AMD to step in?
 
Associate
Joined
19 Mar 2024
Posts
49
Location
Cybernet
Is it just me or does DLSS look way worse than native?

Native 1440p vs DLSSQ.

70 FPS is native - I took a screen cap before his MSI AB could update to the new 45 FPS it was running at.

This is a reason I play native with no RT. I still want my natural vision.

Even worse, the Youtuber says they can't see a difference, either a koollaide drinker or needs a checkup.


Snapshot-2024-03-25-01-40-56.png


Snapshot-2024-03-25-01-41-50.png
 
Last edited:
Associate
Joined
19 Mar 2024
Posts
49
Location
Cybernet
RE2's RT implementation still shows excellent use of cleaning up bad visuals whilst maintaining great visual clarity.





I am unsure of the magic behind the original FSR implementation in RE2 also but it absolutely destroys the DLSS implementation above, yes apples to oranges in which games but same RE engine.

Eample of dog doo doo reflections beyond the norm for SSR.

TywxKtR.jpeg
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,058
Story was superficial tbh and suffers greatly from the same issues as other open world RPGs: you should be driven for a quick resolve due to an impending doom (personal or world ending), but you're doing relative meaningless side quests without any repercussions
:)
)

As for a city, I think Novigrad was pretty good, too, and in general I liked all the communities from TW3.

Still though, what does it better? Only games I would argue is RDR 2 and GTA 5 but their game worlds are empty and don't have as much going on not to mention are just as scripted, if not more so.

Can't give any credence to AMD for anything, if you do then the "nvidia invented everything and AMD make poor copies" narrative can't be maintained.

Seriously like you i have watched this BS for more than a decade, its incredibly childish, remember the reaction to Mantle? Even some tech jurnoes reaction to it? Suddenly Microsoft was the gamers darling and DX11 was the most perfect thing ever created....
5 minutes before everyone agree DX11 was crap!

Because Nvidia didn't do it.

Mantle was great, we wouldn't have vulkan without it but as per amd fashion, they didn't want to be stuck supporting it or pushing to get it in games (like all their tech.) hence why they handed over the reigns. I was a big fan of mantle because it worked wonders in BF 4 and hardline on my aging cpu with a powerful gpu but alas, amd couldn't or didn't want to get it in more games.

The problem with Free-Sync is its a better solution than what Nvidia had, so it can't stand that AMD had anything to do with it.

First they rubished it, then they tried to prove it was rubbish with apples to oranges data because simply saying it was rubbish didn't work, and now its just AMD claiming something that had already existed all along.

Bloody hell...
Fee-Sync does the same thing G-Sync does, only AMD recognised you can do it without a specialised hunk of £150 hardware, all you needed was a handshake between the Displays V-Blank Scaler and the GPU, this is the bit Nvidia thought you need the hardware for.

That scaler just needed to take instructions from the GPU as to when or not to display an image, a variable V-Blank scaler already existed, all it needed was a modification to accept software instructions, AMD created that software and R&D'ed the V-Blank scaler modification, manufactures follow those instruction pretty much now as default for monitors and increasingly TV's.

So you're now a more qualified monitor tech expert than 2 of the best reviewers in the market? :cry: Post evidence to go along with your ridiculous statements as no one takes such posts seriously anymore.

Gsync HDR (aka Gsync Ultimate) came out in like 2018, back then a module was required, AMD didn't have a solution for 144Hz HDR with guaranteed 1000 nits, Nvidia did, hence why a module was necessary and why the whole Ultimate certification exists, to offer that guarantee . Even today none of AMD's FreeSync tiers guarantee a luminance maximum for HDR.

Obviously on modern displays things have changed and up to 1000 nits is possible without a module. You're talking about the early days, so that's what it was like during those early days of VRR factoring in HDR performance. At the baseline ALL of these VRR displays follow the same adaptive sync standard that has been dictated. It's the additional features that once was only possible with the Gsync module.

Yes the module is basically defunct now since monitor's onboard scalers have been able to do what the gsync module has been able to do for some time. But let's not gloss over the actual specs and timeframes in question.

It wasn't even just hdr and refresh rate related, nvidias gsync module had various advantages over freesync, the first iteration of freesync was basically pointless for a long time because monitor manufacturers were skimping on scalers and so on, there was no low frame rate compensation (majority of monitors had terrible ranges of like 48-75hz), there was and still is on the cheaper models, no variable pixel overdrive to match your fluctuating fps and don't even forget about all the flickering and black screens issues. AMD somewhat resolved some of these with freesync premium certification and as per blurbusters chief, gsync module is still held as the gold standard for a VRR experience but yes, after years, freesync eventually caught up to be in the same league as gsync module and with oled, the advantages it had/has are less important i.e. variable pixel overdrive since oleds pixel response is instant.







To get back on topic.....

RT has existed for decades before it was called RTX.

You're wrong on this too....... Stop falling for nvidia marketing. RTX is not RT..... RTX is nvidias feature set. There are tools, which fall under the RTX umbrella to help add RT.
 
Last edited:
  • Like
Reactions: TNA
Man of Honour
Joined
18 Oct 2002
Posts
100,367
Location
South Coast
Is it just me or does DLSS look way worse than native?

Native 1440p vs DLSSQ.

70 FPS is native - I took a screen cap before his MSI AB could update to the new 45 FPS it was running at.

This is a reason I play native with no RT. I still want my natural vision.

Even worse, the Youtuber says they can't see a difference, either a koollaide drinker or needs a checkup.
1: It's a youtube video, who knows what quality and compression he's used.
2: Dragon's Dogma 2 is well known to be broken, do you think such a badly optimised game that runs poorly for everyone and has documented issues implements an upscaler even half properly?
2b: Res Evil 2 also has poor rendering, the black levels are crushed compared to the other Res Evil remakes out of the box. You've also posted 2 videos from 2 different people, one using REC 709 colour profile that makes the game too dark, even on OLED, whilst the 2nd video uses the stock colour which is as mentioned, black crushed. Why so much inconsistency?
3: Why didn't you instead post a comparison of an actual new game to demonstrate good use of DLSS/FSR etc that doesn't have any technical issues whatsoever like oh I don't know, Horizon Forbidden West? No RT in that either so you can go to town with your "that's why I don't use RT" routine :D

Here:

These are direct PNG screenshots from Steam as well so no compression, you can clearly see after zooming into an area that hasn't changed because of natural movement of the scene (look at the wall structure at the back or the ground).

There are many other games too but this is an RT thread so no need to correct you any further with non RT chat.
 
Man of Honour
Joined
18 Oct 2002
Posts
100,367
Location
South Coast
Back onto RT, Microsoft is not only coming in hot with DirectSR which aims to join all upscaler forces into one single API, they now plan to improve ray tracing performance and VRAM use which Tom's Hardware think will save 8GB cards (lol).

The 8GB card will never die will it.


Good idea though, having an LoD for RT means all cards currently not able to get 60fps with RT enabled should be able to to do just that, whilst the better cards get a nice boost which means less reliance on frame generation, which in turn means lower input latency especially at 4K which even tackles a 4090 when enabling path tracing.
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,599
Location
Greater London
Back onto RT, Microsoft is not only coming in hot with DirectSR which aims to join all upscaler forces into one single API, they now plan to improve ray tracing performance and VRAM use which Tom's Hardware think will save 8GB cards (lol).

The 8GB card will never die will it.

See. 10GB is enough!

Nexus might decide to keep his for another 2-3 years after seeing the 5000 series price :p
 
Back
Top Bottom