• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ratchet & Clank Rift Apart to use DirectStorage 1.2 with GPU decompression.

The interesting part is the comments from the developers, they specifically say the asset streaming system is dynamic and the game engine adjusts assets to the available bandwidth on your PC.

That means that the quality of the textures, and how fast they load could vary and differ on every PC. I can't say how much they will differ but yeah that's possibly the first game that will do this instead of preloading with loading screens

So potentially - an older PC with a GPU that isn't optimized for GPU decompression, maybe it also has an older SSD and CPU or whatever and this PC may find that jumping through the game's rifts and when they come out of the rift the textures quality is lower than a PC with the newest hardware even when both PCs are using the same graphics settings and this happens because the developers are focused on fast asset streaming, they don't want to add loading screens so either your pc loads the data into GPU memory fast enough or you get presented with lower quality textures until the data has loaded
 
Last edited:
It is all very interesting indeed.
I've finished the game in nearly 100% on PS5 but will be definitely picking this up for pc to experience improved graphics and high framerates as the game is great.
 
Some people on this forum claim dlss and XeSS can't be implemented by devs without help, they say it's just too complicated. I guess Nvidia and Intel paid heaps of money to Sony and didn't even get a sponsorship deal out of it
 
Some people on this forum claim dlss and XeSS can't be implemented by devs without help, they say it's just too complicated. I guess Nvidia and Intel paid heaps of money to Sony and didn't even get a sponsorship deal out of it
Those people are clueless or intentionally obtuse.
 
Some people on this forum claim dlss and XeSS can't be implemented by devs without help, they say it's just too complicated. I guess Nvidia and Intel paid heaps of money to Sony and didn't even get a sponsorship deal out of it
Do they?
On that thread, Nixxes Studio (Sony's porting developers) were they ones who tweeted about their abstraction wrapper. And I'm sure that of us who though about this a bit, pointed out that just because a studio which only does ports was able to abstract the differences, does not mean everyone can or will.

I'll say it again:
FSR and XeSS work on all vendors.
DLSS only works on SOME Nvidia cards.
Most games are developed for consoles.
FSR works on consoles.
Ergo: if Nvidia want their tech in all games, they have to create a truly neutral wrapper. Streamline is not truly neutral, so they have to try again.
 
Do they?
On that thread, Nixxes Studio (Sony's porting developers) were they ones who tweeted about their abstraction wrapper. And I'm sure that of us who though about this a bit, pointed out that just because a studio which only does ports was able to abstract the differences, does not mean everyone can or will.

I'll say it again:
FSR and XeSS work on all vendors.
DLSS only works on SOME Nvidia cards.
Most games are developed for consoles.
FSR works on consoles.
Ergo: if Nvidia want their tech in all games, they have to create a truly neutral wrapper. Streamline is not truly neutral, so they have to try again.

Lmao the amd cope trying to encourage developers to use FSR because coping reasons
 
Last edited:
Lmao the amd cope trying to encourage developers to use FSR because coping reasons
What nonsense.

FSR will be used in most games because:
Console sales dwarf PC games!

That is really is so simple, I am surprised some people cannot understand that.

EDIT: to add to that:
No matter whether a PC gamer paid £100 for a low end card, or £2,000 for a high-end one.
And no matter what the rest of their rig cost.
Games will be developed for consoles first.
Some porting studios (and Nixxes seem really good) will bother, most will do the minimum.

I'll say it again: if a proprietary upscaler wants to get adopted, there has to be an open-source 100% neutral wrapper.
Which upscaler is technical better hardly matters. And yes, DLSS currently still beats FSR, but until very recently G-Sync was usually far better than FreeSync but guess which won?
 
Last edited:
What nonsense.

FSR will be used in most games because:
Console sales dwarf PC games!

That is really is so simple, I am surprised some people cannot understand that.

EDIT: to add to that:
No matter whether a PC gamer paid £100 for a low end card, or £2,000 for a high-end one.
And no matter what the rest of their rig cost.
Games will be developed for consoles first.
Some porting studios (and Nixxes seem really good) will bother, most will do the minimum.

I'll say it again: if a proprietary upscaler wants to get adopted, there has to be an open-source 100% neutral wrapper.
Which upscaler is technical better hardly matters. And yes, DLSS currently still beats FSR, but until very recently G-Sync was usually far better than FreeSync but guess which won?
Gsync? Still better than freesync, in some cases cheaper ( aw3423dw was cheaper than dwf ) while gsync branded monitors have standards they need to reach before receiving the ‘good to go’.

Freesync monitors are a gamble, some good, some bad, some awful.
 
FSR will be used in most games because: Console sales dwarf PC games! That is really is so simple. Which upscaler is technical better hardly matters. And yes, DLSS currently still beats FSR, but until very recently G-Sync was usually far better than FreeSync but guess which won?

There you go again

On a side note, the gsync comment is interesting, given that the best monitors and TVs on the market are still all gsync. Why does someone need to win anyway? Support all technologies and let consumers buy what they want. We all knows DLSS and Gsync is superior, if someone wants to pay extra for that then so be it, you don't need to be jealous and claim the inferior quality alternative is going to "win", no brand needs to "win". The only ones who need to win is consumers and consumers win when they have choices, if there is a future where all games use FSR and nothing else then that's a bad anti competitive, anti consumer future where the consumer has no choices and you should feel bad for wanting that
 
Last edited:
There you go again

On a side note, the gsync comment is interesting, given that the best monitors and TVs on the market are still all gsync. Why does someone need to win anyway? Support all technologies and let consumers buy what they want. We all knows DLSS and Gsync is superior, if someone wants to pay extra for that then so be it, you don't need to be jelous and claim the crappy alternative is going to "win" just because you say so
TVs? Are there any TVs with the Nvidia chip, or are the just VESA's Adaptive-Sync where they have bother to be G-Sync certified. That is, because they are premium TV and they know they would meet the cert rather than that they are better because they use Nvidia's chip.

"Win" is which tech gets the broader adoption.

I have already (here and the other thread) said what Nvidia should do: get a truly neutral wrapper.

Or even better: DirectX/Vulcan etc. should make upscaling part of the spec and everyone uses that API. Then leave implementation to the vendors. Then Nvidia (and Intel) could boast how much better their upscaler is, we would no longer have the PCMR type going on about they are so elite since they have paid to win with their £3,000+ rig but their upscaler is not supported.

So come on Microsoft, where is Win11's DX13?
 
Game gets direct storage AND DLLS3 support, everyone's happy apart from some entitled 4090 owners.
 
Last edited:
Back
Top Bottom