Spiderman

ryzen 9 3900 x and a 2080 super play this decent ?

Yeh will be fine. The game has dynamic resolution scaling anyway, so it always performs okay as long as you arent CPU bound.

If you turn off DRS and try RT with max settings or 4k will be a stuttery mess.
 
This game crushes systems at max settings. The next crysis.

It does not. there is nothing in this game which screams Crysis levels of system pwnage. It does not look like a next gen game like what Crysis did at launch warranting the severe performance hit either lol.

So I gave it a shot on STEAM but have now requested a refund. IMO for my tastes it felt quite repetitive and the game world itself is very bland with virtually no interactivity. The game did run VERY smoothly however and mouse/keyboard response out the box was supremely perfect. Zero technical issues and 21:9 supported in game and cutscenes. The character detail and reflections are excellent too. Nominal framerate appears to be 100fps on average.

Just a shame about the game world and some of the spiderman control mechanics not feeling natural. Like spiderman will walk up the side of a building but when you reach the top it's a bit janky as he finds his feet on the rooftop of when you are swinging from rooftop to rooftop. It's not Half-Life "smooth" like how Valve does movement mechanics, for example.

As a switch brain off button smasher it was fine though, but I think I've lost interest in such games as time ahas gone on.

I also noticed some wild CPU utilisation, at some points 80% and other points 40-50% etc, and that's a 12700KF. No frame drops or stutters though, absolutely perfectly smooth and gsync seemed to be working really well too, so kudos to Sony for a slick PC port, The CPU usage is a non-issue, I suspect this is efficient use of all the cores to keep the game running super smooth even through the explosions and gunfire that does seem to get pretty heavy.


But removal of those points for not allowing you to skip the game start intro screens. Capcom allow you to do it immediately, so why can't Sony?

Enabling ray traced reflections on the highest settings results in a 20fps drop at 3440x1440 but since the frames are high enough with it on anyway, it's a no brainer for the nicer reflections.

Here are my settings and observations:

20220813_014647.jpg
then moments later:
20220813_014357.jpg


20220813_013509.jpg


20220813_013527.jpg


Screenshot%202022-08-13%20015717.jpg
 
Last edited:
Abysmall sales so far it got beaten by Cult of the Lamb. Wonder for how long Sony will keep doing these ports since PC gamers are clearly not interested.
 
dunno it's been top of steam sales for a while now, maybe not every day. It's an older game though coming out on PS4 so a lot of people have already played it
 
K+Mouse works perfectly !!, I was hopeless on the PS4 version.

Game runs silky smooth ultrawide and looks fantastic defo worth a punt !
 
It does not. there is nothing in this game which screams Crysis levels of system pwnage. It does not look like a next gen game like what Crysis did at launch warranting the severe performance hit either lol.

So I gave it a shot on STEAM but have now requested a refund. IMO for my tastes it felt quite repetitive and the game world itself is very bland with virtually no interactivity. The game did run VERY smoothly however and mouse/keyboard response out the box was supremely perfect. Zero technical issues and 21:9 supported in game and cutscenes. The character detail and reflections are excellent too. Nominal framerate appears to be 100fps on average.

Just a shame about the game world and some of the spiderman control mechanics not feeling natural. Like spiderman will walk up the side of a building but when you reach the top it's a bit janky as he finds his feet on the rooftop of when you are swinging from rooftop to rooftop. It's not Half-Life "smooth" like how Valve does movement mechanics, for example.

As a switch brain off button smasher it was fine though, but I think I've lost interest in such games as time ahas gone on.

I also noticed some wild CPU utilisation, at some points 80% and other points 40-50% etc, and that's a 12700KF. No frame drops or stutters though, absolutely perfectly smooth and gsync seemed to be working really well too, so kudos to Sony for a slick PC port, The CPU usage is a non-issue, I suspect this is efficient use of all the cores to keep the game running super smooth even through the explosions and gunfire that does seem to get pretty heavy.


But removal of those points for not allowing you to skip the game start intro screens. Capcom allow you to do it immediately, so why can't Sony?

Enabling ray traced reflections on the highest settings results in a 20fps drop at 3440x1440 but since the frames are high enough with it on anyway, it's a no brainer for the nicer reflections.

Here are my settings and observations:

20220813_014647.jpg
then moments later:
20220813_014357.jpg

You are only running 1440p and are using a 3080 Ti with DLSS quality (which renders at an underlying resolution of 960p) on and only getting 80-90fps with your 20 thread CPU being very heavily utilised.

Let me know what you get with 4k native.
 
Last edited:
Abysmall sales so far it got beaten by Cult of the Lamb. Wonder for how long Sony will keep doing these ports since PC gamers are clearly not interested.
They are making money and they invested in a studio last year to do these ports with more games on the way, just be happy they are even making pc versions of there games and either support them or don't.

Official pc numbers below and there is just something about playing these games on PC rather than console so for me it's all good.

 
You are only running 1440p and are using a 3080 Ti with DLSS quality (which renders at an underlying resolution of 960p) on and only getting 80-90fps with your 20 thread CPU being very heavily utilised.

Let me know what you get with 4k native.

3440x1440, not "only" 1440p (which is 2560x1440).

Those screens are taken during a firefight with explosions, debris, loads of enemies all at once, nominal framerate as mentioned in the post itself is that it's 100+fps everywhere else.

I cannot play at 4K as my monitor is 3440x1440.

Also note that enabling RT lighting instantly takes away at least 20fps in the settings screen alone. I do not use DRS either as prefer outright image quality. DLSS is a given since it is virtually identical to native in Quality mode on modern games using the latest DLSS version.
 
3440x1440, not "only" 1440p (which is 2560x1440).

Those screens are taken during a firefight with explosions, debris, loads of enemies all at once, nominal framerate as mentioned in the post itself is that it's 100+fps everywhere else.

I cannot play at 4K as my monitor is 3440x1440.

Also note that enabling RT lighting instantly takes away at least 20fps in the settings screen alone. I do not use DRS either as prefer outright image quality. DLSS is a given since it is virtually identical to native in Quality mode on modern games using the latest DLSS version.

So for context, 4K is over double the pixels that of 1440p. A slightly wider 1440p screen is only 33% more pixels.

As for saying your statement is fine because DLSS upscaling from 960p is identical. It is will be miles off a true 4K image.

Like I said, this game crushes top end systems at actual very high to max settings. 960p (even if ultrawide) is not max settings.
 
Do you have a comparison link showing that 4k native vs dlss is miles off? I've yet to see any game using later dlss showing as such.

It's free performance with virtually no quality loss in most cases.
 
Do you have a comparison link showing that 4k native vs dlss is miles off? I've yet to see any game using later dlss showing as such.

It's free performance with virtually no quality loss in most cases.

You aren't even running 4K DLSS which renders an underlying image at 1440p. The settings you have is where it is trying to upscale from 960p.

The only situation that DLSS is truly impressive is 4K DLSS quality, where the underlying image is already pretty good.
 
True for early versions of dlss. Not true for the later versiuons we have had for a while now.

I'd like to see a comparison link showing "miles" of difference in a recent game, or this one.
 
True for early versions of dlss. Not true for the later versiuons we have had for a while now.

I'd like to see a comparison link showing "miles" of difference in a recent game, or this one.

Look for 4K DLSS Performance comparisons. That uses a starting resolution of 1080p (still >25% pixels than the 960p you are using at 1440p Quality).

Screenshot-2022-08-13-125136.png
 
That's just a table of render reses we already know. I'm asking for modern game comparisons of the latest dlss showing what you said was miles of difference between native and dlss.
 
The CPU usage is certainly brutal, even worse than stress-tests in Cyberpunk (see link) which I guess is a bit surprising. If I max it out I'll get mid-to-high 30s on the CPU side (6800K) when zipping around but if I tweak it a bit I can still get away with raytracing at 45+, and without RT it's an almost locked 60. Haven't decided yet how to play it, still kinda stuck on other games. It looks decent, no doubt, but I think graphically it got praised way too much, in no way does it even touch a Watch Dogs: Legion (given their similarity of look). The raytracing in particular is very limited in how it applies and what it applies to, so I'd put this more in the "hybrid" category like the RT Refls in Far cry 6 - i.e. it's judiciously and smartly used, but nonetheless limited.

For anyone that's using a TV as monitor, give 21:9 a go within the 16:9 aspect ratio (just switch res to 3840x1620, non-exclusive fullscreen; Scaling Mode in Driver C.P. should be 'Preserve aspect ratio'), it has a much more natural look than the +20 FOV for normal 4K, I was very pleasantly surprised.
As for upscaling, I tested all the modes (and types) and I think any of them as long as they're on Performance at least (at 4K) look damn good, and I'd happily use them. Heck, I tried even ultra performance and it looked decent (but more obviously aliased) but at that point there's not much benefit in power savings for me (with an RX 6800); could be more useful on some older cards like RX 480 etc.

Personally I enjoy these mindless open world, explore and collect stuff/take over areas type of games so I look forward to getting around to it later down the line. I think atm tho Path of Exile's gonna steal my playtime with the new season. Back to Wraeclast! :cool:
 
That's just a table of render reses we already know. I'm asking for modern game comparisons of the latest dlss showing what you said was miles of difference between native and dlss.

There are a tonne of videos showing what happens to quality as you go from 4K DLSS quality to balanced to performance. The image gets noticeably softer for one, AA becomes a lot worse and the underlying assets lose quality. Yes it is a lot better than TAA upsampling, and 4K Quality can often look as good as and better in some ways than 4K Native TAA, but the more the DLSS algorithm is having to do the more it struggles and fails due to the lack of raw information.
 
Back
Top Bottom