ryzen 9 3900 x and a 2080 super play this decent ?
Yeh will be fine. The game has dynamic resolution anyway, so it always performs okay as long as you aren't CPU bound.
ryzen 9 3900 x and a 2080 super play this decent ?
ryzen 9 3900 x and a 2080 super play this decent ?
This game crushes systems at max settings. The next crysis.
It does not. there is nothing in this game which screams Crysis levels of system pwnage. It does not look like a next gen game like what Crysis did at launch warranting the severe performance hit either lol.
So I gave it a shot on STEAM but have now requested a refund. IMO for my tastes it felt quite repetitive and the game world itself is very bland with virtually no interactivity. The game did run VERY smoothly however and mouse/keyboard response out the box was supremely perfect. Zero technical issues and 21:9 supported in game and cutscenes. The character detail and reflections are excellent too. Nominal framerate appears to be 100fps on average.
Just a shame about the game world and some of the spiderman control mechanics not feeling natural. Like spiderman will walk up the side of a building but when you reach the top it's a bit janky as he finds his feet on the rooftop of when you are swinging from rooftop to rooftop. It's not Half-Life "smooth" like how Valve does movement mechanics, for example.
As a switch brain off button smasher it was fine though, but I think I've lost interest in such games as time ahas gone on.
I also noticed some wild CPU utilisation, at some points 80% and other points 40-50% etc, and that's a 12700KF. No frame drops or stutters though, absolutely perfectly smooth and gsync seemed to be working really well too, so kudos to Sony for a slick PC port, The CPU usage is a non-issue, I suspect this is efficient use of all the cores to keep the game running super smooth even through the explosions and gunfire that does seem to get pretty heavy.
But removal of those points for not allowing you to skip the game start intro screens. Capcom allow you to do it immediately, so why can't Sony?
Enabling ray traced reflections on the highest settings results in a 20fps drop at 3440x1440 but since the frames are high enough with it on anyway, it's a no brainer for the nicer reflections.
Here are my settings and observations:
then moments later:
They are making money and they invested in a studio last year to do these ports with more games on the way, just be happy they are even making pc versions of there games and either support them or don't.Abysmall sales so far it got beaten by Cult of the Lamb. Wonder for how long Sony will keep doing these ports since PC gamers are clearly not interested.
You are only running 1440p and are using a 3080 Ti with DLSS quality (which renders at an underlying resolution of 960p) on and only getting 80-90fps with your 20 thread CPU being very heavily utilised.
Let me know what you get with 4k native.
3440x1440, not "only" 1440p (which is 2560x1440).
Those screens are taken during a firefight with explosions, debris, loads of enemies all at once, nominal framerate as mentioned in the post itself is that it's 100+fps everywhere else.
I cannot play at 4K as my monitor is 3440x1440.
Also note that enabling RT lighting instantly takes away at least 20fps in the settings screen alone. I do not use DRS either as prefer outright image quality. DLSS is a given since it is virtually identical to native in Quality mode on modern games using the latest DLSS version.
Do you have a comparison link showing that 4k native vs dlss is miles off? I've yet to see any game using later dlss showing as such.
It's free performance with virtually no quality loss in most cases.
I'd like to see a comparison link.
True for early versions of dlss. Not true for the later versiuons we have had for a while now.
I'd like to see a comparison link showing "miles" of difference in a recent game, or this one.
That's just a table of render reses we already know. I'm asking for modern game comparisons of the latest dlss showing what you said was miles of difference between native and dlss.