Do game DEVS for PS4 have a secret deal with Sony to not utilize the hardware fullly

Associate
Joined
28 Mar 2007
Posts
316
Location
Sweden
I mean, look a the latest releases. Shadow of Colussus and Monster Hunter they run at 30fps on PS4 standard and 60 fps on PRO.

For sure the standard version is not 100% weaker than the PRO?

Is it so that developers have some secret deal with Sony to persuade people to buy the PRO if they want to use 60 fps?
 
Monster Hunter doesn't run at 60fps unfortunately. Either way, there's often a hard cut-off point at 30 and 60fps on consoles. If a console can't maintain a solid 60fps it's better to have a constant 30fps than drifting about in the 40s and 50s.
 
Sony consoles, including PS4, are an absolute dog to develop for in comparison to other platforms.

Devs have always moaned about how tricky it is.
 
That doesn't make any sense.

On console most (mainstream) people will prefer a constant 30fps to a varying framerate - those more sensitive to those kind of framerates in the 30s, etc. would be better off moving to PC (if life was that simple).
 
On console most (mainstream) people will prefer a constant 30fps to a varying framerate - those more sensitive to those kind of framerates in the 30s, etc. would be better off moving to PC (if life was that simple).
Well I can't admit to experiencing this on console, but to compare with PC I would rather have 35-50 than 30. Or to make it a better comparison, 70-100 vs 60.

Is there some magical eye strain that comes with console gaming that doesn't apply to PCs?
 
Is there some magical eye strain that comes with console gaming that doesn't apply to PCs?

On PC the reasons why lower framerates feel nasty is as much about input latency as it is the actual visual update rate - the average mainstream gamer whose experience is mostly using a controller and capped framerates doesn't tend to build up a sensitivity to it as those who play using keyboard and mouse and will notice the "unsmoothness" of varying framerate before they notice the benefits in responsiveness.

Turning on controller at a capped rate is very different to the experience looking around quickly using a mouse.
 
This is an older article, but helps to explain why a consistent 30fps might be preferable

http://www.eurogamer.net/articles/digitalfoundry-2014-frame-rate-vs-frame-pacing

Killzone Shadow Fall is our first subject - a PS4 launch title that processes frames as quickly as the console can produce them. Guerrilla's idea is to produce as responsive a game as possible - the sooner your input is registered, the more quickly the result is put on-screen. However, part of the problem with this is the implementation of v-sync. This ties the game update to the refresh of the display, so frames can only arrive at distinct 16ms intervals. If the game isn't done processing a frame in 16ms - as if often the case with Killzone - the game waits until the next 16ms refresh. With some frames arriving at 16ms and a whole lot more at 33ms, the result is an uneven update, resulting in on-screen judder and inconsistency in the controls.
 
Well I can't admit to experiencing this on console, but to compare with PC I would rather have 35-50 than 30. Or to make it a better comparison, 70-100 vs 60.

Is there some magical eye strain that comes with console gaming that doesn't apply to PCs?

Gaming on a large TV with the image tearing all over the place? No ta
 
Gaming on a large TV with the image tearing all over the place? No ta

You can set the TV up to refresh at 60Hz and then use multiples (something like triple buffering) which would allow hitting 45fps, etc. without tearing and effectively get intermediate framerates as it jumps from 30-45-60, etc. (I'd assume that the hardware supported it - not sure if its implemented in software but it would be trivial to do so).

However as above when using a controller you can feel the inconsistency in things like turn rate where the frames might not quite sync up so nicely due to the rounding of numbers, etc. before any of the other potential benefits.
 
Anyway, I don't believe Monster Hunter does have a 60fps mode?

Digital Foundry said this of Shadow of the Colossus

Looking at the package as a whole, it's remarkable how dramatic an overhaul we're looking at here. Bluepoint's engine enables some truly breathtaking moments and gloriously detailed scenes - and everything runs locked at a solid 1080p30 on base PlayStation hardware, rising to 1440p30 on Pro (with downscaling supported for 1080p display users), backed by an excellent TAA solution that looks great on both full HD and 4K screens. But there's more: PS4 Pro users also have the option of a performance mode that doggedly locks to 60fps for the vast majority of the experience.

We've seen this kind of performance in an open world title with Metal Gear Solid 5, but how common is it to find a game of this scale operating at such a smooth frame-rate? It's true that the world is mostly empty and this no doubt helps keep the performance up but we were genuinely floored by the experience of playing a game this beautiful at 60fps on a console.

The stability of the frame-rate combined with the excellent motion blur results in one of the smoothest feeling games we've played since Doom 2016. By and large, 60fps means 60fps, but we did note the tiniest of drops in the main template area, where we could trigger a handful of torn frames. It's entirely possible that other areas could drop but, after finishing the first seven Colossi in the game, we've yet to encounter any other spot in the game with this same issue - it's our preferred way to play the game.
 
How much faster is the Pro, 15%?

Anyway, I would prefer if there was an option to reduce image quality and to maintain 60fps even on Ps4(nopro)
 
Sitting close to the screen may have some impact. I can't handle anything below a constant 60fps - even with freesync - when gaming on my PC, but am OK with a locked 30fps on console when played on the lounge tv. 60 is obviously preferred.
 
How much faster is the Pro, 15%?

Anyway, I would prefer if there was an option to reduce image quality and to maintain 60fps even on Ps4(nopro)

Quite a bit faster than that. It does have a boost mode for none optimised PS4 OG games that runs them 15% faster automatically.

Here's the full specs from here: http://www.trustedreviews.com/news/ps4-pro-vs-ps4-2941086


Here are the PS4 Pro specs compared to the original and PS4 ‘Slim’ versions:


PS4 (2013) PS4 (2016) PS4 Pro
CPU
PS4 (2013) 1.6GHz 8-core AMD Jaguar: PS4 (2016) 1.6GHz 8-core AMD Jaguar: PS4 PRO 2.1GHz 8-core AMD Jaguar
GPU PS4 (2013) 1.84 TFLOP AMD Radeon: PS4 (2016) 1.84 TFLOP AMD Radeon: PS4 PRO 4.2 TFLOP AMD Radeon
Memory PS4 (2013) 8GB GDDR5: PS4 (2016) 8GB GDDR5: PS4 PRO 8GB GDDR5 & 1GB
HDR
PS4 (2013) Yes: PS4 (2016) Yes: PS4 PRO Yes
4K
PS4 (2013) No: PS4 (2016) No: PS4 PRO Yes
Storage PS4 (2013) 500GB: PS4 (2016) 500GB & 1TB: PS4 PRO 1TB
USB PS4 (2013) 2x USB 3.0: PS4 (2016) 2x USB 3.1: PS4 PRO 3x USB 3.1
Wi-Fi PS4 (2013) 802.11b/g/n Wi-Fi (2.4GHz only): PS4 (2016) 802.11a/b/g/n/ac Wi-Fi (2.4GHz & 5.0GHz): PS4 PRO 802.11a/b/g/n/ac Wi-Fi (2.4GHz & 5.0GHz)
Bluetooth PS4 (2013) Bluetooth 2.1: PS4 (2016) Bluetooth 4.0: PS4 PRO Bluetooh 4.0

Shame the CPU can't match the GPU because that would solve the FPS problems. Same with the X1X.
 
How much faster is the Pro, 15%?

Anyway, I would prefer if there was an option to reduce image quality and to maintain 60fps even on Ps4(nopro)

What? The GPU in the Pro is over 2x the regular PS4, plus an extra 500mhz on the CPU. Quite a major difference really. What made you think 15%?
 
Sony consoles, including PS4, are an absolute dog to develop for in comparison to other platforms.

Devs have always moaned about how tricky it is.

I thought this problem finished when the PS4 came out with x86 architecture?

Developers definitely moaned about the PS3 because of it's cell processor, but I was under the impression there is little difference between the PS4 and Xbox One in terms of development challenge. Both have their own challenges compared to PC architecture, which in itself is hard because of how varied it can be.
 
I thought this problem finished when the PS4 came out with x86 architecture?

Developers definitely moaned about the PS3 because of it's cell processor, but I was under the impression there is little difference between the PS4 and Xbox One in terms of development challenge. Both have their own challenges compared to PC architecture, which in itself is hard because of how varied it can be.

Maybe, I was basing it off PS3 reports and pre-moans from devs about having to learn the forthcoming PS4 architecture. I also tied Bungie's development excuses for Destiny 2 as being derived from the same issue.
 
Maybe, I was basing it off PS3 reports and pre-moans from devs about having to learn the forthcoming PS4 architecture. I also tied Bungie's development excuses for Destiny 2 as being derived from the same issue.

The biggest bottleneck for devs with both consoles is the CPU. They have the GPU grunt to do pretty much what they want but the CPUs lag behind pretty much everything on the PC.

This is why hopefully if rumors are true about the next gen using Zen CPUs there should be no excuse from devs regarding bottle necks
 
Back
Top Bottom