Poll: *The Official PlayStation (PS5/PS5 Pro) Thread*

Will you be buying a PS5 Pro on release?

  • Yes

    Votes: 54 14.3%
  • No (not at £700 Lol)

    Votes: 215 56.9%
  • No (other)

    Votes: 89 23.5%
  • Pancake

    Votes: 20 5.3%

  • Total voters
    378
Throughout the entire current generation one console has been more powerful than the other PS4 over the One and then the X over the Pro. However, in both cases the differences between the two were just not massive.
The difference in raw power performance between the PS5 and Series X is actually less than X over Pro.

We are going to see in November two so very similar consoles with gnat's whiskers of difference in visuals and overall performance and once again it's going to be about the games.

Wasn't PS4 Pro and X around a two GCN terraflops difference?

2 RDNA terraflops is definitely a bigger power difference.
 
Wasn't PS4 Pro and X around a two GCN terraflops difference?

2 RDNA terraflops is definitely a bigger power difference.

There's some theory that it's more efficient to utilize 36 CUs at 2.23GHz with custom cache scrubbers than it is for 52 CUs @ 1.8GHz. We'll see soon enough.
 
There's some theory that it's more efficient to utilize 36 CUs at 2.23GHz with custom cache scrubbers than it is for 52 CUs @ 1.8GHz. We'll see soon enough.

More is more. The addition of the cache scrubbers will likely boost the performance, however, I don't see them making up that deficit. Plus, wasn't it reported that the clock speed for PS5 wasn't fixed, with it being a boost clock speed?
 
More is more. The addition of the cache scrubbers will likely boost the performance, however, I don't see them making up that deficit. Plus, wasn't it reported that the clock speed for PS5 wasn't fixed, with it being a boost clock speed?
If that works similar to how GPUs in a pc work I don't see the issue.
 
Throughout the entire current generation one console has been more powerful than the other PS4 over the One and then the X over the Pro. However, in both cases the differences between the two were just not massive.
The difference in raw power performance between the PS5 and Series X is actually less than X over Pro.

We are going to see in November two so very similar consoles with gnat's whiskers of difference in visuals and overall performance and once again it's going to be about the games.
I'm not sure I agree with you there. The PS4 was noticeably more powerful than the original Xbox One, and it showed in games where the Xbox had to use a lower resolution and upscale........in fact, didn't Microsoft upgrade the GPU in the One S? I'm sure they wouldn't have done that if they didn't think there was any disparity in games.
The PS4pro and XB1x is a different matter, as they aren't the primary console that developers are making games for, but are merely optimising for them........and imo the One X is very noticeably better than the PS4pro and it doesn't take more than a cursery glance to see that in games like RDR!

We'll find out soon enough either way, but to say a 20-30% jump in performance is "oh so similar" is extremely presumptive, especially when that could turn out to be a best case scenario (assuming that is based on PS5 upper limit clock speeds), and the disparity could be much more in real use scenarios.

One thing I do agree with is that it's actually quite difficult unless consoles are side by side..........though, you can guarentee the internet will exacerbate any differences with frame by frame analysis to fuel the playground and messageboard arguments - but that's half the fun! :D
 
I've never bought a console on day one as I am slightly worried about hardware issues. Indeed, the PS4 launch had the rubber-peeling DualShocks which I avoided by waiting for six months. But I'm so desperate for PS5 that I am going to have to get one ASAP. Does anyone pay for those extended warranties you get with Amazon, Argos et al. or is the opinion that if something is going to go wrong, it will go wrong in the first year under the manufacturer's warranty, more likely?
 
Is always just about the games and what console your mates have. Outside of discussions like on here not many give a rats ass which console is marginally better. :)

Been playing RDR2 this last week on the One X/4K Oled and its superb, looking forward to see what these new ones can do.

Expect we will jump from Xbox to PS due to my son playing it the most and a lot of his friends going PS but we will see.
 
I'm not sure I agree with you there. The PS4 was noticeably more powerful than the original Xbox One, and it showed in games where the Xbox had to use a lower resolution and upscale........in fact, didn't Microsoft upgrade the GPU in the One S? I'm sure they wouldn't have done that if they didn't think there was any disparity in games.
The PS4pro and XB1x is a different matter, as they aren't the primary console that developers are making games for, but are merely optimising for them........and imo the One X is very noticeably better than the PS4pro and it doesn't take more than a cursery glance to see that in games like RDR!

We'll find out soon enough either way, but to say a 20-30% jump in performance is "oh so similar" is extremely presumptive, especially when that could turn out to be a best case scenario (assuming that is based on PS5 upper limit clock speeds), and the disparity could be much more in real use scenarios.

One thing I do agree with is that it's actually quite difficult unless consoles are side by side..........though, you can guarentee the internet will exacerbate any differences with frame by frame analysis to fuel the playground and messageboard arguments - but that's half the fun! :D

But......
You have to remember that the 15% - 17% raw power advantage the XBox has (no, not 30% at all) is in a perfect storm situation.
That is the PS5 throttled down while the XBox, with it's fixed clocks, is running at full pelt.

It doesn't take into account any OS optimisation, how well MS' OS works compared to Sony's.

I'm just saying that this headline differences in power are greatly exaggerated (case in point above with this 30%) and that the difference in raw power (so again, that is raw power before anybody does anything with it) is less between the next generation and the current Premium offerings from both.
 
But......
You have to remember that the 15% - 17% raw power advantage the XBox has (no, not 30% at all) is in a perfect storm situation.
That is the PS5 throttled down while the XBox, with it's fixed clocks, is running at full pelt.

It doesn't take into account any OS optimisation, how well MS' OS works compared to Sony's.

I'm just saying that this headline differences in power are greatly exaggerated (case in point above with this 30%) and that the difference in raw power (so again, that is raw power before anybody does anything with it) is less between the next generation and the current Premium offerings from both.
Sorry, but we may be getting confused with the numbers. I assume you are comparing the XsX 12.1 tflops to 10.28 tflops for PS5 and this is where you're getting the 17% difference? If so, then I'm pretty certain that you're quoting the maximum possible performance for the PS5 when the speed is throttled up, and the sustainable speed will be nearer 9.2 tflops (which is where I am getting the 30% figure).

As you and I both agree, there are a lot of other variables to take into account, so it'll be interesting when developers have their heads aroiund both systems to see where we sit.
 
Sorry, but we may be getting confused with the numbers. I assume you are comparing the XsX 12.1 tflops to 10.28 tflops for PS5 and this is where you're getting the 17% difference? If so, then I'm pretty certain that you're quoting the maximum possible performance for the PS5 when the speed is throttled up, and the sustainable speed will be nearer 9.2 tflops (which is where I am getting the 30% figure).

As you and I both agree, there are a lot of other variables to take into account, so it'll be interesting when developers have their heads aroiund both systems to see where we sit.
From what I have read from Cerny and Co the PS5 doesn't work like that, nothing like how a pc cpu/gpu throttle back. The PS5 will spend most of the time at max speed and only throttles down when going to exceed maximum power. This way temp and power is a constant and the frequency varies this means a 2-3% reduction in speed allegedly gives over a 10% reduction in power usage, therefore the speed drop is minimal and is known for any given situation.
 
From what I have read from Cerny and Co the PS5 doesn't work like that, nothing like how a pc cpu/gpu throttle back. The PS5 will spend most of the time at max speed and only throttles down when going to exceed maximum power. This way temp and power is a constant and the frequency varies this means a 2-3% reduction in speed allegedly gives over a 10% reduction in power usage, therefore the speed drop is minimal and is known for any given situation.
That still means that 17% power differential is a best case scenario for PS5? When running at the lower sustainable speed the differential is in excess of 30%? ........that is working purely on throughput with the assumption that we're comparing apples to apples when we talk about raw power in tflops (which we already know we're not).

That is interesting about the power management (and not what I took from it at all).......I think I need to go and have another look as I was under the impression that the PS5 power management was intending to maintain a steady fixed power draw (rather than a variable power draw where heat is fluxual). As high power draw would be when the GPU and CPU are at highest usage (effectively when you most need them), and that this would affect a throttle down......I'd be grateful if you have a link to where that is discussed?
 
Last edited:
That still means that 17% power differential is a best case scenario for PS5? When running at the lower sustainable speed the differential is in excess of 30%? ........that is working purely on throughput with the assumption that we're comparing apples to apples when we talk about raw power in tflops (which we already know we're not).

That is interesting about the power management (and not what I took from it at all).......I think I need to go and have another look as I was under the impression that the PS5 power management was intending to maintain a steady fixed power draw (rather than a variable power draw where heat is fluxual). As high power draw would be when the GPU and CPU are at highest usage (effectively when you most need them), and that this would affect a throttle down......I'd be grateful if you have a link to where that is discussed?

I think we may have got a few crossed wires as I said the same as you fixed power and temp variable frequency, but the amounts differ lol. Digital foundry had a "review" of to reduce the clock to 9.2 it would require a 12 % ish drop , but according to MC a 2-3% decrease in clocks would give you over a 10% decrease in power. If its linear then a 12% decrease in clocks would give a decrease in excess of 40% way more than I think it will need. DF "review" https://www.eurogamer.net/articles/...s-and-tech-that-deliver-sonys-next-gen-visionhttps://www.eurogamer.net/articles/...s-and-tech-that-deliver-sonys-next-gen-vision
 
I think we may have got a few crossed wires as I said the same as you fixed power and temp variable frequency, but the amounts differ lol. Digital foundry had a "review" of to reduce the clock to 9.2 it would require a 12 % ish drop , but according to MC a 2-3% decrease in clocks would give you over a 10% decrease in power. If its linear then a 12% decrease in clocks would give a decrease in excess of 40% way more than I think it will need. DF "review" https://www.eurogamer.net/articles/...s-and-tech-that-deliver-sonys-next-gen-vision
Thanks for linking to that - it's an enjoyable read and interesting synopsis.

My problem is "according to MC a 2-3% decrease in clocks would give you over a 10% decrease in power"........now think of that in reverse. What he is effectively saying is that a 2-3% increase in clock speed takes a 10% increase in power! To me that sounds like the chip is constrained in some way, and if it requires that much additional power to eke out a mere 2-3% in clock speed then I can't see how that would be sustainable under any sort of load!

To say it is sustainable under load goes against everything I know as a PC gamer and overclocker.........if you overclock a chip and you get to a point where you really have to ramp up the power then it's inevitable that you're ramping up the thermals and requiring a better cooling solution at the same time........and if you can deal with it then why have a variable clock spoeed in the first place?

Will be interesting when it's rteleased and we get a tear down on it! :D
 
Thanks for linking to that - it's an enjoyable read and interesting synopsis.

My problem is "according to MC a 2-3% decrease in clocks would give you over a 10% decrease in power"........now think of that in reverse. What he is effectively saying is that a 2-3% increase in clock speed takes a 10% increase in power! To me that sounds like the chip is constrained in some way, and if it requires that much additional power to eke out a mere 2-3% in clock speed then I can't see how that would be sustainable under any sort of load!

To say it is sustainable under load goes against everything I know as a PC gamer and overclocker.........if you overclock a chip and you get to a point where you really have to ramp up the power then it's inevitable that you're ramping up the thermals and requiring a better cooling solution at the same time........and if you can deal with it then why have a variable clock spoeed in the first place?

Will be interesting when it's rteleased and we get a tear down on it! :D
I would like to see how it works in practice its not like any type of boost I've heard off, but that's the idea. The way it sounds is unlike a PC, boost mode is normal mode and this is how it normally runs. When both gpu and CPU require more power than max, one or both are slightly down clocked until the loads decrease. Then it will go back to boost clocks, for want of another term.
 
I would like to see how it works in practice its not like any type of boost I've heard off, but that's the idea. The way it sounds is unlike a PC, boost mode is normal mode and this is how it normally runs. When both gpu and CPU require more power than max, one or both are slightly down clocked until the loads decrease. Then it will go back to boost clocks, for want of another term.

There's a lot taken on faith, and I am very dubious. Power consumption and thermal load go hand-in-hand with clock speed and workload, and I find it rather weird that you'd need a boost mode if you aren't under load........and if you're under load then that's when you need boost and the thermals crank up! Take the gfx card in your PC - it'll work fine at top clockspeed on the desktop or in easy to render games, but the moment the action steps up and there's a lot more on the screen it's then that you'll get that thermal kick and it'll start to throttle.

I'm sure I recall Marc Cerny stating that they were aiming at 2Ghz and if they hit that it'd be amazing due to chip thermal constraints........then all of a sudden they're at 2.2Ghz and it isn't a problem - something just doesn't add up there, particularly when you consider Sony's rather poor and crude thermal solutions on the PS4 and PS4pro!
 
There's a lot taken on faith, and I am very dubious. Power consumption and thermal load go hand-in-hand with clock speed and workload, and I find it rather weird that you'd need a boost mode if you aren't under load........and if you're under load then that's when you need boost and the thermals crank up! Take the gfx card in your PC - it'll work fine at top clockspeed on the desktop or in easy to render games, but the moment the action steps up and there's a lot more on the screen it's then that you'll get that thermal kick and it'll start to throttle.

I'm sure I recall Marc Cerny stating that they were aiming at 2Ghz and if they hit that it'd be amazing due to chip thermal constraints........then all of a sudden they're at 2.2Ghz and it isn't a problem - something just doesn't add up there, particularly when you consider Sony's rather poor and crude thermal solutions on the PS4 and PS4pro!

A few more videos about the PS5 hardware if you want to learn more.
And
( this is part 1 of a 3 part video) The guy used to work for EA, in engine development I think.
 
did anyone else get an email with this in it?

Edit: oh might be audio product releated

ac276dba1dc80c62c0f2d33553fac779.jpg
What exciting thing did they announce?
 
I think it was a newer model of their £350 headset.

Yeah it was the new WH-1000XM4 noise cancelling headphones. The last ones were very popular so it's caused quite a stir in those circles.

Actually wondering how they would do as a gaming headset, could do with getting rid of some back ground noise! (I have a two year old)
 
Back
Top Bottom