Console performance vs PCs over time

This is not a benefit of consoles as you can also do this on PC (but have the added benefit of being able to play on MKB if that suits the game better).



To actually play games or just to control the UI?

To play games like fortnite. It puts you in lobby with other keyboard and mouse players.

And in my experience lugging my desktop pc to plug into living room TV never goes down well with the missus. And no I'm not running an hdmi cable the length of my house either.
 
The 'Xbox Series X' is going to blow away most PC's, though with the Gamepass and all the exclusives coming to PC now, i'm not sure if I'll bother to get one.

I might end up getting a PS5 for the exclusives, unless Sony follow suit (The fact that Horizon: Zero Dawn is coming to PC is very positive)

I say all of this as a man who plays 90% of games on a very speedy Ryzen 3900x/2080Ti/32GB Ram...and yes I do use an Xbox control pad for all my PC single player games
 
I think what those graphs actually show is just how limiting the various OS are, because far lower specc'd Consoles with no Windows DX to drag them down can compete with high specc'd PC's without issue. So long story short, tech doesn't need to change, the OS "needs" to change the way it does graphics!
 
Hell even the Sega Dreamcast had a keyboard :D
Not only that a modern console is just a stripped down pc anyway, what nasher said makes sense.

It’s not really an issue for me personally it’s only certain games I’d use a mouse and keyboard for and those games are normally only worth playing on the pc anyway. Football manager, tropico etc.
 
I think what those graphs actually show is just how limiting the various OS are, because far lower specc'd Consoles with no Windows DX to drag them down can compete with high specc'd PC's without issue. So long story short, tech doesn't need to change, the OS "needs" to change the way it does graphics!

It's a bit more complex than that.

Consoles are static systems. Console games are set to 'static' graphics settings. Developers can eeek every little bit of performance from them and optimise the games much better.

PC games have to factor in millions of different configurations. And then you've got people who will whack every menu option to max and then say "This game runs crap" when in reality, some of those settings are designed for future GPU's
 
not really. consoles are coded for far more efficiently as you only have 1 hardware set to code for.

that is why the xbox one x does 4k pretty well for a console. a 980ti would struggle to play RDR2 in 4K, it would fall apart.
Actually the 980Ti averages around 28-30fps during the benchmark at 4K with high settings.

XB1X uses a mixture of high/medium/low settings to achieve 30fps.
 
To play games like fortnite. It puts you in lobby with other keyboard and mouse players..

So one game then?

And in my experience lugging my desktop pc to plug into living room TV never goes down well with the missus. And no I'm not running an hdmi cable the length of my house either.

Alternatively get something like an Nvidia Shield (only about 2x the size of a mobile phone) and play seamlessly, whilst also including Prime/Netflix/Youtube, etc... all in one place
 
I think what those graphs actually show is just how limiting the various OS are, because far lower specc'd Consoles with no Windows DX to drag them down can compete with high specc'd PC's without issue. So long story short, tech doesn't need to change, the OS "needs" to change the way it does graphics!
This has always been the case simply because PCs do so much more than play games. Although I'm sure you know already that the Xbox One and its variants runs on a modified version of Windows 10.
 
Consoles are usually sold at cost or even at a loss because the profit is in games and services, not the consoles. In addition, all modern consoles use PC parts or cut-down cheaper versions of PC parts so development costs are minimal. The whole R&D cost is placed on PCs. So maybe console gamers should pay a tax to PC gamers, since it's PC gamers who make consoles possible at a lower price :)

Also, I wouldn't trust some random vid you can't remember and some specs you can't remember. Bit short on details and checking there. I've just done some very quick checking and you're way out on SSD speed (it's claimed to be up to 2.4GBps, not 7) and resolution/fps (it's claimed to be 4K at 60fps, not 120). Graphics are handled by midrange in the next gen of AMD cards, which should be a fair bit cheaper than a 2080Ti when it's released. The next gen of consoles will be very high spec for the price, that's for sure.

The main reason I game on PC isn't performance. It's a mouse and keyboard and a mostly open system. For the time being, anyway.

I think the video he was on about was done by LinusTechTips. So pretty credible. It was a 2070 Super though.

The PS5 has a SSD read of 9GB/s using PCIE4. Xbox Series X is 2.4 or 4.8 when compressed.
 
Last edited:
Oh it just seemed that it was an issue for some.

It works on the x if the devs have implemented it I believe. Metro Exodus works fine for example. The bigger issues are that the console version has a different turning speed, aim assist etc that's still there even if you are using the Kb&M. I played with a XIM4 on the PS4 and it was fairly horrible, the other issue is how to sit comfortably in the lounge using them. The nerdy tech Couchmaster seems to be the best that I've seen.
 
I don't remember the xbox 360 ever outperforming a good PC. The graphics quality was never anywhere near PC level.

They’re mostly games in 720p right? It’s not measuring that, sure a 1080p game on the PC would look nicer - this is just looking at performance.
 
They’re mostly games in 720p right? It’s not measuring that, sure a 1080p game on the PC would look nicer - this is just looking at performance.

But a PC at 720p will perform much better :p

I guess it is an Nvidia graph though. They don't usually reflect reality.

not really. consoles are coded for far more efficiently as you only have 1 hardware set to code for.

that is why the xbox one x does 4k pretty well for a console. a 980ti would struggle to play RDR2 in 4K, it would fall apart.

They don't really do 4k, the screen size is 4k but it's subsampled I think. It doesn't look as crisp as a PC does at 4k.
 
This has always been the case simply because PCs do so much more than play games. Although I'm sure you know already that the Xbox One and its variants runs on a modified version of Windows 10.

I always thought the problem is you are trying to cater for an infinite amount of hardware combinations whereas with a console you have that fixed platform you are developing for so you are going to be a lot more efficient at getting the most out of your hardware.
 
Back
Top Bottom