20 yrs a pc gamer. Is it time to switch.

They also introduced an option where you can select to only get matchmaking with people using the same input type.

Which reinforces why it's never been acceptable to use a KB+M on console.

I own a gaming PC, I have used both extensively and I don't want to see them making it compatible in any FPS games on a console. Other games is fine. If you are going to put it into FPS then KB+M should be separate matchmaking as mandatory not opt into it. Allowing them to play against controller users is like letting formula 1 drivers use bigger engines or lighter cars, etc. It gives you an advantage over a controller. That doesn't mean you will be a god if you use KB+M game skill is far more important therefore better players will be able to exploit that advantage more and make it bigger than those that are crap who will barely see any improvement.
Correct, the option. It's nice having options isn't it.

Surely no different than having the option of using a wheel in a racing game.
 
Personally I'd say the comparison is a bit like comparing apples to oranges. I don't think they are quite the same thing (same for fight sticks too). A KB+M can gives a massive competitive advantage over a joypad, where as a racing wheel is much more slight. Of course there will be joypad users and wheel users who can hold their own.

I think for single-player/PVE/solo games there should be no restrictions on control schemes, so whatever the user wants to use. And consoles allowing KB+M is great. However I agree with @Psycho Sonny multiplayer games like Modern Warfare matchmaking KB&M into their own pools is great and the right thing to do. A problem on console is still Kronos and AIM devices that allow people to 'silently' use KB&M.
 
Personally I'd say the comparison is a bit like comparing apples to oranges. I don't think they are quite the same thing (same for fight sticks too). A KB+M can gives a massive competitive advantage over a joypad, where as a racing wheel is much more slight. Of course there will be joypad users and wheel users who can hold their own.

I think for single-player/PVE/solo games there should be no restrictions on control schemes, so whatever the user wants to use. And consoles allowing KB+M is great. However I agree with @Psycho Sonny multiplayer games like Modern Warfare matchmaking KB&M into their own pools is great and the right thing to do. A problem on console is still Kronos and AIM devices that allow people to 'silently' use KB&M.

which is why I said the options are nice. players can choose to use kb/m and players can choose to not play against people who use kb/m.
 
Absolutely. But it's not an option for the KB&M player to choose to go into the same matchmaking as the joypad users. That's the important bit.
 
Personally I do more gaming on my consoles (got both the latest sony/microsoft consoles).

As a former Amiga guy who only came to PC's because there wasnt any other option, I've never liked the PC for gaming.

Some games are good, like mmo's and other indie titles on PC. But the big games I think are a better experience on console, especially for ease of use.

Though I'll balance my comments up and say PC gaming is a lot more better these days than it was when I came to PC. We don't have to use command lines to set up each individual game like we did back in the day.
 
Perfect example: Witcher 3, came out on PC in May 2015. PS4 wasn't even out yet at this point, it was still around 6 months away. The version of Witcher 3 on PS4 base had to be toned back to maintain framerate despite the PS4 hardware coming out after the game on PC, the PC could run the game at higher settings on release with the right hardware, several months before the PS4 landed, and the PS3 couldn't even dare to try running it.

Sorry late on this, but I think that's a bad example.
  • The Witcher 3 came out in May 2015 for Windows, Xbox One and PS4. Content wise it was/is exactly the same game.
  • It was patched for PS4 Pro (HDR, 4K Checkerboarding, Adaptive V-Sync, LOD/shadows and other improvements) and Xbox One X (Native 4K, HDR, 60fps @1080p mode, LOD/Shadow and other improvements).
  • Then released for Nintendo Switch in 2019 with cross-progression/cloud save with the PC version.
CD Projekt Red have shown why they are so loved given they have continued to work on their game. It also highlights that mid-way through the console generation we got significantly enhanced consoles which then improved existing games. This will continue with games on PS5/XSX. i.e. Adaptive AI HDR on all Xbox One games, improved frame-rates, AA etc. This console generation has shown that console games aren't alway standing still and therefore improvement in the performance of games on consoles is being seen too.

The PC version of the Witcher 3 is still graphically the best version and has MOD support, but their was no 'delay' in the console version. It was very much a multi-platform game. I definitely agree with many of your other point though, and love PC gaming although I'm not sure the 'spec' gap is going to matter as much moving forward. But we will see.
 
Don't forget that with console gaming your pretty much stuck with a TV or 16:9 monitor. I love my ultrawide as it's so immersive and with 144hz it's a dream to play games on. Yes the consoles are specified to run 120hz but they will never reach those heights. HDR is a + for consoles but monitors are catching up.
 
I think we'll see eSports games and some indie stuff reach those 120hz heights. Bit like many of the online MP/BR games currently support 60hz on PS4/XBO. And again if we see XBX X or PS5 Pro then there's another potential jump for 120hz or more games. So equally whilst atm PC wins clearly in this area, consoles (will or) are catching up too.
 
I admit some MP games may go up to 60fps but I can't see them getting any higher. There is no need or desire for developers to go for 120fps. Most games will be 30fps and some 60fps. The XBX and PS4 Pro enhanced games so far have really just been resolution bumps rather than anything else. Developers will use the power of the new consoles for eye candy rather than frame rate.
 
Sorry late on this, but I think that's a bad example.
  • The Witcher 3 came out in May 2015 for Windows, Xbox One and PS4. Content wise it was/is exactly the same game.
  • It was patched for PS4 Pro (HDR, 4K Checkerboarding, Adaptive V-Sync, LOD/shadows and other improvements) and Xbox One X (Native 4K, HDR, 60fps @1080p mode, LOD/Shadow and other improvements).
  • Then released for Nintendo Switch in 2019 with cross-progression/cloud save with the PC version.

Apologies I got some dates wrong, appears I either misread or had a brainfart when referencing. It's still a clear cut example though; on release, Witcher 3 was maximisable (albeit with moderate framerates) on powerful PCs; the PS4 version by contrast just a year and a half after console launch, was already having to settle for compromised graphical and LOD settings which were noticeably poorer than an average gaming PC of the time; go back and compare release date PS4/XBONE/PC comparisons. The PC wipes the floor despite this not being that far into the console life cycle.

The title was then optimised later on for PS4 Pro and Xbox One X, as a work of the developer on more powerful hardware, but it doesn't take away from the fact that those were mid-gen console updates, nor are these always guaranteed. In fact the quoted 1080p@60fps mode for the uprated system says it all. PC gamers target 60+ FPS at 1080p and have done for many years now, something that just isn't possible on base PS4/Xbox One without compromising the graphics, the hardware isn't strong enough. The PC is far more versatile, and even a year and a half into the PS4's life was outshining it significantly. There is also a lot of talk that the game was TONED DOWN to ensure it would run properly on the consoles. Look at the early E3 Witcher 3 Demos, or even Watchdogs. Games that were compromised to ensure the console experience was reasonable as the baseline PS4/XBOne aren't THAT good. They did OK against gaming PCs on release, but the console always starts out relatively strong and then gets outpaced. Always.

These revised mid-gen upgrades are also a relatively new thing, it didn't happen with older gens; the very fact it did with this gen should tell you how relatively anaemic they were at launch/how much faster PC was moving, a mistake IMO on the behalf of MS and Sony who didn't attempt this move with the 360/PS3 despite how long that gen lasted which says something either way.
 
Last edited:
I admit some MP games may go up to 60fps but I can't see them getting any higher. There is no need or desire for developers to go for 120fps.

Maybe - I certainly think 30/60fps will be more common. Although for some game, like Rocket League I think 120hz will become a thing. Stuff like COD has been 60hz forever on console and I think there will be some developers who want to push the consoles.

It's still a clear cut example though; on release, Witcher 3 was maximisable (albeit with moderate framerates) on powerful PCs; the PS4 version by contrast just a year and a half after console launch, was already having to settle for compromised settings which were noticeably poorer than an average gaming PC of the time.

No worries, felt a bit detailed bringing the point up :)

Perhaps but the game did get an upgrade on PS4 Pro/X at launch and didn't look bad on any platform at launch.

And there's different types of upgrades; so all the work that MS did on previous games (i.e. KOTOR now a 4K+16 x AA/AF - without any developer work). XSX will apply machine AI HDR on all previous games with no developer work required. Or all the games that will suddenly run at increased frame-rates (a good example is the Xbox 360 version of Bioshock that runs at 60fps when running in non-V-Sync mode on Xbox One X). Sony have improvements planned for PS4 games that won't require the developer. A similar effect to plugging in a new GPU, perhaps. We are seeing games and services improve on consoles though.
 
Oh definately, it didn't look bad on any machine, I LOVE the Witcher games, back to Witcher 1, but factor in the heavy suggestions the game was downgraded visually to allow consoles to run it (again see E3 demos), and it backs up the point that consoles always get outpaced. The consoles were brand new when Witcher 3 was being worked on and finished, it only came out a year and a bit into the lifecycle; that they were already having to downgrade over an average gaming PC says something I think on the rate at which PC's evolve.
Console development gets optimised and refined, but it'll never keep up with PC's evolution :)

PS4 Pro/XBOXOne X came out late 2016 and 2017 from memory, so yes Witcher got patches at that point to bring it to a higher level, but things like the 60FPS at 1080p were just bringing it up to a level PC gamers had already experienced 1.5 to 2.5 years earlier.

I'd argue the work that MS/Sony are now doing are simply trying to make themselves more relevant in the face of this! It never used to happen :) And regardless of whether the developer or the console manufacturer is doing it, SOMEONE is having to spend time at MS/Sony or the Devs to try and keep the console versions relevant :)

That is the point I was making. The consoles are great, cheaper ways to get into gaming, they get a lot of support, but they can't keep up with PC, and the increased power under the hood always gives developers more options; and if PS4 vs XBOXOne are any clear example the same is true on consoles. Many developers stated in comparison the Xbox One was anaemic and graphically games got compromised in terms for visuals or framerate compared to the stronger baseline PS4.
 
Last edited:
Console development gets optimised and refined, but it'll never keep up with PC's evolution

Yes and historically definitely so. But we are seeing things that have definitely changed. MS don't care where you play, just that you use Game Pass or Live to do so. And in a 'live services' world I think we are seeing more convergence between PC and Consoles. Which is going to see some of that power gap close. Or at least it matter less. Although we might have all moved to new Cloud, VR or Mobile gaming formats before that happens! :p
 
Yes definitely, but conversely that not caring is perhaps not doing them much good.
PC + Switch/PS4 right now is far more justifiable than PC + Xbox - most of the good games on Xbox are coming to PC, and if you have a decent multipurpose system, there's no huge incentive to buy XBOX hardware. Maybe that is Microsoft's plan as it were but not sure it makes a great case for going console only.

These last few years, I've been more disappointed with my Xbox than any other platform I own; and I own them all. Me and the wife bought an Xbox One expecting decent exclusives like earlier machines, but it's just not held up...

Again, certainly wouldn't push me towards going console only, especially with the touted numbers for the next gen consoles, and the typically higher cost of console software which also adds up over time; if you buy enough games around release that difference alone likely covers the cost difference, and the PC does a lot MORE than just game, and graphics upgrade on a modern machine in a few years to keep up to date will also cost you less than a mid-gen console replacement.
 
Last edited:
Prepare to eat your words. Gears 5 multiplayer will be 120fps and I think Fortnite will probably be updated, too.

https://www.destructoid.com/gears-5-wants-to-hit-120fps-on-xbox-series-x-583564.phtml

Ori and the Will of the Wisps developer has mentioned running the game at 120fps as well.

https://www.windowscentral.com/ori-and-will-wisps-may-run-120-fps-xbox-series-x

I was referring to future games, not games that are already out. Maybe the odd game here and there but in general games will not go as high as 120hz. Current gen at the moment can't even hit 60fps stable in most of the games let alone in 4k. The occasional game does but only when the resolution is dumbed down. Rage 2 for instance in only 1080p.
 
I was referring to future games, not games that are already out. Maybe the odd game here and there but in general games will not go as high as 120hz. Current gen at the moment can't even hit 60fps stable in most of the games let alone in 4k. The occasional game does but only when the resolution is dumbed down. Rage 2 for instance in only 1080p.

Yes, I realised that after I posted. But we'll see what happens. :p
 
The simple answer is no, you can't use a mouse and keyboard. If you want to game on console, you need to get used to using a gamepad.

Not entirely true most games don't allow it but let's says Modern Warfare, they have an option that allows you to play with M+K. And then you could always get a Xim apex which even though doesn't feel how mouse does on pc it still allows M+K to be used but just replicates the control sticks for the mouse.
 
Personally, I think a M+K would ruin the console. The whole idea of a console is using a single controller that plays games and navigates the menu system. I don't know why but it doesn't sit well with me.
 
Yes definitely, but conversely that not caring is perhaps not doing them much good.

Financially their gaming division is doing better than ever, and like Sony and Nintendo, they are benefitting hugely from the rapid increase in digital, subscriptions and live services.

But I think you're right if you have a Gaming PC then there is limited point in having an Xbox.
 
Back
Top Bottom