Associate
I looked into PS2 player count and it has so many ifs and buts, developer restrictions, having specific customs servers for the highest player counts, etc. But if it was more than 128 player stable then that just makes one more suspect of COD and BF not bringing this setting to PC. I did not mention RT as PC-specific tech because it was already in Consoles before it got any decent support on PC effectively making it a console tech from the beginning. You must remember how there weren't even tech demos available for people to run (except for press) on their 2080tis for months after its launch. Also, Consoles are/were not necessarily dependent upon progress in PC gaming hardware. This might be true now but PS3, if remember correctly had a quite different architecture to x86/x64 and was in some ways better than PC hardware for some time after its release (for some scenarios). Now you could say that PS3 has an Nvidia chip but point is, it was customized and not a close copy of PC GPU tech.
About the industry maturing, difficult to say how much of that is organic and how much is just various deliberate choices by devs/publishers/MS/Sony, etc.
PS2 has a high player-per instance count, because it makes loads of trade offs vs other multiplayer games. Like changes the player-server tickrate depending on your distance from the enemy and a whole bunch of other things like that. In PS1 it showed you the player ping/tickrate on screen/HUD if you enabled it and I think it was up to 2 seconds (2000ms) for idle players or players out of combat. Anyway the point is that extremely large numbers of player is possible but requires specialist optimization which is why few devs do it. It also suffers from a resource increase which is non linear, it's a power law, so the more players you have the more players each individual has to update, so you get this highly non linear growth of CPU and bandwidth increase that prevents you from doing huge player counts unless you do a lot of aggressive optimizations. We've had 100 player FPS games going back decades but they're rare and they make a lot of trade offs in order to get there.
Advancements in RT is a PC tech, Nvidia pioneered this with Microsoft to introduce RT into the DirectX API with the DXR instruction set and AMD played catch up to implement this in their PC hardware from LAST gen, and that last gen PC hardware is the basis for the new console hardware. Which is why they're so bad at RT and why any kind of RT implementation requires a lot of "optimzation" or in plain English, lowering of quality to get it to work.
Each console lifecycle is about 6-8 years, each PC GPU cycle is about 2 years. So just from that alone you know that the cycle of buying hardware and the re-investment back into R&D for graphics is happening on the PC. When the consoles launch they approach Nvidia and AMD to ask what they can provide and it's always some slightly customized version of their LAST gen hardware. The consoles aren't spending loads of money to provide next-gen graphics, they're borrowing from old PC tech. If the consoles wanted to out pace the PC and provide brand new and improved visuals unique to them, they'd need to charge massive premiums on the console hardware to cover the R&D they'd need to do to outpace Nvidia and AMD. But their target audience is budget gamers so they have to stick with copying old, last gen, hardware.
That's fine, I don't have a problem with it, most casual gamers don't care, but let's at least be honest about it.