PC Gaming Is Dead!

. At least i dont have to buy a new G-card, mobo, ram and so on. just to play the game that the 360 can play anyway. I even know 3 pc gamers who have moved to the dark side. I think that explains a lot if you ask me.

Valid point, I've recently bought a new PC which will play every game out atm to a high standard(with the exception of crysis which i dont own) and feel it will be my last "gaming" pc. As in the future to play new releases which would require me to buy a new graphics card will be at least £200, then eventually i'll need a new mobo etc etc, it all adds up with upgrades just to play a game. When u look at it that way, its easier just to buy a games console with no fuss, just plug in and play.
 
Why would you need a new motherboard? CPU won't be changing socket for a while yet.

I'd wager current Quad/Dual cores will last until 2010, the only thing that will need upgrading to play games on HIGH detail will be the graphics card.

People forget you don't have to play games on the highest settings, you can buy cheaper cards and play at lower ones, y'know.
 
im just speaking from when AGP went out and PCI-E came in. I know what your saying with just play on lower spec but its just not the same, and on ps3 etc its at the highest spec u can get straight away
 
See discussion isn't too bad. :)

As Jihad says, you don't need to play games on Crysis quality, which BF: Heroes may or may not prove when it comes out.

Also you'll need to upgrade your PC at some point, even if it’s just for new operating systems or to run your programs well.

I think the major problem PC gaming has, is that so many people misunderstand it, especially the gaming press. And because of this, they misinterpret a lot of this to their readers. You always hear the horror stories about having to upgrade drivers and so on. But is it really that hard? It’s just the same as installing a program. And with Windows Vista, you never have to worry about drivers again if you don’t want to.
It’s the same story again with hardware. But if you bought a 8800GTX at launch, you’ve had a card that will play any game at the highest levels for well over a year and it’s still going strong. So it’s hardly upgrading every month.
 
im just speaking from when AGP went out and PCI-E came in. I know what your saying with just play on lower spec but its just not the same, and on ps3 etc its at the highest spec u can get straight away

Your right its not the same.
On the PC you have the option to stay at the cutting edge which moves fast.
while on the console you get what your given & it ain't going to get any better from the day it hit the selfs until a new one comes out 5 years or so later.
 
Your right its not the same.
On the PC you have the option to stay at the cutting edge which moves fast.
while on the console you get what your given & it ain't going to get any better from the day it hit the selfs until a new one comes out 5 years or so later.

But thats not true. Compare the launch PS2 games to stuff like God of War 2. Or launch 360 games to stuff like Assassins Creed. And with firmware updates for both next gen consoles even the consoles themselves don't stay the same.

See discussion isn't too bad. :)

I think the major problem PC gaming has, is that so many people misunderstand it, especially the gaming press. And because of this, they misinterpret a lot of this to their readers. You always hear the horror stories about having to upgrade drivers and so on. But is it really that hard? It’s just the same as installing a program. And with Windows Vista, you never have to worry about drivers again if you don’t want to.

I think the problem is that console users don't have to install games either (with the exception of DMC 4 on PS3 I believe) and game patching is done automatically when an update is available. Does your average casual gamer keep an eye on gaming sites to see when the latest patches for their games are available?
 
But thats not true. Compare the launch PS2 games to stuff like God of War 2. Or launch 360 games to stuff like Assassins Creed. And with firmware updates for both next gen consoles even the consoles themselves don't stay the same.



I think the problem is that console users don't have to install games either (with the exception of DMC 4 on PS3 I believe) and game patching is done automatically when an update is available. Does your average casual gamer keep an eye on gaming sites to see when the latest patches for their games are available?

Im talking about hardware & not software.
And to add the console updates don't really make any noticeable difference to gaming performance.
 
Last edited:
Why would you need a new motherboard? CPU won't be changing socket for a while yet.

I'd wager current Quad/Dual cores will last until 2010, the only thing that will need upgrading to play games on HIGH detail will be the graphics card.

People forget you don't have to play games on the highest settings, you can buy cheaper cards and play at lower ones, y'know.

Well technology is always advancing with pc's and the mobo is no exception. You can't help feeling bummed when you have just forked out a few hundred £££ on a good mobo & ram only to have the whole lot out of date just 6 months later with faster ram and new mobo's that make yours look inferior.

As for playing at lower resolutions on cheaper hardware, i don't know if it's just me, but i can't settle for anything that will be less than the 360. With me it's this whole "my pc is better graphics than the 360" thing that is keeping me hooked on pc gaming right now. This isnt the sole reason obviously but when it comes to upgrading my rig to play the next high tech game to look better than the 360 or the next gen console version, it starts to get tired with the upgrading and thats when the mind starts to wonder to the dark side.

It's a fact that pc gaming is more expensive than console gaming. You could argue that you have to buy a new console every X amount of years but it's still cheaper and more reliable than pc gaming and without the cheating ******* too!. It's also a ball ache playing these games with no issues what so ever on the pc. It's nice to have the best technology available, but it costs a No Swearing! load to get there in pc gaming. With the 360 you just accept it because everyone else is at the same lower level.
 
Last edited:
I think the problem is that console users don't have to install games either (with the exception of DMC 4 on PS3 I believe) and game patching is done automatically when an update is available. Does your average casual gamer keep an eye on gaming sites to see when the latest patches for their games are available?

Exactly, and not only that take my recent problem with a game (GoW) couldnt even get the game to start, installed newest patch, still no luck, so i had to download a cd crack to get the game working despite it being a legitimate copy. Then when i finally got it working, theres this savegame problem where i lost all my saved games and start again. Now to get where i was before i start i have to amend files...... All of this fuss just to PLAY the game whereas if i got a 360, none of these problems would exist, and if they did it wouldnt be long before it was automatically fixed.

To me it seems like the developers arent really focusing on major issues on pc games as much anymore, patches etc
 
Im talking about hardware & not software.
And to add the console updates don't really make any noticeable difference to gaming performance.

But surely hardware doesn't matter, its functionality that matters. The point I was making is that games get better over a consoles life as developers learn to use its full potential. So the hardware doesn't need to change to see more advanced games (see the examples I made). Firmware I just added to show that the functionality of consoles can change over time and that the use is no longer fixed as in the past.
 
But thats not true. Compare the launch PS2 games to stuff like God of War 2. Or launch 360 games to stuff like Assassins Creed. And with firmware updates for both next gen consoles even the consoles themselves don't stay the same.



I think the problem is that console users don't have to install games either (with the exception of DMC 4 on PS3 I believe) and game patching is done automatically when an update is available. Does your average casual gamer keep an eye on gaming sites to see when the latest patches for their games are available?

True, but if Valve got their way, that would also be true for PC gaming. And we still don't know what the gaming alliance is going to be about.

Microsoft won't give up the living room that's for sure. So it will be either be an Xbox 3 in every living room or a PC. Either way they want it.
 
True, but if Valve got their way, that would also be true for PC gaming. And we still don't know what the gaming alliance is going to be about.

True, I do like Steam a lot (although I do prefer a physical copy of a game myself) But the problem with PC is there won't be a unified standard, there will always be games that aren't part of it. I thought Games for Windows was a big step towards a set of bullet points that games would fulfill (like widescreen support)
 
But surely hardware doesn't matter, its functionality that matters. The point I was making is that games get better over a consoles life as developers learn to use its full potential. So the hardware doesn't need to change to see more advanced games (see the examples I made). Firmware I just added to show that the functionality of consoles can change over time and that the use is no longer fixed as in the past.

What gets added to a console over its life is insignificant to what a PC has to offer over the same period.

I have built 2 PC`s with 2.6 Opteron 939 4 gigs of ram & one with 1900xt & one with 1800xt & both customers are playing the latest games smoothly at 1440x900 a higher res than the default screen res of the consoles & the games look better & can choose what server they want & play online for free.
The problem is that allot of PC gamers don't want to play at console res & quality setting.

If the consoles gave the option of 16xAA 16ASF then you woulds see moaning about having to play at low quality setting all the time.
 
Last edited:
Well technology is always advancing with pc's and the mobo is no exception. You can't help feeling bummed when you have just forked out a few hundred £££ on a good mobo & ram only to have the whole lot out of date just 6 months later with faster ram and new mobo's that make yours look inferior.

Quite honestly that's your own fault, don't upgrade when someone faster is incoming, unless you know you can't afford the faster option down the line.

Also, 6 months later? What are you on...mobos last for ages as do CPU's, only graphics card change rapidly, and by rapidly it's usually over 6 months, 8800GTX 14 months, still one of the fastest cards out, enough said. ;)
 
IMO, cba reading all other threads, gaming is not dead, never will be, but its on a massive holiday, NO games, i mean NO games have kept me concentrated for more than 30mins, that have been released in the last year. :( rather frustrating for me.
 
IMO, cba reading all other threads, gaming is not dead, never will be, but its on a massive holiday, NO games, i mean NO games have kept me concentrated for more than 30mins, that have been released in the last year. :( rather frustrating for me.

should have read the thread really or at least the OP.
 
dead no, lacking? Yes. the pc hasnt had a A+++ title for donkeys years.

lol @ the valve fannies.
 
Saying PC gaming is dead is a bit objective though, innit. I mean it really depends on what you want to play. I'm still playing games like Sim City 4, Live For Speed, Settlers 2, GTR2 and WoW and as long as those games are around and my 3 year old PC can still play them I'll not be upgrading, and PC gaming will never be dead for me.
 
Back
Top Bottom