Grand Theft Auto 4

If they do the game any justice on the pc it will destroy the poor console versions that were terrible in all departments.
 
Serves you right for getting it on PC if you ask me as it is quite clearly a console game, Plays Perfect on my 360 by the way.

No it doesn't it felt terribly stuttery and controls were terrible.
The framedrops used be unbearable on the 360 I played at.

Gta series will always play better on mouse+kb imo. And on a pc, at least you can drop the gfx detail to make it go smoother, basically I want 35 minimum fps... Anyways, all the gta games had minor graphical bugs and I doubt it's for ''all'' users with the gfx card named in the readme, I think it's just a few tested that experienced the problems...


The only worry about GTA IV for me is windows live, I couldn't and still can't get it working whatever I do on Fallout 3, hoping for more luck with GTA IV otherwise it'll be a long process getting help from R*... I want multiplayer...
 
For those that don't read more than one post before jumping in :rolleyes:

think this is just one of those Locked in Personal preference discussions.

Gutted for those that shell out though only to get crap performance.
 
If they do the game any justice on the pc it will destroy the poor console versions that were terrible in all departments.

It's a bit unfair to say they were terrible, because they....weren't. The only thing really affecting them was performance, which is largely a result of it probably being the most ambitious console game so far. The amount of detail they packed in meant it was largely inevitable that performance was going to take a hit. It's not like it was running at 10FPS the entire time.
 
The amount of detail they packed in meant it was largely inevitable that performance was going to take a hit.

See, that's the thing. On a top end PC it probably won't take a hit .. and it's not R*s fault that the consoles are out dated hardware.
 
See, that's the thing. On a top end PC it probably won't take a hit .. and it's not R*s fault that the consoles are out dated hardware.

I disagree with your logic. Consoles are a fixed hardware platform so the onus is on developers to fit their game to that hardware, not design a game which is too much for the hardware and then release it anyway.
 
If they have spent the time well coding and optimizing the engine, then i'm positive my x1950Pro 512 and x2 3800+ will run it fairly adaquatly at 1650x1050.

Fairly old hardware by most in comparison, but i've ran a few modern games at nearly full settings without any issues (without AA though)

However, if this isnt the case, then like BIA:HH i guess it'll be medium-low settings. :o
 
I disagree with your logic. Consoles are a fixed hardware platform so the onus is on developers to fit their game to that hardware, not design a game which is too much for the hardware and then release it anyway.

Why is it on R* to scale down the game, rather than on MS to make the 360 hardware better / upgradeable?

When Crysis came out and my PC at the time could not run it I did not bitch at Crytek about the game, rather I figured it was time for an upgrade.
 
Why is it on R* to scale down the game, rather than on MS to make the 360 hardware better / upgradeable?

How can MS (or Sony, for that matter) be expected to make the hardware better for a game that wasn't available at the time the console was in development? They can't predict the future.
 
Why is it on R* to scale down the game, rather than on MS to make the 360 hardware better / upgradeable?

When Crysis came out and my PC at the time could not run it I did not bitch at Crytek about the game, rather I figured it was time for an upgrade.

I hope I don't really need to explain why consoles shouldn't be upgradeable (I'm talking the major components like CPU/GPU/RAM etc and not a bigger hard drive before someone steps in with pedantry).

By your logic they could release it for the PS2, have it run at 10fps and it's Sony's fault for not making the PS2 faster or upgradeable.

In fact some of the latest games HAVE been released on the PS2, eg. Need for Speed Undercover, and naturally the devs have reduced the graphics detail etc. in order to keep the performance at a decent level. That is how it works on consoles.
 
For those that don't read more than one post before jumping in :rolleyes:


think this is just one of those Locked in Personal preference discussions.

Gutted for those that shell out though only to get crap performance.


But if we pc users get crap performance, we can fix it by getting new gpu's & cpu's. Console users are stuck with dating gpu's that bottleneck the whole game, you can't throw money at the problem to make it go away :).
Nope, just during the moments when you most needed a high and stable frame-rate. :p

Run @ less than full detail or upgrade hardware?
 
Last edited:
How can MS (or Sony, for that matter) be expected to make the hardware better for a game that wasn't available at the time the console was in development? They can't predict the future.

No, but they can make the console like the PC platform, where you are free to swap out and add components so that games can improve and evolve rather than staying the same overall GFX level for 3-5 years.

I hope I don't really need to explain why consoles shouldn't be upgradeable (I'm talking the major components like CPU/GPU/RAM etc and not a bigger hard drive before someone steps in with pedantry).

By your logic they could release it for the PS2, have it run at 10fps and it's Sony's fault for not making the PS2 faster or upgradeable.

In fact some of the latest games HAVE been released on the PS2, eg. Need for Speed Undercover, and naturally the devs have reduced the graphics detail etc. in order to keep the performance at a decent level. That is how it works on consoles.

Yes you do need to explain why it should not be upgradeable. In my eyes any system that is designed to last serveral years and be used for things that constantly evolve and improve must be able to handle whatever is out during that time frame.
I can understand laptops, which are NOT gaming machines, having un-upgradeable hardware ... however game technology evolves, as new physics systems are developed and improved they require more processing power, as new graphical things added the GPU requires more power. It is ridiculous to expect 3 year old hardware to cope with modern games, imaging trying to run Crysis on the 360...
It is also probable that the 360 will be with us for another 18-24months, meaning that by the time it is gone it will be as obselete as a Pentium 2 chip is nowdays.
 
But if we pc users get crap performance, we can fix it by getting new gpu's & cpu's. Console users are stuck with dating gpu's that bottleneck the whole game, you can't throw money at the problem to make it go away :).

I think the point is that developers shouldn't make games that are too over-ambitious for the hardware in the first place, as far as consoles are concerned. Personally, I don't think they were over-ambitious with GTA4; performance does dip in places but I've never seen huge amounts of pop-up or abysmal framerates. It could have been a bit smoother overall but I think it showed just what the consoles can do with the so-called "outdated hardware". There are plenty of games which look worse and run worse.
 
Back
Top Bottom