• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Graphics card better than 360?

Associate
Joined
10 Feb 2007
Posts
326
Okay so i'm getting a new shuttle computer in august, and i'm wondering, which graphics card by Nvidia or by ATI would give better graphics than the Xbox 360?

I will be using Vista home premium, 2.13ghz C2D processor, 2gb ram.

The Shuttle i'm getting accepts ATI Crossfire. I would like a Graphics card as cheap as possible that would give me same graphics as the xbox or better.
 
Isnt the 360 a G70?

In which case the Nvidia 8800/8600 (*8600 to an extent) and the HD2900XT's be faster?

(Don't take any of the above as gospel)
 
Sorry to highjack your topic. But would you say there was anything out that could be compared to a PS3?
 
Being an owner of a PS3 and X360, up to now I honestly can't really say my x1950 XT is better than those consoles. Your best bet would be an 8800 I guess.
 
Lonz said:
Sorry to highjack your topic. But would you say there was anything out that could be compared to a PS3?

The ps3 uses a gefore 7900. And the 360 uses a r500, which outperforms the 7900 in the ps3...

If you want comparable performance to a 360 you should proberbly get a 320mb gts (dunnofor sure though :p).
 
SebC said:
Being an owner of a PS3 and X360, up to now I honestly can't really say my x1950 XT is better than those consoles. Your best bet would be an 8800 I guess.

The reason is blindingly obvious, they are consoles therefore the games are optimised for the specification of that console and they don't have to cater for a variety of hardware as there isn't any.
In computers there is a huge variety of hardware to choose from, operating systems, drivers etc that need to be catered for therefore reducing the level of optimisations that can be made.

Taking graphics cores out of the equation, the cpu's in the xbox and ps3 are again highly focused cores, very fast too. :)
 
Yeah the Xbox 360 uses an R500 (X1800/X1900) based GPU called Xenos, and the Playstation 3 actually uses a less advanced G70 (7800 Series) based GPU called RSX. So pretty much anything on the market today above an X1950Pro is easily equal or better when compared to this generation of consoles.
 
Last edited:
LoadsaMoney said:
The 360 is inbetween the x1800 XT and the x1900 XT. :)


you seen gears of war running on the 360 at 1900x1080? no way would a x1900 handle that, its more on the level of 8800ultra.

and as time goes on the 360 will have more life in it than any gfx card for the pc.
 
Cyber-Mav said:
you seen gears of war running on the 360 at 1900x1080? no way would a x1900 handle that, its more on the level of 8800ultra.

and as time goes on the 360 will have more life in it than any gfx card for the pc.

yeah hes right, X1900 series is proberbly similar theoretical performance wise to R500, the 8800 series should absolutely stamp all over that and the RSX, so should the X2900XT, but and its a bloody big but PC titles cannot be properly optimised, due to the different ways ATi and NVIDIA graphics processors do things and the other varied hardware in PCs, if you could optimise Gears Of War in the same way for the PC, for just NVIDIA 8800 series for example the performance would be ENORMOUS compared to the x-box 360 version, it would just trample all over the consoles, direct X 10 and unified shaders i think is supposed to help solve the optimisation problem with PCs i believe by making everything more 'standard'. at least i think thats the overall intention of DX10 and how something like crysis can possible run on a single 8800GTX and still attain decent frame/rate :)
 
Cyber-Mav said:
you seen gears of war running on the 360 at 1900x1080? no way would a x1900 handle that, its more on the level of 8800ultra.

and as time goes on the 360 will have more life in it than any gfx card for the pc.


Because theyre programming for one spec, if the pc was a fixed platform like the 360 or ps3 it would be relatively easy to squeeze every last ounce of performance out of it and not have to worry about lower end hardware or dodgy drivers or whatever.
 
Cyber-Mav said:
you seen gears of war running on the 360 at 1900x1080? no way would a x1900 handle that, its more on the level of 8800ultra.

and as time goes on the 360 will have more life in it than any gfx card for the pc.


this is partially down the programmable shaders on the gpu basically making it highly optimisable for games and getting every single last drop out of the core.

but also down to the timeframe. in the year of release of a console basically all gaming companies focus their staff on launch titles and getting a large number of games available in the first year. 3 consoles have launched, every console launch year we get next to nothing in regards to decent PC games. but half the upcoming huge titles this year are 3-5 years in the making type games, a few based on UT3 engine. the UT3 engine is another massive example, its a PC born and bred engine that, due to console year the engine was massively pushed for console games, which is why GOW came out. GOW itself will come to PC, sounds like next year(make a title exclusive to a console and get more money and there are normally time limits on the exclusivity), right now some people still buy 360's just for GOW which if it was available on the PC means no money for M$ as they don't get commison per title just because of windows. the game should run quite a bit nicer on the PC and so will all UT3 games. the reason, while GOW is utterly fantastic up close and personal it has very little depth to it, image quality at a distance is a huge step off what it is close up.

that game with more memory available to it, higher res, and high AF/AA settings and damn.

anyway, we're getting to that point where we are fairly close to the "next gen" engines for the PC, hl2 isn't a yearly thing, its a long term engine, farcry, ut3, quake, they all take a while for a refresh to come out and PC's have suffered hugely in the "console" year.

i think we're back to hear on in, or in a couple months the big engine games will start to come back to PC's and in doing so the graphics quality will be on the PC side.

the problem is, a 8800gtx can run HL2 looking just as nice as a x1800, proboa x800xt, but runs it faster, aa/af settings aside you need NEW games to push new hardware and the PC has had nothng new for ages.
 
Gears Of War does indeed look fantastic but if you analyse the screenshots carefully you'll see why. The actual geometry and textures are fairly basic; even on downsized screenshots you can see plenty of low res textures and polygonal surfaces.

What makes it look so good is the pixel shading and post-processing effects; it's practically everywhere and completely dominates every aspect of the game. This is similiar to how Doom 3 used normal mapping and real time shadows when it first came on scene, take them off and it looked like Quake 2!

You just have to look at the screenshots close up on your PC monitor (like how most people play PC games) and you quickly see how the Epic team managed to pull such amazing graphics and why most mid-high end computers could handle it easily. Also some of the screenshots look as if they have been upsized from a lower resolution. :confused:
 
Last edited:
Bingo! As titaniumx3 said, you'll find that console games in general have much less graphical fidelity.

Look at Oblivion as a prime example. The Xbox 360 version has lower draw distance, absolutely zero anisotropic filtering, the textures in general are of lower quality etc.

Even though it's much easier to optimise code for console hardware, they still have to take huge shortcuts.
 
Ulfhedjinn said:
So pretty much anything on the market today above an X1950Pro is easily equal or better when compared to this generation of consoles.

gpu specification wise yeah, but dont forget the ps3 was originally designed not to have a gpu at all just run everything through the 8 cores the card was added as a bit of a boost. which means it all comes down to programming and how well a developer tailors a game for each set-up.


Ulfhedjinn said:
you'll find that console games in general have much less graphical fidelity.

Look at Oblivion as a prime example. The Xbox 360 version has lower draw distance, absolutely zero anisotropic filtering, the textures in general are of lower quality etc.

thats really only down to lazy/rushed developement, oblivion for the ps3 had better textures and lod's until beth patched the pc version up to include the ps3 shaders etc.


i don't want this to turn into a pc vs console war but the fact is that they cant be compared that easily.
 
Last edited:
dalin80 said:
i don't want this to turn into a pc vs console war but the fact is that they cant be compared that easily.
I'd agree if consoles weren't trying to be PCs these days, but unfortunately I am going to compare a G70 GPU with a G70 GPU and an R500 GPU with an R500 GPU. Game, set, match in favour of the PC.
 
Back
Top Bottom