An Elder Scrolls: Oblivion thead

Richdog said:
I think you need to do your reading up on graphics cards. :p

6600 series is 128-bit, even the new 7600GT is 128-bit.

Is that so?

Take a look at : http://www.nvidia.com/page/geforce256.html


Key points... "GeForce 256

The World's First GPU
August 31, 1999 marked the introduction of the graphics processing unit (GPU) for the PC industry—the NVIDIA GeForce 256. "

and: Graphics-Core: 256 bit.

:rolleyes:

Edit, I should edit to add that it seems you are perhaps referring to the memory interface, which is not the chame as the core. Case in point, we talk about 32 and 64 bit desktop CPUs, the interface to memory is somewhat irrelevant.
 
Last edited:
PeterNem said:
Is that so?

Take a look at : http://www.nvidia.com/page/geforce256.html


Key points... "GeForce 256

The World's First GPU
August 31, 1999 marked the introduction of the graphics processing unit (GPU) for the PC industry—the NVIDIA GeForce 256. "

and: Graphics-Core: 256 bit.

:rolleyes:

*groans* i'd keep those rolleye smileys to yourself mate.

Memory interface... 128bit. That's the bit we're focusing on as that's the bit that matters.
 
Richdog said:
Memory interface... 128bit. That's the bit we're focusing on as that's the bit that matters.

See my edit..... I realised you were talking about memory interface, but the way you phrased the first post on the subject really sounded as if you meant the core.... When you reference to an overall architecture and a bit depth, we do not jsut instanly assume you mean the memory bus width.
 
PeterNem said:
See my edit..... I realised you were talking about memory interface, but the way you phrased the first post on the subject really sounded as if you meant the core.... When you reference to an overall architecture and a bit depth, we do not jsut instanly assume you mean the memory bus width.

Sorry, i'm just used to talking about them I guess so sometimes presume others will know the important bits too as when you say a card is 128-bit or 256-bit it is nearly always without exception the memory interface that is being referred to with GPU's as thats the significant bit. :)
 
Richdog said:
thats the only bit that is significant.

Not really. As you say, the 6600GT has a 128bit (memory interface) and the 6800 <vanilla> has a 256bit (memory interface), but the 6600GT I'd imagine to be faster in most scenarios.

Yes, it is important, but it's not the defining factor of performance.
 
PeterNem said:
Not really. As you say, the 6600GT has a 128bit (memory interface) and the 6800 <vanilla> has a 256bit (memory interface), but the 6600GT I'd imagine to be faster in most scenarios.

Yes, it is important, but it's not the defining factor of performance.

*sigh*

You're missing the point I was making, which is that when people refer to a card being 256-bit, they are referring to memory interface and not core. The same goes for professional sites and every forum i've ever been on. Of course I understand memory interface is not the be-all and end-all of performance when it comes to low-end cards (7600GT for example), but then again I never said it was.

I was just using the 256-bit 6800 as an example of a cheap yet decent-ish card.
 
Last edited:
Guys, you're being a bit petty arguing about this aren't you? It's an Oblivion thread, not a graphics card thread. Ignoring the whole 256/128 part of the whole debate, the original post this came off as a tangent was that of the 6600 being capable of running the game. I'd say it'll run it in low to medium settings, as Matt-Page has got a reasonable CPU, but the lack of memory, only 512MB, will also hold the game back greatly, especially with the masses of outdoor scenes.

Edit, PeterNem's beat me to the 'lets shut up and get back on topic bit now' :p
 
PeterNem said:
Which happens to be worse that what he already has.

Anyway, maybe we should take this to the graphics card forum ;) Let's get this back on topic.

I was thinking mroe of a cheap unlockable 6800, not a vanilla. And yes this is going OT so maybe dont start silly little debates in future then when you're not 100% about what you're arguing? ;)

-------------------------------

By the way, who here plumped for the collectors edition of oblivion?
 
Matt-Page said:
I am worried it wont run very well now.

I upgraded my RAM to run FEAR which ran fine at a medium spec.

Mate you should be fine at lower res and detail... they reccommend a 6800 series card as minimum but yours is roughly equal to a vanilla 6800 so don't fret too much. :)
 
lol even mine pased and I have a 6200TC at the moment...

obliviontest.JPG


Good job my 790GT is arriving next week. :D

Matt-Page said:
Reccomend a format and clean install of XP will help?

Shouldht make much difference tbh. :)
 
Back
Top Bottom