• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

nVidia GT300 - GeForce GTX 380 yields are sub-30%

8800GTX is a different card (G80) the G92 (8800GT, etc.) are a newer revision and hence fair game...

8800GT while same die as the G92 8800GTS is a different card as well (less SPs)...

So really it went 8800GTS (G92)->9800GTS->9800GTX+->GTS250. But strictly the GTX+ is a new smaller core revision too - its only the renaming the 9800GTX+ to the GTS250 that I have big issues with as the real 200 series cards have several benefits over the G9x cards especially in quality and color processing.

While theres quite a gap performance wise between a 285GTX and an 8800GTX now adays - the 8800GTX is still a pretty decent card and can still hang with its bigger brother at lower resolutions/less FSAA - people who bought the 8800GTX originally even tho they paid quite a bit for it got a pretty decent bargin really - prolly the longest useful lifespan of any card to date.

Not really got a lot to say apart from 'I agree with this post'. I did think the 9800 GTX+'s name was silly though, the difference between it and the 9800 GTX was far greater than the that of the 9800 GTX and the 8800 GTS.
 
an nvidia bashing thread where the atibots can scream around like little girls disguised as a serious thread? nothing to see here people, move on
 
an nvidia bashing thread where the atibots can scream around like little girls disguised as a serious thread? nothing to see here people, move on

You amuse me. You're quite willing to take meaningless jabs at ATi ("the way it's not meant to be played" :rolleyes:), but when somebody posts a news story against Nvidia suddenly it's 'atibots screaming around like little girls'. :D
 
I don't think this is a case of nvidia hate as some are calling it, more a concern that if nvidia do not evolve their products better very soon there might be less competition in the gfx card market which isn't any good for any of us. I am not sure nvidia as big as they are can withstand another fiasco like the g2xx series yes they are good cards and good performers but lets not pretend nvidia are not losing money with them because they are, having to lower the price to maintain market place. I want in an ideal world for there to be a strong nvidia a strong ati and intel as well because that way me and everyone else gets the most for their money. Seriously fanboys on either side completely miss the point and if they got their way we would all be worse off for it.
 
an nvidia bashing thread where the atibots can scream around like little girls disguised as a serious thread? nothing to see here people, move on
They really need to introduce a vote-to-ban button on these forums for Duran's posts.

If I wanted to see **** I'd take some laxitives.

This has nothing to do with ATi versus nVidia.
 
Last edited:
Well if there was a time for nVidia bashing this woud be it...

They backed themselves into a corner trying to produce the next "8800GTX" style monster and its looking more like the voodoo 6.

I'm still hoping they manage to get it together tho - in theory the GT300 could seriously move things forward with massive amounts of rendering and GPGPU performance - giving access to next generation physics, AI, etc.

Also to those people thinking the focus on GPGPU is a bad thing and they should concentrate on tesselation, etc. instead might want to look up whats happening with idtech6 - its looking like the future of graphics could well include a lot of raytracing style functionality both for lighting models and world visibility/geometry (sparse octree).
 
Last edited:
I still love my 4870 but i'm itching for something new to play with. And at this rate, I can't see it happening in 2009.

The graphics card market has never been this slow.

i know what you mean but can't complian too much give me time to save for the new cards.
 
They don't konw yet there still renaming it.:p

I lol'd

an nvidia bashing thread where the atibots can scream around like little girls disguised as a serious thread? nothing to see here people, move on

To be fair nvidia have produced a less than satifactory result but its early days. As for the global foundries move I think its a good one. If it cuts cost of nvidia cards and then they compete on price-performance in games rather than extra features then Ill be happy. Yes I own a 4870 (piece of junk died on me and its only 5months old) but I'm no fan boi. I would have went (and wanted to go) for a 260 but it was outta my price range at the time. /shrug
 
Last edited:
Do they make blenders that large?

I agree the graphics market is slow, but then it's the games that are slow. Everyone seems to have finally calmed down over crysis, so there's nothing else worthwhile that needs anymore power.

I've been playing Overlord II, which I think looks great, and it performs great too. I've not even bothered to put crossfire on becuase it runs fine on 1 4850 at 1920x1200. There's just nothing that taxes a PC that much as long as you have a decent PC.

Oooooh.

Bought that today, gonna start it tonight.
 
Oooooh.

Bought that today, gonna start it tonight.

That was using 8xAA too. Well either it was on, or they've done something in software to deal with jagged edges.

I caught a few jaggies, but only if I looked for them, there's no settings to change to enable AA though, the graphics config for the game is really poor. Just res options and 'low, medium. high' for graphical quality.

I haven't had a look for any .ini files though, I've been too busy enjoying the game. :p

EDIT: Whoops, I'm stupid. I completely overlooked the fact that the configs has a 'custom' option. :p
 
Last edited:
an nvidia bashing thread where the atibots can scream around like little girls disguised as a serious thread? nothing to see here people, move on

Agreed, I tend to keep an eye on these threads, but only so I know which users to add to my ignore list, some of the stuff some users come out with is just ridicules and the personal attacks are a joke, they shouldn’t be getting away with it.
 
Agreed, I tend to keep an eye on these threads, but only so I know which users to add to my ignore list, some of the stuff some users come out with is just ridicules and the personal attacks are a joke, they shouldn’t be getting away with it.

You are aware that this sums duran right up aren't you?

This has nothing to do with ATi versus nVIdia, yet the people who point the finger claiming 'fanboy' are the ones are are making it an ATi versus nVida thing.

Duran, as much as he likes to point the finger at people being fanboys, goes in to threads ranting at how terrible ATi products are.

Especially now at the moment when ATi offer better value for money.

No, it's not people telling the truth, they're 'hating' on nVidia.

To these people, the truth is only the truth when it pleases them.
 
Ignore them Kyle. There's been a massive influx of these types recently, and all the good intelligent posters have mostly gone as a result.

You won't get through to them, and they just want you to react anyway.
 
Ignore them Kyle. There's been a massive influx of these types recently, and all the good intelligent posters have mostly gone as a result.

You won't get through to them, and they just want you to react anyway.

This forum has indeed seen a huge change.
Lucky im still in contact with a few that have left through other means.
Some of them have opened their own forums.
 
Well if there was a time for nVidia bashing this woud be it...

They backed themselves into a corner trying to produce the next "8800GTX" style monster and its looking more like the voodoo 6.

I'm still hoping they manage to get it together tho - in theory the GT300 could seriously move things forward with massive amounts of rendering and GPGPU performance - giving access to next generation physics, AI, etc.

Also to those people thinking the focus on GPGPU is a bad thing and they should concentrate on tesselation, etc. instead might want to look up whats happening with idtech6 - its looking like the future of graphics could well include a lot of raytracing style functionality both for lighting models and world visibility/geometry (sparse octree).

The thing is i thought nvidia said just after the g80 gtx launch big chips were coming to the end and will have to go down the same route as ati did for their 4*** series, smaller chips that can scale well and maybe using muti gpu instead of 1 big chip.

Trouble is how useful is GPGPU for games? and like ati tesselation on the olders cards by the time raytracing fuctionality (might see it in xbox720 and then MS might push it)comes about there will be better cards out there and more likely dx12/13 by then.
 
You are aware that this sums duran right up aren't you?

This has nothing to do with ATi versus nVIdia, yet the people who point the finger claiming 'fanboy' are the ones are are making it an ATi versus nVida thing.

Duran, as much as he likes to point the finger at people being fanboys, goes in to threads ranting at how terrible ATi products are.

Especially now at the moment when ATi offer better value for money.

No, it's not people telling the truth, they're 'hating' on nVidia.

To these people, the truth is only the truth when it pleases them.

+1

I like both ati and nvidia products. Usually if not always, the ones that claim its a hate thread in their own particular and normally agressive style, are the ones that can't handle others having put forward a constructive and polite argument.
 
Last edited:
The thing is i thought nvidia said just after the g80 gtx launch big chips were coming to the end and will have to go down the same route as ati did for their 4*** series, smaller chips that can scale well and maybe using muti gpu instead of 1 big chip.

Trouble is how useful is GPGPU for games? and like ati tesselation on the olders cards by the time raytracing fuctionality (might see it in xbox720 and then MS might push it)comes about there will be better cards out there and more likely dx12/13 by then.

GPGPU is very useful if developers do start using hardware physics, porting AI code to the GPU, etc.

I expect that to happen much sooner than developers start using the more advanced DX11 features.

As for their use in raytracing type functionality its hard to say, if id push out tech6 faster then it could be as early as 2 years - I don't think it will be long after tech 5 as work is ongoing on both.
 
It got to be some sort of hybrid because of ati and nvidia raster type gpu (not sure if that the right term) but both have talked about a hybrid system.
 
It got to be some sort of hybrid because of ati and nvidia raster type gpu (not sure if that the right term) but both have talked about a hybrid system.

not sure if that was in response to what I said...

But what I'm talking about isn't the traditional raytracing/lighting - but raytracing/casting into sparse voxel octree using shaders for processing - something that is inherently massively parallel and can get a huge performance boost from SPs rather than the traditional CPU.
 
Last edited:
Back
Top Bottom