• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidea gtx300 series

i feel so so sry for pple who have around 40fps:eek::eek::eek: with crysis on full at highest resolution, it must be so pianful for you to have around 40 frames every second. CMON!!!!!!!!!!!

im on a 1.9GHZ AMD 64x2 with the worst graphics card ever(laptop)

and i cant even play F.E.A.R or fallout 3 or any of the great games!!!

so all u pple who complain about getting 50fps or 40fps on crysis on high SHUT UP!!

im finished:D

Roflmayo!! Well said!

Anyway, cba waiting till Jan next year, will just xfire the next new ATi cards and Flog off my 260's on the bay.
 
i feel so so sry for pple who have around 40fps:eek::eek::eek: with crysis on full at highest resolution, it must be so pianful for you to have around 40 frames every second. CMON!!!!!!!!!!!

im on a 1.9GHZ AMD 64x2 with the worst graphics card ever(laptop)

and i cant even play F.E.A.R or fallout 3 or any of the great games!!!

so all u pple who complain about getting 50fps or 40fps on crysis on high SHUT UP!!

im finished:D

Time for an upgrade?
 
lol by the time Nvidia come out with that, Larrabee will be round the corner an with intel pretty much being able to do what they want its going to be interesting. I can see the intro before the game starts now stating "Intel the way its ment to be played!"

I dont know how intel are going to play it as larrabee is going to be a hybrid graphics card using both a cpu and a gpu, and also having 1024 bits (512 each way) plus upto 48 cores :s

Plus intel gfx will support the X86 instructions with larrabee extensions and cache coherency across all cores.


Interesting times over the next year or so. :D

i also heard that nvidia isnt going to DX11 but to DX10.1 but i may be wrong....
 
another rant from charlie:
http://www.semiaccurate.com/2009/07/29/miracles-happen-gt300-tapes-out/

he could well be right, but Intel have only confirmed H1'10 for Larrabee and who knows what cores/cycles/cost will be involved?

Their software has a tendency to be very expensive - as anyone who's bought their threading-modules can testify to.

Then again, I expect they're not getting into this just-for-laughs.

We've been speculating for a while now, and just recently decided to look at the CUDA route seriously - worst case: migrate x86 code to Larrabee if it does what it says; best case: get a jump on serious CUDA libraries before MIMD and GT300.

There's pros and cons to both approaches, but the CUDA route can, at least, be validated now (sort of) - but Larrabee is still very speculative.

I suspect Larrabee will have more to offer parallel systems than CUDA, but from the first generation? very hard to say at the moment.

That said, can NV bring it? It could easily be H2'10 before they have something worthwhile!

/endrant
 
4870 no, 4890 no, 275 no, 280 no, 285 no, 4870x2&295 borderline playable at 1920x1200.

No, I'm not expecting 100fps. I'm expecting 40-60 fps that feels smooth and something like 280 is faar faar away from that, as are all the other 1 GPU cards. Those 2 GPU cards are somewhat better but the minimum fps is still terrible.

Now I don't know how well/bad Crysis is coded but it's nearly 2 years old and still doesn't run well, there's certainly something wrong with the game or the cards. I remember how Oblivion didn't run well at all but 2 years later it was easy to get solid fps numbers in it. With Crysis, that has not happened.

Played fine on my 280 at 1920x1200.
 
4870 no, 4890 no, 275 no, 280 no, 285 no, 4870x2&295 borderline playable at 1920x1200.

No, I'm not expecting 100fps. I'm expecting 40-60 fps that feels smooth and something like 280 is faar faar away from that, as are all the other 1 GPU cards. Those 2 GPU cards are somewhat better but the minimum fps is still terrible.

Now I don't know how well/bad Crysis is coded but it's nearly 2 years old and still doesn't run well, there's certainly something wrong with the game or the cards. I remember how Oblivion didn't run well at all but 2 years later it was easy to get solid fps numbers in it. With Crysis, that has not happened.


thats true i just played through crysis on my tv at 1920x1080 on my gtx 280 and on dx10 high, not very high, with no aa it still had a choppy frame rate in some places, especially the end boss.
 
I can't wait till we get to play Crysis on maxed out settings with minimum fps of at least 40 :)

we already can, my uncle has tr-sli gtx295 his minimum fps is 75
but it feels wierd without any lag at all with graphics on full "its simply orgasmic"-baby from family guy (i love games to much:D)
 
Played ok on mine @ 1920x1080 using CCC level 6.

Not entirely sure of the frames.. but felt relatively smooth.. apart from very few points where it felt laggy.

920 @3.6 w/ GTX295..
 
That's what a lot of people say. My own experiences and all the tests various sites have done all tell a different tale. In the past year I've tried the game with a 4870, GTX260, GTX280 with [email protected] and now Q9550@4GHz and it just doesn't run well with any combination. I've tried it in XP, 32-bit Vista, 64-bit Vista and now 64-bit Windows 7. Warhead was even worse even though they more or less promised it to run better.. well it really didn't.

i ran Crysis perfectly with all top settings with a 4870 when i had one.

was on 1280 x 1024 though, as that's all the monitor supported, but im sure the card had more to give, given its performance on that res.
 
we already can, my uncle has tr-sli gtx295 his minimum fps is 75
but it feels wierd without any lag at all with graphics on full "its simply orgasmic"-baby from family guy (i love games to much:D)

Someone correct me if im wrong, but I thought only 4 GPU's could be used when gaming.

in Quad SLI.

any more than that would only benefit other tasks.
 
we already can, my uncle has tr-sli gtx295 his minimum fps is 75
but it feels wierd without any lag at all with graphics on full "its simply orgasmic"-baby from family guy (i love games to much:D)

It's impossible to have three GTX 295's working in Tri-Sli mate.

NVIDIA only supports the following:

Single mode
SLI mode
Tri SLI mode
Quad SLI mode

What you’re talking about is Pent SLI mode, which doesn't exist (yet)

Single, double, triple and quad, all refer to the amount of GPUs being used, not the amount of cards. Therefore you could have Quad SLI on one card if it had four GPUs.

An NVIDIA GTX 295 has 2 GPU's, so you can either have:

Single mode (by disabling SLI mode and run purely from one GPU)
SLI mode (one graphics card)
Quad SLI mode (two graphics cards)

I think you may be getting confused with 3 NVIDIA GTX 285s, which appear to be the favourite Tri SLI configuration atm.

Either that or you are adding to the thread's smell of BS ;)
 
Last edited:
Yeah Tri sLi GTX295 isn't possible.
295 is a card with 2xGPU on it (basically 2x GeForce GTX 260/280 hybrids).
So you can achieve quad sLi, but not tri.
 
Last edited:
Back
Top Bottom