• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gibbo is getting an R600!!

willhub said:
I wonder what the Pro will be like, I may go for that, but maybe it wont cope with crysis? maybe it wont be as powerfull as my 8800GTS, but then again, its bound to perform better I think.
I don't think we know much about the Pro model yet but I suspect the XT 512MB, XTX 1GB and the dual-core/card XTX2 will be the only reasonable upgrades from an 8800GTS.
 
The XTX2 wont be that good from the specs in another post I seen, looks like its all lowered clocks and cut down features for 2 cores.

I dont think the 512mb will be much of an upgrade because with a 1GB one out, its gotta be saying something :\

Its funny tbh, I boought the 8800GTS for it been an upgrade, not realising my PC is crap and makes the 8800GTS slow then that combined with rubbish drivers = a total waste of money, I mean I dont think I have had a good gaming experience since I got it, its been junk for me, my mind is allready made up, I think I might sell the GTS today since the longer I leave it to wait and the even closer it gets to the launche the less I am gonna get for it.
 
Last edited:
Ulfhedjinn said:
hate it when games force me down to 4x (or 0x, in the case of S.T.A.L.K.E.R.)
:D

I think the drivers will be the fulcrum for this card, as people have based their disire for an ATI card on this arguament, well mostly.
If they get the drivers right they could well have another 9700pro on their hands.
 
Last edited:
Hmm .. well the gtx is a bit variable ... starting at around £350, with most being around £380.
By the sounds of it, an XT will sell for around £270, which is very good if it gives gtx like performance, so I would expect the XTX to be hitting around the same prices as GTX.
As far as I can work out, the XT is GDDR3, and the XTX GDDR4, and with faster clocked memory and gpu.
What worries me is they sent Gibbo an XT ... does that mean the XTX won't come out at the same time (they have no spare samples as production is low ??), and is that date now mid May as some sites have suggested, or the beginning of May as was the original plan ?
 
Blimey!
I was just going to sell my SLI 7900GT's and go for a 8800 GTX but I think I might wiat just a bit longer for some benchies!
I have always prefered NV cards as they work better for 3D apps like Solidworks etc, but this does look tempting!
 
willhub said:
The XTX2 wont be that good from the specs in another post I seen, looks like its all lowered clocks and cut down features for 2 cores.

I dont think the 512mb will be much of an upgrade because with a 1GB one out, its gotta be saying something :\

Its funny tbh, I boought the 8800GTS for it been an upgrade, not realising my PC is crap and makes the 8800GTS slow then that combined with rubbish drivers = a total waste of money, I mean I dont think I have had a good gaming experience since I got it, its been junk for me, my mind is allready made up, I think I might sell the GTS today since the longer I leave it to wait and the even closer it gets to the launche the less I am gonna get for it.


I think its something to do with youir slow CPU and your 1 gig of ram in that thing you call a gaming pc :D

You need at leastv 2 gig of ram for a gaming pc period

You can't blame the card when the rest of your system is not up to scratch.
 
pegasus1 said:
It was the STALKER reference mate, i expected you to mention me looking fwd to running it in 24AA mode on the R600 :D
Ah right, LOL. Don't you dare. ;)

willhub said:
I can blame the drivers though since there rubbish and I get stuttering crap in games that dont even reach my system memory limit then games with crap like HL2 and other problems.
Ignore easy, he thinks that attempting to mock your machine will actually bother you somehow. I think that this is his new technique to LANGUAGE people into going Nvidia because he does.
 
willhub said:
I can blame the drivers though since there rubbish and I get stuttering crap in games that dont even reach my system memory limit then games with crap like HL2 and other problems.


I would suggest having a closer look at your system.

If you think getting 12k in 06 is crap then you need to do more research.

The reason why you are getting stuttering in games is

a: you dont have a fast enough cpu
b: you don't have enough system ram
c: you are paging to disk

I for one have my system tweaked to optimal perfromnce and have nothing but high FPS in all the games I play.

You have an unbalanced system thats what the problem is.
 
pegasus1 said:
We know, Ulf knows what i mean.
Yeah, stop reminding me. I can almost feel my eyes burning when I have to play something with less than 8x antialiasing, and it's so much worse when I have to use that blurry GRAW thing or nothing at all. :(
 
willhub said:
I can blame the drivers though since there rubbish and I get stuttering crap in games that dont even reach my system memory limit then games with crap like HL2 and other problems.

You have 1GB RAM and a 3700+... that's why you get stuttering...
 
Ulfhedjinn said:
Ah right, LOL. Don't you dare. ;)

Ignore easy, he thinks that attempting to mock your machine will actually bother you somehow. I think that this is his new technique to e-penis people into going Nvidia because he does.


I'm saying that for todays games 1 gig of system is not enough.

You think this is incorrect advice?

And that 1 gig of ram is enough for todays games?

Stop trolling
 
Richdog said:
You have 1GB RAM and a 3700+... that's why you get stuttering...

Sure its why I get suttering in TDU and UT2004 since BOTH games dont reach the system memory limit in XP with only 80% ram been used.

Either way, Nv drivers are bad.

HL2 problems are nothing to do with my CPU or System, I can gaurantee it since that doesent even use all my ram and my X800XTPE could run it at max so unless there is a 50% bottleneck between CPU and GPU, I shouldent be getting these problems other than low FPS or stuttering were my ram is at 100%, NOT WHEN i HAVE 20% FREE...

I can play GRAW at maximum with AA/AF @ 16xQ forced not that it works but my average fps is 60 with everything fully maxed.

Easy I never said 12K in 2006 was crap, but I dont care about 3DMark all that much.
 
Last edited:
Quite a long time ago someone was asking about PSU compatibility.

According to Gibbo the R600 card will run at stock speeds with two 6-pin PCIe 1.0 connectors, but requires one 6-pin and one 8-pin PCIe 2.0 connector to be overclocked.

There will be adaptors available to convert 6-pin connectors to 8-pin ones, but you will require TWO 6-pin connectors to create one 8-pin connector via an adaptor.

Thus, if your existing PSU has only two 6-pin connectors you will be able to run a single R600 card at stock speed only.

If you have four 6-pin connectors you will be able to run R600 Crossfire at stock speeds, or a single card overclocked.

If you want an overclocked R600 Crossfire rig you will need two 8-pin connectors and two 6-pins, or one 8-pin and four 6-pins, or six 6-pin connectors.

An additional complication is that most PSUs that have "PCIe 2.0" connectors actually have the wrong connector on the cable, because the PCI-SIG changed the specification at the last minute. So you will probably need a wrong-8-pin-to-correct-8-pin adaptor.

As for the "320 stream processors" I think that probably means 64 ALUs, each of which is capable of 1 vec4 and 1 scalar operation per clock cycle. In a worst case this is equivalent to 128 scalar processors (of the sort 8800GTX uses). In a theoretical best case it's equivalent to 320, but it's much harder to keep vector ALUs constantly fed with data than is the case with scalar ones, so the efficiency will be a lot lower.

Also remember that the ALUs in G80 chips are clocked at a much higher speed than the rest of the chip. As far as we know, ATI won't be using clock domains in R600. If the core clockspeed is (say) 675MHz, then, even at the theoretical peak efficiency, you've got 320 scalar equivalents compared with 256 scalar equivalents on G80, because the G80 ALUs are operating twice as fast. Hopefully the clock speed will be a little higher than 675MHz, but when you factor in the inefficiency of vector units compared to scalar ones, it's not surprising that R600 and G80 shader performance are fairly similar. (In some cases R600 will be faster, in others G80 will).
 
Back
Top Bottom