Soldato
I didn't.....
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
<maddness> said:
where do you want it
JAG1977 said:Do you ever get the feeling your being taken for a mug with screenshots of yet to be released games that could be utter crap anyway?
I've seen nothing on DX10 cards that I wouldn't expect to see on 360 and PS3 in the next 12 months.
I'm quessing these 'wonderous' DX10 games would look just as good on current hardware if developers optimised their titles, just as there are massive improvements in each generation of console titles.
If your current card plays the titles your interested in why upgrade?
LEUVEN said:Just been thinking about the GX2.
At the OCUK shop it says.............
"- QUAD SLI ready – (Quad SLI support will be provided through a future NVIDIA ForceWare driver release. See www.slizone.com for details)
"
What if Nvidia release some good drivers after the 8800 series launch which makes the GX2 in Quad SLi come close to/match the performance of two 8800's?
Might not need to "Step-up" then
Gibbo said:Hi there
Not impressed to be honest with that score.
To be honest I was expecting the 8800 GTX to be hitting at least 14,000 in 3D Mark 2006, but that is just a benchmark and games may tell a completely different story with much better gains to be had.
However it is good to see a single card matching ATI X1950's in Crossfire at DX9 benchmarks. That for sure says that DX10 is gonna be even better.
ihatelag said:The test was done on Windows XP - it's made for Vista so expect scores to change higher on that platform - the card is future proof. Fom the initial testing we can see the DX9 scores are easily strong enough to hold you until Vista, and when it's released (vista) the card will increase in performance and you should get the scores you're looking for
Don't get me wrong, it's a shame the scores arn't higher on Win XP for DX9 but we're looking at future-proofing. When Vista is released - then you can see how cool the card is (using DX10). Until then, we have a cool card which gives the X1950 a run for its money.
I'm getting one Because usually graphics cards decrease in performance over time (and requires replacement) this time round the card will improve in most aspects and provide value for the long term.
CPU benchmarks in 3DMark pretty much mean nothing at all. The fact the quad core scores an extra 2000+ more than the dual core, and the dual core no doubt scores 1500+ more than a single core doesn't mean a thing. Why? Because games aren't even using dual cores at the moment, it's a totally synthetic benchmark with zero real-world value. The CPU tests aren't even a game, they're just a preset sequence of events - pretty easy to program for multiple cores.Stelly said:some really nice results with both the x6800 and the quad core extreme... I'm definately going for the 8800GTX but dont know about quad core yet
Stelly
Durzel said:I get more than that 8800GTX & X6800 (dual core) in 3DMark06 with my "out of date" Crossfire X1900. So what's the point you're making?
I find it a bit strange that people will make a point of saying that they get "X6800 performance for £250" by overclocking a E6600 to X6800 speeds, but they're quite happy to shell out £500+ on bits of kit that get scores that can be achieved on a 7950GX2, etc.
What?! DirectX 9 is a hell of a lot mature than DirectX 10 is, if anything this card will most likely perform better in Windows XP than in Windows Vista. In fact I'd put money on it, benchmark your system in Windows XP and then in Windows Vista RC1 (and don't pull out the "Vista isn't done yet," I know it's not, but the system requirements for Vista aren't going to magically cut themselves in half before it goes gold.)ihatelag said:The test was done on Windows XP - it's made for Vista.
As far as I know the water cooling parts are optional and can be removed. Either that or they took it out of the final design, as we only ever saw that in photographs of the engineering sample.AWBbox said:what happened to the model of GTX with the water cooled heat-sink? i'm concerned there are no new pics of it floating around.