• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

BFG 8800 GTX/GTS arriving at OcUK soon and some images for you all!!

I didn't.....
icon14.gif
 
Do you ever get the feeling your being taken for a mug with screenshots of yet to be released games that could be utter crap anyway?

I've seen nothing on DX10 cards that I wouldn't expect to see on 360 and PS3 in the next 12 months.

I'm quessing these 'wonderous' DX10 games would look just as good on current hardware if developers optimised their titles, just as there are massive improvements in each generation of console titles.

If your current card plays the titles your interested in why upgrade?
 
JAG1977 said:
Do you ever get the feeling your being taken for a mug with screenshots of yet to be released games that could be utter crap anyway?

I've seen nothing on DX10 cards that I wouldn't expect to see on 360 and PS3 in the next 12 months.

I'm quessing these 'wonderous' DX10 games would look just as good on current hardware if developers optimised their titles, just as there are massive improvements in each generation of console titles.

If your current card plays the titles your interested in why upgrade?


INdeed

Crysis is reported to be 8 HRS long :eek:

online positive
 
LEUVEN said:
Just been thinking about the GX2.
At the OCUK shop it says.............

"- QUAD SLI ready – (Quad SLI support will be provided through a future NVIDIA ForceWare driver release. See www.slizone.com for details)
"
What if Nvidia release some good drivers after the 8800 series launch which makes the GX2 in Quad SLi come close to/match the performance of two 8800's? :D

Might not need to "Step-up" then :eek:

In which case, you need to stop thinking. :o
 
Last edited:
Gibbo said:
Hi there

Not impressed to be honest with that score.

To be honest I was expecting the 8800 GTX to be hitting at least 14,000 in 3D Mark 2006, but that is just a benchmark and games may tell a completely different story with much better gains to be had. :)
However it is good to see a single card matching ATI X1950's in Crossfire at DX9 benchmarks. That for sure says that DX10 is gonna be even better. :)

The test was done on Windows XP - it's made for Vista so expect scores to change higher on that platform - the card is future proof. Fom the initial testing we can see the DX9 scores are easily strong enough to hold you until Vista, and when it's released (vista) the card will increase in performance and you should get the scores you're looking for :)

Don't get me wrong, it's a shame the scores arn't higher on Win XP for DX9 but we're looking at future-proofing. When Vista is released - then you can see how cool the card is (using DX10). Until then, we have a cool card which gives the X1950 a run for its money.

I'm getting one :) Because usually graphics cards decrease in performance over time (and requires replacement) this time round the card will improve in most aspects and provide value for the long term.
 
Last edited:
ihatelag said:
The test was done on Windows XP - it's made for Vista so expect scores to change higher on that platform - the card is future proof. Fom the initial testing we can see the DX9 scores are easily strong enough to hold you until Vista, and when it's released (vista) the card will increase in performance and you should get the scores you're looking for :)

Don't get me wrong, it's a shame the scores arn't higher on Win XP for DX9 but we're looking at future-proofing. When Vista is released - then you can see how cool the card is (using DX10). Until then, we have a cool card which gives the X1950 a run for its money.

I'm getting one :) Because usually graphics cards decrease in performance over time (and requires replacement) this time round the card will improve in most aspects and provide value for the long term.

Futurue proofing is like chasing a rainbow. We don't live in tomorrows age today, we play with what whe got. Whenever the future (in terms of graphics cards) is revealed. It alaways turns out pear shaped, and that "futureproof" piece of kit you paid a stupid amount of money for is no faster than something half its price (6 months in the future)....

That sounds like some summary of a japanese anime....makes no sense. Badically, just look back to the Geforce FX series, it did alright in Directx 8.1, but with DX9, and when lots of shaders started to get used it turned out to be an underpowered turkey.

For all we know, when it comes to dx10 the 8800GTX could be the same, we just don't know yet. So saying that its only going to get better with DX10 is total rubbish.

It may get better, it may not. You nor I know, and if the past is anything to go by, then it won't be getting better.

Just because it supports DX10 natively, doesn't mean its going to have good Dx10 performance.
 
Stelly said:
some really nice results with both the x6800 and the quad core extreme... I'm definately going for the 8800GTX but dont know about quad core yet

Stelly
CPU benchmarks in 3DMark pretty much mean nothing at all. The fact the quad core scores an extra 2000+ more than the dual core, and the dual core no doubt scores 1500+ more than a single core doesn't mean a thing. Why? Because games aren't even using dual cores at the moment, it's a totally synthetic benchmark with zero real-world value. The CPU tests aren't even a game, they're just a preset sequence of events - pretty easy to program for multiple cores.

I get more than that 8800GTX & X6800 (dual core) in 3DMark06 with my "out of date" Crossfire X1900. So what's the point you're making?

I find it a bit strange that people will make a point of saying that they get "X6800 performance for £250" by overclocking a E6600 to X6800 speeds, but they're quite happy to shell out £500+ on bits of kit that get scores that can be achieved on a 7950GX2, etc.
 
Durzel said:
I get more than that 8800GTX & X6800 (dual core) in 3DMark06 with my "out of date" Crossfire X1900. So what's the point you're making?

I find it a bit strange that people will make a point of saying that they get "X6800 performance for £250" by overclocking a E6600 to X6800 speeds, but they're quite happy to shell out £500+ on bits of kit that get scores that can be achieved on a 7950GX2, etc.

How much did you pay for those two cards to best a single 8800?
Just under £500?

Yeah, now I can get ONE card, which for your information does not have any real-true benchmarks, so we cant say for sure what the scores will be.

One card which for me will be cheaper than your 'crossfire' setup, and most likely match or even as it matures, better your setup, ontop of having more features and being more future proof.

Why would I get two x1900's when I could get a 8800GTX cheaper?
(Definatly cheaper for me because of where I live anyway ;) )


Rayb74: I'd pay £500 minus 17.5%, £412.5 ish ? :)
That and I don't have a gfx card yet :p

You honestly think I, a person with a lent 7800 and the guy needs it back soon, am going to buy old technology, crossfire or SLi setups to meet the specs of ONE 8800GTX and not have as much support for the future?

Ontop of that the price should not be half bad for myself...

You're kidding right? :P
 
Last edited:
ihatelag said:
The test was done on Windows XP - it's made for Vista.
What?! DirectX 9 is a hell of a lot mature than DirectX 10 is, if anything this card will most likely perform better in Windows XP than in Windows Vista. In fact I'd put money on it, benchmark your system in Windows XP and then in Windows Vista RC1 (and don't pull out the "Vista isn't done yet," I know it's not, but the system requirements for Vista aren't going to magically cut themselves in half before it goes gold.)

It's not "made for Windows Vista" at all, it's made for any OS that can run or emulate DirectX 9 and also DirectX 10. This includes most versions of Windows (2000 and XP as legacy support for '95 and '98 has ended) and Linux too.

AWBbox said:
what happened to the model of GTX with the water cooled heat-sink? :confused: i'm concerned there are no new pics of it floating around.
As far as I know the water cooling parts are optional and can be removed. Either that or they took it out of the final design, as we only ever saw that in photographs of the engineering sample.
 
Back
Top Bottom