• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ruby Ruby Ruby Ruby!

The card performs good if not great atm. When games come out that really
support all its features then i don't think people will be joking about it then.
I think it will be a good card for people that want to keep it long term which
will in turn make it a good investment .
 
Cheers for the info on that unit sounds like its got potential, unfortunately that phrase has been used many times before....

Can only hope.
 
Yeah but buy the time the tesselator is used Nvidia will have a card out that has one, G92 supposedly, just when the odd Dx10 game might appear if were lucky.
 
LoadsaMoney said:
Yeah but buy the time the tesselator is used Nvidia will have a card out that has one, G92 supposedly, just when the odd Dx10 game might appear if were lucky.


interesting how when sm3.0 was on the scene the exact same reply as yours was made saying that by the time games use sm3 ati will have a new card out.

how the roles just keep on reversing back and forth. :D
 
Originally Posted by TheRealDeal
I think it will be a good card for people that want to keep it long term which
will in turn make it a good investment .

I don't understand that statement. If the prices of cards went up and not down then yes it would be a good investment but as it stands by the time that games actually use these cards properly they will be half the price :confused:
 
Black Dragon said:
I don't understand that statement. If the prices of cards went up and not down then yes it would be a good investment but as it stands by the time that games actually use these cards properly they will be half the price :confused:

A risk you take when you drop £250 on a graphics card. The only time I spent that much money on one, it was a 9700Pro and did me fine for about four years. By that time, it was effectively worthless, but still did a very good job.

The screenshots are looking impressive, but I'm going to wait until there's a game that actually looks like that until I get the cards out.
 
Tom|Nbk said:


And here is the HDR showing off the lighting you can move the mouse about and see how it affects the environment and what have you

Why do graphics start to look less and less like real life :confused:

It is all about shiny surfaces and over bright HDR these days :(
 
Dutch Guy said:
Why do graphics start to look less and less like real life :confused:

It is all about shiny surfaces and over bright HDR these days :(

Its depressing isn't it? :( Then there's the bloom/blur rubbish, makes games look oh so unrealistic. I didn't know until I played a game, that Iraq is actually very blurry with bloom everywhere.

On the plus side, water no longer looks like mercury these days :D I'm hoping Crysis, like Fear, uses graphical effects for realism rather than an artistic mess. Although I have my doubts, as Fear proved, realistic graphics aren't neccesarily embraced.
 
Boogle said:
I'm hoping Crysis, like Fear, uses graphical effects for realism rather than an artistic mess. Although I have my doubts, as Fear proved, realistic graphics aren't neccesarily embraced.
Or it could be like Far Cry and be very colourfull.
 
Dutch Guy said:
Or it could be like Far Cry and be very colourfull.


you could put farcry into cold mode and get a more realistic look.
i prefer the paradise mode, with hdr it looks really nice with over the top lighting.
 
Cyber-Mav said:
you could put farcry into cold mode and get a more realistic look.
i prefer the paradise mode, with hdr it looks really nice with over the top lighting.
Yeah, I was a bit harsh, Far Cry looks (looked) good but I am just a bit disappointed with the next gen cards/games in that I think they can make the games look much more like the real thing but fail to do it because HDR and bright colours sell more games.
 
Cyber-Mav said:
interesting how when sm3.0 was on the scene the exact same reply as yours was made saying that by the time games use sm3 ati will have a new card out.

how the roles just keep on reversing back and forth. :D

I guess, but SM3 was a sure thing, i.e. it was KNOWN it would be used in a lot of games and both nvidia and ati would end up using it. Tessilator units, (as with n patches) I just dont see taking off, its not like its 'SM4' (yes its probably part of SM4) but only one company has embraced it, and possibly when nvidia do there implimentation will be different,im all for new ideas but industry wide ones, not specific to companys trying to say 'hey look at this its shiny', I want something that will be used, i.e. they could have taken all the transistors for tessilators and made more DX10 shader processors, i.e. more power = better gaming experience
 
Back
Top Bottom