• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ATI Radeon 5870 and 5870X2 specs revealed?

I dont think that Nvidia are ready for DX11 with there current arc, thats why they are having to develop a new card from scratch that can incorporate all the DX11 requirements. As long as they can get the cards out by the end of the year, at the same time as ATI, then all is good ;)
 
It would be a bit stir, cause quite a lot of upset for Nvidia if the Nvidia people decided to move away and onto ATI, especially if when Nvidia releases there new cards and realise not many people are buying
 
With the price drops on the 4000 series, i'm starting to think that we'll see a die shrink of current architecture before new ones.

Maybe HD4900 and 4700 series?
 
There might just be a die shrink of our current tech, that can be run faster with less of a power drain, and be called the "Overclocked" versions, maybe from the factory so that you get more performance for much the same money
 
Farcry looks better than cry your eyes out low frame rates sis at 1920 * 1200 and you can run it on a "speak and spell" and still get great frame rates.

TBH Farcry is the bench mark of how an FPS should look and run. That game never stuttered or dropped to low frame rates once.

You obviously can't remember when Farcry first came out. The top end cards for its day were the Nvidia 5800/5900 series and ATi 9800 series. The Nvidia cards couldn't do DX9 without serious problems, so that's already a vast majority of the PC gaming populance who couldn't run Farcry smoothly at all. The lucky 9800 owners still saw a lot of dropped frames. You're just looking through rose tinted glasses. Farcry was nowhere near as demanding on the top hardware for its time as Crysis is, but it was still a game that needed a colossal beast of a computer to run, and it wasn't until the eventual release of the 6800 and X800 series of cards that we could play the game smoothly on maximum settings.
 
Weren't there optimised drivers for the 5900 series and Farcry (+ many other games of the time) I seem to remember Farcry being "alright" on my 5950...
Of course, my rose tinted lenses may be out in force for that one... :D
 
There might just be a die shrink of our current tech, that can be run faster with less of a power drain, and be called the "Overclocked" versions, maybe from the factory so that you get more performance for much the same money

I think nVidia being so far behind has just stunted everything.

ATi probably were ready to go I think Q2/Q3 but there's no need, as there's no competition.

So I expect we'll just get the same cards shrunk down for the moment and then a new tech in Q4/Q1 2010. :(
 
You obviously can't remember when Farcry first came out. The top end cards for its day were the Nvidia 5800/5900 series and ATi 9800 series. The Nvidia cards couldn't do DX9 without serious problems, so that's already a vast majority of the PC gaming populance who couldn't run Farcry smoothly at all. The lucky 9800 owners still saw a lot of dropped frames. You're just looking through rose tinted glasses. Farcry was nowhere near as demanding on the top hardware for its time as Crysis is, but it was still a game that needed a colossal beast of a computer to run, and it wasn't until the eventual release of the 6800 and X800 series of cards that we could play the game smoothly on maximum settings.

I agree that it wasn't as demanding as Crysis, good point. However, my ATI 9800se 128mb ran it rather well and that was before I overclocked it to a 9800.
 
Very true, the Crytek engine seems to push computers to the limit, even upto the point where the tech to run the games as they should be played is not yet available, as if the games are ahead of there time :rolleyes:
 
Very true, the Crytek engine seems to push computers to the limit, even upto the point where the tech to run the games as they should be played is not yet available, as if the games are ahead of there time :rolleyes:

Trouble is there's a number of games that look as good as crytek engine games but run fine.
 
FC2 only compares favourably in environmental visuals (sun, leaves, grass, lighting etc)

If you look at objects and gun models, they are completely hideous. Look at the windshield frame on the assault truck - looks like it belongs in HL1!
 
why are you so hung up on whether its a new architecture? Their is nothing wrong with the 4xxx series architecture, so if its just scaled up, thats fine by me.

After all the 5800Ultra was a whole new architecture................

Well I dunno about Loadsa, but for myself, all I keep thinking as I read the "so what if it's the old Arch" comments, is "what if this were the green team doing this?" there would be wailing and gnashing of teeth and calls for crucifixion.
But when it comes to daft "kit-pocracy" OCUK's fora are old hands........look at all the people who will warn you off NV, and rave of their love of ATI, who will cite "bang for buck" and "sharp business practise" as their main reasons.........and every flaming one of them uses and Intel chip and will tell you to avoid the nasty AMD cos it's 3% slower for 1/2 the price. I mean it's not like the noble, heroic ATI are the same company as that horrid AMD, or as if Intel have EVER been questioned/investigated/convicted/hit with a record fine, in relation to their dodgy dealings. It amuses me something rotten ( all the more so, since personally, for me, speaking for myself, personally, I find the ultimate in a "peace and quiet" PC to be an AMD machine with an NV card, and Intel/ATI is not a combination I enjoy building when doing so for other people....though I still reccommend such combinations when appropriate......I find ATI's drivers to be flakier than NV's and always feel a huge lag when using Intel machines that's just not there on AMD).
 
Back
Top Bottom