• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Discussion On The ATI Radeon 5*** Series Before They Have Been Released Thread

Status
Not open for further replies.
I wonder if ATi will have revised the design of the X2 so it shares all the memory between cores, I bet they'll have a trick under their sleave for the X2 :) 2 5870X2s with a super fast interconnect and 4gb usable RAM each...serious power
 
Will the "mars bar" still be a grand when the 5870s are released?

Im thinking the 5870 will be around the same performance as 1grand card, seeing as its probably faster than a 295gtx!:D
 
So the way I see it;

A) AMD have cornered the developers and truly embraced DX11, look at the support they have from Crytek, Dirt etc
B) The 5870 is going to be a monster card, it will probably beat the 295 for less
C) The "Mars Bar" subsequently is a dodo of a card
D) The 285s effectively becomes a redundant card (where will it sit if it's being destroyed by cheaper cards?)
E) The 295 will only be bought by die hard fan boys

Checkmate? I do hope NV sort their house out, don't see them hitting back with anything more than offering us free cereal box 3d "glasses" and convincing us Batman is more important than DX11.

This is a big marketing coup for AMD, they done the right thing, they jumped on the Windows 7 bandwagon to be the premiere (i.e. only) DX11 partner. NV made a mistake in putting their cards on propeitory technology when DX11 will probably form part of next gen consoles and developers will be more interested in being able to produce games that they can port across easily than using propeitory libraries. Nvidia should have not thought that they are bigger than MS, they picked the wrong battle (i.e. in not embracing DX fully)

Anyone who says AMD have not responded well is frankly delusional
 
Some of the Eyefinity demos were done in Crossfire ;)

Only the flightsim (24 screens) was multi gpu and that wasn't using crossfire, each gpu was rendering a quadrant of the image in Linux, afaik crossfire + eyefinity doesn't yet work, which makes the demos more impressive imo, shows the raw power of the 5870.
 
I still don't get why people are equating the multi-display capabilities to normal ingame performance... they are using the same FOV, rendering the same number of polygons and shaders, etc. as you would on one screen - the only thing that it really hits is the fillrates and the amount of VRAM required for the framebuffer and other buffer objects.

nVidia already have these levels of fillrate on their 200 series cards... they just lack the ability to output to this number of displays easily and possibly the VRAM depending on the model.
 
Last edited:
I still don't get why people are equating the multi-display capabilities to ingame performance... they are using the same FOV, rendering the same number of polygons and shaders, etc. as you would on one screen - the only thing that it really hits is the fillrates and the amount of VRAM required for the framebuffer and other buffer objects.

nVidia already have these levels of fillrate on their 200 series cards... they just lack the ability to output to this number of displays easily and possibly the VRAM depending on the model.

The 6 game demos had expanded FoV as did the Crytek demo....just admit it, these cards look fast and ATi answered the critics emphatically..anyway we'll see when the benchmarks come out :)
 
I personally view it as the Ruby shown for last gen is roughly what appears in this gen's gfx cards. I can't wait for the next batch when that sort of model is in all games :D


x18xx series ruby (what we have now imo)
http://www.youtube.com/watch?v=YdQkfp72Yls

48xx series ruby (what i expect from this gen)
http://www.youtube.com/watch?v=7fzkHGch12c&feature=fvw/

Exciting times ^^

Except the x18xx was 3/4 generations ago and we are still in the 48xx generation. So I expect 48xx ruby type graphics is 2-4 years and the 58xx ruby in 4-6 years.
 
I still don't get why people are equating the multi-display capabilities to ingame performance... they are using the same FOV, rendering the same number of polygons and shaders, etc. as you would on one screen - the only thing that it really hits is the fillrates and the amount of VRAM required for the framebuffer and other buffer objects.

nVidia already have these levels of fillrate on their 200 series cards... they just lack the ability to output to this number of displays easily and possibly the VRAM depending on the model.

Surely if they are running multiple displays then it's rendering more polygons/shaders, since for example in the Dirt 2 videos you are seeing scenery and cars that you wouldn't normally see in a single screen until the cars are actually past you.
 
Surely if they are running multiple displays then it's rendering more polygons/shaders, since for example in the Dirt 2 videos you are seeing scenery and cars that you wouldn't normally see in a single screen until the cars are actually past you.

He will be answered when the benchmarks come out, and maybe he will throw a compliment ATi's way :cool:
 
Maybe fov was increased a bit - but thats still not like rendering 6-24x the amount of scenery... still doesn't make it that impressive if you actually think about what its doing.
 
He will be answered when the benchmarks come out, and maybe he will throw a compliment ATi's way :cool:

As I've said before the performance of teh card is impressive... I'm just perplexed by people equating the multi display results to the cards performance potential in more normal gaming situations.
 
Status
Not open for further replies.
Back
Top Bottom