• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

7950 Crossfire vs 680 SLI at 5760*1080 (Eyefinity/Surround)

It may not be as good as 60, but 30 FPS does work.
It also is not as important to maintain 60FPS per eye on the monitors you only use for peripheral vision, which is the situation here.
 
Rusty0611, can you honestly tell any difference in image quality using AMD, and Nvidia with current generation cards, using a decent amount of AA/AF, or whatever is best?

Not really. You could maybe tell from a side by side still any difference but this just serves to exaggerate minor differences either way. Colours are perhaps slightly more vivid on AMD when both are at default but I used a touch of Digital Vibrance on my nVidia cards so I didn't notice any difference in real terms.

The guys who are using 3D surround say the peripheral monitors are running the same framerate as the central one.

Correct. Difficult to monitor though so I just trusted my eyes which are super sensitive to low FPS.
 
It may not be as good as 60, but 30 FPS does work.
It also is not as important to maintain 60FPS per eye on the monitors you only use for peripheral vision, which is the situation here.

That wasn't really my point, I was highlighting the fact that 3D at 30FPS per eye is very noticeably different to 3D at 60FPS per eye.

Anyone would notice it, 3D content looks like arse at 30FPS per eye. I know it works, but it looks bad and is noticeably choppy and headache inducing.
 
That wasn't really my point, I was highlighting the fact that 3D at 30FPS per eye is very noticeably different to 3D at 60FPS per eye.

Anyone would notice it, 3D content looks like arse at 30FPS per eye. I know it works, but it looks bad and is noticeably choppy and headache inducing.

I do wonder if this is one of those universal truths that just doesn't stand up. In non-3d games, some people demand an average 60 fps with occasional dips, others are fine with 30 or even 20 (depending on the game) and others cant tolerate the game dipping below 60 for even a moment. Lots of people demand x4, x8 or even higher AA, others are fine with none. Everyone's eyes are different.

The industry has universally decided that 3D must be 120FPS but there's no research I kinow of that demands it. I understand why they've done it: they know that if it's allowed on lower FPS, some people will claim it doesn't work and is crap, and they want to minimise that kind of reaction because it's a new technology, and they think any negative reactions will harm its adoption rate.

I think they are wrong personally, and would love the opportunity to try it without hassle on my 60fps monitors even if it's a poor experience.
 
I do wonder if this is one of those universal truths that just doesn't stand up. In non-3d games, some people demand an average 60 fps with occasional dips, others are fine with 30 or even 20 (depending on the game) and others cant tolerate the game dipping below 60 for even a moment. Lots of people demand x4, x8 or even higher AA, others are fine with none. Everyone's eyes are different.

The industry has universally decided that 3D must be 120FPS but there's no research I kinow of that demands it. I understand why they've done it: they know that if it's allowed on lower FPS, some people will claim it doesn't work and is crap, and they want to minimise that kind of reaction because it's a new technology, and they think any negative reactions will harm its adoption rate.

I think they are wrong personally, and would love the opportunity to try it without hassle on my 60fps monitors even if it's a poor experience.

even my sister notice difference between my 120hz screen with 120 vs 60hz.
The Hobbit is shot in 48fps due to it ease your 3D view experience, games no different and 60fps in 3D gaming or else I dont touch that crap.
 
even my sister notice difference between my 120hz screen with 120 vs 60hz.
The Hobbit is shot in 48fps due to it ease your 3D view experience, games no different and 60fps in 3D gaming or else I dont touch that crap.

I often go to the Cinema and did last night to watch the Hobbit in IMAX 3D and a great film. I know it was filmed at higher frame rates but if I am honest, I couldn't spot the difference from 30fps (or whatever it is) to 48fps. Others may but the last film I watched in 3D was Spiderman and both looked as smooth as each other.
 
Can anyone confirm what FPS I would likely get using 2 x 7950 in Crossfire running over 3 1080P Monitors, Ultra in BF3 please?

Also would there be any issues as I know Crossfire has had problems but not sure over Eyefinity.

Thanks
 
Can anyone confirm what FPS I would likely get using 2 x 7950 in Crossfire running over 3 1080P Monitors, Ultra in BF3 please?

Also would there be any issues as I know Crossfire has had problems but not sure over Eyefinity.

Thanks

Better than what Rusty got in the OP (driver updates) and his 7950 OC setup was clearly besting his 680 SLI OC setup. His 680's were golden clockers 1380mhz+ as well if my memory servers me correctly. Its a shame he can't answer for himself though. :D

Eyefinity has not had the frame pacing fix yet, but it is coming. Any serious stutter issues can be resolved using radeonpro so i really wouldn't let that put you off too much. Its generally as simple as creating a profile and setting a fps limt to your average fps. That said before the frame time fix came along i found 90% of games ran perfectly fine anyway and the few that didn't could be resolved by using a combination of either radeonpro, vsync and or a fps cap. Since the frame time fix has been released i stopped using radeonpro for now as all my games are smooth.

EDIT

Don't forget the question you asked is answered in the OP st2000.
 
Last edited:
Better than what Rusty got in the OP (driver updates) and his 7950 OC setup was clearly besting his 680 SLI OC setup. His 680's were golden clockers 1380mhz+ as well if my memory servers me correctly. Its a shame he can't answer for himself though. :D

Eyefinity has not had the frame pacing fix yet, but it is coming. Any serious stutter issues can be resolved using radeonpro so i really wouldn't let that put you off too much. Its generally as simple as creating a profile and setting a fps limt to your average fps. That said before the frame time fix came along i found 90% of games ran perfectly fine anyway and the few that didn't could be resolved by using a combination of either radeonpro, vsync and or a fps cap. Since the frame time fix has been released i stopped using radeonpro for now as all my games are smooth.

EDIT

Don't forget the question you asked is answered in the OP st2000.

Thanks for your reply.

Yes but no mention as to what settings ingame were used, not as far as I can see anyway - and no mention as to what the FPS was - just that AMD was quicker by xx%.... or am I being blind? hehe.
 
Thanks for your reply.

Yes but no mention as to what settings ingame were used, not as far as I can see anyway - and no mention as to what the FPS was - just that AMD was quicker by xx%.... or am I being blind? hehe.

The second graph in the opening post details both the settings used* and the average/min framrate for each config.

*Apart from Skyrim which I assume is running at max settings with max in game AA applied.

IIRC Rusty ran these benchmarks at the maximum settings available in game, unless otherwise stated.
 
I have all settings at max, AAx4, using Radeonpro's frame limiter, and get an almost constant 58.1 FPS with no stutter that I can perceive (I've set it to limit frames to 58, and I've seen it drop to 51-55 at times but no further).

I'm running at 6016x1200 resolution, so about 10% more pixels than you have to deal with, which means my 7970s are probably not that far above your 7950s.

I should point out my figures are for Single Player. Apparently MP is more demanding and CPU comes into it a lot more, but you should still get very good performance.
 
Back
Top Bottom