• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3D Amd - is this a reason to get a 7000 series?

Soldato
Joined
8 Nov 2006
Posts
9,237
There is plenty of debate on price and performance of these cards, and I don't really want to get into that here.

I recently got an LG DM2780D for some 3D fun, as I thought the price was fair.

Currently have the 6950 toxics in crossfire, and was under the impression that they were HDMi 1.4a capable, which I thought would allow 1080p at 60hz in 3d.

Further reading has lead me to believe that the 6950's do not support 1.4a, and so will only do up to 1080p at 30hz for 3D, or 1280x720 at 60hz.

Did some checking and it appears that the 79xx have proper support for HDMI 1.4a. So would I likely be able to get 1080 at 60Hz with one of these?

Info on these LG monitors is very thing on the ground, so hard figuring out all this stuff.
 
Hi there,

Please bear in mind that it is not the graphics cards per-se that are the "problem". The issue is with the HDMI 1.4 standard itself. Due to bandwidth restrictions, HDMI 1.4 is only able to transmit a 1080p image at 30Hz per eye or a 720p image at 60Hz per eye. For proper 1080p 3D at 60Hz per eye you need a display that does 3D via dual-link DVI or Displayport 1.2 - as these connections have much more bandwidth than HDMI 1.4.

Therefore, even if you went for a 7970 card it would have the exact same issue.

The only way to resolve it is to get a "proper" 3D monitor which does 3D via displayport 1.2 (as AMD cards can't do 3D with normal dual-link DVI 120Hz monitors). Also, these monitors use shutter glasses - so you will have to put up with heavier glasses - but you do get a full-resolution image per eye (with the passive 3D technology used in the LG TV you only see half the resolution per eye).
 
Last edited:
Again, you will be restricted by HDMI bandwidth as it only does 3D via HDMI 1.4 - so you would have to chose to either game in 3D at 1080p@30Hz (per eye) or 720p@60Hz (per eye) -personally I would go with the 720p option so you get much smoother motion and let the TV upscale the image. Since that TV uses active shutter glasses type 3D then you will see the full resolution image per eye, unlike passive 3D TVs.

For playing films in 3D neither of these TVs will have any trouble with framerate over HDMI- since films are only shot at 24 fps- so even with HDMI 1.4 you will be able to run the movie at full 1080p resolution without issue.
 
Keep an eye out for my benchmarks. I think you'd be surprised.... You're going to need 7 series is the very short answer.
 
Keep an eye out for my benchmarks. I think you'd be surprised.... You're going to need 7 series is the very short answer.

May I ask, when you were testing your LG Flatron D2342 (passive 3D HDMI 1.4) monitor - did you have it set up to run 3D games in 1080p@30Hz per eye mode or 720p@60Hz?

If you were using it 1080p 3D mode, how did you find smoothness is faster paced games like FPSs?
 
Ok, I know that to get 60hz per eye, I need a 120hz monitor, but 60hz (30 per eye) should be doable, but I can currently only do 30hz - not 30hz per eye, but 30hz max. So basically 30 frames per second.

Or herhaps I am just not understanding this whole 3D thing very well...
 
Sorry, I should have expained further - with passive technology you don't divide the panel refresh rate by two (like in active shutter glasses, where each eye is blacked out half the time) since eye eye sees the panel all the time. The passive technology only requires a 60Hz panel (hence why you can get IPS panel monitor with passive 3D) since the splitting up is done within each frame (each eye only sees half of the full resolution). Therefore if it says it is refreshing at 30Hz, then each eye is seeing that 30Hz - but it only sees half the resolution of each on-screen image to give you the 3D effect.

If you set the system to run at 720p in 3D then a 60Hz framerate will be achievable and each eye will see a 60Hz image (which should feel smoother).
 
Last edited:
May I ask, when you were testing your LG Flatron D2342 (passive 3D HDMI 1.4) monitor - did you have it set up to run 3D games in 1080p@30Hz per eye mode or 720p@60Hz?

If you were using it 1080p 3D mode, how did you find smoothness is faster paced games like FPSs?

The set up is all automatic. Tridef has profiles, and has pretty much every game on those profiles. As such I've not had to mess around and look at the settings.

As for the results of the benchmarks? I'm shocked. Crysis looks absolutely terrible when you count the FPS. However, it ran perfectly smooth :confused:

And trust me when I say, I am incredibly fussy about FPS dips. So I just don't get how the benchmark can give off a min of 13 FPS yet when I played it it seemed fine.

The odd part is that playing the games in 3D *feels* the same. Like there is no FPS hit at all. I do have a strange feeling though that Tridef forces Vsync, which may go some ways to explaining the low average FPS count. But I don't know how it all works myself tbh. I shall no doubt at some point go and ask them.

I did mention it before, but even with the height reduction (which I'm sure it does) I have noticed absolutely no difference in the graphics quality.
 
I think the reason you are getting such bad results is because the software set your default 3D resolution to 1080p. If you are running a displayport 1.2 monitor then this is fine as these can run 3D with 1080p@60Hz (per eye), but for a HDMI 1.4 3D monitor 1080p can only run at a maximum of 30Hz due to lower bandwidth. Therefore, considering your system you should be close to that figure of 30Hz most of the time and action will seem pretty smooth (also helped by Crysis 2's clever motion blur), but invariably there are small sections where your framerate will dip and this will drag your average framerate down to sub 30 FPS.

If you are happy with how the games feel - then I would stick with how you have it set up, but I wouldn't expect getting a big performance boost from a new graphics card- since 30FPS@1080p is as good as it can possibly get due to the monitor. If you want to see a higher framerate in 3D then you will need to drop the resolution to 720p - as at that resolution it can use the full 60Hz (60fps with Vsync on) of your monitor.
 
Sorry, I should have expained further - with passive technology you don't divide the panel refresh rate by two (like in active shutter glasses, where each eye is blacked out half the time) since eye eye sees the panel all the time. The passive technology only requires a 60Hz panel (hence why you can get IPS panel monitor with passive 3D) since the splitting up is done within each frame (each eye only sees half of the full resolution). Therefore if it says it is refreshing at 30Hz, then each eye is seeing that 30Hz - but it only sees half the resolution of each on-screen image to give you the 3D effect.

If you set the system to run at 720p in 3D then a 60Hz framerate will be achievable and each eye will see a 60Hz image (which should feel smoother).

So, being a passive 3D screen, should I not be able to run 1080p at 60hz?

What I don't understand is that this only seems to be a limitation when trying to use AMD's HD3D support built into games.

When just using Tridef, it does seem to be running 1080 at 60hz.

I understand that it is interleaved
 
Technically, a passive 3D screen can run 3D in 1080p at 60Hz. However, it is the HDMI connection which is the bottleneck and that can only support 3D 1080p@30Hz or 720p@60Hz.

Tridef and HD3D have the same restrictions on 3D framerate, they just may be detected differently by framerate monitoring software.
 
Technically, a passive 3D screen can run 3D in 1080p at 60Hz. However, it is the HDMI connection which is the bottleneck and that can only support 3D 1080p@30Hz or 720p@60Hz.

Tridef and HD3D have the same restrictions on 3D framerate, they just may be detected differently by framerate monitoring software.

I've a strong feeling that the FPS are indeed being detected incorrectly if I'm honest.

Maybe they're only half?
 
Yep. Passive 3D halves resolution. Active 3D halves framerate. Quality control is the only reason NVIDIA insists on 100,110 or 120Hz screens for 3D (120Hz being the standard mode for LCDs). Technically it would work on even a 60Hz screen. It just won't be very good.
 
But I don't understand why hdmi suddenly only becomes capable of 1080p at 30hz. we are not talking per eye here, being a passive screen.

So it is interleaved, so one frame is intended for left eye, and next is intended for right, it should surely still be able to put out 60 refreshes a second?

So essentially it should still be able to do 60 refreshes per second, just interlaced, as opposed to being restricted to 30 refreshes progressive?
 
Yep. Passive 3D halves resolution. Active 3D halves framerate. Quality control is the only reason NVIDIA insists on 100,110 or 120Hz screens for 3D (120Hz being the standard mode for LCDs). Technically it would work on even a 60Hz screen. It just won't be very good.

Aye exactly, with active shutter glasses 3D would work fine on a standard 60Hz panel- however each eye would effectively only see 30Hz, which isn't ideal for playing games which involve a lot of motion.

Hence why it is recommended to switch to 720p@60Hz when running games in 3D on a HDMI 1.4 device, since motion is much smoother - giving a better gaming experience.


But I don't understand why hdmi suddenly only becomes capable of 1080p at 30hz. we are not talking per eye here, being a passive screen.

So it is interleaved, so one frame is intended for left eye, and next is intended for right, it should surely still be able to put out 60 refreshes a second?

So essentially it should still be able to do 60 refreshes per second, just interlaced, as opposed to being restricted to 30 refreshes progressive?

For 2D HDMI is indeed perfectly fine for running a 1080p image at 60Hz. However, for running 3D HDMI needs to transmit two full-resolution images per frame instead of one - hence only half the framerate can be supported so it can fit within the same total bandwidth.

On a passive 3D screen these images can be shown at the same time with the monitor chopping them up and the glasses re-assembling them. In this process some of the image quality is wasted, since each eye only sees half the resolution of the initial image. Therefore, if the HDMI 3D standard was made exclusively for passive TVs/monitors then it would only send two half-resolution images per frame and you could run it at full 60Hz framerate and 1080p (with the images combined) since the total data sent would be the same as a 2D full resolution 1080p image. However, the HDMI 1.4 standard has to work with not only passive screens, but also active ones - and these require two full resolution images per frame to provide their maximum quality.
 
Last edited:
But I don't understand why hdmi suddenly only becomes capable of 1080p at 30hz. we are not talking per eye here, being a passive screen.

So it is interleaved, so one frame is intended for left eye, and next is intended for right, it should surely still be able to put out 60 refreshes a second?

So essentially it should still be able to do 60 refreshes per second, just interlaced, as opposed to being restricted to 30 refreshes progressive?

it all comes down to bandwidth. bits per second - bps.

higher resolution means lower frame rate, as fewer frames at a higher res will equal more frames at a lower res. It's simple elementary algebra.

x * y = B (constant)

X * Y = B (constant)

X * Y = x * y

Now if x > X then y < Y is a necessary condition.


(Also, expressed as a ratio: X/x = y/Y)
 
Back
Top Bottom