• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD IQ Vs Nvidia IQ - shenanigans or something else?

I'd be interested in seeing the difference between the default "quality" setting in the nvidia control panel and the "high quality" setting which removes a lot of the optimised features to give the best image quality possible personally.
 
This is gonna sound odd, but have you tried the TX connected via DP to the monitor to see if that makes any difference?

Could it be something to do with the way Nvidia cards use HDMI colour compression to do 4K+60Hz without HDMI 2.0? (I know the TX has HDMI 2.0 but if the HDMI capture device doesn't it may be that the card is seeing a sub HDMI 2.0 connection and activating it's chroma subsampling /shrug).

IDK

The way it is setup is.

DP from GPU to monitor on AMD and Nvidia
HDMI from GPU to capture card on AMD and NVIDIA
Screen Resolution is set to duplicate monitor 1 on monitor 2 (cap card) on both AMD and Nvidia.
For the video in the OP, settings were recording 1080P 30fps @30mbs on both.
Final vids are put into Vegas, split and then encoded. Finally uploaded to youtube.

Essentially, the same exact for both, so if there is a discrepancy, it will show up but the vid is such poor quality, it would be unwise to compare.

Tonight will be encoded with much better quality settings, so discrepancies should show and screens will also be shown for scrutiny.
 
Just wait till tonight chaps. I will have much better 60fps video's for comparison. I will also add screen shots and that should show things better. Just don't try and find something that isn't there though and watch with an open mind.

Any chance you could take stills and motion (whole scene rapidly changing such as when you were running in bf4) screenshots. Would be interesting to see whether Nvidia is able to use some form of driver optimisation to forego detail when a scene is under high amount of motion. Which is neither good or bad, no point rendering if you can't appreciate the IQ difference but would have implications for frame rate comparisons.
 
I'd be interested in seeing the difference between the default "quality" setting in the nvidia control panel and the "high quality" setting which removes a lot of the optimised features to give the best image quality possible personally.

This is something I will also be doing. I will do a run with "prefer max performance" and "prefer max quality" I am wondering if there is an AF difference in my run. No idea what it was on before but it will be interesting to find out. Anyone fancy looking into this whilst I sweat at work?:p
 
And in that time, Nvidia will just stand still and will end up miles behind.

:rolleyes:

Gosh I did not know I had HBM on my TitanXs.:rolleyes:

The difference is because we are talking about a brand new tech (HBM) compared to old tech GDDR5. It is going to take the people who write the drivers time to get the most out of the new tech where as with GDDR5 there is not much more that can be had.
 
Gosh I did not know I had HBM on my TitanXs.:rolleyes:

The difference is because we are talking about a brand new tech (HBM) compared to old tech GDDR5. It is going to take the people who write the drivers time to get the most out of the new tech where as with GDDR5 there is not much more that can be had.

You could have just asked me and I'd have told you that ? Next time dont brag your forum purchases all over the place and you might have more time to investigate what you buy.

The clue is in what Joe has said..... 'Time' 'Drivers' 'Meal of anything' Yeah, AMD have had 2 years this is the best they come up with, drivers ? lol. Meal of anything ?You'd think they would at the very least have the 2nd fastest gingle gpu, but nope not even that.
 
I have just picked up a 980ti after being torn between it an d Fury X. I would be happy to add screens and vids for this. Just let me know what to run and the best way to acheive good results for comparison purposes, Battlefield screens are out though since i don't own it.
 
You could have just asked me and I'd have told you that ? Next time dont brag your forum purchases all over the place and you might have more time to investigate what you buy.

The clue is in what Joe has said..... 'Time' 'Drivers' 'Meal of anything' Yeah, AMD have had 2 years this is the best they come up with, drivers ? lol. Meal of anything ?You'd think they would at the very least have the 2nd fastest gingle gpu, but nope not even that.

I am not the only one on the forums with TXs perhaps you should go and look in the owners thread lol.

As to the drivers AMD have only just got the card in it's final form on the market so it will be a case of starting from scratch.
 
You could have just asked me and I'd have told you that ? Next time dont brag your forum purchases all over the place and you might have more time to investigate what you buy.

Another thing about the TitanX owners is a number of us have been open minded enough to try the Fury X and most of us like the new card.

Gregster has done a really nice review, it is an unbiased open minded and honest piece of work. Maybe you could watch the video if you have not done so already or better still go out and buy a Fury X and give it a go for yourself.
 
Hang on a minute though, AMD are the ones who have both developed the card and produced the drivers. Performance should be there now, with minor tweaks to come.

Replace amd with nvidia and its the same argument. Performance generally goes up with driver improvements over time regardless of gpu maker.
 
Back
Top Bottom