What’s not changed?
The difference in image quality between radeon and geforce.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
What’s not changed?
In a game from 15 years ago?The difference in image quality between radeon and geforce.
In a game from 15 years ago?
Can you or anyone prove this is true in current games? Not talking about the colours which I agree Radeon have better colours by default, I am talking about actual image quality. Can anyone clearly show AMD have better image quality over say 10 games released in the past couple of years?
No good picking one or two games and picking one or two shots. It needs to be shown to be true consistently. No one has done this so far.
Someone posted comparisons from GTA V a while back and the differences with texture sharpness was still there.
I also personally noticed how washed out the geforce image looked when I switched from nvidia. Both times I've switched infact.
That's the catch. If it was so easy to be proved with amateurs' equipments, it would have been proved already.
The colours are relative, you shouldn't argue so much about the colours look or even accuracy.
For example, how would you compare the image quality on a GeForce|TN 6-bit monitor with a Radeon|VA or IPS 10-bit monitor??
I don't understand why AMD is completely quiet about this and instead doesn't use its marketing to show the differences.
Also, I agree there should be a unified standard across all vendors and graphics cards which guarantees that users will see exactly what the creators of the games want them to see, and not allow any freedom for the games to be rendered differently.
Listen, and understand. That 4K8K is out there. It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are brain dead.Why are you still talking absolute nonsense? You have consistently shown in the GPU and CPU sections of this forum that you know nothing and that the rubbish you spout as facts are entirely based on what vendor you happen to be in love with at the time.
Stop posting up screenshots of games from 15 years ago to make an argument about the state of things today. @TNA has posted up something for you to do. Get screenshots of a range of current games and prove once and for all there is difference. Because if what you say is actually true, it should be very easily proved. But of course, most common sense people know that it isn't true, because if it was it would be huge news. Tech websites would be all over it, especially the PRO-AMD ones.
Lies! You need to go specsavers clearly!On my OLED TV there was literally 0 difference when I switched from RX480 to GTX1080.
Listen, and understand. That 4K8K is out there. It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are brain dead.
On my OLED TV there was literally 0 difference when I switched from RX480 to GTX1080.
I don't understand why AMD is completely quiet about this
Listen, and understand. That 4K8K is out there. It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are brain dead.
Does it? A TV that will add stuff to NVidia's missing stuff is very clever indeed. How does it know what stuff needs to be added?Your TV has its own processor that adjusts the picture quality - changes colours, fixes pixels lack, etc.
Your TV has its own processor that adjusts the picture quality - changes colours, fixes pixels lack, etc.
I agree with everything you've said thus far , I recently Upgraded from an Nvidia TNT 2 paired with an early dell 1024x768 lcd monitor to a Spanking new Rx5700 / asus ultra wide 3440x1440 and the difference is night and day , i couldn't believe my eyes , it was like some sort of technical magic was happening before my very eyes, image quality was better , games ran faster , just incredible.
Don't let people troll you man !
As I've pointed out and linked to videos, etc. before people have actually sat down with high end monitors, professional grade photography equipment and done actual image difference analysis with professional grade software - in the vast majority of cases there was no image quality differences at all and only some very slight colour differences which were probably explained by the link posted by James Miller earlier in the thread. If there was the kind of differences as a general thing like you are making out then people like Hardware Unboxed and Gamers Nexus, etc. would have been all over it.
You keep on pulling the headlines from searching for stuff that backs up what you want to be the case without even bothering to read the details especially comments which go into greater depth as to what is being seen and why and often completely and conclusively counter the bit you've cherry picked from the relevant article.
Almost every time when bigger differences are posted about it is down to user error and/or using non-equivalent settings. There have been some exceptions - both nVidia and AMD have in the past had bugs or intentional performance optimisations in their filtering algorithms or settings, etc. which are effectively cheats, some games that have had greater compatibility with one brand or the other but by and large the vast majority of the time there isn't a substantial difference between them.
The in-game colour differences between Radeon (on the right) and GeForce (on the left) vary and can be so large, up to dramatic.
In the following screenshot, you see three main differences:
- level detail loss on the water pool on the GeForce;
- different colours/wash out on the GeForce;
- lower texture quality on the GeForce which points to higher texture compression
But, it would be better if nvidia takes the code from the graphics drivers and exposes it to the public, so everyone openly sees what they do underneath. Visual examination as pixel per pixel comparisons is subjective because reviewers often call the black white or vice versa.
Or the game developers must examine it in the code, not visually.