Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Looking at the games, I would say indeed. He was using an 8500GT for the NVidia card, which was released back in April of 2007 and god knows when support for that card was dropped. The test also has no settings shown, no driver info, no date, no GPU. It is actually a terrible comparison to post. At least use something more recent to make such claims.Looking at the frame rates I would say the bottom images are on some sort of prehistoric GPU
Looking at the games, I would say indeed. He was using an 8500GT for the NVidia card, which was released back in April of 2007 and god knows when support for that card was dropped. The test also has no settings shown, no driver info, no date, no GPU. It is actually a terrible comparison to post. At least use something more recent to make such claims.
So 10 years old then. Talk about "during the war"CS Source, maximum settings, date of last modification of the screenshot is 2 December 2008, the driver must be the latest GeForce driver release at that moment, the graphics card is 8500GT 512MB DDR2.
CS Source, maximum settings, date of last modification of the screenshot is 2 December 2008, the driver must be latest Catalyst driver release at that moment, the graphics card is HD 4670 512MB GDDR3.
Look, you can take RX Vega 64 and RTX 2080 Ti and test CS Source today. Post the images in this thread.
So 10 years old then. Talk about "during the war"
Honestly I run AMD and NVidia and can't notice a difference. On both I set Vibrance to +63 on my monitor, as that makes colours pop but I am sure some will see the placebo effect if they try hard enough.
I have a Asus PG348Q and love it. I also have a 55" OLED LG TV and love that. Not sure if there is something newer that would benefit me?And the tested resolution is 1024 x 768.
About old things - change your monitor to a more recent one, I changed to LG 24UD58 and the colours are like on an OLED screen.
I thought that AMD had better colours out of the box though obviously you can tweak it. The real difference is HDMI nvidia has a deliberately limited colour gamut which again you have to turn off/unlock to get the full range.
you need to be standing in the same spot as the lighting is totally different in both pics which doesnt help in a comparison, also im pretty sure a difference wouldn't show in a screen shot as surely it would be controlled by the card in situ on your machine. apologies if i missed something previously mentioned otherwise?CS Source: Dust 2 map
Video settings - all maxed out, anisotroping filtering 16X, AA 8x.
Ryzen 5 2500U with Vega 8 Graphics plus RX 560X 4GB GDDR5 switchable graphics.
Windows 10 Pro 1803
AMD driver 23.20.826.3072, date 16.06.2018, latest available on Acer's webpage.
1920x1080:
Different areas and such an old system on both, a poor thing to use as a comparison.
But you can't tweak it and not have side-effects. It will cause colour banding.
AMD and Intel have the proper calibration. It's Nvidia who are off. It's like they are adding extra white to the picture.
Remember what you claimed?
How do those images demonstrate that.
Also how about something a bit more modern than CS source, which is 14 years old.
Some articles from reputable tech sites would help to prove your point, preferably relatively recent ones.
Else it is just like AMD having dodgy drivers, they may have done once and the stigma has stuck.
I didn't read through the thread much but I'll say this, my GPU history:
1. GT 8800
2. GTX 580
3. 7970 (mining)
4. 7990 (more mining)
5. 290X (more mining)
6. 980 Ti
7. Vega 56
I have tested throughout these periods back and fourth many times, because I take ages to sell my old hardware and often run 2-3 rigs at a time side-by-side. I can say for sure that AMD has the following:
1. Better colour, at least by default.
2. Better detail accuracy (models, shadows) in many games.
3. Better frametimes/minimums, generally making games (especially higher framerates titles) feel much smoother.
An easy example is that in CS:GO, playing at 4K (one of the only games where you can hit 300+ FPS at 4K), the 980 Ti is noticeably stuttery, and colours are washed out. I switch to the Vega 56, and it is almost unbelievable, I can play it at 4K and it feels more smooth than running the 980 Ti at 540p@480hz (I have an X28 that can pull this off), while the darker areas create more "pop" allowing models to stand out where I would otherwise miss them on the 980 Ti. The same colour/"popping" exists across many titles.
I am in IT by trade and a programmer by hobby, so I can take guesses at where/why these things are happening, my guess is that Nvidia puts more effort into per-title optimization, which ultimately leads to detail quality/fidelity decreasing, because they are trying to strike a balance. You might set some Ultra setting for your shadows in XYZ title, but if Nvidia has reviewed the differences internally for XYZ title and decided that Ultra shadows provide very little benefit over High, while having substantial performance setbacks, then they will force High shadows and "lie" to you. It's understandable, to be clear I'm not trying to make it sound like a negative.
Then, for colours, I believe that Nvidia has issues with the cards defaulting to 6 bit or 8 bit, when 8 bit or 10 bit is available on the monitor side, I also believe that they do some heavy colour compression to save on performance/memory utilization, these two things combined with a more washed out default colour profile can explain the problem.
Finally, for the "smoothness" issue, I honestly believe that AMD has better drivers. Maybe it's their hardware pipeline, maybe it's some Windows issue, but I am absolutely sure it is there and the most simple explanation to me would be that the driver is more responsive or consistent and applications are prone to less clock-clock hitching/instability running on AMD.
I have ran pure Intel on the CPU side all these years, despite supporting AMD, and I have a predominantly Nvidia GPU history, especially if you exclude the time I spent mining on the 5000 and 7000 series. I was until months ago an investor in AMD too, since 2016. I'm not biased though, it really is as simple as noticing these differences when swapping out hardware, some people will notice, others won't, and for some maybe you truly don't see these benefits.
Different areas and such an old system on both, a poor thing to use as a comparison.
you need to be standing in the same spot as the lighting is totally different in both pics which doesnt help in a comparison
completely missed the first postThe purpose of the exercise is that you stand in the same position with your nvidia graphics card and post here what you see there.
The two screenshots are on the same machine, just different places in the tunnel of Dust 2.