• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
Its happened before - IIRC the last Mirror's Edge had it on maximum texture settings,AOTS did it with quad cores,etc - I wish more websites did image quality comparisons like they used too,since they would be able to catch these things beforehand and pass it back to AMD/Nvidia or the dev.

There was also the case during the Polaris launch where AOTS didn't render proper snow on the GTX 1080 either due to a driver bug.

Many people accused AMD of running the card at a lower setting; despite the developers coming out and stating it was an NVIDIA driver issue; which in effect boosted the performance since it wasn't rendering so much.
 
Didn't they do that with some other games too? Where shadows and what not seemed off compared to AMD cards? Nvidia disabling some stuff to boost performance? Just to win on numbers? Cheeky! ! !
Or a game developer not correctly using the API, or a driver bug. No, definitely time to put on that tin-foil
 
There was also the case during the Polaris launch where AOTS didn't render proper snow on the GTX 1080 either due to a driver bug.

Many people accused AMD of running the card at a lower setting; despite the developers coming out and stating it was an NVIDIA driver issue; which in effect boosted the performance since it wasn't rendering so much.
And when the big was fixed Nvidia card saw a performance boost. Which is not unexpected, just because there is a visual anomaly doesn't mean the card isn't rendering and computing much the same work. In fact it's just as feasible the rendering corruption could require additional resources
 
And when the big was fixed Nvidia card saw a performance boost. Which is not unexpected, just because there is a visual anomaly doesn't mean the card isn't rendering and computing much the same work. In fact it's just as feasible the rendering corruption could require additional resources

Oxide stated that the new driver didn't affect performance, and so did Ryan Smith from Anandtech.

https://forum.beyond3d.com/threads/...1070-1060-and-1050.57930/page-17#post-1919727

Has anyone confirmed whether performance is different in the bugged drivers?

Just checked it this morning on a GTX 1070. The average performance difference between 368.19 and 368.39 is 0.6%, well inside the margin of error.

I still have the Prey issue down as a mix of buggy drivers plus the game's dynamic asset scaling.

DF mentioned that on the GTX 1050 it would often drop the textures really low, which was very noticeable, and offered worse visual quality compared to the PS4, despite it upping the FPS.
 
Don't forget that there was/is also issues with doom's IQ:


So either game developers are to blame or/and nvidia's driver team are just very incompetent....
 
Game developers are notoriously bad for failing to correctly follow API specs. Doom is a great example because there are several well known graphics defects that affect both AMD and Nvidia that they still haven't fixed

https://youtu.be/pJwwh_bipJ8
 
Last edited:
well Iq of amd ati cards was better before didn't know it still exist , always found that ati cards to be bit less blurry or fogy and more sharper , but don't know now as i have 1080.

but when moving from 6970 to 770 there was still a difference .

edit: and before 4870 vs 260 was even more noticeable
 
Game developers are notoriously bad for failing to correctly follow API specs. Doom is a great example because there are several well known graphics defects that affect both AMD and Nvidia that they still haven't fixed

https://youtu.be/pJwwh_bipJ8

Never noticed the first issue mentioned and I've played through the game 3 times now.... Got a link to show this specifically for console and AMD users?

As for the other issues he showed, in brackets, he had "nvidia only".....

I'd certainly be interested in hearing about this as I've tested Doom on a RX 480, Radeon Pro Duo, 295x2 and a HD 7990 and been unable to reproduce that flickering.

If it's a genuine bug we'll fix it, this is the first I've heard.
 
Both output the same visuals there; but DF clearly state that on the GTX 1050 the game was dynamically switching out textures for lower ones due to less VRAM. That increases performance on the NVIDIA one.
Performance benefit would be insignificant. Running lower textures just for vRAM reasons typically doesn't mean notably increased performance, just keeping performance in check in general. Certainly nowhere near the difference in performance shown.

Not Just the mimic is missing from the GTX 1060, so is some environmental shadows, and character shadows. That's quite a lot; and shadows do impact performance a lot.
But it's not that common and these missing effects aren't there all at once, the whole time. It's brief snippets of individual missing shadows.

Seriously, there's no reason to think this is causing the performance discrepancy.
 
Nvidia never cease to amaze me lol. They treat their customers like morons.

Never seizes to amaze me that people think ATI/AMD never did such things like underhanded tactics. The Quake 3 “optimizing” which nuke the bitmap levels so it turned the game into a blurry but much faster mess How about the “new” Rage pro turbo which marked as a brand new faster card and all it was the same Rage pro chip but with a new name on the chip, bios and a driver which could be installed on Rage Pro anyway. These days its BS marketing but only at the same level as everyone else lol.

I guess I’m showing my age with those examples but never take a company on face value whether it be Intel/AMD/Nvidia etc
 
Performance benefit would be insignificant. Running lower textures just for vRAM reasons typically doesn't mean notably increased performance, just keeping performance in check in general. Certainly nowhere near the difference in performance shown.


But it's not that common and these missing effects aren't there all at once, the whole time. It's brief snippets of individual missing shadows.

Seriously, there's no reason to think this is causing the performance discrepancy.

How do you know the performance benefit is insignificant? Even PCGH states it stops the game from becoming extremely stuttery and laggy.

I'm sorry but missing light shafts, alpha particles, and then lighting and shadows directly into next room, and then the area with the minic is not just a snippet. The issues are prevalent throughout their benchmark run between the GTX 1060 and 580 at 1080p Very High settings.

It's once they lower the shadows to High that things are similar between the tested cards. Which to me at least means that the 1060, even set to Very High Shadows, still outputs at High, which is a lower quality.

https://www.youtube.com/watch?v=d84gqMzPs2U&feature=youtu.be&t=1m43s


5316ac35bbc342efaf9d1d567299cf5e.png



2d197a644b48442c8d466fb7b82705c6.png
 
It's once they lower the shadows to High that things are similar between the tested cards. Which to me at least means that the 1060, even set to Very High Shadows, still outputs at High, which is a lower quality.
This should easily be testable surely by running the 1060 with shadows set to high then very high and looking if the result changes?
 
This should easily be testable surely by running the 1060 with shadows set to high then very high and looking if the result changes?

It's done in that video, although at 1080p vs 1440p.

The first half is the two cards at Very High settings and Shadows, and the 1060 is missing a lot of details throughout the footage.

Then they switch to 1440p, but lower shadow quality from Very High, to High; and suddenly both cards output the same shadows throughout the rest of the run.
 
Performance benefit would be insignificant. Running lower textures just for vRAM reasons typically doesn't mean notably increased performance, just keeping performance in check in general.
Omega drivers mean anything to you? Back in the early 00's Nvidia tried (and succeeded) to gain a performance advantage over ATi by lowering their cards texture performance settings (highest setting removed and next highest given it's name, this replicated down the settings to make it look like nothing had been changed but the quality of each setting had been dropped a tier). They never admitted anything untoward and for multiple generations Geforce users had to use third party drivers in order to run games at full texture quality.
 
How do you know the performance benefit is insignificant? Even PCGH states it stops the game from becoming extremely stuttery and laggy.
Yes, not because of framerate dips, but because of an overloaded memory bus.

And even if it was to prevent a framerate drop, it would not actually *boost* the framerate. It would just keep it more or less where it was.

I'm sorry but missing light shafts, alpha particles, and then lighting and shadows directly into next room, and then the area with the minic is not just a snippet. The issues are prevalent throughout their benchmark run between the GTX 1060 and 580 at 1080p Very High settings.
The issues can be seen, but they are still isolated missing features. It's not a combination of them causing a constant performance deficit.

It's once they lower the shadows to High that things are similar between the tested cards.
Yet the performance gap is still the same.
 
Status
Not open for further replies.
Back
Top Bottom