• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
And even if it was to prevent a framerate drop, it would not actually *boost* the framerate. It would just keep it more or less where it was.

The issues can be seen, but they are still isolated missing features. It's not a combination of them causing a constant performance deficit.

Yet the performance gap is still the same.

The gap is less, that's easy to see; and it's not isolated when for the entire run it's a visible issue at Very High settings.

What is a "non-isolated" issue to you then? Since these are seen through the entire run.

I don't know about you, but when I set graphics to certain settings, I expect parity on Image Quality with other competing products. I don't want to run at "High", when I set it to "Very High".

"It would just keep it more or less where it was. " That is upping performance...seriously now. If it didn't lower the image quality the performance drops, causing lag.

If you run the game at average 60fps, and suddenly lag because you're at High settings causing it to drop to the 30's that's a loss of performance.
If the game then scales the image quality from High to Low during the part of the lag spike, to try and keep you at 60fps that is "boosting" performance. It's just given you an extra ~30 fps.
DF mention it, they say it's very visible and apparent, and it affects the benchmark run; since the FPS doesn't dip as expected under that load.

Now if something similar is happening to Shadows and Lighting for the GTX 1060, I would be rather concerned; although I still think that part is a driver bug.
 
Omega drivers mean anything to you? Back in the early 00's Nvidia tried (and succeeded) to gain a performance advantage over ATi by lowering their cards texture performance settings (highest setting removed and next highest given it's name, this replicated down the settings to make it look like nothing had been changed but the quality of each setting had been dropped a tier). They never admitted anything untoward and for multiple generations Geforce users had to use third party drivers in order to run games at full texture quality.
A lot has changed since then. Nowadays, it's much like texture filtering - so long as you have the bandwidth, the gain in lowering texture settings is pretty insignificant in terms of performance. You only do it if you're vRAM-limited, but both the 1050 and 460 are running 2GB on a 128 bit bus, so this shouldn't be an issue for one, but not the other. Memory compression techniques differ and could theoretically be 'the difference' here, but it's unlikely.
 
Last edited:

2010:
iD: Guys we're bringing a game out next year that's going to use the latest OpenGL features, make sure you have your drivers up to date.
ATi: mkay,
Nvidia: mkay,

2011:
iD: Guys users are complaining of graphical issues with our awesome new game, any ideas?
AMD: Who are you?
Nvidia: Oh yeah the driver, I'll get on that ASAP dude.
iD: I hate you guys...

2016:
iD: Nice that you guys finally got your OpenGL drivers in check >.> heads up we're adding Vulkan support to Doom.
AMD: Cool, let us know if you need any help.
Nvidia: Won't Star Trek sue you if you steal their characters?
iD: How are you still in business?

:P
 
The gap is less, that's easy to see
I'm really not seeing it. Still seems like the edge is for the 1060. Which would rule out the theory that what you're describing is the only reason it is ahead.

"It would just keep it more or less where it was. " That is upping performance...seriously now. If it didn't lower the image quality the performance drops, causing lag.
It's pretty common for it to cause frametime spikes rather than an actual drop in overall performance when running settings too high for the vRAM pool to handle. This is a common cause of 'stutter' that people complain about even though their framerate seems fine.

As I said in a post above though, there's no reason for the 1050 to be vRAM-limited while the 460 isn't, as they both run pretty much identical memory configurations. And it doesn't come close to explaining the very wide gap in performance.
 
It's pretty common for it to cause frametime spikes rather than an actual drop in overall performance when running settings too high for the vRAM pool to handle. This is a common cause of 'stutter' that people complain about even though their framerate seems fine.

As I said in a post above though, there's no reason for the 1050 to be vRAM-limited while the 460 isn't, as they both run pretty much identical memory configurations. And it doesn't come close to explaining the very wide gap in performance.

The gap in performance there was same as it was when the RX 460 launched, although the 460 had 4GB of VRAM. Which means the 1050 getting scaled down Image Quality would result in better performance.

Now can you please respond to the rest of my post?
What is a "Non-Isolated" issue.

Do you also believe running at lower settings doesn't affect performance?

Can you explain why the 1060 is missing several features, such as shadows, lighting, alpha particles and more?
Do you believe that if I ran a game, and turned those off my performance would stay the as if they were on?

As I stated before, either there's a driver bug, or it's a combination of that with the game dynamically scaling image quality.

I'm an NVIDIA user, and I'd be upset running the game at Very High, and getting Image Quality on par with High. What's the point in it then.
 
Last edited:
The gap in performance there was same as it was when the RX 460 launched. It was always slower, although usually has 4GB of VRAM.
Now can you please respond to the rest of my post?
What is a "Non-Isolated" issue.

Do you also believe running at lower settings doesn't affect performance?

Can you explain why the 1060 is missing several features, such as shadows, lighting, alpha particles and more?
Do you believe that if I ran a game, and turned those off my performance would stay the as if they were on?

As I stated before, either there's a driver bug, or it's a combination of that with the game dynamically scaling image quality.

I'm an NVIDIA user, and I'd be upset running the game at Very High, and getting Image Quality on par with High. What's the point in it then.
I'm saying the issues are isolated, as in - not present together with each other. Meaning the performance benefit of any single missing feature, especially something as simple as a single shadow, is unlikely to explain the entirety of the performance difference shown. If it was a combination of effects all missing together for the majority of the run, then sure.

And you're only speculating about Nvidia users getting 'High' overall rather than 'Very High'. This was one card on one run. If it is indeed a bug, then it stands to reason that it may not be present for every GTX1060, much less every Nvidia user in general.
 
I'm saying the issues are isolated, as in - not present together with each other. Meaning the performance benefit of any single missing feature, especially something as simple as a single shadow, is unlikely to explain the entirety of the performance difference shown. If it was a combination of effects all missing together for the majority of the run, then sure.

And you're only speculating about Nvidia users getting 'High' overall rather than 'Very High'. This was one card on one run. If it is indeed a bug, then it stands to reason that it may not be present for every GTX1060, much less every Nvidia user in general.

Also it's not really a "single" shadow now is it? In some parts entire hallways are missing shadows and correct lighting. That's more than just one really.

I would hope it's a bug that's fixed soon. One bug that is affecting most NVIDIA users is stuttering thanks to the "Game Ready" drivers.

Considering we see none of that in the DF video ( the stuttering ), it could be they were running on the previous drivers, which in turn results in several features not being rendered.

I find it really interesting that when similar issues were shown on Joker's one video where he was trolled and lambasted so much he pulled the video; and now the same issues are in DF's video.
 
I remember Doom 2016 had some weird pop in issues and undetailed textured on my GTX 1070 and it wasnt until i went into the NCP and manually changed texture quality to the highest preset that the pop in seemed to go away. I wouldnt put it past nvidia to "forget" to fix this sort of issue out with Prey if it gives them a number advantage in benchmark runs.
 
Speaking of that, seems some nvidia cards aren't rendering shadows properly in the game. Which ends up with higher fps.

Looks like a driver bug at the moment.

Even the creature in that scene has better shadows on the AMD card.

https://www.youtube.com/watch?v=d84gqMzPs2U&t=2m6s

6232uzq8giwy.png


pg5Gor.gif

xG4Q9r.gif


3lD7EQ.gif

Interesting stuff ...we use to have this years ago didn't we with corners cut to increase fps in either camp....

I remember AMD use to have better rendering quality over nvidia
 
Don't forget that there was/is also issues with doom's IQ:


So either game developers are to blame or/and nvidia's driver team are just very incompetent....

Game developers are notoriously bad for failing to correctly follow API specs. Doom is a great example because there are several well known graphics defects that affect both AMD and Nvidia that they still haven't fixed

https://youtu.be/pJwwh_bipJ8

Never noticed the first issue mentioned and I've played through the game 3 times now.... Got a link to show this specifically for console and AMD users?

As for the other issues he showed, in brackets, he had "nvidia only".....

Same here, I never saw the first issue with my Fury.

The delay with texture loading reminded me of Rage.
 
Its happened before - IIRC the last Mirror's Edge had it on maximum texture settings,AOTS did it with quad cores,etc - I wish more websites did image quality comparisons like they used too,since they would be able to catch these things beforehand and pass it back to AMD/Nvidia or the dev.

Agree i do miss this too.....would be great if some one dug deeper into this ...one things for sure Nvidia have messed up big time to call them game ready drivers and run worse using them ...
 
Put in loads of hours into Doom and didn't get any graphic glitches in my plays with OpenGL and Vulkan.

D.P has any footage from AMD showing this? Or you just trying to justify another mess up by the great Nvidia?
 
Put in loads of hours into Doom and didn't get any graphic glitches in my plays with OpenGL and Vulkan.

D.P has any footage from AMD showing this? Or you just trying to justify another mess up by the great Nvidia?
Not that I want to get sucked into this argument at all but I think this is what DP was on about:

http://www.gamepur.com/news/23042-d...raphics-cards-id-software-confirms-patch.html

I am guessing it needs a patch or driver fix or it can't handle the VRAM needed, so is downgrading graphics.
 
Not that I want to get sucked into this argument at all but I think this is what DP was on about:

http://www.gamepur.com/news/23042-d...raphics-cards-id-software-confirms-patch.html

I am guessing it needs a patch or driver fix or it can't handle the VRAM needed, so is downgrading graphics.

Wasn't that an on release issue?

I got the game after Vulcan released and by then all the on release issues must have been fixed up as I never saw any problems,
Doom on Vulcan was what all games should strive to be imo.

Plus it's smells a bit fishy if AMD's release issues cause poor performance while Nvidia's boost performance.
 
Wasn't that an on release issue?

I got the game after Vulcan released and by then all the on release issues must have been fixed up as I never saw any problems,
Doom on Vulcan was what all games should strive to be imo.

Plus it's smells a bit fishy if AMD's release issues cause poor performance while Nvidia's boost performance.

That was on release and AMD was behind on getting a proper OpenGL performance driver out for the game. Hence those issues there.

Once it was sorted AMD's performance increased significantly, as in 35% performance increase.

https://www.overclock3d.net/news/gpu_displays/amd_release_radeon_software_16_5_2_1_driver_for_doom/1
Performance increase by up to 35% on AMD Radeon R9 390 series products in Doom™ versus Radeon Software Crimson Edition 16.5.2
 
Right - Vega needs to come along before the end of June. My wife has booked an engineer to fit a Smart Electric Meter on 28/06/2017 (you know, the ones that tell you how much electric you're using in real time). That means the twin 290's need to go and be replaced by a single GPU by then or I'll be in real trouble when she finds out how much the PC actually draws under heavy load, so come on AMD - get working on releasing Vega so I can choose between it and Pascal. :D
 
Status
Not open for further replies.
Back
Top Bottom