• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
Speaking of Prey, the game doesn't seem to be especially optimized towards AMD GPU's. If anything, it has a very slight advantage to Nvidia, though it's generally quite close.

Still a bit surprising given their 'engineering' partnership with Bethesda and Prey being talked about specifically and with an AMD loading screen. Unless they're doing something special with Vega, I feel this wont be a 'win' for it when the reviews(of Vega) come out.
 
Speaking of Prey, the game doesn't seem to be especially optimized towards AMD GPU's. If anything, it has a very slight advantage to Nvidia, though it's generally quite close.

Still a bit surprising given their 'engineering' partnership with Bethesda and Prey being talked about specifically and with an AMD loading screen. Unless they're doing something special with Vega, I feel this wont be a 'win' for it when the reviews(of Vega) come out.

Speaking of that, seems some nvidia cards aren't rendering shadows properly in the game. Which ends up with higher fps.

Looks like a driver bug at the moment.

Even the creature in that scene has better shadows on the AMD card.

https://www.youtube.com/watch?v=d84gqMzPs2U&t=2m6s

6232uzq8giwy.png


pg5Gor.gif

xG4Q9r.gif


3lD7EQ.gif
 
Last edited:
Speaking of Prey, the game doesn't seem to be especially optimized towards AMD GPU's. If anything, it has a very slight advantage to Nvidia, though it's generally quite close.

Still a bit surprising given their 'engineering' partnership with Bethesda and Prey being talked about specifically and with an AMD loading screen. Unless they're doing something special with Vega, I feel this wont be a 'win' for it when the reviews(of Vega) come out.

I'd imagine that Prey was practically finished by the time AMD finalised their partnership with Bethesda, so they'll have had a couple of token tweaks and the logo tagged on the front and that's about it. It'll be games currently in development that'll benefit more from the partnership.

nVidia do seem to have issues with Prey however, with their cards not rendering shadows and lighting correctly in the game.
 
Speaking of that, seems some nvidia cards aren't rendering shadows properly in the game. Which ends up with higher fps.

Looks like a driver bug at the moment.

Even the creature in that scene has better shadows on the AMD card.

https://www.youtube.com/watch?v=d84gqMzPs2U&t=2m6s

6232uzq8giwy.png


pg5Gor.gif

xG4Q9r.gif


3lD7EQ.gif

Didn't they do that with some other games too? Where shadows and what not seemed off compared to AMD cards? Nvidia disabling some stuff to boost performance? Just to win on numbers? Cheeky! ! !
 
Speaking of that, seems some nvidia cards aren't rendering shadows properly in the game. Which ends up with higher fps.

Looks like a driver bug at the moment.

Even the creature in that scene has better shadows on the AMD card.

https://www.youtube.com/watch?v=d84gqMzPs2U&t=2m6s
I've seen that mentioned, but if you look later in the video, the RX460 is also missing the Mimic shadow.

Just in general, the game has some graphical bugs. I haven't seen anything to suggest this is creating any performance differences between Nvidia and AMD, though. A missing shadow here or there definitely isn't the reason for the existing gap.
 
I've seen that mentioned, but if you look later in the video, the RX460 is also missing the Mimic shadow.

Just in general, the game has some graphical bugs. I haven't seen anything to suggest this is creating any performance differences between Nvidia and AMD, though. A missing shadow here or there definitely isn't the reason for the existing gap.

Nope,shadows can affect performance quite massively - I have a GTX1080 and play Planetside 2,which is Nvidia sponsored and upping shadows to maximum settings can drop performance by upto 20FPS(!). When I had lesser cards,shadow settings were one of the settings I would drop to increase performance.

Edit!!

Even Nvidia agrees with me. Look at the optimisation guide for FO4.

JZg7CdI.png

cCS0kys.png
 
Nope,shadows can affect performance quite massively - I have a GTX1080 and play Planetside 2,which is Nvidia sponsored and upping shadows to maximum settings can drop performance by upto 20FPS(!). When I had lesser cards,shadow settings were one of the settings I would drop to increase performance.

Edit!!

Even Nvidia agrees with me. Look at the optimisation guide for FO4.
What are you talking about? Where did I say that shadows dont affect performance?

We're discussing some bugged missing shadows in Prey.
 
Didn't they do that with some other games too? Where shadows and what not seemed off compared to AMD cards? Nvidia disabling some stuff to boost performance? Just to win on numbers? Cheeky! ! !
That's not what's going on at all. The RX460 was missing the same Mimic shadow the 1060 was. None of this seems to be 'the difference' in performance here. It's just an odd, occasional missing shadow.

Crazy how fast people jump to conclusions.
 
That's not what's going on at all. The RX460 was missing the same Mimic shadow the 1060 was. None of this seems to be 'the difference' in performance here. It's just an odd, occasional missing shadow.

Crazy how fast people jump to conclusions.

Yep, the second one brand outperforms the other by even 1fps...

"REVIEWER IS BEING PAID OFF" ... "DELIBERATELY GIMPED...."

etc etc.

Whilst I'm sure some of it is of genuine concern, there does seem to be a lot of 'tin foil hat' conspiracies at hand sometimes.

Eg when Ryzen was first being reviewed, I remember hearing that every reviewer (other than Joker who showed Ryzen in a good light) were paid off by Intel.
 
I've seen that mentioned, but if you look later in the video, the RX460 is also missing the Mimic shadow.

Just in general, the game has some graphical bugs. I haven't seen anything to suggest this is creating any performance differences between Nvidia and AMD, though. A missing shadow here or there definitely isn't the reason for the existing gap.


Both output the same visuals there; but DF clearly state that on the GTX 1050 the game was dynamically switching out textures for lower ones due to less VRAM. That increases performance on the NVIDIA one.
https://youtu.be/d84gqMzPs2U?t=6m

It's not the same between RX 580 and GTX 1060; where the latter isn't rendering any of the dynamic or complex shadows.

Not Just the mimic is missing from the GTX 1060, so is some environmental shadows, and character shadows. That's quite a lot; and shadows do impact performance a lot.

So either there's some "dynamic" asset switching in the game like on the Consoles, There's a bug in the game, or wrong settings were used.

There's also some lighting differences at 1:43 where you can see soft lighting on the RX 580, liht shafts with particles in the air, not visible on the GTX 1060.

https://youtu.be/d84gqMzPs2U?t=1m43s

5316ac35bbc342efaf9d1d567299cf5e.png


2d197a644b48442c8d466fb7b82705c6.png


In many of those scenes where the 1060 is missing effects, it also has better performance.

Like I said, could be a driver bug, as PCGH also reported the Game Ready driver introduced a lot of stuttering. Or it's the game Dynamically using lower quality assets for the GTX 1060 as well.

EDIT: Seems PCGH also noticed the game scaling back details to improve performance due to low VRAM.

I wonder if that applies to lighting and shadows as well.

The memory equipment is worthwhile, because while the 3 GiByte of the R9 280X still provide for undisturbed play, the optical splendor with the GTX 770 is significantly worse: the 2 GiByte video storage is insufficient.

Instead of ending in a jerkiness, Prey reduces the texture details massively, so the GTX 770 still shows a decent performance in spite of too little memory - even in higher resolutions.
 
Last edited:
Gibbo when we pre-order?


LOL

To those who cry about Vega availability and that going to be only 20,000 units at launch day.
Do do forget we are 2 months later, pre-orders are done for many AIB GTX1080Tis and still have unknown delivery date yes?

Pay a visit to the other threads in here like the EVGA GTX 1080 Ti FTW3/SC2 iCX......

And thats a mature tech that everyone was expecting to come out since last year...........
Does that mean that there is GDDR5X availability issues, or NV cannot produce GTX1080Tis?


Complete different situations and you know it, If you want a 1080/1080ti you can get one supply was good (lots of brands and lots of models on release day). Ultra high end cards like say the Strix always come in the second wave as fully custom designs on the PCB take time you know and always more limited than the reference and "mid range" models like the MSI gamings.

As for the EVGA example you gave if you had read the thread you realize its EVGA screwing over retailers and they are keeping the majority of the stock rather than there being a shortage of 1080ti chips. It’s like saying you can’t buy a Ford Focus when a limited run model is hard to come by.

Compare that to the Fury X there was only a couple of brands on release and 8-10 cards (there was always only a single board design compared the many of the 980ti/1080ti) in stock at OCuk on release with massive supply issue for weeks and months as there were so few of the cards out there.

That’s probably why Nvidia stuck with GDDR5/GDDR5X for most of their Pascal boards as the price and supply of HBM1/2 is probably not at the levels they need it to be still.
 
Both output the same visuals there; but DF clearly state that on the GTX 1050 the game was dynamically switching out textures for lower ones due to less VRAM. That increases performance on the NVIDIA one.
https://youtu.be/d84gqMzPs2U?t=6m

It's not the same between RX 580 and GTX 1060; where the latter isn't rendering any of the dynamic or complex shadows.

Not Just the mimic is missing from the GTX 1060, so is some environmental shadows, and character shadows. That's quite a lot; and shadows do impact performance a lot.

So either there's some "dynamic" asset switching in the game like on the Consoles, There's a bug in the game, or wrong settings were used.

There's also some lighting differences at 1:43 where you can see soft lighting on the RX 580, liht shafts with particles in the air, not visible on the GTX 1060.

https://youtu.be/d84gqMzPs2U?t=1m43s

5316ac35bbc342efaf9d1d567299cf5e.png


In many of those scenes where the 1060 is missing effects, it also has better performance.

Like I said, could be a driver bug, as PCGH also reported the Game Ready driver introduced a lot of stuttering. Or it's the game Dynamically using lower quality assets for the GTX 1060 as well.

EDIT: Seems PCGH also noticed the game scaling back details to improve performance due to low VRAM.

I wonder if that applies to lighting and shadows as well.

I just looked at the Reddit thread on it,someone noticed it in the review done by Joker:
http://imgur.com/6w4MBj5

There is something weird going on with the GTX1060.
 
Last edited:
Both output the same visuals there; but DF clearly state that on the GTX 1050 the game was dynamically switching out textures for lower ones due to less VRAM. That increases performance on the NVIDIA one.
https://youtu.be/d84gqMzPs2U?t=6m

It's not the same between RX 580 and GTX 1060; where the latter isn't rendering any of the dynamic or complex shadows.

Not Just the mimic is missing from the GTX 1060, so is some environmental shadows, and character shadows. That's quite a lot; and shadows do impact performance a lot.

So either there's some "dynamic" asset switching in the game like on the Consoles, There's a bug in the game, or wrong settings were used.

There's also some lighting differences at 1:43 where you can see soft lighting on the RX 580, liht shafts with particles in the air, not visible on the GTX 1060.

https://youtu.be/d84gqMzPs2U?t=1m43s

5316ac35bbc342efaf9d1d567299cf5e.png


In many of those scenes where the 1060 is missing effects, it also has better performance.

Like I said, could be a driver bug, as PCGH also reported the Game Ready driver introduced a lot of stuttering. Or it's the game Dynamically using lower quality assets for the GTX 1060 as well.

EDIT: Seems PCGH also noticed the game scaling back details to improve performance due to low VRAM.

I wonder if that applies to lighting and shadows as well.

That is quite different during the "walk through the chopper room" sequence but whether there's actual gimping going on is debatable,
Like someone mentioned it would be interesting to see it running on the 1060 with both older and game ready drivers to see if there's a change there.
Personally I think that once they get any fine tune patching done we'll still see an Nvidia lead.
 
That is quite different during the "walk through the chopper room" sequence but whether there's actual gimping going on is debatable,
Like someone mentioned it would be interesting to see it running on the 1060 with both older and game ready drivers to see if there's a change there.
Personally I think that once they get any fine tune patching done we'll still see an Nvidia lead.


I'm personally putting it down to a mix of the Drivers, and the game's Dynamic quality scaling.
There are just so many scenes where the 1060 isn't rendering effects, lightning, and shadows; and it's the same scenes it has a decent FPS lead.

Also makes me wonder which drivers DF were using, as NVIDIA Reddit, Computerbase, and PCGH found some severe stuttering on the "Game Ready" ones.

It's possible they used the previous drivers to get smoother frametimes, but as a result there are rendering issues.
If that's the case they really should have at least added a disclaimer.
 
It could certainly be the driver, mixed with the games engine that dynamically reduces quality at times.

Its happened before - IIRC the last Mirror's Edge had it on maximum texture settings,AOTS did it with quad cores,etc - I wish more websites did image quality comparisons like they used too,since they would be able to catch these things beforehand and pass it back to AMD/Nvidia or the dev.
 
Status
Not open for further replies.
Back
Top Bottom