• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Rumour: AMD's 4096 Shader Big Navi in Q4 2019?

There is no difference when set to the same configuration, He started the post here https://forums.overclockers.co.uk/posts/32721496/
My r5 2500u in answer to it here https://forums.overclockers.co.uk/posts/32734063/ can do 4k60 at 8-10-12 bit but limited obviously due to Hdmi2 bandwidth, on displayport it can do whatever you want same as my Vega 64.

Vega 8 does not send only Rgb signals like 4k is saying, It is a mixture of bandwidth limitations and colour bit depth via hdmi which will limit you to certain settings.
I believe it's down to his Lg 4k monitor and the ultra deep setting, or something funky with the Laptop dual display setting,
lg-4k.png

The LG 24UD58-B is fine, it receives YCbCr when connected via HDMI with the Vega 8.

It's the laptop screen that doesn't get this privilege even despite its connection eDP.
Radeon-Settings-10-Vega-8.png


But... I can choose how to accelerate 3D apps - with the Vega 8 or with the Polaris 21.
The image quality is different, at least in one title.
 
There is no difference when set to the same configuration, He started the post here https://forums.overclockers.co.uk/posts/32721496/
My r5 2500u in answer to it here https://forums.overclockers.co.uk/posts/32734063/ can do 4k60 at 8-10-12 bit but limited obviously due to Hdmi2 bandwidth, on displayport it can do whatever you want same as my Vega 64.

Vega 8 does not send only Rgb signals like 4k is saying, It is a mixture of bandwidth limitations and colour bit depth via hdmi which will limit you to certain settings.
I believe it's down to his Lg 4k monitor and the ultra deep setting, or something funky with the Laptop dual display setting,
lg-4k.png

That pretty much answers that then. It says it right there in the manual. I can't remember the last time that a amd card looked any different to my eyes on my monitors. I do find a difference in default presentation between brands but its personal choice on which you prefer tbh. I was sure even the little vega 11 in the hp705 g4 mini can happily do multiple 4k screens and whetever colour depth required over its 2 displayport ports.
 
You are lucky, my laptop has 1 HDMI and 1 USB Type-C. No DP except its internal EDP.

The point is that the Vega 8 when in gaming sends RGB signals only?!
 
The LG 24UD58-B is fine, it receives YCbCr when connected via HDMI with the Vega 8.

It's the laptop screen that doesn't get this privilege even despite its connection eDP.
Radeon-Settings-10-Vega-8.png


But... I can choose how to accelerate 3D apps - with the Vega 8 or with the Polaris 21.
The image quality is different, at least in one title.

Ok my apologies you said ''Oh, and I need confirmations from the others, but I think AMD lowered the image quality on its Vega cards. I think I get better picture on my Polaris 21 than on the Vega 8 which is somehow darker and with less details''.
I thought you meant all vega cards and in particular that you meant the output from the hdmi.
I can see what you mean when using the internal screen as it is edp, Rgb at 8bit non configurable, but is fine when using the hdmi output.
I haven't looked into it properly but it sounds like the Rx560 in yours uses a different handshake or routing and so can output either at full rgb range or at a higher setting.
 
Now that fascinating rather off topic discussion about AMD card's colour reproduction seems to have concluded, anyone able to answer:
Is there any precedent where a GPU has released several months earlier than what the company road-map said?
If a 5800 XT+ stands a decent chance of coming out by Christmas, I could see myself regretting buying a 5700 XT now.
 
Now that fascinating rather off topic discussion about AMD card's colour reproduction seems to have concluded, anyone able to answer:

If a 5800 XT+ stands a decent chance of coming out by Christmas, I could see myself regretting buying a 5700 XT now.

I am wondering just this.
 

This one speculates the 5800 XT will be announced/launched by the end of this summer :eek: Presumably to accompany the Ryzen 9 3950X.
 

This one speculates the 5800 XT will be announced/launched by the end of this summer :eek: Presumably to accompany the Ryzen 9 3950X.

I think it's pretty close to. What makes me think that is nVidia seem to believe it's close.

Have you noticed how are they have recently been pushing how many future games are going to be Ray Trace only. This can't be to combat AMD's next gen cards as they will have Ray Trace, so it's more likely 5700 & something else they think is coming soon.
 
With 3200 Shaders Navi would be at least as fast as a 2080 and still only be 300mm^2, a small GPU. A 4096 Shader 5900XT 375mm^2.

Compare that to the RTX 2080 at 545mm^2, 2080TI at 754mm^2, they are 12nm, on 7nm the 2080 would be around 410mm^2, the 2080TI around 565mm^2.

AMD's Navi has greater performance per Die size, its actually a really good GPU and i think nVidia know it, on 7nm they would have to make larger GPU's to keep up.
 
Pretty sure both those Navi options would be higher performance than the Nvidia cards.

Clocks depending

I'm not so sure about that, the 2080TI at 4K is 56% faster than the 5700XT, tho i think that's down to the 256Bit bus, at least the 5900XT will get a 384Bit bus which in theory should help bring it closer to the 2080TI at high res.

At 1080P the 5700XT sits 35% behind the 2080TI, 46% at 1440P.

Source, TPU's 5700XT review.

That 2080TI is a monster of a card, one can't get away from that, it is.

Again tho, similar performance to a 2080TI and if its significantly cheaper its exactly what it needs to be.
 
This sounds really good, give NVIDIA some competition so they actually have to make decent 7nm cards from the off, rather than this super ( here's what they should have been at launch) nonsense.

Bring back the competition so people actually have a choice.
 
CU count has to be 80 or 5120 shaders, not 64 with 4096. Don't you remember that Navi was supposed to fix the 64-CU limitation of the older GCN architecture?!

64 CU won't provide much of a performance uplift, will it?
 
Back
Top Bottom