• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The AMD Navi Thread **

Associate
Joined
14 Jul 2017
Posts
128
I wish the price was atleast 50 bucks cheaper but have to see what is the € price.
I dont think that cooler is going to be as bad as some of these "lOL veGa zuuler was loLZcrapz hAha Lolz" think.
Die is 150w, memory is cooled by the shroud. On Vega 56 and Vega 64 the cooler had to handle much more, 210/295. You cant straight go off TDP, but there should be a lot less heath for the cooler to handle. And hopefully they have updated it a bit.

...
What's not fine is the 5700XT is a mid tier product at a higher end price point.
If for example this was £300. It's relative lack of performance against a 2080 is irrelevant etc.
If its cheaper than 2070 and faster than 2070 why would its performance against 2080 be a problem?

But it has nothing to do with DLSS. The end result is very different.
AMD should have been comparing their sharpening to Nvidia's sharpening, not DLSS. But of course, AMD need to pull some PR stunts in order to trick people in to thinking they have something like DLSS.

You still don't get it though. The image wont be as good as a native 4K image, it wont be as good as DLSS and it is absolutely nothing new. You could have done this with Nvidia drivers for 2 years.
If you care about the final outcome then you need to care about the differences between sharpening and upscaling, and you need to compare the same intended outcome .
AMD's sharpening can only be compared to Nvidia's sharpening.
What does it matter what makes the image better looking or performance faster if the end product is better overall? Are you hyped because Nvidia used AI!!! on DLSS? Do you want me to post those BFV pictures where the great DLSS is used and it looks worse than the resolution it originally upscaled from? RIS you can make image look better on native resolution or use it to help in situation in where you dont use native resolution. With DLSS you can only upscale the image. I dont understand what is the problem using RIS vs DLSS if the end product is the same? You cant test Nvidia card vs AMD card because they draw the picture in different way? That is ridicilous. If they both give even somewhat comparable image quality lets say on 4k upscaled from 1440p why wouldnt they be compared? Throw in Nvidias sharpening if you want to, Nvidia can make their demo if they want.
 
Soldato
Joined
18 Feb 2015
Posts
6,480
It isn't junk lol. It is just badly priced IMO.

Yeah, but how low will they go to make it not be junk? I look at the FC5 performance and the 5700 for example isn't even besting the V56, and 2060 has a slight leg up. Are they gonna price them at £300 & under? But wait, we haven't accounted for all the missing features the RTX have. However you may feel about them, those are a value add and have to be kept in mind. And that's without adding speculation about the Super refreshes.

Will they price the 5700 at £250? No? Then it's junk.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
AMD have suggested Q1/Q2 next year for navi 20 which will be bigger cores.

Nvidia is looking at Maybe Q2 next year for Ampere built on Samsung's EUV 7nm+ node, which in itself is already about 30% faster than TSMC DUV 7nm node. It is expected AMD will stick with the current DUV 7nm fro TSMC.

Nvidia will be looking at 50-60% performance increase purely from going to the 7nm+, regardless of any architecture changes. Given the icnreased transistor densiy they could easily increase ray-tracing performance by 2 xto 3x.

At the end of the day AMD's new architecture on 7nm is still much less efficient than Turing on 12/16nm. By releasing on 7nm AMD have reduced the efficiency difference a little but they.
How would nV be getting 50%-60% when AMD only got 1.25x perf from their node shrink?
 
Soldato
Joined
26 Sep 2017
Posts
6,185
Location
In the Masonic Temple
Meh they're saturating the market while AMD releases two GPU's against Nvidia's 7 (8 if they're evil enough to make the TI redundant).

It's kinda like with toothpaste, there's at least 6-8 versions of it at different price points, are you going to buy the single version in the corner or the dozens of boxes laden out from Colgate?
I Get the flouride free charcoal toothpaste from the Internet
 
Soldato
Joined
6 Aug 2009
Posts
7,070
Some funny comments in here. It's just electronics, why do some try to wind up others over a few benchmarks? If they get the price right I'd buy one. At this power level ray tracing is useless. You either have high FPS or ray tracing, at the moment you can't have both.
 
Caporegime
Joined
18 Oct 2002
Posts
32,615
How would nV be getting 50%-60% when AMD only got 1.25x perf from their node shrink?

Because Samsung's EUV 7nm+ is far better than TSMCs DUV 7nm. The node sizes are pointless marketing. Essentially going from DUV 7nm to EUV 7nm gets you the same performance gain as going from 12/16nm to 7nm DUV.

Plus the reports indicate TSMC is about 30% faster than TSMCs 16nm, and Samsung's EUV is 25% faster than TSMC DUV 7nm.

.Add that up and you get a 50% or more improvement.

There is a very good reason Nvidia is skipping 7nm DUV
 
Associate
Joined
24 Nov 2010
Posts
2,314
But wait, we haven't accounted for all the missing features the RTX have. However you may feel about them, those are a value add and have to be kept in mind. And that's without adding speculation about the Super refreshes.

Will they price the 5700 at £250? No? Then it's junk.

Well we now know you're trolling.

All the missing features the NVIDIA RTX line have?

Like RTX? Unusable on anything other than the 2080Ti (or dual 2080Tis), and to really get playable in the very few titles available you need to drop down to 1920x1080 (and not even then half the time), and then still endure a low FPS stutter fest. Ludicrous on a card of that price. Moreover, all implementations so far make the games look worse - just shiny. I'm doubtful that it'll really be usable in the Ampere / AMD 'Next Gen' (if AMD do actually do something *dedicated* in their 2020 product stack) generation either.

DLSS is a total joke. They tried a 'square wheel' approach to a relatively simple problem. AMD went with the common sense (and open source) approach, and seem to have come up with something far better that doesn't require dedicated hardware, runs in software and has little performance penalty on either AMD or NVIDIA hardware.

What else? Decode / encoding is now superior on Navi.

RAL? Well, NVIDIA might respond, but they haven't had anything to say on that front yet. If it works as advertised, then it's the biggest new feature in gaming in many years.
 
Last edited:
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
I smoke whiny AMD fan boys struggling to come to terms with another lacklustre launch.

You stupid and deluded. Show me where I have shown signs of AMD fanboysm in this very forum. Contrary to yourself who are an NV fanatic and all your posts show that.

Because I pointed that the slide is fake from an unknown Korean?
Get out of this forum and back to wccftech troll
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
I'm tight on dosh, if I just slapped in a 2700X would it be worthy upgrade from a 1600 and would it propel me through games for 2-3 years?
I went from a 1600x to a 2700x & the improvement in gaming isn't massive but it does it's job well, It'll be running the latest games for years to come so yeah I t5hink it'll be happily running games 4 or 5 years from now.
 
Permabanned
Joined
15 Oct 2011
Posts
6,311
Location
Nottingham Carlton
Because Samsung's EUV 7nm+ is far better than TSMCs DUV 7nm. The node sizes are pointless marketing. Essentially going from DUV 7nm to EUV 7nm gets you the same performance gain as going from 12/16nm to 7nm DUV.

Plus the reports indicate TSMC is about 30% faster than TSMCs 16nm, and Samsung's EUV is 25% faster than TSMC DUV 7nm.

.Add that up and you get a 50% or more improvement.

There is a very good reason Nvidia is skipping 7nm DUV
Also them 12nm is still more than enough for the 7nm navi...

You stupid and deluded. Show me where I have shown signs of AMD fanboysm in this very forum. Contrary to yourself who are an NV fanatic and all your posts show that.

Because I pointed that the slide is fake from an unknown Korean?
Get out of this forum and back to wccftech troll
As always in last few years of AMD's gpus... It's not bad product !!! Price is BAD. Vega was not bad when came out but was bad for 600 freaking quid. With the deals on Vega56 i sold 2 of em to my mates and they love the bang for buck was it 265 with 3 games ?? that was a deal !!!


Its like AMD is MILKING FAN BOYS rest gets card on good price few months later sort of a deal.


Maybe Navi is the Overclockers dream I'm waiting for since Fury X time ?? ;)
 
Last edited by a moderator:
Soldato
Joined
22 Apr 2016
Posts
3,425
You stupid and deluded. Show me where I have shown signs of AMD fanboysm in this very forum. Contrary to yourself who are an NV fanatic and all your posts show that.

Because I pointed that the slide is fake from an unknown Korean?
Get out of this forum and back to wccftech troll
Calm down sweet pea. You are famous on these forums for being both an AMD fanboy and delusional.

As a helpful reminder for you I’ve posted a couple of threads below where you’ve been taken to task by others for your AMD fanboyism.

https://forums.overclockers.co.uk/t...ing-7nm-graphic-cards-in-2019.18829322/page-2

https://forums.overclockers.co.uk/threads/amd-threadripper.18779604/page-28
 
Caporegime
Joined
18 Oct 2002
Posts
29,679
Navi is not a bad GPU just badly priced. They're pricing the same as the competion even though they're not offered the same features (regardless if you will never use them).

I think most will agree that RTG needs totally gutting and starting again!
 
Soldato
Joined
18 Feb 2015
Posts
6,480
Like RTX? Unusable on anything other than the 2080Ti (or dual 2080Tis), and to really get playable in the very few titles available you need to drop down to 1920x1080 (and not even then half the time), and then still endure a low FPS stutter fest. .

1440p Extreme with RTX on >50 fps... is unplayable? I guess you can sign me right up (and whomever still makes up AMD's pitiful marketshare).

 
Soldato
Joined
18 Feb 2015
Posts
6,480
https://www.extremetech.com/gaming/...ds-planned-software-improvements-for-navi-gcn

Anti-lag is supported in DX11 on all AMD GPUs. Support for DX9 games is a Navi-only feature. DX12 games are not currently supported due to dramatically different implementation requirements in that API.

Radeon Image Sharpening is a feature that pairs contrast adaptive sharpening with the use of GPU upscaling techniques to improve overall image quality above baseline without requiring the penalty of native 4K rendering.
RIS is on in this slide. The effect is very subtle. You may want to open both of the images above in separate tabs, zoom in carefully, and then compare the final product. While there’s a definite IQ improvement in the “ON” image, it’s a small one.
Still, small improvements to IQ are generally welcome. RIS was also designed by Timothy Lottes, who worked on FXAA at Nvidia. There’s no expected performance impact from using the feature (estimated performance hit is 1 percent or less). RIS is a Navi-only feature and is only supported in DX12 and DX9.

Finally, there’s FidelityFX.
FidelityFX is AMD’s new addition to GPUOpen, and is a capability it’s releasing to any developer that wants to take advantage of it. The Contrast Adaptive Sharpening tool can be used on any GPU if developers want to do so.

My emphasis.
 
Soldato
Joined
26 Sep 2013
Posts
10,711
Location
West End, Southampton
1440p Extreme with RTX on >50 fps... is unplayable? I guess you can sign me right up (and whomever still makes up AMD's pitiful marketshare).


Been gaming at 100fps and above with adaptive sync for 3 or so years now, I just can't game on anything under 60fps anymore, it's just a horrible experience. I guess I'm not alone in that. I wouldn't call under 60 unplayable but I certainly have very little interest in doing so.
 
Last edited:
Back
Top Bottom