• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The AMD Navi Thread **

So again, if prices are going to "skyrocket", what do you think will happen to the market?

Do you think skyrocketing prices is going to be sustainable?

Don't forget this isn't just the high end. With Navi, the mid-range is now starting at $400. A couple years ago the mid-range started at $150.

You think people aren't going to think twice about buying even a mid-range PC/upgrade?

Otherwise I'd love to know where are all the money trees people have found.
Navi 5700/5700XT die size is midrange but its cost for AMD is high end. For 5700 to be price of Polaris RX480 the die size should be ~40% of RX480 die size.
Tere will always be those cheaper lower end parts that are small but their jump over older gen lower end parts wont be that big. You cant make on 7nm a die size of 2080ti. So you will have smaller die, but denser taking 2080ti place and sadly the performance jump wont be that big. You also have to ask how much faster will gpus have to become and how much faster can 3d makers use that extra power? If you had 400$ gpu that can run 4k60 would you really need it to be next year 200$ gpu? If you can make console that can run that 4k60 for 500$ or you have to buy 1300$ for most its not going to be that of a hard choise.
 
I’ll be optimistic and say the 5700XT is a tool to force Nvidia’s hand. If the article a few days ago is correct in saying Sapphire have registered multiple iterations then we haven’t seen anything yet.

Let’s not forget, Nvidia can’t keep releasing tweaked GPUs just because AMD launch a new range, if they do, they’re just going to **** off and alienate current owners more and more.

It would be naive to assume AMD don’t know this :) maybe they can use it to their advantage?
 
Navi 5700/5700XT die size is midrange but its cost for AMD is high end. For 5700 to be price of Polaris RX480 the die size should be ~40% of RX480 die size.
Tere will always be those cheaper lower end parts that are small but their jump over older gen lower end parts wont be that big. You cant make on 7nm a die size of 2080ti. So you will have smaller die, but denser taking 2080ti place and sadly the performance jump wont be that big. You also have to ask how much faster will gpus have to become and how much faster can 3d makers use that extra power? If you had 400$ gpu that can run 4k60 would you really need it to be next year 200$ gpu? If you can make console that can run that 4k60 for 500$ or you have to buy 1300$ for most its not going to be that of a hard choise.
Sorry but I very much doubt you are in a position to know how much Navi costs AMD to make. Very much doubt it. Those figures aren't publicly available, so you're best-guessing I imagine.

Also if as you say the perf bumps per gen are going to be "small", then you're not going to be driving sales much in that regard either.

"Here's our offer: a much more expensive card that gives you +15% over a much cheaper card. It's OK tho, devs don't need the extra power."

Compelling! :p

P.S. on a more serious note: devs can use every last drop of power you can give them. It really is that simple.

You will never reach a point where a dev says, "I can't do anything with all that extra grunt." Never. 100% guaranteed.
 
Last edited:
Radeon Image Sharpening explained a little more in this video. It seems to be reading the image and its contrast and changing how much gets sharpened on the fly. So its nothing like Reshade at all then! It's not sharpening the full image! It's sharpening what needs to be sharpened while trying to maintain the best possible AA also.

Can be used at native res or if you upscale an image from 1440p on a 4k display to help reduce that blur. Might even help better here if you use VSR one thing I noticed when using VSR is how blurry the image looks so I never used it this might help fix that a bit?

 
I think the truth is PC had always been expensive, at the high end very expensive. There have been times where there has been done great price/performance parts but that is the exception. I remember for many years I didn't even consider PC gaming as to get decent performance it cost an arm and a leg.

The advantage now is your PC lasts much longer than it used to. Imagine your new build being outdated inside two years, you'd really feel the cost then!

I blame the marketing mostly, trying to to tell the masses they can all be PC gamers whilst trying to milk them at the same time. Gaming for the masses will always be console as it's cheaper, easily accessible, no knowledge required. PC gaming is for the knowledgeable or those prepared/able to pay 2-4x as much for higher performance.
 
Radeon Image Sharpening explained a little more in this video. It seems to be reading the image and its contrast and changing how much gets sharpened on the fly. So its nothing like Reshade at all then! It's not sharpening the full image! It's sharpening what needs to be sharpened while trying to maintain the best possible AA also.

Can be used at native res or if you upscale an image from 1440p on a 4k display to help reduce that blur. Might even help better here if you use VSR one thing I noticed when using VSR is how blurry the image looks so I never used it this might help fix that a bit?


Nice! Will give that a watch thanks :)

Do we know if it's Navi exclusive yet?
 
Sorry but I very much doubt you are in a position to know how much Navi costs AMD to make. Very much doubt it. Those figures aren't publicly available, so you're best-guessing I imagine.

Also if as you say the perf bumps per gen are going to be "small", then you're not going to be driving sales much in that regard either.

"Here's our offer: a much more expensive card that gives you +15% over a much cheaper card. It's OK tho, devs don't need the extra power."

Compelling! :p

P.S. on a more serious note: devs can use every last drop of power you can give them. It really is that simple.

You will never reach a point where a dev says, "I can't do anything with all that extra grunt." Never. 100% guaranteed.
You know Designing chips is x times more expensive on new node, you know its yields are not as good, you know memory is x times more expensive, you know waffer is x times more expensive. Wish all you want but but Navi is much more expensive card for AMD than Polaris. And if you are waiting for Nvidias next top card be much faster in raterazation than 2080Ti, I have some bad news for you. But you dont have to believe me. Navi is Polaris size die and the whole card is Vega64 price if not more expensive even without the HBM2. What I was trying to say is someone looking for 200$ gpu wont need 4k60 perf with their 150$ 23" 1080p monitor. AMD did same with Navi "midrange" than Nvidia did with 2060. Reviewers were pracing the card for being so much better than 1060 not mentioning the price increase of 150-200$. Navi replaced Vega56/64 with die roughly half the size, but wery similiar price and small performance increase.
 
P.S. on a more serious note: devs can use every last drop of power you can give them. It really is that simple.

You will never reach a point where a dev says, "I can't do anything with all that extra grunt." Never. 100% guaranteed.

Dev might say that but given 1080p and avoiding Ultra settings there is not much evidence to suggest that they are able to exploit the extra GPU power consistently. Look at the joke which is depth of field and other dodgy ‘Ultra’ settings.

Destiny 2 - looks great, runs great. Does not need excesses of GPU grunt.

GTX1080 speed is more than enough for 1080p and that is where these new Navi cards are going to land.

if anything, given my exploits in Total War we ran out of CPU grunt and IPC well before GPU.

The Price will come down as volume sales start, the important thing is that the Navi GPU has all the features needed in its toolbox to deliver a great experience.
 
Makes sense as it might be better than first thought.

Going by the video it's definitely doing more than just sharpening the full image.

If navi xt was 300/350 I would upgrade from my Vega 64

Come On amd lol

Edit
Also worth noting Radeon Image Sharpening will work with all games supporting the Api so no waiting for devs to add to the game. But AMD also has something else called FidelityFX that does need to be added to the driver and can be used with Vega and maybe other GCN RAGE 2 added support already for this.

RIS - Navi only
However, this feature is currently exclusive to the Radeon RX 5700 series and, oddly enough, will only work with DirectX 9, 10, 12 and Vulkan on Windows 10. Hopefully, FidelityFX can help plug the gaps in terms of support for this feature.
 
Last edited:
Think my Radeon 7 is going to have to go to make room for one ;) :D

Do we know when reviews are due to drop yet? 7th?

Maybe on Saturday? Sunday is a strange day to release CPU and GPUs lol

This isn't the best high res image but you can see a difference here. The rocks show a much greater detail.

ris.jpg
 
Yeah it's subtle but definitely there. Looking forward to reviews now :)

I think Navi is gonna rock with Borderlands 3 too, which is the only other game this year I'm looking forward to :cool:
 
Dev might say that but given 1080p and avoiding Ultra settings there is not much evidence to suggest that they are able to exploit the extra GPU power consistently. Look at the joke which is depth of field and other dodgy ‘Ultra’ settings.

Destiny 2 - looks great, runs great. Does not need excesses of GPU grunt.

GTX1080 speed is more than enough for 1080p and that is where these new Navi cards are going to land.

if anything, given my exploits in Total War we ran out of CPU grunt and IPC well before GPU.

The Price will come down as volume sales start, the important thing is that the Navi GPU has all the features needed in its toolbox to deliver a great experience.
I think it comes down to basic economics.

Design for consoles first, then if you can be arsed, make the PC experience better. Or if you can't, just throw in some ultra settings which do naff all but eat FPS. Job done.

Doesn't mean devs wouldn't utilise that extra power if the consoles (their biggest market) had it. They would for sure then.

And any dev who failed to (ie didn't want to) utilise all the power that the console could provide him, would quickly be made redundant by the next dev who would. Understanding of course that it normally takes a little while to fully understand what a new console is capable of.

But I don't think you'd find too many console devs saying, "We've got all this power and we're not going to use it because our game doesn't need it." Outside of 2d/indie games and genres. Certainly no so-called AAA dev making FPS would have that attitude.

e: Might sound like I'm banging on about consoles but this is what is driving/constraining PC graphical quality so it's relevant.
 
I think it comes down to basic economics.

Design for consoles first, then if you can be arsed, make the PC experience better. Or if you can't, just throw in some ultra settings which do naff all but eat FPS. Job done.

Doesn't mean devs wouldn't utilise that extra power if the consoles (their biggest market) had it. They would for sure then.

And any dev who failed to (ie didn't want to) utilise all the power that the console could provide him, would quickly be made redundant by the next dev who would. Understanding of course that it normally takes a little while to fully understand what a new console is capable of.

But I don't think you'd find too many console devs saying, "We've got all this power and we're not going to use it because our game doesn't need it." Outside of 2d/indie games and genres. Certainly no so-called AAA dev making FPS would have that attitude.

e: Might sound like I'm banging on about consoles but this is what is driving/constraining PC graphical quality so it's relevant.

Well there's always a commercial war going on between console vendors as well as between consoles and PC hardware vendors. Each console brand and gpu brand would like to have the whole market to themselves so you also have hardware evolution that seeks to be proprietary to offer something the others don't. It also depends on game development engines and whether those are chasing support of the latest and greatest features within consoles and PC hardware. There's a balance to be had between having a single code base that compiles to each target and writing specific device targeted code. These have an impact on how quickly a game can be developed and therefore the associated costs. I think we're stuck with consoles being the driving force though due to affordability and thus market share so I wouldn't expect huge differences in graphics quality on PC but mainly higher FPS.
 
Yeah it's subtle but definitely there. Looking forward to reviews now :)

I think Navi is gonna rock with Borderlands 3 too, which is the only other game this year I'm looking forward to :cool:

I am looking forward to navi, with the games I play, the 5700XT will out perform the VII simply being the chip is designed to not be hit hard on geometry heavy games whilst having decent performance with the AMD drivers feature set.
 
Back
Top Bottom