• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
I think AMD needs a GPU to address the RTX 3070, that GPU is gonna be one of the best performance per £ graphics cards available (at 4k resolution) judging by the price / perf of the RTX 3080.

Hopefully, this would be with the Big Navi GPU shown on 8th Oct, but I think that is probably too optimistic. The RTX 3070 only has 64 ROPs, so hopefully AMD will exceed this amount with their similarly priced GPU.

The 3070 competitor is Navi 22
 
I think there's an assumption that AMD needs to beat the RTX 3090. With the high price of £1400 or more, I don't think they do. Hardly anyone will buy a RTX 3090, less than 1% brought the RTX 2080 TI, according to Steam's hardware survey. Especially since the perf. gap between the 3080 and 3090 is quite small.

the 3090 is not about sales per say though. A big part of a halo product like a 3090 is for nvidia amd intel etc all to show off and have the world fastest title. That can generate sales across the range of cards. People look at a 3090 and say it’s 2x the price of a 3080 for so little extra . The top card is not about price it’s about being the fastest and you pay for that.

steam whilst a good indicator is not a way to see total sales.
 
I could be wrong :) but I though the 5700xt was priced within 20£ of the 2070 before super announcement and then adjusted price by a fair bit to counter nvidia move. I’m going on memory though so I could be wrong.
Yes, the retail price was cut at launch, but AMD suggested this was a bait and switch tactic. Whatever the reason, the 5700XT was widely regarded as a good deal.
 
Yes, the retail price was cut at launch, but AMD suggested this was a bait and switch tactic. Whatever the reason, the 5700XT was widely regarded as a good deal.

yea but if the 2070 super wasn’t launched they would have stuck to the price (just my opinion though) but my 2 older children both have 5700xt in there pc’s though I thought it was a amazing card for the original price (the price certainly has creeped up though for AIB ones over time) but I felt even though the 2060 had rtx stuff it wasn’t powerful enough to use it. It was decently a gimmick or just there to tick boxes:)
 
I think there's an assumption that AMD needs to beat the RTX 3090. With the high price of £1400 or more, I don't think they do. I doubt many will buy a RTX 3090, less than 1% have a RTX 2080 TI installed, according to Steam's hardware survey. Especially since the perf. gap between the 3080 and 3090 is quite small.

AMD can probably beat the RTX 3090 later, if they rush out more GPUs in 2021.

AMD does need something, doesn't need to be a 3090 killer, but they need a prosumer replacement to the Radeon VII
 
I haven't seen anyone give AMD any credit in all this.
AMD knew that Ampere launch would flop!! They sat back and let Nvidia do the song/dance and when push came to shove, no delivery. Reminds me of a the rope-a-dope!
AMD hasn't done nor said anything and all leaks are sealed! They have just sat back and let Nvidia wear themselves out as their marketing backfired on them!!
This definitely looks like AMD was expecting yield issues from Ampere and that is exactly how it looks to me, others and the media.
Now all they need to do is deliver a finishing blow on their launch date. Will they? Time will tell.

But I wouldn't expect availability for RDNA 2 until mid Nov.
I can't see how you can give a company credit for inactivity. If and when they launch a superior product at a competitive price then they will certainly deserve credit.
 
I think another assumption thats been made about RDNA 2 is a 100% increase in Compute Units, vs RDNA gen 1.
This rumour of 80 CUs appears to be purely guesswork, possibly based on the 'Navi 2x' name mentioned by AMD.

A 50% or 75% in CUs is just as likely (total of 60 or 70 compute units). We can't assume that RDNA 2 can utilize substantially more CUs than the Vega architecture, as others on this forum have previously mentioned.

Why do I mention this? its mostly because the Series X GPU has 52 active CUs, a 30% increase from the RX 5700 XT.

I wonder if the Navi 2x name has anything to do with the actual names of any of RDNA 2 GPUs...

This AMD slide implies there are two (maybe more) Navi 2X GPUs, which are 'Top-of-Stack', link here:

https://assets.rockpapershotgun.com/images/2020/03/AMD-Big-Navi-RDNA-2.jpg
 
Last edited:
While I agree that they basically want to take credit for correcting the pricing that they jacked up in the first place, I still don't think one generation makes a new norm. All the generations before Turing offered ~30% improvement over the previous generation at or near the previous gen's price points. And now with another ~30% improvement back down around Pascal's pricing, Turing is clearly the outlier now.

Turing was basically Nvidia scalping their own $700 cards for far more money than the performance was worth. Either it didn't work out for Nvidia or AMD is has them scared. (I think it's a little of both)

Agreed.

I can't see how you can give a company credit for inactivity. If and when they launch a superior product at a competitive price then they will certainly deserve credit.

Inactivity or watching and waiting can very much be a valid tactic. Check out military strategy. Whether or not AMD did that on this case we'll never know. The first casualty in war is the truth. I think the same applies to marketing wars too ;)

In all of this I think whoever wins the sub £500 battle is important. AMD have to at least take more market share there.
 
I wonder what the production cost of 'Big Navi' could be? The Series X soc costs ~250 dollars to produce apparently (console total is approx 500 dollars - so, they barely break even). So, maybe 200 dollars for just the GPU. For a Big Navi graphics card id roughly guess the prod. cost could be upto twice as much, depending on how good the die yields are. So, maybe a couple hundred dollars profit per graphics card, if sold at 600 dollars?

Edit - Ive just been looking at the gross margin figures for AMD since 2018, and they range from 35-44.36%. Link here:
https://www.macrotrends.net/stocks/charts/AMD/amd/gross-margin

The gross margin doesnt take into account any other costs, like employee wages, or R&D, its just the sales minus the cost of goods sold.
 
Last edited:
I haven't seen anyone give AMD any credit in all this.
AMD knew that Ampere launch would flop!! They sat back and let Nvidia do the song/dance and when push came to shove, no delivery. Reminds me of a the rope-a-dope!
AMD hasn't done nor said anything and all leaks are sealed! They have just sat back and let Nvidia wear themselves out as their marketing backfired on them!!
This definitely looks like AMD was expecting yield issues from Ampere and that is exactly how it looks to me, others and the media.
Now all they need to do is deliver a finishing blow on their launch date. Will they? Time will tell.

But I wouldn't expect availability for RDNA 2 until mid Nov.

AMD haven't achieved anything yet in relation to RDNA 2. They'll get credit when they deliver.
 
These cards aren't going to be cheap to produce, and I think the 3080 pricing would have caused concern so I fully expect these to at least match them with more vRam and poss slightly less performance? We'll soon find out!
 
Seen a twitterer claiming 128rops get accused of having some credibility based on being correct about something previously.

Hope these people also keep track of the BS guesses.
 
Some more rumours (via 3dcenter.org)
https://twitter.com/Avery78/status/1316145669741051905

- DXR support is apparently only on N21 and cut variants
- Implementing DXR has meant big design changes that has ramifications in accommodation
- Benchmarks with DXR enabled should show increased PWR consumption (dual clocks?)
- N21 "XTX" consumes a lot of PWR> RTX3080
- N21 XTX can compete with RTX3080 but not RTX3090
- Expensive to make and expensive to buy> RTX3080
- Proper niche card - The big daddy
- RT perf less that 3080
- Lots of ties / trades in 4K in traditional raster perf w / 3080 really depends on the game, possibly overall> 3080
- As mentioned on Sept 20 pinned tweet - there is something funky going on with how throughput performance works in particular with DXR RT. It takes a hit like Turing does but maybe more so is what I am hearing.
- I am unsure of the DXR stuff on N21 only - but it what was said

https://twitter.com/Bullsh1t_buster/status/1316431341659987968
 
think its already in a frenzy with those benches, and by the hints looks as if it wasnt top navi, something around 3070 performance but cheaper will suit me budget wise, just hoping its not a paper launch and theres plentiful stock, though knowing AMD history im not holding my breath on that one
 
Status
Not open for further replies.
Back
Top Bottom