• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Why is it that Nvidia spends a massive $2,376 million dollars on R&D yet they can only beat AMD by aproximately 20%? Am I the only one that thinks that's pathetic? Nvidia must be paying huge bonuses to their R&D department for doing pretty much nothing. Either that, or AMD has the best R&D in the world for being able to get within 20% of Nvidia with less than a third of the budget. I think that's commendable, yet everybody seems to give AMD a hard time and says they suck.

Nvidia makes 59.75% gross margin on their products yet AMD only work to a 41% margin so they have to sell 50% more product to equal Nvidia's income. How can they possibly compete with such a monster?


AMD can only afford $1.4 billion on R&D for both their CPU & GPU divisions. So even if they pumped the lot into Radeon Technologies Group, it would still be £1 billion dollars less than Nvidia.

Don't forget AMD's 7nm advantage, versus NVidia's 12nm disadvantage. So only beating AMD by 30% is quite an achievement.
 
Soldato
Joined
6 Feb 2019
Posts
17,605
Don't forget AMD's 7nm advantage, versus NVidia's 12nm disadvantage. So only beating AMD by 30% is quite an achievement.

Exactly. AMD doesn't have much advantage, at least not yet. The only reason they are even close to Nvidia 12nm Turing is because they got a nice hand me out by by been first to use TSMC's 7nm.

Also CuriousTomCat needs to remember Nvida doesn't just make graphics cards for PC's. They do have a lot of other higher margin markets and probably most of the development cost goes to those areas of the business.

If you go to Nvidia's main website, gaming or desktop PC's isn't on the big banner that takes up half the webpage. No, there is a banner advertising Nvidia's AI business which powers weather system prediction modelling, Cancer and Drug research, security systems, AI drones, self driving cars etc the list goes on and on and if you click on the product stack they put gaming all the way at the bottom.

Then go to AMD's website and the half screen banner is advertising desktop CPUs, scroll down and you're greeted with more massive baners advertising gaming products.

Even though it's hard to break-down from their financial reports, I reckon if we could we'd find the actual R&D spent on developing gaming graphics cards is probably the same between both companies and the extra money that Nvidia spends is in other markets, nothing to do with gaming.
 
Last edited:
Associate
Joined
21 Apr 2007
Posts
2,490
more news by speculation, I think we should ban WccfTech references. They would claim rumour of 3000 Ampere to include a free goat with hay for a year if they thought people would click on it
 
Soldato
Joined
28 May 2007
Posts
10,071
Don't forget AMD's 7nm advantage, versus NVidia's 12nm disadvantage. So only beating AMD by 30% is quite an achievement.

You have to remember that the 2080ti is a 18.6 billion transistor chip compared to Vega 7's 13.2b and the 5700xt at 10.3 bil. AMD have plenty in the tank in this aspect. Even the 2070 super has 13.2 billion transistors. AMD just need to make bigger chips to be competitive but i don't think they will do this until they can get the power usage under control.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,240
You have to remember that the 2080ti is a 18.6 billion transistor chip compared to Vega 7's 13.2b and the 5700xt at 10.3 bil. AMD have plenty in the tank in this aspect. Even the 2070 super has 13.2 billion transistors. AMD just need to make bigger chips to be competitive but i don't think they will do this until they can get the power usage under control.

Some of that is Tensor cores which aren't used in games (aside from DLSS and some RTX specific features).
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
You have to remember that the 2080ti is a 18.6 billion transistor chip compared to Vega 7's 13.2b and the 5700xt at 10.3 bil. AMD have plenty in the tank in this aspect. Even the 2070 super has 13.2 billion transistors. AMD just need to make bigger chips to be competitive but i don't think they will do this until they can get the power usage under control.
If AMD went bigger, it would certainly need a AIO to keep it cool enough, as they are currently struggling with temps on this arch.
 
Soldato
Joined
28 May 2007
Posts
10,071
If AMD went bigger, it would certainly need a AIO to keep it cool enough, as they are currently struggling with temps on this arch.

I don't think cooling is a problem tbf. Any of the decent 5700xt's are cool and quiet along with Vega which is much more power hungry. My Vega 64 runs in the high 60's without any tweaks. They do need to work on power usage though as no way they can scale up to much bigger chips on the current RDNA arch without work on this aspect. That's what needs to happen though as without big chips they won't be competing with Nvidia unless they work out some multi chip design using smaller chips.
 
Soldato
Joined
19 Feb 2007
Posts
14,351
Location
ArcCorp
Heard Nvidia killer with the 290X, Then the Fury X, Then Vega 64, All disappointments for the "killer" claims AMD made prior to release like "poor volta" etc... I'll believe it when I see it.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,639
Location
Greater London
Heard Nvidia killer with the 290X, Then the Fury X, Then Vega 64, All disappointments for the "killer" claims AMD made prior to release like "poor volta" etc... I'll believe it when I see it.
Exactly. Which in my opinion still won't be anytime soon.

I am far from an Nvidia fan boy, but in my opinion in 2020 the RTX 3000 series is where it will be at.

My prediction is the 3070 will offer a lot better RT performance than a 2080 Ti and traditional performance will be within 10% in either direction. The best part is it will cost less than half :D
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
I don't think cooling is a problem tbf. Any of the decent 5700xt's are cool and quiet along with Vega which is much more power hungry. My Vega 64 runs in the high 60's without any tweaks. They do need to work on power usage though as no way they can scale up to much bigger chips on the current RDNA arch without work on this aspect. That's what needs to happen though as without big chips they won't be competing with Nvidia unless they work out some multi chip design using smaller chips.
Fair enough bud. I thought the 5 series was hot running but not paid too much attention in fairness.
 
Soldato
Joined
28 May 2007
Posts
10,071
Exactly. Which in my opinion still won't be anytime soon.

I am far from an Nvidia fan boy, but in my opinion in 2020 the RTX 3000 series is where it will be at.

My prediction is the 3070 will offer a lot better RT performance than a 2080 Ti and traditional performance will be within 10% in either direction. The best part is it will cost less than half :D

Rubbish you know AMD are pulling an ace out of the hat in 2020 :D:D:D:D:D
 
Status
Not open for further replies.
Back
Top Bottom