• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
So do you think the 3080 was going to be slower, then Nvidia got tipped off and had to push the 3080 to its limits? Or, do you think that AMD are now pushing their card to the limits knowing what the 3080 and 3090 can do? Either way round is feasible.

My opinion would be if nvidias spy's/insiders knew it was a repeat of the 5700XT v 2080/Ti saga then the special cooling (would not have seen the light of day) and red lined clockspeed for the power would not have been so aggressive.
 
Things are getting interesting, if true. But I do wonder how AMD will price these cards. Will they undercut Nvidia with a better gpu or similiar knowing the nvidia's supply issue.

Lets be realistic for a moment. I dont think AMD's card will best the 3080 but will be near enough to be benchmarked against one another for a good few months. The special edition will get near to 3090 or even beat it but I wont be buying it (I cant see this being a value purchase and they wont gift it away knowing they can get over a grand).

So going back to flagship vs 6900XT or whatever its name will be, as long as they price it around the FE price or below £649 they will sell. Especially if they have stock to shift on day they say.
 
Its looking likely it will be around 3080 performance give or take. Stock is key. And hopefully if there is reasonable stock then efforts will be made to stop bots, stop overseas purchases and allow only one per customer @Gibbo . I actually want to buy one of these.
 
Lets be realistic for a moment. I dont think AMD's card will best the 3080 but will be near enough to be benchmarked against one another for a good few months. The special edition will get near to 3090 or even beat it but I wont be buying it (I cant see this being a value purchase and they wont gift it away knowing they can get over a grand).

So going back to flagship vs 6900XT or whatever its name will be, as long as they price it around the FE price or below £649 they will sell. Especially if they have stock to shift on day they say.
I'm keeping an open mind on this realistically. I really don't know if RDNA 2 will beat Ampere or not. But if they are truly competing with Ampere I do have to wonder how they will price RDNA 2.
If it's true they have plenty of stock. Will they undercut Ampere to entice people who would otherwise not consider AMD or will they do a Zen 3 price strategy? In particular that sku that is on par with the 3090.
 
I'm keeping an open mind on this realistically. I really don't know if RDNA 2 will beat Ampere or not. But if they are truly competing with Ampere I do have to wonder how they will price RDNA 2.
If it's true they have plenty of stock. Will they undercut Ampere to entice people who would otherwise not consider AMD or will they do a Zen 3 price strategy? In particular that sku that is on par with the 3090.
If they beat nvidia I would expect them to price at the same level or higher as they want to be seen as a premium brand so will set premium prices.

They might not even have the margin to cut prices much as they would have expected a higher price from nvidia on the 3080 so might have set a higher BOM.
 
If there is going to be any disruption on price I'd expect it to be the 3070 slot - if the nVidia 3080 margins are so slim as people claim I just can't see where AMD has to go when they are using a more expensive node and don't have access to as good GDDR6 prices as nVidia does.
 
So do you think the 3080 was going to be slower, then Nvidia got tipped off and had to push the 3080 to its limits? Or, do you think that AMD are now pushing their card to the limits knowing what the 3080 and 3090 can do? Either way round is feasible.

I think ~260W 3080 would have been a small incremental improvement over the (stock) 2080Ti, and Nvidia could have sold plenty just by bringing the price back down to the $700 price point. Blowing out the power looks like an "adjustment" that may not have been in the original plan though.

Nvidia appears to be trying very hard.
 
I agree, I doubt NV originally planned for 320w and 350w tdps on the RTX 3080 / 3090. It looks very much like they decided to 'max them out' at a later stage.

For one thing, the RTX 3080 spec recommends a 700w PSU, enough to rule out some customers, who may opt for a GPU from AMD instead, with lower TDP.

The RTX 3080 uses ~45% more power than the 3070, but offers only about 13-24% higher performance (assuming 3070 perf is similar to the RTX 2080 TI at 1080p, or 4k resolution) so the RTX 3070 is gonna be one of the top GPUs for performance per £.

Assuming what we hear about the RTX 3070 is true (lower tdp, similar or greater perf vs RTX 2080 ti), I think NV will be praised for the value of the RTX 3070, costing less than half the price of the rtx 2080 ti (typically £1,200 or more in the UK).

Its also gonna be sold for 500 dollars in the US, equivalent to ~£387.15 (tax free I think, if bought within some US states).
 
Last edited:
I agree, I doubt NV originally planned for 320w and 350w tdps on the RTX 3080 / 3090. It looks very much like they decided to 'max them out' at a later stage.

For one thing, the RTX 3080 spec recommends a 700w PSU, enough to rule out some customers, who may opt for a GPU from AMD instead, with lower TDP.

The RTX 3080 uses ~45% more power than the 3070, but offers only about 13-24% higher performance (assuming 3070 perf is similar to the RTX 2080 TI at 1080p, or 4k resolution) so the RTX 3070 is gonna be one of the top GPUs for performance per £.

Assuming what we hear about the RTX 3070 is true (lower tdp, similar or greater perf vs RTX 2080 ti), I think NV will be praised for the value of the RTX 3070, costing less than half the price of the rtx 2080 ti (typically £1,200 or more in the UK).

Its also gonna be sold for 500 dollars in the US, equivalent to ~£387.15 (tax free I think, if bought within some US states).

nvidia over engineered cooling solution shows they was expecting a hot card. They knew for a very long time that the card would be a high tdp.

Power supply is not really a consideration when buying a gpu for many many users. I expect anyone buying a 650£ graphic card to have a very nice monitor/tv to pair it with I expect the same for our power supply also and anyone dropping this kinda of money will happily upgrade psu if needed.

I also think your way to hung up on price here. I get you want best value u can but amd will price the graphic card accordingly to performance and whilst a 3070 looks good value and I expect the amd cards to command value u as well but I doubt we’re gonna get a 3080 performance for 500 dollars. Amd are a company after all.
 
Upgrading the cooler from the last gen. wouldve been an easy decision, considering the RTX 2080 Ti can exceed 80 degrees.

Not everyone will be willing to upgrade their PSU if a GPU isn't very power efficient. A tdp over 300w for a single GPU graphics card isnt exactly efficient, when comparing the perf. vs the RTX 2080 TI. And higher power usage always increases max temps...

I think many would rather keep their current PSU, and save the £100 or so for an equivalent / or greater AMD GPU, which (in recent years) havent had TDPs over 300w.

RDNA 2/3 are both planned to make large improvements to power efficiency, so I think this will become more relevent to customers in the next year or so.
 
Last edited:
Depends a bit on the implementation - IIRC Quake 2 RTX grabs an additional 2.5GB allocation for screen buffers alone at 4K.

mhm, that's full path tracing though - probably won't see many other games like that


I agree, I doubt NV originally planned for 320w and 350w tdps on the RTX 3080 / 3090. It looks very much like they decided to 'max them out' at a later stage.

For one thing, the RTX 3080 spec recommends a 700w PSU, enough to rule out some customers, who may opt for a GPU from AMD instead, with lower TDP.

The RTX 3080 uses ~45% more power than the 3070, but offers only about 13-24% higher performance (assuming 3070 perf is similar to the RTX 2080 TI at 1080p, or 4k resolution) so the RTX 3070 is gonna be one of the top GPUs for performance per £.

Assuming what we hear about the RTX 3070 is true (lower tdp, similar or greater perf vs RTX 2080 ti), I think NV will be praised for the value of the RTX 3070, costing less than half the price of the rtx 2080 ti (typically £1,200 or more in the UK).

Its also gonna be sold for 500 dollars in the US, equivalent to ~£387.15 (tax free I think, if bought within some US states).

These cards were definitely clocked over the intended design.

They do underclock very well though - one of the best I've seen to date is one person who managed to cut his 3080 FE from 320w at stock to 170w with a volt downclock and kept 95% of stock performance - now their card only gets up to 50c at 100% load

Nvidia could have kept all the 3080 cards at 200w or less if they wanted with very minimal performance loss but they really wanted that extra 5%
 
Last edited:
170w is very good. If NV can create a lower power variant (with same perf) of the RTX 3080 at ~250w, most gamers would be able to install one, without upgrading their PSU. Maybe possible on 7nm or 7nm EUV?

Lots of ppl have criticised NVs decision to use Samsung 8nm fab. process. I think they just went with the tech that would get them the highest product yields. This makes perfect sense, with AMD still trying to catch up with NVs top GPUs.
 
Last edited:
Upgrading the cooler from the last gen. wouldve been an easy decision, considering the RTX 2080 Ti can exceed 80 degrees.

Not everyone will be willing to upgrade their PSU if a GPU isn't very power efficient. A tdp over 300w for a single GPU graphics card isnt exactly efficient, when comparing the perf. vs the RTX 2080 TI. And higher power usage always increases max temps...

I think many would rather keep their current PSU, and save the £100 or so for an equivalent / or greater AMD GPU, which (in recent years) havent had TDPs over 300w.

RDNA 2/3 are both planned to make large improvements to power efficiency, so I think this will become more relevent to customers in the next year or so.

there is many way to upgrade a cooler but nvidia way involved a redisgn of pcb and a differnt direction of cooler then there normal type we’ve seen many cards with 3 fans and large heatsink. My zotac gpu is massive but it’s how long that’s get me. If nvidia was expecting a lower tdp they would have made a simple cooler not what they have .

im glad you brought up 2080 ti many of those cards are north of 300w also many have 520w 3 pin designs. And most people with a 2080ti will have a top end psu. My psu is a Corsair ax760i it’s nearly 9 years old anyone who is going for a top end gpu will have a psu to match it. Sorry but if I have a 9 year old platinum psu for my end system I expect everyone looking at these type of cards to have a very good psu.

you are properly right about power efficiency but your wide of the mark also look at amd cpu more power efficient then intel but each generation there more and more power hungry and a 3900x is just as hard to cool as a 10900k when pushing both to the limits
 
If there is going to be any disruption on price I'd expect it to be the 3070 slot - if the nVidia 3080 margins are so slim as people claim I just can't see where AMD has to go when they are using a more expensive node and don't have access to as good GDDR6 prices as nVidia does.

Do you have any proof on this GDDR6 claim you are making?

Im not convinced AMDs 7nm Gpus are going to ve as expensive to manufacture as Nvidias shoddy Samsung 8nm ones are.

AMD has a lot of custom with TSMC, sure wafer prices are higher than previous nodes but yields are very very good by all accounts as has been said and shown here many times. Fairly certain AMD are fabbing more stuff at TSMC than Nvidia are at Samsung probably to the point where they are getting enough discounts and yields to get a better ROI than Nvidia.

So i ask you once again please prove where Nvidia get a better GDDR6 price than other customers? GDDR6X yes i believe as Nvidia helped design it..

Please back up your claims with evidence as i and probably many others are interested to see this info.

I dont think AMDs cost to manufacture is as high as everyone believes it is, and i think they are making big margins on their cpus, and im personally hoping they are going to use the markup on the cpus to offset a lower price on bringing gpus to market to disrupt with an aggressive price campaign.

Again as i said previously, their Zen tactic, bring in a competing product stack that offers insanely good price to performance and go from there.

AMD done this in the cpu space and are now leading it, the fight will be 100x tougher against Nvidia in the gpu space, but Intel will probably join in soon, so i think AMD will make hay while the sun shines.

If AMD can come to market with similar raster performance products at a lower cost to Ampere they will hoover up sales, as the lack of DLSS and RT performance will keep some die hards on the green side but there are a lot of fence sitters who don't care about anything other than raster performance and price and will jump.

AMDs biggest problem under the above scenario will be stock, global demand will be huge
 
Status
Not open for further replies.
Back
Top Bottom