• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
The 5700 XT was targeted at 1440p gaming. The 1080 Ti, RTX 2080 and RTX 2080 Ti still provided higher performance at this resolution.

We know AMD is targeting Navi 2x GPUs at 4k resolution. As with the 5700 XT, I don't think they are offering the most powerful GPU, at the targetted resolution.

Based on the perf. advantage NV had vs RDNA gen 1, it seems likely that 1-2 GPUs will still have an advantage, such as the RTX 3080 and RTX 3090.
 
Last edited:
The 5700 XT was targeted at 1440p gaming. The 1080 Ti, RTX 2080 and RTX 2080 Ti still provided higher performance at this resolution.

We know AMD is targeting Navi 2x GPUs at 4k resolution. As with the 5700 XT, I don't think they are offering the most powerful GPU, at the targetted resolution.

Based on the perf. advantage NV had vs RDNA gen 1, it seems likely that 1-2 GPUs will still have an advantage, such as the RTX 3080 and RTX 3090.

It wouldn't surprise me. I'd be going for the best performance to profit ratio. Calculated using performance, profit margin and sales numbers. Make a lot of them and sell them. At the right price a little less performance won't matter to 99.9% of buyers, the other 0.1% have the 3090 :)
 
The 5700 XT was targeted at 1440p gaming. The 1080 Ti, RTX 2080 and RTX 2080 Ti still provided higher performance at this resolution.

We know AMD is targeting Navi 2x GPUs at 4k resolution. As with the 5700 XT, I don't think they are offering the most powerful GPU, at the targetted resolution.

Based on the perf. advantage NV had vs RDNA gen 1, it seems likely that 1-2 GPUs will still have an advantage, such as the RTX 3080 and RTX 3090.

I mean you are welcome to speculate but I don't see how you can assume anything like that. Things change quickly and just because one thing happened last gen, there's no reason it has to happen this gen.

I look forward to the competition in the high end so the better AMD does the better it is for everyone.
 
Last edited:
The 5700 XT was targeted at 1440p gaming. The 1080 Ti, RTX 2080 and RTX 2080 Ti still provided higher performance at this resolution.

We know AMD is targeting Navi 2x GPUs at 4k resolution. As with the 5700 XT, I don't think they are offering the most powerful GPU, at the targetted resolution.

Based on the perf. advantage NV had vs RDNA gen 1, it seems likely that 1-2 GPUs will still have an advantage, such as the RTX 3080 and RTX 3090.

No ;)

You game at 1080p and 1440p you buy amd big navis.
4k and up you buy what suits you best as its a different group that just buys expensive stuff anyhow.

But its a reason its called nvidia killer due to people wanting to upgrade 1440p and 1080p just buy around the $500 prices.
With big Dog navis you have that :D
 
I dunno @RBImGuy , if the slides we seen (and more later) show that the 6000 series cards can push games around the 60fps4k then thats perfect for me. I agree with people and the price sentiments, I wont be buying any flagship from either brand, but whichever card in the £500 segment that can cope with this is the winner.

I dont mind dropping AAA titles or demanding games back down to 1440p to get a happy quality and framerate. I just know my vega cant quite do that so its the next progression for me.
 
I mainly game at 1440p 165hz and putting a budget down of £550 max which I firmly believe AMD will cover with hopefully a cracking card.

I once had a pre-order for a 3080 on the 17th, ordered 14:18pm and queue position was ridiculous. Cancelled my order over a week ago and have decided to wait for AMD announcements and 3070 availability.

According to Moore's Law Is Dead, there should be an absurd amount of stock for the 3070... we'll see on that? After this fiasco, I'll wait and see what's on offer from both camps at the end of this month.
 
I mainly game at 1440p 165hz and putting a budget down of £550 max which I firmly believe AMD will cover with hopefully a cracking card.

I once had a pre-order for a 3080 on the 17th, ordered 14:18pm and queue position was ridiculous. Cancelled my order over a week ago and have decided to wait for AMD announcements and 3070 availability.

According to Moore's Law Is Dead, there should be an absurd amount of stock for the 3070... we'll see on that? After this fiasco, I'll wait and see what's on offer from both camps at the end of this month.

The 3070 supply would make sense of there have been yield issues or problems hitting the clocks in the higher SKUs. Easier to produce 3070s out of the same silicon.
 
The 3070 supply would make sense of there have been yield issues or problems hitting the clocks in the higher SKUs. Easier to produce 3070s out of the same silicon.

I suppose it makes sense, we will see soon enough thou. If AMD get's the performance, price and drivers right this year, could be in for a great year... pending on stock levels!
 
The 3070 supply would make sense of there have been yield issues or problems hitting the clocks in the higher SKUs. Easier to produce 3070s out of the same silicon.

I can't say for sure it is what they are doing at Samsung but one way products like this are often fabricated to reduce wastage is to put multiple products of all sizes on the same wafer (I'm not sure how advanced Samsung's facilities are in this respect) - typically under this kind of scenario you'd be creating something like ~7 GA104 cores for every 1 GA102. So in theory at least as far as the cores go they should be much higher volume than the higher end parts. (Obviously you then have things like what functional cores can actually do frequency wise, etc.).

EDIT: Though the AMD 7nm wafers I've seen so far have all been end to end the same product which surprised me a bit.

I'm not sure what the situation is stock wise for GDDR6 either.
 
Last edited:
Someone pointed out to me that the cards use different memory and off the GA104 so not on the same as the 3080's.

Correct but I reckon there will be 100,000s of 3070 available.

Nvidia will be:

"cant buy a 3080 or 3090 anywhere don;t worry we have plenty of 3070s and they are only £700 - a bargain!"
 
Someone pointed out to me that the cards use different memory and off the GA104 so not on the same as the 3080's.

Good point.

I can't say for sure it is what they are doing at Samsung but one way products like this are often fabricated to reduce wastage is to put multiple products of all sizes on the same wafer (I'm not sure how advanced Samsung's facilities are in this respect) - typically under this kind of scenario you'd be creating something like ~7 GA104 cores for every 1 GA102. So in theory at least as far as the cores go they should be much higher volume than the higher end parts. (Obviously you then have things like what functional cores can actually do frequency wise, etc.).

EDIT: Though the AMD 7nm wafers I've seen so far have all been end to end the same product which surprised me a bit.

I'm not sure what the situation is stock wise for GDDR6 either.

This sounds like how they make TV screens, one big piece with some 77" on it and then get a few 48" to maximize the area.

We really need chiplets for GPUs. That would revolutionise production and hopefully bring prices down to a reasonable level.
 
Correct but I reckon there will be 100,000s of 3070 available.

Nvidia will be:

"cant buy a 3080 or 3090 anywhere don;t worry we have plenty of 3070s and they are only £700 - a bargain!"

Doesn't sound like an amazing play if AMD have their supposed better performance cards out in quantity.
 
Assassins Creed: valhalla 4k sys requirements:

Target - 4K / 30 FPS
Graphics Preset - Ultra High Preset
GPU - AMD RX 5700 XT / Nvidia Geforce RTX 2080
CPU - AMD Ryzen 7 3700X / Intel i7 9700K
RAM - 16 GB (Dual Channel Mode)

From here: https://www.overclock3d.net/news/so...f89b4dc3fae9023&at_ab=per-2&at_pos=1&at_tot=5

They don't even speculate on the 4k 60fps requirements, maybe the most demanding game released this year?

At least they are (finally) using dx 12...

You can run it at 1080p + 60 fps, on the high preset however, with a Vega 64 / GTX 1080.

It's odd really, considering Ubisoft has confirmed 4k + 60 fps for the series X.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom