• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
so whats the chances of lisa su riding on to stage on a amd bike for the reveal when ever it finally happens :D
I hope not, for a company pushing themselves as a "premium brand" now, those bikes look really cheap-ass kit :P

I want her to play up the leather jacket and spatulas. How awesome would it be if she unveiled Ampere-beating cards with a bigger and bolder leather jacket AND more spatulas in her kitchen!
 
I hope not, for a company pushing themselves as a "premium brand" now, those bikes look really cheap-ass kit :p

I want her to play up the leather jacket and spatulas. How awesome would it be if she unveiled Ampere-beating cards with a bigger and bolder leather jacket AND more spatulas in her kitchen!

they need something, and it needs to happen sooner than later.

with the leak thats near as damned confirmed that xbox ships on the 10th of nov their window to release with nothing else going on is slowly shrinking.
 
they need something, and it needs to happen sooner than later.

with the leak thats near as damned confirmed that xbox ships on the 10th of nov their window to release with nothing else going on is slowly shrinking.
I'm kinda thinking the September announcement and October 7th launch is accurate if I'm honest. But I can also see AMD actually building up a stock for a real launch, rather than Nvidia's alleged small quantity for day 1 purchases and then nothing unti next year. Also bear in mind the 3070 isn't coming until October either, and given how most people on this thread seem to think AMD will struggle to compete with that, if AMD are waiting to see how their competition lines up, they'll wait until the actual competitor shows.

But I think that's tosh anyway :P
 
I must admit this isn't the right place but the other thread is full of hype.
I was wondering if the consoles weren't teased earlier in the year, could AMD have blindsided Nvidia this generation?

When the consoles were announced with their specs Nvidia would have looked at that and re-evaluated their internal estimates for AMDs RDNA2 GPUs. The consoles are the only reliable public information we have on RDNA2 and there have been no leaks regarding RDNA2.

We know that the prices of the Ampere GPUs are affected by the imminent consoles and AMDs potential GPU.

It is highly unlikely that when Nvidia was planning out these GPUs they wanted to the 3090 to have a TDP of 350W. They probably would have wanted to stick to 250-300W. So we can make a reasonable assumption that they felt the need to push the clocks to go all out, just to make sure they were not going to lose to AMD.

To me it seems that the consoles being revealed this year may have saved Nvidia from defeat this year.

(this all assumes that AMD can deliver the goods.)
 
I mThe consoles are the only reliable public information we have on RDNA2 and there have been no leaks regarding RDNA2.

The 1660ti is was running low resolution but the 2080Ti's are running the same resolution as the AMD card and they are the fastest 2080Ti's in the database at the time. (so probably overclocked a lot) Going off of the recent overclocked 2080Ti vs 3080 Doom comparison, this GPU is faster than a stock 3080.

...
BigNaviPerf.png
 
It is highly unlikely that when Nvidia was planning out these GPUs they wanted to the 3090 to have a TDP of 350W. They probably would have wanted to stick to 250-300W. So we can make a reasonable assumption that they felt the need to push the clocks to go all out, just to make sure they were not going to lose to AMD.

nVidia might not have wanted a 350W TDP but a big factor in the production of Ampere GPUs was the best balance of cost, availability, etc. so things like yields and other supply constraints and I'm guessing Samsung 8nm is somewhat a compromise there to get the best balance hence the somewhat higher TDP. I suspect the result will be AMD struggling to keep prices down at the same performance level.
 
@Twinz I had forgotten about that. Thanks.

nVidia might not have wanted a 350W TDP but a big factor in the production of Ampere GPUs was the best balance of cost, availability, etc. so things like yields and other supply constraints and I'm guessing Samsung 8nm is somewhat a compromise there to get the best balance hence the somewhat higher TDP. I suspect the result will be AMD struggling to keep prices down at the same performance level.
Good point on the samsung production. One thing to factor in is that the AMD chips will be smaller than the Nvidia performance equivalent, due to the lack of tensor cores and dedicated raytracing cores.
 
Yup it's gonna be great. St Denis at night probably still won't hold 60 fps, but otherwise, so long as the killer settings are off/reduced (water physics, tree tess etc) 4K 60 in general should be good on otherwise ultra settings.

I'd take 1440p any day. 4K is still too taxing in my opinion. I'm on the 5700xt and at highish settings 1440p, I'm struggling to get 50fps at times in built up areas (St Denis etc). A 3080/Big Navi should hopefully get me over 100fps on average, and 80+ in built areas. I have a 165hz monitor and that is still the sweet spot for gaming imo. Although I am a FPS whore. It just allows room for dips etc.

I'm getting about 100fps in Gears 5 multiplayer on Ultra, but would like close to my 165hz refresh, hopefully Big Navi/3080 will get me that.

4K will be viable for high refresh in the next GPU gen after this, in my opinion.
 
I'd take 1440p any day. 4K is still too taxing in my opinion. I'm on the 5700xt and at highish settings 1440p, I'm struggling to get 50fps at times in built up areas (St Denis etc). A 3080/Big Navi should hopefully get me over 100fps on average, and 80+ in built areas. I have a 165hz monitor and that is still the sweet spot for gaming imo. Although I am a FPS whore. It just allows room for dips etc.

I'm getting about 100fps in Gears 5 multiplayer on Ultra, but would like close to my 165hz refresh, hopefully Big Navi/3080 will get me that.

4K will be viable for high refresh in the next GPU gen after this, in my opinion.
I would generally agree with your assessment but personally for RDR2 in particular I would sacrifice SO much just to hit that sweet 4K spot, because the game is otherwise so blurry it bothers me a lot. It's too bad their MSAA implementation is broken, it would help a bunch.
 
Theory 1: AMD doesn't want to compete too much in order to avoid cannibalizing their console position.

Theory 2: RDNA 1's lackluster performance was purely down to tech issues and RDNA 2 could well be 50% on top of what RDNA 1 was originally going to be and that's why Nvidia is being more competitive this time and holding a 3080Ti in reserve.
 
The 1660ti is was running low resolution but the 2080Ti's are running the same resolution as the AMD card and they are the fastest 2080Ti's in the database at the time. (so probably overclocked a lot) Going off of the recent overclocked 2080Ti vs 3080 Doom comparison, this GPU is faster than a stock 3080.

...
BigNaviPerf.png
Yeah i forgot about that too, and apparently there are VERY overclocked 2080TI's on that page.
 
Theory 1: AMD doesn't want to compete too much in order to avoid cannibalizing their console position.

Theory 2: RDNA 1's lackluster performance was purely down to tech issues and RDNA 2 could well be 50% on top of what RDNA 1 was originally going to be and that's why Nvidia is being more competitive this time and holding a 3080Ti in reserve.

I am more hopeful of Intel's Xe than any Radeon release. They are off by 15 billion transistors to offer any kind of competition at the enthusiast level (3080). I feel they have squandered away their 7 nm maturity purely due to a lack of ambition.. could have really turned the tables around this time..think it would be safe to write-off the Radeon brand and hope for Intel to fill the void

Edit: And don't forget that the flagship Ampere is 54 billion transistors while the RTX 3090 is roughly half of that. Nvidia is just slacking off while thanking AMD for letting them extend Ampere's shelf life by 2 more years
 
Last edited:
A third player would be brilliant and it is needed but i have 0 confidence Intel will be competing against Nvidia or AMD in gaming graphics.
 
Status
Not open for further replies.
Back
Top Bottom