• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Makes me wonder if those specs are correct could be part of the reason nVidia has held back the 3070 so if AMD has a competitor with more VRAM they can tweak the product.

Can you please share those assumptions?
I have been trying to search comparable benchmarks for Borderlands but couldn't find any

5700XT gets around 29-30 FPS in Borderlands DX12 4K with badass settings - if you assume no architectural changes, no node changes, etc. you'd need a minimum of a 5120 shader part to get to 61FPS in reality due to the way performance scales with hardware you'd need more. When you add in realistic refinements architecturally for RDNA2, realistic node refinements for frequency, etc. a 3840 shader part is still some way short - a 4608 part would just about get you through the door.
 
Last edited:
That's not going to happen. If Big Navi was your employee he'd have started job hunting by now. :D

But then you're probably looking at $899 card that matches the 3090 in non RT performance.
Yea, NVidia have the head start and know what is what (or should do by now) but a 3090 matching card for a good price would be awesome. I personally am a massive proponent of RT and was the reason for me buying the 2080Ti and whilst others slate RT, I am personally massively impressed with how it works in games like Control and Exodus.
 
Well their most popular SKU has been Superceded by something 33% more expensive. It's not to say it's not a good performing CPU but the bottom end and 8 core have risen by more than double the amount the Ryzen 9's have.

That's a fair point. Maybe a 5700 or 5800 will fill that gap?

For those saying the RX 6000 in the teaser had an unfair advantage due to the CPU, did you see the uplift vs the 10900K at 1080p? Mostly around 5% at a CPU limited resolution. At 4K the difference will be barely if at all discernable, especially for average FPS.

We can only speculate at the moment but it wouldn't surprise me if this is the 6900, with drivers/clocks yet to be finalised. As such when AMD pointed out it's just a 6000 series card and no specifics it allows room for a binned top tier card with more performance.

Whatever the case I'm impressed AMD have managed to increase performance over their last generation so significantly and have shown numbers in the same performance tier as the 3080. When was the last time AMD could claim such a feat? Pricing will of course be key. 2 and a half weeks to go!
 
I mean really... it's not even remotely close to rocket science at how I am arriving at this conclusion.

You're right. It's not remotely close to rocket science how you got there... It is quite a logical possibility, one I even mostly agree with. What takes some degree of Jupiter brain complexity is how you got from it being a likely possibility to "I'M 100% RIGHT AND YOU'RE ALL IDIOTS FOR HAVING OTHER IDEAS."

It's not the conclusion you reach that I'm taking issue with (if you actually read my previous post, you'd see I'm mostly at the same conclusion), it's the unwavering certainty and attitude about it.
 
I can't decide if you are joking or if you are actually being serious and so slow at the same time.

To break it down:

1) The leaked Newegg document lists the part names and specs of each of the 6700XT, 6800XT and 6900XT cards.
2) The AMD benchmarks and the fps results showed performance that was on a par with the 3080... between 95 and 105%.
3) Taking this to the next logical step, if the specs of only one of those leaked cards from the Newegg document, the 6900XT, looks like it can threaten a 3080, then how can the card they used in the AMD presentation logically be anything else than the 6900XT (assuming the leaked specs are accurate)?

I mean really... it's not even remotely close to rocket science at how I am arriving at this conclusion.





Yes, this, thanks for explaining in more detail Rroff!

That leaked Newegg document was fake btw, if you back to the Newegg page you can see they've taken the information away and state it was taken from Techpowerups pages about the cards, Techpowers pages are just placeholders which are based on speculation if you read at the bottom of the page.
 
You're right. It's not remotely close to rocket science how you got there... It is quite a logical possibility, one I even mostly agree with. What takes some degree of Jupiter brain complexity is how you got from it being a likely possibility to "I'M 100% RIGHT AND YOU'RE ALL IDIOTS FOR HAVING OTHER IDEAS."

It's not the conclusion you reach that I'm taking issue with (if you actually read my previous post, you'd see I'm mostly at the same conclusion), it's the unwavering certainty and attitude about it.
I provided a rationale based on the only information we have... if you want to speculate that the card shown was a 6800XT then you are free to do so. Similar specs to the Newegg doc were leaked before by Moores Law, so given that consistency I would say it is around that ball park figure and seems realistic. I'm happy to agree to disagree on any of the above and your opinion on my attitude means nothing, same as my opinion on yours. We can see on the 28th and then revisit these posts. :)
 
Last edited:
I know that's what he's referring to; there's a missing link though - how one gets from those specs to the GPU shown by AMD being the 6900XT. I mean, it's gonna be one of those 3 cards, and the likelihood of it being the 6700 is virtually nil.. so they previewed either the 6800XT or the 6900XT. That's literally as far as you can go without injecting some large quantities of bias while pretending to know for sure and acting all high and mighty about it *shrug*

Personally, I can see it being one of two scenarios - and yes, this is inserting some of my bias. I'm not gonna pretend to know for certain on anything.

1) The card shown was the 6900XT and AMD can just about manage to compete with the 3080 -- sort of a win for AMD as it shows they're at least still relevant in the top end, but it's not really a "win"
2) The card shown was the 6800XT and the framerate numbers were slightly inflated and the 6800XT is not quite as close to the 3080 as we're led to believe. This leaves the 6900XT, which on paper looks to be significantly faster than the 6800XT, to sit above the 3080 and possibly trade with the 3090.

The third, which I can't really see happening is that it was the 6800XT and the numbers weren't inflated - that'd leave the 6900XT to blow away the 3090.... I mean... it's possible but I can't really think it likely.

Edit: This estimation of positioning obviously excludes RT. I've not seen enough information to remotely accurately assess what effect turning on RT will do to 6000 series framerates. The only thing I'm fairly confident on is that AMD will take a bigger hit from RT than nVidia.

The you think the rumoured Infinity cache could boost the performance above that of estimates of the rumoured specs?
 
I provided a rationale based on the only information we have... if you want to believe and engage in wild speculation, despite a lack of evidence, that the card shown was a 6800XT then you are free to do so.

This. This is what I'm taking issue with. There's no wild speculation in my posts - and the lack of evidence is something we're all in the same boat with. Such a condescending bullkaka way of communicating......

The you think the rumoured Infinity cache could boost the performance above that of estimates of the rumoured specs?
*shrug* it's certainly a possibility. There's a lot of unknowns about the cards - AMD have been... not very forthcoming with info (which is uncharacteristic of AMD for GPU launches).. Infinity Cache could/could not exist and then what effect it'll have is a pretty big question mark as we've not seen it done, no data points to extrapolate from.
 
The you think the rumoured Infinity cache could boost the performance above that of estimates of the rumoured specs?

The rumours around infinity cache confuse me - it is a valid technique in keeping costs down in a console like architecture, especially if all the game designers are aware of and work to leverage it, but the general concept is an inferior approach in a PC.

Probably unrelated but the firmware stuff, etc. seems to suggest there is some additional IO capabilities on the Navi cards that hints at a feature we've not seen yet but impossible to infer anything from it and for all I know might just be facilities for debugging the GPU in development.
 
Last edited:
*shrug* it's certainly a possibility. There's a lot of unknowns about the cards - AMD have been... not very forthcoming with info (which is uncharacteristic of AMD for GPU launches).. Infinity Cache could/could not exist and then what effect it'll have is a pretty big question mark as we've not seen it done, no data points to extrapolate from.

Cool thanks for the constructive input, it is refreshing :) Its tech like this that we have yet to see and may not even exist that makes me wonder if it was a 6800 or 6900, just keeping my mind open to the possibilities and its nice not to be belittled for it
 
I personally would love to see the top AMD card beating the 3090 and doing so in RT. If it costs as much as the 3090, so be it but at least competition is back.

Same performance, same money (or more), well... doesn't really matter, is competition for the sake of the statistics, not for end user gain. If competition doesn't doesn't bring better products at lower prices, then it could be a monopoly for all I care. There isn't really a choice.
 
Last edited:
The rumours around infinity cache confuse me - it is a valid technique in keeping costs down in a console like architecture, especially if all the game designers are aware of and work to leverage it, but the general concept is an inferior approach in a PC.

Yeah, those rumours did kind of come out of leftfield... I can see it being a possible benefit - if we look at it like L1-3 caches the bandwidth could be anywhere from like 8TB/s to 200GB/s, so looking at it that way it could be a significant boon to the architecture if it's providing a memory pool with a notably higher data throughput.... but I suspect it's more of a bandage to cover for the "low" bandwidth (512gbps on the 6900xt, less than 400gbps on the 6800xt if leaked specs are to be believed), as you surmise just a cost cutting exercise....
 
just page after page of bickering can't tell if this is a amd /or nvidiot's thred:p:rolleyes:


Seems mostly Nvidiots scared they are about to not be the best GPU in town, and the anxiety is overflowing into all sorts of laughable stuff, It's amusing to watch even jerbait a little.

Edit to add: I dont care if it's AMD of Nvidia I'll decide when the AMD cards finally drop, who I want top purchase from.
 
Yeah, those rumours did kind of come out of leftfield... I can see it being a possible benefit - if we look at it like L1-3 caches the bandwidth could be anywhere from like 8TB/s to 200GB/s, so looking at it that way it could be a significant boon to the architecture if it's providing a memory pool with a notably higher data throughput.... but I suspect it's more of a bandage to cover for the "low" bandwidth (512gbps on the 6900xt, less than 400gbps on the 6800xt if leaked specs are to be believed), as you surmise just a cost cutting exercise....

I'm not sure the memory bandwidth is "low" - assuming AMD use faster than 14gbps modules and a reasonable frequency it would have more bandwidth than the 3070 and about adequate for a card that was within a small percentage of the 3080.
 
Status
Not open for further replies.
Back
Top Bottom