• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
This is a rumours thread,any rumour should be taken with a ton of salt. At least Wccftech and AdoredTV,have a better track record,than all the moaners. In the end if you want 100% accurate performance figures,wait until launch,and ignore these threads!

People keep forgetting this, a rumours thread is for speculation and guess work, none of it is expected to be right....
 
@TheRealDeal will, take issue with that statement :p

Nah i just take offense to people that can't watch his video's due to him being Scottish. His theories are based on leaks which may be true or not and mainly he lets that be known. If you start of with the wrong info you will never come to the right answer so when i listen to his theories from the given information i know it could be wrong. Difference between adored and other leakers he at least gives a good breakdown of why he thinks that way. Again though if the information he recieves is wrong his conclusions will not be correct. New tech is slow these days but if he were English old chap i wouldn't complain as he puts on a jolly good show.
 
Actually you might not be far off. Bring out a 3090 (which is meant to be the TI/Titan) turns out AMD knock it about a bit and then Nvidia can say now here is our real Titan and bring out something they have been saving in the wings thats even faster...... for £2499.

It's certainly possible, hold back on a Titan/Ti just in case, once yields on the design are good enough/ new memory is available and they have enough parts binned, then we might see some sort of Titan/Ti. It screams of a bit of caution and you would expect NV to have a better idea of what Big Navi is capable of by now. I don't care if i'm wrong, I will likely buy big navi if it is a step up over my 7 in my workloads (not gaming) I don't really care that much about gaming.
 
Nah i just take offense to people that can't watch his video's due to him being Scottish. His theories are based on leaks which may be true or not and mainly he lets that be known. If you start of with the wrong info you will never come to the right answer so when i listen to his theories from the given information i know it could be wrong. Difference between adored and other leakers he at least gives a good breakdown of why he thinks that way. Again though if the information he recieves is wrong his conclusions will not be correct. New tech is slow these days but if he were English old chap i wouldn't complain as he puts on a jolly good show.

Basically my own mindset when I watch his videos and to be honest, his accent is easy on the ears and English is not my main language.
 
People keep forgetting this, a rumours thread is for speculation and guess work, none of it is expected to be right....

And why not :)

6700XT: 40 CU's (2560 Shaders) at 2.4Ghz, 8GB GDDR6 265Bit at 16GB/s

6800XT: 60 CU's (3840 Shaders) at 2.3Ghz, 12GB GDDR6 384Bit at 16GB/s

6900XT 80 CU's (5120 Shaders) at 2.1Ghz, 16GB GDDR6 512Bit at 16GB/s
 
And why not :)

6700XT: 40 CU's (2560 Shaders) at 2.4Ghz, 8GB GDDR6 265Bit at 16GB/s

6800XT: 60 CU's (3840 Shaders) at 2.3Ghz, 12GB GDDR6 384Bit at 16GB/s

6900XT 80 CU's (5120 Shaders) at 2.1Ghz, 16GB GDDR6 512Bit at 16GB/s

You could easily be correct but why would the top end part be clocked so low. History should tell you the top end part will have the same or higher clocks than the low end as it's the all guns blazing part and the buyers will have the psu to support it.

Basically my own mindset when I watch his videos and to be honest, his accent is easy on the ears and English is not my main language.

Yea mate his accent is not very strong compared to other parts of Scotland or some parts of the uk for that matter.
 
Last edited:
Actually you might not be far off. Bring out a 3090 (which is meant to be the TI/Titan) turns out AMD knock it about a bit and then Nvidia can say now here is our real Titan and bring out something they have been saving in the wings thats even faster...... for £2499.
Or the 3090 beats Big Navi and nVidia get to say 'That's nothing, have you seen our Titan?'.
 
Or the 3090 beats Big Navi and nVidia get to say 'That's nothing, have you seen our Titan?'.

I don't think they would squeeze a 3090 into the lineup if that was the case. It would just be "look, titan" you got nothing. The fact early leaks suggest bringing titan into the normal high end stack says that perhaps they wont do that. I dunno, massively interesting.
 
You could easily be correct but why would the top end part be clocked so low. History should tell you the top end part will have the same or higher clocks than the low end as it's the all guns blazing part and the buyers will have the psu to support it.

I'm thinking with the size of the die it might run into power issues, smaller dies are usually easier to clock.
 
I'm thinking with the size of the die it might run into power issues, smaller dies are usually easier to clock.

Yea but i have never seen it done on the high end. they usually push it as it's the benchmark. If AMD have achieved a 50% power reduction there will be no need to hold back especially if Nvidia's rumoured power numbers are true. I guess if AMD have caught up as much as we are hearing they might not push to the absolute limits like they have done over the past years.
 
Yea but i have never seen it done on the high end. they usually push it as it's the benchmark. If AMD have achieved a 50% power reduction there will be no need to hold back especially if Nvidia's rumoured power numbers are true. I guess if AMD have caught up as much as we are hearing they might not push to the absolute limits like they have done over the past years.

True. the 5700XT ranges from 180 to 210 Watts depending on the AIB, clocks range from 1.85Ghz reference, 1.95Ghz most AIB and 2.05Ghz for the high end ones. Stock power limit is 190 Watts, with 80 CU part is literally twice the 5700XT.
The way they calculate this is performance per watt. So with a 50% power reduction that would put the 6900XT at about 300 Watts, presumably that's at 1.85Ghz, the PS5 runs at 2.23Ghz, even that is a 400Mhz (20%) increase in frequency. add that to 300 Watts you're up to 360 Watts, probably more given when pushed closer to the limit performance to power returns diminish.

The direct 5700XT replacement probably has a reduced 150 Watt power limit but clocked much higher, it in its self is probably quite a good GPU in relative terms, equivalent to an RTX 2080 at least. Make that a £300 GPU and its a winner.... Well, that is if it was stood alone in that market segment, it won't be.

Exciting times...
 
BTW: that's a 2.23Ghz 5700 is a box the size of a large book.

This thing clocks, i don't think as far as 2.76Ghz but why not 2.4Ghz? in desktop form? and its power efficient.

The higher clocks alone make it a pretty good prospect, 2.4Ghz is 30% higher than the reference 5700XT.

zjOFWZd.jpg.png
 
It’s gonna be a pain if either Nvidia or AMD launch considerably ahead of the other so you can’t compare when buying.

That said with likely stock levels I might have to pre-order before even seeing a review.
 
People keep forgetting this, a rumours thread is for speculation and guess work, none of it is expected to be right....


Yes but there is a first time for everything.:D

In fact just like WCCFtec if we put out enough drivel, somebody will hit the nail on the head, just by the law of averages alone. :p
 
I'm only going to say this once; if you can't discuss cordially with respect for the views of others, I will start handing out thread bans.

Simple as that.
 
And why not :)

6700XT: 40 CU's (2560 Shaders) at 2.4Ghz, 8GB GDDR6 265Bit at 16GB/s

6800XT: 60 CU's (3840 Shaders) at 2.3Ghz, 12GB GDDR6 384Bit at 16GB/s

6900XT 80 CU's (5120 Shaders) at 2.1Ghz, 16GB GDDR6 512Bit at 16GB/s


Quite liking that.

I agree, I reckon that clock speeds will be higher across the board than what we have seen previously.
 
Let’s hope AMD and Nvidia release simultaneously and do us all a favour.

I’ll have whichever one is best please, regardless of brand.
 
I'm only going to say this once; if you can't discuss cordially with respect for the views of others, I will start handing out thread bans.

Simple as that.

Well said, it was getting a bit fruity. Its no news that some have a particular preference and everybody knows mine. You can still be relatively impartial and sensible with your views, a good product is a good product doesn't matter who makes it :)
 
Let’s hope AMD and Nvidia release simultaneously and do us all a favour.

I’ll have whichever one is best please, regardless of brand.

I think NV will be approx 3 weeks behind. In my eyes it's actually more interesting if they don't release together. Personally I won't go straight in so want to see the full run of it. I have an idea of how I think this plays out and I do have a nugget of info that nobody seems to have shared or at least not that I have seen yet and i'm wondering if that info is right. It will be super interesting to see for sure. These days I do more opencl type stuff than I do cuda, in fact the only area where cuda might be better is in adobe creative suite for me and tbh I don't use that all that much, work wise I do a little photoshop (which is preparing design filings for high end clothing companies) and help our marketing team to produce in house content but really my usage isn't all that much. In terms of computational opencl type things I still do plenty of that as well as auditing code, for example a few months back I was auditing quantum code libraries looking for certain things as we were looking to patent what was effectively a quantum API for a uk based company so I still play with lots of code. I also game a little as well, given I don't see my 7 becoming obsolete overnight for my usage I am just gonna sit back and watch the carnage.

When the dust settles I will probably snap something up and hand the 7 down to the wife.
 
Status
Not open for further replies.
Back
Top Bottom