• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
There was a rumour that into next year will have a 3090 beater with AIO cooling but Im not even bothered about that price bracket as I will never buy/afford that.

I am more interested in the 3080 bracket, spending more on a GPU to me is ludicrous.
 
Will we see Ryzen 3 benchmarked with an unknown Radeon in tomorrow's event? I doubt it but that would stir the pot further.

I expect the higher end 6000 series SKUs to beat the 2080 Ti but have no idea by how much.
 
There was a rumour that into next year will have a 3090 beater with AIO cooling but Im not even bothered about that price bracket as I will never buy/afford that.

I am more interested in the 3080 bracket, spending more on a GPU to me is ludicrous.

I'm fully up for AMD having a 3090 beater as you say. In all honesty though I'd be really very happy with something that came within say +/-5% of the 3080 rasterization performance, bit below in ray tracing (I play VR mostly and I'm not aware of a single VR game with ray tracing yet!) and comes in around the £550-600 mark.

In saying that, I'm assuming that if AMD do in fact have a card that can trade blows with the 3090 it will also carry a price tag heftier than I can justify. Of course the perfect scenario for me is a card around the price of the 3080FE that matches the 3090 in most regards but I just don't see AMD doing that if they do in fact have the goods. Nvidia have given them carte blanche for prices well upwards of £1k in that scenario and that's a bit spicy for my tastes...
 
I would not be surprised if they did that.

Certainly possible, but in order to show off their new Zen's gaming performance it needs to be previously have been CPU bottlenecked.... Any benchmark showing GPU bottlenecked games would not show off the performance of Zen3 which is the whole point of the event, and any benchmark showing CPU bottlenecked games wouldn't directly show off the strength of the GPU making it very hard to draw any firm conclusions anyway.

My personal gut feeling is that since they separated the events intentionally, tomorrow will very much all be about Zen 3 and they won't want to muddy the waters. Not long to wait to find out!
 
Last edited:
Certainly possible, but in order to show off their new Zen's gaming performance it needs to be previously have been CPU bottlenecked.... Any benchmark showing GPU bottlenecked games would not show off the performance of Zen3 which is the whole point of the event, and any benchmark showing CPU bottlenecked games wouldn't show off the strength of the GPU making it very hard to draw any firm conclusions anyway.

My personal gut feeling is that since they separated the events intentionally, tomorrow will very much all be about Zen 3 and they won't want to muddy the waters. Not long to wait to find out!

I guess they could skirt the issue by showing 720p benchmarks with a 5700XT.
 
In no way was it a strawman by any definition so I can only assume that either you don't know actually know what a strawman is, or you completely missed the point. I don't care to guess which but I do note that a lot of people seem to love to peddle accusations of logical fallacies to try and look smarter than they are.
It was a strawman and I will prove it by the quote below:

You said, merely having seen the similar shape of the PS5 die itself, "I'm not sure what corteks [sic] provided is from the 6000 series but from the PS5." My argument directly addressed that statement and in no way did it set up a different proposition to be knocked down. Ipso facto, absolutely not a strawman. There is also absolutely no requirement for me to possess nor to provide a picture of the die given my statements and it is most confusing that you would try to suggest that there is.
And here you have it, your strawman. Provide me any proof that you can refute that the photoshoped die is in fact from the 6000 series? Anything else is irrelevant at this point. Thus, the strawman. You are implying that I cannot have an objective opinion that I don't think that die is from the 6000 series. And because he actually removed the circuitry around the die does provide me credence that the die could come from the PS5. Since we know he did alter the pic there is no telling how much was altered. This is what happens when you provide an alter pic as proof of concept to that claim. It's subject to be scrutinize.

So, again, beyond posting your strawman can you refute the information I've provided with some sort of original pic? So far, no you haven't. So through your own mental gymnastics you weren't able to address my concerns about the pic. Nor were you able to provide an rebuttal to it. Other then show your are upset about it. Something in which I cannot help you with (which is a you problem at this point). I don't believe that altered photo is from the 6000 series. And based on how it's presented it looks similiar to the die shot from the PS5. Problem?
:D
 
You quite literally can’t be debated with. To do so is seemingly an exercise in banging one’s head against a wall.

I already addressed all of those points.

You are free to think whatever you want and no it doesn’t upset me in the slightest, but I am equally free to think you aren’t firing on all cylinders with that comment. I laid out why I believe your opinion is logically flawed - that is not disallowing you an opinion, it is discussing it.

edit - and no, it’s still not a strawman.
 
Last edited:
Its quite easy to think that a 5nm Series Xbox Series X like desktop GPU (52 CUs), probably released in 2021, would use about 120w of power (60% of series X tdp). It's possible the Series X speculated 200w tdp is a bit higher, though...

This kind of (small die) GPU would probably scale quite well, maybe we would get a GPU with 104 CUs, at around 2x the tdp (240w) of a 5nm based Series X.

I wonder if Microsoft and Sony will want a 5nm refresh next year for consoles too?
 
Last edited:
There was a rumour that into next year will have a 3090 beater with AIO cooling but Im not even bothered about that price bracket as I will never buy/afford that.

I am more interested in the 3080 bracket, spending more on a GPU to me is ludicrous.

If you look at it logically (sort of) - if AMD had another mid-range-ish offering coming then the 3070 would have been the 3080, the 3080 the 3080ti and the 3090 a Titan, if AMD had a GPU coming that would trouble the 3080 then the 3090 likely wouldn't even exist and the 3080 would be what the 3090 is. Most likely they have products that slot in alongside the current nVidia line up with the 3090 (which isn't a model nVidia use much and more commonly for dual-GPU boards and/or when nVidia need an ultra card to keep the crown) existing just so nVidia can retain the crown.

Purely guessing on my part though.
 
At the risk of repeating myself, maybe its worth waiting for 5nm GPUs in 2021.

The transistor density on 5nm (likely 5nm+) is estimated to be upto 170M /mm². Would've thought 5nm GPUs could have densities of around 100M /mm² (based on the fact that 7nm GPUs have around 2/3 the density of the max stated density for the 7nm fab. process).

Assuming thats true, thats about 5 or 6 times the transitor density of the R9 390 (my current GPU :D) , at 14.2M / mm². Old GPU is old...

Best we can hope for is significantly greater power effiency for RDNA 2, vs RTX 3000 series. Even greater for RDNA 3.
Make a thread about it :p
 
To make sure we have all bases covered Big Navi is:
1.)Slower than an RTX2080TI
2.)Faster/similar than an RTX2080TI
3.)Slower than an RTX3070
4.)Faster/similar than an RTX3070
5.)Slower than an RTX3080
6.)Faster/similar than an RTX3080...

all at the same time. That should cover all the leaks!!

:D

Sounds accurate :D
 
To make sure we have all bases covered Big Navi is:
1.)Slower than an RTX2080TI
2.)Faster/similar than an RTX2080TI
3.)Slower than an RTX3070
4.)Faster/similar than an RTX3070
5.)Slower than an RTX3080
6.)Faster/similar than an RTX3080...

all at the same time. That should cover all the leaks!!

:D

Depending on the benchmark used - it's probably all of them.
 
If you look at it logically (sort of) - if AMD had another mid-range-ish offering coming then the 3070 would have been the 3080, the 3080 the 3080ti and the 3090 a Titan, if AMD had a GPU coming that would trouble the 3080 then the 3090 likely wouldn't even exist and the 3080 would be what the 3090 is. Most likely they have products that slot in alongside the current nVidia line up with the 3090 (which isn't a model nVidia use much and more commonly for dual-GPU boards and/or when nVidia need an ultra card to keep the crown) existing just so nVidia can retain the crown.

Purely guessing on my part though.

Sorry @Rroff had to read a few times as I am still waking up.. you saying you think AMD do have something otherwise nvidia would have dropped down a gear with their stack?
 
tenor.gif


Maybe we'll see a sneaky glimpse of RDNA2 tonight!
 
Sorry @Rroff had to read a few times as I am still waking up.. you saying you think AMD do have something otherwise nvidia would have dropped down a gear with their stack?

It has been quite awhile since the x80 core was on the xx2 core rather than being the top part on xx4 to maximise profits as we've seen lately - though there are other factors I think a good part of it is a response to what nVidia think AMD are bringing - if AMD were just bringing circa 2080ti performance it would be highly likely the 3080 was on GA104 and GA102 was a 3080ti and Titan instead.
 
To make sure we have all bases covered Big Navi is:
1.)Slower than an RTX2080TI
2.)Faster/similar than an RTX2080TI
3.)Slower than an RTX3070
4.)Faster/similar than an RTX3070
5.)Slower than an RTX3080
6.)Faster/similar than an RTX3080...

all at the same time. That should cover all the leaks!!

:D

Lisa Su has also been hand crafting the drivers for Big Navi, to ensure no black screens, BSOD, flickering, sporadic clock rates, texture issues in games and hardware acceleration bugs. AMD's drivers are about to transcend years of instability.
 
If you look at it logically (sort of) - if AMD had another mid-range-ish offering coming then the 3070 would have been the 3080, the 3080 the 3080ti and the 3090 a Titan, if AMD had a GPU coming that would trouble the 3080 then the 3090 likely wouldn't even exist and the 3080 would be what the 3090 is. Most likely they have products that slot in alongside the current nVidia line up with the 3090 (which isn't a model nVidia use much and more commonly for dual-GPU boards and/or when nVidia need an ultra card to keep the crown) existing just so nVidia can retain the crown.

Purely guessing on my part though.

I feel differently the 3080 is infact the 3080 ti and hence great value for enthusiasts..
though nvidia can launch a 3090 with lesser ram and call it the 3080 ti
overall, the Ampere is a one time offer.. all depends on how AMD competes..
i believe there was this rumor of GA103 which was cancelled probably due to product positioning overlap.. so the RDNA2 in some form is atleast one class above GA104

edit: TU104 to GA104
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom