• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
A third player would be brilliant and it is needed but i have 0 confidence Intel will be competing against Nvidia or AMD in gaming graphics.

I am optimistic.. after seeing Tiger Lake benchmarks, which houses the integrated LP version of Xe. Also Intel will be using the MCM approach which means linear price increase compared to the exponential growth we are accustomed to, due to yield issues. I would like to get my hands on a 4 tile version of the graphics card if they launch. Nothing like 60 billion transistors inside your case.
 
I am optimistic.. after seeing Tiger Lake benchmarks, which houses the integrated LP version of Xe. Also Intel will be using the MCM approach which means linear price increase compared to the exponential growth we are accustomed to, due to yield issues. I would like to get my hands on a 4 tile version of the graphics card if they launch. Nothing like 60 billion transistors inside your case.

Yeah, Intel finally caught up with AMD's 3 year old iGPU's.

XipGiFF.png



thM7lKm.png
 
I am more hopeful of Intel's Xe than any Radeon release. They are off by 15 billion transistors to offer any kind of competition at the enthusiast level (3080). I feel they have squandered away their 7 nm maturity purely due to a lack of ambition.. could have really turned the tables around this time..think it would be safe to write-off the Radeon brand and hope for Intel to fill the void

Edit: And don't forget that the flagship Ampere is 54 billion transistors while the RTX 3090 is roughly half of that. Nvidia is just slacking off while thanking AMD for letting them extend Ampere's shelf life by 2 more years
Good grief that's the funniest thing I've read in a while. Good job that man.
 
AMD chips will be smaller...due to the lack of tensor cores and dedicated raytracing cores
That could also make them slower.
But without dedicated cores for that stuff taking up die space, they can use smaller dies and still have more transistors for normal GPU work...and thus be faster
Well this is a fun little thing when you consider consider how AMD's RT implementation is supposed to work: shaders can do RT calculations instead of rasterisation when needed.

If you're not using RT then you get all the shaders doing rasterisation. If you are doing RT then a portion of the shaders switch over to RT "mode", so you have a chunk of transistors doing rasterisation and a chunk of transistors doing RT. That's Turing and Ampere, only the difference is Turing and Ampere can't repurpose their RT coures when not in use. So imagine then, instead of AMD having smaller dies because there are no dedicated RT cores, the increase the die size to have extra shaders.

You're always using 100% of the die, and if you're not doing RT then you get some extra oomph because there's more shaders to play with.
 
I would generally agree with your assessment but personally for RDR2 in particular I would sacrifice SO much just to hit that sweet 4K spot, because the game is otherwise so blurry it bothers me a lot. It's too bad their MSAA implementation is broken, it would help a bunch.

Oh I absolutely agree! The blurriness in that game drives me wild also. Upping the resolution scaling obviously sorts that, but it's almost as taxing as running at the higher res anyway. I found using AMD sharpening helped a great deal, along with TAA. But yes, it's far from ideal.

The game is just so demanding, and although it would look breathtaking at native 4K, it would take a LOT of GPU horsepower, and even with this upcoming GPU gen, I wouldn't be faithful that you wouldn't notice significant FPS drops in high populated areas. I just can't deal with FPS drops/stuttering. I have been spoiled by high refresh gaming.

I mean, GTA V was released 5 years ago, and I can still bring my 5700xt to it's knees by using ultra, and a lot of MSAA. 4K just isn't viable yet in my opinion, if you're wanting high FPS also. Games like Doom Eternal etc? Yes, it's fine.
 
I have this.

intel-xe-vs-amd-100856514-orig.jpg


Believe Tiger Lake goes under 11xxx .. already launched for laptops, think it's a paper launch, the laptops are yet to hit shelf.


Fair enough :)

However, i'm not being a #### here, a couple of things. :)

This is an Intel marketing slide, don't put too much faith in it, but mainly, the 4800U iGPU is pre-RDNA, its old GCN and its 8 CU, because AMD have had 0 competition in iGPU's they have literally been shoving the same junk level GPU cores into their chips for years. Believe me many of us have been watching AMD recycle the same iGPU over and over and over again for years crying when are they going to put a proper GPU in their chips?

When they do they will make everything on this slide look like for what it actually is, junk. when they stop putting junk in their own chips.

AMD are rumoured to put proper RDNA2 CU's in their APU's next year with DDR5.
 
I'm just stoked for Zen 3. Surely AMD must bring the latency down this time, and up the frequency?. That will all but eliminate Intel's last selling point. I have come very close to switching back to Intel a few times with the 10900K, when I see a sometimes 20+ fps advantage to Intel, but I have faith AMD will really bridge the gap this coming gen.
 
I'm just stoked for Zen 3. Surely AMD must bring the latency down this time, and up the frequency?. That will all but eliminate Intel's last selling point. I have come very close to switching back to Intel a few times with the 10900K, when I see a sometimes 20+ fps advantage to Intel, but I have faith AMD will really bridge the gap this coming gen.

All that, and more, yes. :)
 
Well this is a fun little thing when you consider consider how AMD's RT implementation is supposed to work: shaders can do RT calculations instead of rasterisation when needed.

If you're not using RT then you get all the shaders doing rasterisation. If you are doing RT then a portion of the shaders switch over to RT "mode", so you have a chunk of transistors doing rasterisation and a chunk of transistors doing RT. That's Turing and Ampere, only the difference is Turing and Ampere can't repurpose their RT coures when not in use. So imagine then, instead of AMD having smaller dies because there are no dedicated RT cores, the increase the die size to have extra shaders.

You're always using 100% of the die, and if you're not doing RT then you get some extra oomph because there's more shaders to play with.

There are however limits to concurrency and a good bit of what you would/could be doing is shading ray traced results - which is one of the changes between Turing and Ampere. A lot in this respect is going to hinge on the up take of RT features in upcoming games.
 
Fair enough :)

However, i'm not being a #### here, a couple of things. :)

This is an Intel marketing slide, don't put too much faith in it, but mainly, the 4800U iGPU is pre-RDNA, its old GCN and its 8 CU, because AMD have had 0 competition in iGPU's they have literally been shoving the same junk level GPU cores into their chips for years. Believe me many of us have been watching AMD recycle the same iGPU over and over and over again for years crying when are they going to put a proper GPU in their chips?

When they do they will make everything on this slide look like for what it actually is, junk. when they stop putting junk in their own chips.

AMD are rumoured to put proper RDNA2 CU's in their APU's next year with DDR5.

I get that marketing slide comment often .. but they are laptops..there's not much latitude to free arms.

But again, it's pure speculation on my part. However, know this, till Nvidia faces a credible competitor they will hold off from offering their best.
 
Fair enough :)

However, i'm not being a #### here, a couple of things. :)

This is an Intel marketing slide, don't put too much faith in it, but mainly, the 4800U iGPU is pre-RDNA, its old GCN and its 8 CU, because AMD have had 0 competition in iGPU's they have literally been shoving the same junk level GPU cores into their chips for years. Believe me many of us have been watching AMD recycle the same iGPU over and over and over again for years crying when are they going to put a proper GPU in their chips?

When they do they will make everything on this slide look like for what it actually is, junk. when they stop putting junk in their own chips.

AMD are rumoured to put proper RDNA2 CU's in their APU's next year with DDR5.

Also the AMD SOC has double the cores too.
 
Status
Not open for further replies.
Back
Top Bottom