• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Associate
Joined
19 Jun 2017
Posts
1,029
apple might be pushing for an early release of RDNA 3.. apple doesnt look like a company that might be content with the '2nd fastest platform' tag (for 7 years in a row).
but again its a tradeoff between time & money
 
Last edited:
Associate
Joined
21 Apr 2007
Posts
2,494
Exactly. I would prefer to be able to support AMD but if they aren't giving me proper price/performance GPU's, or something seriously problematic is up, I'll just turn to the Leather Jacket Man.

I feel the same way tbh, for me 3080 performance is the benchmark perf. We can talk about price, RTX, extra vram and drivers after AMD show me Big Navi doing above 3080 perf - not head-to-head cos obviously that can't happen for a week or two at the absolute earliest but some taxing 4k benchmark like RDR2 to give us some clue
 
Associate
Joined
19 Jun 2017
Posts
1,029
I have a different perspective..
I believe Intel will decimate AMD Radeon with their Xe graphics launch.. i have nothing against AMD, it will just be good for the gaming market
And as far as CPUs are concerned, theres a big shift expected towards ARM based architectures... betting against Intel doesnt sound prudent
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,904
Location
Planet Earth
I have a different perspective..
I believe Intel will decimate AMD Radeon with their Xe graphics launch.. i have nothing against AMD, it will just be good for the gaming market
And as far as CPUs are concerned, theres a big shift expected towards ARM based architectures... betting against Intel doesnt sound prudent
So they will decimate Nvidia too if that is the case - Nvidia derives a huge percentage of its revenue from consumer graphics.....AMD doesn't.

Nvidia also lacks CPU design experience too....AMD OTH had an ARM core in development at the same time as Zen,but canned it.

Intel also despite its experience with X86 which it created is still plodding along and it's like the 3rd/4th time in 20 years they have fallen behind to a tiny competitor.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,553
I have a different perspective..
I believe Intel will decimate AMD Radeon with their Xe graphics launch.. i have nothing against AMD, it will just be good for the gaming market
And as far as CPUs are concerned, theres a big shift expected towards ARM based architectures... betting against Intel doesnt sound prudent

I'm not convinced when it comes to Intel - they just have the wrong mindset when it comes to gaming GPUs and already there have been issues behind the scenes with staff - several of the high profile engineers or developers they took on have already left.
 
Associate
Joined
19 Jun 2017
Posts
1,029
I am just going by the data flow.. the Xe does look promising. If the GPU organisation is crumbling internally, then they might not be able to sustain the initial momentum and possibly disappear from the market like a one trick pony, but i would prefer to be optimistic :)
 
Soldato
Joined
9 Nov 2009
Posts
24,904
Location
Planet Earth
Yes but if they decimate AMD,then Nvidia is more screwed though. Last year Nvidia derived 65% of it's revenue from consumer graphics and this year it's still nearly 50% after including revenue from Mellanox.

Nvidia has more to be worried about than AMD especially as their sales share is 80% and Nvidia wants higher margins than AMD. Nvidia has most of the prebuilt market when it comes to dGPUs.

Also most Intel based devices tend to ship with Nvidia dGPUs...most AMD dGPUs ship with AMD CPUs. So Intel dGPUs will eat more into Nvidia dGPU OEM sales,and as AMD can attest too,Intel has no problem with contrarevenue. Nvidia experienced it too with their chipset business.
 
Last edited:
Associate
Joined
19 Jun 2017
Posts
1,029
I am okay with it... i have 0 brand loyalty

Nvidia is holding back this generation.. if they had AMD breathing down their necks... they would have probably delayed the 3000 launch by 1 year but the chips could have been lot bigger on TSMC.. folks seem to be ignoring that the flagship Ampere is 54 billion transistors but the RTX3090 is a little more than half of that.. the market as a whole is worse off
 
Soldato
Joined
26 Sep 2010
Posts
7,175
Location
Stoke-on-Trent
but interesting if AMD does make something like that for RDNA3
That's the rumour. RDNA 3 vs Hopper for the first viable MCM arch. In a crazy world, Intel sort out their interconnect issues before AMD and Nvidia and launch MCM first! Their tiles allegedly performance scale exceedingly well, but their interconnect scales power draw through the roof. By all accounts there was a 4 tile design running in Intel's labs early this year which obliterated the 2080 Ti, but used well over 500W to do so.

Of course, Nvidia seem to think it's fine to ramp power usage through the roof this gen, so maybe Intel should just run with that 500W lab sample and get something out there :p
 
Soldato
Joined
20 Apr 2004
Posts
4,365
Location
Oxford
That's the rumour. RDNA 3 vs Hopper for the first viable MCM arch. In a crazy world, Intel sort out their interconnect issues before AMD and Nvidia and launch MCM first! Their tiles allegedly performance scale exceedingly well, but their interconnect scales power draw through the roof. By all accounts there was a 4 tile design running in Intel's labs early this year which obliterated the 2080 Ti, but used well over 500W to do so.

Of course, Nvidia seem to think it's fine to ramp power usage through the roof this gen, so maybe Intel should just run with that 500W lab sample and get something out there :p

https://www.youtube.com/watch?v=-dJolYw8tnk

Power is meaningless unless you know how to use it
 
Associate
Joined
19 Jun 2017
Posts
1,029
But not for AMD it seems. Gotta love some some hypocrisy now and tehn...
Dude, i am speculating basis data flow (Xe is looking too good in purely theoretical terms).. i dont have any blind bias to any brand..
Btw i have used AMD products, my current gaming platform is Ryzen based and before moving to 2080 tis.. i had used a Vega 64 crossfire configuration for 1 full year (again based on data flowing at that time) but i was seriously ****** by their drivers (again its a purely data driven assessment)
And decimating AMD in graphics basically translates into a position that will make nVidia extremely uncomfortable.. those are just words that can be interpreted in many ways.
Also I dont prefer personal ad-hominem arguments.. doesnt achieve anything
 
Last edited:
Associate
Joined
2 Jun 2016
Posts
2,382
Location
UK
Count the CU's in this, it might be real, it might be fake, it might be real and AMD are just not counting Dual CU Shaders as 2 CU's, like Nvidia do.

Whatever it is this is what i was talking about.

4 Shader Engines
2 Shader arrays in each Engine
10 Shaders in each array
2 CU's in Each Shader
= 160 CU's


Mi8iJu7.jpg.png
I know, you posted this in the nvidia thread and I queried the potential wattage

The GPU in the XBox Series X is around 140, X3 = 420 but an RDNA2 GPU with that level of power would crush a 3090's performance.

But you know it would be 500W+ (180Wx3.) :D

If this is you in "grounded mode" then I'm getting really scared thinking what you could be like :p
 
Status
Not open for further replies.
Back
Top Bottom