Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It depends on how much the devs are willing to implement RT in games and also the level of implementation. Also on how much AMD is willing to spend in sponsoring their own RT games, like Nvidia does.I am curious to see what they do to improve RT performance. The rumours of Lovelace are that it will double-down on huge core counts (and consequently RT cores) which means that it will further smack any previous gen cards of any brand around when that's enabled, and make no mistake about it, 2 years from now RT will be a staple feature for most AAA games and even some AA studios, even if it it's only 1 or 2 effects as a toggle.
About software yes AMD are very lazy and far behind Nvidia.
Considering AMDs recent track record, I guess the question isn't will they achieve the 50% per/watt but how much will they exceed it by and how they will achieve it.I for one would prefer AMD/MS to take their time with a quality solution than rush something out half-arsed
Also a few tidbits in here I saw last month;
https://wccftech.com/amd-rdna-3-radeon-rx-gpus-and-zen-4-ryzen-cpus-perf-per-watt-gains-5nm/
Also for the current gen;
"Finally and most importantly, Rick unveiled that AMD is targetting 1440p as the standard resolution for its raytracing solution. AMD themselves have not revealed much about the raytracing graphics performance of its ray-accelerator core powered Radeon RX lineup which will include the Radeon RX 6900 XT, RX 6800 XT & RX 6800 graphics cards but at least we get a baseline of where to expect the Radeon GPUs to land."
The software means more driver releases, working with game devs for optimizations and having game ready drivers on release day, AI upscaling feature, features for stramers etc. Not (only) the DXR part. Nvidia does these things better. Yes they have more money to spend but there is no reason for why AMD can't do much better if they want to be competitive.It will be years before the developers start implementing DXR in more games.
You probably know that the gaming development depends on the lowest common denominator - that is what the average Joe uses in its home entertainment rigs.
That is mostly GTX 1660 / Radeon RX 580 and below.
The software means more driver releases, working with game devs for optimizations and having game ready drivers on release day, AI upscaling feature, features for stramers etc. Not (only) the DXR part. Nvidia does these things better. Yes they have more money to spend but there is no reason for why AMD can't do much better if they want to be competitive.
Why would it be stupid? Marketing is just as important to selling a product as the product itself. There's a reason any big budget release of either hardware or software/media has a huge marketing budget. Unless of course AMD doesn't have the ability to supply that many GPUs or doesn't think it will be able to compete.It depends on how much the devs are willing to implement RT in games and also the level of implementation. Also on how much AMD is willing to spend in sponsoring their own RT games, like Nvidia does.
I think it will never beat Nvidia in Nvidia sponsored titles and it will be stupid to join a RT race like it was the case with tessellation, if there are no RT heavy games except for those sponsored by Nvidia. I will be more concerned if i had a 3090 or a 3080 since Jensen will sponsor even heavier RT games in the future for his new cards.
Since the new consoles have less hw capabilities than the 6000 series ( and Nvidia 3000 series), we should see many light RT games in the future and some of these games will come to PC too.
About software yes AMD are very lazy and far behind Nvidia. But i think their AI upscaling solution also depends on Microsoft. Anyway, Direct ML will become the default in the future for every card manufacturer.
Maybe a 6900 XT OC'ed to the gills on water with a custom power limit can do any sort of competent RT at 1440p but not likely. For the most part they're all 1080p cards w/ RT on.I for one would prefer AMD/MS to take their time with a quality solution than rush something out half-arsed
Also a few tidbits in here I saw last month;
https://wccftech.com/amd-rdna-3-radeon-rx-gpus-and-zen-4-ryzen-cpus-perf-per-watt-gains-5nm/
Also for the current gen;
"Finally and most importantly, Rick unveiled that AMD is targetting 1440p as the standard resolution for its raytracing solution. AMD themselves have not revealed much about the raytracing graphics performance of its ray-accelerator core powered Radeon RX lineup which will include the Radeon RX 6900 XT, RX 6800 XT & RX 6800 graphics cards but at least we get a baseline of where to expect the Radeon GPUs to land."
The lowest common denominator are consoles. Everything revolves around that if it isn't PC-exclusive. What people miss with counting the multitude of 1060 rigs is that they're never going to be interested in all these games anyway (and most of them were net cafes) - those people are playing esports titles or legacy MP games (eg WoW), they're not rushing to buy Cyberpunk or Flight Sim & the devs certainly aren't going to shy from raising min specs on PC because of them.It will be years before the developers start implementing DXR in more games.
You probably know that the gaming development depends on the lowest common denominator - that is what the average Joe uses in its home entertainment rigs.
That is mostly GTX 1660 / Radeon RX 580 and below.
How many RT games are out there that are not sponsored by Nvidia? And how it is AMD supposed to compete with Nvidia in their sponsored titles? Nvidia will always have an advantage because this is what they always did, they focused on a feature and made the devs use it heavily in the games Nvidia sponsored to ruin the performance for everyone else, even their own old cards users.Maybe a 6900 XT OC'ed to the gills on water with a custom power limit can do any sort of competent RT at 1440p but not likely. For the most part they're all 1080p cards w/ RT on.
The lowest common denominator are consoles. Everything revolves around that if it isn't PC-exclusive. What people miss with counting the multitude of 1060 rigs is that they're never going to be interested in all these games anyway (and most of them were net cafes) - those people are playing esports titles or legacy MP games (eg WoW), they're not rushing to buy Cyberpunk or Flight Sim & the devs certainly aren't going to shy from raising min specs on PC because of them.
Technically 14 posts.How long before the Jensen hit squad start polluting this thread?
Not really our problem, that's for AMD to figure out. Frankly Lisa Su should've fired their whole marketing department on day 1 because it's been by far the worst aspect of AMD. In reality AMD has always had good selling points vs Nvidia but they could never market them properly. That's why Nvidia could make Geforce synonymous with GPUs, or Ray tracing with RTX, etc. They're genius-level marketers, and I'm sure their attempts at mimicking Apple is no coincidence in the matter.How many RT games are out there that are not sponsored by Nvidia? And how it is AMD supposed to compete with Nvidia in their sponsored titles? Nvidia will always have an advantage because this is what they always did, they focused on a feature and made the devs use it heavily in the games Nvidia sponsored to ruin the performance for everyone else, even their own old cards users.
How can you compete with that? The only way is to try to sponsor your own games and AMD did that this year...well it is a start anyway. Dirt 5 has a nice market on consoles, and Farcry will come in the 2021. Maybe they will be able to sponsor better games in the future, it looks like they were also involved in the UE5 engine optimization so that might help them in the future.
But again, they will never be even close on Nvidia sponsored titles.
Clicking through to the sources we get this interesting quoteAMD Talks Zen 4 and RDNA 3, Promises to Offer Extremely Competitive Products
"When it comes to RDNA 3, the company has plans to offer an architecture that has a high performance-per-watt. Just like AMD improved performance-per-watt of RDNA 2, it plans to do the same with RDNA 3, bringing the efficiency of the architecture to the first spot and making it very high-performance for any possible task."
We’re happy with RDNA2 on performance per watt, and overall performance, and we have a lot of focus on RDNA3. On elements such as AI specific integration, we are making investments. CDNA launched in November, and you will see us adding more AI capability to our CPUs and GPUs.’
Bergman: “Absolutely. In terms of inferencing and AI, and so on. Yes. Again, we have that capability. Certainly, we have our high-end...training or inferencing solutions, to potentially solutions that are more for the client or endpoint devices as well. And again unfortunately...I can't give you details on what we may have up our sleeves or what's coming, but it's certainly a very interesting area.”
You forgot to add a COVID delay into your equation.Watch as I use my patented method to attempt to guess when the first 5nm part will be released. Amaze your friends, astound your co-workers, etc.
So the Apple A12 Bionic was first released in September 2018. The first 7nm AMD GPU was the Radeon VII which came out February 2019. That means a gap of about 5 months between an Apple mobile part and an AMD GPU part.
So the Apple 5nm part, the A14, was released in September 2020, meaning the AMD 5nm parts must be coming... next month!
Yeah, very interesting
Could be only for yields and decreased production costs, could be a new breakthrough like the Infinity Cache that greatly improves the performance?
The so called "uncore" part of the GPU to go to its own silicon, while the compute units to sit on their own.
I thought the ray-tracing cores could have their own silicon, too.
Must have taken a wrong turn, came here to talk about Cp2077 and all I see is posts about RDNA3.