• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
Maybe it was originally designed with the 3080ti in mind which was going to cost 1k use a cut down 8704 Cuda cores and have 12gb VRAM but then Nvidia caught wind of AMDs performance with cards like the 6800XT which would come in much cheaper so scrapped the 80ti and instead used the same die but with only 10gb VRAM for the 3080 so they could remain price and performance competitive.

The card that was originally going to be the 3080 with a GA104 was rebadged to a 3070

This seems perfectly pleausable since it's the way nvidia went with the previous 2 generations when there was no real competition in the market.

If Nvidia were that worried about AMDs raster performance they would have simply gone with 7nm.

Now we have the odd situation when the 3080ti is released where Nvidia will have 3 different cards ranging from £650 - £1000 - £1400 but separated by only 10% performance.

Makes me wish I had shares in Nvidia.
 
It should be enough for any critic to be more cautious when it comes to predicting AMD's abilities.

These days AMD are in the habit of making nay sayers eat their words. If they are smart they would avoid making predictions about AMD, many do now.

There is no predicting. Ampere launched on 17/9/2020, RDNA2 on 18/11/2020.
 
Are you claiming that AMD have/had no intention of competing with Ampere?

They do compete with Ampere in traditional rendering which is still 99.99% of the games on the market.

They are starting on a smaller node than Nvidia did. Today they are a generation behind with a smaller node and higher clock frequency.

Clock frequency doesn't mean dick for 2 different types of gpu's from 2 different companies, you can't compare 1 company running at 2ghz on ampere and another running at 2.5 on navi as they are totally different, if anything it's just a marketing bullet point that really means very little.

As for big navi, its obvious that ray tracing was a very late addition to the design, i doubt it was ever intended to include ray tracing cores when it was first conceived. They did the best they could at the time which compared to nvidia's first effort is around the same level of performance if not a bit faster. That performance will improve as time goes on.
 
If Nvidia were that worried about AMDs raster performance they would have simply gone with 7nm.


Is your memory really that bad? They tried to strongarm tsmc into a better deal and ended up being told to **** off so they had to use the samsung 8nm node instead. There were plenty of articles knocking about about nvidia attempt at a power play backfiring.
 
They do compete with Ampere in traditional rendering which is still 99.99% of the games on the market.

We were talking about raytracing and DLSS.

Clock frequency doesn't mean dick for 2 different types of gpu's from 2 different companies, you can't compare 1 company running at 2ghz on ampere and another running at 2.5 on navi as they are totally different, if anything it's just a marketing bullet point that really means very little.

As for big navi, its obvious that ray tracing was a very late addition to the design, i doubt it was ever intended to include ray tracing cores when it was first conceived. They did the best they could at the time which compared to nvidia's first effort is around the same level of performance if not a bit faster. That performance will improve as time goes on.

I didn't simply compare frequency, I mentioned node and frequency. I can say that if Ampere was on 7nm, it would most likely have a ~10% higher clock and less power draw.
 
They are starting on a smaller node than Nvidia did. Today they are a generation behind with a smaller node and higher clock frequency.
From RDNA 1 > 2 AMD managed a 50% performance per watt on the same node, now think what they can manage when they move to improved TSMC 5nm with RDNA3 and will still have the node advantage over nvidia if they use Samsung again which looks likey albeit probably 5nm Samsung.
 
Nvidia tried influence TSMC into giving 7nm to them on the cheap, TSMC told them "you don't have any influence" and gave what would have been Nvidia's allocation to AMD.

Yes, that was where the discussion started. Nvidia went for profit over performance, covered many times over, yet still come out on top having the better GPU on a larger node and lower clock than they could have had.
 
From RDNA 1 > 2 AMD managed a 50% performance per watt on the same node, now think what they can manage when they move to improved TSMC 5nm with RDNA3 and will still have the node advantage over nvidia if they use Samsung again which looks likey albeit probably 5nm Samsung.

I'd guess today up to 35% depending on whether they go for power saving or performance.
 
We were talking about raytracing and DLSS.

And i'm talking about competing on the vast amount of games that don't support either, it really is mind boggling on how you somehow think a deficit in a tiny amount of games somehow overshadows their performance in just about every other game on the market.


I didn't simply compare frequency, I mentioned node and frequency. I can say that if Ampere was on 7nm, it would most likely have a ~10% higher clock and less power draw.

Less power draw would be likely,. clock frequency is anyone's guess.
 
Yes, that was where the discussion started. Nvidia went for profit over performance, covered many times over, yet still come out on top having the better GPU on a larger node and lower clock than they could have had.

All things being equal Nvidia will always come out on top, its called mindshare, but over time that can be turned around, tell me it can't.
 
All things being equal Nvidia will always come out on top, its called mindshare, but over time that can be turned around, tell me it can't.
Well AMD have certainly thrown down the gauntlet this time around and while nvidia cards like the 3080 are still slightly ahead of AMD in terms of features, RDNA2 reminds of AMDs zen 2 moment where Intel still had the gaming lead but AMD were close but then Zen3 arrived and blew Intel away in every way.

Now maybe overtaking nvidia in RT with RDNA3 is a bit of a stretch I could certainly see them getting close while also taking the lead in rasterisation.
 
Well AMD have certainly thrown down the gauntlet this time around and while nvidia cards like the 3080 are still slightly ahead of AMD in terms of features, RDNA2 reminds of AMDs zen 2 moment where Intel still had the gaming lead but AMD were close but then Zen3 arrived and blew Intel away in every way.

Now maybe overtaking nvidia in RT with RDNA3 is a bit of a stretch I could certainly see them getting close while also taking the lead in rasterisation.

The possibility exists, certainly Intel, not for the first time have learned AMD are not second best when they have some RnD money, in fact they are intimidatingly good at what they do.

I'm optimistic AMD's GPU future is good, that's not to say they can do to Nvidia what they did to Intel, but certainly make Jenson sweat.
 
And i'm talking about competing on the vast amount of games that don't support either, it really is mind boggling on how you somehow think a deficit in a tiny amount of games somehow overshadows their performance in just about every other game on the market.

I've already said AMD is great if all you do is play old games or games based on old engines. If you want new tech then you will have to go with Nvidia. Is that really so hard to understand?
 
All things being equal Nvidia will always come out on top, its called mindshare, but over time that can be turned around, tell me it can't.

I'm just going with the best card at that moment, I really don't care who makes it. As far as AMD and Nvidia are concerned, I'd rather give my money to AMD due to more open standards, but I'm not going to waste my money on a lesser product.
 
I've already said AMD is great if all you do is play old games or games based on old engines. If you want new tech then you will have to go with Nvidia. Is that really so hard to understand?

Old games on old engines? You talk like ray tracing is the defacto standard in every game engine now. It's a long way off getting that kind of traction, a few engines have support for it but on the whole not many. It'll be quite a few years before a large percentage of games\engines have it as a minimum spec necessity. Right now the only game that has that need is exodus and that's a 2 year old game with likely some brown envelopes changing hands behind the scenes to get this enhanced version out that has RT as a mandatory requirement.


And how great is it that yet ANOTHER thread has turned into a ray tracking circle jerk with the same arguments being rehashed. :rolleyes:
 
Old games on old engines? You talk like ray tracing is the defacto standard in every game engine now. It's a long way off getting that kind of traction, a few engines have support for it but on the whole not many. It'll be quite a few years before a large percentage of games\engines have it as a minimum spec necessity. Right now the only game that has that need is exodus and that's a 2 year old game with likely some brown envelopes changing hands behind the scenes to get this enhanced version out that has RT as a mandatory requirement.

I consider ray tracing and DLSS to be modern tech, todays tech. The sort of tech that would make me want to buy a new GPU. Unreal Engine, Unity, IW and RE Engine have all added support, well RE Egnine in a small way. I don't see it as a requirement any time soon, but I do see it improving graphical quality and immersion already.

And how great is it that yet ANOTHER thread has turned into a ray tracking circle jerk with the same arguments being rehashed. :rolleyes:

It's the nature of an enthusiast's forum :)
 
AMD does not have problems with their hardware they can match Nvidia easily in RT with the next card. Look at the perf/watt, AMD is far ahead. Ok Nvidia is on a worse node but there is a big difference between how much power RDNA2 needs and how much power Ampere needs. There is room for RDNA 3 to use more hardware than Nvidia for the same power usage.

AMD problems are with the game developers and their software team. They need a much better/bigger software team and i am not sure how they can make the game devs spend their time to optimize games for AMD cards. Unless they spend a ton of money and Jensen stops spending and playing his dirty tricks, AMD will always be behind Nvidia.

It's funny to see Epic working close with Sony to optimize their games for Playstation but ignoring AMD cards on PC. Those are Nvidia money at work and i am not sure there is something AMD can do about this. Same for the others game devs, heck Control developers made a custom version for new consoles but they say they will never release that version on PC. :)
 
I consider ray tracing and DLSS to be modern tech, todays tech. The sort of tech that would make me want to buy a new GPU. Unreal Engine, Unity, IW and RE Engine have all added support, well RE Egnine in a small way. I don't see it as a requirement any time soon, but I do see it improving graphical quality and immersion already.
I'd say it's more of a future tech that still won't be what I'd consider good enough till a couple more generations of Gpus by which time this current crop won't cut it anyway.
 
Status
Not open for further replies.
Back
Top Bottom