• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
I don't know about the performance of RDNA3 and i wouldn't get ahead of my self. :)

But i can expand on what i said earlier, we are getting to a point now where each die shrink offers less and less over the previous generation, a slowdown in More's law.

So the problem is to make more CPU or more GPU you have to keep making them bigger, look at the last few generations of Nvidia GPU's, aside from the cost of that you also have the laws of physics, the more transistors you pack in to these things the more electrical resistance you have, that drives up power consumption, again look at Nvidia, and Intel.

Now, wouldn't it be nice if you could split your big die up in to lots of little ones, your wafer yields would go up, they wouldn't have the electrical resistance problem so they are nice and efficient, like a small die chip, and if you do it right, like designing them to be modular, like logo, you can scale them infinitely, so the limits of how much CPU or GPU you can make no longer applies, the limit for how many cores you can have in a CPU is limited only by how big the PCB you glue them to is, 64 cores? no problem, 96? yeah, 128? done that..........
Can you imagine how big a monolithic CPU would need to be to accommodate 64 cores? Its why Intel can't do it, not even close, AMD have doubled that and they will double it again with Zen 5!

The problem with all that ^^^^ is there are A LOT of technical hurdles to overcome to make it work, AMD are the first to have done it, they might be the only ones....

IBM were the first and arguably the sony cell (a lisa su design), anyway global foundries when heavily in bed with AMD then purchased IBM microelectronics division (the firm I work for facilitated the exchange of IP back in the day)... The rest after that is history but if you follow Lisa's career back you can also trace her to IBM and early cpu design, in fact it's said that she is partly the reason why cpu's are designed the way they are today using copper trace rather than alloy interconnects she is also widely credited for her work developing silicon on insulator packaging.

Put it this way, Lisa was in and amongst mcm designs before mcm designs were even considered for x86. Put simply she is an extremely gifted engineer. Anyway thought you might be interested :)
 
Last edited:
IBM were the first and arguably the sony cell (a lisa su design), anyway global foundries when heavily in bed with AMD then purchased IBM microelectronics division (the firm I work for facilitated the exchange of IP back in the day)... The rest after that is history but if you follow Lisa's career back you can also trace her to IBM and early cpu design, in fact it's said that she is partly the reason why cpu's are designed the way they are today using copper trace rather than alloy interconnects she is also widely credited for her work developing silicon on insulator packaging.

Put it this way, Lisa was in and amongst mcm designs before mcm designs were even considered for x86. Put simply she is an extremely gifted engineer. Anyway thought you might be interested :)

I remember reading into the cell cores at the time when the consoles were using it. Very interesting and they didn't maximise on the potential chiefly due to devs not embracing it (amongst other things which naturally hinder progress like time, budget, direction etc.)
 
I remember reading into the cell cores at the time when the consoles were using it. Very interesting and they didn't maximise on the potential chiefly due to devs not embracing it (amongst other things which naturally hinder progress like time, budget, direction etc.)

The cell was pretty cool just probably a bit over complex to easily and quickly make games play nicely as you say. A bit like the sega saturn I guess failed because it was overly complex to develop for.
 

Coming from the same site that confirmed that AMD was using the new connectors only a few days before as many others did too. So have AMD just shelved their stock with the new style connector or never had any cards made ? Because I don't believe such a manufacturing change was made so quickly and with cards in inventory, basically these so called changes or none changes means no cards have actually been made yet. So cards from AMD wont be seen till next year then at this rate.
:rolleyes:

Or it means that the sites were wrong before, probably all copying rumour from the same source.

The message that rdna3 cards will not use the 12 pin connectors came from AMD. The earlier message that rdna3 card will use the 12 pin connectors did not.
 
Or it means that the sites were wrong before, probably all copying rumour from the same source.

The message that rdna3 cards will not use the 12 pin connectors came from AMD. The earlier message that rdna3 card will use the 12 pin connectors did not.


Yup, people just jumped to conclusions including youtubers, people also seem to forget AMD took a dump on the last power connector on Ampere during an rdna 2 event, saying something aalong the lines of it being bad for cable management.

When they eventually do implement something like this hopefully they do it in a way where it dodges the issues of the nvidia implementation.
 
Last edited:
Smart move from AMD. Users are already angry at high upgrade costs for Zen4 CPUs so keeping older connectors on RDNA3 means no one needs to run around looking for new cables or PSU.

And of course letting others be first adopters are smart too, AMD did that with DLSS and RT by letting Nvidia grind their teeth on it so that when AMD comes in they have a more mature offering. AMD is only really going in on RT with RDNA3 and they will have a good start and they are letting Nvidia grind it out again with their buggy implementation of DLSS3 frame generation stuff, wait a while and come out next year maybe with their own version that's more mature.
 
Last edited:
Yup, people just jumped to conclusions including youtubers, people also seem to forget AMD took a dump on the last power connector on Ampere during an rdna 2 event, saying something aalong the lines of it being bad for cable management.

When they eventually do implement something like this hopefully they do it in a way where it dodges the issues of the nvidia implementation.
Allow Nvidia buyers to force the uptake of new PSUs and THEN release their cards on the 12pin?
 
Allow Nvidia buyers to force the uptake of new PSUs and THEN release their cards on the 12pin?

If the connectors aren't needed i really dont see the point of them, they make cable management harder for one. Baffles me why gpu manufactures cant lay out the board so the power connector is in a more convenient location, gigabyte have done that with "project stealth" where the power connector is on the bottom of the card which makes it practically invisible.


 
Last edited:
Hmmm the connector debacle has made me rule out this series from NV, i'm never going to play "bendy/burny Russian Roulette" inside my case.

On top of that, the adaptor cables are only rated for 30 cycles in perfect conditions, i.e. never bent. No typo. 30 cycles. Not 300 or 3000 or 30K cycles (i.e. 30,000 cycles). 30 cycles. Because they're that fragile and that close to their limits right from the start. Even in perfect conditions.

Caveat emptor if your card is ever pulling more than 150W directly from the PSU. You'll probably be OK. As long as you don't bend the cable or swap graphics cards much (good luck to hardware reviewers) or use an old cable in a couple of years. Or use any adapter other than the official one from nvidia because that will void the warranty on your card. Even if it's better than the nvidia one.
 
^ What a great idea

I was put off by the presenter rabbiting about various degrees of uniqueness. That's a pet peeve of mine. There are no degrees of uniqueness. It's a binary thing. Either something is unique or it isn't. It's impossible to have degrees of uniqueness. If someone doesn't know what the word means they should learn. Or not use the word. Or just say what they mean. Uncommon. Rare. Unusual. Very. Extremely. There are plenty of ways to express the idea of degrees of rarity. Why are people trying to eradicate the very idea of uniqueness by corrupting the word and thus removing any way to express the idea that there is only one of a thing? It annoys me.

But the build, yes, that's a great idea. That's people who have built PCs and who understand building PCs and who have sat down and tried to devise the best setup they can come up with. A design for purpose, for function, and one done well. Minimal restriction to airflow in order to optimise cooling. Need to access something? Take off the side panel and it's right there, easily accessible. "That's clearly the right way to do it...why isn't it always done that way?" is generally an indication of a good design. I'm going to look at that right now.
 
With both coil whine and Adapter Gate just days from 7000 series launch AMD Radeon Division is gifted and white listed for a mind share win this generation.
All they have to do is be just a hair faster and cost a lot less and they will deliver a knock-out. Question is will they do the right thing for the sake of Optics for consumers or try to appease shareholders with profit margins. You have to run a business but make a profit by making your product attractive to all.

This is, by far, the easiest goal given to AMD during an announcement of a new GPU Gen, EVER. I hope they don't screw it up.


 
Last edited:
With both coil whine and Adapter Gate just days from 7000 series launch AMD Radeon Division is gifted and white listed for a mind share win this generation.
All they have to do is be just a hair faster and cost a lot less and they will deliver a knock-out. Question is will they do the right thing for the sake of Optics for consumers or try to appease shareholders with profit margins. You have to run a business but make a profit by making your product attractive to all.

This is, by far, the easiest goal given to AMD during an announcement of a new GPU Gen, EVER. I hope they don't screw it up.



Neither. They will optimise for wafer use to try and make sure supply / demand ratios are where they need to be. No point over charging and reducing demand if you have excess supply and no point under charging and have excess demand if you have too much supply.

With the reduction in Zen 4 client demand I expect AMD have excess wafers so might price well to shift the volume, especially after NV have tripped over their own feet.
 
Lol ECH back to shilling the new and shiny AMD product before it launches .

Also noticed his buddy RIBMguy comin in force lololol.
I don't understand your aggressiveness friend. You should try not to take things personally, LOL. I see it the way it is. That cluster of a failure post launch happened less then a week before the Radeon 7000 series launch is one of the biggest flukes known to man. Nv making any kind of recall announcement (which they should) only bolsters more interest in what AMD have to offer.

I am reading post of people with 4090 looking to see what 7900xt/xtx will actually offer for consideration. A first for me seeing such enthusiasm when they thought they already had the best of the best.

But like I said, it's no automatic win for them. Far from it. It will all depend on what the 7900xt/xtx will offer in rasterized performance and what they will charge for it. If it beats the 4090 and it's priced less then the choice is obvious bar any obvious brand loyalty. Something that this adapter fiasco has greatly eroded and will continue to deteriorate in the days ahead.

But we will see what AMD brings to the table.
;)
 
Last edited:
I doubt AMD's new GPU's will beat the 4090, even at a cheaper price. But it would be nice!

I remember all the noise from the 3090 vs the 6900 XT. Look how that turned out in the end.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom