• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

Just a thought....

Worst case. Assuming the dual-issue ALU's are of no help the 7900XTX has 20% more shaders at 2.5Ghz vs 2.3Ghz of the 6950XT (+8.7%) so 1.29X a 6950XT.
The memory architecture needs to be factored in on top of that but who knows what that actually means, its impossible to calculate, with a much improved fabric link, less but much faster cache, a 384Bit Bus and faster GDDR6 the total effective memory bandwidth is according to AMD 2.7X faster, that must account for something.
I'm hoping they have really improved the rasterisation. I'd like to trade the 3060Ti in for something AMD but only if it's a decent uplift and not too silly on prices.
 
I'm hoping they have really improved the rasterisation. I'd like to trade the 3060Ti in for something AMD but only if it's a decent uplift and not too silly on prices.

I'm looking to update my 2070S, its been a great card, still is but at 1440P its not what it once was.

These days Nvidia are too "premium" for my blood, i don't need or want the Apple of GPU's, i have a Samsung phone too....
 
I'm looking to update my 2070S, its been a great card, still is but at 1440P its not what it once was.

These days Nvidia are too "premium" for my blood, i don't need or want the Apple of GPU's, i have a Samsung phone too....
Yesss..mmm.. Yess... join the dark side.. join the dark red side.. it will be cheaper.. you will be happy.. it will complete you!
iu

j
 
I'm looking to update my 2070S, its been a great card, still is but at 1440P its not what it once was.

These days Nvidia are too "premium" for my blood, i don't need or want the Apple of GPU's, i have a Samsung phone too....
Out of interest, not counting RT, what is pushing these cards harder at resolutions they used to handle fine?
 
Out of interest, not counting RT, what is pushing these cards harder at resolutions they used to handle fine?

It's a mix of newer games with ever higher graphical requirements. Higher resolution textures, higher detail 3D models, more draw distance, bigger areas with less loading times etc.

But it should be remembered that some older games were always difficult to run well with the then current gen GPUs. So a 7 year old game like Witcher 3 with HD texture updates etc, is only just playalbe at 4K with a 2070s. The same game on the same GPU at 1440p you will be getting consistently less than 120, let alone 144Hz.

If I recall we have now had the Titan, 1080Ti, the 2080Ti, the 3090Ti/6950 XT and now the 4090 all labelled, "4K Killers".

EDIT: This is a good thing because graphical advances are always welcome.
 
Last edited:
Out of interest, not counting RT, what is pushing these cards harder at resolutions they used to handle fine?

I don't know, games are becoming harder to run, nothing is bad, everything is ok or better, i'm playing Isonzo, WWI game from an Indy developer, Unreal Engine, 60 to 90 FPS highest settings.

No one reasonable would call that bad but Insurgency Sandstorm on the same engine a couple of years older, 100 to 140.
 
I don't know, games are becoming harder to run, nothing is bad, everything is ok or better, i'm playing Isonzo, WWI game from an Indy developer, Unreal Engine, 60 to 90 FPS highest settings.

No one reasonable would call that bad but Insurgency Sandstorm on the same engine a couple of years older, 100 to 140.

Tell a lie, Isonzo is Unity, not Unreal Engine, my bad i assumed it was because it looks like UE.
 
Was about to say, I see what you mean about it looking like an unreal game. It reminded me of The Forest (no idea why, except a feeling). Then I googled The Forest and found out it was a Unity game :cry:
Looks interesting.

Its good fun, its not like a lot of mainstream FPS games where everyone is a bullet sponge, its bolt action, you get one shot so make it quick and be accurate as you only get 5 rounds in the mag.

No unloading the whole mag in to someone face and them running away....
 
Last edited:
Hmm, rumor has it that when RDNA 3 is released it's going to be version 1 which is AMD version. Or better know as MBA (made by AMD). But they fixed whatever errata that the GPU has which includes higher clocks. And AIB variants will use the version 2.

I say version 2 because I haven't read what they are calling it yet. Hmm, if I was buying I am going to wait until next year to see if all of this pans out true or not.

What? As in they are physically different?

Tech Journalists have leaked AMD products that never materialised, those Tech Journalists then said its because AMD canceled it, now Tech Journalists are saying the reason RDNA3 is only clocked to 2.5Ghz is because there is a problem with it, those same Tech Journalists had said it would clock over 3Ghz, some even said as high as 4Ghz, that didn't happen, hence the "because something is wrong with it"
Oh but wait, AIB's will launch RDNA3 GPU's that clock higher, crap...damn! Erm? There are two different versions.....

ffs..... AMD gave us a hint long before they launched it, they said they didn't want to make large inefficient GPU's, but they would let AIB's do what they want.


Rumor mill rumor mill

To be clear it was rumored that the chip was much larger then what we are seeing announced. Whether or not that comes out next year is a different story.
The 306 mm² size of the die is much smaller. Almost midrange like. I heard that the original die size was going to be 5xx mm². The smaller size was do to whatever errata was discovered in the larger die.
The 306 mm² die is fine as far as I know. It's just a mid-high range spec'd gpu. The High-Ultra & WorkStation segment is for the 5xx mm² die gpu.
But this was months ago. Making this old news. That errata was discovered before summer.
If it's true that the larger die was respun it was a few months back before summer. I am sure it's near validation between now and 1Q 2023. Remember, AMD is designing these dies to be scalable.

You gotta remember something here. Although this is a rumor. We saw something similiar with the 6900xt.
Roughly a few months after the release of the 6900xt we had the 6900xtx (1.2v) and a few months after, about a year later after initial release, that we had the 6900xtxh.
After a year we get the 6950xt which brings OC ram IC's and gpu at 1.2v.

Now that was more of a refresh not a respin of the die. But they did improve it from the 6900xt to the 6950xt in incremental refreshes/updates. Which some here never heard of.
And, wasn't labeled on the box as such until we got to the 6950xt.
 
Last edited:
You gotta remember something here. Although this is a rumor. We saw something similiar with the 6900xt.
Roughly a few months after the release of the 6900xt we had the 6900xtx (1.2v) and a few months after, about a year later after initial release, that we had the 6900xtxh.
After a year we get the 6950xt which brings OC ram IC's and gpu at 1.2v.

Now that was more of a refresh not a respin of the die. But they did improve it from the 6900xt to the 6950xt in incremental refreshes/updates. Wwhich some here never heard of.
And, wasn't labeled on the box as such until we got to the 6950xt.

I didnt know this. I mean I knew about the 6950xt, but didnt know they had 6900xtx and xtxh versions.
 
I didnt know this. I mean I knew about the 6950xt, but didnt know they had 6900xtx and xtxh versions.
Yes, they did. They were able to increase the voltage to 1.2v. But that's a max voltage on air. It simply allowed voltages past 1.1v to be honest on air. 1.2v was reserved for watercooling to control the heat.
The 6900xtxh was a hybrid xtx and 6950xt IMO. As some of them could allow for higher vram OC's greater then the 6900xtx but not as good as the 6950xt. The xtxh and 6950xt could do at least 2600Mhz easily on air. Not sure what the max of the 6950xt is as I stopped following it. But with recent drivers I assume it's going to be better then the 6900xtxh on air.


But the point of this was that AMD originally wanted to release the 6950xt. And worked towards that goal with a few refreshes until they achieved the desired performance meta. So in the end the 6950xt should have been the original 6900xt when it was released. They kept lips tight as to why it need several refreshes. If I didn't know any better it looked like an issue with the node.
 
Last edited:
But the point of this was that AMD originally wanted to release the 6950xt. And worked towards that goal with a few refreshes until they achieved the desired performance meta. So in the end the 6950xt should have been the original 6900xt when it was released. They kept lips tight as to why it need several refreshes. If I didn't know any better it looked like an issue with the node.

Yeah that seems to match what RGT touched on. It seems later in the product cycle they will get a nice card out, the AIB's could produce some nice models. Hopefully too we shall see some competitive price drops as the battles commence!
 

Rumor mill rumor mill

To be clear it was rumored that the chip was much larger then what we are seeing announced. Whether or not that comes out next year is a different story.
The 306 mm² size of the die is much smaller. Almost midrange like. I heard that the original die size was going to be 5xx mm². The smaller size was do to whatever errata was discovered in the larger die.
The 306 mm² die is fine as far as I know. It's just a mid-high range spec'd gpu. The High-Ultra & WorkStation segment is for the 5xx mm² die gpu.
But this was months ago. Making this old news. That errata was discovered before summer.
If it's true that the larger die was respun it was a few months back before summer. I am sure it's near validation between now and 1Q 2023. Remember, AMD is designing these dies to be scalable.

You gotta remember something here. Although this is a rumor. We saw something similiar with the 6900xt.
Roughly a few months after the release of the 6900xt we had the 6900xtx (1.2v) and a few months after, about a year later after initial release, that we had the 6900xtxh.
After a year we get the 6950xt which brings OC ram IC's and gpu at 1.2v.

Now that was more of a refresh not a respin of the die. But they did improve it from the 6900xt to the 6950xt in incremental refreshes/updates. Which some here never heard of.
And, wasn't labeled on the box as such until we got to the 6950xt.

Wouldn't a 5xxmm2 GPU be a power hog?

7900xtx with 306mm2 = 355w, so and RDNA3 506mm2 GPU with the same clock speed = 590w TDP. I can't see AMD releasing a 600w GPU given one of their core goals is efficiency
 
Last edited:
Back
Top Bottom