• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Actually defective throw-away-fully dies are rare, there's redundancy in the design and it's all about binning. A GB202 die that slightly fails to be a 5090 should be stockpiled and destined to be a 5090 GRE or whatever TI when there's enough of them. Somewhere down the line the binning process failed or there was deception to bump up 5090 numbers a bit, the latter seems unlikely as if that can be proven they open themselves to a world of hurt.
 
Actually defective throw-away-fully dies are rare, there's redundancy in the design and it's all about binning.
There's redundancy in the design if you design it with enough redundancies, if you're trying to maximise yields and/or reduce cost you'd design it with as few redundancies as possible because those increase the area and cost of each die.
 
There's redundancy in the design if you design it with enough redundancies, if you're trying to maximise yields and/or reduce cost you'd design it with as few redundancies as possible because those increase the area and cost of each die.
That's a bit self-contradictory. If you want a good yield, you *need* to design in redundancies, otherwise you have fewer working dice, and hence a lower yield...

There are some bits that you can't/wouldn't have spares of - e.g. PLL's, PHYs (e.g. PCIe/Displayport/GDDR) - if they fail, the chip is indeed e-waste...
 
That's a bit self-contradictory. If you want a good yield, you *need* to design in redundancies, otherwise you have fewer working dice, and hence a lower yield...
That's not how yields work.
 
The issue no doubt originated with TSMC (nV didn't just magic up fab machines to disable ROPs). But the blame also lies squarely with nV who undoubtably saw thew issue.

As I stated in another thread, the most likely scenario here by leaps and bounds is nV knew about the issue and let the chips go to AIBs. What I can't decide is whether the AIBs missed the missing/non-functional ROPs in their QCs, or they also knew about it and chose to ship (on their own accord or under instruction from nV).

A conspiracy-minded person might even postulated TSMC told nV they had a number of defective dies with non-functional ROPs, nV instructed to still fournish them, then nV passed the chips to AIBs telling them to package them up for distribution.

I think this is true, both NVIDIA and AIBs knew and didn't care...

Both MILD and HUB and said before that having spoken to engineers at NVIDIA and AMD, they are not all that concerned with variances of plus/minus 5%.

I'm convinced NVIDIA 100% do not care about the ROPs issue. What's interesting is that this would fail even the most basic validation process, and they shipped it anyway.
 
We should also all remind ourselves to stop comparing it with Nvidia's ludicrously priced cards and encourage AMD to release cards priced properly.

FTFY.

It is amazing how many people want AMD to price their cards based of Nvidia's cards pricing. Unless you own shares in AMD, it makes no sense.
 
FTFY.

It is amazing how many people want AMD to price their cards based of Nvidia's cards pricing. Unless you own shares in AMD, it makes no sense.

That's why I only use generation on generation improvements at a similar price. Just because something is cheaper does not mean it isn't still overpriced!
 
Last edited:
  • Love
Reactions: TNA
I don't mind cheaper as I am a cheapskate! But for me I always wanted around RX7900XT level performance at £480 to £500 because it would be a 25% to 30% progression over an RX7800XT. Now if AMD gets close to that,I will get one. Say if it ends up being £600 for that level of performance,I probably won't bother and neither will I bother with an RTX5070 at that price.

Well I won't be buying then! It's coming up to Spring and Summer so I can wait.

If these AMD shareholders in here get their way you won't be getting a 9070XT or a 9070. Lol.

That's 2 more years of 3060Ti for you then, unless you face reality and up your budget :p
 
If these AMD shareholders in here get their way you won't be getting a 9070XT or a 9070. Lol.

That's 2 more years of 3060Ti for you then, unless you face reality and up your budget :p

I already upped my budget. I play a mix of older titles,Indy and newer titles and a lot of my mates actually don't have expensive cards either,so the peer pressure isn't there to upgrade,just me wanting to keep up with new tech.There comes a point where I my interest in gaming doesn't justify throwing money at overpriced shrinkflated hardware,as I have so many other hobbies vying for the same budget.

Ever since all these companies started targeting the people with loose purses,my expenditure on other hobbies in the last few years is more than on PC parts.
 
Last edited:
This is what you said as you clearly forgotten it:


But basically you said the RX9070XT had to be priced below an RTX5070 assuming the performance figures are legit. So you are saying the RTX5070 and RTX5070TI are fine at those RRPs.







I only mentioned Cyberpunk 2077 as even DF admitted that CDPR was extensively optimising for that game. So in most other games with RT,which use more economy RT,because they need to run fine on a RTX4060TI or RTX4070 the difference will be less.


Glad we cleared that up and I already computed the numbers based on Cyberpunk 2077.

So you basically admitted for 20% extra performance in one game you will be pay 50% more. Also thank you for admitting that you would still buy a more expensive RTX5070 even though:
1.)It isn't any faster than an RX9070XT in Cyberpunk 2077 by your own estimations
2.)That means in most other games it won't be quicker if that is the case assuming the said numbers hold true
3.)Will be faster in rasterised performance
4.)Have more VRAM

Even if AMD were to match the RTX5070TI in RT and rasterised and price it at $400 I doubt you will buy it. My best advice to you,is to put in an RTX5070 order next week and stop looking at this thread! :cry:

9lagmv.jpg

What some of these people are not doing in optimising for AMD, so far as some of these people are concerned AMD don't exist, they aren't really wrong with 10% and falling market share, there's consoles and then there's Nvidia, that's it, Cyberpunk as an example still doesn't have the latest FSR, the moment Nvidia launch an new DLSS its put in as soon as possible, AMD get the most basic implementation and left at that.

Read the slide @Poneros posted, AMD's GPU's are capable of Pre-Primitive Culling, Nvidia GPU's aren't, so they decided to code it for consoles (AMD GPU's) but not for PC, at all....
 
Last edited:
This is what you said as you clearly forgotten it:
I've not forgotten anything, you just aren't understanding context. For one, if you were aiming at me, then you should quote me, otherwise when I responded I said it meaning people don't (which is the verbiage you used). For ME it would have to be $500 vs $750 70 Ti, because I value all the benefits of going Geforce that highly in this match-up (features AND performance, but also not only for gaming). That's different than making pronouncements for what the price should be taking into consideration the wider market.

But basically you said the RX9070XT had to be priced below an RTX5070 assuming the performance figures are legit. So you are saying the RTX5070 and RTX5070TI are fine at those RRPs.
The 9070 XT would obviously have to be priced below the 5070 Ti, that's just business 101. Even with identical performance and features that would be the case due to market dynamics, and ofc they're not really identical. 9070 / 5070 I haven't paid too much attention to, I'd rather wait for launch for those.

I only mentioned Cyberpunk 2077 as even DF admitted that CDPR was extensively optimising for that game. So in most other games with RT, which use more economy RT, because they need to run fine on a RTX4060TI or RTX4070 the difference will be less.
That's not even true, and as more games come out on UE5 the difference is still worse by default. But more importantly it's irrelevant because it's still about the relative RT difference between Geforce & Radeon, which as of yet hasn't improved and in fact gotten worse.

So you basically admitted for 20% extra performance in one game you will be pay 50% more.
No, I said the 20% extra perf. is enough to move these cards across tiers. Again - pay attention to the context!

Also thank you for admitting that you would still buy a more expensive RTX5070 even though:
But I didn't. The only way a 5070 would be more expensive is if the 9070 XT would be <$550, which is what I said I'd want as the price in the first place. And I only said I'd buy the 5070 Ti, so why you equivocate between 70 and 70 Ti is not clear to me.

1.)It isn't any faster than an RX9070XT in Cyberpunk 2077 by your own estimations
But it is faster, a tier above in fact, at a minimum.
Actually, it's way worse than that because I'd also use PT so there it will be twice as fast.

2.)That means in most other games it won't be quicker if that is the case assuming the said numbers hold true
I don't care about "most other games", I only care about the games where I need to performance/features.

3.)Will be faster in rasterised performance
No, parity (at best). In my case not even, due to game selection.

4.)Have more VRAM
Exact same VRAM.

Even if AMD were to match the RTX5070TI in RT and rasterised and price it at $400 I doubt you will buy it. My best advice to you, is to put in an RTX5070 order next week and stop looking at this thread! :cry:
You can doubt all you want, but I have 22 years of Radeon ownership, including the 6800 I'm using right now (and never mind how many I got for friends & family). So long as it made sense, I bought ATI/AMD. Now it doesn't, so I don't. It's really that simple. The only corp I care about is Me Inc.
 
Imagine being AMD developing advanced technologies for your GPU's that then don't get used because your competitor doesn't have those technologies, the lack of optimisation for AMD also means AMD's GPU are effectively a tier below where they should be and then they get screamed at by tech journalists about not being 40% cheaper than that lower tier they are forced to compete in.

As i keep asking, why do AMD bother? It seems insane to me....
 
Last edited:
Oh and BTW.... Intel get the same treatment, they have no chance of gaining a foothold here, this segment is rotten to the core.
 
Last edited:
I already upped my budget. I play a mix of older titles,Indy and newer titles and a lot of my mates actually don't have expensive cards either,so the peer pressure isn't there to upgrade,just me wanting to keep up with new tech.There comes a point where I my interest in gaming doesn't justify throwing money at overpriced shrinkflated hardware,as I have so many other hobbies vying for the same budget.

Ever since all these companies started targeting the people with loose purses,my expenditure on other hobbies in the last few years is more than on PC parts.

I am streaming BG3, Total War and Indy just fine. Maybe you should think about joining the Apple M4 Mini game streaming master race where there is no noise, no heat and nothing for people to rob.
 
Last edited:
Imagine yourselves years from now watching videos about 'where did it all go wrong'

Consoles now exist as empty boxes with an internet connection for you to stream your games to, RTX ##60 series GPU's now start at £1000 and Nvidia are heavily promoting their GeForce Now streaming service for £55 a month, AMD's fight with Nvidia is now solely in workstations after having pulled out of consoles and dGPU's a couple of years ago.

Its all a bit ****. our hobby is dead, what do we think the "where did it all go wrong" video will say?
 
Its all a bit ****. our hobby is dead, what do we think the "where did it all go wrong" video will say?

The rot set in when we started to proper voltage control and power-limits locked behind a 1,500 quid kingpin sized paywall.

said paywall was then eliminated as an option.
 
I've not forgotten anything, you just aren't understanding context. For one, if you were aiming at me, then you should quote me, otherwise when I responded I said it meaning people don't (which is the verbiage you used). For ME it would have to be $500 vs $750 70 Ti, because I value all the benefits of going Geforce that highly in this match-up (features AND performance, but also not only for gaming). That's different than making pronouncements for what the price should be taking into consideration the wider market.
The 9070 XT would obviously have to be priced below the 5070 Ti, that's just business 101. Even with identical performance and features that would be the case due to market dynamics, and ofc they're not really identical. 9070 / 5070 I haven't paid too much attention to, I'd rather wait for launch for those.


That's not even true, and as more games come out on UE5 the difference is still worse by default. But more importantly it's irrelevant because it's still about the relative RT difference between Geforce & Radeon, which as of yet hasn't improved and in fact gotten worse.


No, I said the 20% extra perf. is enough to move these cards across tiers. Again - pay attention to the context!


But I didn't. The only way a 5070 would be more expensive is if the 9070 XT would be <$550, which is what I said I'd want as the price in the first place. And I only said I'd buy the 5070 Ti, so why you equivocate between 70 and 70 Ti is not clear to me.


But it is faster, a tier above in fact, at a minimum.
Actually, it's way worse than that because I'd also use PT so there it will be twice as fast.


I don't care about "most other games", I only care about the games where I need to performance/features.


No, parity (at best). In my case not even, due to game selection.


Exact same VRAM.


You can doubt all you want, but I have 22 years of Radeon ownership, including the 6800 I'm using right now (and never mind how many I got for friends & family). So long as it made sense, I bought ATI/AMD. Now it doesn't, so I don't. It's really that simple. The only corp I care about is Me Inc.

I have had over 23 years of mixed AMD and Nvidia ownership - I go by price/performance. Whatever company gives me that, I will get the card.If either don't,I have no interest. I have seen all the games played with Tessellation,GSync,etc.

I get you want more RT performance and if the cards were closely priced I can understand.

But then you put out a price which is lower than a $540 RTX5070 which is what YOU stated. RTX5070 is $540 and you think in the context of that the RX9070XT should be $500. I mentioned the RTX5070 to see your response.

So I will ask you again to clarify a few things,based on assuming AMD or Videocardz isn't fibbing about numbers:
1.)Assume the RX9070XT is as fast as RX7900XTX in rasterised
2.)Assume the RX9070XT is around RTX4070 Super to RTX4070TI in an average of RT games
3.)Assume the RTX5070 is an RTX4070 Super in rasterised/RT(maybe even an RTX4070TI in RT)

So around parity in RT(let's say 10% better for Nvidia,because we all love Nvidia here),and the AMD card offering 20% to 30% better rasterised performance and 33% more VRAM.

Now using these assumptions,if the RX9070XT was priced the same as RTX5070,what would you get and why? Not $500,but say $550 which is around the same price.
 
Last edited:
Back
Top Bottom