Soldato
- Joined
- 21 Apr 2007
- Posts
- 2,651
Seems like we are in for a good fight this Winter.
failing that at least we'll be warm with all these new power limits

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Seems like we are in for a good fight this Winter.
I think the extra VRAM will persuade many.
Can't it be 256bit 16gb Vram with 80cus if its 505mm2?
Yeah i had one of thoseand a 290.
That slide, here it is in my stash, has 6 Quad Channel 64Bit IMC's, 384Bit. i can't see anyone doing a 512Bit GPU these days, at that point it would be better to put HBM on it.
![]()
TBF,if AMD can improve their memory bandwidth utilisation then there is no reason they can't get RTX3070 level performance on a 256 bit memory bus. If they could do that,that would help improve efficiency and lower BOM for the AMD GPUs significantly.
Hype building.![]()
What about a 56CU one?They wouldn't do that with an 80 CU GPU tho?
What about a 56CU one?
XFX triple dissipation"Peasant Edition"?![]()
"Fabric" if that Block Digram is real Big Navi will be the worlds first MCM Graphics Card.
Can't it be 256bit 16gb Vram with 80cus if its 505mm2?
What about a 56CU one?
No different to tdp and heat either. AMD make a card that requires 300w and the usual Nvidiots come out mocking the hell out of it finding every meme possible playing on the power and heat issues calling it a disaster. Nvidia bring out a 350w card at extortionate pricing and those same Nvidiots get the tissues out fapping over how incredible the card will be and how much power will be used. All of a sudden the electric bill isn't an issue.
Speaking of R&D this just landed in my Inbox...
https://www.hardwaretimes.com/intel...d-over-the-years-nvidia-spent-less-than-both/
Holly Crap...
WTF are Intel doing? and Nvidia don't like spending money do they?
![]()
Intel have a lot of fingers in a lot of pies not just CPU's (Nic's, modems, nand, ssd controllers, gpu's etc) as well as the fab stuff
There is a differences between having a high TDP card and having a high TDP card with performance to match. Navi has high power consumption for the performance it gives on the process is on reaching Pascal PPW on 7nm vs 16nm.
Say the 3090 has 2080ti performance on 350w it would be a speed and design fail of the FX gpu level. But 2x+ performance and 2x the Ram on a 350w card that is not to bad is it?
I still think its pretty amazing AMD are where they are considering they are fighting two fronts against two bigger companies producing two different core products.
Lol, too trueThe problem is people like JayZ2Cents who constantly bangs on about how he's not an Nvidia fanboy because he gets accused of it constantly, which he probably believes that he's not.... And then he says stupid things like "There is no way AMD could ever catch Nvidia, i mean think about it they have to increase the performance of the 5700XT by 350% to match the 2080TI"
And his post Ampere reveal video was a cringe fest of sycophantic dribbling about how brilliant Ampere is and "Nvidia never exaggerate anything, not like AMD do"
And he's far from the only one who has this deranged almost religious reaction to Nvidia, just watch pretty much all the main reviewers Ampere follow-up videos, they are all the same. Other than Hardware Unboxed that is... this is what AMD are up against, a religion. Religious people don't realize they are nuts!
350w is due to samsungs 8nm its basically, broken.
Yields are likely to be really not good
According to sources, I thought nvidia had 80% of the market share though???
Nope the 8nm is a robust process with a D0 of approx 0.12 defects per sq cm. There are not going to be any supply issues whatsoever... infact Nvidia might be purposely binning good 3090 cores just for product positioning purposes