• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD "Tonga" Silicon Features 384-bit Wide Memory Interface

Well, I'm a won't ever buy Nvidia again'er. Ever since I was burned by their defective hardware from their poor choice of solder. Lesson: never trust a hardware company which boasts that they are a software company (seem to recall some JHH quote along those lines).

First to go was my 8800GT back in 2011. Just shortly after BFG went bankrupt. This was followed shortly after by my brother's 8800GT (not BFG). Then came a bunch of other failures of stuff either I or someone I knew owned like 8400M GS, 7150 chipsets and some laptop chipsets. All manufactured around the same time. Nvidia never did come clean about how widespread their problem was, but the wikipedia entry:
http://en.wikipedia.org/wiki/GeForce_8_series#Problems
only mentions G84 and G86 but I don't believe that.

Anyway, won't ever buy Nvidia again. And if all these AMD doom&gloom stories turn out to be true and AMD go bankrupt, I'll just put up with Intel HD. At least Intel stand over their products (even when the problem is minor like SB SATA ports), unlike Nvidia.

As for the thread subject: this would a surprise. Obviously, the chip die size made it look like Tonga was step back in efficiency but if a large 384-bit version with more shaders is coming this might not be true. Would this be first time that AMD have die harvested and binned a chip by bringing out the cut-down version first? I remember thinking since the 7950 GHz Edition that AMD should really do more binning: 7950 GHz Edition @ 1.25V was hot and loud, while a binned version at 1.00V would have been far cooler and quieter. After all, a lot of miners saved 20-40W by doing exactly that.


That's the spirit.

Just like I will never buy a game that has Mantle support. :)
 
Well, I'm a won't ever buy Nvidia again'er. Ever since I was burned by their defective hardware from their poor choice of solder. Lesson: never trust a hardware company which boasts that they are a software company (seem to recall some JHH quote along those lines).

First to go was my 8800GT back in 2011. Just shortly after BFG went bankrupt. This was followed shortly after by my brother's 8800GT (not BFG). Then came a bunch of other failures of stuff either I or someone I knew owned like 8400M GS, 7150 chipsets and some laptop chipsets. All manufactured around the same time. Nvidia never did come clean about how widespread their problem was, but the wikipedia entry:
http://en.wikipedia.org/wiki/GeForce_8_series#Problems
only mentions G84 and G86 but I don't believe that.

Anyway, won't ever buy Nvidia again. And if all these AMD doom&gloom stories turn out to be true and AMD go bankrupt, I'll just put up with Intel HD. At least Intel stand over their products (even when the problem is minor like SB SATA ports), unlike Nvidia.

As for the thread subject: this would a surprise. Obviously, the chip die size made it look like Tonga was step back in efficiency but if a large 384-bit version with more shaders is coming this might not be true. Would this be first time that AMD have die harvested and binned a chip by bringing out the cut-down version first? I remember thinking since the 7950 GHz Edition that AMD should really do more binning: 7950 GHz Edition @ 1.25V was hot and loud, while a binned version at 1.00V would have been far cooler and quieter. After all, a lot of miners saved 20-40W by doing exactly that.

Go for it.:D

Yep them NVidia cards are crap, the drivers don't work, they are the wrong colour, the wife hates them, my car broke down, the Titans make fantastic house bricks, look where they got ET, there is a 50% higher chance of slipping on a banana skin and the aliens were using them in Independence Day.

Got any more.:D
 
Even though I have a Nvidia card ATM(and bought another two in the last year and a half),people cannot just laugh off the G80 series bumpgate problems. I know the better part of nearly 20 instances of people affected by it(mostly laptops) at the time and the way it was handled - in many cases OEMs were issuing fan speed updates on laptops to try and "fix" the problem,before in the end they had to be replaced,and in many cases I had to inform people of the said issue so they could ring replacements from said OEMs. The problem seemed to far more worse in laptops though,than desktops it seems.

My only card to ever go bang was my 8800GTS 512MB,but to this day I don't know whether it was me being unlucky or a bumpgate issue. I think I might have been unlucky TBH,since my 8500GT seemed to have survived but the latter had a massive aftermarket heatsink attached to it,so it would be quieter,and the 9300 IGP in my other system is fine too.

The problem cost Nvidia at least $200 million,and that did not include the numerous class action lawsuits in the US too.
 
Last edited:
The problem cost Nvida at least $200 million after months of problems,and that did not include the numerous class action lawsuits in the US too.

Nvidia got off very lightly indeed for the amount of damage they caused. Desktop GPUs were of course fairly easy to replace with something else, but for laptops that was not the case as most of the time there was nothing to replace them with. Apple for instance were about the only manufacturer who actually tried to sort this for their costumers and some people ended up getting their Macbook Pro replaced 2-3 times. Problem was all of Nvidia's chips from that period were inherently faulty. (Other manufacturers like HP and Dell mostly just fobbed off their customers.)

In comparison, the Sandy Bridge SATA recall cost was supposed to have cost Intel $700 million and that was for a problem which would 'only' have resulted in the SATA3 ports becoming bad rather than the whole chip not working.

But the real problem I have with Nvidia is that they never came clean. Their line of "only some mobile G84/G86 chips affected" was rather suspect. Those mobile parts were the first to fail (thermal stress inside laptops etc.), but when this happened Nvidia must have know that almost ever part they sold during that time had the same solder defect.

So the scale of this was something similar to 'capacitor plague' causing $billion of damages for the consumer. Yet it got so little coverage in the media. That is the true genius of Nvidia and their press relations. Rather suspect that Charlie from the Inquirer who exposed this story is hated by a lot of people not because of his rants but because he exposed Nvidia as a bunch of cowboys not willing to stand by the quality (or lack thereof) of their products.
 
There are plenty of 290 now popping up on MM so it's fair to say members of both sides feel it's worthwhile upgrading to 870/880 for the improvements outside of purely benchmarking. Christmas is a busy period as well.

yes, or in my case a replacement for my broken 7990, it broke because i tried to reduce its temperatures.... so the 970 is perfect, it runs cold and quiet, lets hope all cards run cold/ quiet from now on.

about the benchmark............ that's a very interesting point, easy, all i do is buy another card next month :cool:

why not the 295X2 ???????? too much money all at once...........plus i want to buy another pc case tomorrow :D:D
 
Last edited:
Ok fair enough, the next Question is Why?

Why would AMD Design a Tahiti replacement that is going to be a midrange part, take the time to bring the new compression stuff to the table making it about 25% more efficient in its memory bandwidth needs and then completely make the whole process a waste by using a 384bit bus anyway.

It is never going to be an Hawaii replacement it is just not fast enough.

A 384bit bus make no sense on this chip.

I get what you are saying regarding the bus as it does seem quite large for the targeted bracket,but the thing is I expect a fully enabled Tonga to be running at a higher effective clockspeed than the R9 290. The R9 285 is clocked only at a maximum of 918MHZ which is lower than the R9 280X and the R9 290.

I have a feeling it could get to similar performance or close enough,as the R9 290 is only 30% to 40% faster overall at 1920X1080 and 2560X1600 than the R9 285 and 25% faster than a R9 280X.

A few things a fully enabled Tonga will have over Tahiti:
1.)Increased effective memory bandwith due to compression.
2.)Far better tessellation performance.
3.)Probably higher base clockspeeds out of the box.
4.)I suspect it will have a 50% increase in ROPs,ie, 48.
5.)Large improvements in media specific features like VCE and video decoding.
6.)Later generation GCN shaders.

We have seen AMD,do something similar with the HD6870. It had less shaders than an HD5870 but got get close to it in many instances.

However,if we do see a fully enabled Tonga GPU,I do hope power consumption is also reasonable too.
 
Last edited:
Back
Top Bottom