• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

radeon 5770 ?

They aren't disabling WORKING simd's, but its a low yield process, meaning not many cores off the waifer actually all work, those that do, 5870's, those that almost everything works, 5850's, those that don't have 1440 working shaders, drop to the next level.

Its saving LOTS of money, because you can't sell a core with 1120 shaders working, as a 5870/5850. Meaning you either throw those cores out, or you sell them as lower end parts.

AT this stage, with awful yields its very unlikely they'd do either of two things, not come up with extra types of core to increase the number of salvaged cores, and 2, waste time on a lower profit part right now when demand for their £300/200 parts is so high and they only have a small amount of capacity at TSMC.

AS for the bracket, the bracket size is denoted by the componentry it has to fit around more than anything else. with 50% of the memory bus not connected, you have FAR less circuitry connected.

We might infact see two versions of the 5770, early cut down versions till yields are high, and eventually a actually physically smaller version with the same specs.

Remember the yields simply suck, not just because of core design, you'd be throwing out a lot of cores from a waifer with smaller cores on aswell, increasing yields by salvaging parts offsets low yields.

I think we could argue about this all day and basically make no progress due to lack of information, I think we can just agree to disagree on this; we'll find out eventually anyway. :)
 
Pretty interested in this since since i doubt i will be getting a 5850 anytime soon. (The 3870 does everything i want it to at present).
Certainly interested if it can be overvolted and possibly unlocked.
Like it for the low power as well.
 
he is right. Throwing away cores with 50% of their shaders functioning is akin to giving money away. If they can sell them, they will.

Wait, I wouldn't mind you essentially declaring my argument a failure if you'd actually concluded with something substantial, but by the looks of it you're just going by your favourite piece of speculation (I'm not saying anything I said was much more than speculation based on given evidence, however). If you've got a source or something, do let me know, though.

Regardless, new information, Charlie of semiaccurate seems to reckon it is a different die, with a size of 185mm^2 going by this:

http://www.semiaccurate.com/2009/10/06/nvidia-will-crater-gtx260-and-gtx275-prices-soon/

There is no way that the 260 and 275 can make money if ATI prices Juniper at $125 and $175 for the low and high end variants respectively.

The problem in a nutshell is that Nvidia has a card that is based on a 484mm^2 die against a 185mm^2 die with a much cheaper board and memory subsystem.
 
Last edited:
Looks interesting - am thinking of 2 in crossfire instead of a 5850/5870

Do wonder how much 'damage' would be caused by running them on a P45 (2x8 rather than 2x16 ) . If memory serves, I believe that the loss when running 2x4870 was only 10/15% when running 1920x1200 and if the 'new' cards were to run similar (or better :) ) then would be very tempted.
 
Wait, I wouldn't mind you essentially declaring my argument a failure if you'd actually concluded with something substantial, but by the looks of it you're just going by your favourite piece of speculation (I'm not saying anything I said was much more than speculation based on given evidence, however). If you've got a source or something, do let me know, though.

do I need a source? whatever the 5770 is is speculation right now until somebody takes a look at one. but regardless of that, its common practice. They all do it.

If you got a good reason to throw away otherwise working cores, especially given the poor yeilds at this time, then tell me why you'd do it. If you know ati are doing it - you know, do you have a source? then straight back at you - let me know.


I'll tell you this much - if the 5770 isnt based on a full fat core, the 5830 im sure will be.
 
Last edited:
do I need a source? whatever the 5770 is is speculation right now until somebody takes a look at one. but regardless of that, its common practice. They all do it.

If you got a good reason to throw away otherwise working cores, especially given the poor yeilds at this time, then tell me why you'd do it. If you know ati are doing it - you know, do you have a source? then straight back at you - let me know.


I'll tell you thins much - if the 5770 isnt based on a full fat core, the 5830 im sure will be.

Firstly, yes, if you're going to try to conclude an argument like that, it helps that you have a source definitively proving your point. In this case, somewhere saying that the core was not a new core, but the same Cypress core with its shaders disabled. So far, nowhere has so much as hinted at that, it's an assumption. It's rude. ;)

And I know full well that GPU manufacturers disable cores to help yields and to differentiate their products, you can see that with the 5850 and 5870 - the 5850 has 1440 of the core's total 1600 processing units. However, by the time you get to just over half of the processing cores not working, there's a reasonable chance other parts of the core that have less redundancy may be a little buggered, too. For example, if a defect is in the core's only tessellation unit, you may as well call it a write off.

Since Juniper is supposed to be a high volume part ($100-$200 segment), I don't think this would be very reliable, but that's just opinion. It seems like a bit of a crap trade off, at any rate - the amount of cores that fit the 5700 series' specific requirements whilst being financially beneficial to AMD would be quite small - and as soon as yields pick up, you're forced to sell a product that could easily be sold for much more for a lot less just because of consumer demand (or make a new core, as Drunkenmaster suggested, with the same processing power as the cut down core, which is another several million in R&D down the drain).

It's like how Nvidia produced G94 (505 million transistors) with 64 shaders for the 9600 GT - which I remind you, didn't even cost much less than the 8800 GT and didn't perform a whole lot worse than it - despite G92 (754 million transistors) just recently at the time having had horrible supply issues. Just because something is common practice, doesn't mean it will automatically happen.

Also, in terms of a source, I direct you to the post you just quoted. Whilst it's not definitive, it's at least something. Here's another if you want, I didn't notice this one from reading the review of the 5800 series cards, but it seems pretty conclusive.

http://techreport.com/articles.x/17618/2

Juniper is a separate, smaller chip aimed at the range between $100 and $200.

And yes, you're probably right, if there's a 5830, it likely will be based on Cypress. I don't think there are any details of such a card as yet, but it wouldn't surprise me if one were to surface.
 
you're going off on a tangent. i said AMD wont be throwing away working cores, nothing more. You havent done anything to disprove that.

he is right. Throwing away cores with 50% of their shaders functioning is akin to giving money away. If they can sell them, they will
And yes, you're probably right, if there's a 5830, it likely will be based on Cypress. I don't think there are any details of such a card as yet, but it wouldn't surprise me if one were to surface.

after all that you are agreeing lol
 
Last edited:
Wait - your argument was the one off on a tangent (at least the one you declared drunkenmaster to be right about). Of course they wouldn't throw away working cores (provided they'd tested for them). :p

Well, I got my point across and found a reliable source for it. I'm happy.
 
Back
Top Bottom