PS3 hardware "slow and broken".

Caustic said:
Read my post a few replies up. It is not vital at all. It won't make the slightest difference. The article seems 100% correct, but is just highly misleading.

*The CPU does not need to read from GPU mem!*

Agreed, I think the story is true also but I don't think it's going to affect the overall performance of the console.
 
Last edited:
Caustic said:
Yup, but the 16MB/s is only for the GPU reading CPU mem. All other combinations are fine, as the table shows.


ok, just seems to me like he's talking about the cell reading local memory. Not the rsx reading the cell memory

For local memory, the measured vs theoretical bandwidth is missing, I wonder why? RSX is at a solid 22.4GBps for both read and write, good job there green team. Then comes the blue team with Cell. Local memory write is about 4GBps, 40% of the next slowest bandwidth there. Then comes the bomb from hell, the Cell local memory read bandwidth is a stunning 16MBps, note that is a capital M to connote Mega vs a capital G to connote Giga. This is a three order of magnitude oopsie, and it is an oopsie, as Sony put it "(no, this isn't a typo...)".
 
Anyone care to explain what the whole thing is about in plain english please....

So far here is my understanding Ps3 cpu bad and rubbish because of the set up... :confused:
 
da_mic_1530 said:
tbh i jst want the ps3 out now, so we know wether its good or bad, i dont want to buy one, just this constant bombardment is even grating on me, and ive slagged it off abit
Step outside then, if it's grating then why get involved in something that shouldn't EVER affect you?
 
james.miller said:
ok, just seems to me like he's talking about the cell reading local memory. Not the rsx reading the cell memory

I think the problem is the wording on the slide more than anything else. I thought it was the speed that an SPE could read its own 256KB memory. I think they were probably talking about the GPU at the time in the presentation, hence 'local mem' means graphics card's mem.

PS. I got it wrong, in my last post, I meant:

*The CPU does not need to read from GPU mem!*

(I got CPU and GPU the wrong way around)
 
Whappers said:
Thanks for that Kumar. God damn inquirer are scare mongers :rolleyes: That is the last time I ever take anything the inquirer says half seriously :rolleyes:

Inquirer is where aspiring Daily Sport writers serve out their apprenticeships.
 
kumar101 said:
Inquirer is where aspiring Daily Sport writers serve out their apprenticeships.

I have always taken most things off the inquirer with a pinch of salt but this is the final straw. It'll now be a granule of salt and nothing more :rolleyes:
 
Whappers said:
I have always taken most things off the inquirer with a pinch of salt but this is the final straw. It'll now be a granule of salt and nothing more :rolleyes:

If you're gonna take it with less salt in future, it means you trust them more than when you used a pinch of salt doesn't it? :p
 
dirtydog said:
If you're gonna take it with less salt in future, it means you trust them more than when you used a pinch of salt doesn't it? :p

While I see where you are coming from, a grain is less effort to pick up than a pinch - so depends which way you want to look at it :D

Think this part of the thread is more enlightening than the real subject myself - no offence to OP meant though
 
kumar101 said:

The Inquirers editor needs a foot up his/hers rear end. It is an appalling site where the writers post tenuous facts at best, and to boot, do it with the worst spelling and grammar that I have ever seen.

You would think that on a tech based news page, fact would be first and foremost in your requirements. I picked an article at random (unsurprisingly by Mr Fuad Abazovic) and immediately found a spelling mistake... who proof reads these things before they are put up????

The Inquirer said:
"DDR 3 should take off at PC 8500, DDR 1066 MHz speed but the spec is jet to be finished so this can easily change. "

OK, rant over :D

Thank you for that article Kumar - I think one statement of note is:

"I'd say PS3 was a challenge to work on," he said, "but every new platform takes a while to get used to. Put it like this, I worked on early PS2 games, and those were a real nightmare - we're getting code up and running on PS3 much faster than we did last time around."

"Once people start doing really impressive stuff on PS3 and Xbox 360, they're both going to be much the same [in terms of difficulty]," he concluded. "Sony's giving us better tools this time around - they're still not great at communicating and there are some weird holes in their developer support, but they've learned a lot of lessons from PS2."
 
FrankJH said:
Think this part of the thread is more enlightening than the real subject myself - no offence to OP meant though

lol no offence taken... I just post articles that I see may create discussion or interest... as this one has. The article slamming The Inq's claims comes as no surprise... I think The Inq has started to lose its grip on reality of late (more so than usual).
 
Richdog said:
lol no offence taken... I just post articles that I see may create discussion or interest... as this one has. The article slamming The Inq's claims comes as no surprise... I think The Inq has started to lose its grip on reality of late (more so than usual).

Richdog why you posting this Inq rubbish anyway you bought ** Screen yet mate?

If not I want you to shell-out £425 on super-duper PS3 and a Benq FP241W with HDMI/1080p for another £900.
 
Last edited:
The most irritating thing about all this is sites like Bit-Tech who gleefully posted an article stating the hardware might be hobbled, but as soon as credible info came out to debunk the claims there is nothing. I'll be e-mailing them tonight from my non-work address to ask why it is they were so keen to spread rubbish but apparently unwilling to post a follow up article that expained the reality.
 
Back
Top Bottom