Nvidia attacks PS4 hardware, calls it "low-end"

It was basically an R600 type of core with the performance of an X1900 GPU. But it's not murky at all. It was the first unified shader GPU to come out and be used commercially.

And for all realworld intents and purposes used as a fixed function GPU - it was a forward looking design that was far too before its time to be utilised - same as the tessellation unit. So I consider it murky as you can split hairs over it but in reality its not useful as a programmable shader device.
 
No, as in, your post about it it wasn't really about that, but I've noticed that when you're picked up on something, you'll commonly go "no no, what I ACTUALLY meant was this" which is why the other say I said something like "I know I meant something else based on your rebuttal".
 
No, as in, your post about it it wasn't really about that, but I've noticed that when you're picked up on something, you'll commonly go "no no, what I ACTUALLY meant was this" which is why the other say I said something like "I know I meant something else based on your rebuttal".

Yeah I don't always put what I see in my head in words as eloquently as I'd like and sometimes present the wrong idea to people who can't see what I see in my head. Not sure why it upsets you so much but I find it somewhat amusing that it does.
 
Yeah I don't always put what I see in my head in words as eloquently as I'd like and sometimes present the wrong idea to people who can't see what I see in my head. Not sure why it upsets you so much but I find it somewhat amusing that it does.

You do know what "upsets" means don't you? Commenting about it isn't indicative of me being upset, you're just reading in to it because that's how you want to see it.

It seems like quite a coincidence that you regularly post stuff that isn't really relevant to the subject anyway, and when questioned about it, decide that you actually meant something else, which again is just as irrelevant.

Why you think I'm upset is just bizarre.
 
Dunno it certainly seems to bother you enough you feel the need to post in the manner you do in reponse to it which is certainly taking it more personal than the majority of the people who reply to my posts. Maybe upset was the wrong word :D but it seems to bother you.
 
Taking it personal? I'm talking about things you've said and the way you react when questioned on it. How could it be anything other than personal when I'm speaking about you? :confused:

Some people might not question you simply because the way you post makes it look like you know what you're talking about, and because they don't know about the subject, have to reason to question what you've said.
 
Dunno it certainly seems to bother you enough you feel the need to post in the manner you do in reponse to it which is certainly taking it more personal than the majority of the people who reply to my posts. Maybe upset was the wrong word :D but it seems to bother you.

Seems that way to me :)
 
Spoffle, you really think that Sony didn't go to some kind of tendering process/negotiation with NVidia and instead only went to ATi?

Only an idiot would do this and not literally play each vendor off against the other on price and spec.

I work in B2B IT, I work on tenders amongst other things and I know how these type of businesses work.
 
Taking it personal? I'm talking about things you've said and the way you react when questioned on it. How could it be anything other than personal when I'm speaking about you? :confused:

Some people might not question you simply because the way you post makes it look like you know what you're talking about, and because they don't know about the subject, have to reason to question what you've said.

Why is that a problem for you? theres almost always someone who knows more about a subject - I almost always have a good understanding of anything I talk about but a lot comes from self learning so I might not have the same understanding as someone with a formal education in the subject - which not unusually leads to situations where what I'm actually meaning is right but the way I'm putting it isn't always correct and sometimes clumsy use of the terminology through lack of formal training in the subject leads people better qualified in the subject to mis-understand what I'm actually saying. Most don't make a deal of it and post something constructive I can learn from a few like youself for some reason are bothered by it and it ends up littering the forum with rubbish posts.

PS despite what your trying to imply - I rarely post about something I don't have a reasonable level of actual experience with i.e. heres a couple of things I coded to gain experience of programming physics:

http://www.youtube.com/watch?v=MJfj_mnoAwg
http://www.youtube.com/watch?v=XSPm5d8GwLI

I've also built my own DirectX7 based game engine from scratch back in the day also a map I created for COD4 http://forum.i3d.net/modifications-cod4/25564-cod4-map-woddrr.html - I've also got maps or work I've done included in idsoftware titles i.e. sidrial - and a ton more but I'll wait for the next time you accuse me of not knowing what I'm talking about before elaborating.
 
nVidia have various ARM based SoCs on tap and the ability to scale up custom versions, the decision to focus on x86 might have made nVidia a less attractive choice tho.

Maxwell is actually closer in design to the hardware in a console than a traditional GPU as well so they aren't entirely lacking in ability to develop that kind of hardware for a console.

EDIT: See http://en.wikipedia.org/wiki/Project_Denver


So,is it true that Project Denver is out this year then??
 
Last edited:
I do think we need to consider timescales here too. Sony for whatever reasons wants to launch the PS4 this year.

Jaguar is already entering production now,and moreover there is another important factor here. It is already made on the same 28NM TSMC process like the AMD and Nvidia GPUs. This would probably mean,Sony knows it will be a credible design to make on the TSMC 28NM process,as part of an SOC. There is less risk for them. Even if PD was a potential design to be used,it would have to made to work on a very different process node.

Edit!!

Poor yields on the Cell BE used in the PS3,was one of the reasons it cost so much:

http://www.dailytech.com/IBM+Says+Its+Lucky+to+Get+10+to+20+Yields+on+Cell+Processor/article3295.htm
 
Last edited:
Spoffle, you really think that Sony didn't go to some kind of tendering process/negotiation with NVidia and instead only went to ATi?

Only an idiot would do this and not literally play each vendor off against the other on price and spec.

I work in B2B IT, I work on tenders amongst other things and I know how these type of businesses work.

Who's ATi? :p

I'm not disputing that, I also know the process. The real question is, if there were only 2 companies that could do what you needed realistically, would you really go through the tendering process if one of them was constantly causing you problems and costing you money? Personally, I don't buy that they would, because this isn't generally a typical situation that occurs.

Why is that a problem for you? theres almost always someone who knows more about a subject - I almost always have a good understanding of anything I talk about but a lot comes from self learning so I might not have the same understanding as someone with a formal education in the subject - which not unusually leads to situations where what I'm actually meaning is right but the way I'm putting it isn't always correct and sometimes clumsy use of the terminology through lack of formal training in the subject leads people better qualified in the subject to mis-understand what I'm actually saying. Most don't make a deal of it and post something constructive I can learn from a few like youself for some reason are bothered by it and it ends up littering the forum with rubbish posts.

PS despite what your trying to imply - I rarely post about something I don't have a reasonable level of actual experience with i.e. heres a couple of things I coded to gain experience of programming physics:

http://www.youtube.com/watch?v=MJfj_mnoAwg
http://www.youtube.com/watch?v=XSPm5d8GwLI

I've also built my own DirectX7 based game engine from scratch back in the day also a map I created for COD4 http://forum.i3d.net/modifications-cod4/25564-cod4-map-woddrr.html - I've also got maps or work I've done included in idsoftware titles i.e. sidrial - and a ton more but I'll wait for the next time you accuse me of not knowing what I'm talking about before elaborating.

It's not so much of a problem, I find it bizarre. I'm not disputing that you actually have experience in the topics you talk about, I think it's actually obvious that you do have experience.

The the fact that you have experience I find it so bizarre, because despite your experience you come out with a lot what looks to be rubbish to be blunt about it, the type of stuff that people who don't have a clue would say.

So basically, because you have the experience, you should know better.

I do think we need to consider timescales here too. Sony for whatever reasons wants to launch the PS4 this year.

Jaguar is already entering production now,and moreover there is another important factor here. It is already made on the same 28NM TSMC process like the AMD and Nvidia GPUs. This would probably mean,Sony knows it will be a credible design to make on the TSMC 28NM process,as part of an SOC. There is less risk for them. Even if PD was a potential design to be used,it would have to made to work on a very different process node.

Edit!!

Poor yields on the Cell BE used in the PS3,was one of the reasons it cost so much:

http://www.dailytech.com/IBM+Says+Its+Lucky+to+Get+10+to+20+Yields+on+Cell+Processor/article3295.htm

I was going to touch on this too before you posted it.

Sony aren't going with any custom chips here, they have literally chosen off the shelf parts, something they wouldn't be able to do with nVidia because nVidia's 660/660Ti that competes with the 7850/7870 is based on GK104, a 300mm² part whereas the 7850/70 are a 200mm² part.

Clearly a 300mm² GPU isn't ideal to have in a console, and for nVidia to be able to supply a 200mm² GPU with enough performance would have likely required a custom chip as I can't see them having any interest in GK107, the performance wouldn't be there without a big bump to clock speeds.
 
Last edited:
Personally I think the GPU is under-powered, its great if you look at it from the perspective of 720p and 30fps but quite a lot more limited if you start talking 1080p or even beyond, 60fps and some real AA in future titles.
.

You do realise Killzone 4 is running at 1080P 30FPS and is a launch title :confused:

With being a launch title, not even running at full specked hardware and considering it takes a few years to get the full purchase out of the hardware, its looking pretty good looking forward.
 
The the fact that you have experience I find it so bizarre, because despite your experience you come out with a lot what looks to be rubbish to be blunt about it, the type of stuff that people who don't have a clue would say.

So basically, because you have the experience, you should know better.

As I mentioned a lot of it comes from the nature of my experience with it i.e. someone brought me up on the comment of "scraping the API" - what I actually meant was correct but scraping correctly used usually refers to parsing human readable data from an application and API is for computer programs to interface with each other so what I actually said through lack of structured training with the subject sounded idiotic even tho the concept I was actually referring to made sense.
 
You do realise Killzone 4 is running at 1080P 30FPS and is a launch title :confused:

With being a launch title, not even running at full specked hardware and considering it takes a few years to get the full purchase out of the hardware, its looking pretty good looking forward.

If developers started to use hardware physics, gpu compute based processing for advanced lighting effects like realtime global illumination and so on tho IMO its going to end up being less than adequate a lot quicker than it should. After seeing TressFX in action hopefully people are starting to realise that the CPU alone isn't sufficent for proper physics implementations beyond the primitive functions (simple rigid/animated bodies, basic ragdolls, etc.). Sure you can throw a few 100,000 boxes around and have it all running smooth but that doesn't represent the workload of more advanced physics effects.
 
With the release of 4K TV's very soon, a PS4 and Xbox720 will look pretty shoddy on one of those IMO.

Release, maybe. Adoption? LOL, no.

The takeup of 1080p screens is probably still nowhere near 50%. Lots have SD still and then you have all the "HD Ready" screens that were bought because people didn't understand the difference between "HD Ready" and 1080p.

4k is a pipe dream.
 
"Nvidia-the way its meant to be baked!"

AMD response:

"AMD -baking evolved!"

Heh, some people were baking their RSX to temporarily sort the YLOD.

Untitled-1_zps2045fc1f.gif

LoL! What was that originally from? Seen this before.... it was downright creepy.

The vast majority of PC gamers know absolutely nothing whatsoever about their chips and companies they discuss but become experts at a moment's notice.

I've noticed for years more so now some have very big ego's and loud mouths.

 
Back
Top Bottom