• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PowerVR demonstrate x2 to x16 core GPU chip for mobile devices!

Fair enough, but I still think nvidia/amd will come up with something similiar before anything like what you suggest would happen.



I agree, and if this chip was so good Intel/Amd could easily incorparate something like it into CPU's as the watts would be so low.

I still honestly cant see mobiles as powerful as desktops though, closer sure.

Time will tell tough:)
 
The kyro was destroyed in it's day by the geforce 2 mx. Only in high res and certain games did it come close. Most reviewers thought the tech was very promising but they couldn't write decent drivers for it. Also ati has been using tile based rendering for many years so it's nothing that will make powervr's technology advantageous :|
 
Last edited:
metalmackey said "I agree, and if this chip was so good Intel/Amd could easily incorparate something like it into CPU's as the watts would be so low."
After a nights sleep I just remembered Intel has taken out a licence and started working on this for the desktop. But its unclear which gen of chip they are useing.

As for those thinking PowerVR's Tile based rendering is the same as ATI/AMD you need to do more research. The architecture is vastly different. ATI/AMD do not do hardware tile based rendering. A few tricks in software do not make your card a tile based card or give many of the advantages. Putting two none Tile based cards together does not make a real tile based card. PowerVR still have many advantages.

As for the Kyro 1 & 2 it was not destroyed by the MX and even if it was just under MX speeds it proves my point. A low Mhz speed Kyro can match a higher Mhz speed MX.
 
Thanks for informing me I wasn’t aware of that. All I could find was references to two none tile based cards being tile based via software in Crossfire.

Still I don’t understand AMD’s thinking. So AMD list the benefits of Tile based rendering, use it in mobiles and consoles but not in desktops. Then sell this efficient architecture so they no longer have access to it. What’s the idea behind all that?
 
No, AMD were in deep trouble and needed to get something back from the ATi purchase. They had the mobile department on sale for about a year before Qualcomm finally bought it.
Snapdragon was still being finalised at that time so they would've licensed the IP anyway, great deal for them, AMD just didn't have the foresight to see where the market was headed (all the bloggers at that time tagged snapdragon as vapourware :D)

The ATi Z460 (essentially a cut down xbox 360 Xenos gpu) became the Adreno 200 and in 2012/2013 it'll reach 360/PS3 performance levels with the Adreno 330 :eek:

With the ULP GeForce (and nVidia paying devs off :D) ARMs unique tri-pipe architecture/multicore Mali, powerVR SGX and Adreno things are getting very interesting... (there are a few others too)
 
So the Adreno 330 will be as powerful as a xbox 360 in 2012. Thats still 6 to 7 years behind a high end desktop GPU and as said before, no one knows how fast mobile GPUs can go before they hit power limits ect.
 
Prebaked.

A lot of stuff is prebaked, but you can do that in very limited games, and you can get away with that kind of thing in COD and stuff like that aswell as you wouldn't really notice. However get into more complex games like Oblivion/Stalker, and try to do prebaked shadows it becomes a joke.

For someone who went on, and on, and on about the immersion of incredibly basic(and not very good ones at that) effects from physx and how they'll change gaming as it improves immersion, removing things we've had for, what, over a decade now, real time lighting, real time shadows, day/night cycles in games where it would matter. In COD it wouldn't as you don't stand in a level and watch the time of day change significantly as you're playing a short level at a specified time, in other games you just couldn't prebake everything.

This is where ultra accelerated but ultra LIMITED mobile graphics will fall down, and why a top end gpu uses 200W +, all the added effects you add, not only need more power but they also add the need to redo other effects so they all work together.

IE you can't add a realistic night/day cycle to that game because, ultimately the shadows wouldn't change, you'd need more power to do realistic lighting, and the massive massive amount of textures required to have a prebaked version of everything, for every different possible lighting position is not possible. So you have to remove prebaked shadows/lighting, and add realtime and that increase complexity massively.

We're a decade or more away from Crysis type performance/effects on a mobile chip that can run under 2W.


Yet again Pottsey is comparing apples to, not even oranges, its not a fruit, its not from the same solar system.
 
No, AMD were in deep trouble and needed to get something back from the ATi purchase. They had the mobile department on sale for about a year before Qualcomm finally bought it.
Snapdragon was still being finalised at that time so they would've licensed the IP anyway, great deal for them, AMD just didn't have the foresight to see where the market was headed (all the bloggers at that time tagged snapdragon as vapourware :D)
)

Actually I'd suggest one of the reasons Meyer was fired was the Qualcomm deal, though a tad harsh as the board would have approved the deal, however if he gave them an incorrect outlook of the market, thats a problem.

At the point Qualcomm was sold AMD weren't really in that much trouble anymore, I'm fairly sure this was after GloFo/AMD split, which essentially guarenteed AMD's future. The people that own GloFo, own 10-15% of AMD, and AMD are still their most important partner, by a country mile and would be in serious, well I was going to say serious trouble but thats not true, they could run 10 top end fabs with no customers for 20 years before the cost would even make them notice :p GloFo are owned by people who are collectively trillionaires, multiple times over.

However, no one likes to throw away money and AMD are a very important partner, $65mil compared to a rough 6billion to buy ATi, is basically nothing, compared to existing debt, is basically nothing.

It was seemingly a very bad move as they had many options at that point. Frankly I think there would be no issue getting a loan from GloFo, or selling half of what they sold to ATIC(who own GloFo and part of AMD already) for the same price and kept control of it which would have done well for both.

Of course $65mil seems like not much, but its really the cost of keeping the R&D teams going thats the real expense in designing chips, the team might have cost 500mil a year to run.

Its very hard to judge if they made the right call or not at that point.
 
drunkenmaster said "This is where ultra accelerated but ultra LIMITED mobile graphics will fall down, and why a top end gpu uses 200W +, all the added effects you add, not only need more power but they also add the need to redo other effects so they all work together."
http://www.youtube.com/watch?v=9ssd4P0bgSM
http://www.youtube.com/watch?v=sy2hXtUN2t0&feature=related
http://www.youtube.com/watch?v=VC2ul99iRkk
Make sure you watch the above in HD.

Yes mobiles really fall down at all those effects :) Epic Citadel has full Bump offset mapping AKA parallax mapping, Unreal's Global illumination to provides realistic lighting and shadows, Dynamic specular lighting with texture masks, Real-time reflections and animation and many more features. If that is somehow not good enough most of those affect will be far beyond what rasterisation based desktop cards can do once we get ray tracing.

Anyway realistic night/day cycle, shadows, realistic lighting all look way and I do mean way better via ray tracing then via rasterisation based desktop cards.

All those people saying mobiles are 16years behind, or my N64 could do mobile graphics better then PowerVR should really go go watch the above videos in HD. I like the middle one best.



drunkenmaster said "We're a decade or more away from Crysis type performance/effects on a mobile chip that can run under 2W."
If the x4 core chip come out on time this year you do realise don't you that we are only about half a year away from getting GeForce 8600 desktop card worth of power in mobiles. If I am not mistaken the 8600 can run Crysis.

Any 8 core PowerVR SGX543MP8 chip at a low 400 MHz would deliver around 532 million polygons and 16 billion pixels per second. Which is around GeForce GTX 260-216 level. Mobile tech is catching up fast. Now take that power and add in real time ray traceing along side rasterisation graphics.

EDIT:Its interesting to see how far mobiles have advance just in the past 6 months
 
Last edited:
A lot of stuff is prebaked, but you can do that in very limited games, and you can get away with that kind of thing in COD and stuff like that aswell as you wouldn't really notice. However get into more complex games like Oblivion/Stalker, and try to do prebaked shadows it becomes a joke.

Quoted for truth.

Anyone who has played a proper immersive PC game will appreciate why this mobile tech is so far off from it's desktop counterparts. The whole gameplay experience in games such as STALKER or Crysis would fall apart if it wasn't for the array of dynamic effects in the game. These effects require raw computing horsepower, something these mobile chips simply cannot do.

Pretty much every one of these 'impressive' 3D games that have come out on mobile devices are extremely static and play more like a rollercoaster ride, rather than and interactive simulated gaming world.
 
Yes mobiles really fall down at all those effects :) Epic Citadel has full Bump offset mapping AKA parallax mapping, Unreal's Global illumination to provides realistic lighting and shadows, Dynamic specular lighting with texture masks, Real-time reflections and animation and many more features. If that is somehow not good enough most of those affect will be far beyond what rasterisation based desktop cards can do once we get ray tracing.

Anyway realistic night/day cycle, shadows, realistic lighting all look way and I do mean way better via ray tracing then via rasterisation based desktop cards.

All those people saying mobiles are 16years behind, or my N64 could do mobile graphics better then PowerVR should really go go watch the above videos in HD. I like the middle one best.



drunkenmaster said "We're a decade or more away from Crysis type performance/effects on a mobile chip that can run under 2W."
If the x4 core chip come out on time this year you do realise don't you that we are only about half a year away from getting GeForce 8600 desktop card worth of power in mobiles. If I am not mistaken the 8600 can run Crysis.

Any 8 core PowerVR SGX543MP8 chip at a low 400 MHz would deliver around 532 million polygons and 16 billion pixels per second. Which is around GeForce GTX 260-216 level. Mobile tech is catching up fast. Now take that power and add in real time ray traceing along side rasterisation graphics.

I watched all the vids and tried the Citadel game myself on an iPhone and apart from DX8 level bump mapping and reflections I saw nothing dynamic at all - as I said before, all the complex shadows and lighting effects are completely static, i.e. prebaked.

Also an 8600 can run Crysis indeed, but it looks like this:

crysis_low.jpg


LOL

If you honestly think were close to this (screenshot below) on a mobile phone, you need to get your eyes checked:

crysis_both_03.jpg
 
Last edited:
You guys should forget the iPhone 4, it's now just a mid range gaming device (The SGS has well over twice the gfx power ;))

Games like galaxy on fire 2 feature a four times higher texture size and a six times higher poly count on its spaceships than on iOS :p

 
drunkenmaster said "This is where ultra accelerated but ultra LIMITED mobile graphics will fall down, and why a top end gpu uses 200W +, all the added effects you add, not only need more power but they also add the need to redo other effects so they all work together."
http://www.youtube.com/watch?v=9ssd4P0bgSM
http://www.youtube.com/watch?v=sy2hXtUN2t0&feature=related
http://www.youtube.com/watch?v=VC2ul99iRkk
Make sure you watch the above in HD.

Yes mobiles really fall down at all those effects :) Epic Citadel has full Bump offset mapping AKA parallax mapping, Unreal's Global illumination to provides realistic lighting and shadows, Dynamic specular lighting with texture masks, Real-time reflections and animation and many more features. If that is somehow not good enough most of those affect will be far beyond what rasterisation based desktop cards can do once we get ray tracing.

^^
Pretty much every one of these 'impressive' 3D games that have come out on mobile devices are extremely static and play more like a rollercoaster ride, rather than and interactive simulated gaming world.

^^ This, also I did not see any real time lighting or shadows in the vids above.
 
I watched all the vids and tried the Citadel game myself on an iPhone and apart from DX8 level bump mapping and reflections I saw nothing dynamic at all - as I said before, all the complex shadows and lighting effects are completely static, i.e. prebaked.

Also an 8600 can run Crysis indeed, but it looks like this:

crysis_low.jpg


LOL

wow, its like im almost there!!! :D
 
Well we shouldn't have long to wait. If I am right on the 27th of January we should see mobile graphic chips over 8times more powerful then the Iphone 4. Perhaps even as much as x10 more powerful. That will be yet another giant leap towards Crysis on mobiles.
 
Back
Top Bottom