• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PowerVR demonstrate x2 to x16 core GPU chip for mobile devices!

It should also be noted that iPhone Rage has literally 0 dynamic lighting. Eveything is baked on in to lightmaps by the looks of it which is 20th century tech. Additionally iPhone Rage is an 'on-rails' shooter, the player has no control over movement and limited control over viewpoint which allows some big, big optimisations that you can't do in a free-roaming game. In short, to compare what it's doing to a normal PC shooter is a joke.

I don't know why you're so determined that mobile graphics should surpass desktop GPUs but the it's pure horse****. Mobile graphics are where they always have been (maybe slightly ahead historically since there is now more investment), which is many years behind console tech and even further behind PCs.
 
Last edited:
Did type out a big explanation but can't be arsed, basicaly any gaming on phones n tablets is CRAP so powervr for gamers is completely irrevelent except for farmville atm.
And yes I have one in all the iPhones round the house. Crashes more than my desktop...

You typed out a big explanation but you could not be arsed to hit submit?

Very odd.
 
Baboonanza said “I don't know why you're so determined that mobile graphics should surpass desktop GPUs but the it's pure horse****. Mobile graphics are where they always have been (maybe slightly ahead historically since there is now more investment), which is many years behind console tech and even further behind PCs.”
It’s not pure horse****. You and a lot of people don’t seem to realise the world is changing. Give me one good reason why a mobile real-time ray tracing chip cannot look better than a none real-time ray tracing desktop card? Just because something has always been ahead does not mean it will always be ahead. PDA’s used to always be ahead of phones now look at how far behind PDA’s are.


It’s a perfectly reasonable possibility that mobile chips at the current rate of expansion can overtake desktops. Just look at the upcoming technology and what’s being developed. I am not saying it’s going to happen 100% but it now a very real possibility. Anyone saying it flat out cannot happen doesn’t understand the technology in development. The improvements between mobile generations is vastly better than the improvements between PC generations.

More and more investment is being put into the mobile market and it’s predicated to vastly overtake PC’s. For year after year mobile markets have seen bigger improvements between generations then PC markets.

It is wildly believe the mobile market will outship the desktop market by 2012 and shortly after be twice the size and still growing. Once you have company’s main source of income in the mobile market and more R&D is put into mobile chips then desktop it’s not hard to imagine mobile chips overtaking desktops. That’s why I am investing in mobile based companies. The future is mobile. Desktop PC’s are not going to die out but I strongly believe they will become the secondary market dwarfed by the mobile market.

Real-time ray tracing is a prime example. Its strongly looking like mobiles will get that before desktops. If that happens desktops could well fall behind graphically. How can you not see that?

As some people don’t get it, please take note, I said might, not will overtake desktops.
 
Mobile chips will never overtake desktops.

If. somehow, a mobile chip was faster than a desktop chip they'd simply run the same chip in a desktop machine but with faster clocks due to better heat dissipation.

You're living in a dream world sadly.
 
The problem with your arguement Pottsey is that it goes against all logic and common intuition.

For a tiny device such as a mobile phone to overtake something as big as a PC in terms of computing power would mean that CPU/GPU manufactures for the PC have failed miserably in terms of architecture design and power efficiency (something that they currently are not doing). In fact the level of failure required for such a feat would imply that either the high end CPU/GPU market has long been forgotten or that the definition of terms such as "mobile" and "desktop" no longer hold any relevance, in which case your arguement is already invalid.
 
titaniumx3 “The problem with your arguement Pottsey is that it goes against all logic and common intuition.”
It might do at first glance but not once you investigate. PowerVR are vastly more efficient with their architecture. Like I said before PowerVR’s cheap 75Mhz card outperformed Nvidias top end desktop card at raytraceing by a massive amount.

Nvidias next gen 1000Mhz chips barely outperform PowerVR’s older 100Mhz chip yet alone the next gen. PowerVR only need to make a small increase to get useable raytraceing while Nvidia are very far away. Raytraceing is the key, once that arrives and it looks like it will then mobiles should be able to look better then desktops. Raytraceing can pull off graphics none Raytraceing desktops cards cannot dream off. That’s the reason lots of people call real time Raytraceing the holy grail



FrenchTart Said “Mobile chips will never overtake desktops.
…..
You're living in a dream world sadly. “

PowerVR are not currently in the desktop space so if they come out it with a great mobile ray tracing chip we can end up with amazing graphics in a mobile sector with no ray tracing in the desktop. ATI and Nvidia are currently years behind PowerVR when it comes to ray tracing.

How is it a dream to believe that PowerVR will get ray tracing at useable speeds before the desktop market? All the facts point towards it happening. Do you think PowerVR will not be the first to get useable Raytraceing?
 
You keep using this 75Mhz chip at raytracing as some magical and brilliant benchmark.

The problem is, how would that 75Mhz chip do at gaming, answer, almost certainly utterly rubbish.

LIkewise asides from 75Mhz I don't know much more than that about it. It could be a 40k shader card running very slowly, that could be produced at the cost of $2500 per core.

Likewise WHAT did it raytrace, was it not a highly accelerated for one specific purpose chip. Have a look at quicksync on Sandybridge, when you have hardware acceleration its magnitudes faster than general purpose hardware programmed in software to do the same job.

What happens if a game has a new feature this fantastic, unavailable, not produced chip doesn't support, does fps go from 60fps to 0.00001fps?

Your claim is so random, unquantified, and unknown.

Anyway back to the current chips, as has been pointed out many many many times over to you, the PC versions of these engines these chips run so well are so insanely more powerful than the mobile versions that you can't compare them at all.

I think you'd almost certain find if those mobile engines could be run on current AMD/Nvidia cards they'd be many many times faster.

Why do people insist on ignoring logic. A basic tablet/smartphone ARM chip has advantages over desktop chips in power because they are designed to be that low power as the PRIMARY goal.

The big problem is most people compare, look at this 2W chip running an OS thats comparable to windows, and doing basic things that can use then compare it to how a 125W AMD/Intel chip run similar stuff like a browser. Thats all well and good, but the actual power being used to run a browser on the AMD/Intel chip is NOT 125W, its just not, it will use 125W to do the most powerful thing it can, actually running firefox say will only be using a few Watt's on an uber powerful chip.

A smartphone/tablet chip is magnitudes slower than a desktop chip, and the same goes gpu's.

A sub 5W mobile chip will never, ever, ever, ever come anywhere near close to the capabilities of a 300W desktop GPU.

A 10W piece of silicon inside a i5, the trascoding block, is many times faster than a 300W gpu, because its got one function, thats how life works. Gaming is not a fixed function process, theres only so much you can hardware accelerate with dedicated hardware in the gpu.

Its pretty simple, if this 75Mhz chip was THAT good, it would be out, making a killing and putting the other guys out of business. When you have a product that far ahead, you don't just turn around and say "those billions of dollars aren't for me, I'd prefer to shelve it".

What you're talking about is a highly highly specialised, single function chip that can do one thing very well and can't in any way be released into an industry for gaming that doesn't give a monkeys about raytracing yet.

I've had quite enough of Physx and Nvidia's "tech demo" games being released that have physx and no gameplay.

A powervr card that is the slowest card in the past decade for any available game but can run some soon to be released game paid for by PowerVr really really well isn't interesting in the least.
 
Raytraceing can pull off graphics none Raytraceing desktops cards cannot dream off. That’s the reason lots of people call real time Raytraceing the holy grail

Like what ?

Maybe more accurate reflections & crisper shadows, but other than that ?

The 3d graphics industry realised a long time ago that pure raytracing wasn't the holy grail & that's why scanline / hybrid renderers took over ( Renderman & all it's other variants )

Screen space or horizon based ambient occlusion looks good enough & is a damm sight faster than monte carlo ambient occlusion / radiosity.

Sure, monte carlo might be more accurate, but what's the point when an approximation almost looks as good & is much, much quicker ? Same with reflections & refraction.

Have a read at this from 2008. Cryteks' ceo talking about why a pure raytracing engine isn't worth it :

http://www.pcper.com/article.php?aid=546

Imho, a hybrid engine that combines rasterisation & ray tracing will no doubt be the best of both worlds, but certainly not a ray tracing only engine.
 
Last edited:
titaniumx3 said "Sorry but I find your claims absurd, show us some evidence of this 75mhz powervr gpu outperforming Nvidia's top end GPU."
This http://www.youtube.com/user/CausticGraphics#p/u/4/LSwjXDCknpo is an old developer card running at 75Mhz from 2009 and its over x10 faster than Nvidia top end desktop GPU's which do about 0.6fps on a single GPU. That's a developer far from finale performance and the newer generations are meant to be much faster. The 2nd gen is ment to be x14 faster.



drunkenmaster said "The problem is, how would that 75Mhz chip do at gaming, answer, almost certainly utterly rubbish."
Your really don't get it do you? The technology is to be mixed with PowerVR tile based chips. The point is if a slow 75Mhz developer card can performed that well imagine what it's going to do on a full blown none developer PowerVR chip running at full speed. The answer is its going be way faster.

A PowerVR 75Mhz chip is about the same as a 700Mhz Nvidia chip. All they need to do is bump the speed up to say 150 to 200mhz add in the ray tracing parts and you have a pretty impressive chip.



drunkenmaster said " Likewise WHAT did it raytrace, was it not a highly accelerated for one specific purpose chip. Have a look at quicksync on Sandybridge, when you have hardware acceleration its magnitudes faster than general purpose hardware programmed in software to do the same job."
Anything and everything. The whole point of the Caustic technology is its a full real-time, fully interactive renderer for a wide range of 3D graphics applications. It's not highly accelerated for one specific purpose which is what's so impressive about Caustic technology ,it's made for 3D artists to designers and end users.

Its takes no shortcuts and does full raytraceing.



drunkenmaster said " What happens if a game has a new feature this fantastic, unavailable, not produced chip doesn't support, does fps go from 60fps to 0.00001fps?"
What a strange thing to say. What normally happens when a new feature comes out and your current chip does not support? As you know 0.00001fps would be an odd case. DX11 games don't drop my DX10 card from 60fps down to 0.00001fps.

What's more likely to happen is Nvidia will not have this nice feature so the games and software and go from 60fps with a PowerVR GPU to 0.1fps with a Nvidia GPU. Anyway for many of the platforms PowerVR are 100% market pretention so all the apps and software are made for PowerVR. The only time a new feature is going to be added is when a new generation of PowerVR hardware comes out. PowerVR are the market leaders in the space this will be used in.



drunkenmaster said "Your claim is so random, unquantified, and unknown."
My claims are 100% provable you just like to ignore anything that doesn't agree with you as this thread shows when you said the Ipad has low textures and low res even though you have been proven wrong many times on that.



drunkenmaster said " Why do people insist on ignoring logic. A basic tablet/smartphone ARM chip has advantages over desktop chips in power because they are designed to be that low power as the PRIMARY goal."
I am not ignoring logic, you are. The architecture of PowerVR is vastly different from ATI or Nvidia. PowerVR need vastly less power requirement to match ATI or NV .

PowerVR chips are magnitudes slower in raw speed then ATI and NV yet match them in 3D speed. Like its been pointed out a 100mhz PowerVR matches a 1000Mhz NV chip.

Sure desktops have access to more power but PowerVR don't need as much power and PowerVR are magnitudes faster at ray tracing.



drunkenmaster said "A smartphone/tablet chip is magnitudes slower than a desktop chip, and the same goes gpu's."
But if the technology keeps changing as it is, mobiles chip will be magnitudes faster in areas like ray tracing. This could mean far better looking games. Please note I said could, I did not say will.



drunkenmaster said " Its pretty simple, if this 75Mhz chip was THAT good, it would be out, making a killing and putting the other guys out of business."
It is out since 2009 and is the market leader in the hardware ray tracing world isn't it? Lots of software packages and people use it. Lots of companies use it.


drunkenmaster said "What you're talking about is a highly highly specialised, single function chip that can do one thing very well and can't in any way be released into an industry for gaming that doesn't give a monkeys about raytracing yet.c"
So what happens when 100% of Ipads 3/4 and/or 100% Iphones 6/7 have this chip? Do you really think no one will give monkeys about it? Do you really think no apps or games will come out that use it? All it will take is one nice looking app and lots of developers will get interested in ray tracing. Lots will be interested as 100% of that gen will be able to use it.

Considering how big the gaming market is for mobiles how can you say this isn't a chip for gaming? You might not like mobile gaming but it doesn't change the fact it's massive and predicted to overtake desktop gaming.


drunkenmaster said " A powervr card that is the slowest card in the past decade for any available game but can run some soon to be released game paid for by PowerVr really really well isn't interesting in the least."
You really have no concept of the markets do you? Or any real understand of technology. How am I meant to respond to something as wrong as what you just said? I have done a lot of research to this to the point where I am feel as safe as one can investing money in it. I guess one off two things are going happen, I am right and make a ton of money or you are right and I lose money. Time will tell.



MoodyB " http://www.pcper.com/article.php?aid=546
Imho, a hybrid engine that combines rasterisation & ray tracing will no doubt be the best of both worlds, but certainly not a ray tracing only engine."

I am pretty sure that the PowerVR/Caustic technology is a choice of Ray only or Hybrid with rasterisation. I need to re watch he videos.

For those interested thesse are old exmples of Castic technology.
http://www.youtube.com/watch?v=YQ2OltdXGZo
http://www.youtube.com/watch?v=aUa-2_KBwCs
http://www.youtube.com/watch?v=gO42parB9G4&feature=related
http://www.youtube.com/watch?v=PwjQk7dJOFQ&feature=related
http://www.youtube.com/watch?v=sJoA4LdzhQs
 
Last edited:
Is this the same kinda tech as the "Unlimited reallity" that was in this forum a few months ago? Where it looked raytraced and was all done on a laptop CPU I believe. Looked amazing but nothing moved (apart from the camera).
 
Sorry, but all I see is a chip designed to carry out a fairly specific function and hence do it very efficiently. To compare it to something as complex and versatile as a high end nvidia GPU is complete nonsense.

The reason why it runs so slowly on a normal GPU is probably because it's not making full use of the math capability in the GPU - the bloke in the video says this himself.

Now I wonder why this is? Maybe nvidia and the rest of GPU industry found more feasible alternatives to raytracing, that work better in actual playable games.
 
drunkenmaster said " Likewise WHAT did it raytrace, was it not a highly accelerated for one specific purpose chip. Have a look at quicksync on Sandybridge, when you have hardware acceleration its magnitudes faster than general purpose hardware programmed in software to do the same job."
Anything and everything. The whole point of the Caustic technology is its a full real-time, fully interactive renderer for a wide range of 3D graphics applications. It's not highly accelerated for one specific purpose which is what's so impressive about Caustic technology ,it's made for 3D artists to designers and end users.

Its takes no shortcuts and does full raytraceing.



drunkenmaster said " What happens if a game has a new feature this fantastic, unavailable, not produced chip doesn't support, does fps go from 60fps to 0.00001fps?"
What a strange thing to say. What normally happens when a new feature comes out and your current chip does not support? As you know 0.00001fps would be an odd case. DX11 games don't drop my DX10 card from 60fps down to 0.00001fps.

What's more likely to happen is Nvidia will not have this nice feature so the games and software and go from 60fps with a PowerVR GPU to 0.1fps with a Nvidia GPU. Anyway for many of the platforms PowerVR are 100% market pretention so all the apps and software are made for PowerVR. The only time a new feature is going to be added is when a new generation of PowerVR hardware comes out. PowerVR are the market leaders in the space this will be used in.



drunkenmaster said "Your claim is so random, unquantified, and unknown."
My claims are 100% provable you just like to ignore anything that doesn't agree with you as this thread shows when you said the Ipad has low textures and low res even though you have been proven wrong many times on that.



drunkenmaster said " Why do people insist on ignoring logic. A basic tablet/smartphone ARM chip has advantages over desktop chips in power because they are designed to be that low power as the PRIMARY goal."
I am not ignoring logic, you are. The architecture of PowerVR is vastly different from ATI or Nvidia. PowerVR need vastly less power requirement to match ATI or NV .

PowerVR chips are magnitudes slower in raw speed then ATI and NV yet match them in 3D speed. Like its been pointed out a 100mhz PowerVR matches a 1000Mhz NV chip.

Sure desktops have access to more power but PowerVR don't need as much power and PowerVR are magnitudes faster at ray tracing.


drunkenmaster said " Its pretty simple, if this 75Mhz chip was THAT good, it would be out, making a killing and putting the other guys out of business."
It is out since 2009 and is the market leader in the hardware ray tracing world isn't it? Lots of software packages and people use it. Lots of companies use it.


drunkenmaster said "What you're talking about is a highly highly specialised, single function chip that can do one thing very well and can't in any way be released into an industry for gaming that doesn't give a monkeys about raytracing yet.c"
So what happens when 100% of Ipads 3/4 and/or 100% Iphones 6/7 have this chip? Do you really think no one will give monkeys about it? Do you really think no apps or games will come out that use it? All it will take is one nice looking app and lots of developers will get interested in ray tracing. Lots will be interested as 100% of that gen will be able to use it.

Considering how big the gaming market is for mobiles how can you say this isn't a chip for gaming? You might not like mobile gaming but it doesn't change the fact it's massive and predicted to overtake desktop gaming.


drunkenmaster said " A powervr card that is the slowest card in the past decade for any available game but can run some soon to be released game paid for by PowerVr really really well isn't interesting in the least."
You really have no concept of the markets do you? Or any real understand of technology. How am I meant to respond to something as wrong as what you just said? I have done a lot of research to this to the point where I am feel as safe as one can investing money in it. I guess one off two things are going happen, I am right and make a ton of money or you are right and I lose money. Time will tell.

I asked what it raytraced, its worth pointing out you have many many many times suggested they could over take desktop gaming cards. Well Intel already massively surpassed desktop gaming cards, in one specific area of performance, if you want to use that logic, thats fine. If you want to suggest that a very low power gpu can over take gpu's using what, 100x or more power, in gaming in general, its just fantasy.

So, name a game this fantastic 75Mhz card can raytrace, please, go on. The card takes no shortcuts and does full raytracing, thats fine, can it run DX11 game?

As for what happens when a new feature comes out, again you utterly utterly utterly fail to comprehend the very simple argument here.

When you make something small, dedicated hardware acceleration you can speed up ANYTHING and not that but VERY EASILY. Hardware acceleration is incredibly easy.

AMD/Nvidia hardware is not mostly dedicated hardware for specific function, the VAST majority of both companies cores are PROGRAMABLE hardware, which does two things, adds to complexity, size, power usage and most important, versatility.

What you seem to be suggesting is this one card runs ONE piece of software, and herein lies the problem, said company doesn't make games, nor does anyone own one of these cards for gaming, so penetrating the gaming market will be almost impossible.

AS for new features, actually most cards can be programmed to do just about anything, new DX features tend to add acceration and speed ups for certain things to improve their usage, some would be uber slow without them, some not.

As for Iphones and gaming, yes I hate mobile gaming and no it won't ever be serious, get over it, its already massive. Facebook gaming surpasses WoW players many many many times over.

Basic rubbish game playing is and really always has been way bigger than "gaming".

The Iphone will never, ever, ever run an engine anywhere near the power/complexity of a PC engine, its just not going to happen. Current Iphone games are a joke, nothing more or less, for people with the attention span of a turnip who can't read a book on the train because reading is too difficult these days for an embarassing number of people.

Being able to run minecraft on your iphone really well does not mean running Crysis is also possible, comparing these two things is ridiculous.


You're using the idea that a company can create a 75Mhz card, that you've provided next to no info on, but can supposedly do raytracing really really well, as a reason why a different chip will eventually surpass desktop cards performance.

Its lunacy, sorry but a chip with completely unconfirmed performance, that can run a insanely cut down engine which offers very little in terms of gameplay, as a reason to insist it will pummel AMD/Nvidia into the ground.
 
Back
Top Bottom