• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD likely to win Nintendo NX and continue console dominance

That's pretty much the point, though. Whilst we have infinitely more capable hardware on this platform - all we're met with is abstract middleware layers from NVIDIA to improve quality by a fraction to none, because nobody else cares. At least AMD let out a disgruntled fart with Mantle in order for Microsoft to railroad D3D development, but that's all we're left with still after a year. DirectX 12 is the swan song for PC gaming really, if it doesn't really grab the graphics market in the next 3 to 5 years then it's time to throw the towel in for the enthusiast GPU. NVIDIA knows this already - why do you think they're so obsessed with Tegra. You adapt, or you die a death.

I'm feeling particularly cynical tonight, sorry :D. I'm not wrong, though

PC gaming is as strong as ever and to say DX12 has to save it?...

I may of misunderstood what you mean by that but look at the figures for LOL and Dota. This is the first time in my life I know as many people who have enthusiast gaming PC's as I do console owners. The market cares more about consoles and we suffer,that much I know but to say PC gaming is on its last legs is absurd
 
Agreed, the PS4 keeps surprising me tbh. Arkham Knight, Witcher 3 look awesome and play great on PS4.

The desktop APU's are poor in comparison and more suitable for general PC use/ web gaming and 720P actual PC gaming. But the custom console APU's are decent.

Future console APU could utilise HBM and be really quiet powerful with die shrinks allowing much more shaders etc.

It's a full 7870 silicon. You're just witnessing a great deal of man hours. Getting 60 frames per second out of current hardware shouldn't be really "****ing hard". That's Naughty Dog's director Neil Druckmann's words, not mine :D

PC gaming is as strong as ever and to say DX12 has to save it?...

I may of misunderstood what you mean by that but look at the figures for LOL and Dota. This is the first time in my life I know as many people who have enthusiast gaming PC's as I do console owners. The market cares more about consoles and we suffer,that much I know but to say PC gaming is on its last legs is absurd


PC Gaming is thriving. That's somewhat different to the prestige of cutting edge hardware and visuals that one comes to expect and want when purchasing a high end GPU.
 
yeh its best its been in a while
i hope steam doesnt screw with that by splitting off
that would be pretty crazy

i think its going to get a huge boost with windows tablets over the next few years too
 
It's worth remembering the consoles have unified memory I'd say they're a long way ahead of PC architecturally. PC's are just throwing more and more brute force at an outdated design. The unified memory could be reason why most PC ports have generally been poor.
 
Last edited:
“If you notice, your latest find references statements they'd already made. Its clear they made headline grabbing statements about a "hundred times increase", the fact you are having to find increasingly hyperbolic statements to "prove" they were right goes to show how pointless those statements are.”
Yes I did notice the first statement they made I gave you the link to both statements. The links I gave show the only person making hyperbolic statements is you. You created a number of hyperbolic statements that neither me or Imagination made. You’re the one making hyperbolic statements that are different from the statements Imagination made. I could find no evidence at all to match up with the hyperbolic statements you created. Everyone can read the links I posted and see that’s true. You are wrong about ray tracing and clearly wrong about this as well. The only pointless statements are the crazy ones you made up.



“The comparison you made was also not of products that were available 5 years apart, but more like 7, and longer if you take an actual "on the shelf" date”
At the point I posted that no one mentioned a 5 year limit. Its not fair to place a 5 year limit after I place my post. The statement you posted was series 5 against series 6 which covers a 7+ year timeframe. If you want the SGX535 as a baseline and a 5 year limit.

The SGX535 in the Iphone 4 came out in 2010 or 2009 if you count the Iphone 3. The Iphone 6 came out in 2014. That’s a 4 or 5 year difference depending on Iphone 3 or 4. With the Iphone 4 the ALU performance off-screen went from 0.4fps to 95.1fps with Iphone 6. The MTexels rate when from 51 MTexels/s to 3756MTexels/s. The T-Rex Offscreen went from 0.4fps to 45fps.

That is not counting the Metal API which gives an even bigger performance difference.

So clearly what you said about performance only doubling is flat out wrong and those numbers fit in with the original statement from Imagination.



“If you notice, you actually posted an image that shows them claiming a 700x improvement over 5 years”
Yes I did as that includes the upcoming series 7 GPU. What is the problem with that 700x number? Considering the context of where the file came from I don’t see the problem.
 
Last edited:
PC Gaming is thriving. That's somewhat different to the prestige of cutting edge hardware and visuals that one comes to expect and want when purchasing a high end GPU.

Most of what i see from pc gaming thriving is coming from small - medium dev studios, the majority being indies through digital distribution.

Yet the quality of many of the games still surpasses most AAA games :P
 
yeh its best its been in a while
i hope steam doesnt screw with that by splitting off
that would be pretty crazy

i think its going to get a huge boost with windows tablets over the next few years too

Its something I'm really looking forward to. I dont think Ive ever been this interested in a windows release but the potential of "cross play" between my XB1,PC and SP3 has me really excited :)
 
It's worth remembering the consoles have unified memory I'd say they're a long way ahead of PC architecturally. PC's are just throwing more and more brute force at an outdated design. The unified memory could be reason why most PC ports have generally been poor.

No, XBOX One is the lowest denominator, it's esRAM pool isn't nearly as good as PS4's unified memory pool of GDDR5. However this still has limitations as streaming all this geometry and shaders is limited by the same envelope that the GPU is limited to.
 
I was actually talking to a dev who has experience with the PS3,PS4,Xbox 360 and Xbox One - he said the main issues is that the eSRAM for both the XBox consoles was never quite big enough - meaning things needed to be done in multiple passes which was inefficient.
 
Yes I did notice the first statement they made I gave you the link to both statements. The links I gave show the only person making hyperbolic statements is you. You created a number of hyperbolic statements that neither me or Imagination made. You’re the one making hyperbolic statements that are different from the statements Imagination made. I could find no evidence at all to match up with the hyperbolic statements you created. Everyone can read the links I posted and see that’s true. You are wrong about ray tracing and clearly wrong about this as well. The only pointless statements are the crazy ones you made up.




At the point I posted that no one mentioned a 5 year limit. Its not fair to place a 5 year limit after I place my post. The statement you posted was series 5 against series 6 which covers a 7+ year timeframe. If you want the SGX535 as a baseline and a 5 year limit.

The SGX535 in the Iphone 4 came out in 2010 or 2009 if you count the Iphone 3. The Iphone 6 came out in 2014. That’s a 4 or 5 year difference depending on Iphone 3 or 4. With the Iphone 4 the ALU performance off-screen went from 0.4fps to 95.1fps with Iphone 6. The MTexels rate when from 51 MTexels/s to 3756MTexels/s. The T-Rex Offscreen went from 0.4fps to 45fps.

the original quote you were supposed to be working off was series5XT to series6 (non-XT) and you chose series5 original base model to series6XT top model, expanding the time frame as that was the only way you could get anywhere near it, you then found another quote that mentioned a 5 year limit, you imposed that limit not me :rolleyes:
you're having a go at me for "imposing" a time limit that you yourself posted

according to powervr's website the SGX520 (the original chip you chose dates back to 2005, and the SGX535 to 2007, so no, either of those to the iphone6 in 2014 is not 5 years
 
“the original quote you were supposed to be working off was series5XT”
Since when? That’s looks to me like something you made up. 1 quote very clearly mentioned core for core. The other quote clearly mentioned the SGX535 as a baseline. When was the 5XT ever mentioned? Give me a link please.


“you're having a go at me for "imposing" a time limit that you yourself posted”
The time limit was imposed after I posted the SGX520. You had a go at me for posting a 7 year old chip before I know about the 5 year time limit. That’s what I wasn’t happy about.


“according to powervr's website the SGX520 (the original chip you chose dates back to 2005, and the SGX535 to 2007, so no, either of those to the iphone6 in 2014 is not 5 years”
That’s it keep change the goal posts just to convinced yourself that you are not wrong again. It’s not fair to compare the IP announcement date against a product date. It’s either product to product. Or IP announcement to IP announcement. Either way you are wrong as the facts show.
 
I've already admitted that the original article I read was a misquote, not sure what else you want from me on that one.
It's not really my fault if PowerVR consider their products "released in" when it takes them another 3-4 years to get them in to real products.

At the point they made that statement, in 2011, the SGX535 was, according to PowerVR, 4 years old already, it then took them until 2014 to get a chip out that was "100x" faster, so 8 years... if you look at what any other GPU manufacturer did in 5-8 years and it's not all that impressive. It's a nonsense comparison. PowerVR love throwing around things like "100x" to grab headlines, but the comparisons they make are not worth making. Comparing a 65nm product to a 28nm product - do that with anyone and you'll turn up some very large numbers, but no one else bothers doing it for headlines. It's not surprising if you are going to make such tenuous comparisons that some journos are going to end up misquoting it.

Do the same comparison with nvidia and you are comparing an 8800GT to a 980ti or 980
 
Last edited:
It's a full 7870 silicon. You're just witnessing a great deal of man hours. Getting 60 frames per second out of current hardware shouldn't be really "****ing hard". That's Naughty Dog's director Neil Druckmann's words, not mine :D

Yeah and I think that's where people confuse consoles APU (Actually decent) with current desktop APU's (Budget low performance).

The PS4 APU has GPU shader count / performance similar to HD 7870, with GDRR5 memory, the desktop APU's have much less shaders, and are hamstrung with DDR3 memory. The PS4 APU is far superior. In the future HBM and die shrnks could make desktop APU's viable for budget gaming but the current ones can barely manage 720P.

The next lot of console APU's could be lil monsters as well. PS4 is a good console imho, and for the price it can give a really good gaming experience.

Hopefully AMD do get the next Nintendo console and it has at least PS4 or better performance.
 
Its not an HD7870 - it has the same number of shaders in it full form as an HD7870 but to maximise yields it has 1152 shaders. It also has some massive enhancements over the HD7870:

http://techreport.com/news/24725/ps4-architect-discusses-console-custom-amd-processor

A 256-bit interface links the console's processor to its shared memory pool. According to Cerny, Sony considered a 128-bit implementation paired with on-chip eDRAM but deemed that solution too complex for developers to exploit. Sony has also taken steps to make it easier for developers to use the graphics component for general-purpose computing tasks. Cerny identifies three custom features dedicated to that mission:

An additional bus has been grafted to the GPU, providing a direct link to system memory that bypasses the GPU's caches. This dedicated bus offers "almost 20GB/s" of bandwidth, according to Cerny.
The GPU's L2 cache has been enhanced to better support simultaneous use by graphics and compute workloads. Compute-related cache lines are marked as "volatile" and can be written or invalidated selectively.
The number of "sources" for GPU compute commands has been increased dramatically. The GCN architecture supports one graphics source and two compute sources, according to Cerny, but the PS4 boosts the number of compute command sources to 64.

Its probablly more like Tonga or Fiji in uarch.

Plus even the CPU has access to a massive amount of bandwidth too.
 
^^

This is why it urks me to see people saying console have weak APU lol, it's not the same as the current actually weak desktop APU's. These are custom built chips with more shaders, and Sony's has extra enhancements and GDDR5. There is no comparison with the Kaveri DRR3 '720P' desktop APU etc.
 
^^

This is why it urks me to see people saying console have weak APU lol, it's not the same as the current actually weak desktop APU's. These are custom built chips with more shaders, and Sony's has extra enhancements and GDDR5. There is no comparison with the Kaveri DRR3 '720P' desktop APU etc.

Its only because of PC-MASTER-RACE E-peen. Its a console which sells for under £300 profitably. They should see the chips used for the central computers on the Mars missions - a modern phone might have more processing power.
 
^^

This is why it urks me to see people saying console have weak APU lol, it's not the same as the current actually weak desktop APU's. These are custom built chips with more shaders, and Sony's has extra enhancements and GDDR5. There is no comparison with the Kaveri DRR3 '720P' desktop APU etc.

It is a weak APU, you're the only one comparing it to AMDs desktop offerings. :rolleyes:

Look at it perspectively. I'm playing a poorly optimised port at 3840x2160. The best people in the business can't get their title to run at 60 frames per second at 1080p, and the majority of developers struggle to achieve 900p. Most of the time the GPU isn't even pushing the envelope enough for the GDDR5 to really make any real difference, but it's great to see people like Naughty Dog raise the bar.

In contrast, it is weak. It's been used because of what Sony looked to spend. You can't call it what it's not.
 
Last edited:
It is a weak APU, you're the only one comparing it to AMDs desktop offerings. :rolleyes:

Look at it perspectively. I'm playing a poorly optimised port at 3840x2160. The best people in the business can't get their title to run at 60 frames per second at 1080p, and the majority of developers struggle to achieve 900p. Most of the time the GPU isn't even pushing the envelope enough for the GDDR5 to really make any real difference, but it's great to see people like Naughty Dog raise the bar.

In contrast, it is weak. It's been used because of what Sony looked to spend. You can't call it what it's not.


No it isn't. I am running a GTX660 which is one of the most common cards for gaming still,and the PS4 has comparable GPU power.

You are massively out of touch with what most gamers are running and its from actually thinking PC hardware enthusiast forums are indicative of what most gamers are running.

Its like thinking my Nikon D600 is the most common Nikon dSLR just because I might post on a camera forum! :rolleyes:
 
Last edited:
lol of course, so if you asked people if the 660 was a weak part they'd tell you otherwise....

Half the performance of a 680GTX. Argument in an empty house springs to mind
 
Back
Top Bottom