• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Possible proof AMD will release gpu&cpu combined

Associate
Joined
26 Jan 2005
Posts
826
I just read that the new rumoured xbox360 'valhalla' may have a gpu and cpu combined on one chip. Now ibm make the 360's cpu and ati/amd make the gpu. It seems unlikely the two would combine efforts, or that IBM has any decent gpu's (although not sure on this so prove me wrong) then amd/ati would seem like the more obvious choice.

Taking this into account maybe well see this on future amd cpu's especially considering the possible power savings for mobile devices, and mobile devices being were the money is at at the the moment.

Of course take this all with a huge pinch of salt as i am speculating.
 
I think this would be more of a glue solution than AMD's fusion core. By combining both chips they would require a smaller console due to less circuitry and the need for only 1 HSF.

*Disclaimer, working from memory/guess work!*
AMD's fusion is about making use out of the huge amount of processing power that is on a gpu that is never used by standard apps as they aren't programmed to use it.
 
If we go down this path we will be intrinsycally stuck with cpu and gpu. If you want better graphics you'll have to upgrade your cpu and gpu in one, no swapping just the card. The PC will lose a lot of the flexibility if this is the market of the future.

Matthew
 
Surely integrating the cpu and graphics card in to one chip would require a massive change in architecture for the 360. I assume that it just means the cpu and graphics card will be moved so that they can both be cooled together???
 
Heinz, thank you
Yep, i'll have heinz aswell ta

Ha ha ha :p Here you go:
Heinz205l.jpg
 
AMD have been talking about Fusion now for years, it's basically integrating onboard graphics into the CPU to reduce costs, you will still be able to disable it to use a separate card, but as said, the performance CPUs probably won't feature it to keep heat down.

Later low end Nehalem chips will have an integrated GPU as well.
 
Now ibm make the 360's cpu and ati/amd make the gpu.

ATi designed the GPU but have nothing to do with the manufacturing.
I think Microsoft actually own the rights to the GPU and ATI were just paid to come up with the design and test it.

Isn't this what happened with the PSOne? All the silicon was combined into one chip, allowing the console to become much smaller?
 
ATi designed the GPU but have nothing to do with the manufacturing.
I think Microsoft actually own the rights to the GPU and ATI were just paid to come up with the design and test it.

Isn't this what happened with the PSOne? All the silicon was combined into one chip, allowing the console to become much smaller?

The later models of the PS2 did this as well. Yes Microsoft do own the rights, they learnt from there mistake where Nvidia still owned the rights with the xBox GPU's and consequently had to pay them a small fortune.
 
AMD have been talking about Fusion now for years, it's basically integrating onboard graphics into the CPU to reduce costs, you will still be able to disable it to use a separate card, but as said, the performance CPUs probably won't feature it to keep heat down.

Later low end Nehalem chips will have an integrated GPU as well.

I think you are missing the point completely. It's about using the power that a GPU has for normal applications. If we weren't limited by backwards compatibility of x86 systems then current CPU's would be much different, i.e faster, than they are now.
 
I think you are missing the point completely. It's about using the power that a GPU has for normal applications. If we weren't limited by backwards compatibility of x86 systems then current CPU's would be much different, i.e faster, than they are now.
But that can be done already using separate GPUs and "backwards compatibility" with x86 is somewhat important for the Windows based PC. Ultimately the functionality that an onboard GPU would provide is basically similar to SSE extensions so it's not doing anything that isn't done already.

As I've said before this is about reducing costs (and as a result increasing profit for AMD) and limiting their competitors oppurtunities as there will be no requirement for a separate graphics processor from Nvidia or Intel. The power and transistor requirements of high end gaming graphics gaurantees that they will not be incorporated into CPUs anytime soon.
 
No one else thought about the massive decrease in power consumption this will allow? For less graphically intensive applications you could turn off the external solution and use whats on the chip.

Sounds like a pretty damn good idea to me.
 
Back
Top Bottom