• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD-ATI to make a GPU on a CPU

Associate
Joined
15 Jun 2006
Posts
2,178
Location
Amsterdam
By Fuad Abazovic: Tuesday 15 August 2006, 08:13

ord=Math.random();ord=ord*10000000000;
ENGINEERS from AMD and ATI companies are going to start working on a unified chip that will have GPU and CPU on a same silicon. We learned this from high ranking sources close to the companies, more than once. Don’t get too excited as it will take at least eighteen months to see such a dream come true.

This is the ultimate OEM chip, as it will be the cheapest way to have the memory controller, chipset, graphics function and CPU on a single chip. This will be the ultimate integration as will decrease the cost of platform and will make even cheaper PCs possible.

CPUs are being shrunk to a 65 nanometre process as we speak and the graphics guys are expected to migrate to this process next year. The graphics firms are still playing with 80 nanometre but will ultimately go to 65 nanometre later next year.

DAAMIT engineers will be looking to shift to 65 nanometre if not even to 45 nanometre to make such a complex chip as a CPU/GPU possible.

We still don’t know whether they are going to put a CPU on a GPU or a GPU or a CPU but either way will give you the same product. µ
 
AMD's desire to put GPUs onto the CPU has been known for a long time (sorry!).

Makes the mass-market business chips/mobile chips cheaper to build and run.

I would have thought that we'd see some interesting hybrid vector/gpu processors too. Big problem is memory bandwidth.. transforming big datasets in parallel (that's essentially what GPUs do) takes more bandwidth than the current CPU memory provide.
 
I'll probably stay with my ol' oc'd X2 4400 until the K8L comes out :D
I seem to have a habit of going for the new shifts in technology (very early adopter with the 512KB cache Barton, pre-ordered X2, ... ) and currently I see no point in going for either of the current Intel/AMD offerings..

Although I have found myself looking at the MaxPro several times.. but I'll wait till I have a use for it! PC motherboards need more memory slots!
 
Wow that will be one hot chip (no pun intended)

It also makes a great business strategy as it will mean everytime you decided to upgrade the weaker of either your graphics card or CPU you will also be replacing your CPU or graphics card.

I don;t suppose this matters too much unless you buy the top of the line cpu and then a new graphics technology comes along making it a much more expensive upgrade.

It's certainly an interesting concept and will mean the end of AMD / Nvidia combos.

Major changes are definately on the horizon

Deks
 
should be loads faster and less bandwidth intensive since the link between the gpu and the cpu should be very quick.

like i said its also eventually going to end up with a physics processor built in too.
 
i once had a dream of a 5 cored cpu (wierd i know) but 2 for processor usuage, 2 for graphics, and 1 for physics. but that would requirre more ram so lets so 2gb for the cpus, 2gb for gfx and physics
 
Deks said:
Wow that will be one hot chip (no pun intended)

It also makes a great business strategy as it will mean everytime you decided to upgrade the weaker of either your graphics card or CPU you will also be replacing your CPU or graphics card.
It won't be hot, it is initially for low end integrated/business/laptop solutions. The sort that use integrated graphics etc.
Well the sector that this will initially be aimed at does not really upgrade. After the 2/3 year lifespan of the computers they buy new again so this is not a problem.

Deks said:
It's certainly an interesting concept and will mean the end of AMD / Nvidia combos.
Not true, it will steed need a chipset to run on which nVidia would still be able to make. Also as said before, this is only the low cost, integrated, business & laptop sector. There is still plenty other areas of market.
In the long term this will probably evolve and become more mainstream and that is where nVidia might struggle, that is a LONG way off yet though. Even this might not come about for 5/6 years!
 
Back
Top Bottom