• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD HD7XXX Series

Cheers drunken master.
So what stops them from putting 30 smaller cores on a card and making better drivers to support it? I believe Nvidia were planning to do something like that. Wasn't it called re voodoo?

cores need ways of communicating with each other and the ram. with multiple cores in a single package, as gpu's are designed these days, that isnt such an issue. but with separate packages on one board the complexity of the design, and there for cost, increases exponentially. i very much doubt we'll ever see more than two or three individual cores on a gpu again.



One question I had is why are CPU/GPU wafers round? Is it something to do with the way they have to be manufactured?

I guess I could google it :D

wafers are grown and part of the process is rotating the silicon, this naturally makes them round :)

http://www.pcpro.co.uk/blogs/2010/05/06/why-are-processor-wafers-round/
 
Last edited:
cores need ways of communicating with each other and the ram. with multiple cores in a single package, as gpu's are designed these days, that isnt such an issue. but with separate packages on one board the complexity of the design, and there for cost, increases exponentially. i very much doubt we'll ever see more than two or three individual cores on a gpu again.





wafers are grown and part of the process is rotating the silicon, this naturally makes them round :)

http://www.pcpro.co.uk/blogs/2010/05/06/why-are-processor-wafers-round/

Cheers James :D
 
well only news on the ati 7xxx is that amd aim to get their 28nm card out before nvidia 6xx. rumor has it, is that amd aim to get the the first 7xxx out sometime between november and december time of this year.
 
Last edited:
Cheers drunken master.
So what stops them from putting 30 smaller cores on a card and making better drivers to support it? I believe Nvidia were planning to do something like that. Wasn't it called re voodoo?

Theres 2 issues here one being communication latency (not input latency) which I will come on to later and the other being the systems for overall rendering when using multiple GPUs.

If you used 30 smaller cores each building a complete frame for rendering (AFR) you'd end up with hideous input latency as they'd all have to be working on subsequent frames at the same time to be performance effective unlike 1-2 cores where you can react to changes very quickly.

If you tasked all 30 cores to work on different parts of the same frame (SFR) you'd hit massive performance drops if 1-2 of those cores were working on a part of the screen which was graphically intensive - you could adaptively start cores which had a ligher workload on one frame to pre-emptively start working on more intensive parts of the next frame but it would be hit and miss if you got it right due to often fast changing scenes typical in games and likely to induce input latency to get rendering performance effectively.

Alternatively you could have the functionality of one core spread out over the PCB in discrete chips - to increase the potential performance of each sub-system - but you'd still have a lot of power to remove from the board and the latency on the interconnects from one chip to the next would probably kill any performance if you tried to do stuff like having a dedicated chip for geometry setup and another dedicated chip for shader processing - yes older GPUs used to do this, having a different chip for texture management, etc. but you can't really get away with it on a modern high performance GPU.
 
Last edited:
Will these new cards require new motherboards with pci x 3.0 is it? i understand they will work with the old ones but not perform at what they can in pcix 3.0 ?
 
Last edited:
Currently designs for both are backwards compatible with PCI-e 2.0 - you may have some issues with some PCI-e 1.x boards but mostly it should work.
 
The only thing that may stop the next generation cards from exceeding 580GTX SLI or 6970x-fire levels would be power draw limitations.

Nice explanations there! However you have also forgotten Amdahl's Law, by which the performance increase would be limited unless there's major change in architecture:cool:
 
Cheers drunken master.
So what stops them from putting 30 smaller cores on a card and making better drivers to support it? I believe Nvidia were planning to do something like that. Wasn't it called re voodoo?

I suppose in a way it could be said that both companies already do this, they have looked at different ways to get a smaller rendering unit to work with other rendering units and they put them together and that is what builds up the current type of GPU that we have today.
Now of course these cores (cuda cores on Nvidia cards and Stream processors on AMD cards) are not individual GPU's but they are what each company has decided to go with at this time, yes I know that AMD are just on the verge of changing theirs completely and it will be as funny as you like if they end up with the bigger core after they change just to hear people say how suddenly bigger cores are better. ;)

anyway what it boils down to is that currant GPU's are a collection of smaller processors arranged into groups of individual cores. whether we will see several (as in more than 2) GPU's on a single card or even on a single die remains to be seen. I doubt it though as certain parts of the GPU are not needed to be duplicated.
 
wasn't PCI-e 3 going to be able to deliver up to 300w from the slot rather than needing extra power connectors, or is that me getting confused with something else...?
 
wasn't PCI-e 3 going to be able to deliver up to 300w from the slot rather than needing extra power connectors, or is that me getting confused with something else...?

Not sure on the exact specs but definitely will be an increase in what you can draw - which may mean we might see some high end cards that actually are PCI-e 3.0 only.
 
Not sure on the exact specs but definitely will be an increase in what you can draw - which may mean we might see some high end cards that actually are PCI-e 3.0 only.

i cant remember where i saw it but an article said there was going to be no increase in the power you can draw from the socket and it would remain at 150watts.
 
i cant remember where i saw it but an article said there was going to be no increase in the power you can draw from the socket and it would remain at 150watts.

Yeah just reading up on it and primary focus for power is consolidation and refinement of current standards tho it doesn't rule out extending the spec.
 
Back
Top Bottom