• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Some news on Piledriver

Does anyone else feel that it's a bit silly that AMD haven't released any technical documentation to the press - even though it's due to be released within a few months?

The last time they did that was with Bulldozer, since then (for many reasons including some of their own Face Palm doings) they have completely clammed up.

Trying to get anything out of AMD these days is like blood from a stone.
 
The last time they did that was with Bulldozer, since then (for many reasons including some of their own Face Palm doings) they have completely clammed up.

Trying to get anything out of AMD these days is like blood from a stone.

They've been that way for ages, hell, even Agena was the same.
 
They've been that way for ages, hell, even Agena was the same.

Yes, but not as bad as this, i mean... they even deny those slides that are floating about are theirs let alone in anyway acknowledge they resemble anything they have done, this despite it becoming increasingly obvious they are AMD's slides or at least do represent their work.
 
That's the reason why Bulldozer was disappointing, because of their lack of information people were thinking that it was going to be incredible.

It's not a bad processor, but nowhere near how good it was hyped up to be.

Sort it out AMD.
 
That's the reason why Bulldozer was disappointing, because of their lack of information people were thinking that it was going to be incredible.

It's not a bad processor, but nowhere near how good it was hyped up to be.

Sort it out AMD.

Well, this time they have the advantage that few people are expecting any improvements at all, aside from power improvements that is. they could hardly keep resonant clock mesh technology a secret lol
 
piledriver_compute.png


pile_imp.jpg


Found these two.
 
The problem with Bulldozer is they announced it over half a decade ago.
Touted on 3 different sockets
Massive rumour it could do thread fusing (8 cores on 1 thread, so it would have been the best bar none :p)

The lies of AMD employees saying IPC had gone up when it hadn't.

EDIT : Looks like AMD are reworking the module (One of the downfalls of Zambezi)
 
@ Number_25

ah..... there are lots of them lol... i think i put the second one in one of these threads.

If you were to ask AMD about them, even as someone with a genuine technical interest they will tell you they only supply data to approved partners and only as needed, (we will contact you)

believe me i have tried, even with a way of gaining greater access than your average Joe.

That at least was never a problem before Bulldozer.
 
Last edited:
Realistically, am I better off waiting to see what PileDriver offers - or should I got straight in for a 3570k?

Up to you.
I'd take a second hand 2500k personally.

50% IPC deficit to make up, even if AMD can get 20% better performance, a clocked 2500k will still be better for gaming as games suddenly won't become heavily multithreaded overnight.
However, for heavily threaded situations (So, specialised tasks etc) will see a PD 8 core besting a 2500k by a fair amount.


EDIT : ^^ A perfectly balanced view point, come at me bro's.
 
@ Martini1991, your making assumptions, no one knows anything yet. its foolish to state assumption as facts. for all we know Piledriver could beat Intel in gaming.

Realistically, am I better off waiting to see what PileDriver offers - or should I got straight in for a 3570k?

Unless your in a rush? wait.
 
Last edited:
@ Martini1991, your making assumptions, no one knows anything yet. its foolish to state assumption as facts. for all we know Piledriver could beat Intel in gaming.

It's ludicrous to think AMD are going to get a 50% core for core (Which since that's the IPC deficit, is what they will need in the majority of games) performance increase especially when they themselves tout it at 10-15%.
But then even if in some twisted turn of fate they did get that 50% core for core improvement, SB's aren't bottlenecking GPU's neither are Ivy's, if he's going to last with his CPU for a while he may/should consider the PCI-E standard, AMD are on 2.0 (Stupidly given they were the first with 3.0 cards) as at that level of CPU performance, his PCI-E bandwidth would probably limit him before the CPU in X or Y GPU generations.

Don't get the constantly waiting for stuff.
If he waits for PD, may's well wait for Haswell, then may's well wait for SR, etc.

For all we know Vishera is 5 months away, or it could be delayed, or anything.
Ivy isn't going to struggle, nor is SB for a decent few years, may's well buy what he can now (Unused i5 3750k for 162 on MM) and then replace for something better in X/Y years time, while having very good performance right now.
 
Last edited:
It's ludicrous to think AMD are going to get a 50% core for core (Which since that's the IPC deficit, is what they will need in the majority of games) performance increase especially when they themselves tout it at 10-15%.

There is more to gaming than IPC and only using one thread in a single threaded game, or even using the CPU for Physics at all....

Famous last words "how did they do that?" don't pretend your a chip designer or know enough about it to make factual statements of your predictions.
 
Likewise Humbug.
They'll put some pixie dust on the cores, that'll do the job.
They can certainly catch up to the point they're not bottlenecking current GPU's, I wasn't suggesting they need 50% for that.
Either way, still got that PCI-E 2.0 limitation (Doesn't make any different yet though at all)

EDIT : You pretend to know more than you really do, but you mixed up SSSE3 and SSE3, you stated SSE4 works on all AMD CPU's etc, when that's a flatout lie, refused to comment on it when I called you out.
 
Last edited:
Likewise Humbug.
They'll put some pixie dust on the cores, that'll do the job.
They can certainly catch up to the point they're not bottlenecking current GPU's, I wasn't suggesting they need 50% for that.
Either way, still got that PCI-E 2.0 limitation (Doesn't make any different yet though at all)

You do realize that PCIe-2 will not even bottleneck an nVidia 690?

Likewise a PCIe-3 GPU in a PCIe-2 Socket has lost performance to the tune of 0
PCIe-2 are a long way from needing to be faster, once GPU cards become to fast for PCIe-2 the chances are AMD will have PCIe-3 on the shelves.
 
You do realize that PCIe-2 will not even bottleneck an nVidia 690?

Likewise a PCIe-3 GPU in a PCIe-2 Socket has lost performance to the tune of 0
PCIe-2 are a long way from needing to be faster, once card become to fast for PCIe-2 the chances are AMD will have PCIe-3 on the shelves.

I'm sure I said it makes no difference yet....
Having used a GTX680 SLI, I found the scaling to be meh, can't see the 690 being any different.

I never said PCI-E 2.0 was limiting cards yet.

But to get PCI-E 3.0 support on AMD you need yet another socket, whereas the IB users have their PCI-E 3.0 and will push the GPU's for a long time.

And any card fast enough that it's limited by PCI-E 2.0 will be moot on AMD since the CPU's limited it anyway (Taking current CPU's into account)
 
Because rapid fanboys make me laugh.
Also, I said pointless because it's an OP with just a link to Softpedia of all places (Take a look through some of the last threads he's made, the rather messed up Softpedia link that is so conflicting it's retarded, or the next softpedia article to the 7990, everyone remembers the softpedia article where "AMD Said" FX would best the i7 920 by 50%)

Not that Piledriver/AMD are pointless, but no....



I said pointless due to the nature of the OP's post, nothing about PD or AMD.

And if anyone's pointless, it'd be you with half the trolling and flaming you do.




That's just a little sensationalist isn't it?
Is it not the same thing when AMD users will jump down my throat for completely absolutely sod all reason?

You are usually the first poster in any AMD cpu thread that crops up, its like you sit pressing F5 24/7 until one crops up so you can post something derogatory.

Chill out, AMD have a reasonably attractive range apart from the high end desktop stuff which is a tiny part of the market. In the grand scheme of things its a setback for them, but they are working on solving it.

If they can drop power consumption, whilst adding 20% or so performance it will be a reasonable chip for the budget minded.
 
Post something derogatory, like "Thuban left Lynfield for dead in heavily threaded app's?"

Good one.
It's literally like people flame me upon seeing my username, not because they've read my post, or because my post is negative towards AMD/PD (Because it's damn well not, and the AMD users flaming me are really ticking me off), hell in this thread I've said I want PD to kill my 2500k.
I'm such an Intel fanboy.
 
Last edited:
Post something derogatory, like "Thuban left Lynfield for dead in heavily threaded app's?"

Good one.
It's literally like people flame me upon seeing my username, not because they've read my post, or because my post isn= negative towards AMD/PD (Because it's damn well not, and the AMD users flaming me are really ticking me off), hell in this thread I've said I want PD to kill my 2500k.
I'm such an Intel fanboy.

I would say that everything which you have stated in this thread so far has been fully justified.

I too want PileDriver to Smash IvyBridge in power and performance.

Odds of it happening?

Unlikely, but still possible.
 
But to get PCI-E 3.0 support on AMD you need yet another socket

Erm? no, My AM3 DDR3 CPU drops right into an AM2 / AM2+ DDR2 socket and works perfectly right off the bat with DDR2 memory.

We have had this discussion before, AMD only change there sockets when they absolutely have to, and even then they design the next generation CPU to fit on the old one so you don't have to change the socket if you don't want to or are a bit short on cash.

Do Sandy Bridge CPU's have DDR2 controllers in them?

There is no reason why AMD would not continue this with the GPU controller change.

@ Number 25, no blind assertion is ever justified. its possibly inherent to deep seated agenda driven issues, and that's how it should be treated.
 
Last edited:
Back
Top Bottom