• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Looking back, IPC, Intel Pentium 4 vs AMD Athlon XP

Caporegime
OP
Joined
17 Mar 2012
Posts
47,668
Location
ARC-L1, Stanton System
Would buy you a dual core G4600 today give or take. Nothing quite in the AM4 socket yet

G4600 is $60 but for Youtube browser or a personal server for Arma III and / or Insurgency ecte... yes actually its a good chip for that price.

For an entry level Gaming CPU? no, its iGPU is useless in that sense so one would have to get a discrete GPU like a GT 1030, a GDDR5 GT 1030 mind you as the DDR4 GT 1030 is also useless for gaming, that's another £70 ontop of the £60 G4600.

So for an entry level Gaming rig the Ryzen G2200 with its 4 real overclockable cores is better than the G4600 and its iGPU is just as good as the GDDR5 GT 1030, its £100
 
Caporegime
Joined
26 Dec 2003
Posts
25,666
I am going by memory too - a lot of my mates were gamers,and the Pentium 4 was a laughing stock until the Northwood series,

I've never stated otherwise, my point was that for around a year nothing from AMD could touch an overclocked Northwood 1.6A, 1.8A and an overclocked 2.4A released April 2002 was probably better than most later Prescott's.

The whole premise of this thread that Athlon64 had 60%+ better IPC than P4 is based on 64bit benchmarks between the Athlon64 that was built from the ground up for 64bit versus the P4 Prescott which iirc had 64bit support tacked on. It's all well and good sitting here in 2018 saying that Athlon64 had 60% faster IPC using 64bit benchmarks as evidence but back then there was hardly any 64bit software and P4's 32bit performance was just fine. Pentium 4 only started to trail with Prescott when Intel lengthened the pipeline and because they ran so hot Athlon64 was just a better choice.

and as the reviews showed even the initial release was beaten by the XP2100+ and it took for the later Northwood A P4 chips over 2.4GHZ to start getting anywhere,and OFC the Northwood B.

P4 Northwood 1.6A and 1.8A released in January 2002 both easily overclocked above 2.5ghz and 2.7ghz respectively and with 150mhz+ FSB. Northwood 'B' was not only expensive but 133mhz FSB out of the box was more of a hinderance to overclocking than anything else.
 
Soldato
Joined
30 Jan 2009
Posts
17,189
Location
Aquilonem Londinensi
G4600 is $60 but for Youtube browser or a personal server for Arma III and / or Insurgency ecte... yes actually its a good chip for that price.

For an entry level Gaming CPU? no, its iGPU is useless in that sense so one would have to get a discrete GPU like a GT 1030, a GDDR5 GT 1030 mind you as the DDR4 GT 1030 is also useless for gaming, that's another £70 ontop of the £60 G4600.

So for an entry level Gaming rig the Ryzen G2200 with its 4 real overclockable cores is better than the G4600 and its iGPU is just as good as the GDDR5 GT 1030, its £100

I was noting the exceptional value of the Tbread 1700+ back in the day, £40 then ~ £60 now. An overclockable low end chip able to mix with the top end parts. No mentioned gaming graphics etc... Sometimes it feels like the CPU/GPU sections are full of chat bots
 
Soldato
Joined
4 Jan 2009
Posts
2,682
Location
Derby
Amd k6 II was my first processer ever went to Athlon 500 and then Athlon 64 <— was my last amd chip unfortunately.

Ryzen is big jump in the right direction which may make me ditch Intel.
 
Associate
Joined
27 Apr 2007
Posts
963
It's ok saying that but it's not as if this was amds first cpu, there were obviously reasons they made the choices they did when designing the cpu. Wether that was a gamble on what direction software development would go or not I've no idea.
You appear to be rambling now!
If it had been particularly good in heavily multi-threaded workloads then it might have been given some leeway but it wasn't in spite of its prodigious hunger for watts. It was a dog:
43697.png
 
Caporegime
Joined
18 Oct 2002
Posts
39,323
Location
Ireland
You appear to be rambling now!
If it had been particularly good in heavily multi-threaded workloads then it might have been given some leeway but it wasn't in spite of its prodigious hunger for watts. It was a dog:
43697.png

As i said, the decisions taken at the design stage must have had some validity to them for it to get approval to go ahead. What happened between then and the time it came out is anyone's guess. It was basically slightly slower than phenom 2 at the time which was a head scratched come review time.
 
Soldato
Joined
9 Nov 2009
Posts
24,846
Location
Planet Earth
88
You appear to be rambling now!
If it had been particularly good in heavily multi-threaded workloads then it might have been given some leeway but it wasn't in spite of its prodigious hunger for watts. It was a dog:
43697.png

The CPU was actually meant to be released earlier to compete with the previous generation of Intel CPUs IIRC, as it was meant to first be released to priority customers in late 2010,but the 32NM process had issues - you could see that with Llano which was clocked quite low and had a huge amount of APUs with defective GPUs.

However,by the time they got it out Intel had released SB,and the rest was history.

Even Ryzen was delayed by six months too!!
 
Caporegime
Joined
18 Oct 2002
Posts
39,323
Location
Ireland
88


The CPU was actually meant to be released earlier to compete with the previous generation of Intel CPUs IIRC, as it was meant to first be released to priority customers in late 2010,but the 32NM process had issues - you could see that with Llano which was clocked quite low and had a huge amount of APUs with defective GPUs.

However,by the time they got it out Intel had released SB,and the rest was history.

Even Ryzen was delayed by six months too!!

Yup, had a sandy bridge setup myself from 2011 to last year, 2600k. Amazing little chip. People on the release of bulldozer were pouring scorn on amd, little did we know that was the start of several years of stagnation on the cpu side.
 
Soldato
Joined
24 Feb 2003
Posts
4,203
Location
Stourport-On-Severn
People on the release of bulldozer were pouring scorn on amd, little did we know that was the start of several years of stagnation on the cpu side.

Actually Gerard, some of us did. At the time i used to get CPU's to test from both AMD and Intel. It was completely obvious to me that Bulldozer was a complete shambles..................i never did get another CPU from AMD lol after giving them my feedback. Intel though were always forthcoming with pre release samples :D
 
Associate
Joined
27 Apr 2007
Posts
963
The CPU was actually meant to be released earlier to compete with the previous generation of Intel CPUs IIRC, as it was meant to first be released to priority customers in late 2010,but the 32NM process had issues - you could see that with Llano which was clocked quite low and had a huge amount of APUs with defective GPUs.
However,by the time they got it out Intel had released SB,and the rest was history.
Intel were meant to release 10nm CPUs in 2016 or so but they didn't so it has no relevance to the actual marketplace. Likewise for BD and I am not going to defend either product which in Intel's case is still a virtual one.
 
Soldato
Joined
9 Nov 2009
Posts
24,846
Location
Planet Earth
Intel were meant to release 10nm CPUs in 2016 or so but they didn't so it has no relevance to the actual marketplace. Likewise for BD and I am not going to defend either product which in Intel's case is still a virtual one.

Who said I was defending it?? I bought SB over it and Llano,and this is a discussion forum so guess what we can also talk about how certain things happened too,even the failures. Or are channels like Techmoan meant to stop talking about the loads of audiovisual formats which didn't quite work out? Its a great channel.

Its like with this thread - the P4 happened the way it did since Intel expected clockspeeds to rise much higher than they did and it didn't happen.

AMD experimented with depreciated cores and shared resources in an attempt at a compact modular core and to cut down on transistor numbers and make it easier to update and make, and expected much higher clockspeeds than what happened. IIRC,one of the reasons they wanted to do this was since Intel was jumping to higher density nodes before them,so they were attempting to try to keep CPU core size down and their cache tech tended to be less denser too(which took up more die area).

They tried making it on 45NM which didn't work and then expected 32NM SOI to increase clockspeeds enough,but that didn't work either. It basically hit the same issue as the P4,and guess what?? The P4 was a response to the P3 not scaling high enough in clockspeed,and BD was a response to CPUs like K8 and the K10 hitting the same clockwall,except it arrived late and didn't clock realistically much higher than a Phenom II X6.

When BD failed AMD went back to the drawing board,and decided to go with a more traditional core design,however again like with Bulldozer AMD wanted to make a compact modular core,that would be easier to update and make. Hence the whole CCX thing and using IF to connect core complexes together.

So they were facing similar problems,but its interesting to see how one strategy worked and one didn't.
 
Last edited:
Soldato
Joined
19 Nov 2009
Posts
4,387
Location
Baa
Interesting thread. I had few Northwood CPUs (1.8 @2.7, various 2.4s @<4GHz and an unlocked 2.8 ES @ 3.8, 250+ FSB) and a Prescott CPU (3.0E iirc, @3.9GHz).

I extended the life of my old s478 mobo by swapping the P4 out for an Asus CT479 socket adaptor and Dothan CPU. That thing was amazing. Overclocked from 1.6 to 2.4 with very modest cooling (compared to the Prescott) it trounced P4 in gaming. It was a sign of things to come (Conroe).
 
Back
Top Bottom