• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

20 years ago today....

That's true. But at the time things moved on and Intel took the lead again and it all went pear shaped for AMD for the next 15 years or so.

iY5cWcf.jpg

I had this same DFI P35 and E6600 setup, too! I ended up throwing in a Q6700 ES later down the line. I had an X1950XTX around the same time. Pretty sure I still have it in a box at my 'rents somewhere.

Conroe was an absolute monster and pretty much ruined the CPU market overnight. The Athlon 64 reign was far too short and hampered by Intel's dirty and illegal (albeit savvy) business tactics. It took way too long for AMD to become competitive again. We went 1 > 2 > 4 so rapidly then Intel decided 4 cores would be enough and that was that.

After Bulldozer was the floppiest flop to ever flop, I gave up on the desktop market. Finally got back into it with the release of Ryzen back in 2017 and it's been fantastic to watch the perf climb year-on-year. Alas, the price has also sadly climbed year-on-year.
 
I don't think we can blame Intel solely for AMD's Bulldozer era.

Like most things its a combination of things, Intel's shenanigans hit AMD hard financially but so did the 2008 crash, and Bulldozer was a mistake on AMD's part, one that they didn't have the cash to put right quickly, the low performance was exacerbated by Windows not knowing what to do with it.
Bulldozer was a programmable hybrid architecture, it had 8 physical cores but in four two core clusters via a switchable unified L2 cache.
That L2 cache for each two core cluster could run 2X 128Bit, one for each core or combine to a single 256Bit L2 to make those cores one big one.
What was supposed to happen was if you needed a lot of multitasking you would have up to 8 physical cores, if what you needed was single threaded performance the L2 cache would combine the two cores in the cluster to give a much fatter core.

This never happened, instead Windows would treat it like a normal 8 core CPU, so it would only ever use one half of the switchable L2 and with that you never got the fat double core.

I don't know why this was never resolved through the life of that architecture, this was also during a time when Microsoft had no interest in resolving the issues with their ancient Direct X, very much a "we don't have competition so lets spend 0$ and do nothing" That is something AMD had to resolve themselves as well, but that's another story.

Anyway, its good to see AMD back on form.

Edit: also, i don't think CPU's are expensive, entry level £200 CPU's these days are pretty good, from both AMD and Intel, you can get the latest and greatest monster CPU's for under £600.

The highest end Gen 1 Zen was $500, a 7950X is more than 4X as fast as that for $100 more, 6 years inflation included, that's not bad....
 
Last edited:
I may be even older, I remember when Dos handled 640k of normal RAM, and anything above this limit had to be configured as Extended RAM, (if memory serves me.....). I remember it being a right pain to sort out properly.

I think I had 1 or 2mb of RAM at the time, and a 286.
 
I may be even older, I remember when Dos handled 640k of normal RAM, and anything above this limit had to be configured as Extended RAM, (if memory serves me.....). I remember it being a right pain to sort out properly.

I think I had 1 or 2mb of RAM at the time, and a 286.
like when we had to modify config.sys and autoexec.bat
 
Exactly. It was so long ago now I don't remember the details, I just remember it was a right pain and anything above 640k did not function like readily accessible RAM.
those were the days. i remember having a game that couldnt run as i didnt have enough 'conventional memory' so microprose sent me a boot disk that would allocate more 'upper memory area' or something like that haha
 
those were the days. i remember having a game that couldnt run as i didnt have enough 'conventional memory' so microprose sent me a boot disk that would allocate more 'upper memory area' or something like that haha
Yes, I started with a CGA monitor and 128k of RAM, but I also remember the move to 64bit computing and at the time many were questioning if it was really needed! the main selling point was the RAM, not infinite, something like 16 million TB if I remember right. We wont need 128 bit for a while yet.
 
Some serious memeories coming back in this thread. Messing with pencils and demister repair pens, crazy clocking Athlons and modifying DOS files. Those OC screenshots with Prime95 it's like being 14 again :o

These days I don't even touch a PC outside of my working hours.
 
I don't think we can blame Intel solely for AMD's Bulldozer era.

Like most things its a combination of things, Intel's shenanigans hit AMD hard financially but so did the 2008 crash, and Bulldozer was a mistake on AMD's part, one that they didn't have the cash to put right quickly, the low performance was exacerbated by Windows not knowing what to do with it.
Bulldozer was a programmable hybrid architecture, it had 8 physical cores but in four two core clusters via a switchable unified L2 cache.
That L2 cache for each two core cluster could run 2X 128Bit, one for each core or combine to a single 256Bit L2 to make those cores one big one.
What was supposed to happen was if you needed a lot of multitasking you would have up to 8 physical cores, if what you needed was single threaded performance the L2 cache would combine the two cores in the cluster to give a much fatter core.

This never happened, instead Windows would treat it like a normal 8 core CPU, so it would only ever use one half of the switchable L2 and with that you never got the fat double core.

I don't know why this was never resolved through the life of that architecture, this was also during a time when Microsoft had no interest in resolving the issues with their ancient Direct X, very much a "we don't have competition so lets spend 0$ and do nothing" That is something AMD had to resolve themselves as well, but that's another story.

Anyway, its good to see AMD back on form.

Edit: also, i don't think CPU's are expensive, entry level £200 CPU's these days are pretty good, from both AMD and Intel, you can get the latest and greatest monster CPU's for under £600.

The highest end Gen 1 Zen was $500, a 7950X is more than 4X as fast as that for $100 more, 6 years inflation included, that's not bad....

I didn't blame Intel solely for Bulldozer; I said Bulldozer was a flop? I did watch it all unfold and I'm aware of the nuances and intricate details. It was still a monumental flop.

My final statement wasn't about CPU pricing specifically - it was about desktop pricing. If you think the pricing is 'not bad' that's your opinion to draw. Alas, I don't share the same view.

And just to get your numbers straight: The 7950X launched at ~£750 and only reduced in price when the 7950X3D was on the horizon which has now become the desktop flagship in that bracket. You can see it for yourself here: https://uk.camelcamelcamel.com/product/B0BBHD5D8Y
 
Wendel made yet another amazing video..love l1techs

He's a dying breed of tech journalist, in that he actually understands the stuff that he talks about, others frequently get things wrong, with such confidence and authority yet never do us the courtesy of shooting themselves in the head because they don't even know they are being thick.

He has a passion for the subject, one that is surpassed only by his knowledge of it, this is an example, how many of the others are even aware of AMD64 let alone understand its significance?
Well how many others celebrated its 20th? Not one.
 
Last edited:
There was of course one small silver lining to Intel's stagnation of CPUs (pretty much after SandyBridge iirc) and that was the fact that you didn't need to upgrade your CPU as there wasn't really much improvement.
I only recently stopped using my Sandybridge-E 3930K.
 
The first system I built was an AMD Athlon64 4000+ which was a rebadged FX-53. It was such a great chip I remember that system running so well. Socket 939! I pored for hours over the system build guide printout first before starting.
 
Last edited:
He's a dying breed of tech journalist, in that he actually understands the stuff that he talks about, others frequently get things wrong, with such confidence and authority yet never do us the courtesy of shooting themselves in the head because they don't even know they are being thick.

He has a passion for the subject, one that is surpassed only by his knowledge of it, this is an example, how many of the others are even aware of AMD64 let alone understand its significance?
Well how many others celebrated its 20th? Not one.

Yeah, I get the impression they're more like presenter/streamer turned techie, rather than a tech turned presenter/streamer, which is what he is
 
Not old enough to remember this!

My first cpu was an AMD X4 Phenom ii 955 though, that thing was quite beastly for its time and it was the first cpu i really tried my hand at overclocking. Back then was rather easy, just multiplier and voltage to increase step by step.

AIO's weren't really a thing yet then either, or was just entering the market. I still remember the jet engine style heatsinks with copper fins.
 
There was of course one small silver lining to Intel's stagnation of CPUs (pretty much after SandyBridge iirc) and that was the fact that you didn't need to upgrade your CPU as there wasn't really much improvement.
I only recently stopped using my Sandybridge-E 3930K.

Very true! It does make you wonder where the market could be today, though. We could have had HEDT as regular desktop core counts - who knows where it could have lead. Intel reserved >4 core counts for HEDT and I don't believe that would have happened if AMD had remained competitive.

This thread has been rather nostalgic.

Here's some random facts some may not know:

The original x86-64 (running 32-bit apps on a 64-bit OS) implementation of Windows XP (Windows XP x64) used something called "WoW64". Which is Windows on Windows 64. It was similar the WoW system used to move from 16-bit to 32-bit computing. This meant that Microsoft could start to push OSes towards 64-bit without relying on developers to re-write/convert their software to 64-bit. There were exceptions, however, things like drivers had to be 64-bit. WoW64 was geared towards running 32-bit applications on a 64-bit OS but has been killed off since the intro of Windows 11 - which is now also 64-bit only.

All Windows Server OSes since 2008 R2 have had zero support for 32-bit and exclusively support a 64-bit OS with 64-bit applications. Enterprise usage is where the largest compute benefits were made and thus it was pushed significantly harder to make the transition sooner.
 
I had a build with an athlon xp 2100+

Prior to that had a pentium 75, then a pentium 500.

Stick with AMD up until "faildozer" then bought a Sandybridge i7 best CPU I've ever had to be fair, but back to AMD now, with a 5800x build a few years ago, and literally today put a nice little build together with a 5600g for downstairs.
 
Back
Top Bottom