Why couldn't they make todays best cpu ten years ago?

Im hoping ray tracing and more advanced simulations in gaming will drive performance forward, i want avatar level graphics in the not too distant future at 60 fps, will it ever happen and if so when?

You don't think that'll freak you out?

seeing the guts and blood spilling from a person as they choke on their own fluids begging you to save them after you hit them with your car in gta 29?
 
You don't think that'll freak you out?

seeing the guts and blood spilling from a person as they choke on their own fluids begging you to save them after you hit them with your car in gta 29?

It will make it all that more exciting. :p
 
I remember us chasing the megahertz back in the 90s and early 00s, like top clockspeed would be 1.5x more the next year than in the previous year. E.g. 1996 200MHz, 1997 300MHz, 2008 450MHz and so on. I remember looking on OcUK in late 2003 and they were stocking 3.6GHz and 3.8GHz as the top 2 Intel processors. In 2004, the top speed then went back down to 3.4GHz with hyperthreading introduced. In 2006, hyperthreading got replaced by dual-core. Then dual-core got replaced by Core 2 Duo™ then 4, 6, 8 cores in later years. For all of this time, the top speed remains around 3.4GHz, but I guess that processor in dual-core would be like having a 6.8GHz CPU?

cpu mhz isnt everything a single core modern cpu at 3ghz would murder one of those old 3.8ghz chips
 
You don't think that'll freak you out?

seeing the guts and blood spilling from a person as they choke on their own fluids begging you to save them after you hit them with your car in gta 29?

I can only dream. :D

And no. Most of us are desensitised to virtual violence at this point. We'll probably be raging about how the guts were low res.
 
The Voodoo 5 5500 graphics card... I remember when my dad bought that for his rig back in 1999/2000 I think it was and he payed around £450 for it from a well known computer shop, crazy. From what I remember at the time, that card was the bee's knees! Well - it was when it did not cause his computer to lock up because the card itself kept sliding out of the AGP port slightly because it was too long!

In that same rig, I think he was running the slot version of the Intel Pentium 3 which if I remember right, was running at around 600Mhz. To the day as well I have still got this very rig in my bedroom cupboard, I remember last powering it up about 6 months ago and everything still works, even the old Seagate 12GB HDD it has which was manufactured back in 1998!

Liam
 
I can only dream. :D

And no. Most of us are desensitised to virtual violence at this point. We'll probably be raging about how the guts were low res.

that;s cause virtual violences looks very virtual, he's talking about avatar + levels of detail.

think of tron legacy and the fake Jeff bridges, at those levels it wouldn't look virtual antonym it would look much more like a video.
 
These days hard disks are the biggest bottleneck in a computer system. But this has been solved by the introduction of solid state drives. Though they're still very expensive for a lot of people and not really viable for storage use.

Your average hard disk has a write speed of 50-60MB/s (laptops even slower). A time will come when even your internet connection will be bottlenecked by a hard disk.

unless they get rid of the hard disk altogether.... ;)
 
A time will come when even your internet connection will be bottlenecked by a hard disk.

Lol asim. No, that time won't come because we'll have a different technology by then. Remember when program loading times were bottlenecked by tapes or floppy discs? Still have that problem now?
 
The "slowing" of moores law isnt related to a contradiction of it.

Silicon is coming to the end of its use in computing, it still has some life in it and new ways are coming around to make it more useful, but in the long run i fully expect Nano Carbon to replace it, but its far to expensive to use right now for any commercial gain.

Thing is that there is also the prospect of a complete change from how computing works in the first place to "Quantum" computing, which im not sure about at the moment, but every few months a piece of the puzzle seems to be solved, but it will make computing more difficult to understand, at least beyond a point, considering the physics involved.

Going even further we have photonic computing, which will be a truly wonderful thing.

They've talked about quantum computing for ages, I remember laying in bed with my Nokia 3410 browsing BBC Sci-Tech on the free internet allowance I had god knows how many years ago, when they mentioned computers without wires.
 
that;s cause virtual violences looks very virtual, he's talking about avatar + levels of detail.

think of tron legacy and the fake Jeff bridges, at those levels it wouldn't look virtual antonym it would look much more like a video.

The gore in future video games might, at best, match what we currently see in films. Is gore in films really an issue? If not then why would it be for video games?
 
The "slowing" of moores law isnt related to a contradiction of it.

Silicon is coming to the end of its use in computing, it still has some life in it and new ways are coming around to make it more useful, but in the long run i fully expect Nano Carbon to replace it, but its far to expensive to use right now for any commercial gain.

Thing is that there is also the prospect of a complete change from how computing works in the first place to "Quantum" computing, which im not sure about at the moment, but every few months a piece of the puzzle seems to be solved, but it will make computing more difficult to understand, at least beyond a point, considering the physics involved.

Going even further we have photonic computing, which will be a truly wonderful thing.


Theres also work being done on silicon-germanium hybrids. Germanium is a better semi-conductor than silicon, but is gloriously expensive, so the idea is to use it in a silicon mix to reduce costs and still retain performance, in theory you should still be able to use your silicon fabs as well.
 
Surprised you'd even make a topic over something so stupid. Obviously technology advances. Why didnt they make guns 2000 years ago instead of using spears?
 
Who's to say they couldn't make them 10 years ago. But why jump from a DX250 to a I5 when you can make so much more money through controlled stepped releases.

It's the same with GPU's. By releasing new graphics cards with 5-10% better performance every year for 4 years, they make more money by releasing one card with a 40% improvment straight away.
 
Who's to say they couldn't make them 10 years ago. But why jump from a DX250 to a I5 when you can make so much more money through controlled stepped releases.

It's the same with GPU's. By releasing new graphics cards with 5-10% better performance every year for 4 years, they make more money by releasing one card with a 40% improvment straight away.

Except they wouldn't because they have to deal with competitors, if they released an I5 to compete with the Dx250's competitors it would give you complete market dominance then you could do incremental upgrades of only 1% a year and no one would have the ability to catch up and threaten your inflated prices and crap upgrades.


Not to mention the process to make the things the way we do now at the volume we do now didn't exist, heck i wouldn't be surprised if there were entire compounds which ware used now which back then has never even been thought of let alone made.
 
Back
Top Bottom