IT in 10 years?

Associate
Joined
18 Oct 2002
Posts
1,748
Location
Chesterfield
What do you think, I find it quite hard to imagine, forget hardware specs, what will it do for us?

I can't see it changing, it's still going to provide and store information.

What do you think?
 
I recon computers will take over the human race, make us all there slaves :p


Seriously?

I think there will obviously be new technologies evolving, IT Technology is moving so fast at the moment
 
Graphics card will have really rediculus model names (like some aren't bad enough already) and will provide almost realistic graphics

Quad core (or more) 10Ghz processors

etc etc for everything else, in 10 years time computers will be used for everything, even more so than now imo.
 
R34P3R said:
etc etc for everything else, in 10 years time computers will be used for everything, even more so than now imo.

They already are used for everything, think of a modern bussiness building, computer's will control the doors, lifts, tills, lighting, aircon.

It doesn't seem like there's anything left to do
 
India will become the major IT dominated market in the world. And then, they will become the "new" threat to the US.

But, as to IT. Quad cores, quad CPUs, OSs that actually use AA and AF.

Anything really :D
 
R34P3R said:
Quad core (or more) 10Ghz processors

I don't see clockspeed bumping up that far in all honesty, the boundaries of what we can do with SOI are starting to become more apparent.

As for number of cores, intel's roadmap see's us firmly at 32 cores by then.
 
R34P3R said:
Graphics card will have really rediculus model names (like some aren't bad enough already) and will provide almost realistic graphics

Quad core (or more) 10Ghz processors

etc etc for everything else, in 10 years time computers will be used for everything, even more so than now imo.

Processor speed wont increase markedly - we *might* see 5GHz, but not 10.

More cores is the way forwards, with greater emphasis on concurrency.

We'll see more convergence, with set-top-boxes and PCs converging. Content aggregators will become much more powerful, with iTunes like services providing a wide range of DRM protected content on demand through a wide variety of portable and fixed devices, mostly delivered through WiFi on the move, or 100Mb connections to the home.

Entertainment will become 'richer', with enhanced metadata improving our ability (or, more accurately, our systems to search for us) to perform collaborative filtering to sift and sort content for our perusal on demand.

In effect the next decade will be one of 'joining up' services that are, by and large, already present.
 
( |-| |2 ][ $ said:
They already are used for everything, think of a modern bussiness building, computer's will control the doors, lifts, tills, lighting, aircon.

It doesn't seem like there's anything left to do

I mean like they will be used instead of humans for a lot more things like, I don't know maybe driving buses or something. 10 years away it will advance a lot.
 
Quantum computing will be in the mainstream and all content will be delivered by little applets.... on portable devices, like the star trek gismos
 
When i was in school i used to think that even though i had an ability with the subject it wouldnt be career worthy, everyone will come to own a pc and know how to use one and there will be a lot less need for people who fix them

I now have a career in IT and its more along the lines of everyone has a pc and theres lots of people who thing they know what they're doing and break things as a result, furthermore things like moving from NT > 2000 > XP havent simplified things at all, vista looks to add another layer of complexity, admittedly the basics should be easier and more user friendly but beyond that i think there will still be plenty for the technical folk to deal with
 
Morlan said:
Uhhh: "Gordon Moore (co-founder of Intel) "

Uhhh: Read the "law", its to do with processor complexity, transistor count, and work done per second per $1K. Nothing inherantly to do with raw clockspeed at all ;)
 
Think how far it's moved in the last ten years. Amazing eh?

Well no, not really. We were running Windows 95 and XP isn't that different really. Computers were plenty fast enough for the software we had and - if you leave out the games market - did most of what we do today. We typed documents and filled in spreadsheets, we created databases and searched them.

The big changes since then have been screen sizes and the Internet. Games have improved but I don't play games these days so Doom Deathmatch was an amazing experience and the modern ones have more detail but are they really that different?

So what will change over the next ten years?

More voice control (except in overcrowded offices!), web pages that interact with the user, information overload from RSS & widgets, no great killer apps that you just must have.

On the other hand, we'll also have DRM, software that tries to control what we can look at, mass refusal to comply and cracks for everything that tries to stop us. I truly hope for a future IT world where the masses get to invoke anarchy against the corporations and the Internet becames the true voice of the people. But who am I kidding? The sheeple will bow down to the megacorps and we'll all rush out to buy Vista and let it dictate what we can and can't do.
 
I think there will be increases in parallelism, bio-informatics and possibly quantum computing.
Its hard to say if there will be any ground breaking changes, who knows.

DRM technology will become more prevalent/and more effort will be put into removing it.

I also think genetic algorithms may be used to solve more complexed problems.

There will also be an increase in people using bloody annoying social networking sites :p
 
// Disclamer - I may be talking rubbish

If my understanding is correct, a superconducting processor, would have zero resistance and therefore would be 100% efficient, thus allowing incredibly high clock speeds (in the region of 100s of Ghz). If within the next 10 years a superconductor is discovered that superconducts at room temperatures and isn't too expensive, 100s of Ghz would become a reality.

And even if one was discovered that functions only *near* to room temperatures, they could always stick micro-refridgerators on the CPUs (they wouldn't need to be large capacity because when superconducting the CPU would output no heat - the cooling is not for heat removal as such, more to reduce the temperature).

// Disclaimer

Other thing is, I would like to see Linux etc. become easier to use and compatible with Windows software/equivelents being made (games for example).

null :)
 
Back
Top Bottom