where is computing going?

Tablets / mobile devices is the way it's going.

This for near future.

Long future minaturised beyond believe. Power source will be tiny or blood driven and integrated. In some sort of HUD. Battery and contact lense, or blood and implanted.
I'm thinking contact lense, I don't think public likes the idea of implants.

3d printing for manufacturing.


Tablets like the ipad will not replace desktops and laptops because with their small displays and lack of integrated input devices .

They do have keyboard support. Companies already uses laptops and just dock them up in the office to a keyboard/mouse and screen. A tablet can do that. W8 will move this along massively.

Company has also already realised the benefits of mobile computing and by Julyish all 13000 Maintenon staff will have iPhones, and iPads to be rolled out after that (not sure who will get these, probably one per team, plus communal ones). With a new app every 90days.

Why waste money on paperwork, a electrical diagram changes and you have to reprint 10s of thousands of books and people constantly have outdated info. Now, it'll be computerised can pull the info up on site as well as office and it'll be upto date.
Need to take a photo, it's all integrated, need to log something or know where you are and you have gps app. Electronic trial, is much better than a paper trial and when it is designed to work from touchscreen or keyboard such forms are easy to do on such a device.
 
Last edited:
This for near future.

Long future militarised beyond believe. Power source will be tiny or blood driven and integrated. In some sort of HUD. Battery and contact lense, or blood and implanted.
I'm thinking contact lense, I don't think public likes the idea of implants.

3d printing for manufacturing.

He's speaking riddles :eek:
 
I don't know whether you mean at a higher level in terms of "gadgets" or at a more fundamental level. For the higher level, I imagine networked devices sharing workspaces, data and displays in a seamless manner, and add to that surface computing. Ubiquitous computing will pervade the home and workplace. Imagine sliding a document off your monitor and on to your tablet for example while you move from your desk to the couch, or some such thing. I think motion sensing tech like Kinect will play a big role as will new types of projection (such as some of the 3D microsoft research is working on) where images are projected individually to each person's eye. The display also changes based on what you're looking at.

That's mostly speculation. I can say a few more concrete things about what will happen on the microelectronics end. Silicon chips become increasingly leaky due to quantum tunnelling as transistor feature sizes decrease. Sooner or later the use of quantum annealing and other techniques that exploit tunnelling will enter the mainstream. This will result in some interesting speed ups on optimization problems. One effect for the consumer is that this will enable much more intelligent systems to be built. Once the application of quantum tunnelling is mastered there will be a leap forward in performance on many types of optimization problems (application e.g. various types of AI). Tied into ubiquitous computing, we will see interesting advances in the "intelligent computing environment". Around 2015 Silicon would have become increasingly difficult to work with because conventional microelectronics will break down. We would have entered the commercial phase of true nanoelectronics -- something that is principally restricted to academic research at the moment.
Various labs around the world have been funding university research on single-molecule organic transistors. This will likely reach maturity and we will see organic transistors enter the commercial sphere. Graphene may well be the next great substance, but whatever the case... The Age of Silicon would have ended.

The Age of Carbon would have begun. These will not be true quantum computers. While having to grapple with quantum phenomena at this scale, and indeed using some to a computational/algorithmic advantage, most advances will be tied to neutralising quantum effects rather than leveraging it.
It's not clear yet when the breakthrough will come to true quantum computation (i.e. using quantum entanglement to implement Shor's algorithm, Grover's algorithm and yet undiscovered mathematical algorithms). Work in graphene quantum dot qubits, for example, shows some promise. There are numerous other approaches, mostly restricted to university laboratories working on a few qubits. However, the break through, when it comes, will come in the form of leveraging entanglement to perform computation on a scale that is viable for commercial applications. This will have very interesting consequences.

Hard to see, the future is... Because the technologies will be highly disruptive by 2015/2016 and beyond. E.g. entanglement-based quantum devices will negate all of asymmetric key cryptography and thus be a devastating blow to the security industry which underpins virtually everything... However, I'm convinced we won't see entanglement used in devices in this decade.
 
Last edited:
Cloud computing.

I go along with this. I really see no reason at all hy average home users should have anything remtely complicated hardware wise nor have to carry their computational needs in any mind of mobile setup.

All software, processing, storage, backups, maintainence done on a server. We all own a variety of User Interface device that connect to that, each optiised for the task/and/or environment at hand.
E.g., your TV is 1 such device that will simply connect to the internet and , allow you to watch movies, tv, stream your photos, also play games. A radio would let you lsiten to music, in your car you can access your music (maybe movies), address book,etc., You amy have a desktop type setupo which is merely a ncie monitor, keybaord and mouse for usualy computer work (writing docs, surfing, gaming). You will have a smartphone which will really just be an interfacer device to the cloud storage and wont do anything smart in itself.

The technology is there, most large IT companies, e.g. Google, are pushign in that direction. The limitiugn factor ATM is network speed, relaibility, infrastructure, cost.

Utility computing is the only sensible future.
 
Wrong. A number of tablets now have the ability for you to connect keyboards and game pads up to them... Although the point about screens is correct. Although again I think many are starting to adapt so they can have output to screens.

kd

They do have keyboard support. Companies already uses laptops and just dock them up in the office to a keyboard/mouse and screen. A tablet can do that. W8 will move this along massively.


I specifically said integrated. Who wants to carry a dozen input peripherals with them when they can just use a laptop? It's not even practical to use a keyboard with a tablet at home.
 
Last edited:
I specifically said integrated. Who wants to carry a dozen input peripherals with them when they can just use a laptop? It's not even practical to use a keyboard with a tablet at home.

So the asus transformer looks so different to a laptop.
It's impossible for you to dock a computer in and use a screen and keyboard mouse? The same as what happens in many companies already with laptops.

Got a gripe with tablets by any chance?
 
My FD was looking to replace his laptop earlier this year and considered a tablet solution but they just aren't quite there yet. I expect in another year or two they might be but I can't see them replacing the desktop for the normal low to mid level worker. At the moment you probably get people using a tablet alongside their laptop/desktop
 
Last edited:
So the asus transformer looks so different to a laptop.

Did you read what I wrote? I said like the ipad and specifically not like the older style dual laptop/tablet eg. the transformer.

It's impossible for you to dock a computer in and use a screen and keyboard mouse? The same as what happens in many companies already with laptops.

And if I want to sit on the sofa?
 
The problem with cloud computing is the fact that most of the world doesn't have access to reliable fast broadband.

It's going to be a long long time until it's a sensible solution. Im not saying it's not one future of computing... but it wont be mainstream for many many years.
 
The "Desktop! tower will be reserved for the enthusiast and the gamer in the future as home computers migrate to thin client sized machines, they already have in fact, I have an ASRock Wii sized HTPC sitting below the TV in the living room which serves as a fully usable desktop PC but is mainly used for watching movies and showing people stuff online on the big TV. It runs Win7 Pro with fully functional graphics capabilities as well.

Then there's portable computing where smartphones sporting high resolution screens and ultra slimness coupled with decent battery life allow people to connect to their home networks over the air and do their computing wherever they are.

That too is already here. 've ditched the laptop and netbook for a year now in favour of my Android phone which does the same things I used to do on those but with far more convenience and in many way offers more scope as well.

These things get faster and smaller every year and will continue to do so right into the quantum computing age that should kick off in 10-20 years.
 
Did you read what I wrote? I said like the ipad and specifically not like the older style dual laptop/tablet eg. the transformer.
So it's already perfectly possible and your hanging on to current/past tech in a thread about the future. No I don't think you have a point at all.

And if I want to sit on the sofa?

You use it as a tablet, a laptop, or a laptop that streams the screen to tv/projector.
 
For web browsing, music, video and general use is where a tablet shines. I may even get one soon as my laptop is very old and the battery has gone. For casual use it's ideal.

For anything serious and demanding then a desktop is still going to be the only real option. Trying to do video editing, game development, rendering or editing a PSD with a hundred layers would be a nightmare on a mobile device. These things will always remain as something you do over a desk or otherwise immobile.
 
Back
Top Bottom