One of the problems as I see it is that IT is quite a fast moving industry, if you teach programming to school kids by the time they have actually entered the job market 10 years later there is a fair chance that you could have stitched them up by teaching them the 'wrong' language i.e. legacy, or at the very least some of it will no longer be relevant. OK so some would continue it on through higher education and keep up to date but chances are they could have done that anyway without a foundation at school.
Yes, many of the principles are likely to remain the same (relational databases, object-oriented languages etc) but unlike mature subjects like Mathematics, English etc the reality is that the world of software engineering will look slightly different as time moves on.
I mean realistically, if I'd learnt computer programming at school, based on a syllabus created pre-mainstream internet (no HTML, no XML etc) it would have been pretty useless nowadays. In fact I did learn some programming at school come to think of it, BASIC. I doubt my GOSUB skillz are going to land me a programming job somehow.
I was chatting to a guy at work a few months back about this, aged ~40, he said that his daughter had no real concept at all about how computers work behind the scenes, when we were growing up you needed to be able to use the commandline to do things, you would tinker around to get games working etc. Nowadays it is just kids clicking GUIs etc, so they never really learn the initial basics of software. Not saying that is a bad thing overall, making computers more accessible to people, just that potentially it stunts the oldschool inquisitive learning processes we used to have, you know, you'd edit a config file and then your PC wouldn't boot the OS properly, something you could learn from
