Why aren't young people taking computer courses at school?

Soldato
Joined
21 Jan 2010
Posts
22,215
Similar experience to you. I dropped out of A levels (lol, didn't read the syllabus - was super hard) and signed up for a Certificate (because they didn't think I was studious enough to go for the Diploma). Experience was exactly as you stated and I fought to get on the Diploma course which was absolutely night and day, but your right the criteria was easy to play to. Luckily a 'triple D' was enough to get me through the door at a decent Uni. I have been rejected roles though for not having A levels despite having an equivalent and a degree.
 
Caporegime
Joined
29 Jan 2008
Posts
58,912
I'm in my 40s these days and when I was at school we didn't have the opportunity to take IT subjects at GCSE level. We had computer classes. But never a recognised course. If there had been a course at GCSE level open to me I'd have done it.

So can the younger folks explain why students aren't taking these IT based GCSE subjects these days? How come the take up rate is low?

I was going to say "because they're guff" but actually it does look like the modern IT type offerings do include programming etc...

Back when I was at school we had the option to take a GCSE in "Information Systems", no option for anything similar at A-Level mind. Sadly it most of the practical lessons involved learning to use some spreadsheet application on the Archimedes/Acron computers the school had just purchased. (There was only one full-time IT teacher, few science teachers would also take some lessons, the IT teacher had seemingly insisted on buying Acron computers rather than PCs when upgrading from the BBC micros we used to use pre-GCSE).

IIRC by the time I got to 6th form some Tesco computers for school vouchers had been used + school funds to buy a load of PCs instead.

Aside from learning to use some spreadsheet in practical lessons (granted the generic basics are obvs transferable), we had theory lessons on, essentially, what a "Systems Analyst" does...
 
Soldato
Joined
12 May 2011
Posts
6,149
Location
Southampton
When I did "ICT" as a GCSE in 2006 it was a compulsory course, but it was just presentations and spreadsheets, sometimes basic website. The hardest thing I remember doing was an Access database. There wasn't an option for coding or anything more than using computers.

And our RE was learning about various religions up until GCSE and at GCSE it was more about philosophy and morality and was not compulsory (I didn't take it because of how boring it was at prs-GCSE).
 
Soldato
Joined
13 May 2003
Posts
8,849
There does appear to be a preponderance of teaching skills and not knowledge in school education these days. The naïve assumption that anything can be learnt as required. Cooking, carpentry etc. give you a good understanding of how demanding and difficult to master manual tasks are. That is underlying knowledge that can be applied in many areas, as undergraduate engineers we were made to use machine tools to build our designs so we would understand the relationship between our design and their manufacturability. You learn that by doing. You want a broad understanding of society you need to understand history and geography not just regurgitate other peoples dogma. IT is no different, how can you know how involved, complex, fragile IT processes can be if you have never attempted any computer studies. The focus on "skills" is misleading nonsense we need more knowledge and experience.
 
Soldato
Joined
11 Sep 2009
Posts
13,951
Location
France, Alsace
Also feels like (even for me) that that instant gratification fix and wanting X now is stifling learning. Also, almost every problem is solved by a Google. Because its all happened before.

Really, I think I'm glad I'm not a kid now.
I agree. It's not kids faults, we were just brought up in different times. Why would you try and fix something? The consumerist world makes things throw away, so it's even more of a pain in the bum to fix, even if you wanted to try. Now days you're more likely to have kids with phones and insurance policies for their devices. Rather than fixing things. We've created a world where it doesn't make sense to try and fix them. It also, like you say, is this instant gratification / on demand business that people don't have the appetite for it if it takes time to learn. Why doh? I learned how to do things because it was the only way to do it. Now you can get a screen replaced for 40quid or whatever, so no point in learning it. Or, google something, get an answer. Google something, find someone has build a service for it. Convenience and the way we've built things around that hasn't led to that sort of mindset that leads to problem solving in the same way we had it.

At the same time formal education is WAAAAY further back than all of this and doesn't support anything close to the reality of the modern world.
 
Associate
Joined
29 Jun 2004
Posts
2,260
Location
Rainham, Kent
I remember using slide rules and logarithm tables when I started secondary school - the school had only just acquired a computer in the year or so before I left, and that was reserved for a handful of maths nerds with no-one else being allowed anywhere near it.
I picked up bits of IT knowledge as time went on, and finally went to Uni to study IT in 2007 when I was in my mid-40's.
 
Soldato
Joined
27 Sep 2004
Posts
13,294
Location
Glasgow
I got my first PC when i was 17 back in the early 90's. I was obsessed with the thing and basically, all I did was buy PC magazines and taught myself everything. With no internet, all i could do was open the thing and learn it's hardware and tinker with Windows 3.1/95.

Sadly thats just not a common thing these days.

I did my internship about 15 years ago, worked for 30 quid a week which covered my train ticket and I was delighted to be in with a chance. Would never happen these days, it would be all over social media as slave labour.

The expectations are too high with not enough education in the right areas. Then again the lack of education never did us lot much harm. One could argue that its a wider issue with society... for example, my accountants have a hard time hiring too, new grads do the minimal amount of work with high expectations.
 
Man of Honour
Joined
19 Oct 2002
Posts
29,521
Location
Surrey
I think there are many reasons for it:

1) Too many distractions.
Back in my day (1981) when I got my VIC20 there were very few games around. If I wanted to do something with it then I had to program it myself. There was no internet and all I had to rely on was the manual which came with it and occasional computer magazines. It taught me a hell of a lot. Nowadays it's too easy to simply load up a AAA game instead.

2) No understanding about the different arms of "IT".
Fixing a computer is completely different to writing software or managing a cluster of servers, etc. We need better education on what the different aspects of IT are.

3) Short termism.
This country is obsessed with picking the cheapest way of doing something and not building for the future. Companies, and the government, would rather employ someone overseas than build up the UK's actual skillbase. A friend started a software company and couldn't find suitable trainees so he had to start poaching other companies staff instead. I guess he'd just perpetuating the problem too.

4) IT (all areas) has been commoditised.
IT is no longer seen as valuable. A few decades ago it gave a company a competitive advantage. Nowadays it's seen as an undesirable cost (which then leads to point 3).

My son has chosen IT as one of his GCSE's but until now his school considers IT to be using office productivity software. It's only in his first year of GCSE that he'll actually start to learn to program in school.
 
Soldato
Joined
22 Apr 2009
Posts
3,662
Location
North-West
People are interested in using technology like computers/tablets/phones but few are actually interested in learning how they work.

Combine that with the fact that, as commented, many entry level IT positions pay barely any more than minumim wage. If you're going to uni/college and getting into tens of thousands of debt you want to know there's a job that will pay enough to compensate you for that. IT is very far from the safe bet it was back in the early 90s. If you're clever enough to go to uni and study IT there are far better fiancially compensated disciplines out there.

I know many people who've done IT degrees and struggled to find work. Even those that have aren't earning what those who studied other sciences can. I'll wager most tradesman earn more than IT graduates. Likewise it's a very difficult job to progress up a career ladder in and even several years experience often doesn't guarentuee the pay will improve.

The above is why nobody is interested in studying IT/Computing anymore. If you really want a job in IT self study and experience is the way forward. Waste of time paying someone to teach you as you'll quickly realise that IT is all about being willing and able to teach yourself when new technologies emerge. What you've been tought at university will likely be out of date by the time you come to acutally look for a job.

Probably part of a wider debate about how tertiary education isn't really fit for purpose in 2021 IMO. I think as a society we need to consider how we link university/college education into the job market.

Totally agree I am one of those people you describe. I have an IT degree but I could be in the same position I am in now, more likely better off, if I didn’t have one and self taught.
 

wnb

wnb

Soldato
Joined
27 Feb 2004
Posts
3,983
Back in 1981 I'n the 1st year we were learning how to program on a zx81, just been a hobby ever since. I always assumed that each generation would be more knowledgeable about computers than the last but with how common phones and tablets are the kid don't want one.
 
Caporegime
Joined
29 Jan 2008
Posts
58,912
4) IT (all areas) has been commoditised.
IT is no longer seen as valuable. A few decades ago it gave a company a competitive advantage. Nowadays it's seen as an undesirable cost (which then leads to point 3).

That just isn't true, look at any richest people in the world list right now, look at who the top people are - IT nerds!

Look at where Prince Harry just got a job, a decade or two ago a prestigious job might have been at some big bank or consultancy, these days a C level role in a tech firm is fit for a Prince.

There are plenty of tech-related roles in very high demand right now from security professionals to data science... there is always demand for good developers - a good developer has a heck of a lot of freedom, generally a fat salary or daily rate and can change employers very easily.

Even mediocre devs and BAs in London can opt to go contracting and get the standard-ish £500 a day + or - £100.

Good developers (or BAs) with in-demand skills can get substantially more in the contractor market.

As for start-up employees or indeed salaried professionals at private companies that later go public they can end up with some serious wealth if it goes wrong... I know of someone who spent circa a decade in the same company as it grew (not a high growth start-up or anything, just a regular, profitable, private IT company) - he had stock options and made use of them over that decade... the company then went public, he's a multimillionaire now and never needs to work again. He wasn't a founder, he didn't join some high risk, high growth firm... he was a regular joe starting in some grad role at like 30k a year and then over the years working his way to a middle/junior management position with responsibility for a small team.

I think the problem is people think of "IT" as - people who fix the printer etc.. and there are career paths within the broad umbrella of "IT" that involve working on a help desk or doing mundane jobs + collecting vendor certificates in order to do more mundane jobs... then eventually working your way up to a position where you're an expert in fixing mundane stuff and you're only called to fix the particularly tricky things that the layer or two of minions below you can't fix.
 
Soldato
Joined
21 Jan 2010
Posts
22,215
That just isn't true, look at any richest people in the world list right now, look at who the top people are - IT nerds!

Look at where Prince Harry just got a job, a decade or two ago a prestigious job might have been at some big bank or consultancy, these days a C level role in a tech firm is fit for a Prince.

There are plenty of tech-related roles in very high demand right now from security professionals to data science... there is always demand for good developers - a good developer has a heck of a lot of freedom, generally a fat salary or daily rate and can change employers very easily.

Even mediocre devs and BAs in London can opt to go contracting and get the standard-ish £500 a day + or - £100.

Good developers (or BAs) with in-demand skills can get substantially more in the contractor market.

As for start-up employees or indeed salaried professionals at private companies that later go public they can end up with some serious wealth if it goes wrong... I know of someone who spent circa a decade in the same company as it grew (not a high growth start-up or anything, just a regular, profitable, private IT company) - he had stock options and made use of them over that decade... the company then went public, he's a multimillionaire now and never needs to work again. He wasn't a founder, he didn't join some high risk, high growth firm... he was a regular joe starting in some grad role at like 30k a year and then over the years working his way to a middle/junior management position with responsibility for a small team.

I think the problem is people think of "IT" as - people who fix the printer etc.. and there are career paths within the broad umbrella of "IT" that involve working on a help desk or doing mundane jobs + collecting vendor certificates in order to do more mundane jobs... then eventually working your way up to a position where you're an expert in fixing mundane stuff and you're only called to fix the particularly tricky things that the layer or two of minions below you can't fix.
100%. Which is ironically is a variation of Hades point '2'.
 
Soldato
Joined
16 Aug 2009
Posts
7,747
I remember using slide rules and logarithm tables when I started secondary school - the school had only just acquired a computer in the year or so before I left, and that was reserved for a handful of maths nerds with no-one else being allowed anywhere near it.
I picked up bits of IT knowledge as time went on, and finally went to Uni to study IT in 2007 when I was in my mid-40's.

Do they still use those in schools? I didn't get my hands on a computer until I was at college but it was essentially basic programming almost no-one got it but I fell in love with them. Not sure if they still do that either I doubt it was too long ago lol
 
Soldato
Joined
21 Jan 2010
Posts
22,215
Back in 1981 I'n the 1st year we were learning how to program on a zx81, just been a hobby ever since. I always assumed that each generation would be more knowledgeable about computers than the last but with how common phones and tablets are the kid don't want one.
It is a plateau of maturity. I reckon the same folk who built your ZX81 for you so you can have a great time programming were mumbling the same "kids these days don't even know how to build a computer, they just use them and don't understand how they work!"
 
Soldato
Joined
13 Apr 2013
Posts
12,406
Location
La France
Sadly thats just not a common thing these days.

I did my internship about 15 years ago, worked for 30 quid a week which covered my train ticket and I was delighted to be in with a chance. Would never happen these days, it would be all over social media as slave labour.

The expectations are too high with not enough education in the right areas. Then again the lack of education never did us lot much harm. One could argue that its a wider issue with society... for example, my accountants have a hard time hiring too, new grads do the minimal amount of work with high expectations.

To be fair, £30 p/w in 2005 was taking the ****. I was taking that home every week after tax as an apprentice in 1982.
 
Soldato
Joined
21 Jan 2010
Posts
22,215
But he is aware there is more than just printer monkeys and did say "all areas", my disagreement is that it certainly isn't badly paid in "all areas".
I was more eluding to that fact that 'all areas' must be poorly defined if the resultant answer was 'badly paid'.
 
Soldato
OP
Joined
17 Jan 2016
Posts
8,768
Location
Oldham
When I was at school we had the BBC Micro computer, then the Apple Mac, then eventually the PC. Thinking back though, though we did BASIC programming on the BBC we never did anything on the Apple or PC computers. I think the PC had QBASIC and then Visual BASIC. But I'd left school by then.

I went on 2 courses at college, BTEC First (?), then the BTEC National. I remember the programming languages I did. I think there was a variation of BASIC on the Unix system. Then we did some Python, then COBOL, and tagged on the end was C+. A lot of people dropped off the course because C+ was seen as the most modern language at that point and it would only be taught in the last 6 weeks of a 2 year course.

During my time in the 80s and 90s computers were still considered a nerdy thing. The only non-nerds were business people using Microsoft based programs for their businesses. But in the early 2000's I think computers become mainstream. I had always thought that more people would have become involved in areas once considered 'nerdy'. But that doesn't seem to have happened. I'm always amazed with the lack of tech knowledge that a lot of the ISP customer service people have. Anyone under 40 years old should have a basic understanding of all areas of computing in my opinion.

I've never understood why there isn't more computer porgrammes on television. We had loads back in the day. Mostly based around gaming and pc/consoles. But some programmes were about solving computer problems. Yet apart from BBC Click there is nothing on tv these days like that.

I don't know if anyone follows Linus on here. But he had a video a couple of weeks ago watching this top twitch streamer trying to build his own computer. He managed to do it just about with help from his chat. I just don't understand how someone can be at the top of their game (excuse the pun ;)) yet know next to nothing about the tools they are using.
 
Man of Honour
Joined
19 Oct 2002
Posts
29,521
Location
Surrey
That just isn't true, look at any richest people in the world list right now, look at who the top people are - IT nerds!

Look at where Prince Harry just got a job, a decade or two ago a prestigious job might have been at some big bank or consultancy, these days a C level role in a tech firm is fit for a Prince.

There are plenty of tech-related roles in very high demand right now from security professionals to data science... there is always demand for good developers - a good developer has a heck of a lot of freedom, generally a fat salary or daily rate and can change employers very easily.

Even mediocre devs and BAs in London can opt to go contracting and get the standard-ish £500 a day + or - £100.

Good developers (or BAs) with in-demand skills can get substantially more in the contractor market.

As for start-up employees or indeed salaried professionals at private companies that later go public they can end up with some serious wealth if it goes wrong... I know of someone who spent circa a decade in the same company as it grew (not a high growth start-up or anything, just a regular, profitable, private IT company) - he had stock options and made use of them over that decade... the company then went public, he's a multimillionaire now and never needs to work again. He wasn't a founder, he didn't join some high risk, high growth firm... he was a regular joe starting in some grad role at like 30k a year and then over the years working his way to a middle/junior management position with responsibility for a small team.

I think the problem is people think of "IT" as - people who fix the printer etc.. and there are career paths within the broad umbrella of "IT" that involve working on a help desk or doing mundane jobs + collecting vendor certificates in order to do more mundane jobs... then eventually working your way up to a position where you're an expert in fixing mundane stuff and you're only called to fix the particularly tricky things that the layer or two of minions below you can't fix.
Mediocre devs in London were getting £500 to £750 in the late 1990's (Source: I had several people working for me earning this). Now the salary is lower. I think £750 a day would be very exceptional now. It hasn't even kept up with inflation.

Admittedly my experience is limited to just banking IT (several banks) but most jobs have been offshored and are being done by people earning far less than that. Almost everyone who works for me is in India or Mexico. Nothing wrong with that but it's an indication that companies are driven to reduce costs rather than focus on IT for IT's sake. If I look at the current job market I would probably get a substantially reduced salary compared to where I currently am. That's because wages have been suppressed. If I look at the general working conditions they have deteriorated too. That doesn't happen when the job is valued. Working hours are excessive with no overtime paid (they used to be excessive but overtime used to be paid). People in IT tend to work in poorer desk conditions too from my experience. Back in the 1980's and 1990's IT workers would have better conditions than the business but nowadays it is reversed.

Is Prince Harry working in IT then?

IT is a tool in the same way a hammer or saw is a tool. It is used to make things. But it's just a tool. The main focus of most companies isn't to generate work for the IT department. Their main goal is to build tools and systems to support their non-IT business. There are some exceptions. But that's the minority.
 
Back
Top Bottom