Global BSOD

Soldato
Joined
6 Feb 2019
Posts
17,910
Just imagine being the programmer who wrote this update and waking up the next day to all this.......

The question is who approved it. Lone programmers don't release code in this size IT firm, this is a massive failure that would involve multiple people, from the people who coded it, the testers and the managers who approved the code commits and patch releases
 
Last edited:
Soldato
Joined
18 Oct 2002
Posts
6,763
Location
Cambridge
That's a shed load of additional work for a team considering those updates are daily if not multiple times a day :eek:
I think they were batched, but it was a fully automated deployment / test system (rebuilding test environments etc.) anyway.
 
Last edited:

X

X

Caporegime
Joined
24 Oct 2012
Posts
25,234
Location
Godalming
just exposes companies who don't have hygienic software deployment schemes
... so if the genuine russian pirates had done some kind of dns intrusion to deploy a bogus release, having hacked appropriate certificates

You having a stroke again?
 
Caporegime
Joined
29 Jan 2008
Posts
58,934
It's been a few years since I did much work with Windows servers, but I don't understand why this has affected so many companies - I'd normally expect updates to be tested in staging / pre-production environments before going to a live system :confused:

Did all these companies have Cloudstrike automatic updates on in production systems? That seems pretty crazy to me!

Yeah, it's a bit odd, like obviously any software update could impact a production system so you at least want to run some basic tests, even if it's just some paint-by-numbers automated testing process.

To just blindly accept automatic updates, without testing, in any critical production system seems very reckless/dumb. Not clear if that's on the vendor or the clients? (As in who controls that? Is it a hosted solution?)

Edit - was it a windows update that broke this (seemingly fragile) software or was it an update to this software (the windows version)? In the latter case there is absolutely no excuse for the vendor, it's broken for everyone then it's clearly just not been tested at all not just some edge case missed by QA.
 
Last edited:
Soldato
Joined
28 Jun 2013
Posts
4,031
Royal Mail "bring your label" printers are down or so my post man says for his office anyway. No big deal but interesting how many were affected by this.
Perhaps all the label printers in the post offices are all down today also.
 
Man of Honour
Joined
19 Oct 2002
Posts
29,603
Location
Surrey
IT support thousand yard stare after waking up to the Crowdstrike outage.


CokSu4J.jpeg
 
Associate
Joined
25 Oct 2013
Posts
1,017
It really cant be right that one software glitch from one company has caused so much disruption globally - Govts/companies need to take a hard look at the situation we seem to have put ourselves in.
 
Soldato
Joined
17 Nov 2007
Posts
3,190
It really cant be right that one software glitch from one company has caused so much disruption globally - Govts/companies need to take a hard look at the situation we seem to have put ourselves in.

The issue is their update caused a BSOD, which while in that state any remote fix cant be applied, so CrowdStrike have a fix but if the machines are offline its not much help :D
 
Soldato
Joined
27 Jun 2006
Posts
12,434
Location
Not here
Interesting how many companies are using CrowdStrike. They must have a fat contract with these government organizations.

Doubt many will be renewing now. Microsoft have got to be rubbing their hands ready to slap them all with another subscription.
 
Last edited:
Soldato
Joined
5 Mar 2003
Posts
10,768
Location
Nottingham
The question is who approved it. Lone programmers don't release code in this size IT firm, this is a massive failure that would involve multiple people, from the people who coded it, the testers and the managers who approved the code commits and patch releases
An anti virus company will be making 100s of releases a month - current products, previous products and the fact that the anti virus business is very dynamic considering its prey. The number of configurations for Windows software is insane - I have no doubts that this would have been thoroughly tested but you can't have 100% coverage for the millions (billions?!) of configurations for MS platforms - the hardware, the software, the different versions, the different drivers / applications on that machine. Will be really interesting to know what the actually problem was (if we'll ever be told) but bugs like this are constantly released into the wild. It's just this time it caused BSOD.
 
Soldato
Joined
26 May 2006
Posts
6,072
Location
Edinburgh
Fairly sure we don't just push down updates as soon as they are available. Often tested before a roll out. Unless something marked as security critical that needs pushed out asap due to a threat it seems bizarre this wasn't picked up in any testing. Especially when the actual issue is a complete BSOD of the system. Surely that gets picked up on the first batch of testing?
 
Back
Top Bottom