Angry admins share the CrowdStrike outage experience
CrowdStrike? More like ClownStrike! Amirite?
IT administrators are struggling to deal with the ongoing fallout from the faulty CrowdStrike file update. One spoke to The Register to share what it is like at the coalface.
Speaking on condition of anonymity, the administrator, who is responsible for a fleet of devices, many of which are used within warehouses, told us: "It is very disturbing that a single AV update can take down more machines than a global denial of service attack. I know some businesses that have hundreds of machines down. For me, it was about 25 percent of our PCs and 10 percent of servers."
He isn't alone. An administrator on Reddit said 40 percent of servers were affected, along with 70 percent of client computers stuck in a bootloop, or approximately 1,000 endpoints.
Other administrators reported having 250,000 clients and servers all over the world to deal with.
Being hit by the outage is one thing. Recovering is quite another. The workarounds published so far are less than ideal for administrators used to remotely managing devices. Since the failure leaves an affected device stuck in a Blue Screen Of Death (BSOD) boot loop, implementing a workaround tends to involves in-person intervention unless remote access that does not use the operating system is possible.
In case you need it, here's a manual workaround for Windows systems broken by CrowdStrike:
- Boot into Safe Mode or the Windows Recovery Environment
- Go to the
C:\Windows\System32\drivers\CrowdStrike
directory - Delete the file matching
C-00000291*.sys
- Reboot as normal
Sadly, for our administrator, things are less than ideal.
He told us: "The fix, while pretty simple, requires hands on the machine, which is not great when most are remote. Talking a warehouse operator through the intricacies of BitLocker recovery keys and command prompts is not for the faint-hearted!"
BitLocker is Microsoft's encryption tool, and it makes a device's storage inaccessible without a recovery key. As such, trying to work through some of the current recovery options on a modern device will likely require the use of that recovery key. Pity the administrators who dutifully kept a list of those keys on a secure server share, only to find that the server is also now showing a screen of baleful blue.
Another Redditor posted: "They sent us a patch but it required we boot into safe mode.
"We can't boot into safe mode because our BitLocker keys are stored inside of a service that we can't login to because our AD is down.
- Azure VMs ruined by CrowdStrike patchpocalypse? Microsoft has recovery tips
- Second NHS IT system confirmed to be affected by CrowdStrike issues
- CrowdStrike shares sink as global IT outage savages systems worldwide
- CrowdStrike update bricking Windows machines around the world
"Most of our comms are down, most execs' laptops are in infinite bsod boot loops, engineers can't get access to credentials to servers."
Our administrator is understandably a little bitter about the whole experience as it has unfolded, saying, "We were forced to switch from the perfectly good ESET solution which we have used for years by our central IT team last year.
"So, we are less than impressed."
The grumbling went on: "Maybe less time sponsoring every sports team and more time testing would fix the issue!" ®