This article is more than 1 year old

Keeping your head as an entire database goes pear-shaped

How do you spell 'backup' again? 'D' 'R' 'O' 'P'...

Who, Me? A reminder of the devastation a simple DROP can do and that backups truly are a DBA's best friend in this morning's "there but for the grace of..." Who, Me?

"Stephen" is the author of today's confession and was faced with what should have been a simple case of applying an update to an Estimating and Invoicing system.

The system ran on a PostgreSQL-database and was, in his words, "Software that I don't touch save when there's an issue, needs rebooting, etc."

The best sort of software, in the opinion of this writer. However, an update was required and Stephen was the man assigned to the task. He had prepared the installer and was ready to wait for the 30 or so minutes while it ran when he noticed that the backups had not been running.

In fact, the backups had not been run in the last nine months.

Wisely, he decided that running a backup before an update was A Good Idea™ and so, after making sure everybody was logged out, hit the manual backup button.

It failed.

[Expletive deleted] Stephen tried again.

It failed again.

"Surely I'm smarter than this issue," Stephen thought to himself, and decided that he could make it work. In fact, he could remember some of the steps the vendor had given him and would simply run through those. After all, he was outside of support hours and didn't want the users to come in next morning to find the system both inaccessible and not updated.

He fired up a console for the database and began to run through the steps he remembered. Drop the database, then run a manual backup. That sort of thing.

Being a careful chap, he also took a copy of the directory of the application from the Programs Folder. Just to make sure he had a copy of the data.

At this point, pretty much every DBA is likely yelling at their screens, but Stephen was confident. He had this. He was, after all, smarter than the software.

Once he'd gone through those steps, he installed the update, which worked like a charm.

He fired up the system…

"And was immediately queried for the information needed for a new install."

Uh oh.

No problem. He was smarter. He could sort it. He had the original folder. He manually copied it back and fired things up once more.

Again, there was the new install message.

It must be the silly update. Stephen re-opened the service ticket on the vendor's website, complaining that something was horribly, horribly amiss with the update. He'd run it. It seemed to work. But the database was blank.

Fortunately for him, the service desk was open and he was soon called back by a tech who listened to his tale of woe and each of the steps he'd taken to back things up. There was a silence on the phone before the tech emitted a quiet "Oh no…"

"The next level of consulting," said Stephen, "included me asking why the software doesn't have an email switch for notifications about backup? If I'd known this was happening, I would've reached out much sooner."

The retort was to ask why Stephen hadn't noticed the failed backups earlier?

"I offered that I'm not a daily driver on the database," said Stephen, "I just make sure it's up [and] running and even if I connected, the account I'd connect with was not a privileged user."

And so it went on. However, despite the best efforts of tech support, the database was gone. Because of course it had - there was the DROP command after all. And, let's face it, databases are not normally stored in the same place as the executables…

This writer well remembers an occasion several decades ago where a DBA was so convinced that a running database meant database files would be locked that a swift DEL *.* could be used to remove redundant files. There were no backups then either.

"We decided in the moment to shut everything down to not make it worse than it already was," said Stephen.

His next step was to make The Call Of Shame™ to the owners of the company.

"Hey folks… have a problem running the update… lost data…"

"How much?"

"All of it."

In the end, the nine-month-old backup was restored and recovery experts enlisted in the hope that the missing records could be recovered. No joy.

"So the company was forced to recover using what printed invoices and tickets we had and to forge ahead from there," recalled Stephen.

"It seems that my candor was what saved my rear end from the genuine (and deserved) ire that should have rained down."

"I have since started making multi-layer backups, without dropping the db, mind you, and learning the lesson that if rebooting the server doesn't fix it, it's time to stop and reach out to the vendor."

Stephen sent us a longer list of lessons learned, but for some reason most also involve backing things up. That, and being thankful for understanding managers.

Backups are that thing that always seems to get forgotten about until they are needed to save one's career. Ever thought "sure, I am smart, I can fix this" only to be proven catastrophically wrong? Tell your story with an email to Who, Me? ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like