Website disasters and how to avoid them
It's easy to forget how much you rely on your website when it's all working nicely. But when things go wrong the pain to your business and the impact on your time can be accute.
There have been a few instances recently that I’ve been made aware where website owners have suffered some of that pain. So in this article I’m going to share what happened along with ways the problems could have been mitigated. In the hope that it saves you and your organisation some future web trauma.
Outsourcing leading to a security breach
The site owner in this case contacted us because their website appeared to be working but if they visited it via a Google search they were redirected to another completely unrelated website. Of course that’s bad news, because potential clients were being directed away and the potential knock-on-effect once Google and other search engines realised there was a problem would have been a negative impact on their search rankings.
The site owner has had some projects on the website completed by outsourced freelancers, found on marketplace websites. While the low rates charged by freelancers on these platforms can be tempting, there are other factors to consider to avoid work on your website becoming a cautionary tale!
In this case the freelancers had added a number of extensions to the site to try and complete their task. A couple of these were outdated and not properly supported which points to potential security vulnerabilities. While another added a critical vulnerability to the site’s file system. It was that lapse in security that allowed a third party to add malicious code.
Extensions and plugins for websites aren’t a bad thing, they can be a great way to improve your site, add functionality and more. But they should always be selected and used following some basic checks, due diligence and knowledge of how they work. What might appear to be a useful shortcut can actually put your whole website at risk.
It’s also vital that websites are kept updated - outdated software is a common security weak point.
How could this have been avoided?
By making sure anyone working on your website, or with access to your CMS understands the potential hazards of using third party tools. Only use tools you can verify are safe and maintained.
Website deleted in error… with no content backups
This one is the stuff of nightmares I’m afraid so prepare yourself as you read on. A live website that had been active for many years, loaded with years worth of content and updates, one day without any warning, vanished. The site owner couldn’t see the front or backend. The site wasn’t there. After getting in touch with the original agency who had built and maintained the site they got the alarming reply, “we didn’t think you needed it any more, so we’ve deleted it!”
I’ve never heard of this before, but normally it would be fairly easy to rectify the mistake. However in this case despite a request from the client to restore it, the website there was no backup available, they'd been deleted as well.
How could this have been avoided?
By following standard practice for backups this scenario could be avoided. There should always be a full backup saved in a remote location cloud based location, off the server the website is on.
Site compromised and core files deleted
A designer I know has been working on his own portfolio site and it complete enough that he had it live while he found time between contracts to add some more projects and copy. The site was being hosted with a very basic package on one of the most popular places for buying domain names (and in my opinion one of the worst choices.). I won’t say the exact name but I’m sure you could ‘Go’ and find it yourself.
A warning sign for the website owner was when he visited his own URL and was redirected before his browser gave a warning about redirections. Assuming he’d broken something he didn’t give it much thought and decided to revisit later.
Fast forward a week and trying to login to the site’s admin dashboard. The page couldn’t be found. Trying the front end URL had the same result.
Time to panic…
At this point he got in touch with me and asked what to do. I should remind you this isn’t a website we manage at Consider, but an example to help you avoid some of the same pain.
I suggested the very first port of call was to contact the hosts and just check with them. Occasionally this could be them performing maintenance on a server or a problem they are aware of. He did just that, but unfortunately they confirmed the server was fine and they could see the core files of his site had been deleted. Perhaps more unfortunately that’s where the support stopped!
The website was down. The CMS was gone. The content was deleted. I think that counts as a website disaster.
In this case the fix was not too painful, we cleaned the server and reinstalled a fresh copy of the CMS. Luckily there was a backup that although a bit old was cloud stored in Drive. So once the CMS was back it could be replaced easily by the site owner.
You’re probably wondering what caused the site to be deleted in the first place. In this case it was almost certainly a malicious attack. The details aren’t as important as the prevention. At the bottom of this article I’ll summarise some essential website security tips.
A few takeaways on how to ensure your website is well protected from disasters.
Always have a backup - that sounds obvious but keeping a fully packaged backup can save a huge amount of stress. It typically reduces downtime if things do go wrong.
At Consider we employ a few best practices with backups. Here they are:
- Keep an on-server backup where possible for restoring a site quickly
- Also keep an off-server backup that’s stored securely in a cloud environment (not on an old laptop).
- We share a link to a cloud stored version of the site with our clients regularly. They don’t need to have it but they’ve paid for the website, it’s theirs not ours so it makes sense they have access if they need.
- We test backups frequently to make sure they are perfectly packaged and that we’re very very fast at getting them back online.
Some essential website security tips
So now you should have your backup organised and be following those best practices above. Now these tips will reduce the chance that you’ll have to use that backup by beefing up your website’s security.
The focus here is on three main areas.
Passwords & user security
We can get a site super secure but if you have users with insecure passwords, or users who shouldn’t have access anymore then you’re asking for trouble.
Typically we see things like, old employees still having admin access to websites, or subcontractors who haven’t used secure passwords. It’s dangerous, but luckily it’s easy to fix.
- Perform regular user audits. If they don’t need access, remove admin accounts.
- Force strong passwords. Many CMS platforms now have the option to require a minimum level of password security.
- Consider using two factor authentication. 2FA makes the user side far more robust.
Site software & components
Your website is likely built out of a collection of different pieces of software. Just like the software on your smartphone newer versions are released. Quite often the newer versions fix potential security flaws. But they also improve performance, add new features, generally they are constantly evolving.
Before you hit ‘Update all’ it’s worth noting that it’s important all these components work together and because they don’t all get newer versions at the same time there is a potential for things to go wrong if you don’t know what you’re doing (and even if you do). So it’s a good ideal to follow best practices here, or even better talk to us about a site care package.
This one doesn’t get talked about enough in my opinion. Budget servers from some of the biggest providers just aren’t very secure. They don’t keep server software up to date which can cause problems. File transfers aren’t as secure (think FTP rather than SFTP) and the servers themselves are shared across lots and lots of websites you don’t know anything about.
As a side note on that, the performance tends to be quite slow too. That’ll have a negative impact on user experience and search engine performance.
Share this Post