Cyberattacks targeting websites aren’t a minor thing. On the contrary, you should consider them an equally severe threat as malware, viruses, trojans, and ransomware. The issue is complicated and difficult to analyze, as websites can be attacked in multiple ways. That’s mostly because violations and breaches vary with each site’s specific architecture. Unfortunately, the effects of various types of attacks are usually similar.
When your website is the target of a cyberattack, you risk losing data, your position in Google’s ranking, and even contact with your customers. In other words, you can never be too cautious when it comes to web security. Here are some basic rules for keeping a site safe and how they can help you avoid security issues.
1 – Use SSL Certificates
As we pointed out in our article on HTTPS for Google, an SSL-protected connection is one of the first things you should equip your site with. Not only does SSL provide a high level of protection, but it can also help you get traffic. Chrome, for example, may refuse connection with pages that don’t adhere to this protocol, so by having an SSL-protected connection, you avoid losing website visitors.
Moreover, an SSL-protected connection can help with SEO, as Google considers HTTPS an important ranking factor for websites. As for the type of certificate you should opt for, you should know that even the very common Let’s Encrypt is enough to protect your site against intrusions or foreign entities spying your online activity.
2 – Keep the Site Up-to-date
Updating your website is essential for its security, especially when managing your content through and CMS. You can and should update modules, themes, and other components that require this action. As a guideline, it’s a good habit to check whether there are updates available and run them periodically (for instance, once per week).
If you don’t have the time for this activity, assign the task to someone with the right skills for the job. Whether you decide on an in-house expert or outsourcing, make sure the person responsible is also able to restore the site in the case of an error. This way, you prevent attackers from exploiting possible site bugs. On top of that, you can implement a policy of Security Assessment to check the effectiveness of your security measures, and further protect your data.
3 – Use Secure Passwords
It’s no secret that passwords are the gateway to the administrative backend of your website, where the CMS enables you to make changes. Many cyberattacks, in fact, are based on the knowledge of site passwords. That’s why it’s essential to protect these secret words to prevent unauthorized access to your website.
All users (and especially the administrators) should never use universal passwords that also give them access to other portals. Moreover, you should encourage them to save passwords in a secure keychain (like the one integrated with Firefox, for example). Remind users that all passwords should contain at least one capital letter and one non-alphabetic character. Also, it’s considered good practice for users to change their passwords every three months.
4 – Properly Manage User Roles on the Site
Many security problems have their roots in a substantial carelessness in managing accounts. If you need to add new users, you should think twice before providing them administrator access–you should do it only when it’s really necessary.
Remember that every new user that you create becomes a potential “key” to access your website, sometimes by malicious users. If you create too many ways to access it, your site becomes more vulnerable to attacks.
5 – Don’t Show too Many Error Messages
Bugs on websites happen all the time, but it’s not necessary to start showing them around. On the contrary, this approach can benefit attackers. It’s fundamental to separate the staging version of the site from the live site. Also, you should make sure that the details of the errors, which are valuable only for your programmers, remain hidden to outsiders. It’s an extra layer of protection that can limit computer security problems.
6 – Don’t Show Details about Your CMS
Webmasters often set up a robots.txt file on the site to prevent Google from specific scans, for SEO purposes. As the file is public, they end with exposing the fact that the site is using WordPress, for example. Another fingerprint that programmers unintentionally leave on the website is the meta-tag generator, which identifies various versions of the site. It’s another element that can be used maliciously– if people know the version of the CMS, they can design specific cyberattacks for that version.
In conclusion, keeping a website safe requires a bit of attention and common sense, as well as in-depth knowledge of the technologies and the dynamics that keep sites functional.