So recently I decided to finally dig into why I have to convert my sites to static for them not to keel over. Yes, I run them on a very lean server, but it still should have been sufficient.

Checking server logs the main culprit seems to be Semrush bot (and bots broadly). I’ve never set up robots.txt as I doubt it would do much good, or rather it would only do good against honourable bots.

As earlier mentioned I hadn’t correctly set nginx to block such bot visits however it’s surprising just how many of the bots are nothing more than lame search engine spiderings – unless they are a front/are being co-opted to reconnoitre for later more targeted vulnerability scans.

I long ago designed my web server to be easy to discard-and-replace, however that is in part due to not much changing on it. Perhaps it’s time to shift it’s DB elsewhere to make it fully replaceable.