Since moving to Hugo as my site generator I have been more concerned with my data can Windows File History fit the bill?
Moving to a static site generator has given my back up procedures food for thought because now my blog or website relies on local files to be able to adjust it. Whereas before it didn’t. Since the early 2010s, I have been using a CMS of some type that I could keep adding to regardless of what machine I was on. Meaning since I have left education my need to back up just wasn’t there. Any photos that I have generally get are backed up to the cloud on one service or another and then important files generally found themselves being manually copied to a USB stick and one of the cloud services that I use for my photos. However, now that I am using Hugo and creating nearly daily files I had to find another solution.
My current workflow for using Hugo before using File History was to use Dropbox and still is for the website itself because I run Hugo in a Virtual Machine running Linux Mint to keep it separate from the rest of my system while editing the markdown files in Windows for ease of use. Both Linux and Windows have Visual Studio Code on them respectively so fundamentally it doesn’t matter which OS I produce the files in but I do prefer to upload them to my server using FileZilla in Windows as opposed to my VM. Of which I am happy with that set up because it means that my site and all its working files are stored somewhere else not just on my machine. The problem becomes that while Dropbox is taking care of syncing and fundamentally backing up the working files it isn’t keeping different versions. My solution to this was to zip up all my working files every time that I make any huge changes to the sites config.
While all these solutions are great while playing around in the Windows settings the other night I noticed that under back up you had file history and this intrigued me because I had an old external HDD that I didn’t use anymore. It had a bit of data on it from when it was running as a Media Storage for my Raspberry PI plex server that I have since discontinued, I did some research and found that Windows File History might be exactly what I want. It will allow me to keep going as I am but automatically create backups of file versions onto my old external HDD. In the 3 days, I have been running it now I haven’t had any huge issues, and have now used it to back up more than my site. Files that I don’t mind losing but would prefer to keep like game saves and stuff like that, and it works in the background. I must admit I didn’t stick with default settings because I felt for my use case backing up every hour was a bit too much so opted for every 6 hours. So far it doesn’t seem to cause any issues for PC in resource usages and only taken 956MB of my 2TB external.
While looking into the restore process I messed up a file on purpose so I could restore it once it had backed up the newer file and it worked fine without any issue. The only complaint I have about this is that it can’t back up to more than one drive. Along with the restore and back up settings user interface seems worlds apart because the backup settings have conformed to the Windows 10 view with the restore with the Windows 7 look. The positive however is that it backs up only the changed files since the last back up and if for example, my PC is off at its next scheduled time it will check if it needs to back up as soon as you boot the PC next time, from the limited use case I’ve seen. While looking at the file structure of the back up made using file explorer, you can manually go in and see the files without needing to use the restore tool however while doing this I found it odd that one file wasn’t where it was suppose to be, that said it could of been the length of the name of the file as I found it linked to a file that now a string of numbers as a file name in another folder on the drive when looking at the recovery tool as well. For the time being, I think this solution will suit my needs, even leaving me wondering whether I should invest into a proper NAS which can be set up in a RAID to have that redundancy that the current solution does not.