I recently started running a Minecraft server on my Synology NAS, so myself and my daughter can share some quality time together whilst having fun in the process, albeit in the form of a bunch of computery ones and noughts.
Now, this precious data, like all precious data, needs to be backed up. Preferably regularly because, since we both know several Minecraft commands that can potentially erase thousands of blocks en masse, one little mess-up whilst wielding that sort of power can cause a heck of a lot of damage to your precious little Minecraft creations in the blink of Steve’s little 8-bit eye.
I already use Synology’s back-up offering to do nightly system backups, but it just wasn’t flexible enough in this instance. Same applied to Synology’s Snapshots feature: It’d let me take snapshots of my entire Docker folder, but it wouldn’t let me single out one particular container and put it on an increased backup frequency compared to all the others.
The more I thought about it, the more I began to realise that the best (and probably only) solution would be to write a custom script to run in Synology Task Scheduler. However, that would require me learning Bash Script and, whilst I’m always keen to learn new skills, it seemed like a lot of hassle and headaches in the making. And besides, I really needed a solution quicker than my tiny little brain could conjure.
So, what to do?
Well, I’ve messed around with Google Gemini on a few occasions but this was literally stuff like, “Hey Gemini, make me a picture of a black and white cat, wearing sunglasses, and riding on a skateboard whilst playing a Fender Stratocaster”.
Not exactly pushing the boundaries of computer science, me thinks.
That said, it did do a very impressive job of said cat picture, so maybe, just maybe, it could help me come up with a solution to my Minecraft server backup dilemma.
So, I headed over to gemini.google.com and typed in the following request…
“Write me a script for Synology Task Scheduler to backup specific folders to a schedule, zip them up, then name them with a date and time stamp. Also with an option to delete backups after a certain timeframe. Please include hashed comments.”
Fully expecting it to respond with, “Sorry, I can only really do cat pictures and questionable song lyrics, will that do?”, you can imagine my utter amazement when, after mere seconds, it came back with this…
# —- Synology Task Scheduler Backup Script —-
# This script backs up specified folders, zips them with a timestamp and deletes old backups to save space.
# —- USER CONFIGURATION —-
# Enter the full paths of the folders you want to back up, separated by spaces.
# Example: /volume1/data/folder1/volume1/photos
SOURCE_FOLDERS=”/enter/your/path/here”
# Enter the full path of the destination folder for your backups.
# Example: /volume1/Backups/important_files
DESTINATION_FOLDER=”/enter/your/path/here”
# Enter the number of days to keep backups.
# Backups older than this will be automatically deleted.
# Set to 0 to disable automatic deletion.
RETENTION_DAYS=7
# —- SCRIPT LOGIC (DO NOT EDIT BELOW THIS LINE) —-
# Create a timestamp for the backup file name.
TIMESTAMP=$(date +”%Y-%m-%d_%H-%M-%S”)
BACKUP_FILE_NAME=”backup_$TIMESTAMP.zip”
BACKUP_PATH=”$DESTINATION_FOLDER/$BACKUP_FILE_NAME”
# Create the destination folder if it does not exist.
mkdir -p “$DESTINATION_FOLDER”
# Log start of the backup process.
echo “Starting backup process…”
# Create a zip archive of the specified source folders.
# The ‘q’ flag makes it quiet, the ‘r’ flag recursively includes directories.
zip -q -r “$BACKUP_PATH” $SOURCE_FOLDERS
# Check if the zip command was successful.
if [ $? -eq 0 ]; then
echo “Backup successful! File saved to: $BACKUP_PATH”
else
echo “Backup failed. Please check the source folders and permissions.”
exit 1
fi
# Clean up old backups based on the retention period.
if [ $RETENTION_DAYS -gt 0 ]; then
echo “Deleting backups older than $RETENTION_DAYS days…”
# Find files in the destination folder that are zip archives and were modified more than X days ago.
find “$DESTINATION_FOLDER” -name “backup_*.zip” -mtime +$RETENTION_DAYS -exec rm {} \;
echo “Cleanup complete.”
else
echo “Automatic cleanup is disabled.”
fi
# Log end of the backup process.
echo “Backup and cleanup process finished.”
# ====================================
# —- END OF SCRIPT —-
# ====================================
I mean, it looks incredibly pretty and everything, right? Beautifully laid out with hashed comments, as requested, so a Bash Script dummy like myself could hopefully sort of understand it, at least to a point… But, would it actually work?
Well, the short answer is, “yes, it actually worked”.
The long answer is, “Holy sh**! Wait, what!? No it did not just do that! Surely not? I mean, WTF!”… [*hastily copies and pastes the code into Synology Task Scheduler, fully expecting it to crash and tell me I’m an idiot who should stick to memes and cat pictures, whilst proceeding to set my NAS on fire to teach me a valuable life lesson*]… “Oh my God… I mean, OH – MY – ACTUAL – F****** – GOD… IT WORKED!!!”… Or something very much along those lines.
So, there we have it… It’s 2025, AI is very much still in it’s infancy, yet it can already fully comprehend a complex question, and what I was trying to achieve in relation to that question, then write me a script to achieve it, in a matter of seconds, in a programming language that I’m not yet proficient in… Ady’s – Tiny – Little – Mind – Blown!
By my calculations, however, at some point in the early-to-mid 2030’s, it (AI) will probably be able to CTRL-ALT-DELETE the planet to make way for a hyperspace bypass, or some other comically far-fetched reason… But, at least I’ll have my Minecraft server nicely backed up when it happens.


Leave a Reply to Senny Cancel reply