This is a simple tool written in Bun to run simple backup instructions to AWS S3 based of a ./config.toml configuration file.
It supports backing up single files, files based of glob patterns, and syncing folders.
Caution
This tool is not the product of love, but laziness! There are plenty of awesome self-hosted backup solutions, but I couldn't bother doing the research* and preferred to write my own code in Bun. And yes, I am aware this could be done entirely in bash, but I also did not want to deal with that.
...and because of that, I don't really recommend anyone uses this.
*Well, technically I did research it, and many of thje projects were paid, didn't support S3, had really terrible reviews, and etc.
- AWS Setup:
- Create a non-public S3 bucket
- Add a a lifecycle policy to the S3 bucket, I if you want to copypaste:
- Name it
backup 60d90d - Set it up to the entire bucket, instead of using filter tags
- Add a
Permanently delete noncurrent versions of objectsrule- Days after objects become noncurrent: 60
- Number of newer versions to retain - Optional: 90
- Name it
- Create a IAM user and attach to it a policy based of iam-policy.json, make sure to edit the
NAME-OF-YOUR-BUCKET. - Copy the .env.template to
.envwith the correct AWS creds for the IAM user you just created.
- Install the AWS CLI: https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html
- Edit the config.toml as per instructions below
- Run
bun install - Run
bun run backuponce and check if the files are correctly backed up to your bucket - Schedule auto backups:
- Modify s3-backup-tool.xml to change
E:\s3-backup-toolto wherever this tool is located in your pc - Open windows' Task Scheduler and import s3-backup-tool.xml
- Configure the triggers as you see fit
- Modify s3-backup-tool.xml to change
The config.toml file supports three types of backup sources:
## The storage class to be used
## Valid choices are: STANDARD | REDUCED_REDUNDANCY | STANDARD_IA | ONEZONE_IA | INTELLIGENT_TIERING | GLACIER | DEEP_ARCHIVE | GLACIER_IR. Defaults to STANDARD_IA.
storageClass = "STANDARD_IA"
## Backing up single files.
## This is done via Bun's built-in S3 API
[[sources]]
type = "single-file"
path = "C:/path/to/source/file.ext" # Absolute path to the source file
dest = "destination/path/file.ext" # Destination path in the backup storage
## Backing up the latest file from a source that matches a pattern (useful for rotating backups).
## This is done via Bun's built-in S3 API
[[sources]]
type = "dynamic-file"
path = "C:/path/to/source/directory/" # Directory containing the files
pattern = "backup_*.zip" # Pattern to match (uses glob syntax)
dest = "destination/path/backup.zip" # Destination path in the backup storage
## Backing up entire directories
## This is done through the aws cli s3 sync command
[[sources]]
type = "folder-sync"
path = "C:/path/to/source/directory" # Absolute path to the source directory
dest = "destination/path/" # Destination path in the backup storage
## Backing up entire directories via a 7z archive
## Windows-only: requires 7-Zip CLI available as `7z` in PATH
[[sources]]
type = "folder-archive"
path = "C:/path/to/source/directory" # Absolute path to the source directory
dest = "destination/path/archive.7z" # Destination file in the backup storage- Add a
folder-archivesource, which compresses a folder's contents before uploading it as a single file. - Add check for env vars
- Fix
utils.s3UploadFilenot getting the output bytes. - Fix
utils.s3SyncFolder's output and success detection. - Detect the number of changed files, print it, and save to some log.