Follow these steps to get the add-on installed on your system:
- Enable Advanced Mode in your Home Assistant user profile.
- Navigate in your Home Assistant frontend to Supervisor -> Add-on Store.
- Search for "Amazon S3 Backup" add-on and click on it.
- Click on the "INSTALL" button.
- Set the
aws_access_key
,aws_secret_access_key
, andbucket_name
. - Optionally / if necessary, change
bucket_region
,storage_class
, anddelete_local_backups
andlocal_backups_to_keep
configuration options. - Start the add-on to sync the
/backup/
directory to the configuredbucket_name
on Amazon S3. You can also automate this of course, see example below:
To automate your backup creation and syncing to Amazon S3, add these two automations in Home Assistants configuration.yaml
and change it to your needs:
automation:
# create a full backup
- id: backup_create_full_backup
alias: Create a full backup every day at 4am
trigger:
platform: time
at: "04:00:00"
action:
service: hassio.backup_full
data:
# uses the 'now' object of the trigger to create a more user friendly name (e.g.: '202101010400_automated-backup')
name: "{{as_timestamp(trigger.now)|timestamp_custom('%Y%m%d%H%M', true)}}_automated-backup"
# Starts the addon 15 minutes after every hour to make sure it syncs all backups, also manual ones, as soon as possible
- id: backup_upload_to_s3
alias: Upload to S3
trigger:
platform: time_pattern
# Matches every hour at 15 minutes past every hour
minutes: 15
action:
service: hassio.addon_start
data:
addon: XXXXX_amazon-s3-backup
The automation above first creates a full backup at 4am, and then at 4:15am syncs to Amazon S3 and if configured deletes local backups according to your configuration.
Example add-on configuration:
aws_access_key: AKXXXXXXXXXXXXXXXX
aws_secret_access_key: XXXXXXXXXXXXXXXX
bucket_name: my-bucket
bucket_region: eu-central-1
storage_class: STANDARD
delete_local_backups: true
local_backups_to_keep: 3
AWS IAM access key used to access the S3 bucket.
AWS IAM secret access key used to access the S3 bucket.
Amazon S3 bucket used to store backups.
AWS region where the S3 bucket was created. See https://aws.amazon.com/about-aws/global-infrastructure/ for all available regions.
Amazon S3 storage class to use for the synced objects, when uploading files to S3. One of STANDARD, REDUCED_REDUNDANCY, STANDARD_IA, ONEZONE_IA, INTELLIGENT_TIERING, GLACIER, DEEP_ARCHIVE. For more information see https://aws.amazon.com/s3/storage-classes/.
Should the addon remove oldest local backups after syncing to your Amazon S3 Bucket? You can configure how many local backups you want to keep with the Option local_backups_to_keep
. Oldest Backups will get deleted first.
How many backups you want to keep locally? If you want to disable automatic local cleanup, set delete_local_backups
to false.
If you also want to automatically delete backups to keep your Amazon S3 Bucket clean, or change the storage class for backups to safe some money, you should take a look at S3 Lifecycle Rules (https://docs.aws.amazon.com/AmazonS3/latest/userguide/how-to-set-lifecycle-configuration-intro.html).
I recommend to create a new IAM user, which:
- can not login to the AWS Console
- can only access AWS programmatically
- is used by this add-on only
- uses the lowest possible IAM Policy, which is this:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowAWSS3Sync",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::YOUR-S3-BUCKET-NAME/*",
"arn:aws:s3:::YOUR-S3-BUCKET-NAME"
]
}
]
}
Usage of the addon requires knowledge of Amazon S3 and AWS IAM.
Under the hood it uses the aws cli version 1, specifically the aws s3 sync
command.
This addon is highly inspired by https://github.com/gdrapp/hass-addons and https://github.com/rrostt/hassio-backup-s3