Browse Source

Merge pull request #2 from robwithhair/specify-filename

Specify filename
pull/119/head
Rob Harrison 7 years ago
committed by GitHub
parent
commit
c4fc4b28cc
No known key found for this signature in database GPG Key ID: 4AEE18F83AFDEB23
  1. 6
      postgres-backup-s3/README.md
  2. 8
      postgres-backup-s3/backup.sh

6
postgres-backup-s3/README.md

@ -40,3 +40,9 @@ You can additionally set the `SCHEDULE` environment variable like `-e SCHEDULE="
More information about the scheduling can be found [here](http://godoc.org/github.com/robfig/cron#hdr-Predefined_schedules).
### Overwriting S3 files
You can use the following environment variable to disable the timestamps and set a custom name for the S3 files.
- `S3_FILENAME` a consistent filename to overwrite with your backup. If not set will use a timestamp.

8
postgres-backup-s3/backup.sh

@ -63,6 +63,12 @@ pg_dump $POSTGRES_HOST_OPTS $POSTGRES_DATABASE | gzip > dump.sql.gz
echo "Uploading dump to $S3_BUCKET"
cat dump.sql.gz | aws $AWS_ARGS s3 cp - s3://$S3_BUCKET/$S3_PREFIX/${POSTGRES_DATABASE}_$(date +"%Y-%m-%dT%H:%M:%SZ").sql.gz || exit 2
if [ "${S3_FILENAME}" == "**None**" ]; then
S3_UPLOAD_PATH=s3://$S3_BUCKET/$S3_PREFIX/${POSTGRES_DATABASE}_$(date +"%Y-%m-%dT%H:%M:%SZ").sql.gz
else
S3_UPLOAD_PATH=s3://$S3_BUCKET/$S3_PREFIX/${S3_FILENAME}.sql.gz
fi
cat dump.sql.gz | aws $AWS_ARGS s3 cp - "$S3_UPLOAD_PATH" || exit 2
echo "SQL backup uploaded successfully"
Loading…
Cancel
Save