Browse Source

Added filename but untested

pull/119/head
Rob Harrison 7 years ago
parent
commit
46d458a750
  1. 1
      postgres-backup-s3/Dockerfile
  2. 6
      postgres-backup-s3/README.md
  3. 8
      postgres-backup-s3/backup.sh

1
postgres-backup-s3/Dockerfile

@ -15,6 +15,7 @@ ENV S3_SECRET_ACCESS_KEY **None**
ENV S3_BUCKET **None**
ENV S3_REGION us-west-1
ENV S3_PATH 'backup'
ENV S3_FILENAME **None**
ENV S3_ENDPOINT **None**
ENV S3_S3V4 no
ENV SCHEDULE **None**

6
postgres-backup-s3/README.md

@ -40,3 +40,9 @@ You can additionally set the `SCHEDULE` environment variable like `-e SCHEDULE="
More information about the scheduling can be found [here](http://godoc.org/github.com/robfig/cron#hdr-Predefined_schedules).
### Overwriting S3 files
You can use the following environment variable to disable the timestamps and set a custom name for the S3 files.
- `S3_FILENAME` a consistent filename to overwrite with your backup. If not set will use a timestamp.

8
postgres-backup-s3/backup.sh

@ -63,6 +63,12 @@ pg_dump $POSTGRES_HOST_OPTS $POSTGRES_DATABASE | gzip > dump.sql.gz
echo "Uploading dump to $S3_BUCKET"
cat dump.sql.gz | aws $AWS_ARGS s3 cp - s3://$S3_BUCKET/$S3_PREFIX/${POSTGRES_DATABASE}_$(date +"%Y-%m-%dT%H:%M:%SZ").sql.gz || exit 2
if [ "${S3_FILENAME}" == "**None**" ]; then
S3_UPLOAD_PATH=s3://$S3_BUCKET/$S3_PREFIX/${POSTGRES_DATABASE}_$(date +"%Y-%m-%dT%H:%M:%SZ").sql.gz
else
S3_UPLOAD_PATH=s3://$S3_BUCKET/$S3_PREFIX/${S3_FILENAME}.sql.gz
fi
cat dump.sql.gz | aws $AWS_ARGS s3 cp - "$S3_UPLOAD_PATH" || exit 2
echo "SQL backup uploaded successfully"
Loading…
Cancel
Save