You can not select more than 25 topics
			Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
		
		
		
		
		
			|  | 4 years ago | |
|---|---|---|
| .. | ||
| Dockerfile | 4 years ago | |
| README.md | 4 years ago | |
| backup.sh | 4 years ago | |
| install.sh | 4 years ago | |
| run.sh | 4 years ago | |
		
			
				
				README.md
			
		
		
	
	postgres-backup-s3
Backup PostgresSQL to S3 (supports periodic backups)
Usage
Docker:
$ docker run -e S3_ACCESS_KEY_ID=key -e S3_SECRET_ACCESS_KEY=secret -e S3_BUCKET=my-bucket -e S3_PREFIX=backup -e POSTGRES_DATABASE=dbname -e POSTGRES_USER=user -e POSTGRES_PASSWORD=password -e POSTGRES_HOST=localhost schickling/postgres-backup-s3
Docker Compose:
postgres:
  image: postgres
  environment:
    POSTGRES_USER: user
    POSTGRES_PASSWORD: password
pgbackups3:
  image: schickling/postgres-backup-s3
  depends_on:
    - postgres
  links:
    - postgres
  environment:
    SCHEDULE: '@daily'
    S3_REGION: region
    S3_ACCESS_KEY_ID: key
    S3_SECRET_ACCESS_KEY: secret
    S3_BUCKET: my-bucket
    S3_PREFIX: backup
    POSTGRES_BACKUP_ALL: "false"
    POSTGRES_HOST: host
    POSTGRES_DATABASE: dbname
    POSTGRES_USER: user
    POSTGRES_PASSWORD: password
    POSTGRES_EXTRA_OPTS: '--schema=public --blobs'
Automatic Periodic Backups
You can additionally set the SCHEDULE environment variable like -e SCHEDULE="@daily" to run the backup automatically.
More information about the scheduling can be found here.
Backup All Databases
You can backup all available databases by setting POSTGRES_BACKUP_ALL="true".
Single archive with the name all_<timestamp>.sql.gz will be uploaded to S3
Endpoints for S3
An Endpoint is the URL of the entry point for an AWS web service or S3 Compitable Storage Provider.
You can specify an alternate endpoint by setting S3_ENDPOINT environment variable like protocol://endpoint
Note: S3 Compitable Storage Provider requires S3_ENDPOINT environment variable