Skip to content

Commit 39c172e

Browse files
DO-168: resolving discussions
1 parent 5d5f562 commit 39c172e

File tree

8 files changed

+147
-38
lines changed

8 files changed

+147
-38
lines changed

Diff for: CHANGELOG.md

+66
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
<!--
2+
3+
All notable changes to this project will be documented in this file.
4+
5+
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6+
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7+
8+
Stub:
9+
## [Unreleased] - YYYY-MM-DD
10+
### Added
11+
### Changed
12+
### Deprecated
13+
### Removed
14+
### Fixed
15+
### Security
16+
17+
-->
18+
19+
# nd_postgres_backup
20+
21+
## [0.3.0] - 2024-01-09
22+
23+
### Added
24+
25+
- Add publish decription to Docker Hub from README.md
26+
- Add jq to container build
27+
- Add abilty to change alias for S3 connection. Variable `S3_ALIAS`
28+
- Add docker-compose [example](compose-example/docker-compose.yml)
29+
- Add `crontab` file for managing schedule
30+
- Add retention functionallity
31+
32+
### Removed
33+
34+
- Remove Go-Cron, replace it with standard cron
35+
- Cut notifications for hourly backups
36+
37+
### Changed
38+
39+
- Replace manual backup routine logic. To start manual backup run container without S3_BUCKET variable. See [README](README.md)
40+
- Change S3 storage directory from `${S3_BUCKET}/${POSTGRES_DB}` to `${S3_BUCKET}/${backup_path}`
41+
- Rename notification scripts. Add `.sh` extention
42+
43+
44+
### Braking changes
45+
- Removed:
46+
- `SCHEDULE` - variable was used by Go-Cron
47+
- `HEALTHCHECK_PORT` - variable was used by Go-Cron
48+
49+
## [0.2.2] - 2023-08-25
50+
- Make docker-entrypoint.sh to easy manual run
51+
- Fix bug with `S3_OBJECT_PATH` var
52+
53+
## [0.2.1] - 2023-08-25
54+
- Added var `S3_OBJECT_PATH` to define the path to the backup file in the bucket
55+
56+
## [0.2.0] - 2023-08-25
57+
- refactoring and verification
58+
### Braking changes
59+
- Renamed:
60+
- `S3_ACCESS_KEY_ID` -> `S3_ACCESS_KEY`
61+
- `S3_SECRET_ACCESS_KEY` -> `S3_SECRET_KEY`
62+
63+
## [0.1.0] - 2023-08-03
64+
- Backup Postgres DB
65+
- Send back up to S3
66+
- Notify users by telegram or private messaging system

Diff for: Dockerfile

+1-1
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ RUN mcli --version
2424

2525
# PosgeSQL related variables
2626
ENV POSTGRES_DB="**None**" \
27-
POSTGRES_HOST="db" \
27+
POSTGRES_HOST="**None**" \
2828
POSTGRES_PORT=5432 \
2929
POSTGRES_USER="**None**" \
3030
POSTGRES_PASSWORD="**None**" \

Diff for: README.md

+72-26
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,22 @@
11
# PostgreSQL Backup to S3 and retention container
22

3+
# Roadmap
4+
5+
- [X] Add support for S3
6+
- [X] Add CI/CD to publish image to DockerHub
7+
- [X] Add retention policy settings by env vars
8+
- [X] Notify about backup status by HTTP-request
9+
- [X] Add docker-compose example
10+
311
## Description
412

5-
Image created to automate backing up procedure of PostgreSQL databases, store them to S3 Object storage and implement retention of stored archives with `Grandfather-father-son` backup rotation [scheme](https://en.wikipedia.org/wiki/Backup_rotation_scheme).
6-
It is also possible to use this container to create a single backup of specific DB. Set full S3 path (e.g `bucket_name/project_name/stage_branch/database_name.tar.gz`) as the value of variable `S3_OBJECT_PATH` to execute single backup
13+
Image created to automate backing up procedure of PostgreSQL databases, store backups to S3 Object storage and implement retention of stored archives with `Grandfather-father-son` backup rotation [scheme](https://en.wikipedia.org/wiki/Backup_rotation_scheme).
14+
It is also possible to use this container to create a single backup of specific DB.
715

816
## Usage
917

1018
Key idea of usage was to add this container as a service to `docker-compose.yml` manifest alongside with PostgreSQL database container. See `compose-example/docker-compose.yml`.
11-
To run container as a standalone backuaper, to backup cloud SaaS or bare-metal deployed PostgreSQL, for example use following command:
19+
To run container as a standalone backupper, to backup cloud SaaS or bare-metal deployed PostgreSQL, for example, use following command:
1220

1321
```shell
1422
docker run -d --rm \
@@ -27,6 +35,27 @@ docker run -d --rm \
2735
numdes/nd_postgres_backup:v0.3.0
2836
```
2937

38+
### Manual one-time backup without schedule
39+
40+
Set full S3 path (e.g `bucket_name/project_name/stage_branch/database_name.tar.gz`) as the value of variable `S3_OBJECT_PATH` to execute single backup
41+
42+
```shell
43+
docker run -d --rm \
44+
--env POSTGRES_HOST="DB_IP_OR_HOSTNAME" \
45+
--env POSTGRES_DB="DB_NAME" \
46+
--env POSTGRES_USER="DB_USERNAME" \
47+
--env POSTGRES_PORT="NON_DEFAULT_PORT" \
48+
--env POSTGRES_PASSWORD="DB_USERNAME_PASSWORD" \
49+
--env NOTIFICATION_SERVER_URL="ONLY_SET_IF_PRIVATE_TELEGRAM_BOT_USED" \
50+
--env TELEGRAM_CHAT_ID="PRIVATE_OR_TELEGRAM_BOT_ID" \
51+
--env S3_ENDPOINT="S3_API_URL" \
52+
--env S3_ACCESS_KEY="S3_ACCESS_KEY" \
53+
--env S3_SECRET_KEY="S3_SECERT_KEY" \
54+
--env S3_OBJECT_PATH="FULL_S3_PATH (e.g `bucket_name/project_name/stage_branch/database_name.tar.gz`)" \
55+
--env S3_ALIAS="S3_CONFIG_SET_ALIAS" \
56+
numdes/nd_postgres_backup:v0.3.0
57+
```
58+
3059
## Backup strategy
3160

3261
By default set to make backup every hour, plus one separate backup a day, plus one separate backup a week
@@ -43,27 +72,44 @@ Maximum depth of storage for each type of backup can be tuned by changing values
4372

4473
Schedule of retention script (`retention.sh`) execution can be edited in `crontab` file
4574

46-
## Variables list
47-
48-
| Variable Name | Default Value | Is Mandatory? | Description |
49-
|---------------------------|:-------------:|:-------------:|---------------------------------------------------------------------------|
50-
| HOURLY_BACKUP_PATH | `hourly` | NO | Path suffix to hourly-made backups storage |
51-
| DAILY_BACKUP_PATH | `daily` | NO | Path suffix to daily-made backups storage|
52-
| WEEKLY_BACKUP_PATH | `weekly` | NO | Path suffix to weekly-made backups storage|
53-
| WEEKLY_BACKUP_LIMIT | `5` | NO | Max number of weekly backups |
54-
| DAILY_BACKUP_LIMIT | `10` | NO | Max number of daily backups |
55-
| HOURLY_BACKUP_LIMIT | `25` | NO | Max number of hourly backups |
56-
| S3_ACCESS_KEY | - | YES | ${S3_BUCKET} READ-WRITE S3 ACCESS KEY |
57-
| S3_SECRET_KEY | - | YES | ${S3_BUCKET} READ-WRITE S3 ACCESS SECRET |
58-
| S3_ENDPOINT | - | YES | S3 API URL |
59-
| S3_BUCKET | - | YES | Path to hourly, daily, weekly directories will be. Including bucket name |
60-
| S3_OBJECT_PATH | - | NO | Optional variable to use single backup [functionality](#description) |
61-
| POSTGRES_DB | - | YES | PostgreSQL database name |
62-
| POSTGRES_HOST | `db` | NO | PostgreSQL IP or host name |
75+
## Variables
76+
77+
### `Gitlab Actions`
78+
*[variables](https://docs.github.com/en/actions/security-guides/encrypted-secrets#creating-encrypted-secrets-for-a-repository)*:
79+
80+
| Name | Description |
81+
|--------------------|-----------------------------|
82+
| DOCKERHUB_USERNAME | `Actions` Repository secret |
83+
| DOCKERHUB_TOKEN | `Actions` Repository secret |
84+
85+
### Notification environmental variables
86+
87+
| Name | Description |
88+
|---------------------------|-----------------------------------------------------------------------|
89+
| NOTIFICATION_SERVER_URL | URL of private telegram bot |
90+
| TELEGRAM_CHAT_ID | Custom bot ID or Telegram Bot ID when bot created using `@botfather` |
91+
| TELEGRAM_BOT_TOKEN | Created by `@botfather` bot security token |
92+
93+
### Environmental variables
94+
95+
| Variable Name | Default Value | Is Mandatory? | Description |
96+
|---------------------------|:-------------:|:-------------:|----------------------------------------------------------------------|
97+
| HOURLY_BACKUP_PATH | `hourly` | NO | Path suffix to hourly-made backups storage |
98+
| DAILY_BACKUP_PATH | `daily` | NO | Path suffix to daily-made backups storage |
99+
| WEEKLY_BACKUP_PATH | `weekly` | NO | Path suffix to weekly-made backups storage |
100+
| HOURLY_BACKUP_LIMIT | `25` | NO | Max number of weekly backups |
101+
| DAILY_BACKUP_LIMIT | `10` | NO | Max number of daily backups |
102+
| WEEKLY_BACKUP_LIMIT | `5` | NO | Max number of hourly backups |
103+
| S3_ACCESS_KEY | - | YES | ${S3_BUCKET} READ-WRITE S3 ACCESS KEY |
104+
| S3_SECRET_KEY | - | YES | ${S3_BUCKET} READ-WRITE S3 ACCESS SECRET |
105+
| S3_ENDPOINT | - | YES | S3 API URL |
106+
| S3_BUCKET | - | YES | Path to hourly, daily, weekly directories. Including bucket name |
107+
| S3_ALIAS | `backup` | NO | Name of config set in `mcli` command `mcli alias set` |
108+
| S3_OBJECT_PATH | - | NO | Optional variable to use single backup [functionality](#manual-backup-without-schedule) |
109+
| POSTGRES_DB | - | YES | PostgreSQL database name |
110+
| POSTGRES_HOST | - | YES | PostgreSQL IP or host name |
63111
| POSTGRES_PORT | `5432` | NO | TCP connection port |
64-
| POSTGRES_USER | - | YES | DB usermane |
65-
| POSTGRES_PASSWORD | - | YES | DB username password |
66-
| POSTGRES_EXTRA_OPTS | `--blobs` | NO | `pg_dump` extra options |
67-
| NOTIFICATION_SERVER_URL | - | NO | URL of private telegram bot |
68-
| TELEGRAM_CHAT_ID | - | NO | Custom bot ID or Telegram Bot ID when bot created using `@botfather` |
69-
| TELEGRAM_BOT_TOKEN | - | NO | Created by `@botfather` bot security token |
112+
| POSTGRES_USER | - | YES | DB usermane |
113+
| POSTGRES_PASSWORD | - | YES | DB username password |
114+
| POSTGRES_EXTRA_OPTS | `--blobs` | NO | `pg_dump` extra options |
115+

Diff for: backup.sh

+3-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
set -euo pipefail
77
IFS=$'\n\t'
88

9-
# Check if we have got path argument from scheduler if not set path to /
9+
# Check if we have gotten path argument from scheduler if not set path to /
1010
if [[ -z "$1" ]]; then
1111
backup_path=""
1212
elif [[ ! "$1" == */ ]]; then
@@ -66,7 +66,8 @@ mcli cp "${ARCHIVE_FILE_NAME}" "${S3_ALIAS}"/"${relative_s3_object_path}"
6666
echo "Maid is here... Doing cleaning..."
6767
rm --force "${POSTGRES_DB}".*
6868

69-
# # Do announce
69+
# Do announce
70+
# We are not going to spam chat every hour. Excluded hourly backups from notifications
7071
if [[ ! ${backup_path} =~ ^hourly.? ]]; then
7172
echo "Starting notification routine..."
7273
# Check which backup routine applied

Diff for: compose-example/docker-compose.yml

+1-6
Original file line numberDiff line numberDiff line change
@@ -21,9 +21,4 @@ services:
2121
depends_on:
2222
- db
2323

24-
redis:
25-
image: redis:7
26-
27-
web:
28-
image: ........
29-
24+
# docker-compoe.yml continues ...

Diff for: compose-example/variables.env

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
# Mandatory values
2-
S3_ENDPOINT=https://io.some-domain.com
2+
S3_ENDPOINT=https://some-s3-api.some-domain.com
33
S3_ACCESS_KEY=${S3_ACCESS_KEY_DB_BACKUPS}
44
S3_SECRET_KEY=${S3_SECRET_KEY_DB_BACKUPS}
55
POSTGRES_DB=some_db_name
66
POSTGRES_USER=some_db_username
77
POSTGRES_PASSWORD=some_db_user_password
88

99
# Mandatory only in case of scheduled usage
10-
S3_BUCKET=bucket_name/project_name/stage_branch/ # Including name of the bucket and (optional) the rest of directory structure
10+
S3_BUCKET=s3_path # Including name of the bucket and (optional) the rest of directory structure. E.g `bucket_name/project_name/stage_branch/`
1111

1212
# Mandatory only in case of a single backup usage
1313
S3_OBJECT_PATH=some_s3_path # E.g `bucket_name/project_name/stage_branch/some_db_name.tar.gz`

Diff for: docker-entrypoint.sh

+1
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ if [[ ${S3_ACCESS_KEY} == "**None**" ]] ||
1212
[[ ${S3_ENDPOINT} == "**None**" ]] ||
1313
[[ ${POSTGRES_DB} == "**None**" ]] ||
1414
[[ ${POSTGRES_USER} == "**None**" ]] ||
15+
[[ ${POSTGRES_HOST} == "**None**" ]] ||
1516
[[ ${POSTGRES_PASSWORD} == "**None**" ]]; then
1617
echo "One or more mandatory values is missing. Check your configuration..." >&2
1718
exit 1

Diff for: retention.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ retention_func() {
1919

2020
local backup_path="$1"
2121
local backup_limit="$2"
22-
# Here we are getting json-formed data from S3 and conveying to JQ
22+
# Here we are getting json-formed data from S3 and convey it to JQ
2323
# where we are sorting and selecting all backup directories except given last ones
2424
# How many backups should remain decides $backup_limit variable.
2525
# Each backup suitable for deletion conveys through `xargs` line for removal

0 commit comments

Comments
 (0)