Collection of invoke commands used by Saritasa
- Installation
- Configuration
- Modules
pip install saritasa-invocations
or if you are using poetry
poetry add saritasa-invocations
Configuration can be set in tasks.py
file.
Below is an example of config:
import invoke
import saritasa_invocations
ns = invoke.Collection(
saritasa_invocations.docker,
saritasa_invocations.git,
saritasa_invocations.github_actions,
saritasa_invocations.pre_commit,
saritasa_invocations.system,
)
# Configurations for run command
ns.configure(
{
"run": {
"pty": True,
"echo": True,
},
"saritasa_invocations": saritasa_invocations.Config(
pre_commit=saritasa_invocations.PreCommitSettings(
hooks=(
"pre-commit",
"pre-push",
"commit-msg",
)
),
git=saritasa_invocations.GitSettings(
merge_ff="true",
pull_ff="only",
),
docker=saritasa_invocations.DockerSettings(
main_containers=(
"opensearch",
"redis",
),
),
system=saritasa_invocations.SystemSettings(
vs_code_settings_template=".vscode/recommended_settings.json",
settings_template="config/.env.local",
save_settings_from_template_to="config/.env",
),
# Default K8S Settings shared between envs
k8s_defaults=saritasa_invocations.K8SDefaultSettings(
proxy="teleport.company.com",
db_config=saritasa_invocations.K8SDBSettings(
namespace="db",
pod_selector="app=pod-selector-db",
),
)
),
},
)
# For K8S settings you just need to create a instances of K8SSettings for each
# environnement. It'll be all collected automatically.
saritasa_invocations.K8SSettings(
name="dev",
cluster="teleport.company.somewhere.com",
namespace="project_name",
)
saritasa_invocations.K8SSettings(
name="prod",
cluster="teleport.client.somewhere.com",
namespace="project_name",
proxy="teleport.client.com",
)
While this module doesn't contain any invocations, it's used to print message
via rich.panel.Panel
. There are three types:
print_success
- print message in green panelprint_warning
- print message in yellow panelprint_error
- print message in red panel
Copies local template for settings into specified file
Settings:
settings_template
path to settings template (Default:config/settings/local.template.py
)save_settings_from_template_to
path to where save settings (Default:config/settings/local.py
)
Copies local template for vscode settings into .vscode
folder
Settings:
vs_code_settings_template
path to settings template (Default:.vscode/recommended_settings.json
)
Change ownership of files to user(current user by default).
Shortcut for owning apps dir by specified user after some files were generated using docker-compose (migrations, new app, etc).
Create folder for temporary files(.tmp
).
Set git setting in config
Preform setup of git:
- Install pre-commit hooks
- Set merge.ff
- Set pull.ff
Settings:
merge_ff
setting value formerge.ff
(Default:false
)pull_ff
setting value forpull.ff
(Default:only
)
Clone repo or pull latest changes to specified repo
Command for creating copies of a file with git blame history saving.
Original script written in bash here
Usage:
inv git.blame-copy <path to original file> <path to copy>,<path to copy>...
If <path to copy>
is file, then data will be copied in it.
If <path to copy>
is directory, then data will be copied in provided
directory with original name.
Algorithm:
- Remember current HEAD state
- For each copy path:
move file to copy path, restore file using
checkout
, remember result commits - Restore state of branch
- Move file to temp file
- Merge copy commits to branch
- Move file to it's original path from temp file
Settings:
copy_commit_template
template for commits created during command workflowcopy_init_message_template
template for init message printed at command start
Template variables:
action
- The copy algorithm consists of several intermediate actions (creating temporary files, merging commits, etc.) Theaction
variable stores the header of the intermediate action.original_path
- Contains value of first argument of the command (path of original file that will be copied)destination_paths
- Sequence of paths to which the original file will be copiedproject_task
- project task that will be parsed from current git branch. If no task found in branch, then will be empty
Default values for templates:
copy_commit_template
:
"[automated-commit]: {action}\n\n"
"copy: {original_path}\n"
"to:\n* {destination_paths}\n\n"
"{project_task}"
copy_init_message_template
:
"Copy {original_path} to:\n"
"* {destination_paths}\n\n"
"Count of created commits: {commits_count}"
Install git hooks via pre-commit.
Settings:
hooks
list of hooks to install (Default:["pre-commit", "pre-push", "commit-msg"]
)
Run all hooks against all files.
Update pre-commit dependencies.
Build service image from docker compose
Build project via pack-cli
Settings:
buildpack_builder
image tag of builder (Default:paketobuildpacks/builder:base
)buildpack_runner
image tag of runner (Default:paketobuildpacks/run:base
)build_image_tag
image tag of builder (Default: Name of project fromproject_name
)buildpack_requirements_path
path to folder with requirements (Default:requirements
)
Shortcut for stopping ALL running docker containers
Bring up main containers and start them.
Settings:
main_containers
image tag of builder (Default:["postgres", "redis"]
)
Stop main containers.
Settings:
main_containers
image tag of builder (Default:["postgres", "redis"]
)
Stop and remove all containers defined in docker-compose. Also remove images.
Add hosts to /etc/hosts
.
Settings:
hosts
image tag of builder (Default: seedocker-main-containers
)
As of now we support two environments for python local
and docker
.
local
is a python that is located in your current virtualenvdocker
is python that is located inside your docker image of service (python_docker_service
).
This was done to have ability to run code against environment close deployed one or simply test it out.
Example of usage
PYTHON_ENV=docker inv python.run --command="--version"
Run python command depending on PYTHON_ENV
variable(docker
or local
).
Settings:
entry
python entry command (Default:python
)docker_service
python service name (Default:web
)docker_service_params
params for docker (Default:--rm
)
Run manage.py
with specified command.
This command also handle starting of required services and waiting DB to be ready.
Requires django_probes
Settings:
manage_file_path
path tomanage.py
file (Default:./manage.py
)
Run makemigrations
command and chown created migrations (only for docker env).
Check if there is new migrations or not. Result should be check via exit code.
Run migrate
command.
Settings:
migrate_command
migrate command (Default:migrate
)
Reset database to initial state (including test DB).
Requires django-extensions
Settings:
settings_path
default django settings (Default:config.settings.local
)
Create superuser.
Settings:
default_superuser_email
default email of superuser. if empty, will try to grab it from git config, before resorting to default (Default:root@localhost
)default_superuser_username
default username of superuser if empty, will try to grab it from git config, before resorting to default (Default:root
)default_superuser_password
default password of superuser (Default:root
)verbose_email_name
verbose name foremail
field (Default:Email address
)verbose_username_name
verbose name forusername
field (Default:Username
)verbose_password_name
verbose name forpassword
field (Default:Password
)
Note:
- Values for
verbose_email_name
,verbose_username_name
,verbose_password_name
should match with verbose names of model that used this setting
Run development web-server.
Settings:
runserver_docker_params
params for docker (Default:--rm --service-ports
)runserver_command
runserver command (Default:runserver_plus
)runserver_host
host of server (Default:0.0.0.0
)runserver_port
port of server (Default:8000
)runserver_params
params for runserver command (Default:""
)
Shortcut for manage.py shell command.
Settings:
shell_command
command to start python shell (Default:shell_plus --ipython
)
Open database shell with credentials from current django settings.
Generate and recompile translation messages.
Requires gettext
Settings:
makemessages_params
params for makemessages command (Default:--all --ignore venv
)compilemessages_params
params for compilemessages command (Default:""
)
Reset db and load db dump.
Uses resetdb and load-db-dump
Settings:
django_settings_path
default django settings (Default:config.settings.local
)
Back up local db.
Uses backup_local_db
Settings:
settings_path
default django settings (Default:config.settings.local
)
Make dump of remote db and download it.
Uses create_dump and get-dump
Settings:
-
settings_path
default django settings (Default:config.settings.local
) -
remote_db_config_mapping
Mapping of db config Default:{ "dbname": "RDS_DB_NAME", "host": "RDS_DB_HOST", "port": "RDS_DB_PORT", "username": "RDS_DB_USER", "password": "RDS_DB_PASSWORD", }
Make dump of remote db and download it and apply to local db.
Uses create_dump and get-dump and load-db-dump
Settings:
settings_path
default django settings (Default:config.settings.local
)
Create django app from a template using cookiecutter.
Settings:
app_boilerplate_link
link to app templateapp_template_directory
path to app template in project template (Default:.
)apps_path
path to apps folder in project (Default:apps
)
Launch docker compose and wait for database connection.
Run development web-server.
Settings:
docker_params
params for docker (Default:--rm --service-ports
)uvicorn_command
uvicorn command (Default:-m uvicorn
)app
path to fastapi app (Default:config:fastapi_app
)host
host of server (Default:0.0.0.0
)port
port of server (Default:8000
)params
params for uvicorn (Default:--reload
)
Run alembic command
Settings:
command
alembic command (Default:-m alembic
)connect_attempts
numbers of attempts to connect to database (Default:10
)
Generate migrations
Settings:
migrations_folder
migrations files location (Default:db/migrations/versions
)
Upgrade database
Downgrade database
Check if there any missing migrations to be generated
Check migration files for adjust messages
Settings:
migrations_folder
migrations files location (Default:db/migrations/versions
)adjust_messages
list of alembic adjust messages (Default:# ### commands auto generated by Alembic - please adjust! ###
,# ### end Alembic commands ###
)
Reset db and load db dump.
Uses downgrade and load-db-dump
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mapping
Mapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
Back up local db.
Uses backup_local_db
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mapping
Mapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
Make dump of remote db and download it.
Uses create_dump and get-dump
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mapping
Mapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
Make dump of remote db and download it and apply to local db.
Uses create-dump and get-dump and load-db-dump
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mapping
Mapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
Launch docker compose and wait for database connection.
Start celery worker.
Settings:
app
path to app (Default:config.celery.app
)scheduler
scheduler (Default:django
)loglevel
log level for celery (Default:info
)extra_params
extra params for worker (Default:("--beat",)
)local_cmd
command for celery (Default:celery --app {app} worker --scheduler={scheduler} --loglevel={info} {extra_params}
)service_name
name of celery service (Default:celery
)
Send task to celery worker.
Settings:
app
path to app (Default:config.celery.app
)
Check that generated open_api spec is valid. This command uses drf-spectacular and it's default validator. It creates spec file in ./tmp folder and then validates it.
Load db dump to local db.
Settings:
load_dump_command
template for load command(Default located in_config.pp > dbSettings
)dump_filename
filename for dump (Default:local_db_dump
)load_additional_params
additional params for load command (Default:--quite
)
Back up local db.
Settings:
dump_command
template for dump command (Default located in_config.pp > dbSettings
)dump_filename
filename for dump (Default:local_db_dump
)dump_additional_params
additional params for dump command (Default:--no-owner
)
For K8S settings you just need to create a instances of K8SSettings
for each
environnement. It'll be all collected automatically.
Login into k8s via teleport.
Settings:
proxy
teleport proxy (REQUIRED)port
teleport port (Default:443
)auth
teleport auth method (Default:github
)
Set k8s context to current project
Settings:
namespace
namespace for k8s (Default: Name of project fromproject_name
)
Get logs for k8s pod
Settings:
default_component
default component (Default:backend
)
Get pods from k8s.
Execute command inside k8s pod.
Settings:
default_component
default component (Default:backend
)default_entry
default entry cmd (Default:/cnb/lifecycle/launcher bash
)
Enter python shell inside k8s pod.
Settings:
default_component
default component (Default:backend
)python_shell
shell cmd (Default:shell_plus
)
Check health of component.
Settings:
default_component
default component (Default:backend
)health_check
health check cmd (Default:health_check
)
Download file from pod.
default_component
default component (Default:backend
)
While you probably won't use this module directly some other modules commands are use it(getting remote db dump)
Make sure to set up these configs:
pod_namespace
db namespace (REQUIRED)pod_selector
pod selector for db (REQUIRED)
Execute dump command in db pod.
Settings:
pod_namespace
db namespace (REQUIRED)pod_selector
pod selector for db (REQUIRED)get_pod_name_command
template for fetching db pod (Default located in_config.pp > K8SdbSettings
)dump_filename
default dump filename (Default: Name of project fromproject_name
plus_db_dump
)dump_command
dump command template (Default located in_config.pp > K8SDBSettings
)dump_dir
folder where to put dump file (Default:tmp
)dump_additional_params
additional dump commands (Default:--no-owner
)
Download db data from db pod if it present
Settings:
pod_namespace
db namespace (REQUIRED)pod_selector
pod selector for db (REQUIRED)get_pod_name_command
template for fetching db pod (Default located in_config.pp > K8SDBSettings
)dump_filename
default dump filename (Default: Name of project fromproject_name
plus_db_dump
)
Cruft is a tool used to synchronize changes with cookiecutter based boilerplates.
Check that there are no cruft files (*.rej
).
Not invocation, but a shortcut for creating cruft projects for testing boilerplates
Install dependencies via poetry.
Update dependencies with respect to version constraints using poetry up plugin.
Fallbacks to poetry update
in case of an error.
Update dependencies to latest versions using poetry up plugin.
By default fallbacks to update
task in case of an error.
Use --no-fallback
to stop on error.
Install dependencies via pip.
Settings:
dependencies_folder
path to folder with dependencies files (Default:requirements
)
Compile dependencies via pip-compile.
Settings:
dependencies_folder
path to folder with dependencies files (Default:requirements
)in_files
sequence of.in
files (Default:"production.in"
,"development.in"
)
Run mypy in path
with params
.
Settings:
mypy_entry
python entry command (Default:-m mypy
)
Run pytest in path
with params
.
Settings:
pytest_entry
python entry command (Default:-m pytest
)
Fill specified credentials in your file from k8s.
This invocations downloads .env
file from pod in k8s.
It will replace specified credentials(--credentials
) in
specified file .env
file (--env_file_path
or .env
as default)
Requires python-decouple
Settings for k8s:
secret_file_path_in_pod
path to secret in pod (REQUIRED)temp_secret_file_path
path for temporary file (Default:.env.to_delete
)