Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to uv for Docker pip compile and install #90

Merged
merged 19 commits into from
Feb 14, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 6 additions & 12 deletions .github/workflows/dea-intertidal-image.yml
Original file line number Diff line number Diff line change
Expand Up @@ -85,9 +85,9 @@ jobs:
wget --no-verbose https://www.dropbox.com/s/uemd8ib2vfw5nad/tide_models.zip?dl=1 -O tide_models.zip
unzip -q tide_models.zip

# Run integration tests using Docker, setting up datacube access, AWS configuration and
# adding volumes that provide access to tide model data and allow us to export artifacts
# from the run
# Run integration tests using Docker. The command sets up datacube access, AWS configuration,
# and adds volumes to allow access to model outputs outside the container (used to commit outputs
# using git). It also provides access to tide model data for the tests.
docker run \
--net=host \
--env DATACUBE_DB_URL \
Expand All @@ -96,18 +96,12 @@ jobs:
--env AWS_ACCESS_KEY_ID \
--env AWS_SECRET_ACCESS_KEY \
--env AWS_SESSION_TOKEN \
--volume ${GITHUB_WORKSPACE}:/code \
--volume ${GITHUB_WORKSPACE}:/app \
--volume ${GITHUB_WORKSPACE}/tide_models:/var/share/tide_models \
--volume ${GITHUB_WORKSPACE}/artifacts:/mnt/artifacts \
dea_intertidal pytest -v --cov=intertidal --cov-report=xml tests

# Copy out validation outputs produced by the integration tests and place them
# in correct output locations so they can be committed back into the repository
cp ./artifacts/validation.jpg ./tests/validation.jpg
cp ./artifacts/validation.csv ./tests/validation.csv
cp ./artifacts/README.md ./tests/README.md

# Commit validation results produced by integration tests back into repo
# (relies on having previously added a volume to access outputs outside of the container)
- name: Commit validation results into repository
uses: stefanzweifel/git-auto-commit-action@v4
if: github.event_name == 'pull_request'
Expand All @@ -116,7 +110,7 @@ jobs:
commit_message: Automatically update integration test validation results
file_pattern: 'tests/validation.jpg tests/validation.csv tests/README.md'

# Post validation tesults as comment on PR
# Post validation tesults as a comment on the pull request
- name: Post validation results as comment
uses: mshick/add-pr-comment@v2
if: github.event_name == 'pull_request'
Expand Down
38 changes: 14 additions & 24 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -12,38 +12,28 @@ ENV DEBIAN_FRONTEND=noninteractive \
RUN apt-get update && \
apt-get install -y \
build-essential \
fish \
git \
vim \
htop \
wget \
unzip \
python3-pip \
libpq-dev \
&& apt-get autoclean && \
apt-get autoremove && \
rm -rf /var/lib/{apt,dpkg,cache,log}

# Install pip-tools
RUN pip install pip-tools
# Set up working directory
WORKDIR /app

# Pip installation
RUN mkdir -p /conf
# COPY requirements.in /conf/
# RUN pip-compile --extra-index-url=https://packages.dea.ga.gov.au/ --output-file=/conf/requirements.txt /conf/requirements.in
COPY requirements.txt /conf/
RUN pip install -r /conf/requirements.txt \
&& pip install --no-cache-dir awscli==1.33.37
# Copy requirements file first
COPY requirements.in /app/requirements.in

# Copy source code and install it
RUN mkdir -p /code
WORKDIR /code
ADD . /code
# Install uv and requirements
RUN pip install uv && \
uv pip compile /app/requirements.in -o /app/requirements.txt && \
uv pip install -r /app/requirements.txt --system

RUN echo "Installing dea-intertidal through the Dockerfile."
RUN pip install --extra-index-url="https://packages.dea.ga.gov.au" .
# Now copy the rest of the files
COPY . /app

RUN pip freeze && pip check

# Make sure it's working
RUN dea-intertidal --help
# Install DEA Intertidal and verify installation
RUN uv pip install . --system && \
uv pip check && \
dea-intertidal --help
2 changes: 1 addition & 1 deletion requirements.in
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
--extra-index-url=https://packages.dea.ga.gov.au/
aiohttp
botocore
click==8.1.7
dask==2024.3.1
datacube[s3,performance]==1.8.19
dea-tools==0.3.4
eodatasets3==0.30.6
Expand Down
8 changes: 4 additions & 4 deletions tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,10 @@ Integration tests

This directory contains tests that are run to verify that DEA Intertidal code runs correctly. The ``test_intertidal.py`` file runs a small-scale full workflow analysis over an intertidal flat in the Gulf of Carpentaria using the DEA Intertidal [Command Line Interface (CLI) tools](../notebooks/Intertidal_CLI.ipynb), and compares these results against a LiDAR validation DEM to produce some simple accuracy metrics.

The latest integration test completed at **2024-10-03 15:33**. Compared to the previous run, it had an:
- RMSE accuracy of **0.15 m ( :heavy_minus_sign: no change)**
- MAE accuracy of **0.12 m ( :heavy_minus_sign: no change)**
- Bias of **0.12 m ( :heavy_minus_sign: no change)**
The latest integration test completed at **2025-02-14 11:12**. Compared to the previous run, it had an:
- RMSE accuracy of **0.14 m ( :heavy_check_mark: improved by 0.001)**
- MAE accuracy of **0.12 m ( :heavy_check_mark: improved by 0.002)**
- Bias of **0.12 m ( :heavy_check_mark: improved by 0.002)**
- Pearson correlation of **0.975 ( :heavy_minus_sign: no change)**


Expand Down
8 changes: 4 additions & 4 deletions tests/test_intertidal.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,9 +65,9 @@ def test_dem_accuracy(
val_path="tests/data/lidar_10m_tests.tif",
mod_path="data/processed/ga_s2ls_intertidal_cyear_3/0-0-1/tes/ting/2021--P1Y/ga_s2ls_intertidal_cyear_3_testing_2021--P1Y_final_elevation.tif",
input_csv="tests/validation.csv",
output_csv="artifacts/validation.csv",
output_plot="artifacts/validation.jpg",
output_md="artifacts/README.md",
output_csv="tests/validation.csv",
output_plot="tests/validation.jpg",
output_md="tests/README.md",
):
"""
Compares elevation outputs of the previous CLI step against
Expand Down Expand Up @@ -195,7 +195,7 @@ def test_dem_accuracy(
ax2.set_ylabel("Metres (m)")
ax2.set_xlabel(None)

# Write into mounted artifacts directory
# Write output CSV
accuracy_df.to_csv(output_csv)
plt.savefig(output_plot, dpi=100, bbox_inches="tight")

Expand Down
1 change: 1 addition & 0 deletions tests/validation.csv
Original file line number Diff line number Diff line change
Expand Up @@ -68,3 +68,4 @@ time,Correlation,RMSE,MAE,R-squared,Bias,Regression slope
2024-09-24 00:07:47.380872+00:00,0.975,0.146,0.125,0.95,0.119,1.112
2024-09-24 05:42:35.688710+00:00,0.975,0.146,0.125,0.95,0.119,1.112
2024-10-03 05:33:20.186227+00:00,0.975,0.146,0.125,0.95,0.119,1.112
2025-02-14 00:12:00.333318+00:00,0.975,0.145,0.123,0.95,0.117,1.118
Binary file modified tests/validation.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.