You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+23-4Lines changed: 23 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,6 +22,10 @@ Each subdirectory is scoped to run only one AI/ML integration's suite of tests f
22
22
Within each subdirectory you should expect to have:
23
23
24
24
-`run.sh` -- A script that should handle any additional library installations and steps for executing the test suite. This script should not populate the Atlas database with any required test data.
25
+
-`config.env` - A file that defines the following environment variables:
26
+
-`REPO_NAME` -- The name of the AI/ML framework repository that will get cloned
27
+
-`CLONE_URL` -- The Github URL to clone into the specified `DIR`
28
+
-`DATABASE` -- The optional database where the Atlas CLI will load your index configs
25
29
-`database/` -- An optional directory used by `.evergreen/scaffold_atlas.py` to populate a MongoDB database with test data. Only provide this if your tests require pre-populated data.
26
30
-`database/{collection}.json` -- An optional JSON file containing one or more MongoDB documents that will be uploaded to `$DATABASE.{collection}` in the local Atlas instance. Only provide this if your tests require pre-populated data.
27
31
-`indexConfig.json` -- An optional file containing configuration for a specified Atlas Search Index.
@@ -40,12 +44,15 @@ The general layout of this repo looks like this:
│ ├── indexConfig.json # Creates Search Index on $DATABASE
55
+
| ├── config.env # Configuration file
49
56
│ └── run.sh # Script that executes test
50
57
```
51
58
@@ -54,13 +61,28 @@ The general layout of this repo looks like this:
54
61
Each test subdirectory will automatically have its own local Atlas deployment. As a result, database and collection names will not conflict between different AI/ML integrations. To connect to your local Atlas using a connection string, `utils.sh` has a `fetch_local_atlas_uri` that you can call from the `run.sh` script within your subdirectory. For example:
55
62
56
63
```bash
57
-
.$workdir/src/.evergreen/utils.sh
64
+
. .evergreen/utils.sh
58
65
59
66
CONN_STRING=$(fetch_local_atlas_uri)
60
67
```
61
68
62
69
Stores the local Atlas URI within the `CONN_STRING` var. The script can then pass `CONN_STRING` as an environment variable to the test suite.
63
70
71
+
#### Running tests locally.
72
+
73
+
We can run the tests with a local checkout of the repo.
74
+
75
+
For example, to run the `docarray` tests using local atlas:
76
+
77
+
```bash
78
+
export DIR=docarray
79
+
bash .evergreen/fetch-repo.sh
80
+
bash .evergreen/provision-atlas.sh
81
+
bash .evergreen/execute-tests.sh
82
+
```
83
+
84
+
Use `.evergreen/setup-remote.sh` instead of `.evergreen/provision-atlas.sh` to test against the remote cluster.
85
+
64
86
#### Pre-populating the Local Atlas Deployment
65
87
66
88
You can pre-populate a test's local Atlas deployment before running the `run.sh` script by providing JSON files in the optional `database` directory of the created subdirectory. The `.evergreen/scaffold_atlas.py` file will search for every JSON file within this database directory and upload the documents to the database provided by the `DATABASE` expansion provided in the build variant of the `.evergreen/config.yml` setup. The collection the script uploads to is based on the name of your JSON file:
@@ -82,9 +104,6 @@ Test execution flow is defined in `.evergreen/config.yml`. The test pipeline's c
82
104
-[`expansions`](https://docs.devprod.prod.corp.mongodb.com/evergreen/Project-Configuration/Project-Configuration-Files/#expansions) -- Build variant specific variables. Expansions that need to be maintained as secrets should be stored in [the Evergreen project settings](https://spruce.mongodb.com/project/ai-ml-pipeline-testing/settings/variables) using [variables](https://docs.devprod.prod.corp.mongodb.com/evergreen/Project-Configuration/Project-and-Distro-Settings#variables). Some common expansions needed are:
83
105
84
106
-`DIR` -- The subdirectory where the tasks will run
85
-
-`REPO_NAME` -- The name of the AI/ML framework repository that will get cloned
86
-
-`CLONE_URL` -- The Github URL to clone into the specified `DIR`
87
-
-`DATABASE` -- The optional database where the Atlas CLI will load your index configs
88
107
89
108
-`run_on` -- Specified platform to run on. `rhel87-small` should be used by default. Any other distro may fail Atlas CLI setup.
90
109
-`tasks` -- Tasks to run. See below for more details
0 commit comments