You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/ce/features/opa-policies.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ With plan policies you can check `terraform plan` output for compliance with you
17
17
18
18
With access policies you can control which Digger operations are allowed at any given time based on various inputs. Access policy is checked before every plan and apply and is passed the following data:
Copy file name to clipboardExpand all lines: docs/ce/features/plan-persistence.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,4 +2,4 @@
2
2
title: "Plan Persistence"
3
3
---
4
4
5
-
By default digger will run an apply based on the branch pull request files (no artefacts stored). In order to configure plan artefacts you can configure the inputs for storing as github artefacts or aws buckets or gcp buckets. The corresponding artefacts to be configured can be found in [storing plans in a bucket](/ce/howto/store-plans-in-a-bucket)
5
+
By default digger will run an apply based on the branch pull request files (no artefacts stored). In order to configure plan artefacts you can configure the inputs for storing as GitHub artefacts or aws buckets or gcp buckets. The corresponding artefacts to be configured can be found in [storing plans in a bucket](/ce/howto/store-plans-in-a-bucket)
Copy file name to clipboardExpand all lines: docs/ce/features/plan-preview.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,4 +17,4 @@ You can also re-plan by commenting `digger plan` (see [CommentOps](/features/com
17
17
18
18
* By performing locks on pull request we guarantee that the plan preview on the pull request is not stale. i.e. the infrastructure was not touched by another subsequent change
19
19
20
-
* Code in github: [https://github.com/diggerhq/digger/blob/5815775095d7380281c71c7c3aa63ca1b374365f/pkg/digger/digger.go#L228](https://github.com/diggerhq/digger/blob/5815775095d7380281c71c7c3aa63ca1b374365f/pkg/digger/digger.go#L228)
20
+
* Code in GitHub: [https://GitHub.com/diggerhq/digger/blob/5815775095d7380281c71c7c3aa63ca1b374365f/pkg/digger/digger.go#L228](https://GitHub.com/diggerhq/digger/blob/5815775095d7380281c71c7c3aa63ca1b374365f/pkg/digger/digger.go#L228)
Copy file name to clipboardExpand all lines: docs/ce/features/pr-level-locks.mdx
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,10 +4,10 @@ title: "PR-level locks"
4
4
5
5
* For every pull request we perform a lock when the pull request is opened and unlocked when the pull request is merged, this is to avoid making a plan preview stale
6
6
7
-
* For GCP locking is performed using buckets that are strongly consistent: [https://github.com/diggerhq/digger/blob/80289922227f225d887feb74749b4daef8b441f8/pkg/gcp/gcp\_lock.go#L13](https://github.com/diggerhq/digger/blob/80289922227f225d887feb74749b4daef8b441f8/pkg/gcp/gcp%5Flock.go#L13)
7
+
* For GCP locking is performed using buckets that are strongly consistent: [https://GitHub.com/diggerhq/digger/blob/80289922227f225d887feb74749b4daef8b441f8/pkg/gcp/gcp\_lock.go#L13](https://GitHub.com/diggerhq/digger/blob/80289922227f225d887feb74749b4daef8b441f8/pkg/gcp/gcp%5Flock.go#L13)
8
8
9
9
* These options are configured and the locking can be disabled entirely if it is not needed
10
10
11
-
* The locking interface is very simple and is based on `Lock()` and `Unlock()` Operations [https://github.com/diggerhq/digger/blob/5815775095d7380281c71c7c3aa63ca1b374365f/pkg/locking/locking.go#L40](https://github.com/diggerhq/digger/blob/5815775095d7380281c71c7c3aa63ca1b374365f/pkg/locking/locking.go#L40)
11
+
* The locking interface is very simple and is based on `Lock()` and `Unlock()` Operations [https://GitHub.com/diggerhq/digger/blob/5815775095d7380281c71c7c3aa63ca1b374365f/pkg/locking/locking.go#L40](https://GitHub.com/diggerhq/digger/blob/5815775095d7380281c71c7c3aa63ca1b374365f/pkg/locking/locking.go#L40)
12
12
13
13
* A pull request acquires a lock for every project impacted by this PR and all dependant projects
Copy file name to clipboardExpand all lines: docs/ce/features/private-runners.mdx
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ title: "Private Runners"
4
4
5
5
In many situations you wish to run digger with private runners. For example if you are provisioning resources in a private k8s cluster in this case you will not be able to use cloud runners.
6
6
7
-
While digger does not natively support k8s agents it is very easy to do it indirectly using github actions runners.
7
+
While digger does not natively support k8s agents it is very easy to do it indirectly using GitHub actions runners.
8
8
In the typical digger flow you are using a workflow that looks like this:
9
9
10
10
```
@@ -16,8 +16,8 @@ jobs:
16
16
runs-on: ubuntu-latest
17
17
```
18
18
19
-
With github specifically there is good support for [self-hosted runners](https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners)
20
-
which means that you can create agents for github actions in your private infrastructure's VPC and github will then run the jobs there.
19
+
With GitHub specifically there is good support for [self-hosted runners](https://docs.GitHub.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners)
20
+
which means that you can create agents for GitHub actions in your private infrastructure's VPC and GitHub will then run the jobs there.
21
21
22
22
The easiest way to achieve self-hosted runners is by running the agent in something like an EC2 instance. Alternatively if you already have a kubernetes cluster
23
23
you could opt for using the [Actions runner controller](which will provide you with actions right in your cluster). Once you have set up and configured your controllers
Copy file name to clipboardExpand all lines: docs/ce/gcp/federated-oidc-access.mdx
+10-10Lines changed: 10 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,17 +10,17 @@ If you already have configured GCP for that, skip to step 5.
10
10
A Workload Identity Pool is an umbrella entity for managing access in GCP. The best practice is to have a dedicated pool for each non-GCP environment.
11
11
12
12
```
13
-
gcloud iam workload-identity-pools create github-wif-pool --location="global" --project
13
+
gcloud iam workload-identity-pools create GitHub-wif-pool --location="global" --project
14
14
```
15
15
16
16
## 2\. Create a Workload Identity Provider
17
17
18
18
A Workload Identity Provider links an external identity like GitHub with your Google Cloud account. This lets IAM use tokens from external providers to authorize access to Google Cloud resources.
19
19
20
20
```
21
-
gcloud iam workload-identity-pools providers create-oidc githubwif \
@@ -53,19 +53,19 @@ Create 2 secrets in your Action Secrets with the following names:
53
53
54
54
## 5\. Configure Digger workflow to use federated access
55
55
56
-
Set `EXT` env var instead of the usual key pair. See [oidc-gcp-example](https://github.com/diggerhq/digger-gcp-ocid-demo) repo for more detail. Sample config below:
56
+
Set `EXT` env var instead of the usual key pair. See [oidc-gcp-example](https://GitHub.com/diggerhq/digger-gcp-ocid-demo) repo for more detail. Sample config below:
In this tutorial we will be using a repository in order to configure a terraform pipeline [https://github.com/diggerhq/digger-gcp-lock-demo](https://github.com/diggerhq/digger-gcp-lock-demo). In order to use GCP with Digger we follow the steps below:
7
+
In this tutorial we will be using a repository in order to configure a terraform pipeline [https://GitHub.com/diggerhq/digger-gcp-lock-demo](https://GitHub.com/diggerhq/digger-gcp-lock-demo). In order to use GCP with Digger we follow the steps below:
8
8
9
9
10
10
Let's create our first pull request with a change and see this in action:
11
11
12
-
1. Fork the [demo repository](https://github.com/diggerhq/digger-gcp-lock-demo)
12
+
1. Fork the [demo repository](https://GitHub.com/diggerhq/digger-gcp-lock-demo)
13
13
14
14
2. Enable Actions (by default workflows won't trigger in a fork)
Copy file name to clipboardExpand all lines: docs/ce/getting-started/azure-devops.mdx
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ For tutorial purposes the token is shown with full access. Don't do that in prod
38
38
39
39
## 2\. Set up Azure Function
40
40
41
-
Clone [this repository](https://github.com/diggerhq/azure-devops-webhook-handler), then follow [this guide](https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-vs-code-python?pivots=python-mode-configuration) deploy the function to your azure account
41
+
Clone [this repository](https://GitHub.com/diggerhq/azure-devops-webhook-handler), then follow [this guide](https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-vs-code-python?pivots=python-mode-configuration) deploy the function to your azure account
42
42
43
43

44
44
@@ -71,7 +71,7 @@ Select the repository you want to integrate digger with. Add the following 4 eve
71
71
72
72
## 4\. Create digger.yml
73
73
74
-
Follow the digger documentation to create digger.yml for your structure, it should be similar to this [demo digger.yml](https://github.com/diggerhq/digger%5Fdemo%5Fmultienv/blob/main/digger.yml)
74
+
Follow the digger documentation to create digger.yml for your structure, it should be similar to this [demo digger.yml](https://GitHub.com/diggerhq/digger%5Fdemo%5Fmultienv/blob/main/digger.yml)
75
75
76
76
The minimum you would need to define for digger is where all your terraform lives:
0 commit comments