-
Notifications
You must be signed in to change notification settings - Fork 0
Add manual E2E Jest test for workflow run lifecycle #1461
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…nfig: __tests__/config/jest.config.workflow.e2e.ts\n- New test: __tests__/e2e/workflow-run.workflow.e2e.test.ts\n - Enqueues autoResolveIssue via API route\n - Polls workflow run list and events APIs\n - Awaits workflow.completed and validates persisted events\n- Add npm script: test:e2e:workflow\n\nThis long-running, external-API test is skipped by default and only runs when RUN_WORKFLOW_E2E=true. It is designed to be executed manually against a fully running local stack (Redis, Neo4j, workers, GitHub/OpenAI credentials).
|
Important Review skippedBot user detected. To trigger a single review, invoke the You can disable this status message by setting the Comment |
|
I wonder if we could mock the PR creation write at the tool call boundary. For example, I expect the LLM agent to call the create PR tool call. And when that's called, we don't need to actually make a PR - we can mock the actual execution of that tool. |
|
|
||
| ;(shouldRun ? describe : describe.skip)("Workflow lifecycle E2E", () => { | ||
| it("launches autoResolveIssue via queue and observes completion + events", async () => { | ||
| const repoFullName = process.env.E2E_REPO_FULL_NAME |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where are these env variables saved? If they're in the tests folder (not root), then that's good. We should create a .env.example file if it doesn't already exist with default non-sensitive values.
| } | ||
| } | ||
|
|
||
| const shouldRun = process.env.RUN_WORKFLOW_E2E === "true" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of using an env variable to check if "shouldRun", I was expecting this to be another class of tests, where we would run something like "pnpm test:e2e" just like we do with the neo4j tests. So then we wouldn't need this env variable.
Seems like there's a "--selectProjects" option that might be useful.
|
|
||
| // Step 2: Poll the Workflow Runs list API for this specific issue until a run appears | ||
| // Build a NextRequest with query params: ?repo=...&issue=... | ||
| const listUrl = new URL("http://localhost/api/workflow-runs") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this work without specifying the port for NextJS? Usually 3000. Double check.
| // IMPORTANT: | ||
| // - This test is intended to be run manually. It will be skipped unless | ||
| // RUN_WORKFLOW_E2E === "true". | ||
| // - It expects your local environment to be running just like `pnpm dev`: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a way to check if these services are running and exit early if they're not? With clear messaging and reminders saying that these services need to be running
| // - Proper GitHub App + OpenAI credentials in env | ||
| // | ||
| // Required env vars for this test: | ||
| // RUN_WORKFLOW_E2E=true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Take this var out. Just use pnpm test:e2e:workflow and wrap these e2e workflows under a project or something so they're not normally run during CI
…e env file\n\n- Drop RUN_WORKFLOW_E2E flag; rely on dedicated Jest project (pnpm test:e2e:workflow)\n- Add preflight checks for required env, Redis, and Neo4j connectivity with clear messages\n- Load optional __tests__/.env.workflow.e2e via dotenv\n- Provide __tests__/env.workflow.e2e.example with non-sensitive defaults and guidance\n- Clarify URL origin/port usage and set to http://localhost:3000
Summary
What’s included
pnpm run test:e2e:workflow.How it works
workflow.completed.Environment and execution
Required env vars for the test:
Optional:
Run it:
Notes
Closes #1460
Updates in response to review
pnpm test:e2e:workflow, so no special env flag is required. This keeps it out of CI by default.If a check fails, the test exits early with a clear console warning and setup reminders.
__tests__/env.workflow.e2e.example(non-sensitive) to document/test variables. The test will auto-load__tests__/.env.workflow.e2eif present.http://localhost:3000and add a comment explaining that the origin is only used to construct a URL forrequest.nextUrl(no HTTP request is made).Notes