Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
80 changes: 60 additions & 20 deletions .github/workflows/pr-preprod-tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -24,13 +24,23 @@ jobs:
run: |
cd /home/integration/git/cardano-rosetta-java

# Fetch latest from remote
git fetch origin

# Clean up any existing PR branch and switch to main first
git checkout main || git checkout develop
git branch -D pr-${{ github.event.pull_request.number }} 2>/dev/null || true
git pull origin main || git pull origin develop

# Fetch and checkout the PR
git fetch origin pull/${{ github.event.pull_request.number }}/head:pr-${{ github.event.pull_request.number }}
git checkout pr-${{ github.event.pull_request.number }}
# Checkout appropriate branch based on trigger type
if [ "${{ github.event_name }}" = "pull_request" ]; then
git branch -D pr-${{ github.event.pull_request.number }} 2>/dev/null || true
git fetch origin pull/${{ github.event.pull_request.number }}/head:pr-${{ github.event.pull_request.number }}
git checkout pr-${{ github.event.pull_request.number }}
else
# workflow_dispatch: checkout the selected branch and force sync with remote
git checkout ${{ github.ref_name }}
git reset --hard origin/${{ github.ref_name }}
fi

- name: Stop current services
run: |
Expand Down Expand Up @@ -94,17 +104,24 @@ jobs:
echo "Waiting for API to be ready..."

# Wait for API to respond
API_STARTED=false
for i in {1..60}; do
if curl -sf -X POST http://localhost:8082/network/list \
-H 'Content-Type: application/json' \
-d '{}' > /dev/null 2>&1; then
echo "API is responding"
API_STARTED=true
break
fi
echo "Waiting for API to start... ($i/60)"
sleep 10
done

if [ "$API_STARTED" != "true" ]; then
echo "❌ API failed to start after 10 minutes"
exit 1
fi

# Check sync status
echo "Checking sync status..."
while true; do
Expand Down Expand Up @@ -150,6 +167,9 @@ jobs:
# Copy to tests directory
cp .env.test tests/data-endpoints/.env

- name: Clean previous Allure results
run: rm -rf /home/integration/git/cardano-rosetta-java/tests/data-endpoints/allure-results/*

- name: Run smoke tests (validate test data)
id: smoke_tests
run: |
Expand Down Expand Up @@ -196,23 +216,22 @@ jobs:
ROSETTA_URL: http://localhost:8082
CARDANO_NETWORK: preprod

- name: Run construction API tests
id: construction_test
- name: Run golden example tests
id: golden_tests
run: |
export PATH="$HOME/.local/bin:$PATH"

cd /home/integration/git/cardano-rosetta-java/tests/integration

# Run construction API snapshot tests
uv run test_construction_api.py \
-v || CONSTRUCTION_RESULT=$?
# Run golden example tests (construction + data endpoints)
uv run test_golden_examples.py \
-v || GOLDEN_RESULT=$?

# Output test result
echo "construction_result=${CONSTRUCTION_RESULT:-0}" >> $GITHUB_OUTPUT
echo "golden_result=${GOLDEN_RESULT:-0}" >> $GITHUB_OUTPUT

# Don't fail the whole job if construction tests fail
# These are informational for now
exit 0
# Fail if golden example tests failed
exit ${GOLDEN_RESULT:-0}
env:
ROSETTA_URL: http://localhost:8082
CARDANO_NETWORK: preprod
Expand Down Expand Up @@ -247,22 +266,43 @@ jobs:
user_name: 'github-actions[bot]'
user_email: '41898282+github-actions[bot]@users.noreply.github.com'

- name: Cleanup gh-pages temp directories
if: always()
run: rm -rf ~/actions_github_pages_*

- name: Comment PR with results
if: always() && github.event_name == 'pull_request' && steps.test.outcome != 'skipped'
if: always() && github.event_name == 'pull_request'
uses: actions/github-script@v6
with:
script: |
const prNumber = context.issue.number;
const testResult = '${{ steps.test.outputs.test_result }}';
const emoji = testResult === '0' ? '✅' : '❌';
const status = testResult === '0' ? 'PASSED' : 'FAILED';

// Check if earlier steps failed (deployment, sync, etc.)
const testsSkipped = '${{ steps.test.outcome }}' === 'skipped';

let emoji, status;
if (testsSkipped) {
emoji = '💥';
status = 'DEPLOYMENT FAILED';
} else {
// Check all three test outputs
const smokeResult = '${{ steps.smoke_tests.outputs.smoke_result }}';
const testResult = '${{ steps.test.outputs.test_result }}';
const goldenResult = '${{ steps.golden_tests.outputs.golden_result }}';

const allPassed = smokeResult === '0' && testResult === '0' && goldenResult === '0';
emoji = allPassed ? '✅' : '❌';
status = allPassed ? 'PASSED' : 'FAILED';
}

// Build comment body
let comment = `## ${emoji} Preprod Tests: ${status}\n\n`;

// Add report link
const reportUrl = `https://cardano-foundation.github.io/cardano-rosetta-java/test-reports/pr-${prNumber}/`;
comment += `📊 **[View Detailed Test Report](${reportUrl})**\n\n`;
// Add report link (only if tests actually ran)
if (!testsSkipped) {
const reportUrl = `https://cardano-foundation.github.io/cardano-rosetta-java/test-reports/pr-${prNumber}/`;
comment += `📊 **[View Detailed Test Report](${reportUrl})**\n\n`;
}

comment += `🔗 [Action Run #${{ github.run_number }}](${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }})\n\n`;
comment += `_Tests run against preprod network with live blockchain data_`;
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ hs_err_pid*
target
settings.xml
.env*
!.env.example

/data/
/node/
Expand Down
74 changes: 72 additions & 2 deletions load-tests/README.MD
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@ usage: stability_test.py [-h] [--url BASE_URL] [--csv CSV_FILE]
[--concurrency CONCURRENCIES] [--duration TEST_DURATION]
[--sla SLA_THRESHOLD] [--error-threshold ERROR_THRESHOLD]
[--skip-header] [-v] [--cooldown COOLDOWN]
[--network NETWORK]
[--endpoints SELECTED_ENDPOINTS] [--list-endpoints]

Cardano Rosetta API Stability Testing Tool
Expand All @@ -102,6 +103,7 @@ options:
--skip-header Skip the header row in the CSV file (default: False)
-v, --verbose Enable verbose output (default: False)
--cooldown COOLDOWN Cooldown period in seconds between endpoint tests (default: 60)
--network NETWORK Network identifier for API requests: mainnet or preprod (default: mainnet)
--endpoints SELECTED_ENDPOINTS
Comma-separated list of endpoint names or paths to test (e.g. "Network Status,Block"
or "/account/balance,/block"). If not specified, all endpoints will be tested.
Expand Down Expand Up @@ -140,10 +142,22 @@ Test only specific endpoints by path:
./load-tests/stability_test.py --endpoints "/network/status,/block,/account/balance"
```

Test only search/transactions endpoint with stake address data:
Test search/transactions by hash lookup:

```bash
./load-tests/stability_test.py --endpoints "/search/transactions" --csv load-tests/data/mainnet-data-stake-address.csv
./load-tests/stability_test.py --endpoints "Search Transactions by Hash"
```

Test search/transactions by address (more resource-intensive):

```bash
./load-tests/stability_test.py --endpoints "Search Transactions by Address" --csv load-tests/data/mainnet-data.csv
```

Test on preprod network:

```bash
./load-tests/stability_test.py --network preprod --url http://127.0.0.1:8082 --csv data/preprod-data.csv
```

List all available endpoints without running tests:
Expand All @@ -164,6 +178,62 @@ Test with custom SLA and error thresholds:
./load-tests/stability_test.py --sla 500 --error-threshold 0.5
```

## Test Data

### CSV Format Requirements

Each CSV row must have 6 fields with specific associations:

```
address,block_index,block_hash,transaction_size,relative_ttl,transaction_hash
```

**Critical Data Associations (Implicit Rules):**

1. **Block Consistency**: The `block_hash` MUST be the hash of the block at `block_index`
2. **Transaction in Block**: The `transaction_hash` MUST exist in the specified block (`block_hash`)
3. **Address in Transaction**: The `address` MUST be involved in the transaction (appear in operations as input/output)
4. **Transaction Size**: The `transaction_size` MUST match the actual size of the transaction in bytes
5. **Valid Address**: The `address` MUST have a balance and UTXO history (for account endpoints)
6. **TTL Value**: The `relative_ttl` is used by construction/metadata endpoint (1000 is standard)

These associations ensure all 8 endpoints can successfully use the same data row:
- Network Status: No specific data needed
- Account Balance/Coins: Requires valid address with balance
- Block: Requires valid block_index and block_hash
- Block Transaction: Requires transaction in specified block
- Search by Hash: Requires valid transaction_hash
- Search by Address: Requires address involved in transactions
- Construction Metadata: Requires transaction_size and relative_ttl

### Available Data Files

The `data/` directory contains pre-validated CSV files for different networks:

### Mainnet Data (`mainnet-data.csv`)
- **Block**: 11573705
- **Transaction**: 3a954835b69ca01ff9cf3b30ce385d5d9ef0cea502bd0f2ad156684dfbaf325a
- **Address**: addr1qxw5ly68dml8ceg7eawa7we8pjw8j8hn74n2djt2upmnq9th42p6lrke4yj3e0xqg3sdqm6lzksa53wd2550vrpkedks4fttnm

### Preprod Data (`preprod-data.csv`)
- **Block**: 4070700
- **Transaction**: bf540a825d5d40af7435801ce6adcac010f3f9f29ae102aee8cff8007f68c3d4
- **Address**: addr_test1wzn5ee2qaqvly3hx7e0nk3vhm240n5muq3plhjcnvx9ppjgf62u6a

All data has been validated to work with all 8 stability test endpoints, with proper associations between blocks, transactions, and addresses.

## Endpoint Details

### Search Transactions Endpoints

The stability test includes two variants of the `/search/transactions` endpoint:

1. **Search Transactions by Hash**: Queries transactions using `transaction_identifier`. This is a fast, direct lookup by transaction hash.

2. **Search Transactions by Address**: Queries transactions using `account_identifier` with an address. This is more resource-intensive as it requires scanning transaction operations to find all transactions involving the specified address.

Both endpoints use the same API path (`/search/transactions`) but with different query parameters, allowing independent performance testing of each query pattern.

## Output

The script creates a timestamped directory containing:
Expand Down
2 changes: 2 additions & 0 deletions load-tests/data/preprod-data.csv
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
address,block_index,block_hash,transaction_size,relative_ttl,transaction_hash
addr_test1wzn5ee2qaqvly3hx7e0nk3vhm240n5muq3plhjcnvx9ppjgf62u6a,4070700,6b1b29d0533a86443140a88d3758f26fa9d4a8954363e78818b3235126ba933b,683,1000,bf540a825d5d40af7435801ce6adcac010f3f9f29ae102aee8cff8007f68c3d4
Loading
Loading