Testing Strategy & Verify Pipeline
Alloy uses a layered testing strategy that combines Rust unit/integration tests
with shell-based end-to-end scripts. Everything runs through a single entry
point: ./scripts/verify.sh.
The Verify Pipeline
verify.sh runs 10 sequential steps. If any step fails, the pipeline stops.
Steps 5–9 share a single temporary SQLite server that is started automatically
and torn down on exit.
| Step | Command / Script | What It Checks |
|---|---|---|
| 1 | cargo fmt --all -- --check | Code formatting (rustfmt) |
| 2 | cargo check --workspace --all-targets | Compilation (all crates, all targets) |
| 3 | cargo clippy --workspace --all-targets -- -D warnings | Lint warnings treated as errors |
| 4 | cargo test --workspace --all-features | All Rust unit and integration tests |
| 5 | seed-demo.sh | API integration — creates demo data and asserts read-back |
| 6 | test-mcp.sh | MCP server — JSON-RPC over stdio against live data |
| 7 | test-permissions.sh | RBAC — all 5 roles across every endpoint |
| 8 | test-tui-api.sh | TUI API paths — smoke-tests every route the TUI uses |
| 9 | test-cli.sh | CLI end-to-end — every CLI command with API cross-validation |
| 10 | verify-docs.sh | Documentation — curl examples match live API responses |
Running Individual Steps
# Format
cargo fmt --all -- --check
# Compile check
cargo check --workspace --all-targets
# Clippy
cargo clippy --workspace --all-targets -- -D warnings
# Rust tests only
cargo test --workspace --all-features
# Any integration script standalone (starts its own server if BASE_URL unset)
bash scripts/test-permissions.sh
# Doc verification (always starts its own server)
bash scripts/verify-docs.sh
Shared Server for Steps 5–9
Steps 5–9 reuse a single ephemeral server to avoid repeated compilation. The pipeline:
- Creates a temporary SQLite database file.
- Picks a random free port via Python.
- Starts
alloy servewithALLOY_AUTO_MIGRATE=true,ALLOY_REGISTRATION=open, and high rate limits. - Waits up to 30 seconds for
/healthto return 200. - Runs
seed-demo.sh, which writes a token file consumed by later steps. - Passes the token, org ID, and user ID as environment variables to subsequent scripts.
- Cleans up the server process and temp files on exit (via
trap).
seed-demo.sh — API Integration Test
seed-demo.sh is both a data seeder and an integration test. It creates a
complete set of demo data and asserts every creation via read-back.
What it creates: A user (demo@alloy.dev / demodemo1), an org
(“Acme Corp”), a project (“Demo Project”), 4 labels, a sprint, 6 tickets,
comments, and time entries.
How it works:
- Helper functions
post(),get(),del()wrap curl with auth headers and exit on non-2xx responses. assert_eq()compares expected vs actual values; any mismatch is fatal.expect_status()verifies specific HTTP status codes (used for negative tests like duplicate detection).- After each POST, a GET reads the resource back and asserts field values match.
- On success, writes
{token, org_id, user_id}to$ALLOY_TOKEN_FILEfor downstream scripts.
Environment variables:
| Variable | Default | Description |
|---|---|---|
BASE_URL | http://localhost:3000 | Server address |
ALLOY_TOKEN_FILE | (none) | Path to write auth context JSON for later scripts |
test-mcp.sh — MCP Integration Test
Tests the MCP server binary (alloy-mcp) by sending JSON-RPC messages over
stdio and validating responses.
How it works:
- Launches
alloy-mcpas a subprocess withBASE_URL,TOKEN,SEED_ORG_ID, andSEED_USER_IDenvironment variables. - Sends JSON-RPC
tools/callrequests for each MCP tool (list projects, create ticket, etc.). - Parses the JSON-RPC response and validates fields using
assert_eq,assert_not_empty, andassert_containshelpers. - Includes HTTP helpers (
post_json,get_json) for API key creation and cross-validation against the REST API.
Required environment variables: BASE_URL, TOKEN, SEED_ORG_ID,
SEED_USER_ID.
test-permissions.sh — Role-Based Access Control Tests
Tests all 5 roles (Owner, Admin, Member, Reporter, Viewer) across every endpoint, verifying that each role gets the expected HTTP status code.
How it works:
- Setup: Registers 5 users, creates an org, invites each user with a
different role, and logs each in to get role-specific tokens
(
TOKEN_OWNER,TOKEN_ADMIN,TOKEN_MEMBER,TOKEN_REPORTER,TOKEN_VIEWER). - Tests: For each endpoint and HTTP method, calls
check()with every role’s token and asserts the expected status code (200/201 for allowed, 403 for forbidden). - Self-contained: If
BASE_URLis not set, starts its own temporary server.
Key helper — check():
check METHOD URL BODY EXPECTED_STATUS LABEL TOKEN
Fires an HTTP request and compares the response status code to the expected
value. Increments PASS_COUNT or FAIL_COUNT and records failure details.
test-tui-api.sh — TUI API Smoke Test
Verifies that every HTTP path used in alloy-tui/src/api.rs actually works
against a live server with seed data.
How it works:
- The
check()helper sends a request and asserts a 2xx response code. - Walks through every API call the TUI makes:
/health,/api/v1/auth/me, project listing, ticket listing, ticket detail, comments, sprints, labels, workflows, and mutations (create/update ticket, add comment). - Captures IDs from responses to use in subsequent requests (e.g., gets a project ID, then lists its tickets).
Required environment variables: BASE_URL, TOKEN, SEED_ORG_ID.
Why this exists: The TUI compiles against api.rs types, not against the
server’s router. A path mismatch (e.g., /api/v1/tickets vs
/api/v1/projects/:id/tickets) compiles fine but fails at runtime. This script
catches those mismatches early.
test-cli.sh — CLI End-to-End Test
Tests every CLI command by running the real alloy binary and validating output.
How it works:
- Uses
cargo run --quiet --bin alloy --with--api-urland--format jsonflags to execute CLI commands programmatically. - Creates a fake
$HOMEdirectory so CLI credential storage doesn’t conflict with real user credentials. - Preserves
$RUSTUP_HOMEand$CARGO_HOMEso cargo still works under the fake home. run_cli()runs a command, increments pass/fail counters, and returns stdout.assert_eq()validates specific field values from JSON output.api_get()cross-validates CLI operations by reading back via the REST API.- Tests the full lifecycle:
auth login, project/ticket/sprint CRUD, comments, labels, time entries.
Required environment variables: BASE_URL, TOKEN, SEED_ORG_ID.
verify-docs.sh — Documentation Curl Validation
Extracts curl examples from all Markdown documentation files and validates them against a live server.
How it works:
- Starts its own temporary SQLite server and seeds it via
seed-demo.sh. - Collects all
.mdfiles fromdocs/,docs/tutorials/, anddocs/guides/. - A Python script extracts “bash/sh + json” block pairs from each file:
- A
```bashblock containingcurlfollowed immediately by a```jsonblock is treated as a testable pair. - Script-like blocks (shebangs, loops, function definitions) are skipped.
- A
- For each pair:
- Substitutes environment variables (
$BASE_URL,$TOKEN,$PROJECT_ID, etc.). - Runs the curl command and captures the response.
- Validates the response JSON against the expected pattern.
- Substitutes environment variables (
- Wildcard matching:
"..."in expected JSON values means the field must exist but any value is accepted. Exact values are compared strictly. - Auto-capture: The validator automatically captures IDs, tokens, and other values from responses to use as variables in subsequent curl commands within the same file.
Writing testable curl examples:
```bash
curl -s -X GET "$BASE_URL/api/v1/projects" \
-H "Authorization: Bearer $TOKEN"
```
```json
{
"items": [
{
"id": "...",
"name": "Acme Corp"
}
]
}
```
Rules:
- The
```jsonblock must immediately follow the```bashblock (blank lines between are OK). - Use
"..."for dynamic values (IDs, timestamps). - Use exact values for fields you want to assert.
- Available variables:
$BASE_URL,$TOKEN,$USER_ID,$ORG_ID,$PROJECT_ID,$TICKET_ID,$SPRINT_ID,$LABEL_ID,$COMMENT_ID,$WORKFLOW_ID,$TIME_ENTRY_ID,$API_KEY_ID,$API_KEY,$WEBHOOK_ID,$LABOR_RATE_ID,$INVITE_ID,$INVITE_CODE.
Adding a New Test
Rust Unit/Integration Test
- Add the test in the same file as the code it tests, or in a
tests/module. - Use
#[sqlx::test]for PostgreSQL integration tests gated behind#[cfg(feature = "postgres-tests")]. - Use in-memory SQLite for fast tests that don’t need PostgreSQL-specific features.
- Run
cargo test --workspace --all-featuresto verify.
New API Endpoint Test
When adding a new endpoint, add coverage in multiple layers:
- seed-demo.sh — Add a creation call and read-back assertion for the new resource.
- test-permissions.sh — Add
check()calls for all 5 roles against the new endpoint. - test-tui-api.sh — If the TUI uses the endpoint, add a
check()call. - test-cli.sh — If there’s a CLI command, add a
run_cli()call with assertions. - test-mcp.sh — If there’s an MCP tool, add a JSON-RPC test.
- Documentation — Add curl + json example pairs in the relevant docs so
verify-docs.shvalidates them.
New Documentation Example
Follow the curl + json block pair format described in the verify-docs.sh
section above. Run bash scripts/verify-docs.sh to validate your examples
before committing.