Background
`IMPLEMENTATION` currently has two values:
```python
class IMPLEMENTATION(Enum):
IN_CODE = "in-code"
NOT_APPLICABLE = "N/A"
```
`N/A` works as a workaround for requirements fulfilled outside of application code — auto-configuration, YAML properties, platform infrastructure, dependencies that wire behavior automatically. But it is:
- Undiscoverable — no documentation explains when to use it or what it means beyond "not in code"
- Too coarse — `N/A` conflates very different situations:
- A YAML property that sets a port number
- An auto-configured library that wires authentication on all outbound clients
- Platform-level redundancy or networking rules
- A feature intentionally deferred or not implemented yet (draft lifecycle)
- Misleading in status output — "Not in Code" section doesn't communicate why it's not in code
Proposed new values
Extend `IMPLEMENTATION` with:
| Value |
Meaning |
`@Requirements` needed? |
Tests needed? |
| `in-code` |
Implemented via annotatable application code |
Yes |
Yes (via SVCs) |
| `configuration` |
Fulfilled by application config (e.g. `application.yml`, env vars) |
No |
Recommended (verify config is effective) |
| `platform` |
Fulfilled by infrastructure, platform, or an auto-configured dependency |
No |
Optional (verify behavior if testable) |
| `N/A` (keep) |
Not applicable / not implemented (draft, deferred, intentionally out of scope) |
No |
No (or MVR/review SVC) |
`configuration` — example use cases
- "Service shall expose management endpoints on port 8079" → fulfilled by `management.server.port=8079` in config
- "Service shall use structured JSON logging" → fulfilled by a logging config file activated by an environment convention
- "Service shall retry failed requests with backoff" → fulfilled by a `resilience4j:` YAML block
The SVC for a `configuration` requirement should verify the config is effective, not just present — e.g. an integration test that confirms the endpoint actually responds on that port.
`platform` — example use cases
- "Service shall use service-to-service authentication for outbound calls" → fulfilled by a library that auto-wires auth on all HTTP clients (no application method to annotate)
- "Service shall have distributed tracing" → fulfilled by an OTel auto-configuration starter
- "System shall be highly available" → fulfilled by platform-level load balancing (not testable via application tests)
For testable platform requirements, the SVC can still use `verification: automated-test` with `@SVCs`-annotated tests. For non-testable ones, `verification: platform` or `verification: review` continues to work as today.
Option C: SVC-only completion (related)
A requirement with `implementation: configuration` or `implementation: platform` that has at least one passing SVC should be considered complete — regardless of `@Requirements` annotation count. This is already how `N/A` behaves today; formalizing it for the new types makes the model consistent.
Status display
Suggested grouping in status output:
```
In Code: 45/50 (90%)
Configuration: 3/3 (100%)
Platform: 5/5 (100%)
N/A: 0/0
─────────────────────────
Total: 53/58 (91%)
```
Migration / backwards compat
- `N/A` remains valid — no breaking change
- `in-code` remains the default when `implementation` is omitted
- New values are additive
Related
- The workaround today is `implementation: N/A` — which works but is undiscoverable
- `verification: platform` on SVCs partially addresses this from the SVC side, but doesn't communicate why the requirement has no code implementation
Background
`IMPLEMENTATION` currently has two values:
```python
class IMPLEMENTATION(Enum):
IN_CODE = "in-code"
NOT_APPLICABLE = "N/A"
```
`N/A` works as a workaround for requirements fulfilled outside of application code — auto-configuration, YAML properties, platform infrastructure, dependencies that wire behavior automatically. But it is:
Proposed new values
Extend `IMPLEMENTATION` with:
`configuration` — example use cases
The SVC for a `configuration` requirement should verify the config is effective, not just present — e.g. an integration test that confirms the endpoint actually responds on that port.
`platform` — example use cases
For testable platform requirements, the SVC can still use `verification: automated-test` with `@SVCs`-annotated tests. For non-testable ones, `verification: platform` or `verification: review` continues to work as today.
Option C: SVC-only completion (related)
A requirement with `implementation: configuration` or `implementation: platform` that has at least one passing SVC should be considered complete — regardless of `@Requirements` annotation count. This is already how `N/A` behaves today; formalizing it for the new types makes the model consistent.
Status display
Suggested grouping in status output:
```
In Code: 45/50 (90%)
Configuration: 3/3 (100%)
Platform: 5/5 (100%)
N/A: 0/0
─────────────────────────
Total: 53/58 (91%)
```
Migration / backwards compat
Related