--- description: "Test documentation template for feature implementation" --- # Test Documentation: [FEATURE NAME] **Feature**: [Link to spec.md] **Created**: [DATE] **Updated**: [DATE] **Tester**: [Agent/User Name] --- ## Overview [Brief description of what this feature does and why testing is important] **Test Strategy**: - [ ] Unit Tests (co-located in `__tests__/` directories) - [ ] Integration Tests (if needed) - [ ] E2E Tests (if critical user flows) - [ ] Contract Tests (for API endpoints) --- ## Test Coverage Matrix | Module | File | Unit Tests | Coverage % | Status | |--------|------|------------|------------|--------| | [Module Name] | `path/to/file.py` | [x] | [XX%] | [Pass/Fail] | | [Module Name] | `path/to/file.svelte` | [x] | [XX%] | [Pass/Fail] | --- ## Test Cases ### [Module Name] **Target File**: `path/to/module.py` | ID | Test Case | Type | Expected Result | Status | |----|-----------|------|------------------|--------| | TC001 | [Description] | [Unit/Integration] | [Expected] | [Pass/Fail] | | TC002 | [Description] | [Unit/Integration] | [Expected] | [Pass/Fail] | --- ## Test Execution Reports ### Report [YYYY-MM-DD] **Executed by**: [Tester] **Duration**: [X] minutes **Result**: [Pass/Fail] **Summary**: - Total Tests: [X] - Passed: [X] - Failed: [X] - Skipped: [X] **Failed Tests**: | Test | Error | Resolution | |------|-------|-------------| | [Test Name] | [Error Message] | [How Fixed] | --- ## Anti-Patterns & Rules ### ✅ DO 1. Write tests BEFORE implementation (TDD approach) 2. Use co-location: `src/module/__tests__/test_module.py` 3. Use MagicMock for external dependencies (DB, Auth, APIs) 4. Include semantic annotations: `# @RELATION: VERIFIES -> module.name` 5. Test edge cases and error conditions 6. **Test UX states** for Svelte components (@UX_STATE, @UX_FEEDBACK, @UX_RECOVERY) ### ❌ DON'T 1. Delete existing tests (only update if they fail) 2. Duplicate tests - check for existing tests first 3. Test implementation details, not behavior 4. Use real external services in unit tests 5. Skip error handling tests 6. **Skip UX contract tests** for CRITICAL frontend components --- ## UX Contract Testing (Frontend) ### UX States Coverage | Component | @UX_STATE | @UX_FEEDBACK | @UX_RECOVERY | Tests | |-----------|-----------|--------------|--------------|-------| | [Component] | [states] | [feedback] | [recovery] | [status] | ### UX Test Cases | ID | Component | UX Tag | Test Action | Expected Result | Status | |----|-----------|--------|-------------|-----------------|--------| | UX001 | [Component] | @UX_STATE: Idle | [action] | [expected] | [Pass/Fail] | | UX002 | [Component] | @UX_FEEDBACK | [action] | [expected] | [Pass/Fail] | | UX003 | [Component] | @UX_RECOVERY | [action] | [expected] | [Pass/Fail] | ### UX Test Examples ```javascript // Testing @UX_STATE transition it('should transition from Idle to Loading on submit', async () => { render(FormComponent); await fireEvent.click(screen.getByText('Submit')); expect(screen.getByTestId('form')).toHaveClass('loading'); }); // Testing @UX_FEEDBACK it('should show error toast on validation failure', async () => { render(FormComponent); await fireEvent.click(screen.getByText('Submit')); expect(screen.getByRole('alert')).toHaveTextContent('Validation error'); }); // Testing @UX_RECOVERY it('should allow retry after error', async () => { render(FormComponent); // Trigger error state await fireEvent.click(screen.getByText('Submit')); // Click retry await fireEvent.click(screen.getByText('Retry')); expect(screen.getByTestId('form')).not.toHaveClass('error'); }); ``` --- ## Notes - [Additional notes about testing approach] - [Known issues or limitations] - [Recommendations for future testing] --- ## Related Documents - [spec.md](./spec.md) - [plan.md](./plan.md) - [tasks.md](./tasks.md) - [contracts/](./contracts/)