3.8 KiB
3.8 KiB
description
| description |
|---|
| Test documentation template for feature implementation |
Test Documentation: [FEATURE NAME]
Feature: [Link to spec.md] Created: [DATE] Updated: [DATE] Tester: [Agent/User Name]
Overview
[Brief description of what this feature does and why testing is important]
Test Strategy:
- Unit Tests (co-located in
__tests__/directories) - Integration Tests (if needed)
- E2E Tests (if critical user flows)
- Contract Tests (for API endpoints)
Test Coverage Matrix
| Module | File | Unit Tests | Coverage % | Status |
|---|---|---|---|---|
| [Module Name] | path/to/file.py |
[x] | [XX%] | [Pass/Fail] |
| [Module Name] | path/to/file.svelte |
[x] | [XX%] | [Pass/Fail] |
Test Cases
[Module Name]
Target File: path/to/module.py
| ID | Test Case | Type | Expected Result | Status |
|---|---|---|---|---|
| TC001 | [Description] | [Unit/Integration] | [Expected] | [Pass/Fail] |
| TC002 | [Description] | [Unit/Integration] | [Expected] | [Pass/Fail] |
Test Execution Reports
Report [YYYY-MM-DD]
Executed by: [Tester] Duration: [X] minutes Result: [Pass/Fail]
Summary:
- Total Tests: [X]
- Passed: [X]
- Failed: [X]
- Skipped: [X]
Failed Tests:
| Test | Error | Resolution |
|---|---|---|
| [Test Name] | [Error Message] | [How Fixed] |
Anti-Patterns & Rules
✅ DO
- Write tests BEFORE implementation (TDD approach)
- Use co-location:
src/module/__tests__/test_module.py - Use MagicMock for external dependencies (DB, Auth, APIs)
- Include semantic annotations:
# @RELATION: VERIFIES -> module.name - Test edge cases and error conditions
- Test UX states for Svelte components (@UX_STATE, @UX_FEEDBACK, @UX_RECOVERY)
❌ DON'T
- Delete existing tests (only update if they fail)
- Duplicate tests - check for existing tests first
- Test implementation details, not behavior
- Use real external services in unit tests
- Skip error handling tests
- Skip UX contract tests for CRITICAL frontend components
UX Contract Testing (Frontend)
UX States Coverage
| Component | @UX_STATE | @UX_FEEDBACK | @UX_RECOVERY | Tests |
|---|---|---|---|---|
| [Component] | [states] | [feedback] | [recovery] | [status] |
UX Test Cases
| ID | Component | UX Tag | Test Action | Expected Result | Status |
|---|---|---|---|---|---|
| UX001 | [Component] | @UX_STATE: Idle | [action] | [expected] | [Pass/Fail] |
| UX002 | [Component] | @UX_FEEDBACK | [action] | [expected] | [Pass/Fail] |
| UX003 | [Component] | @UX_RECOVERY | [action] | [expected] | [Pass/Fail] |
UX Test Examples
// Testing @UX_STATE transition
it('should transition from Idle to Loading on submit', async () => {
render(FormComponent);
await fireEvent.click(screen.getByText('Submit'));
expect(screen.getByTestId('form')).toHaveClass('loading');
});
// Testing @UX_FEEDBACK
it('should show error toast on validation failure', async () => {
render(FormComponent);
await fireEvent.click(screen.getByText('Submit'));
expect(screen.getByRole('alert')).toHaveTextContent('Validation error');
});
// Testing @UX_RECOVERY
it('should allow retry after error', async () => {
render(FormComponent);
// Trigger error state
await fireEvent.click(screen.getByText('Submit'));
// Click retry
await fireEvent.click(screen.getByText('Retry'));
expect(screen.getByTestId('form')).not.toHaveClass('error');
});
Notes
- [Additional notes about testing approach]
- [Known issues or limitations]
- [Recommendations for future testing]