таски готовы

This commit is contained in:
2026-02-07 12:42:32 +03:00
parent 0f16bab2b8
commit e6087bd3c1
9 changed files with 474 additions and 5 deletions

Binary file not shown.

Binary file not shown.

View File

@@ -66,11 +66,14 @@
3. **TRIVIAL** (DTO/**Atoms**):
- Требование: Только Якоря [DEF] и @PURPOSE.
#### VI. ЛОГИРОВАНИЕ (BELIEF STATE)
Цель: Трассировка для самокоррекции.
Python: Context Manager `with belief_scope("ID"):`.
#### VI. ЛОГИРОВАНИЕ (BELIEF STATE & TASK LOGS)
Цель: Трассировка для самокоррекции и пользовательский мониторинг.
Python:
- Системные логи: Context Manager `with belief_scope("ID"):`.
- Логи задач: `context.logger.info("msg", source="component")`.
Svelte: `console.log("[ID][STATE] Msg")`.
Состояния: Entry -> Action -> Coherence:OK / Failed -> Exit.
Инвариант: Каждый лог задачи должен иметь атрибут `source` для фильтрации.
#### VII. АЛГОРИТМ ГЕНЕРАЦИИ
1. АНАЛИЗ. Оцени TIER, слой и UX-требования.

View File

@@ -0,0 +1,41 @@
# Specification Quality Checklist: Task Logging System (Separated Per-Task Logs)
**Purpose**: Validate specification completeness and quality before proceeding to planning
**Created**: 2026-02-07
**Feature**: [specs/018-task-logging-v2/spec.md](specs/018-task-logging-v2/spec.md)
## Content Quality
- [x] No implementation details (languages, frameworks, APIs)
- [x] Focused on user value and business needs
- [x] Written for non-technical stakeholders
- [x] All mandatory sections completed
## UX Consistency
- [x] Functional requirements fully support the 'Happy Path' in ux_reference.md
- [x] Error handling requirements match the 'Error Experience' in ux_reference.md
- [x] No requirements contradict the defined User Persona or Context
## Requirement Completeness
- [x] No [NEEDS CLARIFICATION] markers remain
- [x] Requirements are testable and unambiguous
- [x] Success criteria are measurable
- [x] Success criteria are technology-agnostic (no implementation details)
- [x] All acceptance scenarios are defined
- [x] Edge cases are identified
- [x] Scope is clearly bounded
- [x] Dependencies and assumptions identified
## Feature Readiness
- [x] All functional requirements have clear acceptance criteria
- [x] User scenarios cover primary flows
- [x] Feature meets measurable outcomes defined in Success Criteria
- [x] No implementation details leak into specification
## Notes
- The specification is based on the provided technical draft but has been abstracted to focus on functional requirements and user value.
- Implementation details like "FastAPI", "SQLAlchemy", or "SvelteKit" are excluded from the spec as per guidelines.

View File

@@ -0,0 +1,103 @@
# Technical Plan: Task Logging System (Separated Per-Task Logs)
**Feature Branch**: `018-task-logging-v2`
**Specification**: [`specs/018-task-logging-v2/spec.md`](specs/018-task-logging-v2/spec.md)
**Created**: 2026-02-07
**Status**: Draft
## 1. Architecture Overview
The system will transition from in-memory JSON logging to a persistent, source-attributed logging architecture.
### 1.1. Component Diagram
```mermaid
graph TD
subgraph Backend
P[Plugin] -->|uses| TC[TaskContext]
TC -->|delegates| TL[TaskLogger]
TL -->|calls| TM[TaskManager]
TM -->|buffers| LB[LogBuffer]
LB -->|periodic flush| LPS[LogPersistenceService]
LPS -->|SQL| DB[(SQLite: task_logs)]
TM -->|broadcast| WS[WebSocket Server]
end
subgraph Frontend
WS -->|stream| TLP[TaskLogPanel]
API[REST API] -->|fetch history| TLP
TLP -->|render| UI[Log UI]
end
```
## 2. Database Schema
A new table `task_logs` will be added to the primary database.
| Column | Type | Constraints |
|--------|------|-------------|
| `id` | Integer | Primary Key, Autoincrement |
| `task_id` | String | Foreign Key (tasks.id), Indexed |
| `timestamp` | DateTime | Not Null, Indexed |
| `level` | String(16) | Not Null (INFO, WARNING, ERROR, DEBUG) |
| `source` | String(64) | Not Null, Default: 'system' |
| `message` | Text | Not Null |
| `metadata_json` | Text | Nullable (JSON string) |
## 3. Backend Implementation Details
### 3.1. `TaskLogger` & `TaskContext`
- `TaskLogger`: A wrapper around `TaskManager._add_log` that carries `task_id` and `source`.
- `TaskContext`: A container passed to `plugin.execute()` providing the logger and other task-specific utilities.
### 3.2. `TaskManager` Enhancements
- **Log Buffer**: A dictionary mapping `task_id` to a list of pending `LogEntry` objects.
- **Flusher Thread**: A background daemon thread that flushes the buffer to the database every 2 seconds.
- **Backward Compatibility**: Use `inspect.signature` to detect if a plugin's `execute` method accepts the new `context` parameter.
### 3.3. API Endpoints
- `GET /api/tasks/{task_id}/logs`: Supports `level`, `source`, `search`, `offset`, and `limit`.
- `GET /api/tasks/{task_id}/logs/stats`: Returns counts by level and source.
- `GET /api/tasks/{task_id}/logs/sources`: Returns unique sources for the task.
## 4. Frontend Implementation Details
### 4.1. Components
- `TaskLogPanel`: Main container managing state and data fetching.
- `LogFilterBar`: UI for selecting filters.
- `LogEntryRow`: Optimized row rendering with color coding and progress bar support.
### 4.2. Data Fetching
- **Live Tasks**: Connect via WebSocket with filter parameters in the query string.
- **Completed Tasks**: Fetch via REST API with pagination.
## 5. Implementation Phases
### Phase 1: Foundation
- Database migration (new table).
- `LogPersistenceService` implementation.
- `TaskLogger` and `TaskContext` classes.
### Phase 2: Core Integration
- `TaskManager` buffer and flusher logic.
- API endpoint updates.
- WebSocket server-side filtering.
### Phase 3: Plugin Migration
- Update core plugins (Backup, Migration, Git) to use the new logger.
- Verify backward compatibility with un-migrated plugins.
### Phase 4: Frontend
- Implement new Svelte components.
- Integrate with updated API and WebSocket.
## 6. Verification Plan
- **Unit Tests**:
- `TaskLogger` correctly routes to `TaskManager`.
- `LogPersistenceService` handles batch inserts and complex filters.
- `TaskManager` flushes logs on task completion.
- **Integration Tests**:
- End-to-end flow from plugin log to frontend display.
- Verification of log persistence after server restart.
- **Performance Tests**:
- Measure overhead of batch logging under high load (1000+ logs/sec).

View File

@@ -0,0 +1,99 @@
# Feature Specification: Task Logging System (Separated Per-Task Logs)
**Feature Branch**: `018-task-logging-v2`
**Reference UX**: `specs/018-task-logging-v2/ux_reference.md`
**Created**: 2026-02-07
**Status**: Draft
**Input**: User description: "Implement a separated per-task logging system with source attribution, persistence, and frontend filtering."
## User Scenarios & Testing *(mandatory)*
### User Story 1 - Real-time Filtered Logging (Priority: P1)
As a user, I want to see logs from a running task filtered by their source so that I can focus on specific component behavior without noise.
**Why this priority**: This is the core value proposition—isolating component logs during execution.
**Independent Test**: Start a task that emits logs from multiple sources (e.g., `plugin` and `system`), apply a source filter in the UI, and verify only matching logs are displayed in real-time.
**Acceptance Scenarios**:
1. **Given** a task is running and emitting logs from "plugin" and "superset_api", **When** I select "superset_api" in the Source filter, **Then** only logs with the "superset_api" tag are visible.
2. **Given** a filter is active, **When** a new log matching the filter arrives via WebSocket, **Then** it is immediately appended to the view.
---
### User Story 2 - Persistent Log History (Priority: P1)
As a user, I want my task logs to be saved permanently so that I can review them even after the server restarts.
**Why this priority**: Current logs are in-memory and lost on restart, which is a major pain point for debugging past failures.
**Independent Test**: Run a task, restart the backend server, and verify that the logs for that task are still accessible via the UI.
**Acceptance Scenarios**:
1. **Given** a task has completed, **When** I restart the server and navigate to the task details, **Then** all logs are loaded from the database.
2. **Given** a task was deleted via the cleanup service, **When** I check the database, **Then** all associated logs are also removed.
---
### User Story 3 - Component Attribution (Priority: P2)
As a developer, I want to use a dedicated logger that automatically tags my logs with the correct source.
**Why this priority**: Simplifies plugin development and ensures consistent data for filtering.
**Independent Test**: Update a plugin to use `context.logger.with_source("custom")` and verify that logs appear with the "custom" tag.
**Acceptance Scenarios**:
1. **Given** a plugin uses the new `TaskContext`, **When** it calls `logger.info()`, **Then** the log is recorded with `source="plugin"` by default.
2. **Given** a plugin creates a sub-logger with `with_source("api")`, **When** it logs a message, **Then** the log is recorded with `source="api"`.
---
### Edge Cases
- **High Volume Logs**: How does the system handle a task generating 10,000+ logs in a few seconds? (Requirement: Batching and virtual scrolling).
- **Database Connection Loss**: What happens if the log flusher cannot write to the DB? (Requirement: Logs should remain in-memory as a fallback until the next flush).
- **Concurrent Tasks**: Ensure logs from Task A never leak into the view or database records of Task B.
## Requirements *(mandatory)*
### Functional Requirements
- **FR-001**: System MUST persist task logs in a dedicated `task_logs` database table.
- **FR-002**: Each log entry MUST include: `task_id`, `timestamp`, `level`, `message`, `source`, and optional `metadata`.
- **FR-003**: System MUST provide a `TaskLogger` via a `TaskContext` to all plugins.
- **FR-004**: System MUST support batch-writing logs to the database (e.g., every 2 seconds) to maintain performance.
- **FR-005**: Backend MUST support server-side filtering of logs by `level`, `source`, and text `search` via REST and WebSocket.
- **FR-006**: Frontend MUST provide a UI for filtering and searching logs within the task details view.
- **FR-007**: System MUST maintain backward compatibility for plugins using the old `execute` signature.
- **FR-008**: System MUST automatically delete logs when their associated task is deleted.
- **FR-009**: Task logs MUST follow the same retention policy as task records (logs are kept as long as the task record exists).
- **FR-010**: The default log level filter in the UI MUST be set to INFO and above (hiding DEBUG logs by default).
- **FR-011**: System MUST support filtering logs by specific keys within the `metadata` JSON (e.g., `dashboard_id`, `database_uuid`).
## Clarifications
### Session 2026-02-07
- Q: Should task logs follow the same retention period as the task records themselves? → A: Same as Task Records.
- Q: What should be the default log level filter when a user opens the task log panel? → A: INFO and above.
- Q: Do we need to support searching or filtering inside the JSON metadata? → A: Searchable keys (e.g., `dashboard_id` should be filterable).
### Key Entities *(include if feature involves data)*
- **TaskLog**: Represents a single log message.
- Attributes: `id` (PK), `task_id` (FK), `timestamp`, `level` (Enum/String), `source` (String), `message` (Text), `metadata` (JSON).
- **TaskContext**: Execution context provided to plugins.
- Attributes: `task_id`, `logger`, `config`.
## Success Criteria *(mandatory)*
### Measurable Outcomes
- **SC-001**: Log retrieval for a task with 1,000 entries takes less than 200ms.
- **SC-002**: Database write overhead for logging does not increase task execution time by more than 5%.
- **SC-003**: 100% of logs generated by a task are available after a server restart.
- **SC-004**: Users can filter logs by source and see results in under 100ms on the frontend.

View File

@@ -0,0 +1,142 @@
# Tasks: Task Logging System (Separated Per-Task Logs)
**Input**: Design documents from `/specs/018-task-logging-v2/`
**Prerequisites**: plan.md (required), spec.md (required for user stories), ux_reference.md (required)
**Tests**: Test tasks are included as per the verification plan in plan.md.
**Organization**: Tasks are grouped by user story to enable independent implementation and testing of each story.
## Format: `[ID] [P?] [Story] Description`
- **[P]**: Can run in parallel (different files, no dependencies)
- **[Story]**: Which user story this task belongs to (e.g., US1, US2, US3)
- Include exact file paths in descriptions
## Path Conventions
- **Web app**: `backend/src/`, `frontend/src/`
---
## Phase 1: Setup (Shared Infrastructure)
**Purpose**: Project initialization and basic structure
- [ ] T001 Create database migration for `task_logs` table in `backend/src/models/task.py`
- [ ] T002 [P] Define `LogEntry` and `TaskLog` schemas in `backend/src/core/task_manager/models.py`
- [ ] T003 [P] Create `TaskLogger` class in `backend/src/core/task_manager/task_logger.py`
- [ ] T004 [P] Create `TaskContext` class in `backend/src/core/task_manager/context.py`
---
## Phase 2: Foundational (Blocking Prerequisites)
**Purpose**: Core infrastructure that MUST be complete before ANY user story can be implemented
**⚠️ CRITICAL**: No user story work can begin until this phase is complete
- [ ] T005 Implement `TaskLogPersistenceService` in `backend/src/core/task_manager/persistence.py`
- [ ] T006 Update `TaskManager` to include log buffer and flusher thread in `backend/src/core/task_manager/manager.py`
- [ ] T007 Implement `_flush_logs` and `_add_log` (new signature) in `backend/src/core/task_manager/manager.py`
- [ ] T008 Update `_run_task` to support `TaskContext` and backward compatibility in `backend/src/core/task_manager/manager.py`
- [ ] T009 [P] Update `TaskCleanupService` to delete logs in `backend/src/core/task_manager/cleanup.py`
**Checkpoint**: Foundation ready - user story implementation can now begin in parallel
---
## Phase 3: User Story 2 - Persistent Log History (Priority: P1) 🎯 MVP
**Goal**: Ensure logs are saved to the database and accessible after server restart.
**Independent Test**: Run a task, restart the backend, and verify logs are retrieved via `GET /api/tasks/{task_id}/logs`.
### Implementation for User Story 2
- [ ] T010 [P] [US2] Implement `GET /api/tasks/{task_id}/logs` endpoint in `backend/src/api/routes/tasks.py`
- [ ] T011 [P] [US2] Implement `GET /api/tasks/{task_id}/logs/stats` and `/sources` in `backend/src/api/routes/tasks.py`
- [ ] T012 [US2] Update `get_task_logs` in `TaskManager` to fetch from persistence for completed tasks in `backend/src/core/task_manager/manager.py`
- [ ] T013 [US2] Verify implementation matches ux_reference.md (Happy Path & Errors)
**Checkpoint**: User Story 2 complete - logs are persistent and accessible via API.
---
## Phase 4: User Story 1 - Real-time Filtered Logging (Priority: P1)
**Goal**: Real-time log streaming with server-side filtering.
**Independent Test**: Connect to WebSocket with `?source=plugin` and verify only plugin logs are received.
### Implementation for User Story 1
- [ ] T014 [US1] Update WebSocket endpoint in `backend/src/app.py` to support `source` and `level` query parameters
- [ ] T015 [US1] Implement server-side filtering logic for WebSocket broadcast in `backend/src/core/task_manager/manager.py`
- [ ] T016 [P] [US1] Create `LogFilterBar` component in `frontend/src/components/tasks/LogFilterBar.svelte`
- [ ] T017 [P] [US1] Create `LogEntryRow` component in `frontend/src/components/tasks/LogEntryRow.svelte`
- [ ] T018 [US1] Create `TaskLogPanel` component in `frontend/src/components/tasks/TaskLogPanel.svelte`
- [ ] T019 [US1] Refactor `TaskLogViewer` to use `TaskLogPanel` in `frontend/src/components/TaskLogViewer.svelte`
- [ ] T020 [US1] Update `TaskRunner` to pass filter parameters to WebSocket in `frontend/src/components/TaskRunner.svelte`
- [ ] T021 [US1] Verify implementation matches ux_reference.md (Happy Path & Errors)
**Checkpoint**: User Story 1 complete - real-time filtered logging is functional.
---
## Phase 5: User Story 3 - Component Attribution (Priority: P2)
**Goal**: Migrate plugins to use the new `TaskContext` and `TaskLogger`.
**Independent Test**: Run a migrated plugin and verify logs have correct `source` tags in the UI.
### Implementation for User Story 3
- [ ] T022 [P] [US3] Migrate `BackupPlugin` to use `TaskContext` in `backend/src/plugins/backup.py`
- [ ] T023 [P] [US3] Migrate `MigrationPlugin` to use `TaskContext` in `backend/src/plugins/migration.py`
- [ ] T024 [P] [US3] Migrate `GitPlugin` to use `TaskContext` in `backend/src/plugins/git_plugin.py`
- [ ] T025 [US3] Verify implementation matches ux_reference.md (Happy Path & Errors)
---
## Phase 6: Polish & Cross-Cutting Concerns
**Purpose**: Improvements that affect multiple user stories
- [ ] T026 [P] Add unit tests for `LogPersistenceService` in `backend/tests/test_log_persistence.py`
- [ ] T027 [P] Add unit tests for `TaskLogger` and `TaskContext` in `backend/tests/test_task_logger.py`
- [ ] T028 [P] Update `docs/plugin_dev.md` with new logging instructions
- [ ] T029 Final verification of all success criteria (SC-001 to SC-004)
---
## Dependencies & Execution Order
### Phase Dependencies
- **Setup (Phase 1)**: No dependencies.
- **Foundational (Phase 2)**: Depends on Phase 1.
- **User Stories (Phase 3-5)**: Depend on Phase 2. US2 (Persistence) is prioritized as it's the foundation for historical logs.
- **Polish (Phase 6)**: Depends on all user stories.
### Parallel Opportunities
- T002, T003, T004 can run in parallel.
- T010, T011 can run in parallel.
- T016, T017 can run in parallel.
- Plugin migrations (T022-T024) can run in parallel.
---
## Implementation Strategy
### MVP First (User Story 2)
1. Complete Setup & Foundational phases.
2. Implement US2 (Persistence & API).
3. **STOP and VALIDATE**: Verify logs are saved and retrieved after restart.
### Incremental Delivery
1. Add US1 (Real-time & UI) → Test filtering.
2. Add US3 (Plugin Migration) → Test source attribution.

View File

@@ -0,0 +1,51 @@
# UX Reference: Task Logging System (Separated Per-Task Logs)
**Feature Branch**: `018-task-logging-v2`
**Created**: 2026-02-07
**Status**: Draft
## 1. User Persona & Context
* **Who is the user?**: System Administrator / DevOps Engineer / Developer.
* **What is their goal?**: Monitor task execution in real-time, debug failures by isolating logs from specific components (e.g., Superset API vs. Git operations), and review historical logs of completed tasks.
* **Context**: Using the web dashboard to run migrations, backups, or LLM analysis.
## 2. The "Happy Path" Narrative
The user starts a migration task and immediately sees the log panel. As the task progresses, logs appear with clear color-coded tags indicating their source (e.g., `[superset_api]`, `[git]`). The user notices a warning from the `superset_api` source, clicks the "Source" filter, and selects "superset_api" to isolate those messages. The view instantly updates to show only API-related logs, allowing the user to quickly identify a rate-limiting warning. Once the task finishes, the user refreshes the page, and the logs are still there, fully searchable and filterable.
## 3. Interface Mockups
### UI Layout & Flow
**Screen/Component**: TaskLogPanel (within Task Details)
* **Layout**: A header with filter controls, followed by a scrollable log area.
* **Key Elements**:
* **Level Filter**: Dropdown (All, Info, Warning, Error, Debug).
* **Source Filter**: Dropdown (All, plugin, superset_api, storage, etc.).
* **Search Input**: Text field with "Search logs..." placeholder.
* **Log Area**: Monospace font, alternating row backgrounds.
* **Progress Bar**: Inline bar appearing within a log row when a component reports progress.
* **States**:
* **Default**: Shows all logs for the current task.
* **Filtering**: Log list updates as filters are changed (live for running tasks via WS).
* **Auto-scroll**: Enabled by default; a "Jump to Bottom" button appears if the user scrolls up.
## 4. The "Error" Experience
### Scenario A: Log Loading Failure
* **User Action**: Opens a historical task with thousands of logs.
* **System Response**: A small inline notification: "Failed to load full log history. [Retry]".
* **Recovery**: Clicking "Retry" re-triggers the REST API call.
### Scenario B: WebSocket Disconnect
* **System Response**: The log panel shows a "Reconnecting..." status indicator.
* **Recovery**: System automatically reconnects; once back, it fetches missing logs since the last received timestamp.
## 5. Tone & Voice
* **Style**: Technical, precise, and informative.
* **Terminology**: Use "Source" for component identification, "Level" for severity, and "Persistence" for saved logs.

View File

@@ -1331,9 +1331,15 @@
- ƒ **get_tasks** (`Function`)
- 📝 Retrieves tasks with pagination and optional status filter.
- ƒ **get_task_logs** (`Function`)
- 📝 Retrieves logs for a specific task.
- 📝 Retrieves logs for a specific task with filtering and pagination.
- ƒ **get_task_log_stats** (`Function`)
- 📝 Returns log statistics (counts by level/source) for a task.
- ƒ **get_task_log_sources** (`Function`)
- 📝 Returns unique log sources for a task.
- ƒ **_add_log** (`Function`)
- 📝 Adds a log entry to a task and notifies subscribers.
- 📝 Adds a log entry to a task, buffers for persistence, and notifies subscribers.
- ƒ **_flush_logs** (`Function`)
- 📝 Flushes buffered logs to the database.
- ƒ **subscribe_logs** (`Function`)
- 📝 Subscribes to real-time logs for a task.
- ƒ **unsubscribe_logs** (`Function`)
@@ -1346,6 +1352,30 @@
- 📝 Resume a task that is awaiting input with provided passwords.
- ƒ **clear_tasks** (`Function`)
- 📝 Clears tasks based on status filter.
- 📦 **TaskLogPersistenceModule** (`Module`)
- <20> Handles the persistence of task logs in a dedicated database table.
- 🏗️ Layer: Core
- **TaskLogPersistenceService** (`Class`)
- 📝 Provides CRUD operations for task logs.
- ƒ **save_log** (`Function`)
- 📝 Saves a single log entry.
- ƒ **bulk_save** (`Function`)
- 📝 Performs batch insertion of log entries.
- ƒ **get_logs** (`Function`)
- 📝 Retrieves logs with filtering and pagination.
- ƒ **get_stats** (`Function`)
- 📝 Returns log counts by level and source.
- ƒ **delete_logs** (`Function`)
- 📝 Deletes all logs for a specific task.
- 📦 **TaskContextModule** (`Module`)
- 📝 Provides execution context and logging utilities to plugins.
- 🏗️ Layer: Core
- **TaskLogger** (`Class`)
- 📝 Per-task logger with source attribution.
- ƒ **with_source** (`Function`)
- 📝 Returns a new logger instance with a different source.
- **TaskContext** (`Class`)
- 📝 Container for task-specific resources (logger, config, etc.).
- 📦 **TaskManagerModels** (`Module`)
- 📝 Defines the data models and enumerations used by the Task Manager.
- 🏗️ Layer: Core