Compare commits
35 Commits
012-remove
...
76b98fcf8f
| Author | SHA1 | Date | |
|---|---|---|---|
| 76b98fcf8f | |||
| 794cc55fe7 | |||
| 235b0e3c9f | |||
| e6087bd3c1 | |||
| 0f16bab2b8 | |||
| 7de96c17c4 | |||
| f018b97ed2 | |||
| 72846aa835 | |||
| 994c0c3e5d | |||
| 252a8601a9 | |||
| 8044f85ea4 | |||
| d4109e5a03 | |||
| b2bbd73439 | |||
| 0e0e26e2f7 | |||
| 18b42f8dd0 | |||
| e7b31accd6 | |||
| d3c3a80ed2 | |||
| cc244c2d86 | |||
| d10c23e658 | |||
| 1042b35d1b | |||
| 16ffeb1ed6 | |||
| da34deac02 | |||
| 51e9ee3fcc | |||
| edf9286071 | |||
| a542e7d2df | |||
| a863807cf2 | |||
| e2bc68683f | |||
| 43cb82697b | |||
| 4ba28cf93e | |||
| 343f2e29f5 | |||
| c9a53578fd | |||
| 07ec2d9797 | |||
| e9d3f3c827 | |||
| 26ba015b75 | |||
| 49129d3e86 |
6
.gitignore
vendored
6
.gitignore
vendored
@@ -59,9 +59,13 @@ Thumbs.db
|
|||||||
*.ps1
|
*.ps1
|
||||||
keyring passwords.py
|
keyring passwords.py
|
||||||
*github*
|
*github*
|
||||||
*git*
|
|
||||||
*tech_spec*
|
*tech_spec*
|
||||||
dashboards
|
dashboards
|
||||||
backend/mappings.db
|
backend/mappings.db
|
||||||
|
|
||||||
|
|
||||||
|
backend/tasks.db
|
||||||
|
backend/logs
|
||||||
|
backend/auth.db
|
||||||
|
semantics/reports
|
||||||
|
|||||||
@@ -22,6 +22,19 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
|
|||||||
- SQLite (for job history/results, connection configs), Filesystem (for temporary file uploads) (010-refactor-cli-to-web)
|
- SQLite (for job history/results, connection configs), Filesystem (for temporary file uploads) (010-refactor-cli-to-web)
|
||||||
- Python 3.9+ + FastAPI, Pydantic, requests, pyyaml (migrated from superset_tool) (012-remove-superset-tool)
|
- Python 3.9+ + FastAPI, Pydantic, requests, pyyaml (migrated from superset_tool) (012-remove-superset-tool)
|
||||||
- SQLite (tasks.db, migrations.db), Filesystem (012-remove-superset-tool)
|
- SQLite (tasks.db, migrations.db), Filesystem (012-remove-superset-tool)
|
||||||
|
- Filesystem (local git repo), SQLite (for GitServerConfig, Environment) (011-git-integration-dashboard)
|
||||||
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, GitPython (or CLI git), Pydantic, SQLAlchemy, Superset API (011-git-integration-dashboard)
|
||||||
|
- SQLite (for config/history), Filesystem (local Git repositories) (011-git-integration-dashboard)
|
||||||
|
- Node.js 18+ (Frontend Build), Svelte 5.x + SvelteKit, Tailwind CSS, `date-fns` (existing) (013-unify-frontend-css)
|
||||||
|
- LocalStorage (for language preference) (013-unify-frontend-css)
|
||||||
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit (Frontend) (014-file-storage-ui)
|
||||||
|
- Local Filesystem (for artifacts), Config (for storage path) (014-file-storage-ui)
|
||||||
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit + Tailwind CSS (Frontend) (015-frontend-nav-redesign)
|
||||||
|
- N/A (UI reorganization and API integration) (015-frontend-nav-redesign)
|
||||||
|
- SQLite (`auth.db`) for Users, Roles, Permissions, and Mappings. (016-multi-user-auth)
|
||||||
|
- SQLite (existing `tasks.db` for results, `auth.db` for permissions, `mappings.db` or new `plugins.db` for provider config/metadata) (017-llm-analysis-plugin)
|
||||||
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, SQLAlchemy, WebSocket (existing) (019-superset-ux-redesign)
|
||||||
|
- SQLite (tasks.db, auth.db, migrations.db) - no new database tables required (019-superset-ux-redesign)
|
||||||
|
|
||||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
||||||
|
|
||||||
@@ -42,9 +55,9 @@ cd src; pytest; ruff check .
|
|||||||
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
||||||
|
|
||||||
## Recent Changes
|
## Recent Changes
|
||||||
- 012-remove-superset-tool: Added Python 3.9+ + FastAPI, Pydantic, requests, pyyaml (migrated from superset_tool)
|
- 019-superset-ux-redesign: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, SQLAlchemy, WebSocket (existing)
|
||||||
- 010-refactor-cli-to-web: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, `superset_tool` (internal lib)
|
- 017-llm-analysis-plugin: Added Python 3.9+ (Backend), Node.js 18+ (Frontend)
|
||||||
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
- 016-multi-user-auth: Added Python 3.9+ (Backend), Node.js 18+ (Frontend)
|
||||||
|
|
||||||
|
|
||||||
<!-- MANUAL ADDITIONS START -->
|
<!-- MANUAL ADDITIONS START -->
|
||||||
|
|||||||
4
.kilocode/workflows/read_semantic.md
Normal file
4
.kilocode/workflows/read_semantic.md
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
---
|
||||||
|
description: USE SEMANTIC
|
||||||
|
---
|
||||||
|
Прочитай semantic_protocol.md. ОБЯЗАТЕЛЬНО используй его при разработке
|
||||||
@@ -63,6 +63,7 @@ Load only the minimal necessary context from each artifact:
|
|||||||
**From constitution:**
|
**From constitution:**
|
||||||
|
|
||||||
- Load `.specify/memory/constitution.md` for principle validation
|
- Load `.specify/memory/constitution.md` for principle validation
|
||||||
|
- Load `semantic_protocol.md` for technical standard validation
|
||||||
|
|
||||||
### 3. Build Semantic Models
|
### 3. Build Semantic Models
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,10 @@
|
|||||||
---
|
---
|
||||||
description: Execute the implementation plan by processing and executing all tasks defined in tasks.md
|
description: Execute the implementation plan by processing and executing all tasks defined in tasks.md
|
||||||
|
handoffs:
|
||||||
|
- label: Verify Changes
|
||||||
|
agent: speckit.test
|
||||||
|
prompt: Verify the implementation of...
|
||||||
|
send: true
|
||||||
---
|
---
|
||||||
|
|
||||||
## User Input
|
## User Input
|
||||||
@@ -46,6 +51,7 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
- Automatically proceed to step 3
|
- Automatically proceed to step 3
|
||||||
|
|
||||||
3. Load and analyze the implementation context:
|
3. Load and analyze the implementation context:
|
||||||
|
- **REQUIRED**: Read `semantic_protocol.md` for strict coding standards and contract requirements
|
||||||
- **REQUIRED**: Read tasks.md for the complete task list and execution plan
|
- **REQUIRED**: Read tasks.md for the complete task list and execution plan
|
||||||
- **REQUIRED**: Read plan.md for tech stack, architecture, and file structure
|
- **REQUIRED**: Read plan.md for tech stack, architecture, and file structure
|
||||||
- **IF EXISTS**: Read data-model.md for entities and relationships
|
- **IF EXISTS**: Read data-model.md for entities and relationships
|
||||||
@@ -111,6 +117,7 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
- **Validation checkpoints**: Verify each phase completion before proceeding
|
- **Validation checkpoints**: Verify each phase completion before proceeding
|
||||||
|
|
||||||
7. Implementation execution rules:
|
7. Implementation execution rules:
|
||||||
|
- **Strict Adherence**: Apply `semantic_protocol.md` rules - every file must start with [DEF] header, include @TIER, and define contracts
|
||||||
- **Setup first**: Initialize project structure, dependencies, configuration
|
- **Setup first**: Initialize project structure, dependencies, configuration
|
||||||
- **Tests before code**: If you need to write tests for contracts, entities, and integration scenarios
|
- **Tests before code**: If you need to write tests for contracts, entities, and integration scenarios
|
||||||
- **Core development**: Implement models, services, CLI commands, endpoints
|
- **Core development**: Implement models, services, CLI commands, endpoints
|
||||||
|
|||||||
@@ -22,7 +22,7 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
|
|
||||||
1. **Setup**: Run `.specify/scripts/bash/setup-plan.sh --json` from repo root and parse JSON for FEATURE_SPEC, IMPL_PLAN, SPECS_DIR, BRANCH. For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot").
|
1. **Setup**: Run `.specify/scripts/bash/setup-plan.sh --json` from repo root and parse JSON for FEATURE_SPEC, IMPL_PLAN, SPECS_DIR, BRANCH. For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot").
|
||||||
|
|
||||||
2. **Load context**: Read FEATURE_SPEC and `.specify/memory/constitution.md`. Load IMPL_PLAN template (already copied).
|
2. **Load context**: Read FEATURE_SPEC, `ux_reference.md`, `semantic_protocol.md` and `.specify/memory/constitution.md`. Load IMPL_PLAN template (already copied).
|
||||||
|
|
||||||
3. **Execute plan workflow**: Follow the structure in IMPL_PLAN template to:
|
3. **Execute plan workflow**: Follow the structure in IMPL_PLAN template to:
|
||||||
- Fill Technical Context (mark unknowns as "NEEDS CLARIFICATION")
|
- Fill Technical Context (mark unknowns as "NEEDS CLARIFICATION")
|
||||||
@@ -64,17 +64,30 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
|
|
||||||
**Prerequisites:** `research.md` complete
|
**Prerequisites:** `research.md` complete
|
||||||
|
|
||||||
|
0. **Validate Design against UX Reference**:
|
||||||
|
- Check if the proposed architecture supports the latency, interactivity, and flow defined in `ux_reference.md`.
|
||||||
|
- **Linkage**: Ensure key UI states from `ux_reference.md` map to Component Contracts (`@UX_STATE`).
|
||||||
|
- **CRITICAL**: If the technical plan compromises the UX (e.g. "We can't do real-time validation"), you **MUST STOP** and warn the user.
|
||||||
|
|
||||||
1. **Extract entities from feature spec** → `data-model.md`:
|
1. **Extract entities from feature spec** → `data-model.md`:
|
||||||
- Entity name, fields, relationships
|
- Entity name, fields, relationships, validation rules.
|
||||||
- Validation rules from requirements
|
|
||||||
- State transitions if applicable
|
|
||||||
|
|
||||||
2. **Generate API contracts** from functional requirements:
|
2. **Design & Verify Contracts (Semantic Protocol)**:
|
||||||
- For each user action → endpoint
|
- **Drafting**: Define [DEF] Headers and Contracts for all new modules based on `semantic_protocol.md`.
|
||||||
- Use standard REST/GraphQL patterns
|
- **Self-Review**:
|
||||||
- Output OpenAPI/GraphQL schema to `/contracts/`
|
- *Completeness*: Do `@PRE`/`@POST` cover edge cases identified in Research?
|
||||||
|
- *Connectivity*: Do `@RELATION` tags form a coherent graph?
|
||||||
|
- *Compliance*: Does syntax match `[DEF:id:Type]` exactly?
|
||||||
|
- **Output**: Write verified contracts to `contracts/modules.md`.
|
||||||
|
|
||||||
3. **Agent context update**:
|
3. **Simulate Contract Usage**:
|
||||||
|
- Trace one key user scenario through the defined contracts to ensure data flow continuity.
|
||||||
|
- If a contract interface mismatch is found, fix it immediately.
|
||||||
|
|
||||||
|
4. **Generate API contracts**:
|
||||||
|
- Output OpenAPI/GraphQL schema to `/contracts/` for backend-frontend sync.
|
||||||
|
|
||||||
|
5. **Agent context update**:
|
||||||
- Run `.specify/scripts/bash/update-agent-context.sh kilocode`
|
- Run `.specify/scripts/bash/update-agent-context.sh kilocode`
|
||||||
- These scripts detect which AI agent is in use
|
- These scripts detect which AI agent is in use
|
||||||
- Update the appropriate agent-specific context file
|
- Update the appropriate agent-specific context file
|
||||||
|
|||||||
@@ -70,7 +70,22 @@ Given that feature description, do this:
|
|||||||
|
|
||||||
3. Load `.specify/templates/spec-template.md` to understand required sections.
|
3. Load `.specify/templates/spec-template.md` to understand required sections.
|
||||||
|
|
||||||
4. Follow this execution flow:
|
4. **Generate UX Reference**:
|
||||||
|
|
||||||
|
a. Load `.specify/templates/ux-reference-template.md`.
|
||||||
|
|
||||||
|
b. **Design the User Experience**:
|
||||||
|
- **Imagine you are the user**: Visualize the interface and interaction.
|
||||||
|
- **Persona**: Define who is using this.
|
||||||
|
- **Happy Path**: Write the story of the perfect interaction.
|
||||||
|
- **Mockups**: Create concrete CLI text blocks or UI descriptions.
|
||||||
|
- **Errors**: Define how the system guides the user out of failure.
|
||||||
|
|
||||||
|
c. Write the `ux_reference.md` file in the feature directory.
|
||||||
|
|
||||||
|
d. **CRITICAL**: This UX Reference is now the source of truth for the "feel" of the feature. The technical spec MUST support this experience.
|
||||||
|
|
||||||
|
5. Follow this execution flow:
|
||||||
|
|
||||||
1. Parse user description from Input
|
1. Parse user description from Input
|
||||||
If empty: ERROR "No feature description provided"
|
If empty: ERROR "No feature description provided"
|
||||||
@@ -116,6 +131,12 @@ Given that feature description, do this:
|
|||||||
- [ ] Written for non-technical stakeholders
|
- [ ] Written for non-technical stakeholders
|
||||||
- [ ] All mandatory sections completed
|
- [ ] All mandatory sections completed
|
||||||
|
|
||||||
|
## UX Consistency
|
||||||
|
|
||||||
|
- [ ] Functional requirements fully support the 'Happy Path' in ux_reference.md
|
||||||
|
- [ ] Error handling requirements match the 'Error Experience' in ux_reference.md
|
||||||
|
- [ ] No requirements contradict the defined User Persona or Context
|
||||||
|
|
||||||
## Requirement Completeness
|
## Requirement Completeness
|
||||||
|
|
||||||
- [ ] No [NEEDS CLARIFICATION] markers remain
|
- [ ] No [NEEDS CLARIFICATION] markers remain
|
||||||
@@ -190,7 +211,7 @@ Given that feature description, do this:
|
|||||||
|
|
||||||
d. **Update Checklist**: After each validation iteration, update the checklist file with current pass/fail status
|
d. **Update Checklist**: After each validation iteration, update the checklist file with current pass/fail status
|
||||||
|
|
||||||
7. Report completion with branch name, spec file path, checklist results, and readiness for the next phase (`/speckit.clarify` or `/speckit.plan`).
|
7. Report completion with branch name, spec file path, ux_reference file path, checklist results, and readiness for the next phase (`/speckit.clarify` or `/speckit.plan`).
|
||||||
|
|
||||||
**NOTE:** The script creates and checks out the new branch and initializes the spec file before writing.
|
**NOTE:** The script creates and checks out the new branch and initializes the spec file before writing.
|
||||||
|
|
||||||
|
|||||||
@@ -24,7 +24,7 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
1. **Setup**: Run `.specify/scripts/bash/check-prerequisites.sh --json` from repo root and parse FEATURE_DIR and AVAILABLE_DOCS list. All paths must be absolute. For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot").
|
1. **Setup**: Run `.specify/scripts/bash/check-prerequisites.sh --json` from repo root and parse FEATURE_DIR and AVAILABLE_DOCS list. All paths must be absolute. For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot").
|
||||||
|
|
||||||
2. **Load design documents**: Read from FEATURE_DIR:
|
2. **Load design documents**: Read from FEATURE_DIR:
|
||||||
- **Required**: plan.md (tech stack, libraries, structure), spec.md (user stories with priorities)
|
- **Required**: plan.md (tech stack, libraries, structure), spec.md (user stories with priorities), ux_reference.md (experience source of truth)
|
||||||
- **Optional**: data-model.md (entities), contracts/ (API endpoints), research.md (decisions), quickstart.md (test scenarios)
|
- **Optional**: data-model.md (entities), contracts/ (API endpoints), research.md (decisions), quickstart.md (test scenarios)
|
||||||
- Note: Not all projects have all documents. Generate tasks based on what's available.
|
- Note: Not all projects have all documents. Generate tasks based on what's available.
|
||||||
|
|
||||||
@@ -70,6 +70,12 @@ The tasks.md should be immediately executable - each task must be specific enoug
|
|||||||
|
|
||||||
**Tests are OPTIONAL**: Only generate test tasks if explicitly requested in the feature specification or if user requests TDD approach.
|
**Tests are OPTIONAL**: Only generate test tasks if explicitly requested in the feature specification or if user requests TDD approach.
|
||||||
|
|
||||||
|
### UX Preservation (CRITICAL)
|
||||||
|
|
||||||
|
- **Source of Truth**: `ux_reference.md` is the absolute standard for the "feel" of the feature.
|
||||||
|
- **Violation Warning**: If any task would inherently violate the UX (e.g. "Remove progress bar to simplify code"), you **MUST** flag this to the user immediately.
|
||||||
|
- **Verification Task**: You **MUST** add a specific task at the end of each User Story phase: `- [ ] Txxx [USx] Verify implementation matches ux_reference.md (Happy Path & Errors)`
|
||||||
|
|
||||||
### Checklist Format (REQUIRED)
|
### Checklist Format (REQUIRED)
|
||||||
|
|
||||||
Every task MUST strictly follow this format:
|
Every task MUST strictly follow this format:
|
||||||
|
|||||||
116
.kilocode/workflows/speckit.test.md
Normal file
116
.kilocode/workflows/speckit.test.md
Normal file
@@ -0,0 +1,116 @@
|
|||||||
|
№ **speckit.tasks.md**
|
||||||
|
### Modified Workflow
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
description: Generate an actionable, dependency-ordered tasks.md for the feature based on available design artifacts.
|
||||||
|
handoffs:
|
||||||
|
- label: Analyze For Consistency
|
||||||
|
agent: speckit.analyze
|
||||||
|
prompt: Run a project analysis for consistency
|
||||||
|
send: true
|
||||||
|
- label: Implement Project
|
||||||
|
agent: speckit.implement
|
||||||
|
prompt: Start the implementation in phases
|
||||||
|
send: true
|
||||||
|
---
|
||||||
|
|
||||||
|
## User Input
|
||||||
|
|
||||||
|
```text
|
||||||
|
$ARGUMENTS
|
||||||
|
```
|
||||||
|
|
||||||
|
You **MUST** consider the user input before proceeding (if not empty).
|
||||||
|
|
||||||
|
## Outline
|
||||||
|
|
||||||
|
1. **Setup**: Run `.specify/scripts/bash/check-prerequisites.sh --json` from repo root and parse FEATURE_DIR and AVAILABLE_DOCS list. All paths must be absolute.
|
||||||
|
|
||||||
|
2. **Load design documents**: Read from FEATURE_DIR:
|
||||||
|
- **Required**: plan.md (tech stack, libraries, structure), spec.md (user stories with priorities), ux_reference.md (experience source of truth)
|
||||||
|
- **Optional**: data-model.md (entities), contracts/ (API endpoints), research.md (decisions)
|
||||||
|
|
||||||
|
3. **Execute task generation workflow**:
|
||||||
|
- **Architecture Analysis (CRITICAL)**: Scan existing codebase for patterns (DI, Auth, ORM).
|
||||||
|
- Load plan.md/spec.md.
|
||||||
|
- Generate tasks organized by user story.
|
||||||
|
- **Apply Fractal Co-location**: Ensure all unit tests are mapped to `__tests__` subdirectories relative to the code.
|
||||||
|
- Validate task completeness.
|
||||||
|
|
||||||
|
4. **Generate tasks.md**: Use `.specify/templates/tasks-template.md` as structure.
|
||||||
|
- Phase 1: Context & Setup.
|
||||||
|
- Phase 2: Foundational tasks.
|
||||||
|
- Phase 3+: User Stories (Priority order).
|
||||||
|
- Final Phase: Polish.
|
||||||
|
- **Strict Constraint**: Ensure tasks follow the Co-location and Mocking rules below.
|
||||||
|
|
||||||
|
5. **Report**: Output path to generated tasks.md and summary.
|
||||||
|
|
||||||
|
Context for task generation: $ARGUMENTS
|
||||||
|
|
||||||
|
## Task Generation Rules
|
||||||
|
|
||||||
|
**CRITICAL**: Tasks MUST be actionable, specific, architecture-aware, and context-local.
|
||||||
|
|
||||||
|
### Implementation & Testing Constraints (ANTI-LOOP & CO-LOCATION)
|
||||||
|
|
||||||
|
To prevent infinite debugging loops and context fragmentation, apply these rules:
|
||||||
|
|
||||||
|
1. **Fractal Co-location Strategy (MANDATORY)**:
|
||||||
|
- **Rule**: Unit tests MUST live next to the code they verify.
|
||||||
|
- **Forbidden**: Do NOT create unit tests in root `tests/` or `backend/tests/`. Those are for E2E/Integration only.
|
||||||
|
- **Pattern (Python)**:
|
||||||
|
- Source: `src/domain/order/processing.py`
|
||||||
|
- Test Task: `Create tests in src/domain/order/__tests__/test_processing.py`
|
||||||
|
- **Pattern (Frontend)**:
|
||||||
|
- Source: `src/lib/components/UserCard.svelte`
|
||||||
|
- Test Task: `Create tests in src/lib/components/__tests__/UserCard.test.ts`
|
||||||
|
|
||||||
|
2. **Semantic Relations**:
|
||||||
|
- Test generation tasks must explicitly instruct to add the relation header: `# @RELATION: VERIFIES -> [TargetComponent]`
|
||||||
|
|
||||||
|
3. **Strict Mocking for Unit Tests**:
|
||||||
|
- Any task creating Unit Tests MUST specify: *"Use `unittest.mock.MagicMock` for heavy dependencies (DB sessions, Auth). Do NOT instantiate real service classes."*
|
||||||
|
|
||||||
|
4. **Schema/Model Separation**:
|
||||||
|
- Explicitly separate tasks for ORM Models (SQLAlchemy) and Pydantic Schemas.
|
||||||
|
|
||||||
|
### UX Preservation (CRITICAL)
|
||||||
|
|
||||||
|
- **Source of Truth**: `ux_reference.md` is the absolute standard.
|
||||||
|
- **Verification Task**: You **MUST** add a specific task at the end of each User Story phase: `- [ ] Txxx [USx] Verify implementation matches ux_reference.md (Happy Path & Errors)`
|
||||||
|
|
||||||
|
### Checklist Format (REQUIRED)
|
||||||
|
|
||||||
|
Every task MUST strictly follow this format:
|
||||||
|
|
||||||
|
```text
|
||||||
|
- [ ] [TaskID] [P?] [Story?] Description with file path
|
||||||
|
```
|
||||||
|
|
||||||
|
**Examples**:
|
||||||
|
- ✅ `- [ ] T005 [US1] Create unit tests for OrderService in src/services/__tests__/test_order.py (Mock DB)`
|
||||||
|
- ✅ `- [ ] T006 [US1] Implement OrderService in src/services/order.py`
|
||||||
|
- ❌ `- [ ] T005 [US1] Create tests in backend/tests/test_order.py` (VIOLATION: Wrong location)
|
||||||
|
|
||||||
|
### Task Organization & Phase Structure
|
||||||
|
|
||||||
|
**Phase 1: Context & Setup**
|
||||||
|
- **Goal**: Prepare environment and understand existing patterns.
|
||||||
|
- **Mandatory Task**: `- [ ] T001 Analyze existing project structure, auth patterns, and `conftest.py` location`
|
||||||
|
|
||||||
|
**Phase 2: Foundational (Data & Core)**
|
||||||
|
- Database Models (ORM).
|
||||||
|
- Pydantic Schemas (DTOs).
|
||||||
|
- Core Service interfaces.
|
||||||
|
|
||||||
|
**Phase 3+: User Stories (Iterative)**
|
||||||
|
- **Step 1: Isolation Tests (Co-located)**:
|
||||||
|
- `- [ ] Txxx [USx] Create unit tests for [Component] in [Path]/__tests__/test_[name].py`
|
||||||
|
- *Note: Specify using MagicMock for external deps.*
|
||||||
|
- **Step 2: Implementation**: Services -> Endpoints.
|
||||||
|
- **Step 3: Integration**: Wire up real dependencies (if E2E tests requested).
|
||||||
|
- **Step 4: UX Verification**.
|
||||||
|
|
||||||
|
**Final Phase: Polish**
|
||||||
|
- Linting, formatting, final manual verify.
|
||||||
@@ -13,20 +13,6 @@ customModes:
|
|||||||
- browser
|
- browser
|
||||||
- mcp
|
- mcp
|
||||||
customInstructions: 1. Always begin by loading the relevant plan or task list from the `specs/` directory. 2. Do not assume a task is done just because it is checked; verify the code or functionality first if asked to audit. 3. When updating task lists, ensure you only mark items as complete if you have verified them.
|
customInstructions: 1. Always begin by loading the relevant plan or task list from the `specs/` directory. 2. Do not assume a task is done just because it is checked; verify the code or functionality first if asked to audit. 3. When updating task lists, ensure you only mark items as complete if you have verified them.
|
||||||
- slug: product-manager
|
|
||||||
name: Product Manager
|
|
||||||
description: Executes SpecKit workflows for feature management
|
|
||||||
roleDefinition: |-
|
|
||||||
You are Kilo Code, acting as a Product Manager. Your purpose is to rigorously execute the workflows defined in `.kilocode/workflows/`.
|
|
||||||
You act as the orchestrator for: - Specification (`speckit.specify`, `speckit.clarify`) - Planning (`speckit.plan`) - Task Management (`speckit.tasks`, `speckit.taskstoissues`) - Quality Assurance (`speckit.analyze`, `speckit.checklist`) - Governance (`speckit.constitution`) - Implementation Oversight (`speckit.implement`)
|
|
||||||
For each task, you must read the relevant workflow file from `.kilocode/workflows/` and follow its Execution Steps precisely.
|
|
||||||
whenToUse: Use this mode when you need to run any /speckit.* command or when dealing with high-level feature planning, specification writing, or project management tasks.
|
|
||||||
groups:
|
|
||||||
- read
|
|
||||||
- edit
|
|
||||||
- command
|
|
||||||
- mcp
|
|
||||||
customInstructions: 1. Always read the specific workflow file in `.kilocode/workflows/` before executing a command. 2. Adhere strictly to the "Operating Constraints" and "Execution Steps" in the workflow files.
|
|
||||||
- slug: semantic
|
- slug: semantic
|
||||||
name: Semantic Agent
|
name: Semantic Agent
|
||||||
roleDefinition: |-
|
roleDefinition: |-
|
||||||
@@ -43,3 +29,18 @@ customModes:
|
|||||||
- browser
|
- browser
|
||||||
- mcp
|
- mcp
|
||||||
source: project
|
source: project
|
||||||
|
- slug: product-manager
|
||||||
|
name: Product Manager
|
||||||
|
roleDefinition: |-
|
||||||
|
Your purpose is to rigorously execute the workflows defined in `.kilocode/workflows/`.
|
||||||
|
You act as the orchestrator for: - Specification (`speckit.specify`, `speckit.clarify`) - Planning (`speckit.plan`) - Task Management (`speckit.tasks`, `speckit.taskstoissues`) - Quality Assurance (`speckit.analyze`, `speckit.checklist`) - Governance (`speckit.constitution`) - Implementation Oversight (`speckit.implement`)
|
||||||
|
For each task, you must read the relevant workflow file from `.kilocode/workflows/` and follow its Execution Steps precisely.
|
||||||
|
whenToUse: Use this mode when you need to run any /speckit.* command or when dealing with high-level feature planning, specification writing, or project management tasks.
|
||||||
|
description: Executes SpecKit workflows for feature management
|
||||||
|
customInstructions: 1. Always read the specific workflow file in `.kilocode/workflows/` before executing a command. 2. Adhere strictly to the "Operating Constraints" and "Execution Steps" in the workflow files.
|
||||||
|
groups:
|
||||||
|
- read
|
||||||
|
- edit
|
||||||
|
- command
|
||||||
|
- mcp
|
||||||
|
source: project
|
||||||
|
|||||||
@@ -1,11 +1,11 @@
|
|||||||
<!--
|
<!--
|
||||||
SYNC IMPACT REPORT
|
SYNC IMPACT REPORT
|
||||||
Version: 1.7.1 (Simplified Workflow)
|
Version: 2.2.0 (ConfigManager Discipline)
|
||||||
Changes:
|
Changes:
|
||||||
- Simplified Generation Workflow to a single phase: Code Generation from `tasks.md`.
|
- Updated Principle II: Added mandatory requirement for using `ConfigManager` (via dependency injection) for all configuration access to ensure consistent environment handling and avoid hardcoded values.
|
||||||
- Removed multi-phase Architecture/Implementation split to streamline development.
|
- Updated Principle III: Refined `requestApi` requirement.
|
||||||
Templates Status:
|
Templates Status:
|
||||||
- .specify/templates/plan-template.md: ✅ Aligned (Dynamic check).
|
- .specify/templates/plan-template.md: ✅ Aligned.
|
||||||
- .specify/templates/spec-template.md: ✅ Aligned.
|
- .specify/templates/spec-template.md: ✅ Aligned.
|
||||||
- .specify/templates/tasks-template.md: ✅ Aligned.
|
- .specify/templates/tasks-template.md: ✅ Aligned.
|
||||||
-->
|
-->
|
||||||
@@ -14,54 +14,42 @@ Templates Status:
|
|||||||
## Core Principles
|
## Core Principles
|
||||||
|
|
||||||
### I. Semantic Protocol Compliance
|
### I. Semantic Protocol Compliance
|
||||||
The file `semantic_protocol.md` is the **authoritative technical standard** for this project. All code generation, refactoring, and architecture must strictly adhere to the standards, syntax, and workflows defined therein.
|
The file `semantic_protocol.md` is the **sole and authoritative technical standard** for this project.
|
||||||
- **Syntax**: `[DEF]` anchors, `@RELATION` tags, and metadata must match the Protocol specification.
|
- **Law**: All code must adhere to the Axioms (Meaning First, Contract First, etc.) defined in the Protocol.
|
||||||
- **Structure**: File layouts and headers must follow the "File Structure Standard".
|
- **Syntax & Structure**: Anchors (`[DEF]`), Tags (`@KEY`), and File Structures must strictly match the Protocol.
|
||||||
- **Workflow**: The technical steps for generating code must align with the Protocol.
|
- **Compliance**: Any deviation from `semantic_protocol.md` constitutes a build failure.
|
||||||
|
|
||||||
### II. Causal Validity (Contracts First)
|
### II. Everything is a Plugin & Centralized Config
|
||||||
As defined in the Protocol, Semantic definitions (Contracts) must ALWAYS precede implementation code. Logic is downstream of definition. We define the structure and constraints (`[DEF]`, `@PRE`, `@POST`) before writing the executable logic.
|
All functional extensions, tools, or major features must be implemented as modular Plugins inheriting from `PluginBase`.
|
||||||
|
- **Modularity**: Logic should not reside in standalone services or scripts unless strictly necessary for core infrastructure. This ensures a unified execution model via the `TaskManager`.
|
||||||
|
- **Configuration Discipline**: All configuration access (environments, settings, paths) MUST use the `ConfigManager`. In the backend, the singleton instance MUST be obtained via dependency injection (`get_config_manager()`). Hardcoding environment IDs (e.g., "1") or paths is STRICTLY FORBIDDEN.
|
||||||
|
|
||||||
### III. Immutability of Architecture
|
### III. Unified Frontend Experience
|
||||||
Architectural decisions in the Module Header (`@LAYER`, `@INVARIANT`, `@CONSTRAINT`) are treated as immutable constraints. Changes to these require an explicit refactoring step, not ad-hoc modification during implementation.
|
To ensure a consistent and accessible user experience, all frontend implementations must strictly adhere to the unified design and localization standards.
|
||||||
|
- **Component Reusability**: All UI elements MUST utilize the standardized Svelte component library (`src/lib/ui`) and centralized design tokens.
|
||||||
|
- **Internationalization (i18n)**: All user-facing text MUST be extracted to the translation system (`src/lib/i18n`).
|
||||||
|
- **Backend Communication**: All API requests MUST use the `requestApi` wrapper (or its derivatives like `fetchApi`, `postApi`) from `src/lib/api.js`. Direct use of the native `fetch` API for backend communication is FORBIDDEN to ensure consistent authentication (JWT) and error handling.
|
||||||
|
|
||||||
### IV. Design by Contract (DbC)
|
### IV. Security & Access Control
|
||||||
Contracts are the Source of Truth. Functions and Classes must define their purpose, specifications, and constraints in the metadata block before implementation, strictly following the **Contracts (Section IV)** standard in `semantic_protocol.md`.
|
To support the Role-Based Access Control (RBAC) system, all functional components must define explicit permissions.
|
||||||
|
- **Granular Permissions**: Every Plugin MUST define a unique permission string (e.g., `plugin:name:execute`) required for its operation.
|
||||||
|
- **Registration**: These permissions MUST be registered in the system database (`auth.db`) during initialization.
|
||||||
|
|
||||||
### V. Belief State Logging
|
### V. Independent Testability
|
||||||
Agents must maintain belief state logs for debugging and coherence checks, strictly following the **Logging Standard (Section V)** defined in `semantic_protocol.md`.
|
Every feature specification MUST define "Independent Tests" that allow the feature to be verified in isolation.
|
||||||
|
- **Decoupling**: Features should be designed such that they can be tested without requiring the full application state or external dependencies where possible.
|
||||||
|
- **Verification**: A feature is not complete until its Independent Test scenarios pass.
|
||||||
|
|
||||||
### VI. Fractal Complexity Limit
|
### VI. Asynchronous Execution
|
||||||
To maintain semantic coherence, code must adhere to the complexity limits (Module/Function size) defined in the **Fractal Complexity Limit (Section VI)** of `semantic_protocol.md`.
|
All long-running or resource-intensive operations (migrations, analysis, backups, external API calls) MUST be executed as asynchronous tasks via the `TaskManager`.
|
||||||
|
- **Non-Blocking**: HTTP API endpoints MUST NOT block on these operations; they should spawn a task and return a Task ID.
|
||||||
### VII. Everything is a Plugin
|
- **Observability**: Tasks MUST emit real-time status updates via the WebSocket infrastructure.
|
||||||
All functional extensions, tools, or major features must be implemented as modular Plugins inheriting from `PluginBase`. Logic should not reside in standalone services or scripts unless strictly necessary for core infrastructure. This ensures a unified execution model via the `TaskManager`, consistent logging, and modularity.
|
|
||||||
|
|
||||||
## File Structure Standards
|
|
||||||
Refer to **Section III (File Structure Standard)** in `semantic_protocol.md` for the authoritative definitions of:
|
|
||||||
- Python Module Headers (`.py`)
|
|
||||||
- Svelte Component Headers (`.svelte`)
|
|
||||||
|
|
||||||
## Generation Workflow
|
|
||||||
The development process follows a streamlined single-phase workflow:
|
|
||||||
|
|
||||||
### 1. Code Generation Phase (Mode: `code`)
|
|
||||||
**Input**: `tasks.md`
|
|
||||||
**Responsibility**:
|
|
||||||
- Select task from `tasks.md`.
|
|
||||||
- Generate Scaffolding (`[DEF]` anchors, Headers, Contracts) AND Implementation in one pass.
|
|
||||||
- Ensure strict adherence to Protocol Section IV (Contracts) and Section VII (Generation Workflow).
|
|
||||||
- **Output**: Working code with passing tests.
|
|
||||||
|
|
||||||
### 2. Validation
|
|
||||||
If logic conflicts with Contract -> Stop -> Report Error.
|
|
||||||
|
|
||||||
## Governance
|
## Governance
|
||||||
This Constitution establishes the "Semantic Code Generation Protocol" as the supreme law of this repository.
|
This Constitution establishes the "Semantic Code Generation Protocol" as the supreme law of this repository.
|
||||||
|
|
||||||
- **Authoritative Source**: `semantic_protocol.md` defines the specific implementation rules for these Principles.
|
- **Authoritative Source**: `semantic_protocol.md` defines the specific implementation rules for Principle I.
|
||||||
- **Automated Enforcement**: Tools must validate adherence to the `semantic_protocol.md` syntax.
|
|
||||||
- **Amendments**: Changes to core principles require a Constitution amendment. Changes to technical syntax require a Protocol update.
|
- **Amendments**: Changes to core principles require a Constitution amendment. Changes to technical syntax require a Protocol update.
|
||||||
- **Compliance**: Failure to adhere to the Protocol constitutes a build failure.
|
- **Compliance**: Failure to adhere to the Protocol constitutes a build failure.
|
||||||
|
|
||||||
**Version**: 1.7.1 | **Ratified**: 2025-12-19 | **Last Amended**: 2026-01-13
|
**Version**: 2.2.0 | **Ratified**: 2025-12-19 | **Last Amended**: 2026-01-29
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
# Feature Specification: [FEATURE NAME]
|
# Feature Specification: [FEATURE NAME]
|
||||||
|
|
||||||
**Feature Branch**: `[###-feature-name]`
|
**Feature Branch**: `[###-feature-name]`
|
||||||
|
**Reference UX**: `[ux_reference.md]` (See specific folder)
|
||||||
**Created**: [DATE]
|
**Created**: [DATE]
|
||||||
**Status**: Draft
|
**Status**: Draft
|
||||||
**Input**: User description: "$ARGUMENTS"
|
**Input**: User description: "$ARGUMENTS"
|
||||||
|
|||||||
@@ -1,35 +0,0 @@
|
|||||||
---
|
|
||||||
|
|
||||||
description: "Architecture task list template (Contracts & Scaffolding)"
|
|
||||||
---
|
|
||||||
|
|
||||||
# Architecture Tasks: [FEATURE NAME]
|
|
||||||
|
|
||||||
**Role**: Architect Agent
|
|
||||||
**Goal**: Define the "What" and "Why" (Contracts, Scaffolding, Models) before implementation.
|
|
||||||
**Input**: Design documents from `/specs/[###-feature-name]/`
|
|
||||||
**Output**: Files with `[DEF]` anchors, `@PRE`/`@POST` contracts, and `@RELATION` mappings. No business logic.
|
|
||||||
|
|
||||||
## Phase 1: Setup & Models
|
|
||||||
|
|
||||||
- [ ] A001 Create/Update data models in [path] with `[DEF]` and contracts
|
|
||||||
- [ ] A002 Define API route structure/contracts in [path]
|
|
||||||
- [ ] A003 Define shared utilities/interfaces
|
|
||||||
|
|
||||||
## Phase 2: User Story 1 - [Title]
|
|
||||||
|
|
||||||
- [ ] A004 [US1] Define contracts for [Component/Service] in [path]
|
|
||||||
- [ ] A005 [US1] Define contracts for [Endpoint] in [path]
|
|
||||||
- [ ] A006 [US1] Define contracts for [Frontend Component] in [path]
|
|
||||||
|
|
||||||
## Phase 3: User Story 2 - [Title]
|
|
||||||
|
|
||||||
- [ ] A007 [US2] Define contracts for [Component/Service] in [path]
|
|
||||||
- [ ] A008 [US2] Define contracts for [Endpoint] in [path]
|
|
||||||
|
|
||||||
## Handover Checklist
|
|
||||||
|
|
||||||
- [ ] All new files created with `[DEF]` anchors
|
|
||||||
- [ ] All functions/classes have `@PURPOSE`, `@PRE`, `@POST` tags
|
|
||||||
- [ ] No "naked code" (logic outside of anchors)
|
|
||||||
- [ ] `tasks-dev.md` is ready for the Developer Agent
|
|
||||||
@@ -1,35 +0,0 @@
|
|||||||
---
|
|
||||||
|
|
||||||
description: "Developer task list template (Implementation Logic)"
|
|
||||||
---
|
|
||||||
|
|
||||||
# Developer Tasks: [FEATURE NAME]
|
|
||||||
|
|
||||||
**Role**: Developer Agent
|
|
||||||
**Goal**: Implement the "How" (Logic, State, Error Handling) inside the defined contracts.
|
|
||||||
**Input**: `tasks-arch.md` (completed), Scaffolding files with `[DEF]` anchors.
|
|
||||||
**Output**: Working code that satisfies `@PRE`/`@POST` conditions.
|
|
||||||
|
|
||||||
## Phase 1: Setup & Models
|
|
||||||
|
|
||||||
- [ ] D001 Implement logic for [Model] in [path]
|
|
||||||
- [ ] D002 Implement logic for [API Route] in [path]
|
|
||||||
- [ ] D003 Implement shared utilities
|
|
||||||
|
|
||||||
## Phase 2: User Story 1 - [Title]
|
|
||||||
|
|
||||||
- [ ] D004 [US1] Implement logic for [Component/Service] in [path]
|
|
||||||
- [ ] D005 [US1] Implement logic for [Endpoint] in [path]
|
|
||||||
- [ ] D006 [US1] Implement logic for [Frontend Component] in [path]
|
|
||||||
- [ ] D007 [US1] Verify semantic compliance and belief state logging
|
|
||||||
|
|
||||||
## Phase 3: User Story 2 - [Title]
|
|
||||||
|
|
||||||
- [ ] D008 [US2] Implement logic for [Component/Service] in [path]
|
|
||||||
- [ ] D009 [US2] Implement logic for [Endpoint] in [path]
|
|
||||||
|
|
||||||
## Polish & Quality Assurance
|
|
||||||
|
|
||||||
- [ ] DXXX Verify all tests pass
|
|
||||||
- [ ] DXXX Check error handling and edge cases
|
|
||||||
- [ ] DXXX Ensure code style compliance
|
|
||||||
67
.specify/templates/ux-reference-template.md
Normal file
67
.specify/templates/ux-reference-template.md
Normal file
@@ -0,0 +1,67 @@
|
|||||||
|
# UX Reference: [FEATURE NAME]
|
||||||
|
|
||||||
|
**Feature Branch**: `[###-feature-name]`
|
||||||
|
**Created**: [DATE]
|
||||||
|
**Status**: Draft
|
||||||
|
|
||||||
|
## 1. User Persona & Context
|
||||||
|
|
||||||
|
* **Who is the user?**: [e.g. Junior Developer, System Administrator, End User]
|
||||||
|
* **What is their goal?**: [e.g. Quickly deploy a hotfix, Visualize complex data]
|
||||||
|
* **Context**: [e.g. Running a command in a terminal on a remote server, Browsing the dashboard on a mobile device]
|
||||||
|
|
||||||
|
## 2. The "Happy Path" Narrative
|
||||||
|
|
||||||
|
[Write a short story (3-5 sentences) describing the perfect interaction from the user's perspective. Focus on how it *feels* - is it instant? Does it guide them?]
|
||||||
|
|
||||||
|
## 3. Interface Mockups
|
||||||
|
|
||||||
|
### CLI Interaction (if applicable)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# User runs this command:
|
||||||
|
$ command --flag value
|
||||||
|
|
||||||
|
# System responds immediately with:
|
||||||
|
[ spinner ] specific loading message...
|
||||||
|
|
||||||
|
# Success output:
|
||||||
|
✅ Operation completed successfully in 1.2s
|
||||||
|
- Created file: /path/to/file
|
||||||
|
- Updated config: /path/to/config
|
||||||
|
```
|
||||||
|
|
||||||
|
### UI Layout & Flow (if applicable)
|
||||||
|
|
||||||
|
**Screen/Component**: [Name]
|
||||||
|
|
||||||
|
* **Layout**: [Description of structure, e.g., "Two-column layout, left sidebar navigation..."]
|
||||||
|
* **Key Elements**:
|
||||||
|
* **[Button Name]**: Primary action. Color: Blue.
|
||||||
|
* **[Input Field]**: Placeholder text: "Enter your name...". Validation: Real-time.
|
||||||
|
* **States**:
|
||||||
|
* **Default**: Clean state, waiting for input.
|
||||||
|
* **Loading**: Skeleton loader replaces content area.
|
||||||
|
* **Success**: Toast notification appears top-right: "Saved!" (Green).
|
||||||
|
|
||||||
|
## 4. The "Error" Experience
|
||||||
|
|
||||||
|
**Philosophy**: Don't just report the error; guide the user to the fix.
|
||||||
|
|
||||||
|
### Scenario A: [Common Error, e.g. Invalid Input]
|
||||||
|
|
||||||
|
* **User Action**: Enters "123" in a text-only field.
|
||||||
|
* **System Response**:
|
||||||
|
* (UI) Input border turns Red. Message below input: "Please enter text only."
|
||||||
|
* (CLI) `❌ Error: Invalid input '123'. Expected text format.`
|
||||||
|
* **Recovery**: User can immediately re-type without refreshing/re-running.
|
||||||
|
|
||||||
|
### Scenario B: [System Failure, e.g. Network Timeout]
|
||||||
|
|
||||||
|
* **System Response**: "Unable to connect. Retrying in 3s... (Press C to cancel)"
|
||||||
|
* **Recovery**: Automatic retry or explicit "Retry Now" button.
|
||||||
|
|
||||||
|
## 5. Tone & Voice
|
||||||
|
|
||||||
|
* **Style**: [e.g. Concise, Technical, Friendly, Verbose]
|
||||||
|
* **Terminology**: [e.g. Use "Repository" not "Repo", "Directory" not "Folder"]
|
||||||
158
README.md
158
README.md
@@ -1,119 +1,77 @@
|
|||||||
# Инструменты автоматизации Superset
|
# Инструменты автоматизации Superset (ss-tools)
|
||||||
|
|
||||||
## Обзор
|
## Обзор
|
||||||
Этот репозиторий содержит Python-скрипты и библиотеку (`superset_tool`) для автоматизации задач в Apache Superset, таких как:
|
**ss-tools** — это современная платформа для автоматизации и управления экосистемой Apache Superset. Проект перешел от набора CLI-скриптов к полноценному веб-приложению с архитектурой Backend (FastAPI) + Frontend (SvelteKit), обеспечивая удобный интерфейс для сложных операций.
|
||||||
- **Резервное копирование**: Экспорт всех дашбордов из экземпляра Superset в локальное хранилище.
|
|
||||||
- **Миграция**: Перенос и преобразование дашбордов между разными средами Superset (например, Development, Sandbox, Production).
|
## Основные возможности
|
||||||
|
|
||||||
|
### 🚀 Миграция и управление дашбордами
|
||||||
|
- **Dashboard Grid**: Удобный просмотр всех дашбордов во всех окружениях (Dev, Sandbox, Prod) в едином интерфейсе.
|
||||||
|
- **Интеллектуальный маппинг**: Автоматическое и ручное сопоставление датасетов, таблиц и схем при переносе между окружениями.
|
||||||
|
- **Проверка зависимостей**: Валидация наличия всех необходимых компонентов перед миграцией.
|
||||||
|
|
||||||
|
### 📦 Резервное копирование
|
||||||
|
- **Планировщик (Scheduler)**: Автоматическое создание резервных копий дашбордов и датасетов по расписанию.
|
||||||
|
- **Хранилище**: Локальное хранение артефактов с возможностью управления через UI.
|
||||||
|
|
||||||
|
### 🛠 Git Интеграция
|
||||||
|
- **Version Control**: Возможность версионирования ассетов Superset.
|
||||||
|
- **Git Dashboard**: Управление ветками, коммитами и деплоем изменений напрямую из интерфейса.
|
||||||
|
- **Conflict Resolution**: Встроенные инструменты для разрешения конфликтов в YAML-конфигурациях.
|
||||||
|
|
||||||
|
### 🤖 LLM Анализ (AI Plugin)
|
||||||
|
- **Автоматический аудит**: Анализ состояния дашбордов на основе скриншотов и метаданных.
|
||||||
|
- **Генерация документации**: Автоматическое описание датасетов и колонок с помощью LLM (OpenAI, OpenRouter и др.).
|
||||||
|
- **Smart Validation**: Поиск аномалий и ошибок в визуализациях.
|
||||||
|
|
||||||
|
### 🔐 Безопасность и администрирование
|
||||||
|
- **Multi-user Auth**: Многопользовательский доступ с ролевой моделью (RBAC).
|
||||||
|
- **Управление подключениями**: Централизованная настройка доступов к различным инстансам Superset.
|
||||||
|
- **Логирование**: Подробная история выполнения всех фоновых задач.
|
||||||
|
|
||||||
|
## Технологический стек
|
||||||
|
- **Backend**: Python 3.9+, FastAPI, SQLAlchemy, APScheduler, Pydantic.
|
||||||
|
- **Frontend**: Node.js 18+, SvelteKit, Tailwind CSS.
|
||||||
|
- **Database**: SQLite (для хранения метаданных, задач и настроек доступа).
|
||||||
|
|
||||||
## Структура проекта
|
## Структура проекта
|
||||||
- `backup_script.py`: Основной скрипт для выполнения запланированного резервного копирования дашбордов Superset.
|
- `backend/` — Серверная часть, API и логика плагинов.
|
||||||
- `migration_script.py`: Основной скрипт для переноса конкретных дашбордов между окружениями, включая переопределение соединений с базами данных.
|
- `frontend/` — Клиентская часть (SvelteKit приложение).
|
||||||
- `search_script.py`: Скрипт для поиска данных во всех доступных датасетах на сервере
|
- `specs/` — Спецификации функций и планы реализации.
|
||||||
- `run_mapper.py`: CLI-скрипт для маппинга метаданных датасетов.
|
- `docs/` — Дополнительная документация по маппингу и разработке плагинов.
|
||||||
- `superset_tool/`:
|
|
||||||
- `client.py`: Python-клиент для взаимодействия с API Superset.
|
|
||||||
- `exceptions.py`: Пользовательские классы исключений для структурированной обработки ошибок.
|
|
||||||
- `models.py`: Pydantic-модели для валидации конфигурационных данных.
|
|
||||||
- `utils/`:
|
|
||||||
- `fileio.py`: Утилиты для работы с файловой системой (работа с архивами, парсинг YAML).
|
|
||||||
- `logger.py`: Конфигурация логгера для единообразного логирования в проекте.
|
|
||||||
- `network.py`: HTTP-клиент для сетевых запросов с обработкой аутентификации и повторных попыток.
|
|
||||||
- `init_clients.py`: Утилита для инициализации клиентов Superset для разных окружений.
|
|
||||||
- `dataset_mapper.py`: Логика маппинга метаданных датасетов.
|
|
||||||
|
|
||||||
## Настройка
|
## Быстрый старт
|
||||||
|
|
||||||
### Требования
|
### Требования
|
||||||
- Python 3.9+
|
- Python 3.9+
|
||||||
- `pip` для управления пакетами.
|
- Node.js 18+
|
||||||
- `keyring` для безопасного хранения паролей.
|
- Настроенный доступ к API Superset
|
||||||
|
|
||||||
### Установка
|
### Запуск
|
||||||
1. **Клонируйте репозиторий:**
|
Для автоматической настройки окружений и запуска обоих серверов (Backend & Frontend) используйте скрипт:
|
||||||
```bash
|
|
||||||
git clone https://prod.gitlab.dwh.rusal.com/dwh_bi/superset-tools.git
|
|
||||||
cd superset-tools
|
|
||||||
```
|
|
||||||
2. **Установите зависимости:**
|
|
||||||
```bash
|
|
||||||
pip install -r requirements.txt
|
|
||||||
```
|
|
||||||
(Возможно, потребуется создать `requirements.txt` с `pydantic`, `requests`, `keyring`, `PyYAML`, `urllib3`)
|
|
||||||
3. **Настройте пароли:**
|
|
||||||
Используйте `keyring` для хранения паролей API-пользователей Superset.
|
|
||||||
```python
|
|
||||||
import keyring
|
|
||||||
keyring.set_password("system", "dev migrate", "пароль пользователя migrate_user")
|
|
||||||
keyring.set_password("system", "prod migrate", "пароль пользователя migrate_user")
|
|
||||||
keyring.set_password("system", "sandbox migrate", "пароль пользователя migrate_user")
|
|
||||||
```
|
|
||||||
|
|
||||||
## Использование
|
|
||||||
|
|
||||||
### Запуск проекта (Web UI)
|
|
||||||
Для запуска backend и frontend серверов одной командой:
|
|
||||||
```bash
|
```bash
|
||||||
./run.sh
|
./run.sh
|
||||||
```
|
```
|
||||||
|
*Скрипт создаст виртуальное окружение Python, установит зависимости `pip` и `npm`, и запустит сервисы.*
|
||||||
|
|
||||||
Опции:
|
Опции:
|
||||||
- `--skip-install`: Пропустить проверку и установку зависимостей.
|
- `--skip-install`: Пропустить установку зависимостей.
|
||||||
- `--help`: Показать справку.
|
- `--help`: Показать справку.
|
||||||
|
|
||||||
Переменные окружения:
|
Переменные окружения:
|
||||||
- `BACKEND_PORT`: Порт для backend (по умолчанию 8000).
|
- `BACKEND_PORT`: Порт API (по умолчанию 8000).
|
||||||
- `FRONTEND_PORT`: Порт для frontend (по умолчанию 5173).
|
- `FRONTEND_PORT`: Порт UI (по умолчанию 5173).
|
||||||
|
|
||||||
### Скрипт резервного копирования (`backup_script.py`)
|
## Разработка
|
||||||
Для создания резервных копий дашбордов из настроенных окружений Superset:
|
Проект следует строгим правилам разработки:
|
||||||
```bash
|
1. **Semantic Code Generation**: Использование протокола `semantic_protocol.md` для обеспечения надежности кода.
|
||||||
python backup_script.py
|
2. **Design by Contract (DbC)**: Определение предусловий и постусловий для ключевых функций.
|
||||||
```
|
3. **Constitution**: Соблюдение правил, описанных в конституции проекта в папке `.specify/`.
|
||||||
Резервные копии сохраняются в `P:\Superset\010 Бекапы\` по умолчанию. Логи хранятся в `P:\Superset\010 Бекапы\Logs`.
|
|
||||||
|
|
||||||
### Скрипт миграции (`migration_script.py`)
|
### Полезные команды
|
||||||
Для переноса конкретного дашборда:
|
- **Backend**: `cd backend && .venv/bin/python3 -m uvicorn src.app:app --reload`
|
||||||
```bash
|
- **Frontend**: `cd frontend && npm run dev`
|
||||||
python migration_script.py
|
- **Тесты**: `cd backend && .venv/bin/pytest`
|
||||||
```
|
|
||||||
|
|
||||||
### Скрипт поиска (`search_script.py`)
|
## Контакты и вклад
|
||||||
Для поиска по текстовым паттернам в метаданных датасетов Superset:
|
Для добавления новых функций или исправления ошибок, пожалуйста, ознакомьтесь с `docs/plugin_dev.md` и создайте соответствующую спецификацию в `specs/`.
|
||||||
```bash
|
|
||||||
python search_script.py
|
|
||||||
```
|
|
||||||
Скрипт использует регулярные выражения для поиска в полях датасетов, таких как SQL-запросы. Результаты поиска выводятся в лог и в консоль.
|
|
||||||
|
|
||||||
### Скрипт маппинга метаданных (`run_mapper.py`)
|
|
||||||
Для обновления метаданных датасета (например, verbose names) в Superset:
|
|
||||||
```bash
|
|
||||||
python run_mapper.py --source <source_type> --dataset-id <dataset_id> [--table-name <table_name>] [--table-schema <table_schema>] [--excel-path <path_to_excel>] [--env <environment>]
|
|
||||||
```
|
|
||||||
Если вы используете XLSX - файл должен содержать два столбца - column_name | verbose_name
|
|
||||||
|
|
||||||
|
|
||||||
Параметры:
|
|
||||||
- `--source`: Источник данных ('postgres', 'excel' или 'both').
|
|
||||||
- `--dataset-id`: ID датасета для обновления.
|
|
||||||
- `--table-name`: Имя таблицы для PostgreSQL.
|
|
||||||
- `--table-schema`: Схема таблицы для PostgreSQL.
|
|
||||||
- `--excel-path`: Путь к Excel-файлу.
|
|
||||||
- `--env`: Окружение Superset ('dev', 'prod' и т.д.).
|
|
||||||
|
|
||||||
Пример использования:
|
|
||||||
```bash
|
|
||||||
python run_mapper.py --source postgres --dataset-id 123 --table-name account_debt --table-schema dm_view --env dev
|
|
||||||
|
|
||||||
python run_mapper.py --source=excel --dataset-id=286 --excel-path=H:\dev\ss-tools\286_map.xlsx --env=dev
|
|
||||||
```
|
|
||||||
|
|
||||||
## Логирование
|
|
||||||
Логи пишутся в файл в директории `Logs` (например, `P:\Superset\010 Бекапы\Logs` для резервных копий) и выводятся в консоль. Уровень логирования по умолчанию — `INFO`.
|
|
||||||
|
|
||||||
## Разработка и вклад
|
|
||||||
- Следуйте **Semantic Code Generation Protocol** (см. `semantic_protocol.md`):
|
|
||||||
- Все определения обернуты в `[DEF]...[/DEF]`.
|
|
||||||
- Контракты (`@PRE`, `@POST`) определяются ДО реализации.
|
|
||||||
- Строгая типизация и иммутабельность архитектурных решений.
|
|
||||||
- Соблюдайте Конституцию проекта (`.specify/memory/constitution.md`).
|
|
||||||
- Используйте `Pydantic`-модели для валидации данных.
|
|
||||||
- Реализуйте всестороннюю обработку ошибок с помощью пользовательских исключений.
|
|
||||||
|
|||||||
@@ -1,269 +0,0 @@
|
|||||||
2025-12-20 19:55:11,325 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 19:55:11,325 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 19:55:11,327 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 43, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 21:01:49,905 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 21:01:49,906 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 21:01:49,988 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 21:01:49,990 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 22:42:32,538 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 22:42:32,538 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 22:42:32,583 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 22:42:32,587 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 22:54:29,770 - INFO - [BackupPlugin][Entry] Starting backup for .
|
|
||||||
2025-12-20 22:54:29,771 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 22:54:29,831 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 22:54:29,833 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 22:54:34,078 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 22:54:34,078 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 22:54:34,079 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 22:54:34,079 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 22:59:25,060 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 22:59:25,060 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 22:59:25,114 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 22:59:25,117 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 23:00:31,156 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 23:00:31,156 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 23:00:31,157 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 23:00:31,162 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 23:00:34,710 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 23:00:34,710 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 23:00:34,710 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 23:00:34,711 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 23:01:43,894 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 23:01:43,894 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 23:01:43,895 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 23:01:43,895 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 23:04:07,731 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 23:04:07,731 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 23:04:07,732 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 23:04:07,732 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 23:06:39,641 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 23:06:39,642 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 23:06:39,687 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 23:06:39,689 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
Traceback (most recent call last):
|
|
||||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
|
||||||
config = SupersetConfig(
|
|
||||||
^^^^^^^^^^^^^^^
|
|
||||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
|
||||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
|
||||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
|
||||||
base_url
|
|
||||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
|
||||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
|
||||||
2025-12-20 23:30:36,090 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
|
||||||
2025-12-20 23:30:36,093 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
|
||||||
2025-12-20 23:30:36,128 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
|
||||||
2025-12-20 23:30:36,129 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
|
||||||
2025-12-20 23:30:36,129 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
|
||||||
2025-12-20 23:30:36,130 - WARNING - [_init_session][State] SSL verification disabled.
|
|
||||||
2025-12-20 23:30:36,130 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
|
||||||
2025-12-20 23:30:36,130 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
|
||||||
2025-12-20 23:30:36,130 - INFO - [get_dashboards][Enter] Fetching dashboards.
|
|
||||||
2025-12-20 23:30:36,131 - INFO - [authenticate][Enter] Authenticating to https://superset.bebesh.ru/api/v1
|
|
||||||
2025-12-20 23:30:36,897 - INFO - [authenticate][Exit] Authenticated successfully.
|
|
||||||
2025-12-20 23:30:37,527 - INFO - [get_dashboards][Exit] Found 11 dashboards.
|
|
||||||
2025-12-20 23:30:37,527 - INFO - [BackupPlugin][Progress] Found 11 dashboards to export in superset.
|
|
||||||
2025-12-20 23:30:37,529 - INFO - [export_dashboard][Enter] Exporting dashboard 11.
|
|
||||||
2025-12-20 23:30:38,224 - INFO - [export_dashboard][Exit] Exported dashboard 11 to dashboard_export_20251220T203037.zip.
|
|
||||||
2025-12-20 23:30:38,225 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:38,226 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/FCC New Coder Survey 2018/dashboard_export_20251220T203037.zip
|
|
||||||
2025-12-20 23:30:38,227 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/FCC New Coder Survey 2018
|
|
||||||
2025-12-20 23:30:38,230 - INFO - [export_dashboard][Enter] Exporting dashboard 10.
|
|
||||||
2025-12-20 23:30:38,438 - INFO - [export_dashboard][Exit] Exported dashboard 10 to dashboard_export_20251220T203038.zip.
|
|
||||||
2025-12-20 23:30:38,438 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:38,439 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/COVID Vaccine Dashboard/dashboard_export_20251220T203038.zip
|
|
||||||
2025-12-20 23:30:38,439 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/COVID Vaccine Dashboard
|
|
||||||
2025-12-20 23:30:38,440 - INFO - [export_dashboard][Enter] Exporting dashboard 9.
|
|
||||||
2025-12-20 23:30:38,853 - INFO - [export_dashboard][Exit] Exported dashboard 9 to dashboard_export_20251220T203038.zip.
|
|
||||||
2025-12-20 23:30:38,853 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:38,856 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Sales Dashboard/dashboard_export_20251220T203038.zip
|
|
||||||
2025-12-20 23:30:38,856 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Sales Dashboard
|
|
||||||
2025-12-20 23:30:38,858 - INFO - [export_dashboard][Enter] Exporting dashboard 8.
|
|
||||||
2025-12-20 23:30:38,939 - INFO - [export_dashboard][Exit] Exported dashboard 8 to dashboard_export_20251220T203038.zip.
|
|
||||||
2025-12-20 23:30:38,940 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:38,941 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Unicode Test/dashboard_export_20251220T203038.zip
|
|
||||||
2025-12-20 23:30:38,941 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Unicode Test
|
|
||||||
2025-12-20 23:30:38,942 - INFO - [export_dashboard][Enter] Exporting dashboard 7.
|
|
||||||
2025-12-20 23:30:39,148 - INFO - [export_dashboard][Exit] Exported dashboard 7 to dashboard_export_20251220T203038.zip.
|
|
||||||
2025-12-20 23:30:39,148 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:39,149 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Video Game Sales/dashboard_export_20251220T203038.zip
|
|
||||||
2025-12-20 23:30:39,149 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Video Game Sales
|
|
||||||
2025-12-20 23:30:39,150 - INFO - [export_dashboard][Enter] Exporting dashboard 6.
|
|
||||||
2025-12-20 23:30:39,689 - INFO - [export_dashboard][Exit] Exported dashboard 6 to dashboard_export_20251220T203039.zip.
|
|
||||||
2025-12-20 23:30:39,689 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:39,690 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Featured Charts/dashboard_export_20251220T203039.zip
|
|
||||||
2025-12-20 23:30:39,691 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Featured Charts
|
|
||||||
2025-12-20 23:30:39,692 - INFO - [export_dashboard][Enter] Exporting dashboard 5.
|
|
||||||
2025-12-20 23:30:39,960 - INFO - [export_dashboard][Exit] Exported dashboard 5 to dashboard_export_20251220T203039.zip.
|
|
||||||
2025-12-20 23:30:39,960 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:39,961 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Slack Dashboard/dashboard_export_20251220T203039.zip
|
|
||||||
2025-12-20 23:30:39,961 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Slack Dashboard
|
|
||||||
2025-12-20 23:30:39,962 - INFO - [export_dashboard][Enter] Exporting dashboard 4.
|
|
||||||
2025-12-20 23:30:40,196 - INFO - [export_dashboard][Exit] Exported dashboard 4 to dashboard_export_20251220T203039.zip.
|
|
||||||
2025-12-20 23:30:40,196 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:40,197 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/deck.gl Demo/dashboard_export_20251220T203039.zip
|
|
||||||
2025-12-20 23:30:40,197 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/deck.gl Demo
|
|
||||||
2025-12-20 23:30:40,198 - INFO - [export_dashboard][Enter] Exporting dashboard 3.
|
|
||||||
2025-12-20 23:30:40,745 - INFO - [export_dashboard][Exit] Exported dashboard 3 to dashboard_export_20251220T203040.zip.
|
|
||||||
2025-12-20 23:30:40,746 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:40,760 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Misc Charts/dashboard_export_20251220T203040.zip
|
|
||||||
2025-12-20 23:30:40,761 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Misc Charts
|
|
||||||
2025-12-20 23:30:40,762 - INFO - [export_dashboard][Enter] Exporting dashboard 2.
|
|
||||||
2025-12-20 23:30:40,928 - INFO - [export_dashboard][Exit] Exported dashboard 2 to dashboard_export_20251220T203040.zip.
|
|
||||||
2025-12-20 23:30:40,929 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:40,930 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/USA Births Names/dashboard_export_20251220T203040.zip
|
|
||||||
2025-12-20 23:30:40,931 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/USA Births Names
|
|
||||||
2025-12-20 23:30:40,932 - INFO - [export_dashboard][Enter] Exporting dashboard 1.
|
|
||||||
2025-12-20 23:30:41,582 - INFO - [export_dashboard][Exit] Exported dashboard 1 to dashboard_export_20251220T203040.zip.
|
|
||||||
2025-12-20 23:30:41,582 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
|
||||||
2025-12-20 23:30:41,749 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/World Bank's Data/dashboard_export_20251220T203040.zip
|
|
||||||
2025-12-20 23:30:41,750 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/World Bank's Data
|
|
||||||
2025-12-20 23:30:41,752 - INFO - [consolidate_archive_folders][Enter] Consolidating archives in backups/SUPERSET
|
|
||||||
2025-12-20 23:30:41,753 - INFO - [remove_empty_directories][Enter] Starting cleanup of empty directories in backups/SUPERSET
|
|
||||||
2025-12-20 23:30:41,758 - INFO - [remove_empty_directories][Exit] Removed 0 empty directories.
|
|
||||||
2025-12-20 23:30:41,758 - INFO - [BackupPlugin][CoherenceCheck:Passed] Backup logic completed for superset.
|
|
||||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1,10 +1,17 @@
|
|||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
"""Script to delete tasks with RUNNING status from the database."""
|
# [DEF:backend.delete_running_tasks:Module]
|
||||||
|
# @PURPOSE: Script to delete tasks with RUNNING status from the database.
|
||||||
|
# @LAYER: Utility
|
||||||
|
# @SEMANTICS: maintenance, database, cleanup
|
||||||
|
|
||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from src.core.database import TasksSessionLocal
|
from src.core.database import TasksSessionLocal
|
||||||
from src.models.task import TaskRecord
|
from src.models.task import TaskRecord
|
||||||
|
|
||||||
|
# [DEF:delete_running_tasks:Function]
|
||||||
|
# @PURPOSE: Delete all tasks with RUNNING status from the database.
|
||||||
|
# @PRE: Database is accessible and TaskRecord model is defined.
|
||||||
|
# @POST: All tasks with status 'RUNNING' are removed from the database.
|
||||||
def delete_running_tasks():
|
def delete_running_tasks():
|
||||||
"""Delete all tasks with RUNNING status from the database."""
|
"""Delete all tasks with RUNNING status from the database."""
|
||||||
session: Session = TasksSessionLocal()
|
session: Session = TasksSessionLocal()
|
||||||
@@ -30,6 +37,8 @@ def delete_running_tasks():
|
|||||||
print(f"Error deleting tasks: {e}")
|
print(f"Error deleting tasks: {e}")
|
||||||
finally:
|
finally:
|
||||||
session.close()
|
session.close()
|
||||||
|
# [/DEF:delete_running_tasks:Function]
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
delete_running_tasks()
|
delete_running_tasks()
|
||||||
|
# [/DEF:backend.delete_running_tasks:Module]
|
||||||
|
|||||||
1
backend/get_full_key.py
Normal file
1
backend/get_full_key.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"print(f'Length": {"else": "print('Provider not found')\ndb.close()"}}
|
||||||
1
backend/git_repos/12
Submodule
1
backend/git_repos/12
Submodule
Submodule backend/git_repos/12 added at 57ab7e8679
58903
backend/logs/app.log.1
Normal file
58903
backend/logs/app.log.1
Normal file
File diff suppressed because it is too large
Load Diff
Binary file not shown.
@@ -25,9 +25,13 @@ keyring==25.7.0
|
|||||||
more-itertools==10.8.0
|
more-itertools==10.8.0
|
||||||
pycparser==2.23
|
pycparser==2.23
|
||||||
pydantic==2.12.5
|
pydantic==2.12.5
|
||||||
|
pydantic-settings
|
||||||
pydantic_core==2.41.5
|
pydantic_core==2.41.5
|
||||||
python-multipart==0.0.21
|
python-multipart==0.0.21
|
||||||
PyYAML==6.0.3
|
PyYAML==6.0.3
|
||||||
|
passlib[bcrypt]
|
||||||
|
python-jose[cryptography]
|
||||||
|
PyJWT
|
||||||
RapidFuzz==3.14.3
|
RapidFuzz==3.14.3
|
||||||
referencing==0.37.0
|
referencing==0.37.0
|
||||||
requests==2.32.5
|
requests==2.32.5
|
||||||
@@ -44,3 +48,9 @@ websockets==15.0.1
|
|||||||
pandas
|
pandas
|
||||||
psycopg2-binary
|
psycopg2-binary
|
||||||
openpyxl
|
openpyxl
|
||||||
|
GitPython==3.1.44
|
||||||
|
itsdangerous
|
||||||
|
email-validator
|
||||||
|
openai
|
||||||
|
playwright
|
||||||
|
tenacity
|
||||||
@@ -1,59 +1,118 @@
|
|||||||
# [DEF:AuthModule:Module]
|
# [DEF:backend.src.api.auth:Module]
|
||||||
# @SEMANTICS: auth, authentication, adfs, oauth, middleware
|
#
|
||||||
# @PURPOSE: Implements ADFS authentication using Authlib for FastAPI. It provides a dependency to protect endpoints.
|
# @SEMANTICS: api, auth, routes, login, logout
|
||||||
# @LAYER: UI (API)
|
# @PURPOSE: Authentication API endpoints.
|
||||||
# @RELATION: Used by API routers to protect endpoints that require authentication.
|
# @LAYER: API
|
||||||
|
# @RELATION: USES -> backend.src.services.auth_service.AuthService
|
||||||
|
# @RELATION: USES -> backend.src.core.database.get_auth_db
|
||||||
|
#
|
||||||
|
# @INVARIANT: All auth endpoints must return consistent error codes.
|
||||||
|
|
||||||
from fastapi import Depends, HTTPException, status
|
# [SECTION: IMPORTS]
|
||||||
from fastapi.security import OAuth2AuthorizationCodeBearer
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
from authlib.integrations.starlette_client import OAuth
|
from fastapi.security import OAuth2PasswordRequestForm
|
||||||
from starlette.config import Config
|
from sqlalchemy.orm import Session
|
||||||
|
from ..core.database import get_auth_db
|
||||||
|
from ..services.auth_service import AuthService
|
||||||
|
from ..schemas.auth import Token, User as UserSchema
|
||||||
|
from ..dependencies import get_current_user
|
||||||
|
from ..core.auth.oauth import oauth, is_adfs_configured
|
||||||
|
from ..core.auth.logger import log_security_event
|
||||||
|
from ..core.logger import belief_scope
|
||||||
|
import starlette.requests
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
# Placeholder for ADFS configuration. In a real app, this would come from a secure source.
|
# [DEF:router:Variable]
|
||||||
# Create an in-memory .env file
|
# @PURPOSE: APIRouter instance for authentication routes.
|
||||||
from io import StringIO
|
router = APIRouter(prefix="/api/auth", tags=["auth"])
|
||||||
config_data = StringIO("""
|
# [/DEF:router:Variable]
|
||||||
ADFS_CLIENT_ID=your-client-id
|
|
||||||
ADFS_CLIENT_SECRET=your-client-secret
|
|
||||||
ADFS_SERVER_METADATA_URL=https://your-adfs-server/.well-known/openid-configuration
|
|
||||||
""")
|
|
||||||
config = Config(config_data)
|
|
||||||
oauth = OAuth(config)
|
|
||||||
|
|
||||||
oauth.register(
|
# [DEF:login_for_access_token:Function]
|
||||||
name='adfs',
|
# @PURPOSE: Authenticates a user and returns a JWT access token.
|
||||||
server_metadata_url=config('ADFS_SERVER_METADATA_URL'),
|
# @PRE: form_data contains username and password.
|
||||||
client_kwargs={'scope': 'openid profile email'}
|
# @POST: Returns a Token object on success.
|
||||||
)
|
# @THROW: HTTPException 401 if authentication fails.
|
||||||
|
# @PARAM: form_data (OAuth2PasswordRequestForm) - Login credentials.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: Token - The generated JWT token.
|
||||||
|
@router.post("/login", response_model=Token)
|
||||||
|
async def login_for_access_token(
|
||||||
|
form_data: OAuth2PasswordRequestForm = Depends(),
|
||||||
|
db: Session = Depends(get_auth_db)
|
||||||
|
):
|
||||||
|
with belief_scope("api.auth.login"):
|
||||||
|
auth_service = AuthService(db)
|
||||||
|
user = auth_service.authenticate_user(form_data.username, form_data.password)
|
||||||
|
if not user:
|
||||||
|
log_security_event("LOGIN_FAILED", form_data.username, {"reason": "Invalid credentials"})
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="Incorrect username or password",
|
||||||
|
headers={"WWW-Authenticate": "Bearer"},
|
||||||
|
)
|
||||||
|
log_security_event("LOGIN_SUCCESS", user.username, {"source": "LOCAL"})
|
||||||
|
return auth_service.create_session(user)
|
||||||
|
# [/DEF:login_for_access_token:Function]
|
||||||
|
|
||||||
oauth2_scheme = OAuth2AuthorizationCodeBearer(
|
# [DEF:read_users_me:Function]
|
||||||
authorizationUrl="https://your-adfs-server/adfs/oauth2/authorize",
|
# @PURPOSE: Retrieves the profile of the currently authenticated user.
|
||||||
tokenUrl="https://your-adfs-server/adfs/oauth2/token",
|
# @PRE: Valid JWT token provided.
|
||||||
)
|
# @POST: Returns the current user's data.
|
||||||
|
# @PARAM: current_user (UserSchema) - The user extracted from the token.
|
||||||
|
# @RETURN: UserSchema - The current user profile.
|
||||||
|
@router.get("/me", response_model=UserSchema)
|
||||||
|
async def read_users_me(current_user: UserSchema = Depends(get_current_user)):
|
||||||
|
with belief_scope("api.auth.me"):
|
||||||
|
return current_user
|
||||||
|
# [/DEF:read_users_me:Function]
|
||||||
|
|
||||||
# [DEF:get_current_user:Function]
|
# [DEF:logout:Function]
|
||||||
# @PURPOSE: Dependency to get the current user from the ADFS token.
|
# @PURPOSE: Logs out the current user (placeholder for session revocation).
|
||||||
# @PARAM: token (str) - The OAuth2 bearer token.
|
# @PRE: Valid JWT token provided.
|
||||||
# @PRE: token should be provided via Authorization header.
|
# @POST: Returns success message.
|
||||||
# @POST: Returns user details if authenticated, else raises 401.
|
@router.post("/logout")
|
||||||
# @RETURN: Dict[str, str] - User information.
|
async def logout(current_user: UserSchema = Depends(get_current_user)):
|
||||||
async def get_current_user(token: str = Depends(oauth2_scheme)):
|
with belief_scope("api.auth.logout"):
|
||||||
"""
|
log_security_event("LOGOUT", current_user.username)
|
||||||
Dependency to get the current user from the ADFS token.
|
# In a stateless JWT setup, client-side token deletion is primary.
|
||||||
This is a placeholder and needs to be fully implemented.
|
# Server-side revocation (blacklisting) can be added here if needed.
|
||||||
"""
|
return {"message": "Successfully logged out"}
|
||||||
# In a real implementation, you would:
|
# [/DEF:logout:Function]
|
||||||
# 1. Validate the token with ADFS.
|
|
||||||
# 2. Fetch user information.
|
# [DEF:login_adfs:Function]
|
||||||
# 3. Create a user object.
|
# @PURPOSE: Initiates the ADFS OIDC login flow.
|
||||||
# For now, we'll just check if a token exists.
|
# @POST: Redirects the user to ADFS.
|
||||||
if not token:
|
@router.get("/login/adfs")
|
||||||
raise HTTPException(
|
async def login_adfs(request: starlette.requests.Request):
|
||||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
with belief_scope("api.auth.login_adfs"):
|
||||||
detail="Not authenticated",
|
if not is_adfs_configured():
|
||||||
headers={"WWW-Authenticate": "Bearer"},
|
raise HTTPException(
|
||||||
)
|
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
# A real implementation would return a user object.
|
detail="ADFS is not configured. Please set ADFS_CLIENT_ID, ADFS_CLIENT_SECRET, and ADFS_METADATA_URL environment variables."
|
||||||
return {"placeholder_user": "user@example.com"}
|
)
|
||||||
# [/DEF:get_current_user:Function]
|
redirect_uri = request.url_for('auth_callback_adfs')
|
||||||
# [/DEF:AuthModule:Module]
|
return await oauth.adfs.authorize_redirect(request, str(redirect_uri))
|
||||||
|
# [/DEF:login_adfs:Function]
|
||||||
|
|
||||||
|
# [DEF:auth_callback_adfs:Function]
|
||||||
|
# @PURPOSE: Handles the callback from ADFS after successful authentication.
|
||||||
|
# @POST: Provisions user JIT and returns session token.
|
||||||
|
@router.get("/callback/adfs", name="auth_callback_adfs")
|
||||||
|
async def auth_callback_adfs(request: starlette.requests.Request, db: Session = Depends(get_auth_db)):
|
||||||
|
with belief_scope("api.auth.callback_adfs"):
|
||||||
|
if not is_adfs_configured():
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
|
detail="ADFS is not configured. Please set ADFS_CLIENT_ID, ADFS_CLIENT_SECRET, and ADFS_METADATA_URL environment variables."
|
||||||
|
)
|
||||||
|
token = await oauth.adfs.authorize_access_token(request)
|
||||||
|
user_info = token.get('userinfo')
|
||||||
|
if not user_info:
|
||||||
|
raise HTTPException(status_code=400, detail="Failed to retrieve user info from ADFS")
|
||||||
|
|
||||||
|
auth_service = AuthService(db)
|
||||||
|
user = auth_service.provision_adfs_user(user_info)
|
||||||
|
return auth_service.create_session(user)
|
||||||
|
# [/DEF:auth_callback_adfs:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.api.auth:Module]
|
||||||
@@ -1 +1,3 @@
|
|||||||
from . import plugins, tasks, settings, connections
|
from . import plugins, tasks, settings, connections, environments, mappings, migration, git, storage, admin
|
||||||
|
|
||||||
|
__all__ = ['plugins', 'tasks', 'settings', 'connections', 'environments', 'mappings', 'migration', 'git', 'storage', 'admin']
|
||||||
|
|||||||
310
backend/src/api/routes/admin.py
Normal file
310
backend/src/api/routes/admin.py
Normal file
@@ -0,0 +1,310 @@
|
|||||||
|
# [DEF:backend.src.api.routes.admin:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: api, admin, users, roles, permissions
|
||||||
|
# @PURPOSE: Admin API endpoints for user and role management.
|
||||||
|
# @LAYER: API
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.repository.AuthRepository
|
||||||
|
# @RELATION: USES -> backend.src.dependencies.has_permission
|
||||||
|
#
|
||||||
|
# @INVARIANT: All endpoints in this module require 'Admin' role or 'admin' scope.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from typing import List
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from ...core.database import get_auth_db
|
||||||
|
from ...core.auth.repository import AuthRepository
|
||||||
|
from ...core.auth.security import get_password_hash
|
||||||
|
from ...schemas.auth import (
|
||||||
|
User as UserSchema, UserCreate, UserUpdate,
|
||||||
|
RoleSchema, RoleCreate, RoleUpdate, PermissionSchema,
|
||||||
|
ADGroupMappingSchema, ADGroupMappingCreate
|
||||||
|
)
|
||||||
|
from ...models.auth import User, Role, ADGroupMapping
|
||||||
|
from ...dependencies import has_permission
|
||||||
|
from ...core.logger import logger, belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:router:Variable]
|
||||||
|
# @PURPOSE: APIRouter instance for admin routes.
|
||||||
|
router = APIRouter(prefix="/api/admin", tags=["admin"])
|
||||||
|
# [/DEF:router:Variable]
|
||||||
|
|
||||||
|
# [DEF:list_users:Function]
|
||||||
|
# @PURPOSE: Lists all registered users.
|
||||||
|
# @PRE: Current user has 'Admin' role.
|
||||||
|
# @POST: Returns a list of UserSchema objects.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: List[UserSchema] - List of users.
|
||||||
|
@router.get("/users", response_model=List[UserSchema])
|
||||||
|
async def list_users(
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:users", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.list_users"):
|
||||||
|
users = db.query(User).all()
|
||||||
|
return users
|
||||||
|
# [/DEF:list_users:Function]
|
||||||
|
|
||||||
|
# [DEF:create_user:Function]
|
||||||
|
# @PURPOSE: Creates a new local user.
|
||||||
|
# @PRE: Current user has 'Admin' role.
|
||||||
|
# @POST: New user is created in the database.
|
||||||
|
# @PARAM: user_in (UserCreate) - New user data.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: UserSchema - The created user.
|
||||||
|
@router.post("/users", response_model=UserSchema, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_user(
|
||||||
|
user_in: UserCreate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:users", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.create_user"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
if repo.get_user_by_username(user_in.username):
|
||||||
|
raise HTTPException(status_code=400, detail="Username already exists")
|
||||||
|
|
||||||
|
new_user = User(
|
||||||
|
username=user_in.username,
|
||||||
|
email=user_in.email,
|
||||||
|
password_hash=get_password_hash(user_in.password),
|
||||||
|
auth_source="LOCAL",
|
||||||
|
is_active=user_in.is_active
|
||||||
|
)
|
||||||
|
|
||||||
|
for role_name in user_in.roles:
|
||||||
|
role = repo.get_role_by_name(role_name)
|
||||||
|
if role:
|
||||||
|
new_user.roles.append(role)
|
||||||
|
|
||||||
|
db.add(new_user)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(new_user)
|
||||||
|
return new_user
|
||||||
|
# [/DEF:create_user:Function]
|
||||||
|
|
||||||
|
# [DEF:update_user:Function]
|
||||||
|
# @PURPOSE: Updates an existing user.
|
||||||
|
@router.put("/users/{user_id}", response_model=UserSchema)
|
||||||
|
async def update_user(
|
||||||
|
user_id: str,
|
||||||
|
user_in: UserUpdate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:users", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.update_user"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
user = repo.get_user_by_id(user_id)
|
||||||
|
if not user:
|
||||||
|
raise HTTPException(status_code=404, detail="User not found")
|
||||||
|
|
||||||
|
if user_in.email is not None:
|
||||||
|
user.email = user_in.email
|
||||||
|
if user_in.is_active is not None:
|
||||||
|
user.is_active = user_in.is_active
|
||||||
|
if user_in.password is not None:
|
||||||
|
user.password_hash = get_password_hash(user_in.password)
|
||||||
|
|
||||||
|
if user_in.roles is not None:
|
||||||
|
user.roles = []
|
||||||
|
for role_name in user_in.roles:
|
||||||
|
role = repo.get_role_by_name(role_name)
|
||||||
|
if role:
|
||||||
|
user.roles.append(role)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(user)
|
||||||
|
return user
|
||||||
|
# [/DEF:update_user:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_user:Function]
|
||||||
|
# @PURPOSE: Deletes a user.
|
||||||
|
@router.delete("/users/{user_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
async def delete_user(
|
||||||
|
user_id: str,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:users", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.delete_user"):
|
||||||
|
logger.info(f"[DEBUG] Attempting to delete user context={{'user_id': '{user_id}'}}")
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
user = repo.get_user_by_id(user_id)
|
||||||
|
if not user:
|
||||||
|
logger.warning(f"[DEBUG] User not found for deletion context={{'user_id': '{user_id}'}}")
|
||||||
|
raise HTTPException(status_code=404, detail="User not found")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Found user to delete context={{'username': '{user.username}'}}")
|
||||||
|
db.delete(user)
|
||||||
|
db.commit()
|
||||||
|
logger.info(f"[DEBUG] Successfully deleted user context={{'user_id': '{user_id}'}}")
|
||||||
|
return None
|
||||||
|
# [/DEF:delete_user:Function]
|
||||||
|
|
||||||
|
# [DEF:list_roles:Function]
|
||||||
|
# @PURPOSE: Lists all available roles.
|
||||||
|
# @RETURN: List[RoleSchema] - List of roles.
|
||||||
|
# @RELATION: CALLS -> backend.src.models.auth.Role
|
||||||
|
@router.get("/roles", response_model=List[RoleSchema])
|
||||||
|
async def list_roles(
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.list_roles"):
|
||||||
|
return db.query(Role).all()
|
||||||
|
# [/DEF:list_roles:Function]
|
||||||
|
|
||||||
|
# [DEF:create_role:Function]
|
||||||
|
# @PURPOSE: Creates a new system role with associated permissions.
|
||||||
|
# @PRE: Role name must be unique.
|
||||||
|
# @POST: New Role record is created in auth.db.
|
||||||
|
# @PARAM: role_in (RoleCreate) - New role data.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: RoleSchema - The created role.
|
||||||
|
# @SIDE_EFFECT: Commits new role and associations to auth.db.
|
||||||
|
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.get_permission_by_id
|
||||||
|
@router.post("/roles", response_model=RoleSchema, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_role(
|
||||||
|
role_in: RoleCreate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.create_role"):
|
||||||
|
if db.query(Role).filter(Role.name == role_in.name).first():
|
||||||
|
raise HTTPException(status_code=400, detail="Role already exists")
|
||||||
|
|
||||||
|
new_role = Role(name=role_in.name, description=role_in.description)
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
|
||||||
|
for perm_id_or_str in role_in.permissions:
|
||||||
|
perm = repo.get_permission_by_id(perm_id_or_str)
|
||||||
|
if not perm and ":" in perm_id_or_str:
|
||||||
|
res, act = perm_id_or_str.split(":", 1)
|
||||||
|
perm = repo.get_permission_by_resource_action(res, act)
|
||||||
|
|
||||||
|
if perm:
|
||||||
|
new_role.permissions.append(perm)
|
||||||
|
|
||||||
|
db.add(new_role)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(new_role)
|
||||||
|
return new_role
|
||||||
|
# [/DEF:create_role:Function]
|
||||||
|
|
||||||
|
# [DEF:update_role:Function]
|
||||||
|
# @PURPOSE: Updates an existing role's metadata and permissions.
|
||||||
|
# @PRE: role_id must be a valid existing role UUID.
|
||||||
|
# @POST: Role record is updated in auth.db.
|
||||||
|
# @PARAM: role_id (str) - Target role identifier.
|
||||||
|
# @PARAM: role_in (RoleUpdate) - Updated role data.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: RoleSchema - The updated role.
|
||||||
|
# @SIDE_EFFECT: Commits updates to auth.db.
|
||||||
|
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.get_role_by_id
|
||||||
|
@router.put("/roles/{role_id}", response_model=RoleSchema)
|
||||||
|
async def update_role(
|
||||||
|
role_id: str,
|
||||||
|
role_in: RoleUpdate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.update_role"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
role = repo.get_role_by_id(role_id)
|
||||||
|
if not role:
|
||||||
|
raise HTTPException(status_code=404, detail="Role not found")
|
||||||
|
|
||||||
|
if role_in.name is not None:
|
||||||
|
role.name = role_in.name
|
||||||
|
if role_in.description is not None:
|
||||||
|
role.description = role_in.description
|
||||||
|
|
||||||
|
if role_in.permissions is not None:
|
||||||
|
role.permissions = []
|
||||||
|
for perm_id_or_str in role_in.permissions:
|
||||||
|
perm = repo.get_permission_by_id(perm_id_or_str)
|
||||||
|
if not perm and ":" in perm_id_or_str:
|
||||||
|
res, act = perm_id_or_str.split(":", 1)
|
||||||
|
perm = repo.get_permission_by_resource_action(res, act)
|
||||||
|
|
||||||
|
if perm:
|
||||||
|
role.permissions.append(perm)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(role)
|
||||||
|
return role
|
||||||
|
# [/DEF:update_role:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_role:Function]
|
||||||
|
# @PURPOSE: Removes a role from the system.
|
||||||
|
# @PRE: role_id must be a valid existing role UUID.
|
||||||
|
# @POST: Role record is removed from auth.db.
|
||||||
|
# @PARAM: role_id (str) - Target role identifier.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: None
|
||||||
|
# @SIDE_EFFECT: Deletes record from auth.db and commits.
|
||||||
|
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.get_role_by_id
|
||||||
|
@router.delete("/roles/{role_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
async def delete_role(
|
||||||
|
role_id: str,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.delete_role"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
role = repo.get_role_by_id(role_id)
|
||||||
|
if not role:
|
||||||
|
raise HTTPException(status_code=404, detail="Role not found")
|
||||||
|
|
||||||
|
db.delete(role)
|
||||||
|
db.commit()
|
||||||
|
return None
|
||||||
|
# [/DEF:delete_role:Function]
|
||||||
|
|
||||||
|
# [DEF:list_permissions:Function]
|
||||||
|
# @PURPOSE: Lists all available system permissions for assignment.
|
||||||
|
# @POST: Returns a list of all PermissionSchema objects.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: List[PermissionSchema] - List of permissions.
|
||||||
|
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.list_permissions
|
||||||
|
@router.get("/permissions", response_model=List[PermissionSchema])
|
||||||
|
async def list_permissions(
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.list_permissions"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
return repo.list_permissions()
|
||||||
|
# [/DEF:list_permissions:Function]
|
||||||
|
|
||||||
|
# [DEF:list_ad_mappings:Function]
|
||||||
|
# @PURPOSE: Lists all AD Group to Role mappings.
|
||||||
|
@router.get("/ad-mappings", response_model=List[ADGroupMappingSchema])
|
||||||
|
async def list_ad_mappings(
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.list_ad_mappings"):
|
||||||
|
return db.query(ADGroupMapping).all()
|
||||||
|
# [/DEF:list_ad_mappings:Function]
|
||||||
|
|
||||||
|
# [DEF:create_ad_mapping:Function]
|
||||||
|
# @PURPOSE: Creates a new AD Group mapping.
|
||||||
|
@router.post("/ad-mappings", response_model=ADGroupMappingSchema)
|
||||||
|
async def create_ad_mapping(
|
||||||
|
mapping_in: ADGroupMappingCreate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.create_ad_mapping"):
|
||||||
|
new_mapping = ADGroupMapping(
|
||||||
|
ad_group=mapping_in.ad_group,
|
||||||
|
role_id=mapping_in.role_id
|
||||||
|
)
|
||||||
|
db.add(new_mapping)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(new_mapping)
|
||||||
|
return new_mapping
|
||||||
|
# [/DEF:create_ad_mapping:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.api.routes.admin:Module]
|
||||||
@@ -11,7 +11,7 @@ from fastapi import APIRouter, Depends, HTTPException, status
|
|||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from ...core.database import get_db
|
from ...core.database import get_db
|
||||||
from ...models.connection import ConnectionConfig
|
from ...models.connection import ConnectionConfig
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from ...core.logger import logger, belief_scope
|
from ...core.logger import logger, belief_scope
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|||||||
105
backend/src/api/routes/dashboards.py
Normal file
105
backend/src/api/routes/dashboards.py
Normal file
@@ -0,0 +1,105 @@
|
|||||||
|
# [DEF:backend.src.api.routes.dashboards:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: api, dashboards, resources, hub
|
||||||
|
# @PURPOSE: API endpoints for the Dashboard Hub - listing dashboards with Git and task status
|
||||||
|
# @LAYER: API
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.dependencies
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.services.resource_service
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.core.superset_client
|
||||||
|
#
|
||||||
|
# @INVARIANT: All dashboard responses include git_status and last_task metadata
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from typing import List, Optional
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from ...dependencies import get_config_manager, get_task_manager, get_resource_service, has_permission
|
||||||
|
from ...core.logger import logger, belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
# [DEF:GitStatus:DataClass]
|
||||||
|
class GitStatus(BaseModel):
|
||||||
|
branch: Optional[str] = None
|
||||||
|
sync_status: Optional[str] = Field(None, pattern="^OK|DIFF$")
|
||||||
|
# [/DEF:GitStatus:DataClass]
|
||||||
|
|
||||||
|
# [DEF:LastTask:DataClass]
|
||||||
|
class LastTask(BaseModel):
|
||||||
|
task_id: Optional[str] = None
|
||||||
|
status: Optional[str] = Field(None, pattern="^RUNNING|SUCCESS|ERROR|WAITING_INPUT$")
|
||||||
|
# [/DEF:LastTask:DataClass]
|
||||||
|
|
||||||
|
# [DEF:DashboardItem:DataClass]
|
||||||
|
class DashboardItem(BaseModel):
|
||||||
|
id: int
|
||||||
|
title: str
|
||||||
|
slug: Optional[str] = None
|
||||||
|
url: Optional[str] = None
|
||||||
|
last_modified: Optional[str] = None
|
||||||
|
git_status: Optional[GitStatus] = None
|
||||||
|
last_task: Optional[LastTask] = None
|
||||||
|
# [/DEF:DashboardItem:DataClass]
|
||||||
|
|
||||||
|
# [DEF:DashboardsResponse:DataClass]
|
||||||
|
class DashboardsResponse(BaseModel):
|
||||||
|
dashboards: List[DashboardItem]
|
||||||
|
total: int
|
||||||
|
# [/DEF:DashboardsResponse:DataClass]
|
||||||
|
|
||||||
|
# [DEF:get_dashboards:Function]
|
||||||
|
# @PURPOSE: Fetch list of dashboards from a specific environment with Git status and last task status
|
||||||
|
# @PRE: env_id must be a valid environment ID
|
||||||
|
# @POST: Returns a list of dashboards with enhanced metadata
|
||||||
|
# @PARAM: env_id (str) - The environment ID to fetch dashboards from
|
||||||
|
# @PARAM: search (Optional[str]) - Filter by title/slug
|
||||||
|
# @RETURN: DashboardsResponse - List of dashboards with status metadata
|
||||||
|
# @RELATION: CALLS -> ResourceService.get_dashboards_with_status
|
||||||
|
@router.get("/api/dashboards", response_model=DashboardsResponse)
|
||||||
|
async def get_dashboards(
|
||||||
|
env_id: str,
|
||||||
|
search: Optional[str] = None,
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
task_manager=Depends(get_task_manager),
|
||||||
|
resource_service=Depends(get_resource_service),
|
||||||
|
_ = Depends(has_permission("plugin:migration", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_dashboards", f"env_id={env_id}, search={search}"):
|
||||||
|
# Validate environment exists
|
||||||
|
environments = config_manager.get_environments()
|
||||||
|
env = next((e for e in environments if e.id == env_id), None)
|
||||||
|
if not env:
|
||||||
|
logger.error(f"[get_dashboards][Coherence:Failed] Environment not found: {env_id}")
|
||||||
|
raise HTTPException(status_code=404, detail="Environment not found")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get all tasks for status lookup
|
||||||
|
all_tasks = task_manager.get_all_tasks()
|
||||||
|
|
||||||
|
# Fetch dashboards with status using ResourceService
|
||||||
|
dashboards = await resource_service.get_dashboards_with_status(env, all_tasks)
|
||||||
|
|
||||||
|
# Apply search filter if provided
|
||||||
|
if search:
|
||||||
|
search_lower = search.lower()
|
||||||
|
dashboards = [
|
||||||
|
d for d in dashboards
|
||||||
|
if search_lower in d.get('title', '').lower()
|
||||||
|
or search_lower in d.get('slug', '').lower()
|
||||||
|
]
|
||||||
|
|
||||||
|
logger.info(f"[get_dashboards][Coherence:OK] Returning {len(dashboards)} dashboards")
|
||||||
|
|
||||||
|
return DashboardsResponse(
|
||||||
|
dashboards=dashboards,
|
||||||
|
total=len(dashboards)
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[get_dashboards][Coherence:Failed] Failed to fetch dashboards: {e}")
|
||||||
|
raise HTTPException(status_code=503, detail=f"Failed to fetch dashboards: {str(e)}")
|
||||||
|
# [/DEF:get_dashboards:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.api.routes.dashboards:Module]
|
||||||
103
backend/src/api/routes/datasets.py
Normal file
103
backend/src/api/routes/datasets.py
Normal file
@@ -0,0 +1,103 @@
|
|||||||
|
# [DEF:backend.src.api.routes.datasets:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: api, datasets, resources, hub
|
||||||
|
# @PURPOSE: API endpoints for the Dataset Hub - listing datasets with mapping progress
|
||||||
|
# @LAYER: API
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.dependencies
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.services.resource_service
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.core.superset_client
|
||||||
|
#
|
||||||
|
# @INVARIANT: All dataset responses include last_task metadata
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from typing import List, Optional
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from ...dependencies import get_config_manager, get_task_manager, get_resource_service, has_permission
|
||||||
|
from ...core.logger import logger, belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
# [DEF:MappedFields:DataClass]
|
||||||
|
class MappedFields(BaseModel):
|
||||||
|
total: int
|
||||||
|
mapped: int
|
||||||
|
# [/DEF:MappedFields:DataClass]
|
||||||
|
|
||||||
|
# [DEF:LastTask:DataClass]
|
||||||
|
class LastTask(BaseModel):
|
||||||
|
task_id: Optional[str] = None
|
||||||
|
status: Optional[str] = Field(None, pattern="^RUNNING|SUCCESS|ERROR|WAITING_INPUT$")
|
||||||
|
# [/DEF:LastTask:DataClass]
|
||||||
|
|
||||||
|
# [DEF:DatasetItem:DataClass]
|
||||||
|
class DatasetItem(BaseModel):
|
||||||
|
id: int
|
||||||
|
table_name: str
|
||||||
|
schema: str
|
||||||
|
database: str
|
||||||
|
mapped_fields: Optional[MappedFields] = None
|
||||||
|
last_task: Optional[LastTask] = None
|
||||||
|
# [/DEF:DatasetItem:DataClass]
|
||||||
|
|
||||||
|
# [DEF:DatasetsResponse:DataClass]
|
||||||
|
class DatasetsResponse(BaseModel):
|
||||||
|
datasets: List[DatasetItem]
|
||||||
|
total: int
|
||||||
|
# [/DEF:DatasetsResponse:DataClass]
|
||||||
|
|
||||||
|
# [DEF:get_datasets:Function]
|
||||||
|
# @PURPOSE: Fetch list of datasets from a specific environment with mapping progress
|
||||||
|
# @PRE: env_id must be a valid environment ID
|
||||||
|
# @POST: Returns a list of datasets with enhanced metadata
|
||||||
|
# @PARAM: env_id (str) - The environment ID to fetch datasets from
|
||||||
|
# @PARAM: search (Optional[str]) - Filter by table name
|
||||||
|
# @RETURN: DatasetsResponse - List of datasets with status metadata
|
||||||
|
# @RELATION: CALLS -> ResourceService.get_datasets_with_status
|
||||||
|
@router.get("/api/datasets", response_model=DatasetsResponse)
|
||||||
|
async def get_datasets(
|
||||||
|
env_id: str,
|
||||||
|
search: Optional[str] = None,
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
task_manager=Depends(get_task_manager),
|
||||||
|
resource_service=Depends(get_resource_service),
|
||||||
|
_ = Depends(has_permission("plugin:migration", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_datasets", f"env_id={env_id}, search={search}"):
|
||||||
|
# Validate environment exists
|
||||||
|
environments = config_manager.get_environments()
|
||||||
|
env = next((e for e in environments if e.id == env_id), None)
|
||||||
|
if not env:
|
||||||
|
logger.error(f"[get_datasets][Coherence:Failed] Environment not found: {env_id}")
|
||||||
|
raise HTTPException(status_code=404, detail="Environment not found")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get all tasks for status lookup
|
||||||
|
all_tasks = task_manager.get_all_tasks()
|
||||||
|
|
||||||
|
# Fetch datasets with status using ResourceService
|
||||||
|
datasets = await resource_service.get_datasets_with_status(env, all_tasks)
|
||||||
|
|
||||||
|
# Apply search filter if provided
|
||||||
|
if search:
|
||||||
|
search_lower = search.lower()
|
||||||
|
datasets = [
|
||||||
|
d for d in datasets
|
||||||
|
if search_lower in d.get('table_name', '').lower()
|
||||||
|
]
|
||||||
|
|
||||||
|
logger.info(f"[get_datasets][Coherence:OK] Returning {len(datasets)} datasets")
|
||||||
|
|
||||||
|
return DatasetsResponse(
|
||||||
|
datasets=datasets,
|
||||||
|
total=len(datasets)
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[get_datasets][Coherence:Failed] Failed to fetch datasets: {e}")
|
||||||
|
raise HTTPException(status_code=503, detail=f"Failed to fetch datasets: {str(e)}")
|
||||||
|
# [/DEF:get_datasets:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.api.routes.datasets:Module]
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
# [DEF:backend.src.api.routes.environments:Module]
|
# [DEF:backend.src.api.routes.environments:Module]
|
||||||
#
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: api, environments, superset, databases
|
# @SEMANTICS: api, environments, superset, databases
|
||||||
# @PURPOSE: API endpoints for listing environments and their databases.
|
# @PURPOSE: API endpoints for listing environments and their databases.
|
||||||
# @LAYER: API
|
# @LAYER: API
|
||||||
@@ -10,11 +11,10 @@
|
|||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from typing import List, Dict, Optional
|
from typing import List, Optional
|
||||||
from ...dependencies import get_config_manager, get_scheduler_service
|
from ...dependencies import get_config_manager, get_scheduler_service, has_permission
|
||||||
from ...core.superset_client import SupersetClient
|
from ...core.superset_client import SupersetClient
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
from ...core.config_models import Environment as EnvModel
|
|
||||||
from ...core.logger import belief_scope
|
from ...core.logger import belief_scope
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
@@ -23,7 +23,7 @@ router = APIRouter()
|
|||||||
# [DEF:ScheduleSchema:DataClass]
|
# [DEF:ScheduleSchema:DataClass]
|
||||||
class ScheduleSchema(BaseModel):
|
class ScheduleSchema(BaseModel):
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
cron_expression: str = Field(..., pattern=r'^(@(annually|yearly|monthly|weekly|daily|hourly|reboot))|((((\d+,)*\d+|(\d+(\/|-)\d+)|\d+|\*) ?){5,7})$')
|
cron_expression: str = Field(..., pattern=r'^(@(annually|yearly|monthly|weekly|daily|hourly|reboot))|((((\d+,)*\d+|(\d+(\/|-)\d+)|\d+|\*) ?){4,6})$')
|
||||||
# [/DEF:ScheduleSchema:DataClass]
|
# [/DEF:ScheduleSchema:DataClass]
|
||||||
|
|
||||||
# [DEF:EnvironmentResponse:DataClass]
|
# [DEF:EnvironmentResponse:DataClass]
|
||||||
@@ -47,7 +47,10 @@ class DatabaseResponse(BaseModel):
|
|||||||
# @POST: Returns a list of EnvironmentResponse objects.
|
# @POST: Returns a list of EnvironmentResponse objects.
|
||||||
# @RETURN: List[EnvironmentResponse]
|
# @RETURN: List[EnvironmentResponse]
|
||||||
@router.get("", response_model=List[EnvironmentResponse])
|
@router.get("", response_model=List[EnvironmentResponse])
|
||||||
async def get_environments(config_manager=Depends(get_config_manager)):
|
async def get_environments(
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("environments", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_environments"):
|
with belief_scope("get_environments"):
|
||||||
envs = config_manager.get_environments()
|
envs = config_manager.get_environments()
|
||||||
# Ensure envs is a list
|
# Ensure envs is a list
|
||||||
@@ -61,7 +64,7 @@ async def get_environments(config_manager=Depends(get_config_manager)):
|
|||||||
backup_schedule=ScheduleSchema(
|
backup_schedule=ScheduleSchema(
|
||||||
enabled=e.backup_schedule.enabled,
|
enabled=e.backup_schedule.enabled,
|
||||||
cron_expression=e.backup_schedule.cron_expression
|
cron_expression=e.backup_schedule.cron_expression
|
||||||
) if e.backup_schedule else None
|
) if getattr(e, 'backup_schedule', None) else None
|
||||||
) for e in envs
|
) for e in envs
|
||||||
]
|
]
|
||||||
# [/DEF:get_environments:Function]
|
# [/DEF:get_environments:Function]
|
||||||
@@ -77,7 +80,8 @@ async def update_environment_schedule(
|
|||||||
id: str,
|
id: str,
|
||||||
schedule: ScheduleSchema,
|
schedule: ScheduleSchema,
|
||||||
config_manager=Depends(get_config_manager),
|
config_manager=Depends(get_config_manager),
|
||||||
scheduler_service=Depends(get_scheduler_service)
|
scheduler_service=Depends(get_scheduler_service),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
):
|
):
|
||||||
with belief_scope("update_environment_schedule", f"id={id}"):
|
with belief_scope("update_environment_schedule", f"id={id}"):
|
||||||
envs = config_manager.get_environments()
|
envs = config_manager.get_environments()
|
||||||
@@ -104,7 +108,11 @@ async def update_environment_schedule(
|
|||||||
# @PARAM: id (str) - The environment ID.
|
# @PARAM: id (str) - The environment ID.
|
||||||
# @RETURN: List[Dict] - List of databases.
|
# @RETURN: List[Dict] - List of databases.
|
||||||
@router.get("/{id}/databases")
|
@router.get("/{id}/databases")
|
||||||
async def get_environment_databases(id: str, config_manager=Depends(get_config_manager)):
|
async def get_environment_databases(
|
||||||
|
id: str,
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_environment_databases", f"id={id}"):
|
with belief_scope("get_environment_databases", f"id={id}"):
|
||||||
envs = config_manager.get_environments()
|
envs = config_manager.get_environments()
|
||||||
env = next((e for e in envs if e.id == id), None)
|
env = next((e for e in envs if e.id == id), None)
|
||||||
|
|||||||
456
backend/src/api/routes/git.py
Normal file
456
backend/src/api/routes/git.py
Normal file
@@ -0,0 +1,456 @@
|
|||||||
|
# [DEF:backend.src.api.routes.git:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: git, routes, api, fastapi, repository, deployment
|
||||||
|
# @PURPOSE: Provides FastAPI endpoints for Git integration operations.
|
||||||
|
# @LAYER: API
|
||||||
|
# @RELATION: USES -> src.services.git_service.GitService
|
||||||
|
# @RELATION: USES -> src.api.routes.git_schemas
|
||||||
|
# @RELATION: USES -> src.models.git
|
||||||
|
#
|
||||||
|
# @INVARIANT: All Git operations must be routed through GitService.
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from typing import List, Optional
|
||||||
|
import typing
|
||||||
|
from src.dependencies import get_config_manager, has_permission
|
||||||
|
from src.core.database import get_db
|
||||||
|
from src.models.git import GitServerConfig, GitRepository
|
||||||
|
from src.api.routes.git_schemas import (
|
||||||
|
GitServerConfigSchema, GitServerConfigCreate,
|
||||||
|
BranchSchema, BranchCreate,
|
||||||
|
BranchCheckout, CommitSchema, CommitCreate,
|
||||||
|
DeploymentEnvironmentSchema, DeployRequest, RepoInitRequest
|
||||||
|
)
|
||||||
|
from src.services.git_service import GitService
|
||||||
|
from src.core.logger import logger, belief_scope
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/git", tags=["git"])
|
||||||
|
git_service = GitService()
|
||||||
|
|
||||||
|
# [DEF:get_git_configs:Function]
|
||||||
|
# @PURPOSE: List all configured Git servers.
|
||||||
|
# @PRE: Database session `db` is available.
|
||||||
|
# @POST: Returns a list of all GitServerConfig objects from the database.
|
||||||
|
# @RETURN: List[GitServerConfigSchema]
|
||||||
|
@router.get("/config", response_model=List[GitServerConfigSchema])
|
||||||
|
async def get_git_configs(
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_git_configs"):
|
||||||
|
return db.query(GitServerConfig).all()
|
||||||
|
# [/DEF:get_git_configs:Function]
|
||||||
|
|
||||||
|
# [DEF:create_git_config:Function]
|
||||||
|
# @PURPOSE: Register a new Git server configuration.
|
||||||
|
# @PRE: `config` contains valid GitServerConfigCreate data.
|
||||||
|
# @POST: A new GitServerConfig record is created in the database.
|
||||||
|
# @PARAM: config (GitServerConfigCreate)
|
||||||
|
# @RETURN: GitServerConfigSchema
|
||||||
|
@router.post("/config", response_model=GitServerConfigSchema)
|
||||||
|
async def create_git_config(
|
||||||
|
config: GitServerConfigCreate,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("create_git_config"):
|
||||||
|
db_config = GitServerConfig(**config.dict())
|
||||||
|
db.add(db_config)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(db_config)
|
||||||
|
return db_config
|
||||||
|
# [/DEF:create_git_config:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_git_config:Function]
|
||||||
|
# @PURPOSE: Remove a Git server configuration.
|
||||||
|
# @PRE: `config_id` corresponds to an existing configuration.
|
||||||
|
# @POST: The configuration record is removed from the database.
|
||||||
|
# @PARAM: config_id (str)
|
||||||
|
@router.delete("/config/{config_id}")
|
||||||
|
async def delete_git_config(
|
||||||
|
config_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("delete_git_config"):
|
||||||
|
db_config = db.query(GitServerConfig).filter(GitServerConfig.id == config_id).first()
|
||||||
|
if not db_config:
|
||||||
|
raise HTTPException(status_code=404, detail="Configuration not found")
|
||||||
|
|
||||||
|
db.delete(db_config)
|
||||||
|
db.commit()
|
||||||
|
return {"status": "success", "message": "Configuration deleted"}
|
||||||
|
# [/DEF:delete_git_config:Function]
|
||||||
|
|
||||||
|
# [DEF:test_git_config:Function]
|
||||||
|
# @PURPOSE: Validate connection to a Git server using provided credentials.
|
||||||
|
# @PRE: `config` contains provider, url, and pat.
|
||||||
|
# @POST: Returns success if the connection is validated via GitService.
|
||||||
|
# @PARAM: config (GitServerConfigCreate)
|
||||||
|
@router.post("/config/test")
|
||||||
|
async def test_git_config(
|
||||||
|
config: GitServerConfigCreate,
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("test_git_config"):
|
||||||
|
success = await git_service.test_connection(config.provider, config.url, config.pat)
|
||||||
|
if success:
|
||||||
|
return {"status": "success", "message": "Connection successful"}
|
||||||
|
else:
|
||||||
|
raise HTTPException(status_code=400, detail="Connection failed")
|
||||||
|
# [/DEF:test_git_config:Function]
|
||||||
|
|
||||||
|
# [DEF:init_repository:Function]
|
||||||
|
# @PURPOSE: Link a dashboard to a Git repository and perform initial clone/init.
|
||||||
|
# @PRE: `dashboard_id` exists and `init_data` contains valid config_id and remote_url.
|
||||||
|
# @POST: Repository is initialized on disk and a GitRepository record is saved in DB.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @PARAM: init_data (RepoInitRequest)
|
||||||
|
@router.post("/repositories/{dashboard_id}/init")
|
||||||
|
async def init_repository(
|
||||||
|
dashboard_id: int,
|
||||||
|
init_data: RepoInitRequest,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("init_repository"):
|
||||||
|
# 1. Get config
|
||||||
|
config = db.query(GitServerConfig).filter(GitServerConfig.id == init_data.config_id).first()
|
||||||
|
if not config:
|
||||||
|
raise HTTPException(status_code=404, detail="Git configuration not found")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# 2. Perform Git clone/init
|
||||||
|
logger.info(f"[init_repository][Action] Initializing repo for dashboard {dashboard_id}")
|
||||||
|
git_service.init_repo(dashboard_id, init_data.remote_url, config.pat)
|
||||||
|
|
||||||
|
# 3. Save to DB
|
||||||
|
repo_path = git_service._get_repo_path(dashboard_id)
|
||||||
|
db_repo = db.query(GitRepository).filter(GitRepository.dashboard_id == dashboard_id).first()
|
||||||
|
if not db_repo:
|
||||||
|
db_repo = GitRepository(
|
||||||
|
dashboard_id=dashboard_id,
|
||||||
|
config_id=config.id,
|
||||||
|
remote_url=init_data.remote_url,
|
||||||
|
local_path=repo_path
|
||||||
|
)
|
||||||
|
db.add(db_repo)
|
||||||
|
else:
|
||||||
|
db_repo.config_id = config.id
|
||||||
|
db_repo.remote_url = init_data.remote_url
|
||||||
|
db_repo.local_path = repo_path
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
logger.info(f"[init_repository][Coherence:OK] Repository initialized for dashboard {dashboard_id}")
|
||||||
|
return {"status": "success", "message": "Repository initialized"}
|
||||||
|
except Exception as e:
|
||||||
|
db.rollback()
|
||||||
|
logger.error(f"[init_repository][Coherence:Failed] Failed to init repository: {e}")
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:init_repository:Function]
|
||||||
|
|
||||||
|
# [DEF:get_branches:Function]
|
||||||
|
# @PURPOSE: List all branches for a dashboard's repository.
|
||||||
|
# @PRE: Repository for `dashboard_id` is initialized.
|
||||||
|
# @POST: Returns a list of branches from the local repository.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @RETURN: List[BranchSchema]
|
||||||
|
@router.get("/repositories/{dashboard_id}/branches", response_model=List[BranchSchema])
|
||||||
|
async def get_branches(
|
||||||
|
dashboard_id: int,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_branches"):
|
||||||
|
try:
|
||||||
|
return git_service.list_branches(dashboard_id)
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=404, detail=str(e))
|
||||||
|
# [/DEF:get_branches:Function]
|
||||||
|
|
||||||
|
# [DEF:create_branch:Function]
|
||||||
|
# @PURPOSE: Create a new branch in the dashboard's repository.
|
||||||
|
# @PRE: `dashboard_id` repository exists and `branch_data` has name and from_branch.
|
||||||
|
# @POST: A new branch is created in the local repository.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @PARAM: branch_data (BranchCreate)
|
||||||
|
@router.post("/repositories/{dashboard_id}/branches")
|
||||||
|
async def create_branch(
|
||||||
|
dashboard_id: int,
|
||||||
|
branch_data: BranchCreate,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("create_branch"):
|
||||||
|
try:
|
||||||
|
git_service.create_branch(dashboard_id, branch_data.name, branch_data.from_branch)
|
||||||
|
return {"status": "success"}
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:create_branch:Function]
|
||||||
|
|
||||||
|
# [DEF:checkout_branch:Function]
|
||||||
|
# @PURPOSE: Switch the dashboard's repository to a specific branch.
|
||||||
|
# @PRE: `dashboard_id` repository exists and branch `checkout_data.name` exists.
|
||||||
|
# @POST: The local repository HEAD is moved to the specified branch.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @PARAM: checkout_data (BranchCheckout)
|
||||||
|
@router.post("/repositories/{dashboard_id}/checkout")
|
||||||
|
async def checkout_branch(
|
||||||
|
dashboard_id: int,
|
||||||
|
checkout_data: BranchCheckout,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("checkout_branch"):
|
||||||
|
try:
|
||||||
|
git_service.checkout_branch(dashboard_id, checkout_data.name)
|
||||||
|
return {"status": "success"}
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:checkout_branch:Function]
|
||||||
|
|
||||||
|
# [DEF:commit_changes:Function]
|
||||||
|
# @PURPOSE: Stage and commit changes in the dashboard's repository.
|
||||||
|
# @PRE: `dashboard_id` repository exists and `commit_data` has message and files.
|
||||||
|
# @POST: Specified files are staged and a new commit is created.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @PARAM: commit_data (CommitCreate)
|
||||||
|
@router.post("/repositories/{dashboard_id}/commit")
|
||||||
|
async def commit_changes(
|
||||||
|
dashboard_id: int,
|
||||||
|
commit_data: CommitCreate,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("commit_changes"):
|
||||||
|
try:
|
||||||
|
git_service.commit_changes(dashboard_id, commit_data.message, commit_data.files)
|
||||||
|
return {"status": "success"}
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:commit_changes:Function]
|
||||||
|
|
||||||
|
# [DEF:push_changes:Function]
|
||||||
|
# @PURPOSE: Push local commits to the remote repository.
|
||||||
|
# @PRE: `dashboard_id` repository exists and has a remote configured.
|
||||||
|
# @POST: Local commits are pushed to the remote repository.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
@router.post("/repositories/{dashboard_id}/push")
|
||||||
|
async def push_changes(
|
||||||
|
dashboard_id: int,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("push_changes"):
|
||||||
|
try:
|
||||||
|
git_service.push_changes(dashboard_id)
|
||||||
|
return {"status": "success"}
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:push_changes:Function]
|
||||||
|
|
||||||
|
# [DEF:pull_changes:Function]
|
||||||
|
# @PURPOSE: Pull changes from the remote repository.
|
||||||
|
# @PRE: `dashboard_id` repository exists and has a remote configured.
|
||||||
|
# @POST: Remote changes are fetched and merged into the local branch.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
@router.post("/repositories/{dashboard_id}/pull")
|
||||||
|
async def pull_changes(
|
||||||
|
dashboard_id: int,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("pull_changes"):
|
||||||
|
try:
|
||||||
|
git_service.pull_changes(dashboard_id)
|
||||||
|
return {"status": "success"}
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:pull_changes:Function]
|
||||||
|
|
||||||
|
# [DEF:sync_dashboard:Function]
|
||||||
|
# @PURPOSE: Sync dashboard state from Superset to Git using the GitPlugin.
|
||||||
|
# @PRE: `dashboard_id` is valid; GitPlugin is available.
|
||||||
|
# @POST: Dashboard YAMLs are exported from Superset and committed to Git.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @PARAM: source_env_id (Optional[str])
|
||||||
|
@router.post("/repositories/{dashboard_id}/sync")
|
||||||
|
async def sync_dashboard(
|
||||||
|
dashboard_id: int,
|
||||||
|
source_env_id: typing.Optional[str] = None,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("sync_dashboard"):
|
||||||
|
try:
|
||||||
|
from src.plugins.git_plugin import GitPlugin
|
||||||
|
plugin = GitPlugin()
|
||||||
|
return await plugin.execute({
|
||||||
|
"operation": "sync",
|
||||||
|
"dashboard_id": dashboard_id,
|
||||||
|
"source_env_id": source_env_id
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:sync_dashboard:Function]
|
||||||
|
|
||||||
|
# [DEF:get_environments:Function]
|
||||||
|
# @PURPOSE: List all deployment environments.
|
||||||
|
# @PRE: Config manager is accessible.
|
||||||
|
# @POST: Returns a list of DeploymentEnvironmentSchema objects.
|
||||||
|
# @RETURN: List[DeploymentEnvironmentSchema]
|
||||||
|
@router.get("/environments", response_model=List[DeploymentEnvironmentSchema])
|
||||||
|
async def get_environments(
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("environments", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_environments"):
|
||||||
|
envs = config_manager.get_environments()
|
||||||
|
return [
|
||||||
|
DeploymentEnvironmentSchema(
|
||||||
|
id=e.id,
|
||||||
|
name=e.name,
|
||||||
|
superset_url=e.url,
|
||||||
|
is_active=True
|
||||||
|
) for e in envs
|
||||||
|
]
|
||||||
|
# [/DEF:get_environments:Function]
|
||||||
|
|
||||||
|
# [DEF:deploy_dashboard:Function]
|
||||||
|
# @PURPOSE: Deploy dashboard from Git to a target environment.
|
||||||
|
# @PRE: `dashboard_id` and `deploy_data.environment_id` are valid.
|
||||||
|
# @POST: Dashboard YAMLs are read from Git and imported into the target Superset.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @PARAM: deploy_data (DeployRequest)
|
||||||
|
@router.post("/repositories/{dashboard_id}/deploy")
|
||||||
|
async def deploy_dashboard(
|
||||||
|
dashboard_id: int,
|
||||||
|
deploy_data: DeployRequest,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("deploy_dashboard"):
|
||||||
|
try:
|
||||||
|
from src.plugins.git_plugin import GitPlugin
|
||||||
|
plugin = GitPlugin()
|
||||||
|
return await plugin.execute({
|
||||||
|
"operation": "deploy",
|
||||||
|
"dashboard_id": dashboard_id,
|
||||||
|
"environment_id": deploy_data.environment_id
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:deploy_dashboard:Function]
|
||||||
|
|
||||||
|
# [DEF:get_history:Function]
|
||||||
|
# @PURPOSE: View commit history for a dashboard's repository.
|
||||||
|
# @PRE: `dashboard_id` repository exists.
|
||||||
|
# @POST: Returns a list of recent commits from the repository.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @PARAM: limit (int)
|
||||||
|
# @RETURN: List[CommitSchema]
|
||||||
|
@router.get("/repositories/{dashboard_id}/history", response_model=List[CommitSchema])
|
||||||
|
async def get_history(
|
||||||
|
dashboard_id: int,
|
||||||
|
limit: int = 50,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_history"):
|
||||||
|
try:
|
||||||
|
return git_service.get_commit_history(dashboard_id, limit)
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=404, detail=str(e))
|
||||||
|
# [/DEF:get_history:Function]
|
||||||
|
|
||||||
|
# [DEF:get_repository_status:Function]
|
||||||
|
# @PURPOSE: Get current Git status for a dashboard repository.
|
||||||
|
# @PRE: `dashboard_id` repository exists.
|
||||||
|
# @POST: Returns the status of the working directory (staged, unstaged, untracked).
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @RETURN: dict
|
||||||
|
@router.get("/repositories/{dashboard_id}/status")
|
||||||
|
async def get_repository_status(
|
||||||
|
dashboard_id: int,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_repository_status"):
|
||||||
|
try:
|
||||||
|
return git_service.get_status(dashboard_id)
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:get_repository_status:Function]
|
||||||
|
|
||||||
|
# [DEF:get_repository_diff:Function]
|
||||||
|
# @PURPOSE: Get Git diff for a dashboard repository.
|
||||||
|
# @PRE: `dashboard_id` repository exists.
|
||||||
|
# @POST: Returns the diff text for the specified file or all changes.
|
||||||
|
# @PARAM: dashboard_id (int)
|
||||||
|
# @PARAM: file_path (Optional[str])
|
||||||
|
# @PARAM: staged (bool)
|
||||||
|
# @RETURN: str
|
||||||
|
@router.get("/repositories/{dashboard_id}/diff")
|
||||||
|
async def get_repository_diff(
|
||||||
|
dashboard_id: int,
|
||||||
|
file_path: Optional[str] = None,
|
||||||
|
staged: bool = False,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_repository_diff"):
|
||||||
|
try:
|
||||||
|
diff_text = git_service.get_diff(dashboard_id, file_path, staged)
|
||||||
|
return diff_text
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:get_repository_diff:Function]
|
||||||
|
|
||||||
|
# [DEF:generate_commit_message:Function]
|
||||||
|
# @PURPOSE: Generate a suggested commit message using LLM.
|
||||||
|
# @PRE: Repository for `dashboard_id` is initialized.
|
||||||
|
# @POST: Returns a suggested commit message string.
|
||||||
|
@router.post("/repositories/{dashboard_id}/generate-message")
|
||||||
|
async def generate_commit_message(
|
||||||
|
dashboard_id: int,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("generate_commit_message"):
|
||||||
|
try:
|
||||||
|
# 1. Get Diff
|
||||||
|
diff = git_service.get_diff(dashboard_id, staged=True)
|
||||||
|
if not diff:
|
||||||
|
diff = git_service.get_diff(dashboard_id, staged=False)
|
||||||
|
|
||||||
|
if not diff:
|
||||||
|
return {"message": "No changes detected"}
|
||||||
|
|
||||||
|
# 2. Get History
|
||||||
|
history_objs = git_service.get_commit_history(dashboard_id, limit=5)
|
||||||
|
history = [h.message for h in history_objs if hasattr(h, 'message')]
|
||||||
|
|
||||||
|
# 3. Get LLM Client
|
||||||
|
from ...services.llm_provider import LLMProviderService
|
||||||
|
from ...plugins.llm_analysis.service import LLMClient
|
||||||
|
from ...plugins.llm_analysis.models import LLMProviderType
|
||||||
|
|
||||||
|
llm_service = LLMProviderService(db)
|
||||||
|
providers = llm_service.get_all_providers()
|
||||||
|
provider = next((p for p in providers if p.is_active), None)
|
||||||
|
|
||||||
|
if not provider:
|
||||||
|
raise HTTPException(status_code=400, detail="No active LLM provider found")
|
||||||
|
|
||||||
|
api_key = llm_service.get_decrypted_api_key(provider.id)
|
||||||
|
client = LLMClient(
|
||||||
|
provider_type=LLMProviderType(provider.provider_type),
|
||||||
|
api_key=api_key,
|
||||||
|
base_url=provider.base_url,
|
||||||
|
default_model=provider.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
# 4. Generate Message
|
||||||
|
from ...plugins.git.llm_extension import GitLLMExtension
|
||||||
|
extension = GitLLMExtension(client)
|
||||||
|
message = await extension.suggest_commit_message(diff, history)
|
||||||
|
|
||||||
|
return {"message": message}
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to generate commit message: {e}")
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:generate_commit_message:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.api.routes.git:Module]
|
||||||
144
backend/src/api/routes/git_schemas.py
Normal file
144
backend/src/api/routes/git_schemas.py
Normal file
@@ -0,0 +1,144 @@
|
|||||||
|
# [DEF:backend.src.api.routes.git_schemas:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: git, schemas, pydantic, api, contracts
|
||||||
|
# @PURPOSE: Defines Pydantic models for the Git integration API layer.
|
||||||
|
# @LAYER: API
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.models.git
|
||||||
|
#
|
||||||
|
# @INVARIANT: All schemas must be compatible with the FastAPI router.
|
||||||
|
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from typing import List, Optional
|
||||||
|
from datetime import datetime
|
||||||
|
from src.models.git import GitProvider, GitStatus, SyncStatus
|
||||||
|
|
||||||
|
# [DEF:GitServerConfigBase:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Base schema for Git server configuration attributes.
|
||||||
|
class GitServerConfigBase(BaseModel):
|
||||||
|
name: str = Field(..., description="Display name for the Git server")
|
||||||
|
provider: GitProvider = Field(..., description="Git provider (GITHUB, GITLAB, GITEA)")
|
||||||
|
url: str = Field(..., description="Server base URL")
|
||||||
|
pat: str = Field(..., description="Personal Access Token")
|
||||||
|
default_repository: Optional[str] = Field(None, description="Default repository path (org/repo)")
|
||||||
|
# [/DEF:GitServerConfigBase:Class]
|
||||||
|
|
||||||
|
# [DEF:GitServerConfigCreate:Class]
|
||||||
|
# @PURPOSE: Schema for creating a new Git server configuration.
|
||||||
|
class GitServerConfigCreate(GitServerConfigBase):
|
||||||
|
"""Schema for creating a new Git server configuration."""
|
||||||
|
pass
|
||||||
|
# [/DEF:GitServerConfigCreate:Class]
|
||||||
|
|
||||||
|
# [DEF:GitServerConfigSchema:Class]
|
||||||
|
# @PURPOSE: Schema for representing a Git server configuration with metadata.
|
||||||
|
class GitServerConfigSchema(GitServerConfigBase):
|
||||||
|
"""Schema for representing a Git server configuration with metadata."""
|
||||||
|
id: str
|
||||||
|
status: GitStatus
|
||||||
|
last_validated: datetime
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:GitServerConfigSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:GitRepositorySchema:Class]
|
||||||
|
# @PURPOSE: Schema for tracking a local Git repository linked to a dashboard.
|
||||||
|
class GitRepositorySchema(BaseModel):
|
||||||
|
"""Schema for tracking a local Git repository linked to a dashboard."""
|
||||||
|
id: str
|
||||||
|
dashboard_id: int
|
||||||
|
config_id: str
|
||||||
|
remote_url: str
|
||||||
|
local_path: str
|
||||||
|
current_branch: str
|
||||||
|
sync_status: SyncStatus
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:GitRepositorySchema:Class]
|
||||||
|
|
||||||
|
# [DEF:BranchSchema:Class]
|
||||||
|
# @PURPOSE: Schema for representing a Git branch metadata.
|
||||||
|
class BranchSchema(BaseModel):
|
||||||
|
"""Schema for representing a Git branch."""
|
||||||
|
name: str
|
||||||
|
commit_hash: str
|
||||||
|
is_remote: bool
|
||||||
|
last_updated: datetime
|
||||||
|
# [/DEF:BranchSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:CommitSchema:Class]
|
||||||
|
# @PURPOSE: Schema for representing Git commit details.
|
||||||
|
class CommitSchema(BaseModel):
|
||||||
|
"""Schema for representing a Git commit."""
|
||||||
|
hash: str
|
||||||
|
author: str
|
||||||
|
email: str
|
||||||
|
timestamp: datetime
|
||||||
|
message: str
|
||||||
|
files_changed: List[str]
|
||||||
|
# [/DEF:CommitSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:BranchCreate:Class]
|
||||||
|
# @PURPOSE: Schema for branch creation requests.
|
||||||
|
class BranchCreate(BaseModel):
|
||||||
|
"""Schema for branch creation requests."""
|
||||||
|
name: str
|
||||||
|
from_branch: str
|
||||||
|
# [/DEF:BranchCreate:Class]
|
||||||
|
|
||||||
|
# [DEF:BranchCheckout:Class]
|
||||||
|
# @PURPOSE: Schema for branch checkout requests.
|
||||||
|
class BranchCheckout(BaseModel):
|
||||||
|
"""Schema for branch checkout requests."""
|
||||||
|
name: str
|
||||||
|
# [/DEF:BranchCheckout:Class]
|
||||||
|
|
||||||
|
# [DEF:CommitCreate:Class]
|
||||||
|
# @PURPOSE: Schema for staging and committing changes.
|
||||||
|
class CommitCreate(BaseModel):
|
||||||
|
"""Schema for staging and committing changes."""
|
||||||
|
message: str
|
||||||
|
files: List[str]
|
||||||
|
# [/DEF:CommitCreate:Class]
|
||||||
|
|
||||||
|
# [DEF:ConflictResolution:Class]
|
||||||
|
# @PURPOSE: Schema for resolving merge conflicts.
|
||||||
|
class ConflictResolution(BaseModel):
|
||||||
|
"""Schema for resolving merge conflicts."""
|
||||||
|
file_path: str
|
||||||
|
resolution: str = Field(pattern="^(mine|theirs|manual)$")
|
||||||
|
content: Optional[str] = None
|
||||||
|
# [/DEF:ConflictResolution:Class]
|
||||||
|
|
||||||
|
# [DEF:DeploymentEnvironmentSchema:Class]
|
||||||
|
# @PURPOSE: Schema for representing a target deployment environment.
|
||||||
|
class DeploymentEnvironmentSchema(BaseModel):
|
||||||
|
"""Schema for representing a target deployment environment."""
|
||||||
|
id: str
|
||||||
|
name: str
|
||||||
|
superset_url: str
|
||||||
|
is_active: bool
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:DeploymentEnvironmentSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:DeployRequest:Class]
|
||||||
|
# @PURPOSE: Schema for dashboard deployment requests.
|
||||||
|
class DeployRequest(BaseModel):
|
||||||
|
"""Schema for deployment requests."""
|
||||||
|
environment_id: str
|
||||||
|
# [/DEF:DeployRequest:Class]
|
||||||
|
|
||||||
|
# [DEF:RepoInitRequest:Class]
|
||||||
|
# @PURPOSE: Schema for repository initialization requests.
|
||||||
|
class RepoInitRequest(BaseModel):
|
||||||
|
"""Schema for repository initialization requests."""
|
||||||
|
config_id: str
|
||||||
|
remote_url: str
|
||||||
|
# [/DEF:RepoInitRequest:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.api.routes.git_schemas:Module]
|
||||||
207
backend/src/api/routes/llm.py
Normal file
207
backend/src/api/routes/llm.py
Normal file
@@ -0,0 +1,207 @@
|
|||||||
|
# [DEF:backend/src/api/routes/llm.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: api, routes, llm
|
||||||
|
# @PURPOSE: API routes for LLM provider configuration and management.
|
||||||
|
# @LAYER: UI (API)
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from typing import List
|
||||||
|
from ...core.logger import logger
|
||||||
|
from ...schemas.auth import User
|
||||||
|
from ...dependencies import get_current_user as get_current_active_user
|
||||||
|
from ...plugins.llm_analysis.models import LLMProviderConfig, LLMProviderType
|
||||||
|
from ...services.llm_provider import LLMProviderService
|
||||||
|
from ...core.database import get_db
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
# [DEF:router:Global]
|
||||||
|
# @PURPOSE: APIRouter instance for LLM routes.
|
||||||
|
router = APIRouter(prefix="/api/llm", tags=["LLM"])
|
||||||
|
# [/DEF:router:Global]
|
||||||
|
|
||||||
|
# [DEF:get_providers:Function]
|
||||||
|
# @PURPOSE: Retrieve all LLM provider configurations.
|
||||||
|
# @PRE: User is authenticated.
|
||||||
|
# @POST: Returns list of LLMProviderConfig.
|
||||||
|
@router.get("/providers", response_model=List[LLMProviderConfig])
|
||||||
|
async def get_providers(
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get all LLM provider configurations.
|
||||||
|
"""
|
||||||
|
logger.info(f"[llm_routes][get_providers][Action] Fetching providers for user: {current_user.username}")
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
providers = service.get_all_providers()
|
||||||
|
return [
|
||||||
|
LLMProviderConfig(
|
||||||
|
id=p.id,
|
||||||
|
provider_type=LLMProviderType(p.provider_type),
|
||||||
|
name=p.name,
|
||||||
|
base_url=p.base_url,
|
||||||
|
api_key="********",
|
||||||
|
default_model=p.default_model,
|
||||||
|
is_active=p.is_active
|
||||||
|
) for p in providers
|
||||||
|
]
|
||||||
|
# [/DEF:get_providers:Function]
|
||||||
|
|
||||||
|
# [DEF:create_provider:Function]
|
||||||
|
# @PURPOSE: Create a new LLM provider configuration.
|
||||||
|
# @PRE: User is authenticated and has admin permissions.
|
||||||
|
# @POST: Returns the created LLMProviderConfig.
|
||||||
|
@router.post("/providers", response_model=LLMProviderConfig, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_provider(
|
||||||
|
config: LLMProviderConfig,
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Create a new LLM provider configuration.
|
||||||
|
"""
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
provider = service.create_provider(config)
|
||||||
|
return LLMProviderConfig(
|
||||||
|
id=provider.id,
|
||||||
|
provider_type=LLMProviderType(provider.provider_type),
|
||||||
|
name=provider.name,
|
||||||
|
base_url=provider.base_url,
|
||||||
|
api_key="********",
|
||||||
|
default_model=provider.default_model,
|
||||||
|
is_active=provider.is_active
|
||||||
|
)
|
||||||
|
# [/DEF:create_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:update_provider:Function]
|
||||||
|
# @PURPOSE: Update an existing LLM provider configuration.
|
||||||
|
# @PRE: User is authenticated and has admin permissions.
|
||||||
|
# @POST: Returns the updated LLMProviderConfig.
|
||||||
|
@router.put("/providers/{provider_id}", response_model=LLMProviderConfig)
|
||||||
|
async def update_provider(
|
||||||
|
provider_id: str,
|
||||||
|
config: LLMProviderConfig,
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Update an existing LLM provider configuration.
|
||||||
|
"""
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
provider = service.update_provider(provider_id, config)
|
||||||
|
if not provider:
|
||||||
|
raise HTTPException(status_code=404, detail="Provider not found")
|
||||||
|
|
||||||
|
return LLMProviderConfig(
|
||||||
|
id=provider.id,
|
||||||
|
provider_type=LLMProviderType(provider.provider_type),
|
||||||
|
name=provider.name,
|
||||||
|
base_url=provider.base_url,
|
||||||
|
api_key="********",
|
||||||
|
default_model=provider.default_model,
|
||||||
|
is_active=provider.is_active
|
||||||
|
)
|
||||||
|
# [/DEF:update_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_provider:Function]
|
||||||
|
# @PURPOSE: Delete an LLM provider configuration.
|
||||||
|
# @PRE: User is authenticated and has admin permissions.
|
||||||
|
# @POST: Returns success status.
|
||||||
|
@router.delete("/providers/{provider_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
async def delete_provider(
|
||||||
|
provider_id: str,
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Delete an LLM provider configuration.
|
||||||
|
"""
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
if not service.delete_provider(provider_id):
|
||||||
|
raise HTTPException(status_code=404, detail="Provider not found")
|
||||||
|
return
|
||||||
|
# [/DEF:delete_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:test_connection:Function]
|
||||||
|
# @PURPOSE: Test connection to an LLM provider.
|
||||||
|
# @PRE: User is authenticated.
|
||||||
|
# @POST: Returns success status and message.
|
||||||
|
@router.post("/providers/{provider_id}/test")
|
||||||
|
async def test_connection(
|
||||||
|
provider_id: str,
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
logger.info(f"[llm_routes][test_connection][Action] Testing connection for provider_id: {provider_id}")
|
||||||
|
"""
|
||||||
|
Test connection to an LLM provider.
|
||||||
|
"""
|
||||||
|
from ...plugins.llm_analysis.service import LLMClient
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
db_provider = service.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
raise HTTPException(status_code=404, detail="Provider not found")
|
||||||
|
|
||||||
|
api_key = service.get_decrypted_api_key(provider_id)
|
||||||
|
|
||||||
|
# Check if API key was successfully decrypted
|
||||||
|
if not api_key:
|
||||||
|
logger.error(f"[llm_routes][test_connection] Failed to decrypt API key for provider {provider_id}")
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=500,
|
||||||
|
detail="Failed to decrypt API key. The provider may have been encrypted with a different encryption key. Please update the provider with a new API key."
|
||||||
|
)
|
||||||
|
|
||||||
|
client = LLMClient(
|
||||||
|
provider_type=LLMProviderType(db_provider.provider_type),
|
||||||
|
api_key=api_key,
|
||||||
|
base_url=db_provider.base_url,
|
||||||
|
default_model=db_provider.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Simple test call
|
||||||
|
await client.client.models.list()
|
||||||
|
return {"success": True, "message": "Connection successful"}
|
||||||
|
except Exception as e:
|
||||||
|
return {"success": False, "error": str(e)}
|
||||||
|
# [/DEF:test_connection:Function]
|
||||||
|
|
||||||
|
# [DEF:test_provider_config:Function]
|
||||||
|
# @PURPOSE: Test connection with a provided configuration (not yet saved).
|
||||||
|
# @PRE: User is authenticated.
|
||||||
|
# @POST: Returns success status and message.
|
||||||
|
@router.post("/providers/test")
|
||||||
|
async def test_provider_config(
|
||||||
|
config: LLMProviderConfig,
|
||||||
|
current_user: User = Depends(get_current_active_user)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Test connection with a provided configuration.
|
||||||
|
"""
|
||||||
|
from ...plugins.llm_analysis.service import LLMClient
|
||||||
|
logger.info(f"[llm_routes][test_provider_config][Action] Testing config for {config.name}")
|
||||||
|
|
||||||
|
# Check if API key is provided
|
||||||
|
if not config.api_key or config.api_key == "********":
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=400,
|
||||||
|
detail="API key is required for testing connection"
|
||||||
|
)
|
||||||
|
|
||||||
|
client = LLMClient(
|
||||||
|
provider_type=config.provider_type,
|
||||||
|
api_key=config.api_key,
|
||||||
|
base_url=config.base_url,
|
||||||
|
default_model=config.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Simple test call
|
||||||
|
await client.client.models.list()
|
||||||
|
return {"success": True, "message": "Connection successful"}
|
||||||
|
except Exception as e:
|
||||||
|
return {"success": False, "error": str(e)}
|
||||||
|
# [/DEF:test_provider_config:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/api/routes/llm.py]
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
# [DEF:backend.src.api.routes.mappings:Module]
|
# [DEF:backend.src.api.routes.mappings:Module]
|
||||||
#
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: api, mappings, database, fuzzy-matching
|
# @SEMANTICS: api, mappings, database, fuzzy-matching
|
||||||
# @PURPOSE: API endpoints for managing database mappings and getting suggestions.
|
# @PURPOSE: API endpoints for managing database mappings and getting suggestions.
|
||||||
# @LAYER: API
|
# @LAYER: API
|
||||||
@@ -13,7 +14,8 @@
|
|||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
from ...dependencies import get_config_manager
|
from ...core.logger import belief_scope
|
||||||
|
from ...dependencies import get_config_manager, has_permission
|
||||||
from ...core.database import get_db
|
from ...core.database import get_db
|
||||||
from ...models.mapping import DatabaseMapping
|
from ...models.mapping import DatabaseMapping
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
@@ -59,7 +61,8 @@ class SuggestRequest(BaseModel):
|
|||||||
async def get_mappings(
|
async def get_mappings(
|
||||||
source_env_id: Optional[str] = None,
|
source_env_id: Optional[str] = None,
|
||||||
target_env_id: Optional[str] = None,
|
target_env_id: Optional[str] = None,
|
||||||
db: Session = Depends(get_db)
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||||
):
|
):
|
||||||
with belief_scope("get_mappings"):
|
with belief_scope("get_mappings"):
|
||||||
query = db.query(DatabaseMapping)
|
query = db.query(DatabaseMapping)
|
||||||
@@ -75,7 +78,11 @@ async def get_mappings(
|
|||||||
# @PRE: mapping is valid MappingCreate, db session is injected.
|
# @PRE: mapping is valid MappingCreate, db session is injected.
|
||||||
# @POST: DatabaseMapping created or updated in database.
|
# @POST: DatabaseMapping created or updated in database.
|
||||||
@router.post("", response_model=MappingResponse)
|
@router.post("", response_model=MappingResponse)
|
||||||
async def create_mapping(mapping: MappingCreate, db: Session = Depends(get_db)):
|
async def create_mapping(
|
||||||
|
mapping: MappingCreate,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("create_mapping"):
|
with belief_scope("create_mapping"):
|
||||||
# Check if mapping already exists
|
# Check if mapping already exists
|
||||||
existing = db.query(DatabaseMapping).filter(
|
existing = db.query(DatabaseMapping).filter(
|
||||||
@@ -105,10 +112,11 @@ async def create_mapping(mapping: MappingCreate, db: Session = Depends(get_db)):
|
|||||||
@router.post("/suggest")
|
@router.post("/suggest")
|
||||||
async def suggest_mappings_api(
|
async def suggest_mappings_api(
|
||||||
request: SuggestRequest,
|
request: SuggestRequest,
|
||||||
config_manager=Depends(get_config_manager)
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||||
):
|
):
|
||||||
with belief_scope("suggest_mappings_api"):
|
with belief_scope("suggest_mappings_api"):
|
||||||
from backend.src.services.mapping_service import MappingService
|
from ...services.mapping_service import MappingService
|
||||||
service = MappingService(config_manager)
|
service = MappingService(config_manager)
|
||||||
try:
|
try:
|
||||||
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
# [DEF:backend.src.api.routes.migration:Module]
|
# [DEF:backend.src.api.routes.migration:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: api, migration, dashboards
|
# @SEMANTICS: api, migration, dashboards
|
||||||
# @PURPOSE: API endpoints for migration operations.
|
# @PURPOSE: API endpoints for migration operations.
|
||||||
# @LAYER: API
|
# @LAYER: API
|
||||||
@@ -6,10 +7,11 @@
|
|||||||
# @RELATION: DEPENDS_ON -> backend.src.models.dashboard
|
# @RELATION: DEPENDS_ON -> backend.src.models.dashboard
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from typing import List, Dict
|
from typing import List
|
||||||
from ...dependencies import get_config_manager, get_task_manager
|
from ...dependencies import get_config_manager, get_task_manager, has_permission
|
||||||
from ...models.dashboard import DashboardMetadata, DashboardSelection
|
from ...models.dashboard import DashboardMetadata, DashboardSelection
|
||||||
from ...core.superset_client import SupersetClient
|
from ...core.superset_client import SupersetClient
|
||||||
|
from ...core.logger import belief_scope
|
||||||
|
|
||||||
router = APIRouter(prefix="/api", tags=["migration"])
|
router = APIRouter(prefix="/api", tags=["migration"])
|
||||||
|
|
||||||
@@ -20,8 +22,13 @@ router = APIRouter(prefix="/api", tags=["migration"])
|
|||||||
# @PARAM: env_id (str) - The ID of the environment to fetch from.
|
# @PARAM: env_id (str) - The ID of the environment to fetch from.
|
||||||
# @RETURN: List[DashboardMetadata]
|
# @RETURN: List[DashboardMetadata]
|
||||||
@router.get("/environments/{env_id}/dashboards", response_model=List[DashboardMetadata])
|
@router.get("/environments/{env_id}/dashboards", response_model=List[DashboardMetadata])
|
||||||
async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)):
|
async def get_dashboards(
|
||||||
environments = config_manager.get_environments()
|
env_id: str,
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("plugin:migration", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_dashboards", f"env_id={env_id}"):
|
||||||
|
environments = config_manager.get_environments()
|
||||||
env = next((e for e in environments if e.id == env_id), None)
|
env = next((e for e in environments if e.id == env_id), None)
|
||||||
if not env:
|
if not env:
|
||||||
raise HTTPException(status_code=404, detail="Environment not found")
|
raise HTTPException(status_code=404, detail="Environment not found")
|
||||||
@@ -38,9 +45,15 @@ async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)
|
|||||||
# @PARAM: selection (DashboardSelection) - The dashboards to migrate.
|
# @PARAM: selection (DashboardSelection) - The dashboards to migrate.
|
||||||
# @RETURN: Dict - {"task_id": str, "message": str}
|
# @RETURN: Dict - {"task_id": str, "message": str}
|
||||||
@router.post("/migration/execute")
|
@router.post("/migration/execute")
|
||||||
async def execute_migration(selection: DashboardSelection, config_manager=Depends(get_config_manager), task_manager=Depends(get_task_manager)):
|
async def execute_migration(
|
||||||
# Validate environments exist
|
selection: DashboardSelection,
|
||||||
environments = config_manager.get_environments()
|
config_manager=Depends(get_config_manager),
|
||||||
|
task_manager=Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("plugin:migration", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("execute_migration"):
|
||||||
|
# Validate environments exist
|
||||||
|
environments = config_manager.get_environments()
|
||||||
env_ids = {e.id for e in environments}
|
env_ids = {e.id for e in environments}
|
||||||
if selection.source_env_id not in env_ids or selection.target_env_id not in env_ids:
|
if selection.source_env_id not in env_ids or selection.target_env_id not in env_ids:
|
||||||
raise HTTPException(status_code=400, detail="Invalid source or target environment")
|
raise HTTPException(status_code=400, detail="Invalid source or target environment")
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
# [DEF:PluginsRouter:Module]
|
# [DEF:PluginsRouter:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: api, router, plugins, list
|
# @SEMANTICS: api, router, plugins, list
|
||||||
# @PURPOSE: Defines the FastAPI router for plugin-related endpoints, allowing clients to list available plugins.
|
# @PURPOSE: Defines the FastAPI router for plugin-related endpoints, allowing clients to list available plugins.
|
||||||
# @LAYER: UI (API)
|
# @LAYER: UI (API)
|
||||||
@@ -7,7 +8,7 @@ from typing import List
|
|||||||
from fastapi import APIRouter, Depends
|
from fastapi import APIRouter, Depends
|
||||||
|
|
||||||
from ...core.plugin_base import PluginConfig
|
from ...core.plugin_base import PluginConfig
|
||||||
from ...dependencies import get_plugin_loader
|
from ...dependencies import get_plugin_loader, has_permission
|
||||||
from ...core.logger import belief_scope
|
from ...core.logger import belief_scope
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
@@ -19,7 +20,8 @@ router = APIRouter()
|
|||||||
# @RETURN: List[PluginConfig] - List of registered plugins.
|
# @RETURN: List[PluginConfig] - List of registered plugins.
|
||||||
@router.get("", response_model=List[PluginConfig])
|
@router.get("", response_model=List[PluginConfig])
|
||||||
async def list_plugins(
|
async def list_plugins(
|
||||||
plugin_loader = Depends(get_plugin_loader)
|
plugin_loader = Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugins", "READ"))
|
||||||
):
|
):
|
||||||
with belief_scope("list_plugins"):
|
with belief_scope("list_plugins"):
|
||||||
"""
|
"""
|
||||||
|
|||||||
@@ -12,14 +12,24 @@
|
|||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from typing import List
|
from typing import List
|
||||||
from ...core.config_models import AppConfig, Environment, GlobalSettings
|
from pydantic import BaseModel
|
||||||
from ...dependencies import get_config_manager
|
from ...core.config_models import AppConfig, Environment, GlobalSettings, LoggingConfig
|
||||||
|
from ...models.storage import StorageConfig
|
||||||
|
from ...dependencies import get_config_manager, has_permission
|
||||||
from ...core.config_manager import ConfigManager
|
from ...core.config_manager import ConfigManager
|
||||||
from ...core.logger import logger, belief_scope
|
from ...core.logger import logger, belief_scope
|
||||||
from ...core.superset_client import SupersetClient
|
from ...core.superset_client import SupersetClient
|
||||||
import os
|
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:LoggingConfigResponse:Class]
|
||||||
|
# @PURPOSE: Response model for logging configuration with current task log level.
|
||||||
|
# @SEMANTICS: logging, config, response
|
||||||
|
class LoggingConfigResponse(BaseModel):
|
||||||
|
level: str
|
||||||
|
task_log_level: str
|
||||||
|
enable_belief_state: bool
|
||||||
|
# [/DEF:LoggingConfigResponse:Class]
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
# [DEF:get_settings:Function]
|
# [DEF:get_settings:Function]
|
||||||
@@ -28,7 +38,10 @@ router = APIRouter()
|
|||||||
# @POST: Returns masked AppConfig.
|
# @POST: Returns masked AppConfig.
|
||||||
# @RETURN: AppConfig - The current configuration.
|
# @RETURN: AppConfig - The current configuration.
|
||||||
@router.get("", response_model=AppConfig)
|
@router.get("", response_model=AppConfig)
|
||||||
async def get_settings(config_manager: ConfigManager = Depends(get_config_manager)):
|
async def get_settings(
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_settings"):
|
with belief_scope("get_settings"):
|
||||||
logger.info("[get_settings][Entry] Fetching all settings")
|
logger.info("[get_settings][Entry] Fetching all settings")
|
||||||
config = config_manager.get_config().copy(deep=True)
|
config = config_manager.get_config().copy(deep=True)
|
||||||
@@ -48,21 +61,60 @@ async def get_settings(config_manager: ConfigManager = Depends(get_config_manage
|
|||||||
@router.patch("/global", response_model=GlobalSettings)
|
@router.patch("/global", response_model=GlobalSettings)
|
||||||
async def update_global_settings(
|
async def update_global_settings(
|
||||||
settings: GlobalSettings,
|
settings: GlobalSettings,
|
||||||
config_manager: ConfigManager = Depends(get_config_manager)
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
):
|
):
|
||||||
with belief_scope("update_global_settings"):
|
with belief_scope("update_global_settings"):
|
||||||
logger.info("[update_global_settings][Entry] Updating global settings")
|
logger.info("[update_global_settings][Entry] Updating global settings")
|
||||||
|
|
||||||
config_manager.update_global_settings(settings)
|
config_manager.update_global_settings(settings)
|
||||||
return settings
|
return settings
|
||||||
# [/DEF:update_global_settings:Function]
|
# [/DEF:update_global_settings:Function]
|
||||||
|
|
||||||
|
# [DEF:get_storage_settings:Function]
|
||||||
|
# @PURPOSE: Retrieves storage-specific settings.
|
||||||
|
# @RETURN: StorageConfig - The storage configuration.
|
||||||
|
@router.get("/storage", response_model=StorageConfig)
|
||||||
|
async def get_storage_settings(
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_storage_settings"):
|
||||||
|
return config_manager.get_config().settings.storage
|
||||||
|
# [/DEF:get_storage_settings:Function]
|
||||||
|
|
||||||
|
# [DEF:update_storage_settings:Function]
|
||||||
|
# @PURPOSE: Updates storage-specific settings.
|
||||||
|
# @PARAM: storage (StorageConfig) - The new storage settings.
|
||||||
|
# @POST: Storage settings are updated and saved.
|
||||||
|
# @RETURN: StorageConfig - The updated storage settings.
|
||||||
|
@router.put("/storage", response_model=StorageConfig)
|
||||||
|
async def update_storage_settings(
|
||||||
|
storage: StorageConfig,
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("update_storage_settings"):
|
||||||
|
is_valid, message = config_manager.validate_path(storage.root_path)
|
||||||
|
if not is_valid:
|
||||||
|
raise HTTPException(status_code=400, detail=message)
|
||||||
|
|
||||||
|
settings = config_manager.get_config().settings
|
||||||
|
settings.storage = storage
|
||||||
|
config_manager.update_global_settings(settings)
|
||||||
|
return config_manager.get_config().settings.storage
|
||||||
|
# [/DEF:update_storage_settings:Function]
|
||||||
|
|
||||||
# [DEF:get_environments:Function]
|
# [DEF:get_environments:Function]
|
||||||
# @PURPOSE: Lists all configured Superset environments.
|
# @PURPOSE: Lists all configured Superset environments.
|
||||||
# @PRE: Config manager is available.
|
# @PRE: Config manager is available.
|
||||||
# @POST: Returns list of environments.
|
# @POST: Returns list of environments.
|
||||||
# @RETURN: List[Environment] - List of environments.
|
# @RETURN: List[Environment] - List of environments.
|
||||||
@router.get("/environments", response_model=List[Environment])
|
@router.get("/environments", response_model=List[Environment])
|
||||||
async def get_environments(config_manager: ConfigManager = Depends(get_config_manager)):
|
async def get_environments(
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_environments"):
|
with belief_scope("get_environments"):
|
||||||
logger.info("[get_environments][Entry] Fetching environments")
|
logger.info("[get_environments][Entry] Fetching environments")
|
||||||
return config_manager.get_environments()
|
return config_manager.get_environments()
|
||||||
@@ -77,7 +129,8 @@ async def get_environments(config_manager: ConfigManager = Depends(get_config_ma
|
|||||||
@router.post("/environments", response_model=Environment)
|
@router.post("/environments", response_model=Environment)
|
||||||
async def add_environment(
|
async def add_environment(
|
||||||
env: Environment,
|
env: Environment,
|
||||||
config_manager: ConfigManager = Depends(get_config_manager)
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
):
|
):
|
||||||
with belief_scope("add_environment"):
|
with belief_scope("add_environment"):
|
||||||
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
||||||
@@ -179,30 +232,83 @@ async def test_environment_connection(
|
|||||||
return {"status": "error", "message": str(e)}
|
return {"status": "error", "message": str(e)}
|
||||||
# [/DEF:test_environment_connection:Function]
|
# [/DEF:test_environment_connection:Function]
|
||||||
|
|
||||||
# [DEF:validate_backup_path:Function]
|
# [DEF:get_logging_config:Function]
|
||||||
# @PURPOSE: Validates if a backup path exists and is writable.
|
# @PURPOSE: Retrieves current logging configuration.
|
||||||
# @PRE: Path is provided in path_data.
|
# @PRE: Config manager is available.
|
||||||
# @POST: Returns success or error status.
|
# @POST: Returns logging configuration.
|
||||||
# @PARAM: path (str) - The path to validate.
|
# @RETURN: LoggingConfigResponse - The current logging config.
|
||||||
# @RETURN: dict - Validation result.
|
@router.get("/logging", response_model=LoggingConfigResponse)
|
||||||
@router.post("/validate-path")
|
async def get_logging_config(
|
||||||
async def validate_backup_path(
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
path_data: dict,
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
config_manager: ConfigManager = Depends(get_config_manager)
|
|
||||||
):
|
):
|
||||||
with belief_scope("validate_backup_path"):
|
with belief_scope("get_logging_config"):
|
||||||
path = path_data.get("path")
|
logging_config = config_manager.get_config().settings.logging
|
||||||
if not path:
|
return LoggingConfigResponse(
|
||||||
raise HTTPException(status_code=400, detail="Path is required")
|
level=logging_config.level,
|
||||||
|
task_log_level=logging_config.task_log_level,
|
||||||
|
enable_belief_state=logging_config.enable_belief_state
|
||||||
|
)
|
||||||
|
# [/DEF:get_logging_config:Function]
|
||||||
|
|
||||||
logger.info(f"[validate_backup_path][Entry] Validating path: {path}")
|
# [DEF:update_logging_config:Function]
|
||||||
|
# @PURPOSE: Updates logging configuration.
|
||||||
|
# @PRE: New logging config is provided.
|
||||||
|
# @POST: Logging configuration is updated and saved.
|
||||||
|
# @PARAM: config (LoggingConfig) - The new logging configuration.
|
||||||
|
# @RETURN: LoggingConfigResponse - The updated logging config.
|
||||||
|
@router.patch("/logging", response_model=LoggingConfigResponse)
|
||||||
|
async def update_logging_config(
|
||||||
|
config: LoggingConfig,
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("update_logging_config"):
|
||||||
|
logger.info(f"[update_logging_config][Entry] Updating logging config: level={config.level}, task_log_level={config.task_log_level}")
|
||||||
|
|
||||||
valid, message = config_manager.validate_path(path)
|
# Get current settings and update logging config
|
||||||
|
settings = config_manager.get_config().settings
|
||||||
|
settings.logging = config
|
||||||
|
config_manager.update_global_settings(settings)
|
||||||
|
|
||||||
if not valid:
|
return LoggingConfigResponse(
|
||||||
return {"status": "error", "message": message}
|
level=config.level,
|
||||||
|
task_log_level=config.task_log_level,
|
||||||
|
enable_belief_state=config.enable_belief_state
|
||||||
|
)
|
||||||
|
# [/DEF:update_logging_config:Function]
|
||||||
|
|
||||||
return {"status": "success", "message": message}
|
# [DEF:ConsolidatedSettingsResponse:Class]
|
||||||
# [/DEF:validate_backup_path:Function]
|
class ConsolidatedSettingsResponse(BaseModel):
|
||||||
|
environments: List[dict]
|
||||||
|
connections: List[dict]
|
||||||
|
llm: dict
|
||||||
|
logging: dict
|
||||||
|
storage: dict
|
||||||
|
# [/DEF:ConsolidatedSettingsResponse:Class]
|
||||||
|
|
||||||
|
# [DEF:get_consolidated_settings:Function]
|
||||||
|
# @PURPOSE: Retrieves all settings categories in a single call
|
||||||
|
# @PRE: Config manager is available.
|
||||||
|
# @POST: Returns all consolidated settings.
|
||||||
|
# @RETURN: ConsolidatedSettingsResponse - All settings categories.
|
||||||
|
@router.get("/consolidated", response_model=ConsolidatedSettingsResponse)
|
||||||
|
async def get_consolidated_settings(
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("get_consolidated_settings"):
|
||||||
|
logger.info("[get_consolidated_settings][Entry] Fetching all consolidated settings")
|
||||||
|
|
||||||
|
config = config_manager.get_config()
|
||||||
|
|
||||||
|
return ConsolidatedSettingsResponse(
|
||||||
|
environments=config.environments,
|
||||||
|
connections=config.settings.connections,
|
||||||
|
llm=config.settings.llm,
|
||||||
|
logging=config.settings.logging,
|
||||||
|
storage=config.settings.storage
|
||||||
|
)
|
||||||
|
# [/DEF:get_consolidated_settings:Function]
|
||||||
|
|
||||||
# [/DEF:SettingsRouter:Module]
|
# [/DEF:SettingsRouter:Module]
|
||||||
|
|||||||
146
backend/src/api/routes/storage.py
Normal file
146
backend/src/api/routes/storage.py
Normal file
@@ -0,0 +1,146 @@
|
|||||||
|
# [DEF:storage_routes:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: storage, files, upload, download, backup, repository
|
||||||
|
# @PURPOSE: API endpoints for file storage management (backups and repositories).
|
||||||
|
# @LAYER: API
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.models.storage
|
||||||
|
#
|
||||||
|
# @INVARIANT: All paths must be validated against path traversal.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from pathlib import Path
|
||||||
|
from fastapi import APIRouter, Depends, UploadFile, File, Form, HTTPException
|
||||||
|
from fastapi.responses import FileResponse
|
||||||
|
from typing import List, Optional
|
||||||
|
from ...models.storage import StoredFile, FileCategory
|
||||||
|
from ...dependencies import get_plugin_loader, has_permission
|
||||||
|
from ...plugins.storage.plugin import StoragePlugin
|
||||||
|
from ...core.logger import belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
router = APIRouter(tags=["storage"])
|
||||||
|
|
||||||
|
# [DEF:list_files:Function]
|
||||||
|
# @PURPOSE: List all files and directories in the storage system.
|
||||||
|
#
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Returns a list of StoredFile objects.
|
||||||
|
#
|
||||||
|
# @PARAM: category (Optional[FileCategory]) - Filter by category.
|
||||||
|
# @PARAM: path (Optional[str]) - Subpath within the category.
|
||||||
|
# @RETURN: List[StoredFile] - List of files/directories.
|
||||||
|
#
|
||||||
|
# @RELATION: CALLS -> StoragePlugin.list_files
|
||||||
|
@router.get("/files", response_model=List[StoredFile])
|
||||||
|
async def list_files(
|
||||||
|
category: Optional[FileCategory] = None,
|
||||||
|
path: Optional[str] = None,
|
||||||
|
plugin_loader=Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugin:storage", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("list_files"):
|
||||||
|
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||||
|
if not storage_plugin:
|
||||||
|
raise HTTPException(status_code=500, detail="Storage plugin not loaded")
|
||||||
|
return storage_plugin.list_files(category, path)
|
||||||
|
# [/DEF:list_files:Function]
|
||||||
|
|
||||||
|
# [DEF:upload_file:Function]
|
||||||
|
# @PURPOSE: Upload a file to the storage system.
|
||||||
|
#
|
||||||
|
# @PRE: category must be a valid FileCategory.
|
||||||
|
# @PRE: file must be a valid UploadFile.
|
||||||
|
# @POST: Returns the StoredFile object of the uploaded file.
|
||||||
|
#
|
||||||
|
# @PARAM: category (FileCategory) - Target category.
|
||||||
|
# @PARAM: path (Optional[str]) - Target subpath.
|
||||||
|
# @PARAM: file (UploadFile) - The file content.
|
||||||
|
# @RETURN: StoredFile - Metadata of the uploaded file.
|
||||||
|
#
|
||||||
|
# @SIDE_EFFECT: Writes file to the filesystem.
|
||||||
|
#
|
||||||
|
# @RELATION: CALLS -> StoragePlugin.save_file
|
||||||
|
@router.post("/upload", response_model=StoredFile, status_code=201)
|
||||||
|
async def upload_file(
|
||||||
|
category: FileCategory = Form(...),
|
||||||
|
path: Optional[str] = Form(None),
|
||||||
|
file: UploadFile = File(...),
|
||||||
|
plugin_loader=Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugin:storage", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("upload_file"):
|
||||||
|
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||||
|
if not storage_plugin:
|
||||||
|
raise HTTPException(status_code=500, detail="Storage plugin not loaded")
|
||||||
|
try:
|
||||||
|
return await storage_plugin.save_file(file, category, path)
|
||||||
|
except ValueError as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:upload_file:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_file:Function]
|
||||||
|
# @PURPOSE: Delete a specific file or directory.
|
||||||
|
#
|
||||||
|
# @PRE: category must be a valid FileCategory.
|
||||||
|
# @POST: Item is removed from storage.
|
||||||
|
#
|
||||||
|
# @PARAM: category (FileCategory) - File category.
|
||||||
|
# @PARAM: path (str) - Relative path of the item.
|
||||||
|
# @RETURN: None
|
||||||
|
#
|
||||||
|
# @SIDE_EFFECT: Deletes item from the filesystem.
|
||||||
|
#
|
||||||
|
# @RELATION: CALLS -> StoragePlugin.delete_file
|
||||||
|
@router.delete("/files/{category}/{path:path}", status_code=204)
|
||||||
|
async def delete_file(
|
||||||
|
category: FileCategory,
|
||||||
|
path: str,
|
||||||
|
plugin_loader=Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugin:storage", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("delete_file"):
|
||||||
|
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||||
|
if not storage_plugin:
|
||||||
|
raise HTTPException(status_code=500, detail="Storage plugin not loaded")
|
||||||
|
try:
|
||||||
|
storage_plugin.delete_file(category, path)
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise HTTPException(status_code=404, detail="File not found")
|
||||||
|
except ValueError as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:delete_file:Function]
|
||||||
|
|
||||||
|
# [DEF:download_file:Function]
|
||||||
|
# @PURPOSE: Retrieve a file for download.
|
||||||
|
#
|
||||||
|
# @PRE: category must be a valid FileCategory.
|
||||||
|
# @POST: Returns a FileResponse.
|
||||||
|
#
|
||||||
|
# @PARAM: category (FileCategory) - File category.
|
||||||
|
# @PARAM: path (str) - Relative path of the file.
|
||||||
|
# @RETURN: FileResponse - The file content.
|
||||||
|
#
|
||||||
|
# @RELATION: CALLS -> StoragePlugin.get_file_path
|
||||||
|
@router.get("/download/{category}/{path:path}")
|
||||||
|
async def download_file(
|
||||||
|
category: FileCategory,
|
||||||
|
path: str,
|
||||||
|
plugin_loader=Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugin:storage", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("download_file"):
|
||||||
|
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||||
|
if not storage_plugin:
|
||||||
|
raise HTTPException(status_code=500, detail="Storage plugin not loaded")
|
||||||
|
try:
|
||||||
|
abs_path = storage_plugin.get_file_path(category, path)
|
||||||
|
filename = Path(path).name
|
||||||
|
return FileResponse(path=abs_path, filename=filename)
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise HTTPException(status_code=404, detail="File not found")
|
||||||
|
except ValueError as e:
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:download_file:Function]
|
||||||
|
|
||||||
|
# [/DEF:storage_routes:Module]
|
||||||
@@ -1,15 +1,17 @@
|
|||||||
# [DEF:TasksRouter:Module]
|
# [DEF:TasksRouter:Module]
|
||||||
# @SEMANTICS: api, router, tasks, create, list, get
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: api, router, tasks, create, list, get, logs
|
||||||
# @PURPOSE: Defines the FastAPI router for task-related endpoints, allowing clients to create, list, and get the status of tasks.
|
# @PURPOSE: Defines the FastAPI router for task-related endpoints, allowing clients to create, list, and get the status of tasks.
|
||||||
# @LAYER: UI (API)
|
# @LAYER: UI (API)
|
||||||
# @RELATION: Depends on the TaskManager. It is included by the main app.
|
# @RELATION: Depends on the TaskManager. It is included by the main app.
|
||||||
from typing import List, Dict, Any, Optional
|
from typing import List, Dict, Any, Optional
|
||||||
from fastapi import APIRouter, Depends, HTTPException, status
|
from fastapi import APIRouter, Depends, HTTPException, status, Query
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
from ...core.logger import belief_scope
|
from ...core.logger import belief_scope
|
||||||
|
|
||||||
from ...core.task_manager import TaskManager, Task, TaskStatus, LogEntry
|
from ...core.task_manager import TaskManager, Task, TaskStatus, LogEntry
|
||||||
from ...dependencies import get_task_manager
|
from ...core.task_manager.models import LogFilter, LogStats
|
||||||
|
from ...dependencies import get_task_manager, has_permission, get_current_user
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
@@ -33,13 +35,31 @@ class ResumeTaskRequest(BaseModel):
|
|||||||
# @RETURN: Task - The created task instance.
|
# @RETURN: Task - The created task instance.
|
||||||
async def create_task(
|
async def create_task(
|
||||||
request: CreateTaskRequest,
|
request: CreateTaskRequest,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
current_user = Depends(get_current_user)
|
||||||
):
|
):
|
||||||
|
# Dynamic permission check based on plugin_id
|
||||||
|
has_permission(f"plugin:{request.plugin_id}", "EXECUTE")(current_user)
|
||||||
"""
|
"""
|
||||||
Create and start a new task for a given plugin.
|
Create and start a new task for a given plugin.
|
||||||
"""
|
"""
|
||||||
with belief_scope("create_task"):
|
with belief_scope("create_task"):
|
||||||
try:
|
try:
|
||||||
|
# Special handling for validation task to include provider config
|
||||||
|
if request.plugin_id == "llm_dashboard_validation":
|
||||||
|
from ...core.database import SessionLocal
|
||||||
|
from ...services.llm_provider import LLMProviderService
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
llm_service = LLMProviderService(db)
|
||||||
|
provider_id = request.params.get("provider_id")
|
||||||
|
if provider_id:
|
||||||
|
db_provider = llm_service.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
raise ValueError(f"LLM Provider {provider_id} not found")
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
task = await task_manager.create_task(
|
task = await task_manager.create_task(
|
||||||
plugin_id=request.plugin_id,
|
plugin_id=request.plugin_id,
|
||||||
params=request.params
|
params=request.params
|
||||||
@@ -63,7 +83,8 @@ async def list_tasks(
|
|||||||
limit: int = 10,
|
limit: int = 10,
|
||||||
offset: int = 0,
|
offset: int = 0,
|
||||||
status: Optional[TaskStatus] = None,
|
status: Optional[TaskStatus] = None,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "READ"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Retrieve a list of tasks with pagination and optional status filter.
|
Retrieve a list of tasks with pagination and optional status filter.
|
||||||
@@ -82,7 +103,8 @@ async def list_tasks(
|
|||||||
# @RETURN: Task - The task details.
|
# @RETURN: Task - The task details.
|
||||||
async def get_task(
|
async def get_task(
|
||||||
task_id: str,
|
task_id: str,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "READ"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Retrieve the details of a specific task.
|
Retrieve the details of a specific task.
|
||||||
@@ -96,26 +118,93 @@ async def get_task(
|
|||||||
|
|
||||||
@router.get("/{task_id}/logs", response_model=List[LogEntry])
|
@router.get("/{task_id}/logs", response_model=List[LogEntry])
|
||||||
# [DEF:get_task_logs:Function]
|
# [DEF:get_task_logs:Function]
|
||||||
# @PURPOSE: Retrieve logs for a specific task.
|
# @PURPOSE: Retrieve logs for a specific task with optional filtering.
|
||||||
# @PARAM: task_id (str) - The unique identifier of the task.
|
# @PARAM: task_id (str) - The unique identifier of the task.
|
||||||
|
# @PARAM: level (Optional[str]) - Filter by log level (DEBUG, INFO, WARNING, ERROR).
|
||||||
|
# @PARAM: source (Optional[str]) - Filter by source component.
|
||||||
|
# @PARAM: search (Optional[str]) - Text search in message.
|
||||||
|
# @PARAM: offset (int) - Number of logs to skip.
|
||||||
|
# @PARAM: limit (int) - Maximum number of logs to return.
|
||||||
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||||
# @PRE: task_id must exist.
|
# @PRE: task_id must exist.
|
||||||
# @POST: Returns a list of log entries or raises 404.
|
# @POST: Returns a list of log entries or raises 404.
|
||||||
# @RETURN: List[LogEntry] - List of log entries.
|
# @RETURN: List[LogEntry] - List of log entries.
|
||||||
|
# @TIER: CRITICAL
|
||||||
async def get_task_logs(
|
async def get_task_logs(
|
||||||
task_id: str,
|
task_id: str,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
level: Optional[str] = Query(None, description="Filter by log level (DEBUG, INFO, WARNING, ERROR)"),
|
||||||
|
source: Optional[str] = Query(None, description="Filter by source component"),
|
||||||
|
search: Optional[str] = Query(None, description="Text search in message"),
|
||||||
|
offset: int = Query(0, ge=0, description="Number of logs to skip"),
|
||||||
|
limit: int = Query(100, ge=1, le=1000, description="Maximum number of logs to return"),
|
||||||
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "READ"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Retrieve logs for a specific task.
|
Retrieve logs for a specific task with optional filtering.
|
||||||
|
Supports filtering by level, source, and text search.
|
||||||
"""
|
"""
|
||||||
with belief_scope("get_task_logs"):
|
with belief_scope("get_task_logs"):
|
||||||
task = task_manager.get_task(task_id)
|
task = task_manager.get_task(task_id)
|
||||||
if not task:
|
if not task:
|
||||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||||
return task_manager.get_task_logs(task_id)
|
|
||||||
|
log_filter = LogFilter(
|
||||||
|
level=level.upper() if level else None,
|
||||||
|
source=source,
|
||||||
|
search=search,
|
||||||
|
offset=offset,
|
||||||
|
limit=limit
|
||||||
|
)
|
||||||
|
return task_manager.get_task_logs(task_id, log_filter)
|
||||||
# [/DEF:get_task_logs:Function]
|
# [/DEF:get_task_logs:Function]
|
||||||
|
|
||||||
|
@router.get("/{task_id}/logs/stats", response_model=LogStats)
|
||||||
|
# [DEF:get_task_log_stats:Function]
|
||||||
|
# @PURPOSE: Get statistics about logs for a task (counts by level and source).
|
||||||
|
# @PARAM: task_id (str) - The unique identifier of the task.
|
||||||
|
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||||
|
# @PRE: task_id must exist.
|
||||||
|
# @POST: Returns log statistics or raises 404.
|
||||||
|
# @RETURN: LogStats - Statistics about task logs.
|
||||||
|
async def get_task_log_stats(
|
||||||
|
task_id: str,
|
||||||
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "READ"))
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get statistics about logs for a task (counts by level and source).
|
||||||
|
"""
|
||||||
|
with belief_scope("get_task_log_stats"):
|
||||||
|
task = task_manager.get_task(task_id)
|
||||||
|
if not task:
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||||
|
return task_manager.get_task_log_stats(task_id)
|
||||||
|
# [/DEF:get_task_log_stats:Function]
|
||||||
|
|
||||||
|
@router.get("/{task_id}/logs/sources", response_model=List[str])
|
||||||
|
# [DEF:get_task_log_sources:Function]
|
||||||
|
# @PURPOSE: Get unique sources for a task's logs.
|
||||||
|
# @PARAM: task_id (str) - The unique identifier of the task.
|
||||||
|
# @PARAM: task_manager (TaskManager) - The task manager instance.
|
||||||
|
# @PRE: task_id must exist.
|
||||||
|
# @POST: Returns list of unique source names or raises 404.
|
||||||
|
# @RETURN: List[str] - Unique source names.
|
||||||
|
async def get_task_log_sources(
|
||||||
|
task_id: str,
|
||||||
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "READ"))
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get unique sources for a task's logs.
|
||||||
|
"""
|
||||||
|
with belief_scope("get_task_log_sources"):
|
||||||
|
task = task_manager.get_task(task_id)
|
||||||
|
if not task:
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Task not found")
|
||||||
|
return task_manager.get_task_log_sources(task_id)
|
||||||
|
# [/DEF:get_task_log_sources:Function]
|
||||||
|
|
||||||
@router.post("/{task_id}/resolve", response_model=Task)
|
@router.post("/{task_id}/resolve", response_model=Task)
|
||||||
# [DEF:resolve_task:Function]
|
# [DEF:resolve_task:Function]
|
||||||
# @PURPOSE: Resolve a task that is awaiting mapping.
|
# @PURPOSE: Resolve a task that is awaiting mapping.
|
||||||
@@ -128,7 +217,8 @@ async def get_task_logs(
|
|||||||
async def resolve_task(
|
async def resolve_task(
|
||||||
task_id: str,
|
task_id: str,
|
||||||
request: ResolveTaskRequest,
|
request: ResolveTaskRequest,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "WRITE"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Resolve a task that is awaiting mapping.
|
Resolve a task that is awaiting mapping.
|
||||||
@@ -153,7 +243,8 @@ async def resolve_task(
|
|||||||
async def resume_task(
|
async def resume_task(
|
||||||
task_id: str,
|
task_id: str,
|
||||||
request: ResumeTaskRequest,
|
request: ResumeTaskRequest,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "WRITE"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Resume a task that is awaiting input (e.g., passwords).
|
Resume a task that is awaiting input (e.g., passwords).
|
||||||
@@ -175,7 +266,8 @@ async def resume_task(
|
|||||||
# @POST: Tasks are removed from memory/persistence.
|
# @POST: Tasks are removed from memory/persistence.
|
||||||
async def clear_tasks(
|
async def clear_tasks(
|
||||||
status: Optional[TaskStatus] = None,
|
status: Optional[TaskStatus] = None,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "WRITE"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Clear tasks matching the status filter. If no filter, clears all non-running tasks.
|
Clear tasks matching the status filter. If no filter, clears all non-running tasks.
|
||||||
|
|||||||
@@ -1,25 +1,28 @@
|
|||||||
# [DEF:AppModule:Module]
|
# [DEF:AppModule:Module]
|
||||||
|
# @TIER: CRITICAL
|
||||||
# @SEMANTICS: app, main, entrypoint, fastapi
|
# @SEMANTICS: app, main, entrypoint, fastapi
|
||||||
# @PURPOSE: The main entry point for the FastAPI application. It initializes the app, configures CORS, sets up dependencies, includes API routers, and defines the WebSocket endpoint for log streaming.
|
# @PURPOSE: The main entry point for the FastAPI application. It initializes the app, configures CORS, sets up dependencies, includes API routers, and defines the WebSocket endpoint for log streaming.
|
||||||
# @LAYER: UI (API)
|
# @LAYER: UI (API)
|
||||||
# @RELATION: Depends on the dependency module and API route modules.
|
# @RELATION: Depends on the dependency module and API route modules.
|
||||||
import sys
|
# @INVARIANT: Only one FastAPI app instance exists per process.
|
||||||
|
# @INVARIANT: All WebSocket connections must be properly cleaned up on disconnect.
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
# project_root is used for static files mounting
|
# project_root is used for static files mounting
|
||||||
project_root = Path(__file__).resolve().parent.parent.parent
|
project_root = Path(__file__).resolve().parent.parent.parent
|
||||||
|
|
||||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, Request, HTTPException
|
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Request, HTTPException
|
||||||
|
from starlette.middleware.sessions import SessionMiddleware
|
||||||
from fastapi.middleware.cors import CORSMiddleware
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
from fastapi.staticfiles import StaticFiles
|
from fastapi.staticfiles import StaticFiles
|
||||||
from fastapi.responses import FileResponse
|
from fastapi.responses import FileResponse
|
||||||
import asyncio
|
import asyncio
|
||||||
import os
|
|
||||||
|
|
||||||
from .dependencies import get_task_manager, get_scheduler_service
|
from .dependencies import get_task_manager, get_scheduler_service
|
||||||
|
from .core.utils.network import NetworkError
|
||||||
from .core.logger import logger, belief_scope
|
from .core.logger import logger, belief_scope
|
||||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration, connections
|
from .api.routes import plugins, tasks, settings, environments, mappings, migration, connections, git, storage, admin, llm, dashboards, datasets
|
||||||
from .core.database import init_db
|
from .api import auth
|
||||||
|
|
||||||
# [DEF:App:Global]
|
# [DEF:App:Global]
|
||||||
# @SEMANTICS: app, fastapi, instance
|
# @SEMANTICS: app, fastapi, instance
|
||||||
@@ -55,6 +58,10 @@ async def shutdown_event():
|
|||||||
scheduler.stop()
|
scheduler.stop()
|
||||||
# [/DEF:shutdown_event:Function]
|
# [/DEF:shutdown_event:Function]
|
||||||
|
|
||||||
|
# Configure Session Middleware (required by Authlib for OAuth2 flow)
|
||||||
|
from .core.auth.config import auth_config
|
||||||
|
app.add_middleware(SessionMiddleware, secret_key=auth_config.SECRET_KEY)
|
||||||
|
|
||||||
# Configure CORS
|
# Configure CORS
|
||||||
app.add_middleware(
|
app.add_middleware(
|
||||||
CORSMiddleware,
|
CORSMiddleware,
|
||||||
@@ -71,16 +78,39 @@ app.add_middleware(
|
|||||||
# @POST: Logs request and response details.
|
# @POST: Logs request and response details.
|
||||||
# @PARAM: request (Request) - The incoming request object.
|
# @PARAM: request (Request) - The incoming request object.
|
||||||
# @PARAM: call_next (Callable) - The next middleware or route handler.
|
# @PARAM: call_next (Callable) - The next middleware or route handler.
|
||||||
|
@app.exception_handler(NetworkError)
|
||||||
|
async def network_error_handler(request: Request, exc: NetworkError):
|
||||||
|
with belief_scope("network_error_handler"):
|
||||||
|
logger.error(f"Network error: {exc}")
|
||||||
|
return HTTPException(
|
||||||
|
status_code=503,
|
||||||
|
detail="Environment unavailable. Please check if the Superset instance is running."
|
||||||
|
)
|
||||||
|
|
||||||
@app.middleware("http")
|
@app.middleware("http")
|
||||||
async def log_requests(request: Request, call_next):
|
async def log_requests(request: Request, call_next):
|
||||||
with belief_scope("log_requests", f"{request.method} {request.url.path}"):
|
# Avoid spamming logs for polling endpoints
|
||||||
logger.info(f"[DEBUG] Incoming request: {request.method} {request.url.path}")
|
is_polling = request.url.path.endswith("/api/tasks") and request.method == "GET"
|
||||||
|
|
||||||
|
if not is_polling:
|
||||||
|
logger.info(f"Incoming request: {request.method} {request.url.path}")
|
||||||
|
|
||||||
|
try:
|
||||||
response = await call_next(request)
|
response = await call_next(request)
|
||||||
logger.info(f"[DEBUG] Response status: {response.status_code} for {request.url.path}")
|
if not is_polling:
|
||||||
|
logger.info(f"Response status: {response.status_code} for {request.url.path}")
|
||||||
return response
|
return response
|
||||||
|
except NetworkError as e:
|
||||||
|
logger.error(f"Network error caught in middleware: {e}")
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=503,
|
||||||
|
detail="Environment unavailable. Please check if the Superset instance is running."
|
||||||
|
)
|
||||||
# [/DEF:log_requests:Function]
|
# [/DEF:log_requests:Function]
|
||||||
|
|
||||||
# Include API routes
|
# Include API routes
|
||||||
|
app.include_router(auth.router)
|
||||||
|
app.include_router(admin.router)
|
||||||
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
||||||
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
||||||
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
||||||
@@ -88,28 +118,72 @@ app.include_router(connections.router, prefix="/api/settings/connections", tags=
|
|||||||
app.include_router(environments.router, prefix="/api/environments", tags=["Environments"])
|
app.include_router(environments.router, prefix="/api/environments", tags=["Environments"])
|
||||||
app.include_router(mappings.router)
|
app.include_router(mappings.router)
|
||||||
app.include_router(migration.router)
|
app.include_router(migration.router)
|
||||||
|
app.include_router(git.router)
|
||||||
|
app.include_router(llm.router)
|
||||||
|
app.include_router(storage.router, prefix="/api/storage", tags=["Storage"])
|
||||||
|
app.include_router(dashboards.router, tags=["Dashboards"])
|
||||||
|
app.include_router(datasets.router, tags=["Datasets"])
|
||||||
|
|
||||||
# [DEF:websocket_endpoint:Function]
|
# [DEF:websocket_endpoint:Function]
|
||||||
# @PURPOSE: Provides a WebSocket endpoint for real-time log streaming of a task.
|
# @PURPOSE: Provides a WebSocket endpoint for real-time log streaming of a task with server-side filtering.
|
||||||
# @PRE: task_id must be a valid task ID.
|
# @PRE: task_id must be a valid task ID.
|
||||||
# @POST: WebSocket connection is managed and logs are streamed until disconnect.
|
# @POST: WebSocket connection is managed and logs are streamed until disconnect.
|
||||||
|
# @TIER: CRITICAL
|
||||||
|
# @UX_STATE: Connecting -> Streaming -> (Disconnected)
|
||||||
@app.websocket("/ws/logs/{task_id}")
|
@app.websocket("/ws/logs/{task_id}")
|
||||||
async def websocket_endpoint(websocket: WebSocket, task_id: str):
|
async def websocket_endpoint(
|
||||||
|
websocket: WebSocket,
|
||||||
|
task_id: str,
|
||||||
|
source: str = None,
|
||||||
|
level: str = None
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
WebSocket endpoint for real-time log streaming with optional server-side filtering.
|
||||||
|
|
||||||
|
Query Parameters:
|
||||||
|
source: Filter logs by source component (e.g., "plugin", "superset_api")
|
||||||
|
level: Filter logs by minimum level (DEBUG, INFO, WARNING, ERROR)
|
||||||
|
"""
|
||||||
with belief_scope("websocket_endpoint", f"task_id={task_id}"):
|
with belief_scope("websocket_endpoint", f"task_id={task_id}"):
|
||||||
await websocket.accept()
|
await websocket.accept()
|
||||||
logger.info(f"WebSocket connection accepted for task {task_id}")
|
|
||||||
|
# Normalize filter parameters
|
||||||
|
source_filter = source.lower() if source else None
|
||||||
|
level_filter = level.upper() if level else None
|
||||||
|
|
||||||
|
# Level hierarchy for filtering
|
||||||
|
level_hierarchy = {"DEBUG": 0, "INFO": 1, "WARNING": 2, "ERROR": 3}
|
||||||
|
min_level = level_hierarchy.get(level_filter, 0) if level_filter else 0
|
||||||
|
|
||||||
|
logger.info(f"WebSocket connection accepted for task {task_id} (source={source_filter}, level={level_filter})")
|
||||||
task_manager = get_task_manager()
|
task_manager = get_task_manager()
|
||||||
queue = await task_manager.subscribe_logs(task_id)
|
queue = await task_manager.subscribe_logs(task_id)
|
||||||
|
|
||||||
|
def matches_filters(log_entry) -> bool:
|
||||||
|
"""Check if log entry matches the filter criteria."""
|
||||||
|
# Check source filter
|
||||||
|
if source_filter and log_entry.source.lower() != source_filter:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Check level filter
|
||||||
|
if level_filter:
|
||||||
|
log_level = level_hierarchy.get(log_entry.level.upper(), 0)
|
||||||
|
if log_level < min_level:
|
||||||
|
return False
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Stream new logs
|
# Stream new logs
|
||||||
logger.info(f"Starting log stream for task {task_id}")
|
logger.info(f"Starting log stream for task {task_id}")
|
||||||
|
|
||||||
# Send initial logs first to build context
|
# Send initial logs first to build context (apply filters)
|
||||||
initial_logs = task_manager.get_task_logs(task_id)
|
initial_logs = task_manager.get_task_logs(task_id)
|
||||||
for log_entry in initial_logs:
|
for log_entry in initial_logs:
|
||||||
log_dict = log_entry.dict()
|
if matches_filters(log_entry):
|
||||||
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
log_dict = log_entry.dict()
|
||||||
await websocket.send_json(log_dict)
|
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
||||||
|
await websocket.send_json(log_dict)
|
||||||
|
|
||||||
# Force a check for AWAITING_INPUT status immediately upon connection
|
# Force a check for AWAITING_INPUT status immediately upon connection
|
||||||
# This ensures that if the task is already waiting when the user connects, they get the prompt.
|
# This ensures that if the task is already waiting when the user connects, they get the prompt.
|
||||||
@@ -127,6 +201,11 @@ async def websocket_endpoint(websocket: WebSocket, task_id: str):
|
|||||||
|
|
||||||
while True:
|
while True:
|
||||||
log_entry = await queue.get()
|
log_entry = await queue.get()
|
||||||
|
|
||||||
|
# Apply server-side filtering
|
||||||
|
if not matches_filters(log_entry):
|
||||||
|
continue
|
||||||
|
|
||||||
log_dict = log_entry.dict()
|
log_dict = log_entry.dict()
|
||||||
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
||||||
await websocket.send_json(log_dict)
|
await websocket.send_json(log_dict)
|
||||||
|
|||||||
44
backend/src/core/auth/config.py
Normal file
44
backend/src/core/auth/config.py
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
# [DEF:backend.src.core.auth.config:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: auth, config, settings, jwt, adfs
|
||||||
|
# @PURPOSE: Centralized configuration for authentication and authorization.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> pydantic
|
||||||
|
#
|
||||||
|
# @INVARIANT: All sensitive configuration must have defaults or be loaded from environment.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from pydantic import Field
|
||||||
|
from pydantic_settings import BaseSettings
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:AuthConfig:Class]
|
||||||
|
# @PURPOSE: Holds authentication-related settings.
|
||||||
|
# @PRE: Environment variables may be provided via .env file.
|
||||||
|
# @POST: Returns a configuration object with validated settings.
|
||||||
|
class AuthConfig(BaseSettings):
|
||||||
|
# JWT Settings
|
||||||
|
SECRET_KEY: str = Field(default="super-secret-key-change-in-production", env="AUTH_SECRET_KEY")
|
||||||
|
ALGORITHM: str = "HS256"
|
||||||
|
ACCESS_TOKEN_EXPIRE_MINUTES: int = 480
|
||||||
|
REFRESH_TOKEN_EXPIRE_DAYS: int = 7
|
||||||
|
|
||||||
|
# Database Settings
|
||||||
|
AUTH_DATABASE_URL: str = Field(default="sqlite:///./backend/auth.db", env="AUTH_DATABASE_URL")
|
||||||
|
|
||||||
|
# ADFS Settings
|
||||||
|
ADFS_CLIENT_ID: str = Field(default="", env="ADFS_CLIENT_ID")
|
||||||
|
ADFS_CLIENT_SECRET: str = Field(default="", env="ADFS_CLIENT_SECRET")
|
||||||
|
ADFS_METADATA_URL: str = Field(default="", env="ADFS_METADATA_URL")
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
env_file = ".env"
|
||||||
|
extra = "ignore"
|
||||||
|
# [/DEF:AuthConfig:Class]
|
||||||
|
|
||||||
|
# [DEF:auth_config:Variable]
|
||||||
|
# @PURPOSE: Singleton instance of AuthConfig.
|
||||||
|
auth_config = AuthConfig()
|
||||||
|
# [/DEF:auth_config:Variable]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.config:Module]
|
||||||
55
backend/src/core/auth/jwt.py
Normal file
55
backend/src/core/auth/jwt.py
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
# [DEF:backend.src.core.auth.jwt:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: jwt, token, session, auth
|
||||||
|
# @PURPOSE: JWT token generation and validation logic.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> jose
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.config.auth_config
|
||||||
|
#
|
||||||
|
# @INVARIANT: Tokens must include expiration time and user identifier.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import Optional
|
||||||
|
from jose import jwt
|
||||||
|
from .config import auth_config
|
||||||
|
from ..logger import belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:create_access_token:Function]
|
||||||
|
# @PURPOSE: Generates a new JWT access token.
|
||||||
|
# @PRE: data dict contains 'sub' (user_id) and optional 'scopes' (roles).
|
||||||
|
# @POST: Returns a signed JWT string.
|
||||||
|
#
|
||||||
|
# @PARAM: data (dict) - Payload data for the token.
|
||||||
|
# @PARAM: expires_delta (Optional[timedelta]) - Custom expiration time.
|
||||||
|
# @RETURN: str - The encoded JWT.
|
||||||
|
def create_access_token(data: dict, expires_delta: Optional[timedelta] = None) -> str:
|
||||||
|
with belief_scope("create_access_token"):
|
||||||
|
to_encode = data.copy()
|
||||||
|
if expires_delta:
|
||||||
|
expire = datetime.utcnow() + expires_delta
|
||||||
|
else:
|
||||||
|
expire = datetime.utcnow() + timedelta(minutes=auth_config.ACCESS_TOKEN_EXPIRE_MINUTES)
|
||||||
|
|
||||||
|
to_encode.update({"exp": expire})
|
||||||
|
encoded_jwt = jwt.encode(to_encode, auth_config.SECRET_KEY, algorithm=auth_config.ALGORITHM)
|
||||||
|
return encoded_jwt
|
||||||
|
# [/DEF:create_access_token:Function]
|
||||||
|
|
||||||
|
# [DEF:decode_token:Function]
|
||||||
|
# @PURPOSE: Decodes and validates a JWT token.
|
||||||
|
# @PRE: token is a signed JWT string.
|
||||||
|
# @POST: Returns the decoded payload if valid.
|
||||||
|
#
|
||||||
|
# @PARAM: token (str) - The JWT to decode.
|
||||||
|
# @RETURN: dict - The decoded payload.
|
||||||
|
# @THROW: jose.JWTError - If token is invalid or expired.
|
||||||
|
def decode_token(token: str) -> dict:
|
||||||
|
with belief_scope("decode_token"):
|
||||||
|
payload = jwt.decode(token, auth_config.SECRET_KEY, algorithms=[auth_config.ALGORITHM])
|
||||||
|
return payload
|
||||||
|
# [/DEF:decode_token:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.jwt:Module]
|
||||||
32
backend/src/core/auth/logger.py
Normal file
32
backend/src/core/auth/logger.py
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
# [DEF:backend.src.core.auth.logger:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: auth, logger, audit, security
|
||||||
|
# @PURPOSE: Audit logging for security-related events.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: USES -> backend.src.core.logger.belief_scope
|
||||||
|
#
|
||||||
|
# @INVARIANT: Must not log sensitive data like passwords or full tokens.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from ..logger import logger, belief_scope
|
||||||
|
from datetime import datetime
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:log_security_event:Function]
|
||||||
|
# @PURPOSE: Logs a security-related event for audit trails.
|
||||||
|
# @PRE: event_type and username are strings.
|
||||||
|
# @POST: Security event is written to the application log.
|
||||||
|
# @PARAM: event_type (str) - Type of event (e.g., LOGIN_SUCCESS, PERMISSION_DENIED).
|
||||||
|
# @PARAM: username (str) - The user involved in the event.
|
||||||
|
# @PARAM: details (dict) - Additional non-sensitive metadata.
|
||||||
|
def log_security_event(event_type: str, username: str, details: dict = None):
|
||||||
|
with belief_scope("log_security_event", f"{event_type}:{username}"):
|
||||||
|
timestamp = datetime.utcnow().isoformat()
|
||||||
|
msg = f"[AUDIT][{timestamp}][{event_type}] User: {username}"
|
||||||
|
if details:
|
||||||
|
msg += f" Details: {details}"
|
||||||
|
logger.info(msg)
|
||||||
|
# [/DEF:log_security_event:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.logger:Module]
|
||||||
51
backend/src/core/auth/oauth.py
Normal file
51
backend/src/core/auth/oauth.py
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
# [DEF:backend.src.core.auth.oauth:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: auth, oauth, oidc, adfs
|
||||||
|
# @PURPOSE: ADFS OIDC configuration and client using Authlib.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> authlib
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.config.auth_config
|
||||||
|
#
|
||||||
|
# @INVARIANT: Must use secure OIDC flows.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from authlib.integrations.starlette_client import OAuth
|
||||||
|
from .config import auth_config
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:oauth:Variable]
|
||||||
|
# @PURPOSE: Global Authlib OAuth registry.
|
||||||
|
oauth = OAuth()
|
||||||
|
# [/DEF:oauth:Variable]
|
||||||
|
|
||||||
|
# [DEF:register_adfs:Function]
|
||||||
|
# @PURPOSE: Registers the ADFS OIDC client.
|
||||||
|
# @PRE: ADFS configuration is provided in auth_config.
|
||||||
|
# @POST: ADFS client is registered in oauth registry.
|
||||||
|
def register_adfs():
|
||||||
|
if auth_config.ADFS_CLIENT_ID:
|
||||||
|
oauth.register(
|
||||||
|
name='adfs',
|
||||||
|
client_id=auth_config.ADFS_CLIENT_ID,
|
||||||
|
client_secret=auth_config.ADFS_CLIENT_SECRET,
|
||||||
|
server_metadata_url=auth_config.ADFS_METADATA_URL,
|
||||||
|
client_kwargs={
|
||||||
|
'scope': 'openid email profile groups'
|
||||||
|
}
|
||||||
|
)
|
||||||
|
# [/DEF:register_adfs:Function]
|
||||||
|
|
||||||
|
# [DEF:is_adfs_configured:Function]
|
||||||
|
# @PURPOSE: Checks if ADFS is properly configured.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Returns True if ADFS client is registered, False otherwise.
|
||||||
|
# @RETURN: bool - Configuration status.
|
||||||
|
def is_adfs_configured() -> bool:
|
||||||
|
"""Check if ADFS OAuth client is registered."""
|
||||||
|
return 'adfs' in oauth._registry
|
||||||
|
# [/DEF:is_adfs_configured:Function]
|
||||||
|
|
||||||
|
# Initial registration
|
||||||
|
register_adfs()
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.oauth:Module]
|
||||||
123
backend/src/core/auth/repository.py
Normal file
123
backend/src/core/auth/repository.py
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
# [DEF:backend.src.core.auth.repository:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: auth, repository, database, user, role
|
||||||
|
# @PURPOSE: Data access layer for authentication-related entities.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||||
|
# @RELATION: USES -> backend.src.models.auth
|
||||||
|
#
|
||||||
|
# @INVARIANT: All database operations must be performed within a session.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from typing import Optional, List
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from ...models.auth import User, Role, Permission
|
||||||
|
from ..logger import belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:AuthRepository:Class]
|
||||||
|
# @PURPOSE: Encapsulates database operations for authentication.
|
||||||
|
class AuthRepository:
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the repository with a database session.
|
||||||
|
# @PARAM: db (Session) - SQLAlchemy session.
|
||||||
|
def __init__(self, db: Session):
|
||||||
|
self.db = db
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:get_user_by_username:Function]
|
||||||
|
# @PURPOSE: Retrieves a user by their username.
|
||||||
|
# @PRE: username is a string.
|
||||||
|
# @POST: Returns User object if found, else None.
|
||||||
|
# @PARAM: username (str) - The username to search for.
|
||||||
|
# @RETURN: Optional[User] - The found user or None.
|
||||||
|
def get_user_by_username(self, username: str) -> Optional[User]:
|
||||||
|
with belief_scope("AuthRepository.get_user_by_username"):
|
||||||
|
return self.db.query(User).filter(User.username == username).first()
|
||||||
|
# [/DEF:get_user_by_username:Function]
|
||||||
|
|
||||||
|
# [DEF:get_user_by_id:Function]
|
||||||
|
# @PURPOSE: Retrieves a user by their unique ID.
|
||||||
|
# @PRE: user_id is a valid UUID string.
|
||||||
|
# @POST: Returns User object if found, else None.
|
||||||
|
# @PARAM: user_id (str) - The user's unique identifier.
|
||||||
|
# @RETURN: Optional[User] - The found user or None.
|
||||||
|
def get_user_by_id(self, user_id: str) -> Optional[User]:
|
||||||
|
with belief_scope("AuthRepository.get_user_by_id"):
|
||||||
|
return self.db.query(User).filter(User.id == user_id).first()
|
||||||
|
# [/DEF:get_user_by_id:Function]
|
||||||
|
|
||||||
|
# [DEF:get_role_by_name:Function]
|
||||||
|
# @PURPOSE: Retrieves a role by its name.
|
||||||
|
# @PRE: name is a string.
|
||||||
|
# @POST: Returns Role object if found, else None.
|
||||||
|
# @PARAM: name (str) - The role name to search for.
|
||||||
|
# @RETURN: Optional[Role] - The found role or None.
|
||||||
|
def get_role_by_name(self, name: str) -> Optional[Role]:
|
||||||
|
with belief_scope("AuthRepository.get_role_by_name"):
|
||||||
|
return self.db.query(Role).filter(Role.name == name).first()
|
||||||
|
# [/DEF:get_role_by_name:Function]
|
||||||
|
|
||||||
|
# [DEF:update_last_login:Function]
|
||||||
|
# @PURPOSE: Updates the last_login timestamp for a user.
|
||||||
|
# @PRE: user object is a valid User instance.
|
||||||
|
# @POST: User's last_login is updated in the database.
|
||||||
|
# @SIDE_EFFECT: Commits the transaction.
|
||||||
|
# @PARAM: user (User) - The user to update.
|
||||||
|
def update_last_login(self, user: User):
|
||||||
|
with belief_scope("AuthRepository.update_last_login"):
|
||||||
|
from datetime import datetime
|
||||||
|
user.last_login = datetime.utcnow()
|
||||||
|
self.db.add(user)
|
||||||
|
self.db.commit()
|
||||||
|
# [/DEF:update_last_login:Function]
|
||||||
|
|
||||||
|
# [DEF:get_role_by_id:Function]
|
||||||
|
# @PURPOSE: Retrieves a role by its unique ID.
|
||||||
|
# @PRE: role_id is a string.
|
||||||
|
# @POST: Returns Role object if found, else None.
|
||||||
|
# @PARAM: role_id (str) - The role's unique identifier.
|
||||||
|
# @RETURN: Optional[Role] - The found role or None.
|
||||||
|
def get_role_by_id(self, role_id: str) -> Optional[Role]:
|
||||||
|
with belief_scope("AuthRepository.get_role_by_id"):
|
||||||
|
return self.db.query(Role).filter(Role.id == role_id).first()
|
||||||
|
# [/DEF:get_role_by_id:Function]
|
||||||
|
|
||||||
|
# [DEF:get_permission_by_id:Function]
|
||||||
|
# @PURPOSE: Retrieves a permission by its unique ID.
|
||||||
|
# @PRE: perm_id is a string.
|
||||||
|
# @POST: Returns Permission object if found, else None.
|
||||||
|
# @PARAM: perm_id (str) - The permission's unique identifier.
|
||||||
|
# @RETURN: Optional[Permission] - The found permission or None.
|
||||||
|
def get_permission_by_id(self, perm_id: str) -> Optional[Permission]:
|
||||||
|
with belief_scope("AuthRepository.get_permission_by_id"):
|
||||||
|
return self.db.query(Permission).filter(Permission.id == perm_id).first()
|
||||||
|
# [/DEF:get_permission_by_id:Function]
|
||||||
|
|
||||||
|
# [DEF:get_permission_by_resource_action:Function]
|
||||||
|
# @PURPOSE: Retrieves a permission by resource and action.
|
||||||
|
# @PRE: resource and action are strings.
|
||||||
|
# @POST: Returns Permission object if found, else None.
|
||||||
|
# @PARAM: resource (str) - The resource name.
|
||||||
|
# @PARAM: action (str) - The action name.
|
||||||
|
# @RETURN: Optional[Permission] - The found permission or None.
|
||||||
|
def get_permission_by_resource_action(self, resource: str, action: str) -> Optional[Permission]:
|
||||||
|
with belief_scope("AuthRepository.get_permission_by_resource_action"):
|
||||||
|
return self.db.query(Permission).filter(
|
||||||
|
Permission.resource == resource,
|
||||||
|
Permission.action == action
|
||||||
|
).first()
|
||||||
|
# [/DEF:get_permission_by_resource_action:Function]
|
||||||
|
|
||||||
|
# [DEF:list_permissions:Function]
|
||||||
|
# @PURPOSE: Lists all available permissions.
|
||||||
|
# @POST: Returns a list of all Permission objects.
|
||||||
|
# @RETURN: List[Permission] - List of permissions.
|
||||||
|
def list_permissions(self) -> List[Permission]:
|
||||||
|
with belief_scope("AuthRepository.list_permissions"):
|
||||||
|
return self.db.query(Permission).all()
|
||||||
|
# [/DEF:list_permissions:Function]
|
||||||
|
|
||||||
|
# [/DEF:AuthRepository:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.repository:Module]
|
||||||
42
backend/src/core/auth/security.py
Normal file
42
backend/src/core/auth/security.py
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
# [DEF:backend.src.core.auth.security:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: security, password, hashing, bcrypt
|
||||||
|
# @PURPOSE: Utility for password hashing and verification using Passlib.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> passlib
|
||||||
|
#
|
||||||
|
# @INVARIANT: Uses bcrypt for hashing with standard work factor.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from passlib.context import CryptContext
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:pwd_context:Variable]
|
||||||
|
# @PURPOSE: Passlib CryptContext for password management.
|
||||||
|
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
|
||||||
|
# [/DEF:pwd_context:Variable]
|
||||||
|
|
||||||
|
# [DEF:verify_password:Function]
|
||||||
|
# @PURPOSE: Verifies a plain password against a hashed password.
|
||||||
|
# @PRE: plain_password is a string, hashed_password is a bcrypt hash.
|
||||||
|
# @POST: Returns True if password matches, False otherwise.
|
||||||
|
#
|
||||||
|
# @PARAM: plain_password (str) - The unhashed password.
|
||||||
|
# @PARAM: hashed_password (str) - The stored hash.
|
||||||
|
# @RETURN: bool - Verification result.
|
||||||
|
def verify_password(plain_password: str, hashed_password: str) -> bool:
|
||||||
|
return pwd_context.verify(plain_password, hashed_password)
|
||||||
|
# [/DEF:verify_password:Function]
|
||||||
|
|
||||||
|
# [DEF:get_password_hash:Function]
|
||||||
|
# @PURPOSE: Generates a bcrypt hash for a plain password.
|
||||||
|
# @PRE: password is a string.
|
||||||
|
# @POST: Returns a secure bcrypt hash string.
|
||||||
|
#
|
||||||
|
# @PARAM: password (str) - The password to hash.
|
||||||
|
# @RETURN: str - The generated hash.
|
||||||
|
def get_password_hash(password: str) -> str:
|
||||||
|
return pwd_context.hash(password)
|
||||||
|
# [/DEF:get_password_hash:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.security:Module]
|
||||||
@@ -15,7 +15,7 @@ import json
|
|||||||
import os
|
import os
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional, List
|
from typing import Optional, List
|
||||||
from .config_models import AppConfig, Environment, GlobalSettings
|
from .config_models import AppConfig, Environment, GlobalSettings, StorageConfig
|
||||||
from .logger import logger, configure_logger, belief_scope
|
from .logger import logger, configure_logger, belief_scope
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
@@ -46,7 +46,7 @@ class ConfigManager:
|
|||||||
# 3. Runtime check of @POST
|
# 3. Runtime check of @POST
|
||||||
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
||||||
|
|
||||||
logger.info(f"[ConfigManager][Exit] Initialized")
|
logger.info("[ConfigManager][Exit] Initialized")
|
||||||
# [/DEF:__init__:Function]
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
# [DEF:_load_config:Function]
|
# [DEF:_load_config:Function]
|
||||||
@@ -59,19 +59,23 @@ class ConfigManager:
|
|||||||
logger.debug(f"[_load_config][Entry] Loading from {self.config_path}")
|
logger.debug(f"[_load_config][Entry] Loading from {self.config_path}")
|
||||||
|
|
||||||
if not self.config_path.exists():
|
if not self.config_path.exists():
|
||||||
logger.info(f"[_load_config][Action] Config file not found. Creating default.")
|
logger.info("[_load_config][Action] Config file not found. Creating default.")
|
||||||
default_config = AppConfig(
|
default_config = AppConfig(
|
||||||
environments=[],
|
environments=[],
|
||||||
settings=GlobalSettings(backup_path="backups")
|
settings=GlobalSettings()
|
||||||
)
|
)
|
||||||
self._save_config_to_disk(default_config)
|
self._save_config_to_disk(default_config)
|
||||||
return default_config
|
return default_config
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with open(self.config_path, "r") as f:
|
with open(self.config_path, "r") as f:
|
||||||
data = json.load(f)
|
data = json.load(f)
|
||||||
|
|
||||||
|
# Check for deprecated field
|
||||||
|
if "settings" in data and "backup_path" in data["settings"]:
|
||||||
|
del data["settings"]["backup_path"]
|
||||||
|
|
||||||
config = AppConfig(**data)
|
config = AppConfig(**data)
|
||||||
logger.info(f"[_load_config][Coherence:OK] Configuration loaded")
|
logger.info("[_load_config][Coherence:OK] Configuration loaded")
|
||||||
return config
|
return config
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"[_load_config][Coherence:Failed] Error loading config: {e}")
|
logger.error(f"[_load_config][Coherence:Failed] Error loading config: {e}")
|
||||||
@@ -79,7 +83,7 @@ class ConfigManager:
|
|||||||
# For now, return default to be safe, but log the error prominently.
|
# For now, return default to be safe, but log the error prominently.
|
||||||
return AppConfig(
|
return AppConfig(
|
||||||
environments=[],
|
environments=[],
|
||||||
settings=GlobalSettings(backup_path="backups")
|
settings=GlobalSettings(storage=StorageConfig())
|
||||||
)
|
)
|
||||||
# [/DEF:_load_config:Function]
|
# [/DEF:_load_config:Function]
|
||||||
|
|
||||||
@@ -99,7 +103,7 @@ class ConfigManager:
|
|||||||
try:
|
try:
|
||||||
with open(self.config_path, "w") as f:
|
with open(self.config_path, "w") as f:
|
||||||
json.dump(config.dict(), f, indent=4)
|
json.dump(config.dict(), f, indent=4)
|
||||||
logger.info(f"[_save_config_to_disk][Action] Configuration saved")
|
logger.info("[_save_config_to_disk][Action] Configuration saved")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"[_save_config_to_disk][Coherence:Failed] Failed to save: {e}")
|
logger.error(f"[_save_config_to_disk][Coherence:Failed] Failed to save: {e}")
|
||||||
# [/DEF:_save_config_to_disk:Function]
|
# [/DEF:_save_config_to_disk:Function]
|
||||||
@@ -130,7 +134,7 @@ class ConfigManager:
|
|||||||
# @PARAM: settings (GlobalSettings) - The new global settings.
|
# @PARAM: settings (GlobalSettings) - The new global settings.
|
||||||
def update_global_settings(self, settings: GlobalSettings):
|
def update_global_settings(self, settings: GlobalSettings):
|
||||||
with belief_scope("update_global_settings"):
|
with belief_scope("update_global_settings"):
|
||||||
logger.info(f"[update_global_settings][Entry] Updating settings")
|
logger.info("[update_global_settings][Entry] Updating settings")
|
||||||
|
|
||||||
# 1. Runtime check of @PRE
|
# 1. Runtime check of @PRE
|
||||||
assert isinstance(settings, GlobalSettings), "settings must be an instance of GlobalSettings"
|
assert isinstance(settings, GlobalSettings), "settings must be an instance of GlobalSettings"
|
||||||
@@ -142,7 +146,7 @@ class ConfigManager:
|
|||||||
# Reconfigure logger with new settings
|
# Reconfigure logger with new settings
|
||||||
configure_logger(settings.logging)
|
configure_logger(settings.logging)
|
||||||
|
|
||||||
logger.info(f"[update_global_settings][Exit] Settings updated")
|
logger.info("[update_global_settings][Exit] Settings updated")
|
||||||
# [/DEF:update_global_settings:Function]
|
# [/DEF:update_global_settings:Function]
|
||||||
|
|
||||||
# [DEF:validate_path:Function]
|
# [DEF:validate_path:Function]
|
||||||
@@ -218,7 +222,7 @@ class ConfigManager:
|
|||||||
self.config.environments.append(env)
|
self.config.environments.append(env)
|
||||||
self.save()
|
self.save()
|
||||||
|
|
||||||
logger.info(f"[add_environment][Exit] Environment added")
|
logger.info("[add_environment][Exit] Environment added")
|
||||||
# [/DEF:add_environment:Function]
|
# [/DEF:add_environment:Function]
|
||||||
|
|
||||||
# [DEF:update_environment:Function]
|
# [DEF:update_environment:Function]
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
# [DEF:ConfigModels:Module]
|
# [DEF:ConfigModels:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: config, models, pydantic
|
# @SEMANTICS: config, models, pydantic
|
||||||
# @PURPOSE: Defines the data models for application configuration using Pydantic.
|
# @PURPOSE: Defines the data models for application configuration using Pydantic.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
@@ -7,6 +8,7 @@
|
|||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
|
from ..models.storage import StorageConfig
|
||||||
|
|
||||||
# [DEF:Schedule:DataClass]
|
# [DEF:Schedule:DataClass]
|
||||||
# @PURPOSE: Represents a backup schedule configuration.
|
# @PURPOSE: Represents a backup schedule configuration.
|
||||||
@@ -33,6 +35,7 @@ class Environment(BaseModel):
|
|||||||
# @PURPOSE: Defines the configuration for the application's logging system.
|
# @PURPOSE: Defines the configuration for the application's logging system.
|
||||||
class LoggingConfig(BaseModel):
|
class LoggingConfig(BaseModel):
|
||||||
level: str = "INFO"
|
level: str = "INFO"
|
||||||
|
task_log_level: str = "INFO" # Minimum level for task-specific logs (DEBUG, INFO, WARNING, ERROR)
|
||||||
file_path: Optional[str] = "logs/app.log"
|
file_path: Optional[str] = "logs/app.log"
|
||||||
max_bytes: int = 10 * 1024 * 1024
|
max_bytes: int = 10 * 1024 * 1024
|
||||||
backup_count: int = 5
|
backup_count: int = 5
|
||||||
@@ -42,7 +45,7 @@ class LoggingConfig(BaseModel):
|
|||||||
# [DEF:GlobalSettings:DataClass]
|
# [DEF:GlobalSettings:DataClass]
|
||||||
# @PURPOSE: Represents global application settings.
|
# @PURPOSE: Represents global application settings.
|
||||||
class GlobalSettings(BaseModel):
|
class GlobalSettings(BaseModel):
|
||||||
backup_path: str
|
storage: StorageConfig = Field(default_factory=StorageConfig)
|
||||||
default_environment_id: Optional[str] = None
|
default_environment_id: Optional[str] = None
|
||||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||||
|
|
||||||
|
|||||||
@@ -5,54 +5,90 @@
|
|||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
# @RELATION: DEPENDS_ON -> sqlalchemy
|
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||||
# @RELATION: USES -> backend.src.models.mapping
|
# @RELATION: USES -> backend.src.models.mapping
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.config
|
||||||
#
|
#
|
||||||
# @INVARIANT: A single engine instance is used for the entire application.
|
# @INVARIANT: A single engine instance is used for the entire application.
|
||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
from sqlalchemy import create_engine
|
from sqlalchemy import create_engine
|
||||||
from sqlalchemy.orm import sessionmaker, Session
|
from sqlalchemy.orm import sessionmaker
|
||||||
from ..models.mapping import Base
|
from ..models.mapping import Base
|
||||||
# Import models to ensure they're registered with Base
|
# Import models to ensure they're registered with Base
|
||||||
from ..models.task import TaskRecord
|
|
||||||
from ..models.connection import ConnectionConfig
|
|
||||||
from .logger import belief_scope
|
from .logger import belief_scope
|
||||||
|
from .auth.config import auth_config
|
||||||
import os
|
import os
|
||||||
|
from pathlib import Path
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:BASE_DIR:Variable]
|
||||||
|
# @PURPOSE: Base directory for the backend (where .db files should reside).
|
||||||
|
BASE_DIR = Path(__file__).resolve().parent.parent.parent
|
||||||
|
# [/DEF:BASE_DIR:Variable]
|
||||||
|
|
||||||
# [DEF:DATABASE_URL:Constant]
|
# [DEF:DATABASE_URL:Constant]
|
||||||
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:///./mappings.db")
|
# @PURPOSE: URL for the main mappings database.
|
||||||
|
DATABASE_URL = os.getenv("DATABASE_URL", f"sqlite:///{BASE_DIR}/mappings.db")
|
||||||
# [/DEF:DATABASE_URL:Constant]
|
# [/DEF:DATABASE_URL:Constant]
|
||||||
|
|
||||||
# [DEF:TASKS_DATABASE_URL:Constant]
|
# [DEF:TASKS_DATABASE_URL:Constant]
|
||||||
TASKS_DATABASE_URL = os.getenv("TASKS_DATABASE_URL", "sqlite:///./tasks.db")
|
# @PURPOSE: URL for the tasks execution database.
|
||||||
|
TASKS_DATABASE_URL = os.getenv("TASKS_DATABASE_URL", f"sqlite:///{BASE_DIR}/tasks.db")
|
||||||
# [/DEF:TASKS_DATABASE_URL:Constant]
|
# [/DEF:TASKS_DATABASE_URL:Constant]
|
||||||
|
|
||||||
|
# [DEF:AUTH_DATABASE_URL:Constant]
|
||||||
|
# @PURPOSE: URL for the authentication database.
|
||||||
|
AUTH_DATABASE_URL = os.getenv("AUTH_DATABASE_URL", auth_config.AUTH_DATABASE_URL)
|
||||||
|
# If it's a relative sqlite path starting with ./backend/, fix it to be absolute or relative to BASE_DIR
|
||||||
|
if AUTH_DATABASE_URL.startswith("sqlite:///./backend/"):
|
||||||
|
AUTH_DATABASE_URL = AUTH_DATABASE_URL.replace("sqlite:///./backend/", f"sqlite:///{BASE_DIR}/")
|
||||||
|
elif AUTH_DATABASE_URL.startswith("sqlite:///./") and not AUTH_DATABASE_URL.startswith("sqlite:///./backend/"):
|
||||||
|
# If it's just ./ but we are in backend, it's fine, but let's make it absolute for robustness
|
||||||
|
AUTH_DATABASE_URL = AUTH_DATABASE_URL.replace("sqlite:///./", f"sqlite:///{BASE_DIR}/")
|
||||||
|
# [/DEF:AUTH_DATABASE_URL:Constant]
|
||||||
|
|
||||||
# [DEF:engine:Variable]
|
# [DEF:engine:Variable]
|
||||||
|
# @PURPOSE: SQLAlchemy engine for mappings database.
|
||||||
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
# [/DEF:engine:Variable]
|
# [/DEF:engine:Variable]
|
||||||
|
|
||||||
# [DEF:tasks_engine:Variable]
|
# [DEF:tasks_engine:Variable]
|
||||||
|
# @PURPOSE: SQLAlchemy engine for tasks database.
|
||||||
tasks_engine = create_engine(TASKS_DATABASE_URL, connect_args={"check_same_thread": False})
|
tasks_engine = create_engine(TASKS_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
# [/DEF:tasks_engine:Variable]
|
# [/DEF:tasks_engine:Variable]
|
||||||
|
|
||||||
|
# [DEF:auth_engine:Variable]
|
||||||
|
# @PURPOSE: SQLAlchemy engine for authentication database.
|
||||||
|
auth_engine = create_engine(AUTH_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
|
# [/DEF:auth_engine:Variable]
|
||||||
|
|
||||||
# [DEF:SessionLocal:Class]
|
# [DEF:SessionLocal:Class]
|
||||||
# @PURPOSE: A session factory for the main mappings database.
|
# @PURPOSE: A session factory for the main mappings database.
|
||||||
|
# @PRE: engine is initialized.
|
||||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
# [/DEF:SessionLocal:Class]
|
# [/DEF:SessionLocal:Class]
|
||||||
|
|
||||||
# [DEF:TasksSessionLocal:Class]
|
# [DEF:TasksSessionLocal:Class]
|
||||||
# @PURPOSE: A session factory for the tasks execution database.
|
# @PURPOSE: A session factory for the tasks execution database.
|
||||||
|
# @PRE: tasks_engine is initialized.
|
||||||
TasksSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=tasks_engine)
|
TasksSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=tasks_engine)
|
||||||
# [/DEF:TasksSessionLocal:Class]
|
# [/DEF:TasksSessionLocal:Class]
|
||||||
|
|
||||||
|
# [DEF:AuthSessionLocal:Class]
|
||||||
|
# @PURPOSE: A session factory for the authentication database.
|
||||||
|
# @PRE: auth_engine is initialized.
|
||||||
|
AuthSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=auth_engine)
|
||||||
|
# [/DEF:AuthSessionLocal:Class]
|
||||||
|
|
||||||
# [DEF:init_db:Function]
|
# [DEF:init_db:Function]
|
||||||
# @PURPOSE: Initializes the database by creating all tables.
|
# @PURPOSE: Initializes the database by creating all tables.
|
||||||
# @PRE: engine and tasks_engine are initialized.
|
# @PRE: engine, tasks_engine and auth_engine are initialized.
|
||||||
# @POST: Database tables created.
|
# @POST: Database tables created in all databases.
|
||||||
|
# @SIDE_EFFECT: Creates physical database files if they don't exist.
|
||||||
def init_db():
|
def init_db():
|
||||||
with belief_scope("init_db"):
|
with belief_scope("init_db"):
|
||||||
Base.metadata.create_all(bind=engine)
|
Base.metadata.create_all(bind=engine)
|
||||||
Base.metadata.create_all(bind=tasks_engine)
|
Base.metadata.create_all(bind=tasks_engine)
|
||||||
|
Base.metadata.create_all(bind=auth_engine)
|
||||||
# [/DEF:init_db:Function]
|
# [/DEF:init_db:Function]
|
||||||
|
|
||||||
# [DEF:get_db:Function]
|
# [DEF:get_db:Function]
|
||||||
@@ -83,4 +119,18 @@ def get_tasks_db():
|
|||||||
db.close()
|
db.close()
|
||||||
# [/DEF:get_tasks_db:Function]
|
# [/DEF:get_tasks_db:Function]
|
||||||
|
|
||||||
|
# [DEF:get_auth_db:Function]
|
||||||
|
# @PURPOSE: Dependency for getting an authentication database session.
|
||||||
|
# @PRE: AuthSessionLocal is initialized.
|
||||||
|
# @POST: Session is closed after use.
|
||||||
|
# @RETURN: Generator[Session, None, None]
|
||||||
|
def get_auth_db():
|
||||||
|
with belief_scope("get_auth_db"):
|
||||||
|
db = AuthSessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:get_auth_db:Function]
|
||||||
|
|
||||||
# [/DEF:backend.src.core.database:Module]
|
# [/DEF:backend.src.core.database:Module]
|
||||||
|
|||||||
@@ -19,6 +19,9 @@ _belief_state = threading.local()
|
|||||||
# Global flag for belief state logging
|
# Global flag for belief state logging
|
||||||
_enable_belief_state = True
|
_enable_belief_state = True
|
||||||
|
|
||||||
|
# Global task log level filter
|
||||||
|
_task_log_level = "INFO"
|
||||||
|
|
||||||
# [DEF:BeliefFormatter:Class]
|
# [DEF:BeliefFormatter:Class]
|
||||||
# @PURPOSE: Custom logging formatter that adds belief state prefixes to log messages.
|
# @PURPOSE: Custom logging formatter that adds belief state prefixes to log messages.
|
||||||
class BeliefFormatter(logging.Formatter):
|
class BeliefFormatter(logging.Formatter):
|
||||||
@@ -28,6 +31,7 @@ class BeliefFormatter(logging.Formatter):
|
|||||||
# @POST: Returns formatted string.
|
# @POST: Returns formatted string.
|
||||||
# @PARAM: record (logging.LogRecord) - The log record to format.
|
# @PARAM: record (logging.LogRecord) - The log record to format.
|
||||||
# @RETURN: str - The formatted log message.
|
# @RETURN: str - The formatted log message.
|
||||||
|
# @SEMANTICS: logging, formatter, context
|
||||||
def format(self, record):
|
def format(self, record):
|
||||||
anchor_id = getattr(_belief_state, 'anchor_id', None)
|
anchor_id = getattr(_belief_state, 'anchor_id', None)
|
||||||
if anchor_id:
|
if anchor_id:
|
||||||
@@ -54,14 +58,15 @@ class LogEntry(BaseModel):
|
|||||||
# @PARAM: message (str) - Optional entry message.
|
# @PARAM: message (str) - Optional entry message.
|
||||||
# @PRE: anchor_id must be provided.
|
# @PRE: anchor_id must be provided.
|
||||||
# @POST: Thread-local belief state is updated and entry/exit logs are generated.
|
# @POST: Thread-local belief state is updated and entry/exit logs are generated.
|
||||||
|
# @SEMANTICS: logging, context, belief_state
|
||||||
@contextmanager
|
@contextmanager
|
||||||
def belief_scope(anchor_id: str, message: str = ""):
|
def belief_scope(anchor_id: str, message: str = ""):
|
||||||
# Log Entry if enabled
|
# Log Entry if enabled (DEBUG level to reduce noise)
|
||||||
if _enable_belief_state:
|
if _enable_belief_state:
|
||||||
entry_msg = f"[{anchor_id}][Entry]"
|
entry_msg = f"[{anchor_id}][Entry]"
|
||||||
if message:
|
if message:
|
||||||
entry_msg += f" {message}"
|
entry_msg += f" {message}"
|
||||||
logger.info(entry_msg)
|
logger.debug(entry_msg)
|
||||||
|
|
||||||
# Set thread-local anchor_id
|
# Set thread-local anchor_id
|
||||||
old_anchor = getattr(_belief_state, 'anchor_id', None)
|
old_anchor = getattr(_belief_state, 'anchor_id', None)
|
||||||
@@ -69,13 +74,13 @@ def belief_scope(anchor_id: str, message: str = ""):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
yield
|
yield
|
||||||
# Log Coherence OK and Exit
|
# Log Coherence OK and Exit (DEBUG level to reduce noise)
|
||||||
logger.info(f"[{anchor_id}][Coherence:OK]")
|
logger.debug(f"[{anchor_id}][Coherence:OK]")
|
||||||
if _enable_belief_state:
|
if _enable_belief_state:
|
||||||
logger.info(f"[{anchor_id}][Exit]")
|
logger.debug(f"[{anchor_id}][Exit]")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
# Log Coherence Failed
|
# Log Coherence Failed (DEBUG level to reduce noise)
|
||||||
logger.info(f"[{anchor_id}][Coherence:Failed] {str(e)}")
|
logger.debug(f"[{anchor_id}][Coherence:Failed] {str(e)}")
|
||||||
raise
|
raise
|
||||||
finally:
|
finally:
|
||||||
# Restore old anchor
|
# Restore old anchor
|
||||||
@@ -86,11 +91,13 @@ def belief_scope(anchor_id: str, message: str = ""):
|
|||||||
# [DEF:configure_logger:Function]
|
# [DEF:configure_logger:Function]
|
||||||
# @PURPOSE: Configures the logger with the provided logging settings.
|
# @PURPOSE: Configures the logger with the provided logging settings.
|
||||||
# @PRE: config is a valid LoggingConfig instance.
|
# @PRE: config is a valid LoggingConfig instance.
|
||||||
# @POST: Logger level, handlers, and belief state flag are updated.
|
# @POST: Logger level, handlers, belief state flag, and task log level are updated.
|
||||||
# @PARAM: config (LoggingConfig) - The logging configuration.
|
# @PARAM: config (LoggingConfig) - The logging configuration.
|
||||||
|
# @SEMANTICS: logging, configuration, initialization
|
||||||
def configure_logger(config):
|
def configure_logger(config):
|
||||||
global _enable_belief_state
|
global _enable_belief_state, _task_log_level
|
||||||
_enable_belief_state = config.enable_belief_state
|
_enable_belief_state = config.enable_belief_state
|
||||||
|
_task_log_level = config.task_log_level.upper()
|
||||||
|
|
||||||
# Set logger level
|
# Set logger level
|
||||||
level = getattr(logging, config.level.upper(), logging.INFO)
|
level = getattr(logging, config.level.upper(), logging.INFO)
|
||||||
@@ -104,7 +111,6 @@ def configure_logger(config):
|
|||||||
|
|
||||||
# Add file handler if file_path is set
|
# Add file handler if file_path is set
|
||||||
if config.file_path:
|
if config.file_path:
|
||||||
import os
|
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
log_file = Path(config.file_path)
|
log_file = Path(config.file_path)
|
||||||
log_file.parent.mkdir(parents=True, exist_ok=True)
|
log_file.parent.mkdir(parents=True, exist_ok=True)
|
||||||
@@ -127,6 +133,36 @@ def configure_logger(config):
|
|||||||
))
|
))
|
||||||
# [/DEF:configure_logger:Function]
|
# [/DEF:configure_logger:Function]
|
||||||
|
|
||||||
|
# [DEF:get_task_log_level:Function]
|
||||||
|
# @PURPOSE: Returns the current task log level filter.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Returns the task log level string.
|
||||||
|
# @RETURN: str - The current task log level (DEBUG, INFO, WARNING, ERROR).
|
||||||
|
# @SEMANTICS: logging, configuration, getter
|
||||||
|
def get_task_log_level() -> str:
|
||||||
|
"""Returns the current task log level filter."""
|
||||||
|
return _task_log_level
|
||||||
|
# [/DEF:get_task_log_level:Function]
|
||||||
|
|
||||||
|
# [DEF:should_log_task_level:Function]
|
||||||
|
# @PURPOSE: Checks if a log level should be recorded based on task_log_level setting.
|
||||||
|
# @PRE: level is a valid log level string.
|
||||||
|
# @POST: Returns True if level meets or exceeds task_log_level threshold.
|
||||||
|
# @PARAM: level (str) - The log level to check.
|
||||||
|
# @RETURN: bool - True if the level should be logged.
|
||||||
|
# @SEMANTICS: logging, filter, level
|
||||||
|
def should_log_task_level(level: str) -> bool:
|
||||||
|
"""Checks if a log level should be recorded based on task_log_level setting."""
|
||||||
|
level_order = {"DEBUG": 0, "INFO": 1, "WARNING": 2, "ERROR": 3}
|
||||||
|
current_level = _task_log_level.upper()
|
||||||
|
check_level = level.upper()
|
||||||
|
|
||||||
|
current_order = level_order.get(current_level, 1) # Default to INFO
|
||||||
|
check_order = level_order.get(check_level, 1)
|
||||||
|
|
||||||
|
return check_order >= current_order
|
||||||
|
# [/DEF:should_log_task_level:Function]
|
||||||
|
|
||||||
# [DEF:WebSocketLogHandler:Class]
|
# [DEF:WebSocketLogHandler:Class]
|
||||||
# @SEMANTICS: logging, handler, websocket, buffer
|
# @SEMANTICS: logging, handler, websocket, buffer
|
||||||
# @PURPOSE: A custom logging handler that captures log records into a buffer. It is designed to be extended for real-time log streaming over WebSockets.
|
# @PURPOSE: A custom logging handler that captures log records into a buffer. It is designed to be extended for real-time log streaming over WebSockets.
|
||||||
@@ -140,6 +176,7 @@ class WebSocketLogHandler(logging.Handler):
|
|||||||
# @PRE: capacity is an integer.
|
# @PRE: capacity is an integer.
|
||||||
# @POST: Instance initialized with empty deque.
|
# @POST: Instance initialized with empty deque.
|
||||||
# @PARAM: capacity (int) - Maximum number of logs to keep in memory.
|
# @PARAM: capacity (int) - Maximum number of logs to keep in memory.
|
||||||
|
# @SEMANTICS: logging, initialization, buffer
|
||||||
def __init__(self, capacity: int = 1000):
|
def __init__(self, capacity: int = 1000):
|
||||||
super().__init__()
|
super().__init__()
|
||||||
self.log_buffer: deque[LogEntry] = deque(maxlen=capacity)
|
self.log_buffer: deque[LogEntry] = deque(maxlen=capacity)
|
||||||
@@ -152,6 +189,7 @@ class WebSocketLogHandler(logging.Handler):
|
|||||||
# @PRE: record is a logging.LogRecord.
|
# @PRE: record is a logging.LogRecord.
|
||||||
# @POST: Log is added to the log_buffer.
|
# @POST: Log is added to the log_buffer.
|
||||||
# @PARAM: record (logging.LogRecord) - The log record to emit.
|
# @PARAM: record (logging.LogRecord) - The log record to emit.
|
||||||
|
# @SEMANTICS: logging, handler, buffer
|
||||||
def emit(self, record: logging.LogRecord):
|
def emit(self, record: logging.LogRecord):
|
||||||
try:
|
try:
|
||||||
log_entry = LogEntry(
|
log_entry = LogEntry(
|
||||||
@@ -179,6 +217,7 @@ class WebSocketLogHandler(logging.Handler):
|
|||||||
# @PRE: None.
|
# @PRE: None.
|
||||||
# @POST: Returns list of LogEntry objects.
|
# @POST: Returns list of LogEntry objects.
|
||||||
# @RETURN: List[LogEntry] - List of buffered log entries.
|
# @RETURN: List[LogEntry] - List of buffered log entries.
|
||||||
|
# @SEMANTICS: logging, buffer, retrieval
|
||||||
def get_recent_logs(self) -> List[LogEntry]:
|
def get_recent_logs(self) -> List[LogEntry]:
|
||||||
"""
|
"""
|
||||||
Returns a list of recent log entries from the buffer.
|
Returns a list of recent log entries from the buffer.
|
||||||
@@ -196,12 +235,24 @@ logger = logging.getLogger("superset_tools_app")
|
|||||||
# [DEF:believed:Function]
|
# [DEF:believed:Function]
|
||||||
# @PURPOSE: A decorator that wraps a function in a belief scope.
|
# @PURPOSE: A decorator that wraps a function in a belief scope.
|
||||||
# @PARAM: anchor_id (str) - The identifier for the semantic block.
|
# @PARAM: anchor_id (str) - The identifier for the semantic block.
|
||||||
|
# @PRE: anchor_id must be a string.
|
||||||
|
# @POST: Returns a decorator function.
|
||||||
def believed(anchor_id: str):
|
def believed(anchor_id: str):
|
||||||
|
# [DEF:decorator:Function]
|
||||||
|
# @PURPOSE: Internal decorator for belief scope.
|
||||||
|
# @PRE: func must be a callable.
|
||||||
|
# @POST: Returns the wrapped function.
|
||||||
def decorator(func):
|
def decorator(func):
|
||||||
|
# [DEF:wrapper:Function]
|
||||||
|
# @PURPOSE: Internal wrapper that enters belief scope.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Executes the function within a belief scope.
|
||||||
def wrapper(*args, **kwargs):
|
def wrapper(*args, **kwargs):
|
||||||
with belief_scope(anchor_id):
|
with belief_scope(anchor_id):
|
||||||
return func(*args, **kwargs)
|
return func(*args, **kwargs)
|
||||||
|
# [/DEF:wrapper:Function]
|
||||||
return wrapper
|
return wrapper
|
||||||
|
# [/DEF:decorator:Function]
|
||||||
return decorator
|
return decorator
|
||||||
# [/DEF:believed:Function]
|
# [/DEF:believed:Function]
|
||||||
logger.setLevel(logging.INFO)
|
logger.setLevel(logging.INFO)
|
||||||
|
|||||||
@@ -11,12 +11,10 @@
|
|||||||
import zipfile
|
import zipfile
|
||||||
import yaml
|
import yaml
|
||||||
import os
|
import os
|
||||||
import shutil
|
|
||||||
import tempfile
|
import tempfile
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Dict
|
from typing import Dict
|
||||||
from .logger import logger, belief_scope
|
from .logger import logger, belief_scope
|
||||||
import yaml
|
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:MigrationEngine:Class]
|
# [DEF:MigrationEngine:Class]
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
from abc import ABC, abstractmethod
|
from abc import ABC, abstractmethod
|
||||||
from typing import Dict, Any
|
from typing import Dict, Any, Optional
|
||||||
from .logger import belief_scope
|
from .logger import belief_scope
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
@@ -68,6 +68,33 @@ class PluginBase(ABC):
|
|||||||
pass
|
pass
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:required_permission:Function]
|
||||||
|
# @PURPOSE: Returns the required permission string to execute this plugin.
|
||||||
|
# @PRE: Plugin instance exists.
|
||||||
|
# @POST: Returns string permission.
|
||||||
|
# @RETURN: str - Required permission (e.g., "plugin:backup:execute").
|
||||||
|
def required_permission(self) -> str:
|
||||||
|
"""The permission string required to execute this plugin."""
|
||||||
|
with belief_scope("required_permission"):
|
||||||
|
return f"plugin:{self.id}:execute"
|
||||||
|
# [/DEF:required_permission:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the plugin's UI, if applicable.
|
||||||
|
# @PRE: Plugin instance exists.
|
||||||
|
# @POST: Returns string route or None.
|
||||||
|
# @RETURN: Optional[str] - Frontend route.
|
||||||
|
def ui_route(self) -> Optional[str]:
|
||||||
|
"""
|
||||||
|
The frontend route for the plugin's UI.
|
||||||
|
Returns None if the plugin does not have a dedicated UI page.
|
||||||
|
"""
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return None
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
@abstractmethod
|
@abstractmethod
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for the plugin's input parameters.
|
# @PURPOSE: Returns the JSON schema for the plugin's input parameters.
|
||||||
@@ -111,5 +138,6 @@ class PluginConfig(BaseModel):
|
|||||||
name: str = Field(..., description="Human-readable name for the plugin")
|
name: str = Field(..., description="Human-readable name for the plugin")
|
||||||
description: str = Field(..., description="Brief description of what the plugin does")
|
description: str = Field(..., description="Brief description of what the plugin does")
|
||||||
version: str = Field(..., description="Version of the plugin")
|
version: str = Field(..., description="Version of the plugin")
|
||||||
|
ui_route: Optional[str] = Field(None, description="Frontend route for the plugin UI")
|
||||||
input_schema: Dict[str, Any] = Field(..., description="JSON schema for input parameters", alias="schema")
|
input_schema: Dict[str, Any] = Field(..., description="JSON schema for input parameters", alias="schema")
|
||||||
# [/DEF:PluginConfig:Class]
|
# [/DEF:PluginConfig:Class]
|
||||||
@@ -1,12 +1,12 @@
|
|||||||
import importlib.util
|
import importlib.util
|
||||||
import os
|
import os
|
||||||
import sys # Added this line
|
import sys # Added this line
|
||||||
from typing import Dict, Type, List, Optional
|
from typing import Dict, List, Optional
|
||||||
from .plugin_base import PluginBase, PluginConfig
|
from .plugin_base import PluginBase, PluginConfig
|
||||||
from jsonschema import validate
|
|
||||||
from .logger import belief_scope
|
from .logger import belief_scope
|
||||||
|
|
||||||
# [DEF:PluginLoader:Class]
|
# [DEF:PluginLoader:Class]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: plugin, loader, dynamic, import
|
# @SEMANTICS: plugin, loader, dynamic, import
|
||||||
# @PURPOSE: Scans a specified directory for Python modules, dynamically loads them, and registers any classes that are valid implementations of the PluginBase interface.
|
# @PURPOSE: Scans a specified directory for Python modules, dynamically loads them, and registers any classes that are valid implementations of the PluginBase interface.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
@@ -50,9 +50,18 @@ class PluginLoader:
|
|||||||
sys.path.insert(0, plugin_parent_dir)
|
sys.path.insert(0, plugin_parent_dir)
|
||||||
|
|
||||||
for filename in os.listdir(self.plugin_dir):
|
for filename in os.listdir(self.plugin_dir):
|
||||||
|
file_path = os.path.join(self.plugin_dir, filename)
|
||||||
|
|
||||||
|
# Handle directory-based plugins (packages)
|
||||||
|
if os.path.isdir(file_path):
|
||||||
|
init_file = os.path.join(file_path, "__init__.py")
|
||||||
|
if os.path.exists(init_file):
|
||||||
|
self._load_module(filename, init_file)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Handle single-file plugins
|
||||||
if filename.endswith(".py") and filename != "__init__.py":
|
if filename.endswith(".py") and filename != "__init__.py":
|
||||||
module_name = filename[:-3]
|
module_name = filename[:-3]
|
||||||
file_path = os.path.join(self.plugin_dir, filename)
|
|
||||||
self._load_module(module_name, file_path)
|
self._load_module(module_name, file_path)
|
||||||
# [/DEF:_load_plugins:Function]
|
# [/DEF:_load_plugins:Function]
|
||||||
|
|
||||||
@@ -132,6 +141,7 @@ class PluginLoader:
|
|||||||
name=plugin_instance.name,
|
name=plugin_instance.name,
|
||||||
description=plugin_instance.description,
|
description=plugin_instance.description,
|
||||||
version=plugin_instance.version,
|
version=plugin_instance.version,
|
||||||
|
ui_route=plugin_instance.ui_route,
|
||||||
schema=schema,
|
schema=schema,
|
||||||
)
|
)
|
||||||
# The following line is commented out because it requires a schema to be passed to validate against.
|
# The following line is commented out because it requires a schema to be passed to validate against.
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
# [DEF:SchedulerModule:Module]
|
# [DEF:SchedulerModule:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: scheduler, apscheduler, cron, backup
|
# @SEMANTICS: scheduler, apscheduler, cron, backup
|
||||||
# @PURPOSE: Manages scheduled tasks using APScheduler.
|
# @PURPOSE: Manages scheduled tasks using APScheduler.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
@@ -9,11 +10,11 @@ from apscheduler.schedulers.background import BackgroundScheduler
|
|||||||
from apscheduler.triggers.cron import CronTrigger
|
from apscheduler.triggers.cron import CronTrigger
|
||||||
from .logger import logger, belief_scope
|
from .logger import logger, belief_scope
|
||||||
from .config_manager import ConfigManager
|
from .config_manager import ConfigManager
|
||||||
from typing import Optional
|
|
||||||
import asyncio
|
import asyncio
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:SchedulerService:Class]
|
# [DEF:SchedulerService:Class]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: scheduler, service, apscheduler
|
# @SEMANTICS: scheduler, service, apscheduler
|
||||||
# @PURPOSE: Provides a service to manage scheduled backup tasks.
|
# @PURPOSE: Provides a service to manage scheduled backup tasks.
|
||||||
class SchedulerService:
|
class SchedulerService:
|
||||||
|
|||||||
@@ -13,10 +13,10 @@
|
|||||||
import json
|
import json
|
||||||
import zipfile
|
import zipfile
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Any, Dict, List, Optional, Tuple, Union, cast
|
from typing import Dict, List, Optional, Tuple, Union, cast
|
||||||
from requests import Response
|
from requests import Response
|
||||||
from .logger import logger as app_logger, belief_scope
|
from .logger import logger as app_logger, belief_scope
|
||||||
from .utils.network import APIClient, SupersetAPIError, AuthenticationError, DashboardNotFoundError, NetworkError
|
from .utils.network import APIClient, SupersetAPIError
|
||||||
from .utils.fileio import get_filename_from_headers
|
from .utils.fileio import get_filename_from_headers
|
||||||
from .config_models import Environment
|
from .config_models import Environment
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
@@ -65,6 +65,8 @@ class SupersetClient:
|
|||||||
@property
|
@property
|
||||||
# [DEF:headers:Function]
|
# [DEF:headers:Function]
|
||||||
# @PURPOSE: Возвращает базовые HTTP-заголовки, используемые сетевым клиентом.
|
# @PURPOSE: Возвращает базовые HTTP-заголовки, используемые сетевым клиентом.
|
||||||
|
# @PRE: APIClient is initialized and authenticated.
|
||||||
|
# @POST: Returns a dictionary of HTTP headers.
|
||||||
def headers(self) -> dict:
|
def headers(self) -> dict:
|
||||||
with belief_scope("headers"):
|
with belief_scope("headers"):
|
||||||
return self.network.headers
|
return self.network.headers
|
||||||
@@ -75,6 +77,8 @@ class SupersetClient:
|
|||||||
# [DEF:get_dashboards:Function]
|
# [DEF:get_dashboards:Function]
|
||||||
# @PURPOSE: Получает полный список дашбордов, автоматически обрабатывая пагинацию.
|
# @PURPOSE: Получает полный список дашбордов, автоматически обрабатывая пагинацию.
|
||||||
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса для API.
|
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса для API.
|
||||||
|
# @PRE: Client is authenticated.
|
||||||
|
# @POST: Returns a tuple with total count and list of dashboards.
|
||||||
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список дашбордов).
|
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список дашбордов).
|
||||||
def get_dashboards(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
def get_dashboards(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
||||||
with belief_scope("get_dashboards"):
|
with belief_scope("get_dashboards"):
|
||||||
@@ -94,6 +98,8 @@ class SupersetClient:
|
|||||||
|
|
||||||
# [DEF:get_dashboards_summary:Function]
|
# [DEF:get_dashboards_summary:Function]
|
||||||
# @PURPOSE: Fetches dashboard metadata optimized for the grid.
|
# @PURPOSE: Fetches dashboard metadata optimized for the grid.
|
||||||
|
# @PRE: Client is authenticated.
|
||||||
|
# @POST: Returns a list of dashboard metadata summaries.
|
||||||
# @RETURN: List[Dict]
|
# @RETURN: List[Dict]
|
||||||
def get_dashboards_summary(self) -> List[Dict]:
|
def get_dashboards_summary(self) -> List[Dict]:
|
||||||
with belief_scope("SupersetClient.get_dashboards_summary"):
|
with belief_scope("SupersetClient.get_dashboards_summary"):
|
||||||
@@ -117,6 +123,8 @@ class SupersetClient:
|
|||||||
# [DEF:export_dashboard:Function]
|
# [DEF:export_dashboard:Function]
|
||||||
# @PURPOSE: Экспортирует дашборд в виде ZIP-архива.
|
# @PURPOSE: Экспортирует дашборд в виде ZIP-архива.
|
||||||
# @PARAM: dashboard_id (int) - ID дашборда для экспорта.
|
# @PARAM: dashboard_id (int) - ID дашборда для экспорта.
|
||||||
|
# @PRE: dashboard_id must exist in Superset.
|
||||||
|
# @POST: Returns ZIP content and filename.
|
||||||
# @RETURN: Tuple[bytes, str] - Бинарное содержимое ZIP-архива и имя файла.
|
# @RETURN: Tuple[bytes, str] - Бинарное содержимое ZIP-архива и имя файла.
|
||||||
def export_dashboard(self, dashboard_id: int) -> Tuple[bytes, str]:
|
def export_dashboard(self, dashboard_id: int) -> Tuple[bytes, str]:
|
||||||
with belief_scope("export_dashboard"):
|
with belief_scope("export_dashboard"):
|
||||||
@@ -140,6 +148,8 @@ class SupersetClient:
|
|||||||
# @PARAM: file_name (Union[str, Path]) - Путь к ZIP-архиву.
|
# @PARAM: file_name (Union[str, Path]) - Путь к ZIP-архиву.
|
||||||
# @PARAM: dash_id (Optional[int]) - ID дашборда для удаления при сбое.
|
# @PARAM: dash_id (Optional[int]) - ID дашборда для удаления при сбое.
|
||||||
# @PARAM: dash_slug (Optional[str]) - Slug дашборда для поиска ID.
|
# @PARAM: dash_slug (Optional[str]) - Slug дашборда для поиска ID.
|
||||||
|
# @PRE: file_name must be a valid ZIP dashboard export.
|
||||||
|
# @POST: Dashboard is imported or re-imported after deletion.
|
||||||
# @RETURN: Dict - Ответ API в случае успеха.
|
# @RETURN: Dict - Ответ API в случае успеха.
|
||||||
def import_dashboard(self, file_name: Union[str, Path], dash_id: Optional[int] = None, dash_slug: Optional[str] = None) -> Dict:
|
def import_dashboard(self, file_name: Union[str, Path], dash_id: Optional[int] = None, dash_slug: Optional[str] = None) -> Dict:
|
||||||
with belief_scope("import_dashboard"):
|
with belief_scope("import_dashboard"):
|
||||||
@@ -165,6 +175,8 @@ class SupersetClient:
|
|||||||
# [DEF:delete_dashboard:Function]
|
# [DEF:delete_dashboard:Function]
|
||||||
# @PURPOSE: Удаляет дашборд по его ID или slug.
|
# @PURPOSE: Удаляет дашборд по его ID или slug.
|
||||||
# @PARAM: dashboard_id (Union[int, str]) - ID или slug дашборда.
|
# @PARAM: dashboard_id (Union[int, str]) - ID или slug дашборда.
|
||||||
|
# @PRE: dashboard_id must exist.
|
||||||
|
# @POST: Dashboard is removed from Superset.
|
||||||
def delete_dashboard(self, dashboard_id: Union[int, str]) -> None:
|
def delete_dashboard(self, dashboard_id: Union[int, str]) -> None:
|
||||||
with belief_scope("delete_dashboard"):
|
with belief_scope("delete_dashboard"):
|
||||||
app_logger.info("[delete_dashboard][Enter] Deleting dashboard %s.", dashboard_id)
|
app_logger.info("[delete_dashboard][Enter] Deleting dashboard %s.", dashboard_id)
|
||||||
@@ -183,6 +195,8 @@ class SupersetClient:
|
|||||||
# [DEF:get_datasets:Function]
|
# [DEF:get_datasets:Function]
|
||||||
# @PURPOSE: Получает полный список датасетов, автоматически обрабатывая пагинацию.
|
# @PURPOSE: Получает полный список датасетов, автоматически обрабатывая пагинацию.
|
||||||
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса.
|
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса.
|
||||||
|
# @PRE: Client is authenticated.
|
||||||
|
# @POST: Returns total count and list of datasets.
|
||||||
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список датасетов).
|
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список датасетов).
|
||||||
def get_datasets(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
def get_datasets(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
||||||
with belief_scope("get_datasets"):
|
with belief_scope("get_datasets"):
|
||||||
@@ -198,9 +212,35 @@ class SupersetClient:
|
|||||||
return total_count, paginated_data
|
return total_count, paginated_data
|
||||||
# [/DEF:get_datasets:Function]
|
# [/DEF:get_datasets:Function]
|
||||||
|
|
||||||
|
# [DEF:get_datasets_summary:Function]
|
||||||
|
# @PURPOSE: Fetches dataset metadata optimized for the Dataset Hub grid.
|
||||||
|
# @PRE: Client is authenticated.
|
||||||
|
# @POST: Returns a list of dataset metadata summaries.
|
||||||
|
# @RETURN: List[Dict]
|
||||||
|
def get_datasets_summary(self) -> List[Dict]:
|
||||||
|
with belief_scope("SupersetClient.get_datasets_summary"):
|
||||||
|
query = {
|
||||||
|
"columns": ["id", "table_name", "schema", "database"]
|
||||||
|
}
|
||||||
|
_, datasets = self.get_datasets(query=query)
|
||||||
|
|
||||||
|
# Map fields to match the contracts
|
||||||
|
result = []
|
||||||
|
for ds in datasets:
|
||||||
|
result.append({
|
||||||
|
"id": ds.get("id"),
|
||||||
|
"table_name": ds.get("table_name"),
|
||||||
|
"schema": ds.get("schema"),
|
||||||
|
"database": ds.get("database", {}).get("database_name", "Unknown")
|
||||||
|
})
|
||||||
|
return result
|
||||||
|
# [/DEF:get_datasets_summary:Function]
|
||||||
|
|
||||||
# [DEF:get_dataset:Function]
|
# [DEF:get_dataset:Function]
|
||||||
# @PURPOSE: Получает информацию о конкретном датасете по его ID.
|
# @PURPOSE: Получает информацию о конкретном датасете по его ID.
|
||||||
# @PARAM: dataset_id (int) - ID датасета.
|
# @PARAM: dataset_id (int) - ID датасета.
|
||||||
|
# @PRE: dataset_id must exist.
|
||||||
|
# @POST: Returns dataset details.
|
||||||
# @RETURN: Dict - Информация о датасете.
|
# @RETURN: Dict - Информация о датасете.
|
||||||
def get_dataset(self, dataset_id: int) -> Dict:
|
def get_dataset(self, dataset_id: int) -> Dict:
|
||||||
with belief_scope("SupersetClient.get_dataset", f"id={dataset_id}"):
|
with belief_scope("SupersetClient.get_dataset", f"id={dataset_id}"):
|
||||||
@@ -215,6 +255,8 @@ class SupersetClient:
|
|||||||
# @PURPOSE: Обновляет данные датасета по его ID.
|
# @PURPOSE: Обновляет данные датасета по его ID.
|
||||||
# @PARAM: dataset_id (int) - ID датасета.
|
# @PARAM: dataset_id (int) - ID датасета.
|
||||||
# @PARAM: data (Dict) - Данные для обновления.
|
# @PARAM: data (Dict) - Данные для обновления.
|
||||||
|
# @PRE: dataset_id must exist.
|
||||||
|
# @POST: Dataset is updated in Superset.
|
||||||
# @RETURN: Dict - Ответ API.
|
# @RETURN: Dict - Ответ API.
|
||||||
def update_dataset(self, dataset_id: int, data: Dict) -> Dict:
|
def update_dataset(self, dataset_id: int, data: Dict) -> Dict:
|
||||||
with belief_scope("SupersetClient.update_dataset", f"id={dataset_id}"):
|
with belief_scope("SupersetClient.update_dataset", f"id={dataset_id}"):
|
||||||
@@ -237,6 +279,8 @@ class SupersetClient:
|
|||||||
# [DEF:get_databases:Function]
|
# [DEF:get_databases:Function]
|
||||||
# @PURPOSE: Получает полный список баз данных.
|
# @PURPOSE: Получает полный список баз данных.
|
||||||
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса.
|
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса.
|
||||||
|
# @PRE: Client is authenticated.
|
||||||
|
# @POST: Returns total count and list of databases.
|
||||||
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список баз данных).
|
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список баз данных).
|
||||||
def get_databases(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
def get_databases(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
||||||
with belief_scope("get_databases"):
|
with belief_scope("get_databases"):
|
||||||
@@ -256,6 +300,8 @@ class SupersetClient:
|
|||||||
# [DEF:get_database:Function]
|
# [DEF:get_database:Function]
|
||||||
# @PURPOSE: Получает информацию о конкретной базе данных по её ID.
|
# @PURPOSE: Получает информацию о конкретной базе данных по её ID.
|
||||||
# @PARAM: database_id (int) - ID базы данных.
|
# @PARAM: database_id (int) - ID базы данных.
|
||||||
|
# @PRE: database_id must exist.
|
||||||
|
# @POST: Returns database details.
|
||||||
# @RETURN: Dict - Информация о базе данных.
|
# @RETURN: Dict - Информация о базе данных.
|
||||||
def get_database(self, database_id: int) -> Dict:
|
def get_database(self, database_id: int) -> Dict:
|
||||||
with belief_scope("get_database"):
|
with belief_scope("get_database"):
|
||||||
@@ -268,6 +314,8 @@ class SupersetClient:
|
|||||||
|
|
||||||
# [DEF:get_databases_summary:Function]
|
# [DEF:get_databases_summary:Function]
|
||||||
# @PURPOSE: Fetch a summary of databases including uuid, name, and engine.
|
# @PURPOSE: Fetch a summary of databases including uuid, name, and engine.
|
||||||
|
# @PRE: Client is authenticated.
|
||||||
|
# @POST: Returns list of database summaries.
|
||||||
# @RETURN: List[Dict] - Summary of databases.
|
# @RETURN: List[Dict] - Summary of databases.
|
||||||
def get_databases_summary(self) -> List[Dict]:
|
def get_databases_summary(self) -> List[Dict]:
|
||||||
with belief_scope("SupersetClient.get_databases_summary"):
|
with belief_scope("SupersetClient.get_databases_summary"):
|
||||||
@@ -286,6 +334,8 @@ class SupersetClient:
|
|||||||
# [DEF:get_database_by_uuid:Function]
|
# [DEF:get_database_by_uuid:Function]
|
||||||
# @PURPOSE: Find a database by its UUID.
|
# @PURPOSE: Find a database by its UUID.
|
||||||
# @PARAM: db_uuid (str) - The UUID of the database.
|
# @PARAM: db_uuid (str) - The UUID of the database.
|
||||||
|
# @PRE: db_uuid must be a valid UUID string.
|
||||||
|
# @POST: Returns database info or None.
|
||||||
# @RETURN: Optional[Dict] - Database info if found, else None.
|
# @RETURN: Optional[Dict] - Database info if found, else None.
|
||||||
def get_database_by_uuid(self, db_uuid: str) -> Optional[Dict]:
|
def get_database_by_uuid(self, db_uuid: str) -> Optional[Dict]:
|
||||||
with belief_scope("SupersetClient.get_database_by_uuid", f"uuid={db_uuid}"):
|
with belief_scope("SupersetClient.get_database_by_uuid", f"uuid={db_uuid}"):
|
||||||
@@ -301,6 +351,9 @@ class SupersetClient:
|
|||||||
# [SECTION: HELPERS]
|
# [SECTION: HELPERS]
|
||||||
|
|
||||||
# [DEF:_resolve_target_id_for_delete:Function]
|
# [DEF:_resolve_target_id_for_delete:Function]
|
||||||
|
# @PURPOSE: Resolves a dashboard ID from either an ID or a slug.
|
||||||
|
# @PRE: Either dash_id or dash_slug should be provided.
|
||||||
|
# @POST: Returns the resolved ID or None.
|
||||||
def _resolve_target_id_for_delete(self, dash_id: Optional[int], dash_slug: Optional[str]) -> Optional[int]:
|
def _resolve_target_id_for_delete(self, dash_id: Optional[int], dash_slug: Optional[str]) -> Optional[int]:
|
||||||
with belief_scope("_resolve_target_id_for_delete"):
|
with belief_scope("_resolve_target_id_for_delete"):
|
||||||
if dash_id is not None:
|
if dash_id is not None:
|
||||||
@@ -319,6 +372,9 @@ class SupersetClient:
|
|||||||
# [/DEF:_resolve_target_id_for_delete:Function]
|
# [/DEF:_resolve_target_id_for_delete:Function]
|
||||||
|
|
||||||
# [DEF:_do_import:Function]
|
# [DEF:_do_import:Function]
|
||||||
|
# @PURPOSE: Performs the actual multipart upload for import.
|
||||||
|
# @PRE: file_name must be a path to an existing ZIP file.
|
||||||
|
# @POST: Returns the API response from the upload.
|
||||||
def _do_import(self, file_name: Union[str, Path]) -> Dict:
|
def _do_import(self, file_name: Union[str, Path]) -> Dict:
|
||||||
with belief_scope("_do_import"):
|
with belief_scope("_do_import"):
|
||||||
app_logger.debug(f"[_do_import][State] Uploading file: {file_name}")
|
app_logger.debug(f"[_do_import][State] Uploading file: {file_name}")
|
||||||
@@ -336,6 +392,9 @@ class SupersetClient:
|
|||||||
# [/DEF:_do_import:Function]
|
# [/DEF:_do_import:Function]
|
||||||
|
|
||||||
# [DEF:_validate_export_response:Function]
|
# [DEF:_validate_export_response:Function]
|
||||||
|
# @PURPOSE: Validates that the export response is a non-empty ZIP archive.
|
||||||
|
# @PRE: response must be a valid requests.Response object.
|
||||||
|
# @POST: Raises SupersetAPIError if validation fails.
|
||||||
def _validate_export_response(self, response: Response, dashboard_id: int) -> None:
|
def _validate_export_response(self, response: Response, dashboard_id: int) -> None:
|
||||||
with belief_scope("_validate_export_response"):
|
with belief_scope("_validate_export_response"):
|
||||||
content_type = response.headers.get("Content-Type", "")
|
content_type = response.headers.get("Content-Type", "")
|
||||||
@@ -346,6 +405,9 @@ class SupersetClient:
|
|||||||
# [/DEF:_validate_export_response:Function]
|
# [/DEF:_validate_export_response:Function]
|
||||||
|
|
||||||
# [DEF:_resolve_export_filename:Function]
|
# [DEF:_resolve_export_filename:Function]
|
||||||
|
# @PURPOSE: Determines the filename for an exported dashboard.
|
||||||
|
# @PRE: response must contain Content-Disposition header or dashboard_id must be provided.
|
||||||
|
# @POST: Returns a sanitized filename string.
|
||||||
def _resolve_export_filename(self, response: Response, dashboard_id: int) -> str:
|
def _resolve_export_filename(self, response: Response, dashboard_id: int) -> str:
|
||||||
with belief_scope("_resolve_export_filename"):
|
with belief_scope("_resolve_export_filename"):
|
||||||
filename = get_filename_from_headers(dict(response.headers))
|
filename = get_filename_from_headers(dict(response.headers))
|
||||||
@@ -358,6 +420,9 @@ class SupersetClient:
|
|||||||
# [/DEF:_resolve_export_filename:Function]
|
# [/DEF:_resolve_export_filename:Function]
|
||||||
|
|
||||||
# [DEF:_validate_query_params:Function]
|
# [DEF:_validate_query_params:Function]
|
||||||
|
# @PURPOSE: Ensures query parameters have default page and page_size.
|
||||||
|
# @PRE: query can be None or a dictionary.
|
||||||
|
# @POST: Returns a dictionary with at least page and page_size.
|
||||||
def _validate_query_params(self, query: Optional[Dict]) -> Dict:
|
def _validate_query_params(self, query: Optional[Dict]) -> Dict:
|
||||||
with belief_scope("_validate_query_params"):
|
with belief_scope("_validate_query_params"):
|
||||||
base_query = {"page": 0, "page_size": 1000}
|
base_query = {"page": 0, "page_size": 1000}
|
||||||
@@ -365,6 +430,9 @@ class SupersetClient:
|
|||||||
# [/DEF:_validate_query_params:Function]
|
# [/DEF:_validate_query_params:Function]
|
||||||
|
|
||||||
# [DEF:_fetch_total_object_count:Function]
|
# [DEF:_fetch_total_object_count:Function]
|
||||||
|
# @PURPOSE: Fetches the total number of items for a given endpoint.
|
||||||
|
# @PRE: endpoint must be a valid Superset API path.
|
||||||
|
# @POST: Returns the total count as an integer.
|
||||||
def _fetch_total_object_count(self, endpoint: str) -> int:
|
def _fetch_total_object_count(self, endpoint: str) -> int:
|
||||||
with belief_scope("_fetch_total_object_count"):
|
with belief_scope("_fetch_total_object_count"):
|
||||||
return self.network.fetch_paginated_count(
|
return self.network.fetch_paginated_count(
|
||||||
@@ -375,12 +443,18 @@ class SupersetClient:
|
|||||||
# [/DEF:_fetch_total_object_count:Function]
|
# [/DEF:_fetch_total_object_count:Function]
|
||||||
|
|
||||||
# [DEF:_fetch_all_pages:Function]
|
# [DEF:_fetch_all_pages:Function]
|
||||||
|
# @PURPOSE: Iterates through all pages to collect all data items.
|
||||||
|
# @PRE: pagination_options must contain base_query, total_count, and results_field.
|
||||||
|
# @POST: Returns a combined list of all items.
|
||||||
def _fetch_all_pages(self, endpoint: str, pagination_options: Dict) -> List[Dict]:
|
def _fetch_all_pages(self, endpoint: str, pagination_options: Dict) -> List[Dict]:
|
||||||
with belief_scope("_fetch_all_pages"):
|
with belief_scope("_fetch_all_pages"):
|
||||||
return self.network.fetch_paginated_data(endpoint=endpoint, pagination_options=pagination_options)
|
return self.network.fetch_paginated_data(endpoint=endpoint, pagination_options=pagination_options)
|
||||||
# [/DEF:_fetch_all_pages:Function]
|
# [/DEF:_fetch_all_pages:Function]
|
||||||
|
|
||||||
# [DEF:_validate_import_file:Function]
|
# [DEF:_validate_import_file:Function]
|
||||||
|
# @PURPOSE: Validates that the file to be imported is a valid ZIP with metadata.yaml.
|
||||||
|
# @PRE: zip_path must be a path to a file.
|
||||||
|
# @POST: Raises error if file is missing, not a ZIP, or missing metadata.
|
||||||
def _validate_import_file(self, zip_path: Union[str, Path]) -> None:
|
def _validate_import_file(self, zip_path: Union[str, Path]) -> None:
|
||||||
with belief_scope("_validate_import_file"):
|
with belief_scope("_validate_import_file"):
|
||||||
path = Path(zip_path)
|
path = Path(zip_path)
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
# [DEF:TaskManagerPackage:Module]
|
# [DEF:TaskManagerPackage:Module]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @SEMANTICS: task, manager, package, exports
|
# @SEMANTICS: task, manager, package, exports
|
||||||
# @PURPOSE: Exports the public API of the task manager package.
|
# @PURPOSE: Exports the public API of the task manager package.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
|
|||||||
@@ -1,47 +1,75 @@
|
|||||||
# [DEF:TaskCleanupModule:Module]
|
# [DEF:TaskCleanupModule:Module]
|
||||||
# @SEMANTICS: task, cleanup, retention
|
# @TIER: STANDARD
|
||||||
# @PURPOSE: Implements task cleanup and retention policies.
|
# @SEMANTICS: task, cleanup, retention, logs
|
||||||
|
# @PURPOSE: Implements task cleanup and retention policies, including associated logs.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
# @RELATION: Uses TaskPersistenceService to delete old tasks.
|
# @RELATION: Uses TaskPersistenceService and TaskLogPersistenceService to delete old tasks and logs.
|
||||||
|
|
||||||
from datetime import datetime, timedelta
|
from typing import List
|
||||||
from .persistence import TaskPersistenceService
|
from .persistence import TaskPersistenceService, TaskLogPersistenceService
|
||||||
from ..logger import logger, belief_scope
|
from ..logger import logger, belief_scope
|
||||||
from ..config_manager import ConfigManager
|
from ..config_manager import ConfigManager
|
||||||
|
|
||||||
# [DEF:TaskCleanupService:Class]
|
# [DEF:TaskCleanupService:Class]
|
||||||
# @PURPOSE: Provides methods to clean up old task records.
|
# @PURPOSE: Provides methods to clean up old task records and their associated logs.
|
||||||
|
# @TIER: STANDARD
|
||||||
class TaskCleanupService:
|
class TaskCleanupService:
|
||||||
# [DEF:__init__:Function]
|
# [DEF:__init__:Function]
|
||||||
# @PURPOSE: Initializes the cleanup service with dependencies.
|
# @PURPOSE: Initializes the cleanup service with dependencies.
|
||||||
# @PRE: persistence_service and config_manager are valid.
|
# @PRE: persistence_service and config_manager are valid.
|
||||||
# @POST: Cleanup service is ready.
|
# @POST: Cleanup service is ready.
|
||||||
def __init__(self, persistence_service: TaskPersistenceService, config_manager: ConfigManager):
|
def __init__(
|
||||||
|
self,
|
||||||
|
persistence_service: TaskPersistenceService,
|
||||||
|
log_persistence_service: TaskLogPersistenceService,
|
||||||
|
config_manager: ConfigManager
|
||||||
|
):
|
||||||
self.persistence_service = persistence_service
|
self.persistence_service = persistence_service
|
||||||
|
self.log_persistence_service = log_persistence_service
|
||||||
self.config_manager = config_manager
|
self.config_manager = config_manager
|
||||||
# [/DEF:__init__:Function]
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
# [DEF:run_cleanup:Function]
|
# [DEF:run_cleanup:Function]
|
||||||
# @PURPOSE: Deletes tasks older than the configured retention period.
|
# @PURPOSE: Deletes tasks older than the configured retention period and their logs.
|
||||||
# @PRE: Config manager has valid settings.
|
# @PRE: Config manager has valid settings.
|
||||||
# @POST: Old tasks are deleted from persistence.
|
# @POST: Old tasks and their logs are deleted from persistence.
|
||||||
def run_cleanup(self):
|
def run_cleanup(self):
|
||||||
with belief_scope("TaskCleanupService.run_cleanup"):
|
with belief_scope("TaskCleanupService.run_cleanup"):
|
||||||
settings = self.config_manager.get_config().settings
|
settings = self.config_manager.get_config().settings
|
||||||
retention_days = settings.task_retention_days
|
retention_days = settings.task_retention_days
|
||||||
|
|
||||||
# This is a simplified implementation.
|
|
||||||
# In a real scenario, we would query IDs of tasks older than retention_days.
|
|
||||||
# For now, we'll log the action.
|
|
||||||
logger.info(f"Cleaning up tasks older than {retention_days} days.")
|
logger.info(f"Cleaning up tasks older than {retention_days} days.")
|
||||||
|
|
||||||
# Re-loading tasks to check for limit
|
# Load tasks to check for limit
|
||||||
tasks = self.persistence_service.load_tasks(limit=1000)
|
tasks = self.persistence_service.load_tasks(limit=1000)
|
||||||
if len(tasks) > settings.task_retention_limit:
|
if len(tasks) > settings.task_retention_limit:
|
||||||
to_delete = [t.id for t in tasks[settings.task_retention_limit:]]
|
to_delete: List[str] = [t.id for t in tasks[settings.task_retention_limit:]]
|
||||||
|
|
||||||
|
# Delete logs first (before task records)
|
||||||
|
self.log_persistence_service.delete_logs_for_tasks(to_delete)
|
||||||
|
|
||||||
|
# Then delete task records
|
||||||
self.persistence_service.delete_tasks(to_delete)
|
self.persistence_service.delete_tasks(to_delete)
|
||||||
logger.info(f"Deleted {len(to_delete)} tasks exceeding limit of {settings.task_retention_limit}")
|
|
||||||
|
logger.info(f"Deleted {len(to_delete)} tasks and their logs exceeding limit of {settings.task_retention_limit}")
|
||||||
# [/DEF:run_cleanup:Function]
|
# [/DEF:run_cleanup:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_task_with_logs:Function]
|
||||||
|
# @PURPOSE: Delete a single task and all its associated logs.
|
||||||
|
# @PRE: task_id is a valid task ID.
|
||||||
|
# @POST: Task and all its logs are deleted.
|
||||||
|
# @PARAM: task_id (str) - The task ID to delete.
|
||||||
|
def delete_task_with_logs(self, task_id: str) -> None:
|
||||||
|
"""Delete a single task and all its associated logs."""
|
||||||
|
with belief_scope("TaskCleanupService.delete_task_with_logs", f"task_id={task_id}"):
|
||||||
|
# Delete logs first
|
||||||
|
self.log_persistence_service.delete_logs_for_task(task_id)
|
||||||
|
|
||||||
|
# Then delete task record
|
||||||
|
self.persistence_service.delete_tasks([task_id])
|
||||||
|
|
||||||
|
logger.info(f"Deleted task {task_id} and its associated logs")
|
||||||
|
# [/DEF:delete_task_with_logs:Function]
|
||||||
|
|
||||||
# [/DEF:TaskCleanupService:Class]
|
# [/DEF:TaskCleanupService:Class]
|
||||||
# [/DEF:TaskCleanupModule:Module]
|
# [/DEF:TaskCleanupModule:Module]
|
||||||
115
backend/src/core/task_manager/context.py
Normal file
115
backend/src/core/task_manager/context.py
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
# [DEF:TaskContextModule:Module]
|
||||||
|
# @SEMANTICS: task, context, plugin, execution, logger
|
||||||
|
# @PURPOSE: Provides execution context passed to plugins during task execution.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> TaskLogger, USED_BY -> plugins
|
||||||
|
# @TIER: CRITICAL
|
||||||
|
# @INVARIANT: Each TaskContext is bound to a single task execution.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from typing import Dict, Any, Callable
|
||||||
|
from .task_logger import TaskLogger
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:TaskContext:Class]
|
||||||
|
# @SEMANTICS: context, task, execution, plugin
|
||||||
|
# @PURPOSE: A container passed to plugin.execute() providing the logger and other task-specific utilities.
|
||||||
|
# @TIER: CRITICAL
|
||||||
|
# @INVARIANT: logger is always a valid TaskLogger instance.
|
||||||
|
# @UX_STATE: Idle -> Active -> Complete
|
||||||
|
class TaskContext:
|
||||||
|
"""
|
||||||
|
Execution context provided to plugins during task execution.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
def execute(params: dict, context: TaskContext = None):
|
||||||
|
if context:
|
||||||
|
context.logger.info("Starting process")
|
||||||
|
context.logger.progress("Processing items", percent=50)
|
||||||
|
# ... plugin logic
|
||||||
|
"""
|
||||||
|
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initialize the TaskContext with task-specific resources.
|
||||||
|
# @PRE: task_id is a valid task identifier, add_log_fn is callable.
|
||||||
|
# @POST: TaskContext is ready to be passed to plugin.execute().
|
||||||
|
# @PARAM: task_id (str) - The ID of the task.
|
||||||
|
# @PARAM: add_log_fn (Callable) - Function to add log to TaskManager.
|
||||||
|
# @PARAM: params (Dict) - Task parameters.
|
||||||
|
# @PARAM: default_source (str) - Default source for logs (default: "plugin").
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
task_id: str,
|
||||||
|
add_log_fn: Callable,
|
||||||
|
params: Dict[str, Any],
|
||||||
|
default_source: str = "plugin"
|
||||||
|
):
|
||||||
|
self._task_id = task_id
|
||||||
|
self._params = params
|
||||||
|
self._logger = TaskLogger(
|
||||||
|
task_id=task_id,
|
||||||
|
add_log_fn=add_log_fn,
|
||||||
|
source=default_source
|
||||||
|
)
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:task_id:Function]
|
||||||
|
# @PURPOSE: Get the task ID.
|
||||||
|
# @PRE: TaskContext must be initialized.
|
||||||
|
# @POST: Returns the task ID string.
|
||||||
|
# @RETURN: str - The task ID.
|
||||||
|
@property
|
||||||
|
def task_id(self) -> str:
|
||||||
|
return self._task_id
|
||||||
|
# [/DEF:task_id:Function]
|
||||||
|
|
||||||
|
# [DEF:logger:Function]
|
||||||
|
# @PURPOSE: Get the TaskLogger instance for this context.
|
||||||
|
# @PRE: TaskContext must be initialized.
|
||||||
|
# @POST: Returns the TaskLogger instance.
|
||||||
|
# @RETURN: TaskLogger - The logger instance.
|
||||||
|
@property
|
||||||
|
def logger(self) -> TaskLogger:
|
||||||
|
return self._logger
|
||||||
|
# [/DEF:logger:Function]
|
||||||
|
|
||||||
|
# [DEF:params:Function]
|
||||||
|
# @PURPOSE: Get the task parameters.
|
||||||
|
# @PRE: TaskContext must be initialized.
|
||||||
|
# @POST: Returns the parameters dictionary.
|
||||||
|
# @RETURN: Dict[str, Any] - The task parameters.
|
||||||
|
@property
|
||||||
|
def params(self) -> Dict[str, Any]:
|
||||||
|
return self._params
|
||||||
|
# [/DEF:params:Function]
|
||||||
|
|
||||||
|
# [DEF:get_param:Function]
|
||||||
|
# @PURPOSE: Get a specific parameter value with optional default.
|
||||||
|
# @PRE: TaskContext must be initialized.
|
||||||
|
# @POST: Returns parameter value or default.
|
||||||
|
# @PARAM: key (str) - Parameter key.
|
||||||
|
# @PARAM: default (Any) - Default value if key not found.
|
||||||
|
# @RETURN: Any - Parameter value or default.
|
||||||
|
def get_param(self, key: str, default: Any = None) -> Any:
|
||||||
|
return self._params.get(key, default)
|
||||||
|
# [/DEF:get_param:Function]
|
||||||
|
|
||||||
|
# [DEF:create_sub_context:Function]
|
||||||
|
# @PURPOSE: Create a sub-context with a different default source.
|
||||||
|
# @PRE: source is a non-empty string.
|
||||||
|
# @POST: Returns new TaskContext with different logger source.
|
||||||
|
# @PARAM: source (str) - New default source for logging.
|
||||||
|
# @RETURN: TaskContext - New context with different source.
|
||||||
|
def create_sub_context(self, source: str) -> "TaskContext":
|
||||||
|
"""Create a sub-context with a different default source for logging."""
|
||||||
|
return TaskContext(
|
||||||
|
task_id=self._task_id,
|
||||||
|
add_log_fn=self._logger._add_log,
|
||||||
|
params=self._params,
|
||||||
|
default_source=source
|
||||||
|
)
|
||||||
|
# [/DEF:create_sub_context:Function]
|
||||||
|
|
||||||
|
# [/DEF:TaskContext:Class]
|
||||||
|
|
||||||
|
# [/DEF:TaskContextModule:Module]
|
||||||
@@ -8,23 +8,33 @@
|
|||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
import asyncio
|
import asyncio
|
||||||
|
import threading
|
||||||
|
import inspect
|
||||||
|
from concurrent.futures import ThreadPoolExecutor
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from typing import Dict, Any, List, Optional
|
from typing import Dict, Any, List, Optional
|
||||||
from concurrent.futures import ThreadPoolExecutor
|
|
||||||
|
|
||||||
from .models import Task, TaskStatus, LogEntry
|
from .models import Task, TaskStatus, LogEntry, LogFilter, LogStats
|
||||||
from .persistence import TaskPersistenceService
|
from .persistence import TaskPersistenceService, TaskLogPersistenceService
|
||||||
from ..logger import logger, belief_scope
|
from .context import TaskContext
|
||||||
|
from ..logger import logger, belief_scope, should_log_task_level
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:TaskManager:Class]
|
# [DEF:TaskManager:Class]
|
||||||
# @SEMANTICS: task, manager, lifecycle, execution, state
|
# @SEMANTICS: task, manager, lifecycle, execution, state
|
||||||
# @PURPOSE: Manages the lifecycle of tasks, including their creation, execution, and state tracking.
|
# @PURPOSE: Manages the lifecycle of tasks, including their creation, execution, and state tracking.
|
||||||
|
# @TIER: CRITICAL
|
||||||
|
# @INVARIANT: Task IDs are unique within the registry.
|
||||||
|
# @INVARIANT: Each task has exactly one status at any time.
|
||||||
|
# @INVARIANT: Log entries are never deleted after being added to a task.
|
||||||
class TaskManager:
|
class TaskManager:
|
||||||
"""
|
"""
|
||||||
Manages the lifecycle of tasks, including their creation, execution, and state tracking.
|
Manages the lifecycle of tasks, including their creation, execution, and state tracking.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
# Log flush interval in seconds
|
||||||
|
LOG_FLUSH_INTERVAL = 2.0
|
||||||
|
|
||||||
# [DEF:__init__:Function]
|
# [DEF:__init__:Function]
|
||||||
# @PURPOSE: Initialize the TaskManager with dependencies.
|
# @PURPOSE: Initialize the TaskManager with dependencies.
|
||||||
# @PRE: plugin_loader is initialized.
|
# @PRE: plugin_loader is initialized.
|
||||||
@@ -35,8 +45,18 @@ class TaskManager:
|
|||||||
self.plugin_loader = plugin_loader
|
self.plugin_loader = plugin_loader
|
||||||
self.tasks: Dict[str, Task] = {}
|
self.tasks: Dict[str, Task] = {}
|
||||||
self.subscribers: Dict[str, List[asyncio.Queue]] = {}
|
self.subscribers: Dict[str, List[asyncio.Queue]] = {}
|
||||||
self.executor = ThreadPoolExecutor(max_workers=5) # For CPU-bound plugin execution
|
self.executor = ThreadPoolExecutor(max_workers=5) # For CPU-bound plugin execution
|
||||||
self.persistence_service = TaskPersistenceService()
|
self.persistence_service = TaskPersistenceService()
|
||||||
|
self.log_persistence_service = TaskLogPersistenceService()
|
||||||
|
|
||||||
|
# Log buffer: task_id -> List[LogEntry]
|
||||||
|
self._log_buffer: Dict[str, List[LogEntry]] = {}
|
||||||
|
self._log_buffer_lock = threading.Lock()
|
||||||
|
|
||||||
|
# Flusher thread for batch writing logs
|
||||||
|
self._flusher_stop_event = threading.Event()
|
||||||
|
self._flusher_thread = threading.Thread(target=self._flusher_loop, daemon=True)
|
||||||
|
self._flusher_thread.start()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
self.loop = asyncio.get_running_loop()
|
self.loop = asyncio.get_running_loop()
|
||||||
@@ -48,6 +68,59 @@ class TaskManager:
|
|||||||
self.load_persisted_tasks()
|
self.load_persisted_tasks()
|
||||||
# [/DEF:__init__:Function]
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:_flusher_loop:Function]
|
||||||
|
# @PURPOSE: Background thread that periodically flushes log buffer to database.
|
||||||
|
# @PRE: TaskManager is initialized.
|
||||||
|
# @POST: Logs are batch-written to database every LOG_FLUSH_INTERVAL seconds.
|
||||||
|
def _flusher_loop(self):
|
||||||
|
"""Background thread that flushes log buffer to database."""
|
||||||
|
while not self._flusher_stop_event.is_set():
|
||||||
|
self._flush_logs()
|
||||||
|
self._flusher_stop_event.wait(self.LOG_FLUSH_INTERVAL)
|
||||||
|
# [/DEF:_flusher_loop:Function]
|
||||||
|
|
||||||
|
# [DEF:_flush_logs:Function]
|
||||||
|
# @PURPOSE: Flush all buffered logs to the database.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: All buffered logs are written to task_logs table.
|
||||||
|
def _flush_logs(self):
|
||||||
|
"""Flush all buffered logs to the database."""
|
||||||
|
with self._log_buffer_lock:
|
||||||
|
task_ids = list(self._log_buffer.keys())
|
||||||
|
|
||||||
|
for task_id in task_ids:
|
||||||
|
with self._log_buffer_lock:
|
||||||
|
logs = self._log_buffer.pop(task_id, [])
|
||||||
|
|
||||||
|
if logs:
|
||||||
|
try:
|
||||||
|
self.log_persistence_service.add_logs(task_id, logs)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to flush logs for task {task_id}: {e}")
|
||||||
|
# Re-add logs to buffer on failure
|
||||||
|
with self._log_buffer_lock:
|
||||||
|
if task_id not in self._log_buffer:
|
||||||
|
self._log_buffer[task_id] = []
|
||||||
|
self._log_buffer[task_id].extend(logs)
|
||||||
|
# [/DEF:_flush_logs:Function]
|
||||||
|
|
||||||
|
# [DEF:_flush_task_logs:Function]
|
||||||
|
# @PURPOSE: Flush logs for a specific task immediately.
|
||||||
|
# @PRE: task_id exists.
|
||||||
|
# @POST: Task's buffered logs are written to database.
|
||||||
|
# @PARAM: task_id (str) - The task ID.
|
||||||
|
def _flush_task_logs(self, task_id: str):
|
||||||
|
"""Flush logs for a specific task immediately."""
|
||||||
|
with self._log_buffer_lock:
|
||||||
|
logs = self._log_buffer.pop(task_id, [])
|
||||||
|
|
||||||
|
if logs:
|
||||||
|
try:
|
||||||
|
self.log_persistence_service.add_logs(task_id, logs)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to flush logs for task {task_id}: {e}")
|
||||||
|
# [/DEF:_flush_task_logs:Function]
|
||||||
|
|
||||||
# [DEF:create_task:Function]
|
# [DEF:create_task:Function]
|
||||||
# @PURPOSE: Creates and queues a new task for execution.
|
# @PURPOSE: Creates and queues a new task for execution.
|
||||||
# @PRE: Plugin with plugin_id exists. Params are valid.
|
# @PRE: Plugin with plugin_id exists. Params are valid.
|
||||||
@@ -63,7 +136,7 @@ class TaskManager:
|
|||||||
logger.error(f"Plugin with ID '{plugin_id}' not found.")
|
logger.error(f"Plugin with ID '{plugin_id}' not found.")
|
||||||
raise ValueError(f"Plugin with ID '{plugin_id}' not found.")
|
raise ValueError(f"Plugin with ID '{plugin_id}' not found.")
|
||||||
|
|
||||||
plugin = self.plugin_loader.get_plugin(plugin_id)
|
self.plugin_loader.get_plugin(plugin_id)
|
||||||
|
|
||||||
if not isinstance(params, dict):
|
if not isinstance(params, dict):
|
||||||
logger.error("Task parameters must be a dictionary.")
|
logger.error("Task parameters must be a dictionary.")
|
||||||
@@ -78,7 +151,7 @@ class TaskManager:
|
|||||||
# [/DEF:create_task:Function]
|
# [/DEF:create_task:Function]
|
||||||
|
|
||||||
# [DEF:_run_task:Function]
|
# [DEF:_run_task:Function]
|
||||||
# @PURPOSE: Internal method to execute a task.
|
# @PURPOSE: Internal method to execute a task with TaskContext support.
|
||||||
# @PRE: Task exists in registry.
|
# @PRE: Task exists in registry.
|
||||||
# @POST: Task is executed, status updated to SUCCESS or FAILED.
|
# @POST: Task is executed, status updated to SUCCESS or FAILED.
|
||||||
# @PARAM: task_id (str) - The ID of the task to run.
|
# @PARAM: task_id (str) - The ID of the task to run.
|
||||||
@@ -91,30 +164,54 @@ class TaskManager:
|
|||||||
task.status = TaskStatus.RUNNING
|
task.status = TaskStatus.RUNNING
|
||||||
task.started_at = datetime.utcnow()
|
task.started_at = datetime.utcnow()
|
||||||
self.persistence_service.persist_task(task)
|
self.persistence_service.persist_task(task)
|
||||||
self._add_log(task_id, "INFO", f"Task started for plugin '{plugin.name}'")
|
self._add_log(task_id, "INFO", f"Task started for plugin '{plugin.name}'", source="system")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Execute plugin
|
# Prepare params and check if plugin supports new TaskContext
|
||||||
params = {**task.params, "_task_id": task_id}
|
params = {**task.params, "_task_id": task_id}
|
||||||
|
|
||||||
if asyncio.iscoroutinefunction(plugin.execute):
|
# Check if plugin's execute method accepts 'context' parameter
|
||||||
task.result = await plugin.execute(params)
|
sig = inspect.signature(plugin.execute)
|
||||||
else:
|
accepts_context = 'context' in sig.parameters
|
||||||
task.result = await self.loop.run_in_executor(
|
|
||||||
self.executor,
|
if accepts_context:
|
||||||
plugin.execute,
|
# Create TaskContext for new-style plugins
|
||||||
params
|
context = TaskContext(
|
||||||
|
task_id=task_id,
|
||||||
|
add_log_fn=self._add_log,
|
||||||
|
params=params,
|
||||||
|
default_source="plugin"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if asyncio.iscoroutinefunction(plugin.execute):
|
||||||
|
task.result = await plugin.execute(params, context=context)
|
||||||
|
else:
|
||||||
|
task.result = await self.loop.run_in_executor(
|
||||||
|
self.executor,
|
||||||
|
lambda: plugin.execute(params, context=context)
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
# Backward compatibility: old-style plugins without context
|
||||||
|
if asyncio.iscoroutinefunction(plugin.execute):
|
||||||
|
task.result = await plugin.execute(params)
|
||||||
|
else:
|
||||||
|
task.result = await self.loop.run_in_executor(
|
||||||
|
self.executor,
|
||||||
|
plugin.execute,
|
||||||
|
params
|
||||||
|
)
|
||||||
|
|
||||||
logger.info(f"Task {task_id} completed successfully")
|
logger.info(f"Task {task_id} completed successfully")
|
||||||
task.status = TaskStatus.SUCCESS
|
task.status = TaskStatus.SUCCESS
|
||||||
self._add_log(task_id, "INFO", f"Task completed successfully for plugin '{plugin.name}'")
|
self._add_log(task_id, "INFO", f"Task completed successfully for plugin '{plugin.name}'", source="system")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Task {task_id} failed: {e}")
|
logger.error(f"Task {task_id} failed: {e}")
|
||||||
task.status = TaskStatus.FAILED
|
task.status = TaskStatus.FAILED
|
||||||
self._add_log(task_id, "ERROR", f"Task failed: {e}", {"error_type": type(e).__name__})
|
self._add_log(task_id, "ERROR", f"Task failed: {e}", source="system", metadata={"error_type": type(e).__name__})
|
||||||
finally:
|
finally:
|
||||||
task.finished_at = datetime.utcnow()
|
task.finished_at = datetime.utcnow()
|
||||||
|
# Flush any remaining buffered logs before persisting task
|
||||||
|
self._flush_task_logs(task_id)
|
||||||
self.persistence_service.persist_task(task)
|
self.persistence_service.persist_task(task)
|
||||||
logger.info(f"Task {task_id} execution finished with status: {task.status}")
|
logger.info(f"Task {task_id} execution finished with status: {task.status}")
|
||||||
# [/DEF:_run_task:Function]
|
# [/DEF:_run_task:Function]
|
||||||
@@ -151,7 +248,8 @@ class TaskManager:
|
|||||||
async def wait_for_resolution(self, task_id: str):
|
async def wait_for_resolution(self, task_id: str):
|
||||||
with belief_scope("TaskManager.wait_for_resolution", f"task_id={task_id}"):
|
with belief_scope("TaskManager.wait_for_resolution", f"task_id={task_id}"):
|
||||||
task = self.tasks.get(task_id)
|
task = self.tasks.get(task_id)
|
||||||
if not task: return
|
if not task:
|
||||||
|
return
|
||||||
|
|
||||||
task.status = TaskStatus.AWAITING_MAPPING
|
task.status = TaskStatus.AWAITING_MAPPING
|
||||||
self.persistence_service.persist_task(task)
|
self.persistence_service.persist_task(task)
|
||||||
@@ -172,7 +270,8 @@ class TaskManager:
|
|||||||
async def wait_for_input(self, task_id: str):
|
async def wait_for_input(self, task_id: str):
|
||||||
with belief_scope("TaskManager.wait_for_input", f"task_id={task_id}"):
|
with belief_scope("TaskManager.wait_for_input", f"task_id={task_id}"):
|
||||||
task = self.tasks.get(task_id)
|
task = self.tasks.get(task_id)
|
||||||
if not task: return
|
if not task:
|
||||||
|
return
|
||||||
|
|
||||||
# Status is already set to AWAITING_INPUT by await_input()
|
# Status is already set to AWAITING_INPUT by await_input()
|
||||||
self.task_futures[task_id] = self.loop.create_future()
|
self.task_futures[task_id] = self.loop.create_future()
|
||||||
@@ -224,36 +323,106 @@ class TaskManager:
|
|||||||
# [/DEF:get_tasks:Function]
|
# [/DEF:get_tasks:Function]
|
||||||
|
|
||||||
# [DEF:get_task_logs:Function]
|
# [DEF:get_task_logs:Function]
|
||||||
# @PURPOSE: Retrieves logs for a specific task.
|
# @PURPOSE: Retrieves logs for a specific task (from memory for running, persistence for completed).
|
||||||
# @PRE: task_id is a string.
|
# @PRE: task_id is a string.
|
||||||
# @POST: Returns list of LogEntry objects.
|
# @POST: Returns list of LogEntry or TaskLog objects.
|
||||||
# @PARAM: task_id (str) - ID of the task.
|
# @PARAM: task_id (str) - ID of the task.
|
||||||
|
# @PARAM: log_filter (Optional[LogFilter]) - Filter parameters.
|
||||||
# @RETURN: List[LogEntry] - List of log entries.
|
# @RETURN: List[LogEntry] - List of log entries.
|
||||||
def get_task_logs(self, task_id: str) -> List[LogEntry]:
|
def get_task_logs(self, task_id: str, log_filter: Optional[LogFilter] = None) -> List[LogEntry]:
|
||||||
with belief_scope("TaskManager.get_task_logs", f"task_id={task_id}"):
|
with belief_scope("TaskManager.get_task_logs", f"task_id={task_id}"):
|
||||||
task = self.tasks.get(task_id)
|
task = self.tasks.get(task_id)
|
||||||
|
|
||||||
|
# For completed tasks, fetch from persistence
|
||||||
|
if task and task.status in [TaskStatus.SUCCESS, TaskStatus.FAILED]:
|
||||||
|
if log_filter is None:
|
||||||
|
log_filter = LogFilter()
|
||||||
|
task_logs = self.log_persistence_service.get_logs(task_id, log_filter)
|
||||||
|
# Convert TaskLog to LogEntry for backward compatibility
|
||||||
|
return [
|
||||||
|
LogEntry(
|
||||||
|
timestamp=log.timestamp,
|
||||||
|
level=log.level,
|
||||||
|
message=log.message,
|
||||||
|
source=log.source,
|
||||||
|
metadata=log.metadata
|
||||||
|
)
|
||||||
|
for log in task_logs
|
||||||
|
]
|
||||||
|
|
||||||
|
# For running/pending tasks, return from memory
|
||||||
return task.logs if task else []
|
return task.logs if task else []
|
||||||
# [/DEF:get_task_logs:Function]
|
# [/DEF:get_task_logs:Function]
|
||||||
|
|
||||||
|
# [DEF:get_task_log_stats:Function]
|
||||||
|
# @PURPOSE: Get statistics about logs for a task.
|
||||||
|
# @PRE: task_id is a valid task ID.
|
||||||
|
# @POST: Returns LogStats with counts by level and source.
|
||||||
|
# @PARAM: task_id (str) - The task ID.
|
||||||
|
# @RETURN: LogStats - Statistics about task logs.
|
||||||
|
def get_task_log_stats(self, task_id: str) -> LogStats:
|
||||||
|
with belief_scope("TaskManager.get_task_log_stats", f"task_id={task_id}"):
|
||||||
|
return self.log_persistence_service.get_log_stats(task_id)
|
||||||
|
# [/DEF:get_task_log_stats:Function]
|
||||||
|
|
||||||
|
# [DEF:get_task_log_sources:Function]
|
||||||
|
# @PURPOSE: Get unique sources for a task's logs.
|
||||||
|
# @PRE: task_id is a valid task ID.
|
||||||
|
# @POST: Returns list of unique source strings.
|
||||||
|
# @PARAM: task_id (str) - The task ID.
|
||||||
|
# @RETURN: List[str] - Unique source names.
|
||||||
|
def get_task_log_sources(self, task_id: str) -> List[str]:
|
||||||
|
with belief_scope("TaskManager.get_task_log_sources", f"task_id={task_id}"):
|
||||||
|
return self.log_persistence_service.get_sources(task_id)
|
||||||
|
# [/DEF:get_task_log_sources:Function]
|
||||||
|
|
||||||
# [DEF:_add_log:Function]
|
# [DEF:_add_log:Function]
|
||||||
# @PURPOSE: Adds a log entry to a task and notifies subscribers.
|
# @PURPOSE: Adds a log entry to a task buffer and notifies subscribers.
|
||||||
# @PRE: Task exists.
|
# @PRE: Task exists.
|
||||||
# @POST: Log added to task and pushed to queues.
|
# @POST: Log added to buffer and pushed to queues (if level meets task_log_level filter).
|
||||||
# @PARAM: task_id (str) - ID of the task.
|
# @PARAM: task_id (str) - ID of the task.
|
||||||
# @PARAM: level (str) - Log level.
|
# @PARAM: level (str) - Log level.
|
||||||
# @PARAM: message (str) - Log message.
|
# @PARAM: message (str) - Log message.
|
||||||
# @PARAM: context (Optional[Dict]) - Log context.
|
# @PARAM: source (str) - Source component (default: "system").
|
||||||
def _add_log(self, task_id: str, level: str, message: str, context: Optional[Dict[str, Any]] = None):
|
# @PARAM: metadata (Optional[Dict]) - Additional structured data.
|
||||||
|
# @PARAM: context (Optional[Dict]) - Legacy context (for backward compatibility).
|
||||||
|
def _add_log(
|
||||||
|
self,
|
||||||
|
task_id: str,
|
||||||
|
level: str,
|
||||||
|
message: str,
|
||||||
|
source: str = "system",
|
||||||
|
metadata: Optional[Dict[str, Any]] = None,
|
||||||
|
context: Optional[Dict[str, Any]] = None
|
||||||
|
):
|
||||||
with belief_scope("TaskManager._add_log", f"task_id={task_id}"):
|
with belief_scope("TaskManager._add_log", f"task_id={task_id}"):
|
||||||
task = self.tasks.get(task_id)
|
task = self.tasks.get(task_id)
|
||||||
if not task:
|
if not task:
|
||||||
return
|
return
|
||||||
|
|
||||||
log_entry = LogEntry(level=level, message=message, context=context)
|
# Filter logs based on task_log_level configuration
|
||||||
task.logs.append(log_entry)
|
if not should_log_task_level(level):
|
||||||
self.persistence_service.persist_task(task)
|
return
|
||||||
|
|
||||||
# Notify subscribers
|
# Create log entry with new fields
|
||||||
|
log_entry = LogEntry(
|
||||||
|
level=level,
|
||||||
|
message=message,
|
||||||
|
source=source,
|
||||||
|
metadata=metadata,
|
||||||
|
context=context # Keep for backward compatibility
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add to in-memory logs (for backward compatibility with legacy JSON field)
|
||||||
|
task.logs.append(log_entry)
|
||||||
|
|
||||||
|
# Add to buffer for batch persistence
|
||||||
|
with self._log_buffer_lock:
|
||||||
|
if task_id not in self._log_buffer:
|
||||||
|
self._log_buffer[task_id] = []
|
||||||
|
self._log_buffer[task_id].append(log_entry)
|
||||||
|
|
||||||
|
# Notify subscribers (for real-time WebSocket updates)
|
||||||
if task_id in self.subscribers:
|
if task_id in self.subscribers:
|
||||||
for queue in self.subscribers[task_id]:
|
for queue in self.subscribers[task_id]:
|
||||||
self.loop.call_soon_threadsafe(queue.put_nowait, log_entry)
|
self.loop.call_soon_threadsafe(queue.put_nowait, log_entry)
|
||||||
@@ -353,7 +522,7 @@ class TaskManager:
|
|||||||
# [/DEF:resume_task_with_password:Function]
|
# [/DEF:resume_task_with_password:Function]
|
||||||
|
|
||||||
# [DEF:clear_tasks:Function]
|
# [DEF:clear_tasks:Function]
|
||||||
# @PURPOSE: Clears tasks based on status filter.
|
# @PURPOSE: Clears tasks based on status filter (also deletes associated logs).
|
||||||
# @PRE: status is Optional[TaskStatus].
|
# @PRE: status is Optional[TaskStatus].
|
||||||
# @POST: Tasks matching filter (or all non-active) cleared from registry and database.
|
# @POST: Tasks matching filter (or all non-active) cleared from registry and database.
|
||||||
# @PARAM: status (Optional[TaskStatus]) - Filter by task status.
|
# @PARAM: status (Optional[TaskStatus]) - Filter by task status.
|
||||||
@@ -387,9 +556,13 @@ class TaskManager:
|
|||||||
|
|
||||||
del self.tasks[tid]
|
del self.tasks[tid]
|
||||||
|
|
||||||
# Remove from persistence
|
# Remove from persistence (task_records and task_logs via CASCADE)
|
||||||
self.persistence_service.delete_tasks(tasks_to_remove)
|
self.persistence_service.delete_tasks(tasks_to_remove)
|
||||||
|
|
||||||
|
# Also explicitly delete logs (in case CASCADE is not set up)
|
||||||
|
if tasks_to_remove:
|
||||||
|
self.log_persistence_service.delete_logs_for_tasks(tasks_to_remove)
|
||||||
|
|
||||||
logger.info(f"Cleared {len(tasks_to_remove)} tasks.")
|
logger.info(f"Cleared {len(tasks_to_remove)} tasks.")
|
||||||
return len(tasks_to_remove)
|
return len(tasks_to_remove)
|
||||||
# [/DEF:clear_tasks:Function]
|
# [/DEF:clear_tasks:Function]
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
# [DEF:TaskManagerModels:Module]
|
# [DEF:TaskManagerModels:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: task, models, pydantic, enum, state
|
# @SEMANTICS: task, models, pydantic, enum, state
|
||||||
# @PURPOSE: Defines the data models and enumerations used by the Task Manager.
|
# @PURPOSE: Defines the data models and enumerations used by the Task Manager.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
@@ -16,6 +17,7 @@ from pydantic import BaseModel, Field
|
|||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:TaskStatus:Enum]
|
# [DEF:TaskStatus:Enum]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @SEMANTICS: task, status, state, enum
|
# @SEMANTICS: task, status, state, enum
|
||||||
# @PURPOSE: Defines the possible states a task can be in during its lifecycle.
|
# @PURPOSE: Defines the possible states a task can be in during its lifecycle.
|
||||||
class TaskStatus(str, Enum):
|
class TaskStatus(str, Enum):
|
||||||
@@ -27,17 +29,73 @@ class TaskStatus(str, Enum):
|
|||||||
AWAITING_INPUT = "AWAITING_INPUT"
|
AWAITING_INPUT = "AWAITING_INPUT"
|
||||||
# [/DEF:TaskStatus:Enum]
|
# [/DEF:TaskStatus:Enum]
|
||||||
|
|
||||||
|
# [DEF:LogLevel:Enum]
|
||||||
|
# @SEMANTICS: log, level, severity, enum
|
||||||
|
# @PURPOSE: Defines the possible log levels for task logging.
|
||||||
|
# @TIER: STANDARD
|
||||||
|
class LogLevel(str, Enum):
|
||||||
|
DEBUG = "DEBUG"
|
||||||
|
INFO = "INFO"
|
||||||
|
WARNING = "WARNING"
|
||||||
|
ERROR = "ERROR"
|
||||||
|
# [/DEF:LogLevel:Enum]
|
||||||
|
|
||||||
# [DEF:LogEntry:Class]
|
# [DEF:LogEntry:Class]
|
||||||
# @SEMANTICS: log, entry, record, pydantic
|
# @SEMANTICS: log, entry, record, pydantic
|
||||||
# @PURPOSE: A Pydantic model representing a single, structured log entry associated with a task.
|
# @PURPOSE: A Pydantic model representing a single, structured log entry associated with a task.
|
||||||
|
# @TIER: CRITICAL
|
||||||
|
# @INVARIANT: Each log entry has a unique timestamp and source.
|
||||||
class LogEntry(BaseModel):
|
class LogEntry(BaseModel):
|
||||||
timestamp: datetime = Field(default_factory=datetime.utcnow)
|
timestamp: datetime = Field(default_factory=datetime.utcnow)
|
||||||
level: str
|
level: str = Field(default="INFO")
|
||||||
message: str
|
message: str
|
||||||
context: Optional[Dict[str, Any]] = None
|
source: str = Field(default="system") # Component attribution: plugin, superset_api, git, etc.
|
||||||
|
context: Optional[Dict[str, Any]] = None # Legacy field, kept for backward compatibility
|
||||||
|
metadata: Optional[Dict[str, Any]] = None # Structured metadata (e.g., dashboard_id, progress)
|
||||||
# [/DEF:LogEntry:Class]
|
# [/DEF:LogEntry:Class]
|
||||||
|
|
||||||
|
# [DEF:TaskLog:Class]
|
||||||
|
# @SEMANTICS: task, log, persistent, pydantic
|
||||||
|
# @PURPOSE: A Pydantic model representing a persisted log entry from the database.
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @RELATION: MAPS_TO -> TaskLogRecord
|
||||||
|
class TaskLog(BaseModel):
|
||||||
|
id: int
|
||||||
|
task_id: str
|
||||||
|
timestamp: datetime
|
||||||
|
level: str
|
||||||
|
source: str
|
||||||
|
message: str
|
||||||
|
metadata: Optional[Dict[str, Any]] = None
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:TaskLog:Class]
|
||||||
|
|
||||||
|
# [DEF:LogFilter:Class]
|
||||||
|
# @SEMANTICS: log, filter, query, pydantic
|
||||||
|
# @PURPOSE: Filter parameters for querying task logs.
|
||||||
|
# @TIER: STANDARD
|
||||||
|
class LogFilter(BaseModel):
|
||||||
|
level: Optional[str] = None # Filter by log level
|
||||||
|
source: Optional[str] = None # Filter by source component
|
||||||
|
search: Optional[str] = None # Text search in message
|
||||||
|
offset: int = Field(default=0, ge=0)
|
||||||
|
limit: int = Field(default=100, ge=1, le=1000)
|
||||||
|
# [/DEF:LogFilter:Class]
|
||||||
|
|
||||||
|
# [DEF:LogStats:Class]
|
||||||
|
# @SEMANTICS: log, stats, aggregation, pydantic
|
||||||
|
# @PURPOSE: Statistics about log entries for a task.
|
||||||
|
# @TIER: STANDARD
|
||||||
|
class LogStats(BaseModel):
|
||||||
|
total_count: int
|
||||||
|
by_level: Dict[str, int] # {"INFO": 10, "ERROR": 2}
|
||||||
|
by_source: Dict[str, int] # {"plugin": 5, "superset_api": 7}
|
||||||
|
# [/DEF:LogStats:Class]
|
||||||
|
|
||||||
# [DEF:Task:Class]
|
# [DEF:Task:Class]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: task, job, execution, state, pydantic
|
# @SEMANTICS: task, job, execution, state, pydantic
|
||||||
# @PURPOSE: A Pydantic model representing a single execution instance of a plugin, including its status, parameters, and logs.
|
# @PURPOSE: A Pydantic model representing a single execution instance of a plugin, including its status, parameters, and logs.
|
||||||
class Task(BaseModel):
|
class Task(BaseModel):
|
||||||
|
|||||||
@@ -7,13 +7,13 @@
|
|||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from typing import List, Optional, Dict, Any
|
from typing import List, Optional
|
||||||
import json
|
import json
|
||||||
|
|
||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from ...models.task import TaskRecord
|
from ...models.task import TaskRecord, TaskLogRecord
|
||||||
from ..database import TasksSessionLocal
|
from ..database import TasksSessionLocal
|
||||||
from .models import Task, TaskStatus, LogEntry
|
from .models import Task, TaskStatus, LogEntry, TaskLog, LogFilter, LogStats
|
||||||
from ..logger import logger, belief_scope
|
from ..logger import logger, belief_scope
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
@@ -36,6 +36,7 @@ class TaskPersistenceService:
|
|||||||
# @PRE: isinstance(task, Task)
|
# @PRE: isinstance(task, Task)
|
||||||
# @POST: Task record created or updated in database.
|
# @POST: Task record created or updated in database.
|
||||||
# @PARAM: task (Task) - The task object to persist.
|
# @PARAM: task (Task) - The task object to persist.
|
||||||
|
# @SIDE_EFFECT: Writes to task_records table in tasks.db
|
||||||
def persist_task(self, task: Task) -> None:
|
def persist_task(self, task: Task) -> None:
|
||||||
with belief_scope("TaskPersistenceService.persist_task", f"task_id={task.id}"):
|
with belief_scope("TaskPersistenceService.persist_task", f"task_id={task.id}"):
|
||||||
session: Session = TasksSessionLocal()
|
session: Session = TasksSessionLocal()
|
||||||
@@ -50,8 +51,19 @@ class TaskPersistenceService:
|
|||||||
record.environment_id = task.params.get("environment_id") or task.params.get("source_env_id")
|
record.environment_id = task.params.get("environment_id") or task.params.get("source_env_id")
|
||||||
record.started_at = task.started_at
|
record.started_at = task.started_at
|
||||||
record.finished_at = task.finished_at
|
record.finished_at = task.finished_at
|
||||||
record.params = task.params
|
|
||||||
record.result = task.result
|
# Ensure params and result are JSON serializable
|
||||||
|
def json_serializable(obj):
|
||||||
|
if isinstance(obj, dict):
|
||||||
|
return {k: json_serializable(v) for k, v in obj.items()}
|
||||||
|
elif isinstance(obj, list):
|
||||||
|
return [json_serializable(v) for v in obj]
|
||||||
|
elif isinstance(obj, datetime):
|
||||||
|
return obj.isoformat()
|
||||||
|
return obj
|
||||||
|
|
||||||
|
record.params = json_serializable(task.params)
|
||||||
|
record.result = json_serializable(task.result)
|
||||||
|
|
||||||
# Store logs as JSON, converting datetime to string
|
# Store logs as JSON, converting datetime to string
|
||||||
record.logs = []
|
record.logs = []
|
||||||
@@ -59,6 +71,9 @@ class TaskPersistenceService:
|
|||||||
log_dict = log.dict()
|
log_dict = log.dict()
|
||||||
if isinstance(log_dict.get('timestamp'), datetime):
|
if isinstance(log_dict.get('timestamp'), datetime):
|
||||||
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
||||||
|
# Also clean up any datetimes in context
|
||||||
|
if log_dict.get('context'):
|
||||||
|
log_dict['context'] = json_serializable(log_dict['context'])
|
||||||
record.logs.append(log_dict)
|
record.logs.append(log_dict)
|
||||||
|
|
||||||
# Extract error if failed
|
# Extract error if failed
|
||||||
@@ -155,4 +170,215 @@ class TaskPersistenceService:
|
|||||||
# [/DEF:delete_tasks:Function]
|
# [/DEF:delete_tasks:Function]
|
||||||
|
|
||||||
# [/DEF:TaskPersistenceService:Class]
|
# [/DEF:TaskPersistenceService:Class]
|
||||||
|
|
||||||
|
# [DEF:TaskLogPersistenceService:Class]
|
||||||
|
# @SEMANTICS: persistence, service, database, log, sqlalchemy
|
||||||
|
# @PURPOSE: Provides methods to save and query task logs from the task_logs table.
|
||||||
|
# @TIER: CRITICAL
|
||||||
|
# @RELATION: DEPENDS_ON -> TaskLogRecord
|
||||||
|
# @INVARIANT: Log entries are batch-inserted for performance.
|
||||||
|
class TaskLogPersistenceService:
|
||||||
|
"""
|
||||||
|
Service for persisting and querying task logs.
|
||||||
|
Supports batch inserts, filtering, and statistics.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initialize the log persistence service.
|
||||||
|
# @POST: Service is ready.
|
||||||
|
def __init__(self):
|
||||||
|
pass
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:add_logs:Function]
|
||||||
|
# @PURPOSE: Batch insert log entries for a task.
|
||||||
|
# @PRE: logs is a list of LogEntry objects.
|
||||||
|
# @POST: All logs inserted into task_logs table.
|
||||||
|
# @PARAM: task_id (str) - The task ID.
|
||||||
|
# @PARAM: logs (List[LogEntry]) - Log entries to insert.
|
||||||
|
# @SIDE_EFFECT: Writes to task_logs table.
|
||||||
|
def add_logs(self, task_id: str, logs: List[LogEntry]) -> None:
|
||||||
|
if not logs:
|
||||||
|
return
|
||||||
|
with belief_scope("TaskLogPersistenceService.add_logs", f"task_id={task_id}"):
|
||||||
|
session: Session = TasksSessionLocal()
|
||||||
|
try:
|
||||||
|
for log in logs:
|
||||||
|
record = TaskLogRecord(
|
||||||
|
task_id=task_id,
|
||||||
|
timestamp=log.timestamp,
|
||||||
|
level=log.level,
|
||||||
|
source=log.source or "system",
|
||||||
|
message=log.message,
|
||||||
|
metadata_json=json.dumps(log.metadata) if log.metadata else None
|
||||||
|
)
|
||||||
|
session.add(record)
|
||||||
|
session.commit()
|
||||||
|
except Exception as e:
|
||||||
|
session.rollback()
|
||||||
|
logger.error(f"Failed to add logs for task {task_id}: {e}")
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
# [/DEF:add_logs:Function]
|
||||||
|
|
||||||
|
# [DEF:get_logs:Function]
|
||||||
|
# @PURPOSE: Query logs for a task with filtering and pagination.
|
||||||
|
# @PRE: task_id is a valid task ID.
|
||||||
|
# @POST: Returns list of TaskLog objects matching filters.
|
||||||
|
# @PARAM: task_id (str) - The task ID.
|
||||||
|
# @PARAM: log_filter (LogFilter) - Filter parameters.
|
||||||
|
# @RETURN: List[TaskLog] - Filtered log entries.
|
||||||
|
def get_logs(self, task_id: str, log_filter: LogFilter) -> List[TaskLog]:
|
||||||
|
with belief_scope("TaskLogPersistenceService.get_logs", f"task_id={task_id}"):
|
||||||
|
session: Session = TasksSessionLocal()
|
||||||
|
try:
|
||||||
|
query = session.query(TaskLogRecord).filter(TaskLogRecord.task_id == task_id)
|
||||||
|
|
||||||
|
# Apply filters
|
||||||
|
if log_filter.level:
|
||||||
|
query = query.filter(TaskLogRecord.level == log_filter.level.upper())
|
||||||
|
if log_filter.source:
|
||||||
|
query = query.filter(TaskLogRecord.source == log_filter.source)
|
||||||
|
if log_filter.search:
|
||||||
|
search_pattern = f"%{log_filter.search}%"
|
||||||
|
query = query.filter(TaskLogRecord.message.ilike(search_pattern))
|
||||||
|
|
||||||
|
# Order by timestamp ascending (oldest first)
|
||||||
|
query = query.order_by(TaskLogRecord.timestamp.asc())
|
||||||
|
|
||||||
|
# Apply pagination
|
||||||
|
records = query.offset(log_filter.offset).limit(log_filter.limit).all()
|
||||||
|
|
||||||
|
logs = []
|
||||||
|
for record in records:
|
||||||
|
metadata = None
|
||||||
|
if record.metadata_json:
|
||||||
|
try:
|
||||||
|
metadata = json.loads(record.metadata_json)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
metadata = None
|
||||||
|
|
||||||
|
logs.append(TaskLog(
|
||||||
|
id=record.id,
|
||||||
|
task_id=record.task_id,
|
||||||
|
timestamp=record.timestamp,
|
||||||
|
level=record.level,
|
||||||
|
source=record.source,
|
||||||
|
message=record.message,
|
||||||
|
metadata=metadata
|
||||||
|
))
|
||||||
|
|
||||||
|
return logs
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
# [/DEF:get_logs:Function]
|
||||||
|
|
||||||
|
# [DEF:get_log_stats:Function]
|
||||||
|
# @PURPOSE: Get statistics about logs for a task.
|
||||||
|
# @PRE: task_id is a valid task ID.
|
||||||
|
# @POST: Returns LogStats with counts by level and source.
|
||||||
|
# @PARAM: task_id (str) - The task ID.
|
||||||
|
# @RETURN: LogStats - Statistics about task logs.
|
||||||
|
def get_log_stats(self, task_id: str) -> LogStats:
|
||||||
|
with belief_scope("TaskLogPersistenceService.get_log_stats", f"task_id={task_id}"):
|
||||||
|
session: Session = TasksSessionLocal()
|
||||||
|
try:
|
||||||
|
# Get total count
|
||||||
|
total_count = session.query(TaskLogRecord).filter(
|
||||||
|
TaskLogRecord.task_id == task_id
|
||||||
|
).count()
|
||||||
|
|
||||||
|
# Get counts by level
|
||||||
|
from sqlalchemy import func
|
||||||
|
level_counts = session.query(
|
||||||
|
TaskLogRecord.level,
|
||||||
|
func.count(TaskLogRecord.id)
|
||||||
|
).filter(
|
||||||
|
TaskLogRecord.task_id == task_id
|
||||||
|
).group_by(TaskLogRecord.level).all()
|
||||||
|
|
||||||
|
by_level = {level: count for level, count in level_counts}
|
||||||
|
|
||||||
|
# Get counts by source
|
||||||
|
source_counts = session.query(
|
||||||
|
TaskLogRecord.source,
|
||||||
|
func.count(TaskLogRecord.id)
|
||||||
|
).filter(
|
||||||
|
TaskLogRecord.task_id == task_id
|
||||||
|
).group_by(TaskLogRecord.source).all()
|
||||||
|
|
||||||
|
by_source = {source: count for source, count in source_counts}
|
||||||
|
|
||||||
|
return LogStats(
|
||||||
|
total_count=total_count,
|
||||||
|
by_level=by_level,
|
||||||
|
by_source=by_source
|
||||||
|
)
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
# [/DEF:get_log_stats:Function]
|
||||||
|
|
||||||
|
# [DEF:get_sources:Function]
|
||||||
|
# @PURPOSE: Get unique sources for a task's logs.
|
||||||
|
# @PRE: task_id is a valid task ID.
|
||||||
|
# @POST: Returns list of unique source strings.
|
||||||
|
# @PARAM: task_id (str) - The task ID.
|
||||||
|
# @RETURN: List[str] - Unique source names.
|
||||||
|
def get_sources(self, task_id: str) -> List[str]:
|
||||||
|
with belief_scope("TaskLogPersistenceService.get_sources", f"task_id={task_id}"):
|
||||||
|
session: Session = TasksSessionLocal()
|
||||||
|
try:
|
||||||
|
from sqlalchemy import distinct
|
||||||
|
sources = session.query(distinct(TaskLogRecord.source)).filter(
|
||||||
|
TaskLogRecord.task_id == task_id
|
||||||
|
).all()
|
||||||
|
return [s[0] for s in sources]
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
# [/DEF:get_sources:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_logs_for_task:Function]
|
||||||
|
# @PURPOSE: Delete all logs for a specific task.
|
||||||
|
# @PRE: task_id is a valid task ID.
|
||||||
|
# @POST: All logs for the task are deleted.
|
||||||
|
# @PARAM: task_id (str) - The task ID.
|
||||||
|
# @SIDE_EFFECT: Deletes from task_logs table.
|
||||||
|
def delete_logs_for_task(self, task_id: str) -> None:
|
||||||
|
with belief_scope("TaskLogPersistenceService.delete_logs_for_task", f"task_id={task_id}"):
|
||||||
|
session: Session = TasksSessionLocal()
|
||||||
|
try:
|
||||||
|
session.query(TaskLogRecord).filter(
|
||||||
|
TaskLogRecord.task_id == task_id
|
||||||
|
).delete(synchronize_session=False)
|
||||||
|
session.commit()
|
||||||
|
except Exception as e:
|
||||||
|
session.rollback()
|
||||||
|
logger.error(f"Failed to delete logs for task {task_id}: {e}")
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
# [/DEF:delete_logs_for_task:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_logs_for_tasks:Function]
|
||||||
|
# @PURPOSE: Delete all logs for multiple tasks.
|
||||||
|
# @PRE: task_ids is a list of task IDs.
|
||||||
|
# @POST: All logs for the tasks are deleted.
|
||||||
|
# @PARAM: task_ids (List[str]) - List of task IDs.
|
||||||
|
def delete_logs_for_tasks(self, task_ids: List[str]) -> None:
|
||||||
|
if not task_ids:
|
||||||
|
return
|
||||||
|
with belief_scope("TaskLogPersistenceService.delete_logs_for_tasks"):
|
||||||
|
session: Session = TasksSessionLocal()
|
||||||
|
try:
|
||||||
|
session.query(TaskLogRecord).filter(
|
||||||
|
TaskLogRecord.task_id.in_(task_ids)
|
||||||
|
).delete(synchronize_session=False)
|
||||||
|
session.commit()
|
||||||
|
except Exception as e:
|
||||||
|
session.rollback()
|
||||||
|
logger.error(f"Failed to delete logs for tasks: {e}")
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
# [/DEF:delete_logs_for_tasks:Function]
|
||||||
|
|
||||||
|
# [/DEF:TaskLogPersistenceService:Class]
|
||||||
# [/DEF:TaskPersistenceModule:Module]
|
# [/DEF:TaskPersistenceModule:Module]
|
||||||
167
backend/src/core/task_manager/task_logger.py
Normal file
167
backend/src/core/task_manager/task_logger.py
Normal file
@@ -0,0 +1,167 @@
|
|||||||
|
# [DEF:TaskLoggerModule:Module]
|
||||||
|
# @SEMANTICS: task, logger, context, plugin, attribution
|
||||||
|
# @PURPOSE: Provides a dedicated logger for tasks with automatic source attribution.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> TaskManager, CALLS -> TaskManager._add_log
|
||||||
|
# @TIER: CRITICAL
|
||||||
|
# @INVARIANT: Each TaskLogger instance is bound to a specific task_id and default source.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from typing import Dict, Any, Optional, Callable
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:TaskLogger:Class]
|
||||||
|
# @SEMANTICS: logger, task, source, attribution
|
||||||
|
# @PURPOSE: A wrapper around TaskManager._add_log that carries task_id and source context.
|
||||||
|
# @TIER: CRITICAL
|
||||||
|
# @INVARIANT: All log calls include the task_id and source.
|
||||||
|
# @UX_STATE: Idle -> Logging -> (system records log)
|
||||||
|
class TaskLogger:
|
||||||
|
"""
|
||||||
|
A dedicated logger for tasks that automatically tags logs with source attribution.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
logger = TaskLogger(task_id="abc123", add_log_fn=task_manager._add_log, source="plugin")
|
||||||
|
logger.info("Starting backup process")
|
||||||
|
logger.error("Failed to connect", metadata={"error_code": 500})
|
||||||
|
|
||||||
|
# Create sub-logger with different source
|
||||||
|
api_logger = logger.with_source("superset_api")
|
||||||
|
api_logger.info("Fetching dashboards")
|
||||||
|
"""
|
||||||
|
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initialize the TaskLogger with task context.
|
||||||
|
# @PRE: add_log_fn is a callable that accepts (task_id, level, message, context, source, metadata).
|
||||||
|
# @POST: TaskLogger is ready to log messages.
|
||||||
|
# @PARAM: task_id (str) - The ID of the task.
|
||||||
|
# @PARAM: add_log_fn (Callable) - Function to add log to TaskManager.
|
||||||
|
# @PARAM: source (str) - Default source for logs (default: "plugin").
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
task_id: str,
|
||||||
|
add_log_fn: Callable,
|
||||||
|
source: str = "plugin"
|
||||||
|
):
|
||||||
|
self._task_id = task_id
|
||||||
|
self._add_log = add_log_fn
|
||||||
|
self._default_source = source
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:with_source:Function]
|
||||||
|
# @PURPOSE: Create a sub-logger with a different default source.
|
||||||
|
# @PRE: source is a non-empty string.
|
||||||
|
# @POST: Returns new TaskLogger with the same task_id but different source.
|
||||||
|
# @PARAM: source (str) - New default source.
|
||||||
|
# @RETURN: TaskLogger - New logger instance.
|
||||||
|
def with_source(self, source: str) -> "TaskLogger":
|
||||||
|
"""Create a sub-logger with a different source context."""
|
||||||
|
return TaskLogger(
|
||||||
|
task_id=self._task_id,
|
||||||
|
add_log_fn=self._add_log,
|
||||||
|
source=source
|
||||||
|
)
|
||||||
|
# [/DEF:with_source:Function]
|
||||||
|
|
||||||
|
# [DEF:_log:Function]
|
||||||
|
# @PURPOSE: Internal method to log a message at a given level.
|
||||||
|
# @PRE: level is a valid log level string.
|
||||||
|
# @POST: Log entry added via add_log_fn.
|
||||||
|
# @PARAM: level (str) - Log level (DEBUG, INFO, WARNING, ERROR).
|
||||||
|
# @PARAM: message (str) - Log message.
|
||||||
|
# @PARAM: source (Optional[str]) - Override source for this log entry.
|
||||||
|
# @PARAM: metadata (Optional[Dict]) - Additional structured data.
|
||||||
|
def _log(
|
||||||
|
self,
|
||||||
|
level: str,
|
||||||
|
message: str,
|
||||||
|
source: Optional[str] = None,
|
||||||
|
metadata: Optional[Dict[str, Any]] = None
|
||||||
|
) -> None:
|
||||||
|
"""Internal logging method."""
|
||||||
|
self._add_log(
|
||||||
|
task_id=self._task_id,
|
||||||
|
level=level,
|
||||||
|
message=message,
|
||||||
|
source=source or self._default_source,
|
||||||
|
metadata=metadata
|
||||||
|
)
|
||||||
|
# [/DEF:_log:Function]
|
||||||
|
|
||||||
|
# [DEF:debug:Function]
|
||||||
|
# @PURPOSE: Log a DEBUG level message.
|
||||||
|
# @PARAM: message (str) - Log message.
|
||||||
|
# @PARAM: source (Optional[str]) - Override source.
|
||||||
|
# @PARAM: metadata (Optional[Dict]) - Additional data.
|
||||||
|
def debug(
|
||||||
|
self,
|
||||||
|
message: str,
|
||||||
|
source: Optional[str] = None,
|
||||||
|
metadata: Optional[Dict[str, Any]] = None
|
||||||
|
) -> None:
|
||||||
|
self._log("DEBUG", message, source, metadata)
|
||||||
|
# [/DEF:debug:Function]
|
||||||
|
|
||||||
|
# [DEF:info:Function]
|
||||||
|
# @PURPOSE: Log an INFO level message.
|
||||||
|
# @PARAM: message (str) - Log message.
|
||||||
|
# @PARAM: source (Optional[str]) - Override source.
|
||||||
|
# @PARAM: metadata (Optional[Dict]) - Additional data.
|
||||||
|
def info(
|
||||||
|
self,
|
||||||
|
message: str,
|
||||||
|
source: Optional[str] = None,
|
||||||
|
metadata: Optional[Dict[str, Any]] = None
|
||||||
|
) -> None:
|
||||||
|
self._log("INFO", message, source, metadata)
|
||||||
|
# [/DEF:info:Function]
|
||||||
|
|
||||||
|
# [DEF:warning:Function]
|
||||||
|
# @PURPOSE: Log a WARNING level message.
|
||||||
|
# @PARAM: message (str) - Log message.
|
||||||
|
# @PARAM: source (Optional[str]) - Override source.
|
||||||
|
# @PARAM: metadata (Optional[Dict]) - Additional data.
|
||||||
|
def warning(
|
||||||
|
self,
|
||||||
|
message: str,
|
||||||
|
source: Optional[str] = None,
|
||||||
|
metadata: Optional[Dict[str, Any]] = None
|
||||||
|
) -> None:
|
||||||
|
self._log("WARNING", message, source, metadata)
|
||||||
|
# [/DEF:warning:Function]
|
||||||
|
|
||||||
|
# [DEF:error:Function]
|
||||||
|
# @PURPOSE: Log an ERROR level message.
|
||||||
|
# @PARAM: message (str) - Log message.
|
||||||
|
# @PARAM: source (Optional[str]) - Override source.
|
||||||
|
# @PARAM: metadata (Optional[Dict]) - Additional data.
|
||||||
|
def error(
|
||||||
|
self,
|
||||||
|
message: str,
|
||||||
|
source: Optional[str] = None,
|
||||||
|
metadata: Optional[Dict[str, Any]] = None
|
||||||
|
) -> None:
|
||||||
|
self._log("ERROR", message, source, metadata)
|
||||||
|
# [/DEF:error:Function]
|
||||||
|
|
||||||
|
# [DEF:progress:Function]
|
||||||
|
# @PURPOSE: Log a progress update with percentage.
|
||||||
|
# @PRE: percent is between 0 and 100.
|
||||||
|
# @POST: Log entry with progress metadata added.
|
||||||
|
# @PARAM: message (str) - Progress message.
|
||||||
|
# @PARAM: percent (float) - Progress percentage (0-100).
|
||||||
|
# @PARAM: source (Optional[str]) - Override source.
|
||||||
|
def progress(
|
||||||
|
self,
|
||||||
|
message: str,
|
||||||
|
percent: float,
|
||||||
|
source: Optional[str] = None
|
||||||
|
) -> None:
|
||||||
|
"""Log a progress update with percentage."""
|
||||||
|
metadata = {"progress": min(100, max(0, percent))}
|
||||||
|
self._log("INFO", message, source, metadata)
|
||||||
|
# [/DEF:progress:Function]
|
||||||
|
|
||||||
|
# [/DEF:TaskLogger:Class]
|
||||||
|
|
||||||
|
# [/DEF:TaskLoggerModule:Module]
|
||||||
@@ -11,7 +11,7 @@
|
|||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
import pandas as pd # type: ignore
|
import pandas as pd # type: ignore
|
||||||
import psycopg2 # type: ignore
|
import psycopg2 # type: ignore
|
||||||
from typing import Dict, List, Optional, Any
|
from typing import Dict, Optional, Any
|
||||||
from ..logger import logger as app_logger, belief_scope
|
from ..logger import logger as app_logger, belief_scope
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
|
|||||||
@@ -19,13 +19,14 @@ from datetime import date, datetime
|
|||||||
import shutil
|
import shutil
|
||||||
import zlib
|
import zlib
|
||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
import yaml
|
|
||||||
from ..logger import logger as app_logger, belief_scope
|
from ..logger import logger as app_logger, belief_scope
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:InvalidZipFormatError:Class]
|
# [DEF:InvalidZipFormatError:Class]
|
||||||
|
# @PURPOSE: Exception raised when a file is not a valid ZIP archive.
|
||||||
class InvalidZipFormatError(Exception):
|
class InvalidZipFormatError(Exception):
|
||||||
pass
|
pass
|
||||||
|
# [/DEF:InvalidZipFormatError:Class]
|
||||||
|
|
||||||
# [DEF:create_temp_file:Function]
|
# [DEF:create_temp_file:Function]
|
||||||
# @PURPOSE: Контекстный менеджер для создания временного файла или директории с гарантированным удалением.
|
# @PURPOSE: Контекстный менеджер для создания временного файла или директории с гарантированным удалением.
|
||||||
|
|||||||
@@ -20,31 +20,71 @@ from ..logger import logger as app_logger, belief_scope
|
|||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:SupersetAPIError:Class]
|
# [DEF:SupersetAPIError:Class]
|
||||||
|
# @PURPOSE: Base exception for all Superset API related errors.
|
||||||
class SupersetAPIError(Exception):
|
class SupersetAPIError(Exception):
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the exception with a message and context.
|
||||||
|
# @PRE: message is a string, context is a dict.
|
||||||
|
# @POST: Exception is initialized with context.
|
||||||
def __init__(self, message: str = "Superset API error", **context: Any):
|
def __init__(self, message: str = "Superset API error", **context: Any):
|
||||||
self.context = context
|
with belief_scope("SupersetAPIError.__init__"):
|
||||||
super().__init__(f"[API_FAILURE] {message} | Context: {self.context}")
|
self.context = context
|
||||||
|
super().__init__(f"[API_FAILURE] {message} | Context: {self.context}")
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
# [/DEF:SupersetAPIError:Class]
|
||||||
|
|
||||||
# [DEF:AuthenticationError:Class]
|
# [DEF:AuthenticationError:Class]
|
||||||
|
# @PURPOSE: Exception raised when authentication fails.
|
||||||
class AuthenticationError(SupersetAPIError):
|
class AuthenticationError(SupersetAPIError):
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the authentication error.
|
||||||
|
# @PRE: message is a string, context is a dict.
|
||||||
|
# @POST: AuthenticationError is initialized.
|
||||||
def __init__(self, message: str = "Authentication failed", **context: Any):
|
def __init__(self, message: str = "Authentication failed", **context: Any):
|
||||||
super().__init__(message, type="authentication", **context)
|
with belief_scope("AuthenticationError.__init__"):
|
||||||
|
super().__init__(message, type="authentication", **context)
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
# [/DEF:AuthenticationError:Class]
|
||||||
|
|
||||||
# [DEF:PermissionDeniedError:Class]
|
# [DEF:PermissionDeniedError:Class]
|
||||||
|
# @PURPOSE: Exception raised when access is denied.
|
||||||
class PermissionDeniedError(AuthenticationError):
|
class PermissionDeniedError(AuthenticationError):
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the permission denied error.
|
||||||
|
# @PRE: message is a string, context is a dict.
|
||||||
|
# @POST: PermissionDeniedError is initialized.
|
||||||
def __init__(self, message: str = "Permission denied", **context: Any):
|
def __init__(self, message: str = "Permission denied", **context: Any):
|
||||||
super().__init__(message, **context)
|
with belief_scope("PermissionDeniedError.__init__"):
|
||||||
|
super().__init__(message, **context)
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
# [/DEF:PermissionDeniedError:Class]
|
||||||
|
|
||||||
# [DEF:DashboardNotFoundError:Class]
|
# [DEF:DashboardNotFoundError:Class]
|
||||||
|
# @PURPOSE: Exception raised when a dashboard cannot be found.
|
||||||
class DashboardNotFoundError(SupersetAPIError):
|
class DashboardNotFoundError(SupersetAPIError):
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the not found error with resource ID.
|
||||||
|
# @PRE: resource_id is provided.
|
||||||
|
# @POST: DashboardNotFoundError is initialized.
|
||||||
def __init__(self, resource_id: Union[int, str], message: str = "Dashboard not found", **context: Any):
|
def __init__(self, resource_id: Union[int, str], message: str = "Dashboard not found", **context: Any):
|
||||||
super().__init__(f"Dashboard '{resource_id}' {message}", subtype="not_found", resource_id=resource_id, **context)
|
with belief_scope("DashboardNotFoundError.__init__"):
|
||||||
|
super().__init__(f"Dashboard '{resource_id}' {message}", subtype="not_found", resource_id=resource_id, **context)
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
# [/DEF:DashboardNotFoundError:Class]
|
||||||
|
|
||||||
# [DEF:NetworkError:Class]
|
# [DEF:NetworkError:Class]
|
||||||
|
# @PURPOSE: Exception raised when a network level error occurs.
|
||||||
class NetworkError(Exception):
|
class NetworkError(Exception):
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the network error.
|
||||||
|
# @PRE: message is a string.
|
||||||
|
# @POST: NetworkError is initialized.
|
||||||
def __init__(self, message: str = "Network connection failed", **context: Any):
|
def __init__(self, message: str = "Network connection failed", **context: Any):
|
||||||
self.context = context
|
with belief_scope("NetworkError.__init__"):
|
||||||
super().__init__(f"[NETWORK_FAILURE] {message} | Context: {self.context}")
|
self.context = context
|
||||||
|
super().__init__(f"[NETWORK_FAILURE] {message} | Context: {self.context}")
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
# [/DEF:NetworkError:Class]
|
||||||
|
|
||||||
# [DEF:APIClient:Class]
|
# [DEF:APIClient:Class]
|
||||||
# @PURPOSE: Инкапсулирует HTTP-логику для работы с API, включая сессии, аутентификацию, и обработку запросов.
|
# @PURPOSE: Инкапсулирует HTTP-логику для работы с API, включая сессии, аутентификацию, и обработку запросов.
|
||||||
@@ -100,7 +140,16 @@ class APIClient:
|
|||||||
app_logger.info("[authenticate][Enter] Authenticating to %s", self.base_url)
|
app_logger.info("[authenticate][Enter] Authenticating to %s", self.base_url)
|
||||||
try:
|
try:
|
||||||
login_url = f"{self.base_url}/security/login"
|
login_url = f"{self.base_url}/security/login"
|
||||||
|
# Log the payload keys and values (masking password)
|
||||||
|
masked_auth = {k: ("******" if k == "password" else v) for k, v in self.auth.items()}
|
||||||
|
app_logger.info(f"[authenticate][Debug] Login URL: {login_url}")
|
||||||
|
app_logger.info(f"[authenticate][Debug] Auth payload: {masked_auth}")
|
||||||
|
|
||||||
response = self.session.post(login_url, json=self.auth, timeout=self.request_settings["timeout"])
|
response = self.session.post(login_url, json=self.auth, timeout=self.request_settings["timeout"])
|
||||||
|
|
||||||
|
if response.status_code != 200:
|
||||||
|
app_logger.error(f"[authenticate][Error] Status: {response.status_code}, Response: {response.text}")
|
||||||
|
|
||||||
response.raise_for_status()
|
response.raise_for_status()
|
||||||
access_token = response.json()["access_token"]
|
access_token = response.json()["access_token"]
|
||||||
|
|
||||||
@@ -113,6 +162,9 @@ class APIClient:
|
|||||||
app_logger.info("[authenticate][Exit] Authenticated successfully.")
|
app_logger.info("[authenticate][Exit] Authenticated successfully.")
|
||||||
return self._tokens
|
return self._tokens
|
||||||
except requests.exceptions.HTTPError as e:
|
except requests.exceptions.HTTPError as e:
|
||||||
|
status_code = e.response.status_code if e.response is not None else None
|
||||||
|
if status_code in [502, 503, 504]:
|
||||||
|
raise NetworkError(f"Environment unavailable during authentication (Status {status_code})", status_code=status_code) from e
|
||||||
raise AuthenticationError(f"Authentication failed: {e}") from e
|
raise AuthenticationError(f"Authentication failed: {e}") from e
|
||||||
except (requests.exceptions.RequestException, KeyError) as e:
|
except (requests.exceptions.RequestException, KeyError) as e:
|
||||||
raise NetworkError(f"Network or parsing error during authentication: {e}") from e
|
raise NetworkError(f"Network or parsing error during authentication: {e}") from e
|
||||||
@@ -125,7 +177,8 @@ class APIClient:
|
|||||||
# @POST: Returns headers including auth tokens.
|
# @POST: Returns headers including auth tokens.
|
||||||
def headers(self) -> Dict[str, str]:
|
def headers(self) -> Dict[str, str]:
|
||||||
with belief_scope("headers"):
|
with belief_scope("headers"):
|
||||||
if not self._authenticated: self.authenticate()
|
if not self._authenticated:
|
||||||
|
self.authenticate()
|
||||||
return {
|
return {
|
||||||
"Authorization": f"Bearer {self._tokens['access_token']}",
|
"Authorization": f"Bearer {self._tokens['access_token']}",
|
||||||
"X-CSRFToken": self._tokens.get("csrf_token", ""),
|
"X-CSRFToken": self._tokens.get("csrf_token", ""),
|
||||||
@@ -148,7 +201,8 @@ class APIClient:
|
|||||||
with belief_scope("request"):
|
with belief_scope("request"):
|
||||||
full_url = f"{self.base_url}{endpoint}"
|
full_url = f"{self.base_url}{endpoint}"
|
||||||
_headers = self.headers.copy()
|
_headers = self.headers.copy()
|
||||||
if headers: _headers.update(headers)
|
if headers:
|
||||||
|
_headers.update(headers)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
response = self.session.request(method, full_url, headers=_headers, **kwargs)
|
response = self.session.request(method, full_url, headers=_headers, **kwargs)
|
||||||
@@ -169,9 +223,14 @@ class APIClient:
|
|||||||
def _handle_http_error(self, e: requests.exceptions.HTTPError, endpoint: str):
|
def _handle_http_error(self, e: requests.exceptions.HTTPError, endpoint: str):
|
||||||
with belief_scope("_handle_http_error"):
|
with belief_scope("_handle_http_error"):
|
||||||
status_code = e.response.status_code
|
status_code = e.response.status_code
|
||||||
if status_code == 404: raise DashboardNotFoundError(endpoint) from e
|
if status_code == 502 or status_code == 503 or status_code == 504:
|
||||||
if status_code == 403: raise PermissionDeniedError() from e
|
raise NetworkError(f"Environment unavailable (Status {status_code})", status_code=status_code) from e
|
||||||
if status_code == 401: raise AuthenticationError() from e
|
if status_code == 404:
|
||||||
|
raise DashboardNotFoundError(endpoint) from e
|
||||||
|
if status_code == 403:
|
||||||
|
raise PermissionDeniedError() from e
|
||||||
|
if status_code == 401:
|
||||||
|
raise AuthenticationError() from e
|
||||||
raise SupersetAPIError(f"API Error {status_code}: {e.response.text}") from e
|
raise SupersetAPIError(f"API Error {status_code}: {e.response.text}") from e
|
||||||
# [/DEF:_handle_http_error:Function]
|
# [/DEF:_handle_http_error:Function]
|
||||||
|
|
||||||
@@ -183,9 +242,12 @@ class APIClient:
|
|||||||
# @POST: Raises a NetworkError.
|
# @POST: Raises a NetworkError.
|
||||||
def _handle_network_error(self, e: requests.exceptions.RequestException, url: str):
|
def _handle_network_error(self, e: requests.exceptions.RequestException, url: str):
|
||||||
with belief_scope("_handle_network_error"):
|
with belief_scope("_handle_network_error"):
|
||||||
if isinstance(e, requests.exceptions.Timeout): msg = "Request timeout"
|
if isinstance(e, requests.exceptions.Timeout):
|
||||||
elif isinstance(e, requests.exceptions.ConnectionError): msg = "Connection error"
|
msg = "Request timeout"
|
||||||
else: msg = f"Unknown network error: {e}"
|
elif isinstance(e, requests.exceptions.ConnectionError):
|
||||||
|
msg = "Connection error"
|
||||||
|
else:
|
||||||
|
msg = f"Unknown network error: {e}"
|
||||||
raise NetworkError(msg, url=url) from e
|
raise NetworkError(msg, url=url) from e
|
||||||
# [/DEF:_handle_network_error:Function]
|
# [/DEF:_handle_network_error:Function]
|
||||||
|
|
||||||
@@ -202,7 +264,9 @@ class APIClient:
|
|||||||
def upload_file(self, endpoint: str, file_info: Dict[str, Any], extra_data: Optional[Dict] = None, timeout: Optional[int] = None) -> Dict:
|
def upload_file(self, endpoint: str, file_info: Dict[str, Any], extra_data: Optional[Dict] = None, timeout: Optional[int] = None) -> Dict:
|
||||||
with belief_scope("upload_file"):
|
with belief_scope("upload_file"):
|
||||||
full_url = f"{self.base_url}{endpoint}"
|
full_url = f"{self.base_url}{endpoint}"
|
||||||
_headers = self.headers.copy(); _headers.pop('Content-Type', None)
|
_headers = self.headers.copy()
|
||||||
|
_headers.pop('Content-Type', None)
|
||||||
|
|
||||||
|
|
||||||
file_obj, file_name, form_field = file_info.get("file_obj"), file_info.get("file_name"), file_info.get("form_field", "file")
|
file_obj, file_name, form_field = file_info.get("file_obj"), file_info.get("file_name"), file_info.get("form_field", "file")
|
||||||
|
|
||||||
|
|||||||
@@ -1,16 +1,23 @@
|
|||||||
# [DEF:Dependencies:Module]
|
# [DEF:Dependencies:Module]
|
||||||
# @SEMANTICS: dependency, injection, singleton, factory
|
# @SEMANTICS: dependency, injection, singleton, factory, auth, jwt
|
||||||
# @PURPOSE: Manages the creation and provision of shared application dependencies, such as the PluginLoader and TaskManager, to avoid circular imports.
|
# @PURPOSE: Manages the creation and provision of shared application dependencies, such as the PluginLoader and TaskManager, to avoid circular imports.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
# @RELATION: Used by the main app and API routers to get access to shared instances.
|
# @RELATION: Used by the main app and API routers to get access to shared instances.
|
||||||
|
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
from fastapi import Depends, HTTPException, status
|
||||||
|
from fastapi.security import OAuth2PasswordBearer
|
||||||
|
from jose import JWTError
|
||||||
from .core.plugin_loader import PluginLoader
|
from .core.plugin_loader import PluginLoader
|
||||||
from .core.task_manager import TaskManager
|
from .core.task_manager import TaskManager
|
||||||
from .core.config_manager import ConfigManager
|
from .core.config_manager import ConfigManager
|
||||||
from .core.scheduler import SchedulerService
|
from .core.scheduler import SchedulerService
|
||||||
from .core.database import init_db
|
from .services.resource_service import ResourceService
|
||||||
from .core.logger import logger, belief_scope
|
from .core.database import init_db, get_auth_db
|
||||||
|
from .core.logger import logger
|
||||||
|
from .core.auth.jwt import decode_token
|
||||||
|
from .core.auth.repository import AuthRepository
|
||||||
|
from .models.auth import User
|
||||||
|
|
||||||
# Initialize singletons
|
# Initialize singletons
|
||||||
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
||||||
@@ -28,8 +35,7 @@ init_db()
|
|||||||
# @RETURN: ConfigManager - The shared config manager instance.
|
# @RETURN: ConfigManager - The shared config manager instance.
|
||||||
def get_config_manager() -> ConfigManager:
|
def get_config_manager() -> ConfigManager:
|
||||||
"""Dependency injector for the ConfigManager."""
|
"""Dependency injector for the ConfigManager."""
|
||||||
with belief_scope("get_config_manager"):
|
return config_manager
|
||||||
return config_manager
|
|
||||||
# [/DEF:get_config_manager:Function]
|
# [/DEF:get_config_manager:Function]
|
||||||
|
|
||||||
plugin_dir = Path(__file__).parent / "plugins"
|
plugin_dir = Path(__file__).parent / "plugins"
|
||||||
@@ -44,6 +50,9 @@ logger.info("TaskManager initialized")
|
|||||||
scheduler_service = SchedulerService(task_manager, config_manager)
|
scheduler_service = SchedulerService(task_manager, config_manager)
|
||||||
logger.info("SchedulerService initialized")
|
logger.info("SchedulerService initialized")
|
||||||
|
|
||||||
|
resource_service = ResourceService()
|
||||||
|
logger.info("ResourceService initialized")
|
||||||
|
|
||||||
# [DEF:get_plugin_loader:Function]
|
# [DEF:get_plugin_loader:Function]
|
||||||
# @PURPOSE: Dependency injector for the PluginLoader.
|
# @PURPOSE: Dependency injector for the PluginLoader.
|
||||||
# @PRE: Global plugin_loader must be initialized.
|
# @PRE: Global plugin_loader must be initialized.
|
||||||
@@ -51,8 +60,7 @@ logger.info("SchedulerService initialized")
|
|||||||
# @RETURN: PluginLoader - The shared plugin loader instance.
|
# @RETURN: PluginLoader - The shared plugin loader instance.
|
||||||
def get_plugin_loader() -> PluginLoader:
|
def get_plugin_loader() -> PluginLoader:
|
||||||
"""Dependency injector for the PluginLoader."""
|
"""Dependency injector for the PluginLoader."""
|
||||||
with belief_scope("get_plugin_loader"):
|
return plugin_loader
|
||||||
return plugin_loader
|
|
||||||
# [/DEF:get_plugin_loader:Function]
|
# [/DEF:get_plugin_loader:Function]
|
||||||
|
|
||||||
# [DEF:get_task_manager:Function]
|
# [DEF:get_task_manager:Function]
|
||||||
@@ -62,8 +70,7 @@ def get_plugin_loader() -> PluginLoader:
|
|||||||
# @RETURN: TaskManager - The shared task manager instance.
|
# @RETURN: TaskManager - The shared task manager instance.
|
||||||
def get_task_manager() -> TaskManager:
|
def get_task_manager() -> TaskManager:
|
||||||
"""Dependency injector for the TaskManager."""
|
"""Dependency injector for the TaskManager."""
|
||||||
with belief_scope("get_task_manager"):
|
return task_manager
|
||||||
return task_manager
|
|
||||||
# [/DEF:get_task_manager:Function]
|
# [/DEF:get_task_manager:Function]
|
||||||
|
|
||||||
# [DEF:get_scheduler_service:Function]
|
# [DEF:get_scheduler_service:Function]
|
||||||
@@ -73,8 +80,81 @@ def get_task_manager() -> TaskManager:
|
|||||||
# @RETURN: SchedulerService - The shared scheduler service instance.
|
# @RETURN: SchedulerService - The shared scheduler service instance.
|
||||||
def get_scheduler_service() -> SchedulerService:
|
def get_scheduler_service() -> SchedulerService:
|
||||||
"""Dependency injector for the SchedulerService."""
|
"""Dependency injector for the SchedulerService."""
|
||||||
with belief_scope("get_scheduler_service"):
|
return scheduler_service
|
||||||
return scheduler_service
|
|
||||||
# [/DEF:get_scheduler_service:Function]
|
# [/DEF:get_scheduler_service:Function]
|
||||||
|
|
||||||
|
# [DEF:get_resource_service:Function]
|
||||||
|
# @PURPOSE: Dependency injector for the ResourceService.
|
||||||
|
# @PRE: Global resource_service must be initialized.
|
||||||
|
# @POST: Returns shared ResourceService instance.
|
||||||
|
# @RETURN: ResourceService - The shared resource service instance.
|
||||||
|
def get_resource_service() -> ResourceService:
|
||||||
|
"""Dependency injector for the ResourceService."""
|
||||||
|
return resource_service
|
||||||
|
# [/DEF:get_resource_service:Function]
|
||||||
|
|
||||||
|
# [DEF:oauth2_scheme:Variable]
|
||||||
|
# @PURPOSE: OAuth2 password bearer scheme for token extraction.
|
||||||
|
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/api/auth/login")
|
||||||
|
# [/DEF:oauth2_scheme:Variable]
|
||||||
|
|
||||||
|
# [DEF:get_current_user:Function]
|
||||||
|
# @PURPOSE: Dependency for retrieving the currently authenticated user from a JWT.
|
||||||
|
# @PRE: JWT token provided in Authorization header.
|
||||||
|
# @POST: Returns the User object if token is valid.
|
||||||
|
# @THROW: HTTPException 401 if token is invalid or user not found.
|
||||||
|
# @PARAM: token (str) - Extracted JWT token.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: User - The authenticated user.
|
||||||
|
def get_current_user(token: str = Depends(oauth2_scheme), db = Depends(get_auth_db)):
|
||||||
|
credentials_exception = HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="Could not validate credentials",
|
||||||
|
headers={"WWW-Authenticate": "Bearer"},
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
payload = decode_token(token)
|
||||||
|
username: str = payload.get("sub")
|
||||||
|
if username is None:
|
||||||
|
raise credentials_exception
|
||||||
|
except JWTError:
|
||||||
|
raise credentials_exception
|
||||||
|
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
user = repo.get_user_by_username(username)
|
||||||
|
if user is None:
|
||||||
|
raise credentials_exception
|
||||||
|
return user
|
||||||
|
# [/DEF:get_current_user:Function]
|
||||||
|
|
||||||
|
# [DEF:has_permission:Function]
|
||||||
|
# @PURPOSE: Dependency for checking if the current user has a specific permission.
|
||||||
|
# @PRE: User is authenticated.
|
||||||
|
# @POST: Returns True if user has permission.
|
||||||
|
# @THROW: HTTPException 403 if permission is denied.
|
||||||
|
# @PARAM: resource (str) - The resource identifier.
|
||||||
|
# @PARAM: action (str) - The action identifier (READ, EXECUTE, WRITE).
|
||||||
|
# @RETURN: User - The authenticated user if permission granted.
|
||||||
|
def has_permission(resource: str, action: str):
|
||||||
|
def permission_checker(current_user: User = Depends(get_current_user)):
|
||||||
|
# Union of all permissions across all roles
|
||||||
|
for role in current_user.roles:
|
||||||
|
for perm in role.permissions:
|
||||||
|
if perm.resource == resource and perm.action == action:
|
||||||
|
return current_user
|
||||||
|
|
||||||
|
# Special case for Admin role (full access)
|
||||||
|
if any(role.name == "Admin" for role in current_user.roles):
|
||||||
|
return current_user
|
||||||
|
|
||||||
|
from .core.auth.logger import log_security_event
|
||||||
|
log_security_event("PERMISSION_DENIED", current_user.username, {"resource": resource, "action": action})
|
||||||
|
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_403_FORBIDDEN,
|
||||||
|
detail=f"Permission denied for {resource}:{action}"
|
||||||
|
)
|
||||||
|
return permission_checker
|
||||||
|
# [/DEF:has_permission:Function]
|
||||||
|
|
||||||
# [/DEF:Dependencies:Module]
|
# [/DEF:Dependencies:Module]
|
||||||
105
backend/src/models/auth.py
Normal file
105
backend/src/models/auth.py
Normal file
@@ -0,0 +1,105 @@
|
|||||||
|
# [DEF:backend.src.models.auth:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: auth, models, user, role, permission, sqlalchemy
|
||||||
|
# @PURPOSE: SQLAlchemy models for multi-user authentication and authorization.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: INHERITS_FROM -> backend.src.models.mapping.Base
|
||||||
|
#
|
||||||
|
# @INVARIANT: Usernames and emails must be unique.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
import uuid
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import Column, String, Boolean, DateTime, ForeignKey, Table
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from .mapping import Base
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:generate_uuid:Function]
|
||||||
|
# @PURPOSE: Generates a unique UUID string.
|
||||||
|
# @POST: Returns a string representation of a new UUID.
|
||||||
|
def generate_uuid():
|
||||||
|
return str(uuid.uuid4())
|
||||||
|
# [/DEF:generate_uuid:Function]
|
||||||
|
|
||||||
|
# [DEF:user_roles:Table]
|
||||||
|
# @PURPOSE: Association table for many-to-many relationship between Users and Roles.
|
||||||
|
user_roles = Table(
|
||||||
|
"user_roles",
|
||||||
|
Base.metadata,
|
||||||
|
Column("user_id", String, ForeignKey("users.id"), primary_key=True),
|
||||||
|
Column("role_id", String, ForeignKey("roles.id"), primary_key=True),
|
||||||
|
)
|
||||||
|
# [/DEF:user_roles:Table]
|
||||||
|
|
||||||
|
# [DEF:role_permissions:Table]
|
||||||
|
# @PURPOSE: Association table for many-to-many relationship between Roles and Permissions.
|
||||||
|
role_permissions = Table(
|
||||||
|
"role_permissions",
|
||||||
|
Base.metadata,
|
||||||
|
Column("role_id", String, ForeignKey("roles.id"), primary_key=True),
|
||||||
|
Column("permission_id", String, ForeignKey("permissions.id"), primary_key=True),
|
||||||
|
)
|
||||||
|
# [/DEF:role_permissions:Table]
|
||||||
|
|
||||||
|
# [DEF:User:Class]
|
||||||
|
# @PURPOSE: Represents an identity that can authenticate to the system.
|
||||||
|
# @RELATION: HAS_MANY -> Role (via user_roles)
|
||||||
|
class User(Base):
|
||||||
|
__tablename__ = "users"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
username = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
email = Column(String, unique=True, index=True, nullable=True)
|
||||||
|
password_hash = Column(String, nullable=True)
|
||||||
|
auth_source = Column(String, default="LOCAL") # LOCAL or ADFS
|
||||||
|
is_active = Column(Boolean, default=True)
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
last_login = Column(DateTime, nullable=True)
|
||||||
|
|
||||||
|
roles = relationship("Role", secondary=user_roles, back_populates="users")
|
||||||
|
# [/DEF:User:Class]
|
||||||
|
|
||||||
|
# [DEF:Role:Class]
|
||||||
|
# @PURPOSE: Represents a collection of permissions.
|
||||||
|
# @RELATION: HAS_MANY -> User (via user_roles)
|
||||||
|
# @RELATION: HAS_MANY -> Permission (via role_permissions)
|
||||||
|
class Role(Base):
|
||||||
|
__tablename__ = "roles"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
name = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
description = Column(String, nullable=True)
|
||||||
|
|
||||||
|
users = relationship("User", secondary=user_roles, back_populates="roles")
|
||||||
|
permissions = relationship("Permission", secondary=role_permissions, back_populates="roles")
|
||||||
|
# [/DEF:Role:Class]
|
||||||
|
|
||||||
|
# [DEF:Permission:Class]
|
||||||
|
# @PURPOSE: Represents a specific capability within the system.
|
||||||
|
# @RELATION: HAS_MANY -> Role (via role_permissions)
|
||||||
|
class Permission(Base):
|
||||||
|
__tablename__ = "permissions"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
resource = Column(String, nullable=False) # e.g. "plugin:backup"
|
||||||
|
action = Column(String, nullable=False) # e.g. "READ", "EXECUTE", "WRITE"
|
||||||
|
|
||||||
|
roles = relationship("Role", secondary=role_permissions, back_populates="permissions")
|
||||||
|
# [/DEF:Permission:Class]
|
||||||
|
|
||||||
|
# [DEF:ADGroupMapping:Class]
|
||||||
|
# @PURPOSE: Maps an Active Directory group to a local System Role.
|
||||||
|
# @RELATION: DEPENDS_ON -> Role
|
||||||
|
class ADGroupMapping(Base):
|
||||||
|
__tablename__ = "ad_group_mappings"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
ad_group = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
role_id = Column(String, ForeignKey("roles.id"), nullable=False)
|
||||||
|
|
||||||
|
role = relationship("Role")
|
||||||
|
# [/DEF:ADGroupMapping:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.models.auth:Module]
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
# [DEF:backend.src.models.connection:Module]
|
# [DEF:backend.src.models.connection:Module]
|
||||||
#
|
#
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @SEMANTICS: database, connection, configuration, sqlalchemy, sqlite
|
# @SEMANTICS: database, connection, configuration, sqlalchemy, sqlite
|
||||||
# @PURPOSE: Defines the database schema for external database connection configurations.
|
# @PURPOSE: Defines the database schema for external database connection configurations.
|
||||||
# @LAYER: Domain
|
# @LAYER: Domain
|
||||||
@@ -15,6 +16,7 @@ import uuid
|
|||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:ConnectionConfig:Class]
|
# [DEF:ConnectionConfig:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Stores credentials for external databases used for column mapping.
|
# @PURPOSE: Stores credentials for external databases used for column mapping.
|
||||||
class ConnectionConfig(Base):
|
class ConnectionConfig(Base):
|
||||||
__tablename__ = "connection_configs"
|
__tablename__ = "connection_configs"
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
# [DEF:backend.src.models.dashboard:Module]
|
# [DEF:backend.src.models.dashboard:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: dashboard, model, metadata, migration
|
# @SEMANTICS: dashboard, model, metadata, migration
|
||||||
# @PURPOSE: Defines data models for dashboard metadata and selection.
|
# @PURPOSE: Defines data models for dashboard metadata and selection.
|
||||||
# @LAYER: Model
|
# @LAYER: Model
|
||||||
@@ -8,6 +9,7 @@ from pydantic import BaseModel
|
|||||||
from typing import List
|
from typing import List
|
||||||
|
|
||||||
# [DEF:DashboardMetadata:Class]
|
# [DEF:DashboardMetadata:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Represents a dashboard available for migration.
|
# @PURPOSE: Represents a dashboard available for migration.
|
||||||
class DashboardMetadata(BaseModel):
|
class DashboardMetadata(BaseModel):
|
||||||
id: int
|
id: int
|
||||||
@@ -17,6 +19,7 @@ class DashboardMetadata(BaseModel):
|
|||||||
# [/DEF:DashboardMetadata:Class]
|
# [/DEF:DashboardMetadata:Class]
|
||||||
|
|
||||||
# [DEF:DashboardSelection:Class]
|
# [DEF:DashboardSelection:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Represents the user's selection of dashboards to migrate.
|
# @PURPOSE: Represents the user's selection of dashboards to migrate.
|
||||||
class DashboardSelection(BaseModel):
|
class DashboardSelection(BaseModel):
|
||||||
selected_ids: List[int]
|
selected_ids: List[int]
|
||||||
|
|||||||
73
backend/src/models/git.py
Normal file
73
backend/src/models/git.py
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
# [DEF:GitModels:Module]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @SEMANTICS: git, models, sqlalchemy, database, schema
|
||||||
|
# @PURPOSE: Git-specific SQLAlchemy models for configuration and repository tracking.
|
||||||
|
# @LAYER: Model
|
||||||
|
# @RELATION: specs/011-git-integration-dashboard/data-model.md
|
||||||
|
|
||||||
|
import enum
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import Column, String, Integer, DateTime, Enum, ForeignKey, Boolean
|
||||||
|
import uuid
|
||||||
|
from src.core.database import Base
|
||||||
|
|
||||||
|
class GitProvider(str, enum.Enum):
|
||||||
|
GITHUB = "GITHUB"
|
||||||
|
GITLAB = "GITLAB"
|
||||||
|
GITEA = "GITEA"
|
||||||
|
|
||||||
|
class GitStatus(str, enum.Enum):
|
||||||
|
CONNECTED = "CONNECTED"
|
||||||
|
FAILED = "FAILED"
|
||||||
|
UNKNOWN = "UNKNOWN"
|
||||||
|
|
||||||
|
class SyncStatus(str, enum.Enum):
|
||||||
|
CLEAN = "CLEAN"
|
||||||
|
DIRTY = "DIRTY"
|
||||||
|
CONFLICT = "CONFLICT"
|
||||||
|
|
||||||
|
# [DEF:GitServerConfig:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Configuration for a Git server connection.
|
||||||
|
class GitServerConfig(Base):
|
||||||
|
__tablename__ = "git_server_configs"
|
||||||
|
|
||||||
|
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||||
|
name = Column(String(255), nullable=False)
|
||||||
|
provider = Column(Enum(GitProvider), nullable=False)
|
||||||
|
url = Column(String(255), nullable=False)
|
||||||
|
pat = Column(String(255), nullable=False) # PERSONAL ACCESS TOKEN
|
||||||
|
default_repository = Column(String(255), nullable=True)
|
||||||
|
status = Column(Enum(GitStatus), default=GitStatus.UNKNOWN)
|
||||||
|
last_validated = Column(DateTime, default=datetime.utcnow)
|
||||||
|
# [/DEF:GitServerConfig:Class]
|
||||||
|
|
||||||
|
# [DEF:GitRepository:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Tracking for a local Git repository linked to a dashboard.
|
||||||
|
class GitRepository(Base):
|
||||||
|
__tablename__ = "git_repositories"
|
||||||
|
|
||||||
|
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||||
|
dashboard_id = Column(Integer, nullable=False, unique=True)
|
||||||
|
config_id = Column(String(36), ForeignKey("git_server_configs.id"), nullable=False)
|
||||||
|
remote_url = Column(String(255), nullable=False)
|
||||||
|
local_path = Column(String(255), nullable=False)
|
||||||
|
current_branch = Column(String(255), default="main")
|
||||||
|
sync_status = Column(Enum(SyncStatus), default=SyncStatus.CLEAN)
|
||||||
|
# [/DEF:GitRepository:Class]
|
||||||
|
|
||||||
|
# [DEF:DeploymentEnvironment:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Target Superset environments for dashboard deployment.
|
||||||
|
class DeploymentEnvironment(Base):
|
||||||
|
__tablename__ = "deployment_environments"
|
||||||
|
|
||||||
|
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||||
|
name = Column(String(255), nullable=False)
|
||||||
|
superset_url = Column(String(255), nullable=False)
|
||||||
|
superset_token = Column(String(255), nullable=False)
|
||||||
|
is_active = Column(Boolean, default=True)
|
||||||
|
# [/DEF:DeploymentEnvironment:Class]
|
||||||
|
|
||||||
|
# [/DEF:GitModels:Module]
|
||||||
46
backend/src/models/llm.py
Normal file
46
backend/src/models/llm.py
Normal file
@@ -0,0 +1,46 @@
|
|||||||
|
# [DEF:backend.src.models.llm:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: llm, models, sqlalchemy, persistence
|
||||||
|
# @PURPOSE: SQLAlchemy models for LLM provider configuration and validation results.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: INHERITS_FROM -> backend.src.models.mapping.Base
|
||||||
|
|
||||||
|
from sqlalchemy import Column, String, Boolean, DateTime, JSON, Text
|
||||||
|
from datetime import datetime
|
||||||
|
import uuid
|
||||||
|
from .mapping import Base
|
||||||
|
|
||||||
|
def generate_uuid():
|
||||||
|
return str(uuid.uuid4())
|
||||||
|
|
||||||
|
# [DEF:LLMProvider:Class]
|
||||||
|
# @PURPOSE: SQLAlchemy model for LLM provider configuration.
|
||||||
|
class LLMProvider(Base):
|
||||||
|
__tablename__ = "llm_providers"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
provider_type = Column(String, nullable=False) # openai, openrouter, kilo
|
||||||
|
name = Column(String, nullable=False)
|
||||||
|
base_url = Column(String, nullable=False)
|
||||||
|
api_key = Column(String, nullable=False) # Should be encrypted
|
||||||
|
default_model = Column(String, nullable=False)
|
||||||
|
is_active = Column(Boolean, default=True)
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
# [/DEF:LLMProvider:Class]
|
||||||
|
|
||||||
|
# [DEF:ValidationRecord:Class]
|
||||||
|
# @PURPOSE: SQLAlchemy model for dashboard validation history.
|
||||||
|
class ValidationRecord(Base):
|
||||||
|
__tablename__ = "llm_validation_results"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
dashboard_id = Column(String, nullable=False, index=True)
|
||||||
|
timestamp = Column(DateTime, default=datetime.utcnow)
|
||||||
|
status = Column(String, nullable=False) # PASS, WARN, FAIL
|
||||||
|
screenshot_path = Column(String, nullable=True)
|
||||||
|
issues = Column(JSON, nullable=False)
|
||||||
|
summary = Column(Text, nullable=False)
|
||||||
|
raw_response = Column(Text, nullable=True)
|
||||||
|
# [/DEF:ValidationRecord:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.models.llm:Module]
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
# [DEF:backend.src.models.mapping:Module]
|
# [DEF:backend.src.models.mapping:Module]
|
||||||
#
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: database, mapping, environment, migration, sqlalchemy, sqlite
|
# @SEMANTICS: database, mapping, environment, migration, sqlalchemy, sqlite
|
||||||
# @PURPOSE: Defines the database schema for environment metadata and database mappings using SQLAlchemy.
|
# @PURPOSE: Defines the database schema for environment metadata and database mappings using SQLAlchemy.
|
||||||
# @LAYER: Domain
|
# @LAYER: Domain
|
||||||
@@ -19,6 +20,7 @@ import enum
|
|||||||
Base = declarative_base()
|
Base = declarative_base()
|
||||||
|
|
||||||
# [DEF:MigrationStatus:Class]
|
# [DEF:MigrationStatus:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Enumeration of possible migration job statuses.
|
# @PURPOSE: Enumeration of possible migration job statuses.
|
||||||
class MigrationStatus(enum.Enum):
|
class MigrationStatus(enum.Enum):
|
||||||
PENDING = "PENDING"
|
PENDING = "PENDING"
|
||||||
@@ -29,6 +31,7 @@ class MigrationStatus(enum.Enum):
|
|||||||
# [/DEF:MigrationStatus:Class]
|
# [/DEF:MigrationStatus:Class]
|
||||||
|
|
||||||
# [DEF:Environment:Class]
|
# [DEF:Environment:Class]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @PURPOSE: Represents a Superset instance environment.
|
# @PURPOSE: Represents a Superset instance environment.
|
||||||
class Environment(Base):
|
class Environment(Base):
|
||||||
__tablename__ = "environments"
|
__tablename__ = "environments"
|
||||||
@@ -40,6 +43,7 @@ class Environment(Base):
|
|||||||
# [/DEF:Environment:Class]
|
# [/DEF:Environment:Class]
|
||||||
|
|
||||||
# [DEF:DatabaseMapping:Class]
|
# [DEF:DatabaseMapping:Class]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @PURPOSE: Represents a mapping between source and target databases.
|
# @PURPOSE: Represents a mapping between source and target databases.
|
||||||
class DatabaseMapping(Base):
|
class DatabaseMapping(Base):
|
||||||
__tablename__ = "database_mappings"
|
__tablename__ = "database_mappings"
|
||||||
@@ -55,6 +59,7 @@ class DatabaseMapping(Base):
|
|||||||
# [/DEF:DatabaseMapping:Class]
|
# [/DEF:DatabaseMapping:Class]
|
||||||
|
|
||||||
# [DEF:MigrationJob:Class]
|
# [DEF:MigrationJob:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Represents a single migration execution job.
|
# @PURPOSE: Represents a single migration execution job.
|
||||||
class MigrationJob(Base):
|
class MigrationJob(Base):
|
||||||
__tablename__ = "migration_jobs"
|
__tablename__ = "migration_jobs"
|
||||||
|
|||||||
42
backend/src/models/storage.py
Normal file
42
backend/src/models/storage.py
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
# [DEF:backend.src.models.storage:Module]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @SEMANTICS: storage, file, model, pydantic
|
||||||
|
# @PURPOSE: Data models for the storage system.
|
||||||
|
# @LAYER: Domain
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
from enum import Enum
|
||||||
|
from typing import Optional
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
|
# [DEF:FileCategory:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Enumeration of supported file categories in the storage system.
|
||||||
|
class FileCategory(str, Enum):
|
||||||
|
BACKUP = "backups"
|
||||||
|
REPOSITORY = "repositorys"
|
||||||
|
# [/DEF:FileCategory:Class]
|
||||||
|
|
||||||
|
# [DEF:StorageConfig:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Configuration model for the storage system, defining paths and naming patterns.
|
||||||
|
class StorageConfig(BaseModel):
|
||||||
|
root_path: str = Field(default="backups", description="Absolute path to the storage root directory.")
|
||||||
|
backup_structure_pattern: str = Field(default="{category}/", description="Pattern for backup directory structure.")
|
||||||
|
repo_structure_pattern: str = Field(default="{category}/", description="Pattern for repository directory structure.")
|
||||||
|
filename_pattern: str = Field(default="{name}_{timestamp}", description="Pattern for filenames.")
|
||||||
|
# [/DEF:StorageConfig:Class]
|
||||||
|
|
||||||
|
# [DEF:StoredFile:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Data model representing metadata for a file stored in the system.
|
||||||
|
class StoredFile(BaseModel):
|
||||||
|
name: str = Field(..., description="Name of the file (including extension).")
|
||||||
|
path: str = Field(..., description="Relative path from storage root.")
|
||||||
|
size: int = Field(..., ge=0, description="Size of the file in bytes.")
|
||||||
|
created_at: datetime = Field(..., description="Creation timestamp.")
|
||||||
|
category: FileCategory = Field(..., description="Category of the file.")
|
||||||
|
mime_type: Optional[str] = Field(None, description="MIME type of the file.")
|
||||||
|
# [/DEF:StoredFile:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.models.storage:Module]
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
# [DEF:backend.src.models.task:Module]
|
# [DEF:backend.src.models.task:Module]
|
||||||
#
|
#
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @SEMANTICS: database, task, record, sqlalchemy, sqlite
|
# @SEMANTICS: database, task, record, sqlalchemy, sqlite
|
||||||
# @PURPOSE: Defines the database schema for task execution records.
|
# @PURPOSE: Defines the database schema for task execution records.
|
||||||
# @LAYER: Domain
|
# @LAYER: Domain
|
||||||
@@ -8,13 +9,14 @@
|
|||||||
# @INVARIANT: All primary keys are UUID strings.
|
# @INVARIANT: All primary keys are UUID strings.
|
||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
from sqlalchemy import Column, String, DateTime, JSON, ForeignKey
|
from sqlalchemy import Column, String, DateTime, JSON, ForeignKey, Text, Integer, Index
|
||||||
from sqlalchemy.sql import func
|
from sqlalchemy.sql import func
|
||||||
from .mapping import Base
|
from .mapping import Base
|
||||||
import uuid
|
import uuid
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:TaskRecord:Class]
|
# [DEF:TaskRecord:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Represents a persistent record of a task execution.
|
# @PURPOSE: Represents a persistent record of a task execution.
|
||||||
class TaskRecord(Base):
|
class TaskRecord(Base):
|
||||||
__tablename__ = "task_records"
|
__tablename__ = "task_records"
|
||||||
@@ -25,11 +27,35 @@ class TaskRecord(Base):
|
|||||||
environment_id = Column(String, ForeignKey("environments.id"), nullable=True)
|
environment_id = Column(String, ForeignKey("environments.id"), nullable=True)
|
||||||
started_at = Column(DateTime(timezone=True), nullable=True)
|
started_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
finished_at = Column(DateTime(timezone=True), nullable=True)
|
finished_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
logs = Column(JSON, nullable=True) # Store structured logs as JSON
|
logs = Column(JSON, nullable=True) # Store structured logs as JSON (legacy, kept for backward compatibility)
|
||||||
error = Column(String, nullable=True)
|
error = Column(String, nullable=True)
|
||||||
result = Column(JSON, nullable=True)
|
result = Column(JSON, nullable=True)
|
||||||
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
created_at = Column(DateTime(timezone=True), server_default=func.now())
|
||||||
params = Column(JSON, nullable=True)
|
params = Column(JSON, nullable=True)
|
||||||
# [/DEF:TaskRecord:Class]
|
# [/DEF:TaskRecord:Class]
|
||||||
|
|
||||||
|
# [DEF:TaskLogRecord:Class]
|
||||||
|
# @PURPOSE: Represents a single persistent log entry for a task.
|
||||||
|
# @TIER: CRITICAL
|
||||||
|
# @RELATION: DEPENDS_ON -> TaskRecord
|
||||||
|
# @INVARIANT: Each log entry belongs to exactly one task.
|
||||||
|
class TaskLogRecord(Base):
|
||||||
|
__tablename__ = "task_logs"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
task_id = Column(String, ForeignKey("task_records.id", ondelete="CASCADE"), nullable=False, index=True)
|
||||||
|
timestamp = Column(DateTime(timezone=True), nullable=False, index=True)
|
||||||
|
level = Column(String(16), nullable=False) # INFO, WARNING, ERROR, DEBUG
|
||||||
|
source = Column(String(64), nullable=False, default="system") # plugin, superset_api, git, etc.
|
||||||
|
message = Column(Text, nullable=False)
|
||||||
|
metadata_json = Column(Text, nullable=True) # JSON string for additional metadata
|
||||||
|
|
||||||
|
# Composite indexes for efficient filtering
|
||||||
|
__table_args__ = (
|
||||||
|
Index('ix_task_logs_task_timestamp', 'task_id', 'timestamp'),
|
||||||
|
Index('ix_task_logs_task_level', 'task_id', 'level'),
|
||||||
|
Index('ix_task_logs_task_source', 'task_id', 'source'),
|
||||||
|
)
|
||||||
|
# [/DEF:TaskLogRecord:Class]
|
||||||
|
|
||||||
# [/DEF:backend.src.models.task:Module]
|
# [/DEF:backend.src.models.task:Module]
|
||||||
@@ -5,13 +5,14 @@
|
|||||||
# @RELATION: IMPLEMENTS -> PluginBase
|
# @RELATION: IMPLEMENTS -> PluginBase
|
||||||
# @RELATION: DEPENDS_ON -> superset_tool.client
|
# @RELATION: DEPENDS_ON -> superset_tool.client
|
||||||
# @RELATION: DEPENDS_ON -> superset_tool.utils
|
# @RELATION: DEPENDS_ON -> superset_tool.utils
|
||||||
|
# @RELATION: USES -> TaskContext
|
||||||
|
|
||||||
from typing import Dict, Any
|
from typing import Dict, Any, Optional
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from requests.exceptions import RequestException
|
from requests.exceptions import RequestException
|
||||||
|
|
||||||
from ..core.plugin_base import PluginBase
|
from ..core.plugin_base import PluginBase
|
||||||
from ..core.logger import belief_scope
|
from ..core.logger import belief_scope, logger as app_logger
|
||||||
from ..core.superset_client import SupersetClient
|
from ..core.superset_client import SupersetClient
|
||||||
from ..core.utils.network import SupersetAPIError
|
from ..core.utils.network import SupersetAPIError
|
||||||
from ..core.utils.fileio import (
|
from ..core.utils.fileio import (
|
||||||
@@ -23,6 +24,7 @@ from ..core.utils.fileio import (
|
|||||||
RetentionPolicy
|
RetentionPolicy
|
||||||
)
|
)
|
||||||
from ..dependencies import get_config_manager
|
from ..dependencies import get_config_manager
|
||||||
|
from ..core.task_manager.context import TaskContext
|
||||||
|
|
||||||
# [DEF:BackupPlugin:Class]
|
# [DEF:BackupPlugin:Class]
|
||||||
# @PURPOSE: Implementation of the backup plugin logic.
|
# @PURPOSE: Implementation of the backup plugin logic.
|
||||||
@@ -75,6 +77,15 @@ class BackupPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the backup plugin.
|
||||||
|
# @RETURN: str - "/tools/backups"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/tools/backups"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for backup plugin parameters.
|
# @PURPOSE: Returns the JSON schema for backup plugin parameters.
|
||||||
# @PRE: Plugin instance exists.
|
# @PRE: Plugin instance exists.
|
||||||
@@ -84,7 +95,7 @@ class BackupPlugin(PluginBase):
|
|||||||
with belief_scope("get_schema"):
|
with belief_scope("get_schema"):
|
||||||
config_manager = get_config_manager()
|
config_manager = get_config_manager()
|
||||||
envs = [e.name for e in config_manager.get_environments()]
|
envs = [e.name for e in config_manager.get_environments()]
|
||||||
default_path = config_manager.get_config().settings.backup_path
|
config_manager.get_config().settings.storage.root_path
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"type": "object",
|
"type": "object",
|
||||||
@@ -95,23 +106,18 @@ class BackupPlugin(PluginBase):
|
|||||||
"description": "The Superset environment to back up.",
|
"description": "The Superset environment to back up.",
|
||||||
"enum": envs if envs else [],
|
"enum": envs if envs else [],
|
||||||
},
|
},
|
||||||
"backup_path": {
|
|
||||||
"type": "string",
|
|
||||||
"title": "Backup Path",
|
|
||||||
"description": "The root directory to save backups to.",
|
|
||||||
"default": default_path
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
"required": ["env", "backup_path"],
|
"required": ["env"],
|
||||||
}
|
}
|
||||||
# [/DEF:get_schema:Function]
|
# [/DEF:get_schema:Function]
|
||||||
|
|
||||||
# [DEF:execute:Function]
|
# [DEF:execute:Function]
|
||||||
# @PURPOSE: Executes the dashboard backup logic.
|
# @PURPOSE: Executes the dashboard backup logic with TaskContext support.
|
||||||
# @PARAM: params (Dict[str, Any]) - Backup parameters (env, backup_path).
|
# @PARAM: params (Dict[str, Any]) - Backup parameters (env, backup_path).
|
||||||
|
# @PARAM: context (Optional[TaskContext]) - Task context for logging with source attribution.
|
||||||
# @PRE: Target environment must be configured. params must be a dictionary.
|
# @PRE: Target environment must be configured. params must be a dictionary.
|
||||||
# @POST: All dashboards are exported and archived.
|
# @POST: All dashboards are exported and archived.
|
||||||
async def execute(self, params: Dict[str, Any]):
|
async def execute(self, params: Dict[str, Any], context: Optional[TaskContext] = None):
|
||||||
with belief_scope("execute"):
|
with belief_scope("execute"):
|
||||||
config_manager = get_config_manager()
|
config_manager = get_config_manager()
|
||||||
env_id = params.get("environment_id")
|
env_id = params.get("environment_id")
|
||||||
@@ -126,11 +132,18 @@ class BackupPlugin(PluginBase):
|
|||||||
if not env:
|
if not env:
|
||||||
raise KeyError("env")
|
raise KeyError("env")
|
||||||
|
|
||||||
backup_path_str = params.get("backup_path") or config_manager.get_config().settings.backup_path
|
storage_settings = config_manager.get_config().settings.storage
|
||||||
backup_path = Path(backup_path_str)
|
# Use 'backups' subfolder within the storage root
|
||||||
|
backup_path = Path(storage_settings.root_path) / "backups"
|
||||||
|
|
||||||
from ..core.logger import logger as app_logger
|
# Use TaskContext logger if available, otherwise fall back to app_logger
|
||||||
app_logger.info(f"[BackupPlugin][Entry] Starting backup for {env}.")
|
log = context.logger if context else app_logger
|
||||||
|
|
||||||
|
# Create sub-loggers for different components
|
||||||
|
superset_log = log.with_source("superset_api") if context else log
|
||||||
|
storage_log = log.with_source("storage") if context else log
|
||||||
|
|
||||||
|
log.info(f"Starting backup for environment: {env}")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
config_manager = get_config_manager()
|
config_manager = get_config_manager()
|
||||||
@@ -144,24 +157,30 @@ class BackupPlugin(PluginBase):
|
|||||||
client = SupersetClient(env_config)
|
client = SupersetClient(env_config)
|
||||||
|
|
||||||
dashboard_count, dashboard_meta = client.get_dashboards()
|
dashboard_count, dashboard_meta = client.get_dashboards()
|
||||||
app_logger.info(f"[BackupPlugin][Progress] Found {dashboard_count} dashboards to export in {env}.")
|
superset_log.info(f"Found {dashboard_count} dashboards to export")
|
||||||
|
|
||||||
if dashboard_count == 0:
|
if dashboard_count == 0:
|
||||||
app_logger.info("[BackupPlugin][Exit] No dashboards to back up.")
|
log.info("No dashboards to back up")
|
||||||
return
|
return
|
||||||
|
|
||||||
for db in dashboard_meta:
|
total = len(dashboard_meta)
|
||||||
|
for idx, db in enumerate(dashboard_meta, 1):
|
||||||
dashboard_id = db.get('id')
|
dashboard_id = db.get('id')
|
||||||
dashboard_title = db.get('dashboard_title', 'Unknown Dashboard')
|
dashboard_title = db.get('dashboard_title', 'Unknown Dashboard')
|
||||||
if not dashboard_id:
|
if not dashboard_id:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
# Report progress
|
||||||
|
progress_pct = (idx / total) * 100
|
||||||
|
log.progress(f"Backing up dashboard: {dashboard_title}", percent=progress_pct)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
dashboard_base_dir_name = sanitize_filename(f"{dashboard_title}")
|
dashboard_base_dir_name = sanitize_filename(f"{dashboard_title}")
|
||||||
dashboard_dir = backup_path / env.upper() / dashboard_base_dir_name
|
dashboard_dir = backup_path / env.upper() / dashboard_base_dir_name
|
||||||
dashboard_dir.mkdir(parents=True, exist_ok=True)
|
dashboard_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
zip_content, filename = client.export_dashboard(dashboard_id)
|
zip_content, filename = client.export_dashboard(dashboard_id)
|
||||||
|
superset_log.debug(f"Exported dashboard: {dashboard_title}")
|
||||||
|
|
||||||
save_and_unpack_dashboard(
|
save_and_unpack_dashboard(
|
||||||
zip_content=zip_content,
|
zip_content=zip_content,
|
||||||
@@ -171,18 +190,19 @@ class BackupPlugin(PluginBase):
|
|||||||
)
|
)
|
||||||
|
|
||||||
archive_exports(str(dashboard_dir), policy=RetentionPolicy())
|
archive_exports(str(dashboard_dir), policy=RetentionPolicy())
|
||||||
|
storage_log.debug(f"Archived dashboard: {dashboard_title}")
|
||||||
|
|
||||||
except (SupersetAPIError, RequestException, IOError, OSError) as db_error:
|
except (SupersetAPIError, RequestException, IOError, OSError) as db_error:
|
||||||
app_logger.error(f"[BackupPlugin][Failure] Failed to export dashboard {dashboard_title} (ID: {dashboard_id}): {db_error}", exc_info=True)
|
log.error(f"Failed to export dashboard {dashboard_title} (ID: {dashboard_id}): {db_error}")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
consolidate_archive_folders(backup_path / env.upper())
|
consolidate_archive_folders(backup_path / env.upper())
|
||||||
remove_empty_directories(str(backup_path / env.upper()))
|
remove_empty_directories(str(backup_path / env.upper()))
|
||||||
|
|
||||||
app_logger.info(f"[BackupPlugin][CoherenceCheck:Passed] Backup logic completed for {env}.")
|
log.info(f"Backup completed successfully for {env}")
|
||||||
|
|
||||||
except (RequestException, IOError, KeyError) as e:
|
except (RequestException, IOError, KeyError) as e:
|
||||||
app_logger.critical(f"[BackupPlugin][Failure] Fatal error during backup for {env}: {e}", exc_info=True)
|
log.error(f"Fatal error during backup for {env}: {e}")
|
||||||
raise e
|
raise e
|
||||||
# [/DEF:execute:Function]
|
# [/DEF:execute:Function]
|
||||||
# [/DEF:BackupPlugin:Class]
|
# [/DEF:BackupPlugin:Class]
|
||||||
|
|||||||
@@ -3,6 +3,7 @@
|
|||||||
# @PURPOSE: Implements a plugin for system diagnostics and debugging Superset API responses.
|
# @PURPOSE: Implements a plugin for system diagnostics and debugging Superset API responses.
|
||||||
# @LAYER: Plugins
|
# @LAYER: Plugins
|
||||||
# @RELATION: Inherits from PluginBase. Uses SupersetClient from core.
|
# @RELATION: Inherits from PluginBase. Uses SupersetClient from core.
|
||||||
|
# @RELATION: USES -> TaskContext
|
||||||
# @CONSTRAINT: Must use belief_scope for logging.
|
# @CONSTRAINT: Must use belief_scope for logging.
|
||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
@@ -10,6 +11,7 @@ from typing import Dict, Any, Optional
|
|||||||
from ..core.plugin_base import PluginBase
|
from ..core.plugin_base import PluginBase
|
||||||
from ..core.superset_client import SupersetClient
|
from ..core.superset_client import SupersetClient
|
||||||
from ..core.logger import logger, belief_scope
|
from ..core.logger import logger, belief_scope
|
||||||
|
from ..core.task_manager.context import TaskContext
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:DebugPlugin:Class]
|
# [DEF:DebugPlugin:Class]
|
||||||
@@ -63,6 +65,15 @@ class DebugPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the debug plugin.
|
||||||
|
# @RETURN: str - "/tools/debug"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/tools/debug"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for the debug plugin parameters.
|
# @PURPOSE: Returns the JSON schema for the debug plugin parameters.
|
||||||
# @PRE: Plugin instance exists.
|
# @PRE: Plugin instance exists.
|
||||||
@@ -105,20 +116,29 @@ class DebugPlugin(PluginBase):
|
|||||||
# [/DEF:get_schema:Function]
|
# [/DEF:get_schema:Function]
|
||||||
|
|
||||||
# [DEF:execute:Function]
|
# [DEF:execute:Function]
|
||||||
# @PURPOSE: Executes the debug logic.
|
# @PURPOSE: Executes the debug logic with TaskContext support.
|
||||||
# @PARAM: params (Dict[str, Any]) - Debug parameters.
|
# @PARAM: params (Dict[str, Any]) - Debug parameters.
|
||||||
|
# @PARAM: context (Optional[TaskContext]) - Task context for logging with source attribution.
|
||||||
# @PRE: action must be provided in params.
|
# @PRE: action must be provided in params.
|
||||||
# @POST: Debug action is executed and results returned.
|
# @POST: Debug action is executed and results returned.
|
||||||
# @RETURN: Dict[str, Any] - Execution results.
|
# @RETURN: Dict[str, Any] - Execution results.
|
||||||
async def execute(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
async def execute(self, params: Dict[str, Any], context: Optional[TaskContext] = None) -> Dict[str, Any]:
|
||||||
with belief_scope("execute"):
|
with belief_scope("execute"):
|
||||||
action = params.get("action")
|
action = params.get("action")
|
||||||
|
|
||||||
|
# Use TaskContext logger if available, otherwise fall back to app logger
|
||||||
|
log = context.logger if context else logger
|
||||||
|
debug_log = log.with_source("debug") if context else log
|
||||||
|
superset_log = log.with_source("superset_api") if context else log
|
||||||
|
|
||||||
|
debug_log.info(f"Executing debug action: {action}")
|
||||||
|
|
||||||
if action == "test-db-api":
|
if action == "test-db-api":
|
||||||
return await self._test_db_api(params)
|
return await self._test_db_api(params, superset_log)
|
||||||
elif action == "get-dataset-structure":
|
elif action == "get-dataset-structure":
|
||||||
return await self._get_dataset_structure(params)
|
return await self._get_dataset_structure(params, superset_log)
|
||||||
else:
|
else:
|
||||||
|
debug_log.error(f"Unknown action: {action}")
|
||||||
raise ValueError(f"Unknown action: {action}")
|
raise ValueError(f"Unknown action: {action}")
|
||||||
# [/DEF:execute:Function]
|
# [/DEF:execute:Function]
|
||||||
|
|
||||||
@@ -127,33 +147,37 @@ class DebugPlugin(PluginBase):
|
|||||||
# @PRE: source_env and target_env params exist in params.
|
# @PRE: source_env and target_env params exist in params.
|
||||||
# @POST: Returns DB counts for both envs.
|
# @POST: Returns DB counts for both envs.
|
||||||
# @PARAM: params (Dict) - Plugin parameters.
|
# @PARAM: params (Dict) - Plugin parameters.
|
||||||
|
# @PARAM: log - Logger instance for superset_api source.
|
||||||
# @RETURN: Dict - Comparison results.
|
# @RETURN: Dict - Comparison results.
|
||||||
async def _test_db_api(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
async def _test_db_api(self, params: Dict[str, Any], log) -> Dict[str, Any]:
|
||||||
with belief_scope("_test_db_api"):
|
with belief_scope("_test_db_api"):
|
||||||
source_env_name = params.get("source_env")
|
source_env_name = params.get("source_env")
|
||||||
target_env_name = params.get("target_env")
|
target_env_name = params.get("target_env")
|
||||||
|
|
||||||
if not source_env_name or not target_env_name:
|
if not source_env_name or not target_env_name:
|
||||||
raise ValueError("source_env and target_env are required for test-db-api")
|
raise ValueError("source_env and target_env are required for test-db-api")
|
||||||
|
|
||||||
from ..dependencies import get_config_manager
|
from ..dependencies import get_config_manager
|
||||||
config_manager = get_config_manager()
|
config_manager = get_config_manager()
|
||||||
|
|
||||||
results = {}
|
results = {}
|
||||||
for name in [source_env_name, target_env_name]:
|
for name in [source_env_name, target_env_name]:
|
||||||
env_config = config_manager.get_environment(name)
|
log.info(f"Testing database API for environment: {name}")
|
||||||
if not env_config:
|
env_config = config_manager.get_environment(name)
|
||||||
raise ValueError(f"Environment '{name}' not found.")
|
if not env_config:
|
||||||
|
log.error(f"Environment '{name}' not found.")
|
||||||
|
raise ValueError(f"Environment '{name}' not found.")
|
||||||
|
|
||||||
client = SupersetClient(env_config)
|
client = SupersetClient(env_config)
|
||||||
client.authenticate()
|
client.authenticate()
|
||||||
count, dbs = client.get_databases()
|
count, dbs = client.get_databases()
|
||||||
results[name] = {
|
log.debug(f"Found {count} databases in {name}")
|
||||||
"count": count,
|
results[name] = {
|
||||||
"databases": dbs
|
"count": count,
|
||||||
}
|
"databases": dbs
|
||||||
|
}
|
||||||
|
|
||||||
return results
|
return results
|
||||||
# [/DEF:_test_db_api:Function]
|
# [/DEF:_test_db_api:Function]
|
||||||
|
|
||||||
# [DEF:_get_dataset_structure:Function]
|
# [DEF:_get_dataset_structure:Function]
|
||||||
@@ -161,26 +185,31 @@ class DebugPlugin(PluginBase):
|
|||||||
# @PRE: env and dataset_id params exist in params.
|
# @PRE: env and dataset_id params exist in params.
|
||||||
# @POST: Returns dataset JSON structure.
|
# @POST: Returns dataset JSON structure.
|
||||||
# @PARAM: params (Dict) - Plugin parameters.
|
# @PARAM: params (Dict) - Plugin parameters.
|
||||||
|
# @PARAM: log - Logger instance for superset_api source.
|
||||||
# @RETURN: Dict - Dataset structure.
|
# @RETURN: Dict - Dataset structure.
|
||||||
async def _get_dataset_structure(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
async def _get_dataset_structure(self, params: Dict[str, Any], log) -> Dict[str, Any]:
|
||||||
with belief_scope("_get_dataset_structure"):
|
with belief_scope("_get_dataset_structure"):
|
||||||
env_name = params.get("env")
|
env_name = params.get("env")
|
||||||
dataset_id = params.get("dataset_id")
|
dataset_id = params.get("dataset_id")
|
||||||
|
|
||||||
if not env_name or dataset_id is None:
|
if not env_name or dataset_id is None:
|
||||||
raise ValueError("env and dataset_id are required for get-dataset-structure")
|
raise ValueError("env and dataset_id are required for get-dataset-structure")
|
||||||
|
|
||||||
from ..dependencies import get_config_manager
|
log.info(f"Fetching structure for dataset {dataset_id} in {env_name}")
|
||||||
config_manager = get_config_manager()
|
|
||||||
env_config = config_manager.get_environment(env_name)
|
|
||||||
if not env_config:
|
|
||||||
raise ValueError(f"Environment '{env_name}' not found.")
|
|
||||||
|
|
||||||
client = SupersetClient(env_config)
|
from ..dependencies import get_config_manager
|
||||||
client.authenticate()
|
config_manager = get_config_manager()
|
||||||
|
env_config = config_manager.get_environment(env_name)
|
||||||
|
if not env_config:
|
||||||
|
log.error(f"Environment '{env_name}' not found.")
|
||||||
|
raise ValueError(f"Environment '{env_name}' not found.")
|
||||||
|
|
||||||
dataset_response = client.get_dataset(dataset_id)
|
client = SupersetClient(env_config)
|
||||||
return dataset_response.get('result') or {}
|
client.authenticate()
|
||||||
|
|
||||||
|
dataset_response = client.get_dataset(dataset_id)
|
||||||
|
log.debug(f"Retrieved dataset structure for {dataset_id}")
|
||||||
|
return dataset_response.get('result') or {}
|
||||||
# [/DEF:_get_dataset_structure:Function]
|
# [/DEF:_get_dataset_structure:Function]
|
||||||
|
|
||||||
# [/DEF:DebugPlugin:Class]
|
# [/DEF:DebugPlugin:Class]
|
||||||
|
|||||||
66
backend/src/plugins/git/llm_extension.py
Normal file
66
backend/src/plugins/git/llm_extension.py
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
# [DEF:backend/src/plugins/git/llm_extension:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: git, llm, commit
|
||||||
|
# @PURPOSE: LLM-based extensions for the Git plugin, specifically for commit message generation.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.plugins.llm_analysis.service.LLMClient
|
||||||
|
|
||||||
|
from typing import List
|
||||||
|
from tenacity import retry, stop_after_attempt, wait_exponential
|
||||||
|
from ..llm_analysis.service import LLMClient
|
||||||
|
from ...core.logger import belief_scope, logger
|
||||||
|
|
||||||
|
# [DEF:GitLLMExtension:Class]
|
||||||
|
# @PURPOSE: Provides LLM capabilities to the Git plugin.
|
||||||
|
class GitLLMExtension:
|
||||||
|
def __init__(self, client: LLMClient):
|
||||||
|
self.client = client
|
||||||
|
|
||||||
|
# [DEF:suggest_commit_message:Function]
|
||||||
|
# @PURPOSE: Generates a suggested commit message based on a diff and history.
|
||||||
|
# @PARAM: diff (str) - The git diff of staged changes.
|
||||||
|
# @PARAM: history (List[str]) - Recent commit messages for context.
|
||||||
|
# @RETURN: str - The suggested commit message.
|
||||||
|
@retry(
|
||||||
|
stop=stop_after_attempt(2),
|
||||||
|
wait=wait_exponential(multiplier=1, min=2, max=10),
|
||||||
|
reraise=True
|
||||||
|
)
|
||||||
|
async def suggest_commit_message(self, diff: str, history: List[str]) -> str:
|
||||||
|
with belief_scope("suggest_commit_message"):
|
||||||
|
history_text = "\n".join(history)
|
||||||
|
prompt = f"""
|
||||||
|
Generate a concise and professional git commit message based on the following diff and recent history.
|
||||||
|
Use Conventional Commits format (e.g., feat: ..., fix: ..., docs: ...).
|
||||||
|
|
||||||
|
Recent History:
|
||||||
|
{history_text}
|
||||||
|
|
||||||
|
Diff:
|
||||||
|
{diff}
|
||||||
|
|
||||||
|
Commit Message:
|
||||||
|
"""
|
||||||
|
|
||||||
|
logger.debug(f"[suggest_commit_message] Calling LLM with model: {self.client.default_model}")
|
||||||
|
response = await self.client.client.chat.completions.create(
|
||||||
|
model=self.client.default_model,
|
||||||
|
messages=[{"role": "user", "content": prompt}],
|
||||||
|
temperature=0.7
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.debug(f"[suggest_commit_message] LLM Response: {response}")
|
||||||
|
|
||||||
|
if not response or not hasattr(response, 'choices') or not response.choices:
|
||||||
|
error_info = getattr(response, 'error', 'No choices in response')
|
||||||
|
logger.error(f"[suggest_commit_message] Invalid LLM response. Error info: {error_info}")
|
||||||
|
|
||||||
|
# If it's a timeout/provider error, we might want to throw to trigger retry if decorated
|
||||||
|
# but for now we return a safe fallback to avoid UI crash
|
||||||
|
return "Update dashboard configurations (LLM generation failed)"
|
||||||
|
|
||||||
|
return response.choices[0].message.content.strip()
|
||||||
|
# [/DEF:suggest_commit_message:Function]
|
||||||
|
# [/DEF:GitLLMExtension:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/git/llm_extension:Module]
|
||||||
399
backend/src/plugins/git_plugin.py
Normal file
399
backend/src/plugins/git_plugin.py
Normal file
@@ -0,0 +1,399 @@
|
|||||||
|
# [DEF:backend.src.plugins.git_plugin:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: git, plugin, dashboard, version_control, sync, deploy
|
||||||
|
# @PURPOSE: Предоставляет плагин для версионирования и развертывания дашбордов Superset.
|
||||||
|
# @LAYER: Plugin
|
||||||
|
# @RELATION: INHERITS_FROM -> src.core.plugin_base.PluginBase
|
||||||
|
# @RELATION: USES -> src.services.git_service.GitService
|
||||||
|
# @RELATION: USES -> src.core.superset_client.SupersetClient
|
||||||
|
# @RELATION: USES -> src.core.config_manager.ConfigManager
|
||||||
|
# @RELATION: USES -> TaskContext
|
||||||
|
#
|
||||||
|
# @INVARIANT: Все операции с Git должны выполняться через GitService.
|
||||||
|
# @CONSTRAINT: Плагин работает только с распакованными YAML-экспортами Superset.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
import os
|
||||||
|
import io
|
||||||
|
import shutil
|
||||||
|
import zipfile
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Dict, Any, Optional
|
||||||
|
from src.core.plugin_base import PluginBase
|
||||||
|
from src.services.git_service import GitService
|
||||||
|
from src.core.logger import logger as app_logger, belief_scope
|
||||||
|
from src.core.config_manager import ConfigManager
|
||||||
|
from src.core.superset_client import SupersetClient
|
||||||
|
from src.core.task_manager.context import TaskContext
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:GitPlugin:Class]
|
||||||
|
# @PURPOSE: Реализация плагина Git Integration для управления версиями дашбордов.
|
||||||
|
class GitPlugin(PluginBase):
|
||||||
|
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Инициализирует плагин и его зависимости.
|
||||||
|
# @PRE: config.json exists or shared config_manager is available.
|
||||||
|
# @POST: Инициализированы git_service и config_manager.
|
||||||
|
def __init__(self):
|
||||||
|
with belief_scope("GitPlugin.__init__"):
|
||||||
|
app_logger.info("Initializing GitPlugin.")
|
||||||
|
self.git_service = GitService()
|
||||||
|
|
||||||
|
# Robust config path resolution:
|
||||||
|
# 1. Try absolute path from src/dependencies.py style if possible
|
||||||
|
# 2. Try relative paths based on common execution patterns
|
||||||
|
if os.path.exists("../config.json"):
|
||||||
|
config_path = "../config.json"
|
||||||
|
elif os.path.exists("config.json"):
|
||||||
|
config_path = "config.json"
|
||||||
|
else:
|
||||||
|
# Fallback to the one initialized in dependencies if we can import it
|
||||||
|
try:
|
||||||
|
from src.dependencies import config_manager
|
||||||
|
self.config_manager = config_manager
|
||||||
|
app_logger.info("GitPlugin initialized using shared config_manager.")
|
||||||
|
return
|
||||||
|
except Exception:
|
||||||
|
config_path = "config.json"
|
||||||
|
|
||||||
|
self.config_manager = ConfigManager(config_path)
|
||||||
|
app_logger.info(f"GitPlugin initialized with {config_path}")
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:id:Function]
|
||||||
|
# @PURPOSE: Returns the plugin identifier.
|
||||||
|
# @PRE: GitPlugin is initialized.
|
||||||
|
# @POST: Returns 'git-integration'.
|
||||||
|
def id(self) -> str:
|
||||||
|
with belief_scope("GitPlugin.id"):
|
||||||
|
return "git-integration"
|
||||||
|
# [/DEF:id:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:name:Function]
|
||||||
|
# @PURPOSE: Returns the plugin name.
|
||||||
|
# @PRE: GitPlugin is initialized.
|
||||||
|
# @POST: Returns the human-readable name.
|
||||||
|
def name(self) -> str:
|
||||||
|
with belief_scope("GitPlugin.name"):
|
||||||
|
return "Git Integration"
|
||||||
|
# [/DEF:name:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:description:Function]
|
||||||
|
# @PURPOSE: Returns the plugin description.
|
||||||
|
# @PRE: GitPlugin is initialized.
|
||||||
|
# @POST: Returns the plugin's purpose description.
|
||||||
|
def description(self) -> str:
|
||||||
|
with belief_scope("GitPlugin.description"):
|
||||||
|
return "Version control for Superset dashboards"
|
||||||
|
# [/DEF:description:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:version:Function]
|
||||||
|
# @PURPOSE: Returns the plugin version.
|
||||||
|
# @PRE: GitPlugin is initialized.
|
||||||
|
# @POST: Returns the version string.
|
||||||
|
def version(self) -> str:
|
||||||
|
with belief_scope("GitPlugin.version"):
|
||||||
|
return "0.1.0"
|
||||||
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the git plugin.
|
||||||
|
# @RETURN: str - "/git"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("GitPlugin.ui_route"):
|
||||||
|
return "/git"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
|
# [DEF:get_schema:Function]
|
||||||
|
# @PURPOSE: Возвращает JSON-схему параметров для выполнения задач плагина.
|
||||||
|
# @PRE: GitPlugin is initialized.
|
||||||
|
# @POST: Returns a JSON schema dictionary.
|
||||||
|
# @RETURN: Dict[str, Any] - Схема параметров.
|
||||||
|
def get_schema(self) -> Dict[str, Any]:
|
||||||
|
with belief_scope("GitPlugin.get_schema"):
|
||||||
|
return {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"operation": {"type": "string", "enum": ["sync", "deploy", "history"]},
|
||||||
|
"dashboard_id": {"type": "integer"},
|
||||||
|
"environment_id": {"type": "string"},
|
||||||
|
"source_env_id": {"type": "string"}
|
||||||
|
},
|
||||||
|
"required": ["operation", "dashboard_id"]
|
||||||
|
}
|
||||||
|
# [/DEF:get_schema:Function]
|
||||||
|
|
||||||
|
# [DEF:initialize:Function]
|
||||||
|
# @PURPOSE: Выполняет начальную настройку плагина.
|
||||||
|
# @PRE: GitPlugin is initialized.
|
||||||
|
# @POST: Плагин готов к выполнению задач.
|
||||||
|
async def initialize(self):
|
||||||
|
with belief_scope("GitPlugin.initialize"):
|
||||||
|
app_logger.info("[GitPlugin.initialize][Action] Initializing Git Integration Plugin logic.")
|
||||||
|
|
||||||
|
# [DEF:execute:Function]
|
||||||
|
# @PURPOSE: Основной метод выполнения задач плагина с поддержкой TaskContext.
|
||||||
|
# @PRE: task_data содержит 'operation' и 'dashboard_id'.
|
||||||
|
# @POST: Возвращает результат выполнения операции.
|
||||||
|
# @PARAM: task_data (Dict[str, Any]) - Данные задачи.
|
||||||
|
# @PARAM: context (Optional[TaskContext]) - Task context for logging with source attribution.
|
||||||
|
# @RETURN: Dict[str, Any] - Статус и сообщение.
|
||||||
|
# @RELATION: CALLS -> self._handle_sync
|
||||||
|
# @RELATION: CALLS -> self._handle_deploy
|
||||||
|
async def execute(self, task_data: Dict[str, Any], context: Optional[TaskContext] = None) -> Dict[str, Any]:
|
||||||
|
with belief_scope("GitPlugin.execute"):
|
||||||
|
operation = task_data.get("operation")
|
||||||
|
dashboard_id = task_data.get("dashboard_id")
|
||||||
|
|
||||||
|
# Use TaskContext logger if available, otherwise fall back to app_logger
|
||||||
|
log = context.logger if context else app_logger
|
||||||
|
|
||||||
|
# Create sub-loggers for different components
|
||||||
|
git_log = log.with_source("git") if context else log
|
||||||
|
superset_log = log.with_source("superset_api") if context else log
|
||||||
|
|
||||||
|
log.info(f"Executing operation: {operation} for dashboard {dashboard_id}")
|
||||||
|
|
||||||
|
if operation == "sync":
|
||||||
|
source_env_id = task_data.get("source_env_id")
|
||||||
|
result = await self._handle_sync(dashboard_id, source_env_id, log, git_log, superset_log)
|
||||||
|
elif operation == "deploy":
|
||||||
|
env_id = task_data.get("environment_id")
|
||||||
|
result = await self._handle_deploy(dashboard_id, env_id, log, git_log, superset_log)
|
||||||
|
elif operation == "history":
|
||||||
|
result = {"status": "success", "message": "History available via API"}
|
||||||
|
else:
|
||||||
|
log.error(f"Unknown operation: {operation}")
|
||||||
|
raise ValueError(f"Unknown operation: {operation}")
|
||||||
|
|
||||||
|
log.info(f"Operation {operation} completed.")
|
||||||
|
return result
|
||||||
|
# [/DEF:execute:Function]
|
||||||
|
|
||||||
|
# [DEF:_handle_sync:Function]
|
||||||
|
# @PURPOSE: Экспортирует дашборд из Superset и распаковывает в Git-репозиторий.
|
||||||
|
# @PRE: Репозиторий для дашборда должен существовать.
|
||||||
|
# @POST: Файлы в репозитории обновлены до текущего состояния в Superset.
|
||||||
|
# @PARAM: dashboard_id (int) - ID дашборда.
|
||||||
|
# @PARAM: source_env_id (Optional[str]) - ID исходного окружения.
|
||||||
|
# @RETURN: Dict[str, str] - Результат синхронизации.
|
||||||
|
# @SIDE_EFFECT: Изменяет файлы в локальной рабочей директории репозитория.
|
||||||
|
# @RELATION: CALLS -> src.services.git_service.GitService.get_repo
|
||||||
|
# @RELATION: CALLS -> src.core.superset_client.SupersetClient.export_dashboard
|
||||||
|
async def _handle_sync(self, dashboard_id: int, source_env_id: Optional[str] = None, log=None, git_log=None, superset_log=None) -> Dict[str, str]:
|
||||||
|
with belief_scope("GitPlugin._handle_sync"):
|
||||||
|
try:
|
||||||
|
# 1. Получение репозитория
|
||||||
|
repo = self.git_service.get_repo(dashboard_id)
|
||||||
|
repo_path = Path(repo.working_dir)
|
||||||
|
git_log.info(f"Target repo path: {repo_path}")
|
||||||
|
|
||||||
|
# 2. Настройка клиента Superset
|
||||||
|
env = self._get_env(source_env_id)
|
||||||
|
client = SupersetClient(env)
|
||||||
|
client.authenticate()
|
||||||
|
|
||||||
|
# 3. Экспорт дашборда
|
||||||
|
superset_log.info(f"Exporting dashboard {dashboard_id} from {env.name}")
|
||||||
|
zip_bytes, _ = client.export_dashboard(dashboard_id)
|
||||||
|
|
||||||
|
# 4. Распаковка с выравниванием структуры (flattening)
|
||||||
|
git_log.info(f"Unpacking export to {repo_path}")
|
||||||
|
|
||||||
|
# Список папок/файлов, которые мы ожидаем от Superset
|
||||||
|
managed_dirs = ["dashboards", "charts", "datasets", "databases"]
|
||||||
|
managed_files = ["metadata.yaml"]
|
||||||
|
|
||||||
|
# Очистка старых данных перед распаковкой, чтобы не оставалось "призраков"
|
||||||
|
for d in managed_dirs:
|
||||||
|
d_path = repo_path / d
|
||||||
|
if d_path.exists() and d_path.is_dir():
|
||||||
|
shutil.rmtree(d_path)
|
||||||
|
for f in managed_files:
|
||||||
|
f_path = repo_path / f
|
||||||
|
if f_path.exists():
|
||||||
|
f_path.unlink()
|
||||||
|
|
||||||
|
with zipfile.ZipFile(io.BytesIO(zip_bytes)) as zf:
|
||||||
|
# Superset экспортирует всё в подпапку dashboard_export_timestamp/
|
||||||
|
# Нам нужно найти это имя папки
|
||||||
|
namelist = zf.namelist()
|
||||||
|
if not namelist:
|
||||||
|
raise ValueError("Export ZIP is empty")
|
||||||
|
|
||||||
|
root_folder = namelist[0].split('/')[0]
|
||||||
|
git_log.info(f"Detected root folder in ZIP: {root_folder}")
|
||||||
|
|
||||||
|
for member in zf.infolist():
|
||||||
|
if member.filename.startswith(root_folder + "/") and len(member.filename) > len(root_folder) + 1:
|
||||||
|
# Убираем префикс папки
|
||||||
|
relative_path = member.filename[len(root_folder)+1:]
|
||||||
|
target_path = repo_path / relative_path
|
||||||
|
|
||||||
|
if member.is_dir():
|
||||||
|
target_path.mkdir(parents=True, exist_ok=True)
|
||||||
|
else:
|
||||||
|
target_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
with zf.open(member) as source, open(target_path, "wb") as target:
|
||||||
|
shutil.copyfileobj(source, target)
|
||||||
|
|
||||||
|
# 5. Автоматический staging изменений (не коммит, чтобы юзер мог проверить diff)
|
||||||
|
try:
|
||||||
|
repo.git.add(A=True)
|
||||||
|
app_logger.info("[_handle_sync][Action] Changes staged in git")
|
||||||
|
except Exception as ge:
|
||||||
|
app_logger.warning(f"[_handle_sync][Action] Failed to stage changes: {ge}")
|
||||||
|
|
||||||
|
app_logger.info(f"[_handle_sync][Coherence:OK] Dashboard {dashboard_id} synced successfully.")
|
||||||
|
return {"status": "success", "message": "Dashboard synced and flattened in local repository"}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
app_logger.error(f"[_handle_sync][Coherence:Failed] Sync failed: {e}")
|
||||||
|
raise
|
||||||
|
# [/DEF:_handle_sync:Function]
|
||||||
|
|
||||||
|
# [DEF:_handle_deploy:Function]
|
||||||
|
# @PURPOSE: Упаковывает репозиторий в ZIP и импортирует в целевое окружение Superset.
|
||||||
|
# @PRE: environment_id должен соответствовать настроенному окружению.
|
||||||
|
# @POST: Дашборд импортирован в целевой Superset.
|
||||||
|
# @PARAM: dashboard_id (int) - ID дашборда.
|
||||||
|
# @PARAM: env_id (str) - ID целевого окружения.
|
||||||
|
# @PARAM: log - Main logger instance.
|
||||||
|
# @PARAM: git_log - Git-specific logger instance.
|
||||||
|
# @PARAM: superset_log - Superset API-specific logger instance.
|
||||||
|
# @RETURN: Dict[str, Any] - Результат деплоя.
|
||||||
|
# @SIDE_EFFECT: Создает и удаляет временный ZIP-файл.
|
||||||
|
# @RELATION: CALLS -> src.core.superset_client.SupersetClient.import_dashboard
|
||||||
|
async def _handle_deploy(self, dashboard_id: int, env_id: str, log=None, git_log=None, superset_log=None) -> Dict[str, Any]:
|
||||||
|
with belief_scope("GitPlugin._handle_deploy"):
|
||||||
|
try:
|
||||||
|
if not env_id:
|
||||||
|
raise ValueError("Target environment ID required for deployment")
|
||||||
|
|
||||||
|
# 1. Получение репозитория
|
||||||
|
repo = self.git_service.get_repo(dashboard_id)
|
||||||
|
repo_path = Path(repo.working_dir)
|
||||||
|
|
||||||
|
# 2. Упаковка в ZIP
|
||||||
|
git_log.info(f"Packing repository {repo_path} for deployment.")
|
||||||
|
zip_buffer = io.BytesIO()
|
||||||
|
|
||||||
|
# Superset expects a root directory in the ZIP (e.g., dashboard_export_20240101T000000/)
|
||||||
|
root_dir_name = f"dashboard_export_{dashboard_id}"
|
||||||
|
|
||||||
|
with zipfile.ZipFile(zip_buffer, "w", zipfile.ZIP_DEFLATED) as zf:
|
||||||
|
for root, dirs, files in os.walk(repo_path):
|
||||||
|
if ".git" in dirs:
|
||||||
|
dirs.remove(".git")
|
||||||
|
for file in files:
|
||||||
|
if file == ".git" or file.endswith(".zip"):
|
||||||
|
continue
|
||||||
|
file_path = Path(root) / file
|
||||||
|
# Prepend the root directory name to the archive path
|
||||||
|
arcname = Path(root_dir_name) / file_path.relative_to(repo_path)
|
||||||
|
zf.write(file_path, arcname)
|
||||||
|
|
||||||
|
zip_buffer.seek(0)
|
||||||
|
|
||||||
|
# 3. Настройка клиента Superset
|
||||||
|
env = self.config_manager.get_environment(env_id)
|
||||||
|
if not env:
|
||||||
|
raise ValueError(f"Environment {env_id} not found")
|
||||||
|
|
||||||
|
client = SupersetClient(env)
|
||||||
|
client.authenticate()
|
||||||
|
|
||||||
|
# 4. Импорт
|
||||||
|
temp_zip_path = repo_path / f"deploy_{dashboard_id}.zip"
|
||||||
|
git_log.info(f"Saving temporary zip to {temp_zip_path}")
|
||||||
|
with open(temp_zip_path, "wb") as f:
|
||||||
|
f.write(zip_buffer.getvalue())
|
||||||
|
|
||||||
|
try:
|
||||||
|
app_logger.info(f"[_handle_deploy][Action] Importing dashboard to {env.name}")
|
||||||
|
result = client.import_dashboard(temp_zip_path)
|
||||||
|
app_logger.info(f"[_handle_deploy][Coherence:OK] Deployment successful for dashboard {dashboard_id}.")
|
||||||
|
return {"status": "success", "message": f"Dashboard deployed to {env.name}", "details": result}
|
||||||
|
finally:
|
||||||
|
if temp_zip_path.exists():
|
||||||
|
os.remove(temp_zip_path)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
app_logger.error(f"[_handle_deploy][Coherence:Failed] Deployment failed: {e}")
|
||||||
|
raise
|
||||||
|
# [/DEF:_handle_deploy:Function]
|
||||||
|
|
||||||
|
# [DEF:_get_env:Function]
|
||||||
|
# @PURPOSE: Вспомогательный метод для получения конфигурации окружения.
|
||||||
|
# @PARAM: env_id (Optional[str]) - ID окружения.
|
||||||
|
# @PRE: env_id is a string or None.
|
||||||
|
# @POST: Returns an Environment object from config or DB.
|
||||||
|
# @RETURN: Environment - Объект конфигурации окружения.
|
||||||
|
def _get_env(self, env_id: Optional[str] = None):
|
||||||
|
with belief_scope("GitPlugin._get_env"):
|
||||||
|
app_logger.info(f"[_get_env][Entry] Fetching environment for ID: {env_id}")
|
||||||
|
|
||||||
|
# Priority 1: ConfigManager (config.json)
|
||||||
|
if env_id:
|
||||||
|
env = self.config_manager.get_environment(env_id)
|
||||||
|
if env:
|
||||||
|
app_logger.info(f"[_get_env][Exit] Found environment by ID in ConfigManager: {env.name}")
|
||||||
|
return env
|
||||||
|
|
||||||
|
# Priority 2: Database (DeploymentEnvironment)
|
||||||
|
from src.core.database import SessionLocal
|
||||||
|
from src.models.git import DeploymentEnvironment
|
||||||
|
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
if env_id:
|
||||||
|
db_env = db.query(DeploymentEnvironment).filter(DeploymentEnvironment.id == env_id).first()
|
||||||
|
else:
|
||||||
|
# If no ID, try to find active or any environment in DB
|
||||||
|
db_env = db.query(DeploymentEnvironment).filter(DeploymentEnvironment.is_active).first()
|
||||||
|
if not db_env:
|
||||||
|
db_env = db.query(DeploymentEnvironment).first()
|
||||||
|
|
||||||
|
if db_env:
|
||||||
|
app_logger.info(f"[_get_env][Exit] Found environment in DB: {db_env.name}")
|
||||||
|
from src.core.config_models import Environment
|
||||||
|
# Use token as password for SupersetClient
|
||||||
|
return Environment(
|
||||||
|
id=db_env.id,
|
||||||
|
name=db_env.name,
|
||||||
|
url=db_env.superset_url,
|
||||||
|
username="admin",
|
||||||
|
password=db_env.superset_token,
|
||||||
|
verify_ssl=True
|
||||||
|
)
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
# Priority 3: ConfigManager Default (if no env_id provided)
|
||||||
|
envs = self.config_manager.get_environments()
|
||||||
|
if envs:
|
||||||
|
if env_id:
|
||||||
|
# If env_id was provided but not found in DB or specifically by ID in config,
|
||||||
|
# but we have other envs, maybe it's one of them?
|
||||||
|
env = next((e for e in envs if e.id == env_id), None)
|
||||||
|
if env:
|
||||||
|
app_logger.info(f"[_get_env][Exit] Found environment {env_id} in ConfigManager list")
|
||||||
|
return env
|
||||||
|
|
||||||
|
if not env_id:
|
||||||
|
app_logger.info(f"[_get_env][Exit] Using first environment from ConfigManager: {envs[0].name}")
|
||||||
|
return envs[0]
|
||||||
|
|
||||||
|
app_logger.error(f"[_get_env][Coherence:Failed] No environments configured (searched config.json and DB). env_id={env_id}")
|
||||||
|
raise ValueError("No environments configured. Please add a Superset Environment in Settings.")
|
||||||
|
# [/DEF:_get_env:Function]
|
||||||
|
|
||||||
|
# [/DEF:initialize:Function]
|
||||||
|
# [/DEF:GitPlugin:Class]
|
||||||
|
# [/DEF:backend.src.plugins.git_plugin:Module]
|
||||||
14
backend/src/plugins/llm_analysis/__init__.py
Normal file
14
backend/src/plugins/llm_analysis/__init__.py
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/__init__.py:Module]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Initialize the LLM Analysis plugin package.
|
||||||
|
# @LAYER: Domain
|
||||||
|
|
||||||
|
"""
|
||||||
|
LLM Analysis Plugin for automated dashboard validation and dataset documentation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from .plugin import DashboardValidationPlugin, DocumentationPlugin
|
||||||
|
|
||||||
|
__all__ = ['DashboardValidationPlugin', 'DocumentationPlugin']
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/__init__.py:Module]
|
||||||
61
backend/src/plugins/llm_analysis/models.py
Normal file
61
backend/src/plugins/llm_analysis/models.py
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/models.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: pydantic, models, llm
|
||||||
|
# @PURPOSE: Define Pydantic models for LLM Analysis plugin.
|
||||||
|
# @LAYER: Domain
|
||||||
|
|
||||||
|
from typing import List, Optional
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from datetime import datetime
|
||||||
|
from enum import Enum
|
||||||
|
|
||||||
|
# [DEF:LLMProviderType:Class]
|
||||||
|
# @PURPOSE: Enum for supported LLM providers.
|
||||||
|
class LLMProviderType(str, Enum):
|
||||||
|
OPENAI = "openai"
|
||||||
|
OPENROUTER = "openrouter"
|
||||||
|
KILO = "kilo"
|
||||||
|
# [/DEF:LLMProviderType:Class]
|
||||||
|
|
||||||
|
# [DEF:LLMProviderConfig:Class]
|
||||||
|
# @PURPOSE: Configuration for an LLM provider.
|
||||||
|
class LLMProviderConfig(BaseModel):
|
||||||
|
id: Optional[str] = None
|
||||||
|
provider_type: LLMProviderType
|
||||||
|
name: str
|
||||||
|
base_url: str
|
||||||
|
api_key: Optional[str] = None
|
||||||
|
default_model: str
|
||||||
|
is_active: bool = True
|
||||||
|
# [/DEF:LLMProviderConfig:Class]
|
||||||
|
|
||||||
|
# [DEF:ValidationStatus:Class]
|
||||||
|
# @PURPOSE: Enum for dashboard validation status.
|
||||||
|
class ValidationStatus(str, Enum):
|
||||||
|
PASS = "PASS"
|
||||||
|
WARN = "WARN"
|
||||||
|
FAIL = "FAIL"
|
||||||
|
# [/DEF:ValidationStatus:Class]
|
||||||
|
|
||||||
|
# [DEF:DetectedIssue:Class]
|
||||||
|
# @PURPOSE: Model for a single issue detected during validation.
|
||||||
|
class DetectedIssue(BaseModel):
|
||||||
|
severity: ValidationStatus
|
||||||
|
message: str
|
||||||
|
location: Optional[str] = None
|
||||||
|
# [/DEF:DetectedIssue:Class]
|
||||||
|
|
||||||
|
# [DEF:ValidationResult:Class]
|
||||||
|
# @PURPOSE: Model for dashboard validation result.
|
||||||
|
class ValidationResult(BaseModel):
|
||||||
|
id: Optional[str] = None
|
||||||
|
dashboard_id: str
|
||||||
|
timestamp: datetime = Field(default_factory=datetime.utcnow)
|
||||||
|
status: ValidationStatus
|
||||||
|
screenshot_path: Optional[str] = None
|
||||||
|
issues: List[DetectedIssue]
|
||||||
|
summary: str
|
||||||
|
raw_response: Optional[str] = None
|
||||||
|
# [/DEF:ValidationResult:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/models.py:Module]
|
||||||
391
backend/src/plugins/llm_analysis/plugin.py
Normal file
391
backend/src/plugins/llm_analysis/plugin.py
Normal file
@@ -0,0 +1,391 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/plugin.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: plugin, llm, analysis, documentation
|
||||||
|
# @PURPOSE: Implements DashboardValidationPlugin and DocumentationPlugin.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: INHERITS -> backend.src.core.plugin_base.PluginBase
|
||||||
|
# @RELATION: CALLS -> backend.src.plugins.llm_analysis.service.ScreenshotService
|
||||||
|
# @RELATION: CALLS -> backend.src.plugins.llm_analysis.service.LLMClient
|
||||||
|
# @RELATION: CALLS -> backend.src.services.llm_provider.LLMProviderService
|
||||||
|
# @RELATION: USES -> TaskContext
|
||||||
|
# @INVARIANT: All LLM interactions must be executed as asynchronous tasks.
|
||||||
|
|
||||||
|
from typing import Dict, Any, Optional
|
||||||
|
import os
|
||||||
|
import json
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from ...core.plugin_base import PluginBase
|
||||||
|
from ...core.logger import belief_scope, logger
|
||||||
|
from ...core.database import SessionLocal
|
||||||
|
from ...services.llm_provider import LLMProviderService
|
||||||
|
from ...core.superset_client import SupersetClient
|
||||||
|
from .service import ScreenshotService, LLMClient
|
||||||
|
from .models import LLMProviderType, ValidationStatus, ValidationResult, DetectedIssue
|
||||||
|
from ...models.llm import ValidationRecord
|
||||||
|
from ...core.task_manager.context import TaskContext
|
||||||
|
|
||||||
|
# [DEF:DashboardValidationPlugin:Class]
|
||||||
|
# @PURPOSE: Plugin for automated dashboard health analysis using LLMs.
|
||||||
|
# @RELATION: IMPLEMENTS -> backend.src.core.plugin_base.PluginBase
|
||||||
|
class DashboardValidationPlugin(PluginBase):
|
||||||
|
@property
|
||||||
|
def id(self) -> str:
|
||||||
|
return "llm_dashboard_validation"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self) -> str:
|
||||||
|
return "Dashboard LLM Validation"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def description(self) -> str:
|
||||||
|
return "Automated dashboard health analysis using multimodal LLMs."
|
||||||
|
|
||||||
|
@property
|
||||||
|
def version(self) -> str:
|
||||||
|
return "1.0.0"
|
||||||
|
|
||||||
|
def get_schema(self) -> Dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"dashboard_id": {"type": "string", "title": "Dashboard ID"},
|
||||||
|
"environment_id": {"type": "string", "title": "Environment ID"},
|
||||||
|
"provider_id": {"type": "string", "title": "LLM Provider ID"}
|
||||||
|
},
|
||||||
|
"required": ["dashboard_id", "environment_id", "provider_id"]
|
||||||
|
}
|
||||||
|
|
||||||
|
# [DEF:DashboardValidationPlugin.execute:Function]
|
||||||
|
# @PURPOSE: Executes the dashboard validation task with TaskContext support.
|
||||||
|
# @PARAM: params (Dict[str, Any]) - Validation parameters.
|
||||||
|
# @PARAM: context (Optional[TaskContext]) - Task context for logging with source attribution.
|
||||||
|
# @PRE: params contains dashboard_id, environment_id, and provider_id.
|
||||||
|
# @POST: Returns a dictionary with validation results and persists them to the database.
|
||||||
|
# @SIDE_EFFECT: Captures a screenshot, calls LLM API, and writes to the database.
|
||||||
|
async def execute(self, params: Dict[str, Any], context: Optional[TaskContext] = None):
|
||||||
|
with belief_scope("execute", f"plugin_id={self.id}"):
|
||||||
|
# Use TaskContext logger if available, otherwise fall back to app logger
|
||||||
|
log = context.logger if context else logger
|
||||||
|
|
||||||
|
# Create sub-loggers for different components
|
||||||
|
llm_log = log.with_source("llm") if context else log
|
||||||
|
screenshot_log = log.with_source("screenshot") if context else log
|
||||||
|
superset_log = log.with_source("superset_api") if context else log
|
||||||
|
|
||||||
|
log.info(f"Executing {self.name} with params: {params}")
|
||||||
|
|
||||||
|
dashboard_id = params.get("dashboard_id")
|
||||||
|
env_id = params.get("environment_id")
|
||||||
|
provider_id = params.get("provider_id")
|
||||||
|
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
# 1. Get Environment
|
||||||
|
from ...dependencies import get_config_manager
|
||||||
|
config_mgr = get_config_manager()
|
||||||
|
env = config_mgr.get_environment(env_id)
|
||||||
|
if not env:
|
||||||
|
log.error(f"Environment {env_id} not found")
|
||||||
|
raise ValueError(f"Environment {env_id} not found")
|
||||||
|
|
||||||
|
# 2. Get LLM Provider
|
||||||
|
llm_service = LLMProviderService(db)
|
||||||
|
db_provider = llm_service.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
log.error(f"LLM Provider {provider_id} not found")
|
||||||
|
raise ValueError(f"LLM Provider {provider_id} not found")
|
||||||
|
|
||||||
|
llm_log.debug("Retrieved provider config:")
|
||||||
|
llm_log.debug(f" Provider ID: {db_provider.id}")
|
||||||
|
llm_log.debug(f" Provider Name: {db_provider.name}")
|
||||||
|
llm_log.debug(f" Provider Type: {db_provider.provider_type}")
|
||||||
|
llm_log.debug(f" Base URL: {db_provider.base_url}")
|
||||||
|
llm_log.debug(f" Default Model: {db_provider.default_model}")
|
||||||
|
llm_log.debug(f" Is Active: {db_provider.is_active}")
|
||||||
|
|
||||||
|
api_key = llm_service.get_decrypted_api_key(provider_id)
|
||||||
|
llm_log.debug(f"API Key decrypted (first 8 chars): {api_key[:8] if api_key and len(api_key) > 8 else 'EMPTY_OR_NONE'}...")
|
||||||
|
|
||||||
|
# Check if API key was successfully decrypted
|
||||||
|
if not api_key:
|
||||||
|
raise ValueError(
|
||||||
|
f"Failed to decrypt API key for provider {provider_id}. "
|
||||||
|
f"The provider may have been encrypted with a different encryption key. "
|
||||||
|
f"Please update the provider with a new API key through the UI."
|
||||||
|
)
|
||||||
|
|
||||||
|
# 3. Capture Screenshot
|
||||||
|
screenshot_service = ScreenshotService(env)
|
||||||
|
|
||||||
|
storage_root = config_mgr.get_config().settings.storage.root_path
|
||||||
|
screenshots_dir = os.path.join(storage_root, "screenshots")
|
||||||
|
os.makedirs(screenshots_dir, exist_ok=True)
|
||||||
|
|
||||||
|
filename = f"{dashboard_id}_{datetime.now().strftime('%Y%m%d_%H%M%S')}.png"
|
||||||
|
screenshot_path = os.path.join(screenshots_dir, filename)
|
||||||
|
|
||||||
|
screenshot_log.info(f"Capturing screenshot for dashboard {dashboard_id}")
|
||||||
|
await screenshot_service.capture_dashboard(dashboard_id, screenshot_path)
|
||||||
|
screenshot_log.debug(f"Screenshot saved to: {screenshot_path}")
|
||||||
|
|
||||||
|
# 4. Fetch Logs (from Environment /api/v1/log/)
|
||||||
|
logs = []
|
||||||
|
try:
|
||||||
|
client = SupersetClient(env)
|
||||||
|
|
||||||
|
# Calculate time window (last 24 hours)
|
||||||
|
start_time = (datetime.now() - timedelta(hours=24)).isoformat()
|
||||||
|
|
||||||
|
# Construct filter for logs
|
||||||
|
# Note: We filter by dashboard_id matching the object
|
||||||
|
query_params = {
|
||||||
|
"filters": [
|
||||||
|
{"col": "dashboard_id", "opr": "eq", "value": dashboard_id},
|
||||||
|
{"col": "dttm", "opr": "gt", "value": start_time}
|
||||||
|
],
|
||||||
|
"order_column": "dttm",
|
||||||
|
"order_direction": "desc",
|
||||||
|
"page": 0,
|
||||||
|
"page_size": 100
|
||||||
|
}
|
||||||
|
|
||||||
|
superset_log.debug(f"Fetching logs for dashboard {dashboard_id}")
|
||||||
|
response = client.network.request(
|
||||||
|
method="GET",
|
||||||
|
endpoint="/log/",
|
||||||
|
params={"q": json.dumps(query_params)}
|
||||||
|
)
|
||||||
|
|
||||||
|
if isinstance(response, dict) and "result" in response:
|
||||||
|
for item in response["result"]:
|
||||||
|
action = item.get("action", "unknown")
|
||||||
|
dttm = item.get("dttm", "")
|
||||||
|
details = item.get("json", "")
|
||||||
|
logs.append(f"[{dttm}] {action}: {details}")
|
||||||
|
|
||||||
|
if not logs:
|
||||||
|
logs = ["No recent logs found for this dashboard."]
|
||||||
|
superset_log.debug("No recent logs found for this dashboard")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
superset_log.warning(f"Failed to fetch logs from environment: {e}")
|
||||||
|
logs = [f"Error fetching remote logs: {str(e)}"]
|
||||||
|
|
||||||
|
# 5. Analyze with LLM
|
||||||
|
llm_client = LLMClient(
|
||||||
|
provider_type=LLMProviderType(db_provider.provider_type),
|
||||||
|
api_key=api_key,
|
||||||
|
base_url=db_provider.base_url,
|
||||||
|
default_model=db_provider.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
llm_log.info(f"Analyzing dashboard {dashboard_id} with LLM")
|
||||||
|
analysis = await llm_client.analyze_dashboard(screenshot_path, logs)
|
||||||
|
|
||||||
|
# Log analysis summary to task logs for better visibility
|
||||||
|
llm_log.info(f"[ANALYSIS_SUMMARY] Status: {analysis['status']}")
|
||||||
|
llm_log.info(f"[ANALYSIS_SUMMARY] Summary: {analysis['summary']}")
|
||||||
|
if analysis.get("issues"):
|
||||||
|
for i, issue in enumerate(analysis["issues"]):
|
||||||
|
llm_log.info(f"[ANALYSIS_ISSUE][{i+1}] {issue.get('severity')}: {issue.get('message')} (Location: {issue.get('location', 'N/A')})")
|
||||||
|
|
||||||
|
# 6. Persist Result
|
||||||
|
validation_result = ValidationResult(
|
||||||
|
dashboard_id=dashboard_id,
|
||||||
|
status=ValidationStatus(analysis["status"]),
|
||||||
|
summary=analysis["summary"],
|
||||||
|
issues=[DetectedIssue(**issue) for issue in analysis["issues"]],
|
||||||
|
screenshot_path=screenshot_path,
|
||||||
|
raw_response=str(analysis)
|
||||||
|
)
|
||||||
|
|
||||||
|
db_record = ValidationRecord(
|
||||||
|
dashboard_id=validation_result.dashboard_id,
|
||||||
|
status=validation_result.status.value,
|
||||||
|
summary=validation_result.summary,
|
||||||
|
issues=[issue.dict() for issue in validation_result.issues],
|
||||||
|
screenshot_path=validation_result.screenshot_path,
|
||||||
|
raw_response=validation_result.raw_response
|
||||||
|
)
|
||||||
|
db.add(db_record)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
# 7. Notification on failure (US1 / FR-015)
|
||||||
|
if validation_result.status == ValidationStatus.FAIL:
|
||||||
|
log.warning(f"Dashboard {dashboard_id} validation FAILED. Summary: {validation_result.summary}")
|
||||||
|
# Placeholder for Email/Pulse notification dispatch
|
||||||
|
# In a real implementation, we would call a NotificationService here
|
||||||
|
# with a payload containing the summary and a link to the report.
|
||||||
|
|
||||||
|
# Final log to ensure all analysis is visible in task logs
|
||||||
|
log.info(f"Validation completed for dashboard {dashboard_id}. Status: {validation_result.status.value}")
|
||||||
|
|
||||||
|
return validation_result.dict()
|
||||||
|
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:DashboardValidationPlugin.execute:Function]
|
||||||
|
# [/DEF:DashboardValidationPlugin:Class]
|
||||||
|
|
||||||
|
# [DEF:DocumentationPlugin:Class]
|
||||||
|
# @PURPOSE: Plugin for automated dataset documentation using LLMs.
|
||||||
|
# @RELATION: IMPLEMENTS -> backend.src.core.plugin_base.PluginBase
|
||||||
|
class DocumentationPlugin(PluginBase):
|
||||||
|
@property
|
||||||
|
def id(self) -> str:
|
||||||
|
return "llm_documentation"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self) -> str:
|
||||||
|
return "Dataset LLM Documentation"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def description(self) -> str:
|
||||||
|
return "Automated dataset and column documentation using LLMs."
|
||||||
|
|
||||||
|
@property
|
||||||
|
def version(self) -> str:
|
||||||
|
return "1.0.0"
|
||||||
|
|
||||||
|
def get_schema(self) -> Dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"dataset_id": {"type": "string", "title": "Dataset ID"},
|
||||||
|
"environment_id": {"type": "string", "title": "Environment ID"},
|
||||||
|
"provider_id": {"type": "string", "title": "LLM Provider ID"}
|
||||||
|
},
|
||||||
|
"required": ["dataset_id", "environment_id", "provider_id"]
|
||||||
|
}
|
||||||
|
|
||||||
|
# [DEF:DocumentationPlugin.execute:Function]
|
||||||
|
# @PURPOSE: Executes the dataset documentation task with TaskContext support.
|
||||||
|
# @PARAM: params (Dict[str, Any]) - Documentation parameters.
|
||||||
|
# @PARAM: context (Optional[TaskContext]) - Task context for logging with source attribution.
|
||||||
|
# @PRE: params contains dataset_id, environment_id, and provider_id.
|
||||||
|
# @POST: Returns generated documentation and updates the dataset in Superset.
|
||||||
|
# @SIDE_EFFECT: Calls LLM API and updates dataset metadata in Superset.
|
||||||
|
async def execute(self, params: Dict[str, Any], context: Optional[TaskContext] = None):
|
||||||
|
with belief_scope("execute", f"plugin_id={self.id}"):
|
||||||
|
# Use TaskContext logger if available, otherwise fall back to app logger
|
||||||
|
log = context.logger if context else logger
|
||||||
|
|
||||||
|
# Create sub-loggers for different components
|
||||||
|
llm_log = log.with_source("llm") if context else log
|
||||||
|
superset_log = log.with_source("superset_api") if context else log
|
||||||
|
|
||||||
|
log.info(f"Executing {self.name} with params: {params}")
|
||||||
|
|
||||||
|
dataset_id = params.get("dataset_id")
|
||||||
|
env_id = params.get("environment_id")
|
||||||
|
provider_id = params.get("provider_id")
|
||||||
|
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
# 1. Get Environment
|
||||||
|
from ...dependencies import get_config_manager
|
||||||
|
config_mgr = get_config_manager()
|
||||||
|
env = config_mgr.get_environment(env_id)
|
||||||
|
if not env:
|
||||||
|
log.error(f"Environment {env_id} not found")
|
||||||
|
raise ValueError(f"Environment {env_id} not found")
|
||||||
|
|
||||||
|
# 2. Get LLM Provider
|
||||||
|
llm_service = LLMProviderService(db)
|
||||||
|
db_provider = llm_service.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
log.error(f"LLM Provider {provider_id} not found")
|
||||||
|
raise ValueError(f"LLM Provider {provider_id} not found")
|
||||||
|
|
||||||
|
llm_log.debug("Retrieved provider config:")
|
||||||
|
llm_log.debug(f" Provider ID: {db_provider.id}")
|
||||||
|
llm_log.debug(f" Provider Name: {db_provider.name}")
|
||||||
|
llm_log.debug(f" Provider Type: {db_provider.provider_type}")
|
||||||
|
llm_log.debug(f" Base URL: {db_provider.base_url}")
|
||||||
|
llm_log.debug(f" Default Model: {db_provider.default_model}")
|
||||||
|
|
||||||
|
api_key = llm_service.get_decrypted_api_key(provider_id)
|
||||||
|
llm_log.debug(f"API Key decrypted (first 8 chars): {api_key[:8] if api_key and len(api_key) > 8 else 'EMPTY_OR_NONE'}...")
|
||||||
|
|
||||||
|
# Check if API key was successfully decrypted
|
||||||
|
if not api_key:
|
||||||
|
raise ValueError(
|
||||||
|
f"Failed to decrypt API key for provider {provider_id}. "
|
||||||
|
f"The provider may have been encrypted with a different encryption key. "
|
||||||
|
f"Please update the provider with a new API key through the UI."
|
||||||
|
)
|
||||||
|
|
||||||
|
# 3. Fetch Metadata (US2 / T024)
|
||||||
|
from ...core.superset_client import SupersetClient
|
||||||
|
client = SupersetClient(env)
|
||||||
|
|
||||||
|
superset_log.debug(f"Fetching dataset {dataset_id}")
|
||||||
|
dataset = client.get_dataset(int(dataset_id))
|
||||||
|
|
||||||
|
# Extract columns and existing descriptions
|
||||||
|
columns_data = []
|
||||||
|
for col in dataset.get("columns", []):
|
||||||
|
columns_data.append({
|
||||||
|
"name": col.get("column_name"),
|
||||||
|
"type": col.get("type"),
|
||||||
|
"description": col.get("description")
|
||||||
|
})
|
||||||
|
superset_log.debug(f"Extracted {len(columns_data)} columns from dataset")
|
||||||
|
|
||||||
|
# 4. Construct Prompt & Analyze (US2 / T025)
|
||||||
|
llm_client = LLMClient(
|
||||||
|
provider_type=LLMProviderType(db_provider.provider_type),
|
||||||
|
api_key=api_key,
|
||||||
|
base_url=db_provider.base_url,
|
||||||
|
default_model=db_provider.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
prompt = f"""
|
||||||
|
Generate professional documentation for the following dataset and its columns.
|
||||||
|
Dataset: {dataset.get('table_name')}
|
||||||
|
Columns: {columns_data}
|
||||||
|
|
||||||
|
Provide the documentation in JSON format:
|
||||||
|
{{
|
||||||
|
"dataset_description": "General description of the dataset",
|
||||||
|
"column_descriptions": [
|
||||||
|
{{
|
||||||
|
"name": "column_name",
|
||||||
|
"description": "Generated description"
|
||||||
|
}}
|
||||||
|
]
|
||||||
|
}}
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Using a generic chat completion for text-only US2
|
||||||
|
llm_log.info(f"Generating documentation for dataset {dataset_id}")
|
||||||
|
doc_result = await llm_client.get_json_completion([{"role": "user", "content": prompt}])
|
||||||
|
|
||||||
|
# 5. Update Metadata (US2 / T026)
|
||||||
|
update_payload = {
|
||||||
|
"description": doc_result["dataset_description"],
|
||||||
|
"columns": []
|
||||||
|
}
|
||||||
|
|
||||||
|
# Map generated descriptions back to column IDs
|
||||||
|
for col_doc in doc_result["column_descriptions"]:
|
||||||
|
for col in dataset.get("columns", []):
|
||||||
|
if col.get("column_name") == col_doc["name"]:
|
||||||
|
update_payload["columns"].append({
|
||||||
|
"id": col.get("id"),
|
||||||
|
"description": col_doc["description"]
|
||||||
|
})
|
||||||
|
|
||||||
|
superset_log.info(f"Updating dataset {dataset_id} with generated documentation")
|
||||||
|
client.update_dataset(int(dataset_id), update_payload)
|
||||||
|
|
||||||
|
log.info(f"Documentation completed for dataset {dataset_id}")
|
||||||
|
|
||||||
|
return doc_result
|
||||||
|
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:DocumentationPlugin.execute:Function]
|
||||||
|
# [/DEF:DocumentationPlugin:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/plugin.py:Module]
|
||||||
62
backend/src/plugins/llm_analysis/scheduler.py
Normal file
62
backend/src/plugins/llm_analysis/scheduler.py
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/scheduler.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: scheduler, task, automation
|
||||||
|
# @PURPOSE: Provides helper functions to schedule LLM-based validation tasks.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.core.scheduler
|
||||||
|
|
||||||
|
from typing import Dict, Any
|
||||||
|
from ...dependencies import get_task_manager, get_scheduler_service
|
||||||
|
from ...core.logger import belief_scope, logger
|
||||||
|
|
||||||
|
# [DEF:schedule_dashboard_validation:Function]
|
||||||
|
# @PURPOSE: Schedules a recurring dashboard validation task.
|
||||||
|
# @PARAM: dashboard_id (str) - ID of the dashboard to validate.
|
||||||
|
# @PARAM: cron_expression (str) - Standard cron expression for scheduling.
|
||||||
|
# @PARAM: params (Dict[str, Any]) - Task parameters (environment_id, provider_id).
|
||||||
|
# @SIDE_EFFECT: Adds a job to the scheduler service.
|
||||||
|
def schedule_dashboard_validation(dashboard_id: str, cron_expression: str, params: Dict[str, Any]):
|
||||||
|
with belief_scope("schedule_dashboard_validation", f"dashboard_id={dashboard_id}"):
|
||||||
|
scheduler = get_scheduler_service()
|
||||||
|
task_manager = get_task_manager()
|
||||||
|
|
||||||
|
job_id = f"llm_val_{dashboard_id}"
|
||||||
|
|
||||||
|
async def job_func():
|
||||||
|
await task_manager.create_task(
|
||||||
|
plugin_id="llm_dashboard_validation",
|
||||||
|
params={
|
||||||
|
"dashboard_id": dashboard_id,
|
||||||
|
**params
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
scheduler.add_job(
|
||||||
|
job_func,
|
||||||
|
"cron",
|
||||||
|
id=job_id,
|
||||||
|
replace_existing=True,
|
||||||
|
**_parse_cron(cron_expression)
|
||||||
|
)
|
||||||
|
logger.info(f"Scheduled validation for dashboard {dashboard_id} with cron {cron_expression}")
|
||||||
|
# [/DEF:schedule_dashboard_validation:Function]
|
||||||
|
|
||||||
|
# [DEF:_parse_cron:Function]
|
||||||
|
# @PURPOSE: Basic cron parser placeholder.
|
||||||
|
# @PARAM: cron (str) - Cron expression.
|
||||||
|
# @RETURN: Dict[str, str] - Parsed cron parts.
|
||||||
|
def _parse_cron(cron: str) -> Dict[str, str]:
|
||||||
|
# Basic cron parser placeholder
|
||||||
|
parts = cron.split()
|
||||||
|
if len(parts) != 5:
|
||||||
|
return {}
|
||||||
|
return {
|
||||||
|
"minute": parts[0],
|
||||||
|
"hour": parts[1],
|
||||||
|
"day": parts[2],
|
||||||
|
"month": parts[3],
|
||||||
|
"day_of_week": parts[4]
|
||||||
|
}
|
||||||
|
# [/DEF:_parse_cron:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/scheduler.py:Module]
|
||||||
632
backend/src/plugins/llm_analysis/service.py
Normal file
632
backend/src/plugins/llm_analysis/service.py
Normal file
@@ -0,0 +1,632 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/service.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: service, llm, screenshot, playwright, openai
|
||||||
|
# @PURPOSE: Services for LLM interaction and dashboard screenshots.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: DEPENDS_ON -> playwright
|
||||||
|
# @RELATION: DEPENDS_ON -> openai
|
||||||
|
# @RELATION: DEPENDS_ON -> tenacity
|
||||||
|
# @INVARIANT: Screenshots must be 1920px width and capture full page height.
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import base64
|
||||||
|
import json
|
||||||
|
import io
|
||||||
|
from typing import List, Dict, Any
|
||||||
|
from PIL import Image
|
||||||
|
from playwright.async_api import async_playwright
|
||||||
|
from openai import AsyncOpenAI, RateLimitError, AuthenticationError as OpenAIAuthenticationError
|
||||||
|
from tenacity import retry, stop_after_attempt, wait_exponential, retry_if_exception
|
||||||
|
from .models import LLMProviderType
|
||||||
|
from ...core.logger import belief_scope, logger
|
||||||
|
from ...core.config_models import Environment
|
||||||
|
|
||||||
|
# [DEF:ScreenshotService:Class]
|
||||||
|
# @PURPOSE: Handles capturing screenshots of Superset dashboards.
|
||||||
|
class ScreenshotService:
|
||||||
|
# [DEF:ScreenshotService.__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the ScreenshotService with environment configuration.
|
||||||
|
# @PRE: env is a valid Environment object.
|
||||||
|
def __init__(self, env: Environment):
|
||||||
|
self.env = env
|
||||||
|
# [/DEF:ScreenshotService.__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:ScreenshotService.capture_dashboard:Function]
|
||||||
|
# @PURPOSE: Captures a full-page screenshot of a dashboard using Playwright and CDP.
|
||||||
|
# @PRE: dashboard_id is a valid string, output_path is a writable path.
|
||||||
|
# @POST: Returns True if screenshot is saved successfully.
|
||||||
|
# @SIDE_EFFECT: Launches a browser, performs UI login, switches tabs, and writes a PNG file.
|
||||||
|
# @UX_STATE: [Navigating] -> Loading dashboard UI
|
||||||
|
# @UX_STATE: [TabSwitching] -> Iterating through dashboard tabs to trigger lazy loading
|
||||||
|
# @UX_STATE: [CalculatingHeight] -> Determining dashboard dimensions
|
||||||
|
# @UX_STATE: [Capturing] -> Executing CDP screenshot
|
||||||
|
async def capture_dashboard(self, dashboard_id: str, output_path: str) -> bool:
|
||||||
|
with belief_scope("capture_dashboard", f"dashboard_id={dashboard_id}"):
|
||||||
|
logger.info(f"Capturing screenshot for dashboard {dashboard_id}")
|
||||||
|
async with async_playwright() as p:
|
||||||
|
browser = await p.chromium.launch(
|
||||||
|
headless=True,
|
||||||
|
args=[
|
||||||
|
"--disable-blink-features=AutomationControlled",
|
||||||
|
"--disable-infobars",
|
||||||
|
"--no-sandbox"
|
||||||
|
]
|
||||||
|
)
|
||||||
|
# Set a realistic user agent to avoid 403 Forbidden from OpenResty/WAF
|
||||||
|
user_agent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
|
||||||
|
# Construct base UI URL from environment (strip /api/v1 suffix)
|
||||||
|
base_ui_url = self.env.url.rstrip("/")
|
||||||
|
if base_ui_url.endswith("/api/v1"):
|
||||||
|
base_ui_url = base_ui_url[:-len("/api/v1")]
|
||||||
|
|
||||||
|
# Create browser context with realistic headers
|
||||||
|
context = await browser.new_context(
|
||||||
|
viewport={'width': 1280, 'height': 720},
|
||||||
|
user_agent=user_agent,
|
||||||
|
extra_http_headers={
|
||||||
|
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7",
|
||||||
|
"Accept-Language": "ru-RU,ru;q=0.9,en-US;q=0.8,en;q=0.7",
|
||||||
|
"Upgrade-Insecure-Requests": "1",
|
||||||
|
"Sec-Fetch-Dest": "document",
|
||||||
|
"Sec-Fetch-Mode": "navigate",
|
||||||
|
"Sec-Fetch-Site": "none",
|
||||||
|
"Sec-Fetch-User": "?1"
|
||||||
|
}
|
||||||
|
)
|
||||||
|
logger.info("Browser context created successfully")
|
||||||
|
|
||||||
|
page = await context.new_page()
|
||||||
|
# Bypass navigator.webdriver detection
|
||||||
|
await page.add_init_script("delete Object.getPrototypeOf(navigator).webdriver")
|
||||||
|
|
||||||
|
# 1. Navigate to login page and authenticate
|
||||||
|
login_url = f"{base_ui_url.rstrip('/')}/login/"
|
||||||
|
logger.info(f"[DEBUG] Navigating to login page: {login_url}")
|
||||||
|
|
||||||
|
response = await page.goto(login_url, wait_until="networkidle", timeout=60000)
|
||||||
|
if response:
|
||||||
|
logger.info(f"[DEBUG] Login page response status: {response.status}")
|
||||||
|
|
||||||
|
# Wait for login form to be ready
|
||||||
|
await page.wait_for_load_state("domcontentloaded")
|
||||||
|
|
||||||
|
# More exhaustive list of selectors for various Superset versions/themes
|
||||||
|
selectors = {
|
||||||
|
"username": ['input[name="username"]', 'input#username', 'input[placeholder*="Username"]', 'input[type="text"]'],
|
||||||
|
"password": ['input[name="password"]', 'input#password', 'input[placeholder*="Password"]', 'input[type="password"]'],
|
||||||
|
"submit": ['button[type="submit"]', 'button#submit', '.btn-primary', 'input[type="submit"]']
|
||||||
|
}
|
||||||
|
logger.info("[DEBUG] Attempting to find login form elements...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Find and fill username
|
||||||
|
u_selector = None
|
||||||
|
for s in selectors["username"]:
|
||||||
|
count = await page.locator(s).count()
|
||||||
|
logger.info(f"[DEBUG] Selector '{s}': {count} elements found")
|
||||||
|
if count > 0:
|
||||||
|
u_selector = s
|
||||||
|
break
|
||||||
|
|
||||||
|
if not u_selector:
|
||||||
|
# Log all input fields on the page for debugging
|
||||||
|
all_inputs = await page.locator('input').all()
|
||||||
|
logger.info(f"[DEBUG] Found {len(all_inputs)} input fields on page")
|
||||||
|
for i, inp in enumerate(all_inputs[:5]): # Log first 5
|
||||||
|
inp_type = await inp.get_attribute('type')
|
||||||
|
inp_name = await inp.get_attribute('name')
|
||||||
|
inp_id = await inp.get_attribute('id')
|
||||||
|
logger.info(f"[DEBUG] Input {i}: type={inp_type}, name={inp_name}, id={inp_id}")
|
||||||
|
raise RuntimeError("Could not find username input field on login page")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Filling username field with selector: {u_selector}")
|
||||||
|
await page.fill(u_selector, self.env.username)
|
||||||
|
|
||||||
|
# Find and fill password
|
||||||
|
p_selector = None
|
||||||
|
for s in selectors["password"]:
|
||||||
|
if await page.locator(s).count() > 0:
|
||||||
|
p_selector = s
|
||||||
|
break
|
||||||
|
|
||||||
|
if not p_selector:
|
||||||
|
raise RuntimeError("Could not find password input field on login page")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Filling password field with selector: {p_selector}")
|
||||||
|
await page.fill(p_selector, self.env.password)
|
||||||
|
|
||||||
|
# Click submit
|
||||||
|
s_selector = selectors["submit"][0]
|
||||||
|
for s in selectors["submit"]:
|
||||||
|
if await page.locator(s).count() > 0:
|
||||||
|
s_selector = s
|
||||||
|
break
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Clicking submit button with selector: {s_selector}")
|
||||||
|
await page.click(s_selector)
|
||||||
|
|
||||||
|
# Wait for navigation after login
|
||||||
|
await page.wait_for_load_state("networkidle", timeout=30000)
|
||||||
|
|
||||||
|
# Check if login was successful
|
||||||
|
if "/login" in page.url:
|
||||||
|
# Check for error messages on page
|
||||||
|
error_msg = await page.locator(".alert-danger, .error-message").text_content() if await page.locator(".alert-danger, .error-message").count() > 0 else "Unknown error"
|
||||||
|
logger.error(f"[DEBUG] Login failed. Still on login page. Error: {error_msg}")
|
||||||
|
debug_path = output_path.replace(".png", "_debug_failed_login.png")
|
||||||
|
await page.screenshot(path=debug_path)
|
||||||
|
raise RuntimeError(f"Login failed: {error_msg}. Debug screenshot saved to {debug_path}")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Login successful. Current URL: {page.url}")
|
||||||
|
|
||||||
|
# Check cookies after successful login
|
||||||
|
page_cookies = await context.cookies()
|
||||||
|
logger.info(f"[DEBUG] Cookies after login: {len(page_cookies)}")
|
||||||
|
for c in page_cookies:
|
||||||
|
logger.info(f"[DEBUG] Cookie: name={c['name']}, domain={c['domain']}, value={c.get('value', '')[:20]}...")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
page_title = await page.title()
|
||||||
|
logger.error(f"UI Login failed. Page title: {page_title}, URL: {page.url}, Error: {str(e)}")
|
||||||
|
debug_path = output_path.replace(".png", "_debug_failed_login.png")
|
||||||
|
await page.screenshot(path=debug_path)
|
||||||
|
raise RuntimeError(f"Login failed: {str(e)}. Debug screenshot saved to {debug_path}")
|
||||||
|
|
||||||
|
# 2. Navigate to dashboard
|
||||||
|
# @UX_STATE: [Navigating] -> Loading dashboard UI
|
||||||
|
dashboard_url = f"{base_ui_url.rstrip('/')}/superset/dashboard/{dashboard_id}/?standalone=true"
|
||||||
|
|
||||||
|
if base_ui_url.startswith("https://") and dashboard_url.startswith("http://"):
|
||||||
|
dashboard_url = dashboard_url.replace("http://", "https://")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Navigating to dashboard: {dashboard_url}")
|
||||||
|
|
||||||
|
# Use networkidle to ensure all initial assets are loaded
|
||||||
|
response = await page.goto(dashboard_url, wait_until="networkidle", timeout=60000)
|
||||||
|
|
||||||
|
if response:
|
||||||
|
logger.info(f"[DEBUG] Dashboard navigation response status: {response.status}, URL: {response.url}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Wait for the dashboard grid to be present
|
||||||
|
await page.wait_for_selector('.dashboard-component, .dashboard-header, [data-test="dashboard-grid"]', timeout=30000)
|
||||||
|
logger.info("[DEBUG] Dashboard container loaded")
|
||||||
|
|
||||||
|
# Wait for charts to finish loading (Superset uses loading spinners/skeletons)
|
||||||
|
# We wait until loading indicators disappear or a timeout occurs
|
||||||
|
try:
|
||||||
|
# Wait for loading indicators to disappear
|
||||||
|
await page.wait_for_selector('.loading, .ant-skeleton, .spinner', state="hidden", timeout=60000)
|
||||||
|
logger.info("[DEBUG] Loading indicators hidden")
|
||||||
|
except Exception:
|
||||||
|
logger.warning("[DEBUG] Timeout waiting for loading indicators to hide")
|
||||||
|
|
||||||
|
# Wait for charts to actually render their content (e.g., ECharts, NVD3)
|
||||||
|
# We look for common chart containers that should have content
|
||||||
|
try:
|
||||||
|
await page.wait_for_selector('.chart-container canvas, .slice_container svg, .superset-chart-canvas, .grid-content .chart-container', timeout=60000)
|
||||||
|
logger.info("[DEBUG] Chart content detected")
|
||||||
|
except Exception:
|
||||||
|
logger.warning("[DEBUG] Timeout waiting for chart content")
|
||||||
|
|
||||||
|
# Additional check: wait for all chart containers to have non-empty content
|
||||||
|
logger.info("[DEBUG] Waiting for all charts to have rendered content...")
|
||||||
|
await page.wait_for_function("""() => {
|
||||||
|
const charts = document.querySelectorAll('.chart-container, .slice_container');
|
||||||
|
if (charts.length === 0) return true; // No charts to wait for
|
||||||
|
|
||||||
|
// Check if all charts have rendered content (canvas, svg, or non-empty div)
|
||||||
|
return Array.from(charts).every(chart => {
|
||||||
|
const hasCanvas = chart.querySelector('canvas') !== null;
|
||||||
|
const hasSvg = chart.querySelector('svg') !== null;
|
||||||
|
const hasContent = chart.innerText.trim().length > 0 || chart.children.length > 0;
|
||||||
|
return hasCanvas || hasSvg || hasContent;
|
||||||
|
});
|
||||||
|
}""", timeout=60000)
|
||||||
|
logger.info("[DEBUG] All charts have rendered content")
|
||||||
|
|
||||||
|
# Scroll to bottom and back to top to trigger lazy loading of all charts
|
||||||
|
logger.info("[DEBUG] Scrolling to trigger lazy loading...")
|
||||||
|
await page.evaluate("""async () => {
|
||||||
|
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
|
||||||
|
for (let i = 0; i < document.body.scrollHeight; i += 500) {
|
||||||
|
window.scrollTo(0, i);
|
||||||
|
await delay(100);
|
||||||
|
}
|
||||||
|
window.scrollTo(0, 0);
|
||||||
|
await delay(500);
|
||||||
|
}""")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"[DEBUG] Dashboard content wait failed: {e}, proceeding anyway after delay")
|
||||||
|
|
||||||
|
# Final stabilization delay - increased for complex dashboards
|
||||||
|
logger.info("[DEBUG] Final stabilization delay...")
|
||||||
|
await asyncio.sleep(15)
|
||||||
|
|
||||||
|
# Logic to handle tabs and full-page capture
|
||||||
|
try:
|
||||||
|
# 1. Handle Tabs (Recursive switching)
|
||||||
|
# @UX_STATE: [TabSwitching] -> Iterating through dashboard tabs to trigger lazy loading
|
||||||
|
processed_tabs = set()
|
||||||
|
|
||||||
|
async def switch_tabs(depth=0):
|
||||||
|
if depth > 3:
|
||||||
|
return # Limit recursion depth
|
||||||
|
|
||||||
|
tab_selectors = [
|
||||||
|
'.ant-tabs-nav-list .ant-tabs-tab',
|
||||||
|
'.dashboard-component-tabs .ant-tabs-tab',
|
||||||
|
'[data-test="dashboard-component-tabs"] .ant-tabs-tab'
|
||||||
|
]
|
||||||
|
|
||||||
|
found_tabs = []
|
||||||
|
for selector in tab_selectors:
|
||||||
|
found_tabs = await page.locator(selector).all()
|
||||||
|
if found_tabs:
|
||||||
|
break
|
||||||
|
|
||||||
|
if found_tabs:
|
||||||
|
logger.info(f"[DEBUG][TabSwitching] Found {len(found_tabs)} tabs at depth {depth}")
|
||||||
|
for i, tab in enumerate(found_tabs):
|
||||||
|
try:
|
||||||
|
tab_text = (await tab.inner_text()).strip()
|
||||||
|
tab_id = f"{depth}_{i}_{tab_text}"
|
||||||
|
|
||||||
|
if tab_id in processed_tabs:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if await tab.is_visible():
|
||||||
|
logger.info(f"[DEBUG][TabSwitching] Switching to tab: {tab_text}")
|
||||||
|
processed_tabs.add(tab_id)
|
||||||
|
|
||||||
|
is_active = "ant-tabs-tab-active" in (await tab.get_attribute("class") or "")
|
||||||
|
if not is_active:
|
||||||
|
await tab.click()
|
||||||
|
await asyncio.sleep(2) # Wait for content to render
|
||||||
|
|
||||||
|
await switch_tabs(depth + 1)
|
||||||
|
except Exception as tab_e:
|
||||||
|
logger.warning(f"[DEBUG][TabSwitching] Failed to process tab {i}: {tab_e}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
first_tab = found_tabs[0]
|
||||||
|
if "ant-tabs-tab-active" not in (await first_tab.get_attribute("class") or ""):
|
||||||
|
await first_tab.click()
|
||||||
|
await asyncio.sleep(1)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
await switch_tabs()
|
||||||
|
|
||||||
|
# 2. Calculate full height for screenshot
|
||||||
|
# @UX_STATE: [CalculatingHeight] -> Determining dashboard dimensions
|
||||||
|
full_height = await page.evaluate("""() => {
|
||||||
|
const body = document.body;
|
||||||
|
const html = document.documentElement;
|
||||||
|
const dashboardContent = document.querySelector('.dashboard-content');
|
||||||
|
|
||||||
|
return Math.max(
|
||||||
|
body.scrollHeight, body.offsetHeight,
|
||||||
|
html.clientHeight, html.scrollHeight, html.offsetHeight,
|
||||||
|
dashboardContent ? dashboardContent.scrollHeight + 100 : 0
|
||||||
|
);
|
||||||
|
}""")
|
||||||
|
logger.info(f"[DEBUG] Calculated full height: {full_height}")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Count chart elements before resize
|
||||||
|
chart_count_before = await page.evaluate("""() => {
|
||||||
|
return {
|
||||||
|
chartContainers: document.querySelectorAll('.chart-container, .slice_container').length,
|
||||||
|
canvasElements: document.querySelectorAll('canvas').length,
|
||||||
|
svgElements: document.querySelectorAll('.chart-container svg, .slice_container svg').length,
|
||||||
|
visibleCharts: document.querySelectorAll('.chart-container:visible, .slice_container:visible').length
|
||||||
|
};
|
||||||
|
}""")
|
||||||
|
logger.info(f"[DIAGNOSTIC] Chart elements BEFORE viewport resize: {chart_count_before}")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Capture pre-resize screenshot for comparison
|
||||||
|
pre_resize_path = output_path.replace(".png", "_preresize.png")
|
||||||
|
try:
|
||||||
|
await page.screenshot(path=pre_resize_path, full_page=False, timeout=10000)
|
||||||
|
import os
|
||||||
|
pre_resize_size = os.path.getsize(pre_resize_path) if os.path.exists(pre_resize_path) else 0
|
||||||
|
logger.info(f"[DIAGNOSTIC] Pre-resize screenshot saved: {pre_resize_path} ({pre_resize_size} bytes)")
|
||||||
|
except Exception as pre_e:
|
||||||
|
logger.warning(f"[DIAGNOSTIC] Failed to capture pre-resize screenshot: {pre_e}")
|
||||||
|
|
||||||
|
logger.info(f"[DIAGNOSTIC] Resizing viewport from current to 1920x{int(full_height)}")
|
||||||
|
await page.set_viewport_size({"width": 1920, "height": int(full_height)})
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Increased wait time and log timing
|
||||||
|
logger.info("[DIAGNOSTIC] Waiting 10 seconds after viewport resize for re-render...")
|
||||||
|
await asyncio.sleep(10)
|
||||||
|
logger.info("[DIAGNOSTIC] Wait completed")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Count chart elements after resize and wait
|
||||||
|
chart_count_after = await page.evaluate("""() => {
|
||||||
|
return {
|
||||||
|
chartContainers: document.querySelectorAll('.chart-container, .slice_container').length,
|
||||||
|
canvasElements: document.querySelectorAll('canvas').length,
|
||||||
|
svgElements: document.querySelectorAll('.chart-container svg, .slice_container svg').length,
|
||||||
|
visibleCharts: document.querySelectorAll('.chart-container:visible, .slice_container:visible').length
|
||||||
|
};
|
||||||
|
}""")
|
||||||
|
logger.info(f"[DIAGNOSTIC] Chart elements AFTER viewport resize + wait: {chart_count_after}")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Check if any charts have error states
|
||||||
|
chart_errors = await page.evaluate("""() => {
|
||||||
|
const errors = [];
|
||||||
|
document.querySelectorAll('.chart-container, .slice_container').forEach((chart, i) => {
|
||||||
|
const errorEl = chart.querySelector('.error, .alert-danger, .ant-alert-error');
|
||||||
|
if (errorEl) {
|
||||||
|
errors.push({index: i, text: errorEl.innerText.substring(0, 100)});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return errors;
|
||||||
|
}""")
|
||||||
|
if chart_errors:
|
||||||
|
logger.warning(f"[DIAGNOSTIC] Charts with error states detected: {chart_errors}")
|
||||||
|
else:
|
||||||
|
logger.info("[DIAGNOSTIC] No chart error states detected")
|
||||||
|
|
||||||
|
# 3. Take screenshot using CDP to bypass Playwright's font loading wait
|
||||||
|
# @UX_STATE: [Capturing] -> Executing CDP screenshot
|
||||||
|
logger.info("[DEBUG] Attempting full-page screenshot via CDP...")
|
||||||
|
cdp = await page.context.new_cdp_session(page)
|
||||||
|
|
||||||
|
screenshot_data = await cdp.send("Page.captureScreenshot", {
|
||||||
|
"format": "png",
|
||||||
|
"fromSurface": True,
|
||||||
|
"captureBeyondViewport": True
|
||||||
|
})
|
||||||
|
|
||||||
|
image_data = base64.b64decode(screenshot_data["data"])
|
||||||
|
|
||||||
|
with open(output_path, 'wb') as f:
|
||||||
|
f.write(image_data)
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Verify screenshot file
|
||||||
|
import os
|
||||||
|
final_size = os.path.getsize(output_path) if os.path.exists(output_path) else 0
|
||||||
|
logger.info(f"[DIAGNOSTIC] Final screenshot saved: {output_path}")
|
||||||
|
logger.info(f"[DIAGNOSTIC] Final screenshot size: {final_size} bytes ({final_size / 1024:.2f} KB)")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Get image dimensions
|
||||||
|
try:
|
||||||
|
with Image.open(output_path) as final_img:
|
||||||
|
logger.info(f"[DIAGNOSTIC] Final screenshot dimensions: {final_img.width}x{final_img.height}")
|
||||||
|
except Exception as img_err:
|
||||||
|
logger.warning(f"[DIAGNOSTIC] Could not read final image dimensions: {img_err}")
|
||||||
|
|
||||||
|
logger.info(f"Full-page screenshot saved to {output_path} (via CDP)")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[DEBUG] Full-page/Tab capture failed: {e}")
|
||||||
|
try:
|
||||||
|
await page.screenshot(path=output_path, full_page=True, timeout=10000)
|
||||||
|
except Exception as e2:
|
||||||
|
logger.error(f"[DEBUG] Fallback screenshot also failed: {e2}")
|
||||||
|
await page.screenshot(path=output_path, timeout=5000)
|
||||||
|
|
||||||
|
await browser.close()
|
||||||
|
return True
|
||||||
|
# [/DEF:ScreenshotService.capture_dashboard:Function]
|
||||||
|
# [/DEF:ScreenshotService:Class]
|
||||||
|
|
||||||
|
# [DEF:LLMClient:Class]
|
||||||
|
# @PURPOSE: Wrapper for LLM provider APIs.
|
||||||
|
class LLMClient:
|
||||||
|
# [DEF:LLMClient.__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the LLMClient with provider settings.
|
||||||
|
# @PRE: api_key, base_url, and default_model are non-empty strings.
|
||||||
|
def __init__(self, provider_type: LLMProviderType, api_key: str, base_url: str, default_model: str):
|
||||||
|
self.provider_type = provider_type
|
||||||
|
self.api_key = api_key
|
||||||
|
self.base_url = base_url
|
||||||
|
self.default_model = default_model
|
||||||
|
|
||||||
|
# DEBUG: Log initialization parameters (without exposing full API key)
|
||||||
|
logger.info("[LLMClient.__init__] Initializing LLM client:")
|
||||||
|
logger.info(f"[LLMClient.__init__] Provider Type: {provider_type}")
|
||||||
|
logger.info(f"[LLMClient.__init__] Base URL: {base_url}")
|
||||||
|
logger.info(f"[LLMClient.__init__] Default Model: {default_model}")
|
||||||
|
logger.info(f"[LLMClient.__init__] API Key (first 8 chars): {api_key[:8] if api_key and len(api_key) > 8 else 'EMPTY_OR_NONE'}...")
|
||||||
|
logger.info(f"[LLMClient.__init__] API Key Length: {len(api_key) if api_key else 0}")
|
||||||
|
|
||||||
|
self.client = AsyncOpenAI(api_key=api_key, base_url=base_url)
|
||||||
|
# [/DEF:LLMClient.__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:LLMClient.get_json_completion:Function]
|
||||||
|
# @PURPOSE: Helper to handle LLM calls with JSON mode and fallback parsing.
|
||||||
|
# @PRE: messages is a list of valid message dictionaries.
|
||||||
|
# @POST: Returns a parsed JSON dictionary.
|
||||||
|
# @SIDE_EFFECT: Calls external LLM API.
|
||||||
|
def _should_retry(exception: Exception) -> bool:
|
||||||
|
"""Custom retry predicate that excludes authentication errors."""
|
||||||
|
# Don't retry on authentication errors
|
||||||
|
if isinstance(exception, OpenAIAuthenticationError):
|
||||||
|
return False
|
||||||
|
# Retry on rate limit errors and other exceptions
|
||||||
|
return isinstance(exception, (RateLimitError, Exception))
|
||||||
|
|
||||||
|
@retry(
|
||||||
|
stop=stop_after_attempt(5),
|
||||||
|
wait=wait_exponential(multiplier=2, min=5, max=60),
|
||||||
|
retry=retry_if_exception(_should_retry),
|
||||||
|
reraise=True
|
||||||
|
)
|
||||||
|
async def get_json_completion(self, messages: List[Dict[str, Any]]) -> Dict[str, Any]:
|
||||||
|
with belief_scope("get_json_completion"):
|
||||||
|
response = None
|
||||||
|
try:
|
||||||
|
try:
|
||||||
|
logger.info(f"[get_json_completion] Attempting LLM call with JSON mode for model: {self.default_model}")
|
||||||
|
logger.info(f"[get_json_completion] Base URL being used: {self.base_url}")
|
||||||
|
logger.info(f"[get_json_completion] Number of messages: {len(messages)}")
|
||||||
|
logger.info(f"[get_json_completion] API Key present: {bool(self.api_key and len(self.api_key) > 0)}")
|
||||||
|
|
||||||
|
response = await self.client.chat.completions.create(
|
||||||
|
model=self.default_model,
|
||||||
|
messages=messages,
|
||||||
|
response_format={"type": "json_object"}
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
if "JSON mode is not enabled" in str(e) or "400" in str(e):
|
||||||
|
logger.warning(f"[get_json_completion] JSON mode failed or not supported: {str(e)}. Falling back to plain text response.")
|
||||||
|
response = await self.client.chat.completions.create(
|
||||||
|
model=self.default_model,
|
||||||
|
messages=messages
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
raise e
|
||||||
|
|
||||||
|
logger.debug(f"[get_json_completion] LLM Response: {response}")
|
||||||
|
except OpenAIAuthenticationError as e:
|
||||||
|
logger.error(f"[get_json_completion] Authentication error: {str(e)}")
|
||||||
|
# Do not retry on auth errors - re-raise to stop retry
|
||||||
|
raise
|
||||||
|
except RateLimitError as e:
|
||||||
|
logger.warning(f"[get_json_completion] Rate limit hit: {str(e)}")
|
||||||
|
|
||||||
|
# Extract retry_delay from error metadata if available
|
||||||
|
retry_delay = 5.0 # Default fallback
|
||||||
|
try:
|
||||||
|
# Based on logs, the raw response is in e.body or e.response.json()
|
||||||
|
# The logs show 'metadata': {'raw': '...'} which suggests a proxy or specific client wrapper
|
||||||
|
# Let's try to find the 'retryDelay' in the error message or response
|
||||||
|
import re
|
||||||
|
|
||||||
|
# Try to find "retryDelay": "XXs" in the string representation of the error
|
||||||
|
error_str = str(e)
|
||||||
|
match = re.search(r'"retryDelay":\s*"(\d+)s"', error_str)
|
||||||
|
if match:
|
||||||
|
retry_delay = float(match.group(1))
|
||||||
|
else:
|
||||||
|
# Try to parse from response if it's a standard OpenAI-like error with body
|
||||||
|
if hasattr(e, 'body') and isinstance(e.body, dict):
|
||||||
|
# Some providers put it in details
|
||||||
|
details = e.body.get('error', {}).get('details', [])
|
||||||
|
for detail in details:
|
||||||
|
if detail.get('@type') == 'type.googleapis.com/google.rpc.RetryInfo':
|
||||||
|
delay_str = detail.get('retryDelay', '5s')
|
||||||
|
retry_delay = float(delay_str.rstrip('s'))
|
||||||
|
break
|
||||||
|
except Exception as parse_e:
|
||||||
|
logger.debug(f"[get_json_completion] Failed to parse retry delay: {parse_e}")
|
||||||
|
|
||||||
|
# Add a small safety margin (0.5s) as requested
|
||||||
|
wait_time = retry_delay + 0.5
|
||||||
|
logger.info(f"[get_json_completion] Waiting for {wait_time}s before retry...")
|
||||||
|
await asyncio.sleep(wait_time)
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[get_json_completion] LLM call failed: {str(e)}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
if not response or not hasattr(response, 'choices') or not response.choices:
|
||||||
|
raise RuntimeError(f"Invalid LLM response: {response}")
|
||||||
|
|
||||||
|
content = response.choices[0].message.content
|
||||||
|
logger.debug(f"[get_json_completion] Raw content to parse: {content}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
return json.loads(content)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
logger.warning("[get_json_completion] Failed to parse JSON directly, attempting to extract from code blocks")
|
||||||
|
if "```json" in content:
|
||||||
|
json_str = content.split("```json")[1].split("```")[0].strip()
|
||||||
|
return json.loads(json_str)
|
||||||
|
elif "```" in content:
|
||||||
|
json_str = content.split("```")[1].split("```")[0].strip()
|
||||||
|
return json.loads(json_str)
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
# [/DEF:LLMClient.get_json_completion:Function]
|
||||||
|
|
||||||
|
# [DEF:LLMClient.analyze_dashboard:Function]
|
||||||
|
# @PURPOSE: Sends dashboard data (screenshot + logs) to LLM for health analysis.
|
||||||
|
# @PRE: screenshot_path exists, logs is a list of strings.
|
||||||
|
# @POST: Returns a structured analysis dictionary (status, summary, issues).
|
||||||
|
# @SIDE_EFFECT: Reads screenshot file and calls external LLM API.
|
||||||
|
async def analyze_dashboard(self, screenshot_path: str, logs: List[str]) -> Dict[str, Any]:
|
||||||
|
with belief_scope("analyze_dashboard"):
|
||||||
|
# Optimize image to reduce token count (US1 / T023)
|
||||||
|
# Gemini/Gemma models have limits on input tokens, and large images contribute significantly.
|
||||||
|
try:
|
||||||
|
with Image.open(screenshot_path) as img:
|
||||||
|
# Convert to RGB if necessary
|
||||||
|
if img.mode in ("RGBA", "P"):
|
||||||
|
img = img.convert("RGB")
|
||||||
|
|
||||||
|
# Resize if too large (max 1024px width while maintaining aspect ratio)
|
||||||
|
# We reduce width further to 1024px to stay within token limits for long dashboards
|
||||||
|
max_width = 1024
|
||||||
|
if img.width > max_width or img.height > 2048:
|
||||||
|
# Calculate scaling factor to fit within 1024x2048
|
||||||
|
scale = min(max_width / img.width, 2048 / img.height)
|
||||||
|
if scale < 1.0:
|
||||||
|
new_width = int(img.width * scale)
|
||||||
|
new_height = int(img.height * scale)
|
||||||
|
img = img.resize((new_width, new_height), Image.Resampling.LANCZOS)
|
||||||
|
logger.info(f"[analyze_dashboard] Resized image from {img.width}x{img.height} to {new_width}x{new_height}")
|
||||||
|
|
||||||
|
# Compress and convert to base64
|
||||||
|
buffer = io.BytesIO()
|
||||||
|
# Lower quality to 60% to further reduce payload size
|
||||||
|
img.save(buffer, format="JPEG", quality=60, optimize=True)
|
||||||
|
base_64_image = base64.b64encode(buffer.getvalue()).decode('utf-8')
|
||||||
|
logger.info(f"[analyze_dashboard] Optimized image size: {len(buffer.getvalue()) / 1024:.2f} KB")
|
||||||
|
except Exception as img_e:
|
||||||
|
logger.warning(f"[analyze_dashboard] Image optimization failed: {img_e}. Using raw image.")
|
||||||
|
with open(screenshot_path, "rb") as image_file:
|
||||||
|
base_64_image = base64.b64encode(image_file.read()).decode('utf-8')
|
||||||
|
|
||||||
|
log_text = "\n".join(logs)
|
||||||
|
prompt = f"""
|
||||||
|
Analyze the attached dashboard screenshot and the following execution logs for health and visual issues.
|
||||||
|
|
||||||
|
Logs:
|
||||||
|
{log_text}
|
||||||
|
|
||||||
|
Provide the analysis in JSON format with the following structure:
|
||||||
|
{{
|
||||||
|
"status": "PASS" | "WARN" | "FAIL",
|
||||||
|
"summary": "Short summary of findings",
|
||||||
|
"issues": [
|
||||||
|
{{
|
||||||
|
"severity": "WARN" | "FAIL",
|
||||||
|
"message": "Description of the issue",
|
||||||
|
"location": "Optional location info (e.g. chart name)"
|
||||||
|
}}
|
||||||
|
]
|
||||||
|
}}
|
||||||
|
"""
|
||||||
|
|
||||||
|
messages = [
|
||||||
|
{
|
||||||
|
"role": "user",
|
||||||
|
"content": [
|
||||||
|
{"type": "text", "text": prompt},
|
||||||
|
{
|
||||||
|
"type": "image_url",
|
||||||
|
"image_url": {
|
||||||
|
"url": f"data:image/jpeg;base64,{base_64_image}"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
|
||||||
|
try:
|
||||||
|
return await self.get_json_completion(messages)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[analyze_dashboard] Failed to get analysis: {str(e)}")
|
||||||
|
return {
|
||||||
|
"status": "FAIL",
|
||||||
|
"summary": f"Failed to get response from LLM: {str(e)}",
|
||||||
|
"issues": [{"severity": "FAIL", "message": "LLM provider returned empty or invalid response"}]
|
||||||
|
}
|
||||||
|
# [/DEF:LLMClient.analyze_dashboard:Function]
|
||||||
|
# [/DEF:LLMClient:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/service.py:Module]
|
||||||
@@ -3,6 +3,7 @@
|
|||||||
# @PURPOSE: Implements a plugin for mapping dataset columns using external database connections or Excel files.
|
# @PURPOSE: Implements a plugin for mapping dataset columns using external database connections or Excel files.
|
||||||
# @LAYER: Plugins
|
# @LAYER: Plugins
|
||||||
# @RELATION: Inherits from PluginBase. Uses DatasetMapper from superset_tool.
|
# @RELATION: Inherits from PluginBase. Uses DatasetMapper from superset_tool.
|
||||||
|
# @RELATION: USES -> TaskContext
|
||||||
# @CONSTRAINT: Must use belief_scope for logging.
|
# @CONSTRAINT: Must use belief_scope for logging.
|
||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
@@ -13,6 +14,7 @@ from ..core.logger import logger, belief_scope
|
|||||||
from ..core.database import SessionLocal
|
from ..core.database import SessionLocal
|
||||||
from ..models.connection import ConnectionConfig
|
from ..models.connection import ConnectionConfig
|
||||||
from ..core.utils.dataset_mapper import DatasetMapper
|
from ..core.utils.dataset_mapper import DatasetMapper
|
||||||
|
from ..core.task_manager.context import TaskContext
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:MapperPlugin:Class]
|
# [DEF:MapperPlugin:Class]
|
||||||
@@ -66,6 +68,15 @@ class MapperPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the mapper plugin.
|
||||||
|
# @RETURN: str - "/tools/mapper"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/tools/mapper"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for the mapper plugin parameters.
|
# @PURPOSE: Returns the JSON schema for the mapper plugin parameters.
|
||||||
# @PRE: Plugin instance exists.
|
# @PRE: Plugin instance exists.
|
||||||
@@ -119,19 +130,27 @@ class MapperPlugin(PluginBase):
|
|||||||
# [/DEF:get_schema:Function]
|
# [/DEF:get_schema:Function]
|
||||||
|
|
||||||
# [DEF:execute:Function]
|
# [DEF:execute:Function]
|
||||||
# @PURPOSE: Executes the dataset mapping logic.
|
# @PURPOSE: Executes the dataset mapping logic with TaskContext support.
|
||||||
# @PARAM: params (Dict[str, Any]) - Mapping parameters.
|
# @PARAM: params (Dict[str, Any]) - Mapping parameters.
|
||||||
|
# @PARAM: context (Optional[TaskContext]) - Task context for logging with source attribution.
|
||||||
# @PRE: Params contain valid 'env', 'dataset_id', and 'source'. params must be a dictionary.
|
# @PRE: Params contain valid 'env', 'dataset_id', and 'source'. params must be a dictionary.
|
||||||
# @POST: Updates the dataset in Superset.
|
# @POST: Updates the dataset in Superset.
|
||||||
# @RETURN: Dict[str, Any] - Execution status.
|
# @RETURN: Dict[str, Any] - Execution status.
|
||||||
async def execute(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
async def execute(self, params: Dict[str, Any], context: Optional[TaskContext] = None) -> Dict[str, Any]:
|
||||||
with belief_scope("execute"):
|
with belief_scope("execute"):
|
||||||
env_name = params.get("env")
|
env_name = params.get("env")
|
||||||
dataset_id = params.get("dataset_id")
|
dataset_id = params.get("dataset_id")
|
||||||
source = params.get("source")
|
source = params.get("source")
|
||||||
|
|
||||||
|
# Use TaskContext logger if available, otherwise fall back to app logger
|
||||||
|
log = context.logger if context else logger
|
||||||
|
|
||||||
|
# Create sub-loggers for different components
|
||||||
|
superset_log = log.with_source("superset_api") if context else log
|
||||||
|
db_log = log.with_source("postgres") if context else log
|
||||||
|
|
||||||
if not env_name or dataset_id is None or not source:
|
if not env_name or dataset_id is None or not source:
|
||||||
logger.error("[MapperPlugin.execute][State] Missing required parameters.")
|
log.error("Missing required parameters: env, dataset_id, source")
|
||||||
raise ValueError("Missing required parameters: env, dataset_id, source")
|
raise ValueError("Missing required parameters: env, dataset_id, source")
|
||||||
|
|
||||||
# Get config and initialize client
|
# Get config and initialize client
|
||||||
@@ -139,7 +158,7 @@ class MapperPlugin(PluginBase):
|
|||||||
config_manager = get_config_manager()
|
config_manager = get_config_manager()
|
||||||
env_config = config_manager.get_environment(env_name)
|
env_config = config_manager.get_environment(env_name)
|
||||||
if not env_config:
|
if not env_config:
|
||||||
logger.error(f"[MapperPlugin.execute][State] Environment '{env_name}' not found.")
|
log.error(f"Environment '{env_name}' not found in configuration.")
|
||||||
raise ValueError(f"Environment '{env_name}' not found in configuration.")
|
raise ValueError(f"Environment '{env_name}' not found in configuration.")
|
||||||
|
|
||||||
client = SupersetClient(env_config)
|
client = SupersetClient(env_config)
|
||||||
@@ -149,7 +168,7 @@ class MapperPlugin(PluginBase):
|
|||||||
if source == "postgres":
|
if source == "postgres":
|
||||||
connection_id = params.get("connection_id")
|
connection_id = params.get("connection_id")
|
||||||
if not connection_id:
|
if not connection_id:
|
||||||
logger.error("[MapperPlugin.execute][State] connection_id is required for postgres source.")
|
log.error("connection_id is required for postgres source.")
|
||||||
raise ValueError("connection_id is required for postgres source.")
|
raise ValueError("connection_id is required for postgres source.")
|
||||||
|
|
||||||
# Load connection from DB
|
# Load connection from DB
|
||||||
@@ -157,7 +176,7 @@ class MapperPlugin(PluginBase):
|
|||||||
try:
|
try:
|
||||||
conn_config = db.query(ConnectionConfig).filter(ConnectionConfig.id == connection_id).first()
|
conn_config = db.query(ConnectionConfig).filter(ConnectionConfig.id == connection_id).first()
|
||||||
if not conn_config:
|
if not conn_config:
|
||||||
logger.error(f"[MapperPlugin.execute][State] Connection {connection_id} not found.")
|
db_log.error(f"Connection {connection_id} not found.")
|
||||||
raise ValueError(f"Connection {connection_id} not found.")
|
raise ValueError(f"Connection {connection_id} not found.")
|
||||||
|
|
||||||
postgres_config = {
|
postgres_config = {
|
||||||
@@ -167,10 +186,11 @@ class MapperPlugin(PluginBase):
|
|||||||
'host': conn_config.host,
|
'host': conn_config.host,
|
||||||
'port': str(conn_config.port) if conn_config.port else '5432'
|
'port': str(conn_config.port) if conn_config.port else '5432'
|
||||||
}
|
}
|
||||||
|
db_log.debug(f"Loaded connection config for {conn_config.host}:{conn_config.port}/{conn_config.database}")
|
||||||
finally:
|
finally:
|
||||||
db.close()
|
db.close()
|
||||||
|
|
||||||
logger.info(f"[MapperPlugin.execute][Action] Starting mapping for dataset {dataset_id} in {env_name}")
|
log.info(f"Starting mapping for dataset {dataset_id} in {env_name}")
|
||||||
|
|
||||||
mapper = DatasetMapper()
|
mapper = DatasetMapper()
|
||||||
|
|
||||||
@@ -184,10 +204,10 @@ class MapperPlugin(PluginBase):
|
|||||||
table_name=params.get("table_name"),
|
table_name=params.get("table_name"),
|
||||||
table_schema=params.get("table_schema") or "public"
|
table_schema=params.get("table_schema") or "public"
|
||||||
)
|
)
|
||||||
logger.info(f"[MapperPlugin.execute][Success] Mapping completed for dataset {dataset_id}")
|
superset_log.info(f"Mapping completed for dataset {dataset_id}")
|
||||||
return {"status": "success", "dataset_id": dataset_id}
|
return {"status": "success", "dataset_id": dataset_id}
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"[MapperPlugin.execute][Failure] Mapping failed: {e}")
|
log.error(f"Mapping failed: {e}")
|
||||||
raise
|
raise
|
||||||
# [/DEF:execute:Function]
|
# [/DEF:execute:Function]
|
||||||
|
|
||||||
|
|||||||
@@ -5,20 +5,20 @@
|
|||||||
# @RELATION: IMPLEMENTS -> PluginBase
|
# @RELATION: IMPLEMENTS -> PluginBase
|
||||||
# @RELATION: DEPENDS_ON -> superset_tool.client
|
# @RELATION: DEPENDS_ON -> superset_tool.client
|
||||||
# @RELATION: DEPENDS_ON -> superset_tool.utils
|
# @RELATION: DEPENDS_ON -> superset_tool.utils
|
||||||
|
# @RELATION: USES -> TaskContext
|
||||||
|
|
||||||
from typing import Dict, Any, List
|
from typing import Dict, Any, Optional
|
||||||
from pathlib import Path
|
|
||||||
import zipfile
|
|
||||||
import re
|
import re
|
||||||
|
|
||||||
from ..core.plugin_base import PluginBase
|
from ..core.plugin_base import PluginBase
|
||||||
from ..core.logger import belief_scope
|
from ..core.logger import belief_scope, logger as app_logger
|
||||||
from ..core.superset_client import SupersetClient
|
from ..core.superset_client import SupersetClient
|
||||||
from ..core.utils.fileio import create_temp_file, update_yamls, create_dashboard_export
|
from ..core.utils.fileio import create_temp_file
|
||||||
from ..dependencies import get_config_manager
|
from ..dependencies import get_config_manager
|
||||||
from ..core.migration_engine import MigrationEngine
|
from ..core.migration_engine import MigrationEngine
|
||||||
from ..core.database import SessionLocal
|
from ..core.database import SessionLocal
|
||||||
from ..models.mapping import DatabaseMapping, Environment
|
from ..models.mapping import DatabaseMapping, Environment
|
||||||
|
from ..core.task_manager.context import TaskContext
|
||||||
|
|
||||||
# [DEF:MigrationPlugin:Class]
|
# [DEF:MigrationPlugin:Class]
|
||||||
# @PURPOSE: Implementation of the migration plugin logic.
|
# @PURPOSE: Implementation of the migration plugin logic.
|
||||||
@@ -71,6 +71,15 @@ class MigrationPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the migration plugin.
|
||||||
|
# @RETURN: str - "/migration"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/migration"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for migration plugin parameters.
|
# @PURPOSE: Returns the JSON schema for migration plugin parameters.
|
||||||
# @PRE: Config manager is available.
|
# @PRE: Config manager is available.
|
||||||
@@ -123,11 +132,12 @@ class MigrationPlugin(PluginBase):
|
|||||||
# [/DEF:get_schema:Function]
|
# [/DEF:get_schema:Function]
|
||||||
|
|
||||||
# [DEF:execute:Function]
|
# [DEF:execute:Function]
|
||||||
# @PURPOSE: Executes the dashboard migration logic.
|
# @PURPOSE: Executes the dashboard migration logic with TaskContext support.
|
||||||
# @PARAM: params (Dict[str, Any]) - Migration parameters.
|
# @PARAM: params (Dict[str, Any]) - Migration parameters.
|
||||||
|
# @PARAM: context (Optional[TaskContext]) - Task context for logging with source attribution.
|
||||||
# @PRE: Source and target environments must be configured.
|
# @PRE: Source and target environments must be configured.
|
||||||
# @POST: Selected dashboards are migrated.
|
# @POST: Selected dashboards are migrated.
|
||||||
async def execute(self, params: Dict[str, Any]):
|
async def execute(self, params: Dict[str, Any], context: Optional[TaskContext] = None):
|
||||||
with belief_scope("MigrationPlugin.execute"):
|
with belief_scope("MigrationPlugin.execute"):
|
||||||
source_env_id = params.get("source_env_id")
|
source_env_id = params.get("source_env_id")
|
||||||
target_env_id = params.get("target_env_id")
|
target_env_id = params.get("target_env_id")
|
||||||
@@ -139,8 +149,8 @@ class MigrationPlugin(PluginBase):
|
|||||||
dashboard_regex = params.get("dashboard_regex")
|
dashboard_regex = params.get("dashboard_regex")
|
||||||
|
|
||||||
replace_db_config = params.get("replace_db_config", False)
|
replace_db_config = params.get("replace_db_config", False)
|
||||||
from_db_id = params.get("from_db_id")
|
params.get("from_db_id")
|
||||||
to_db_id = params.get("to_db_id")
|
params.get("to_db_id")
|
||||||
|
|
||||||
# [DEF:MigrationPlugin.execute:Action]
|
# [DEF:MigrationPlugin.execute:Action]
|
||||||
# @PURPOSE: Execute the migration logic with proper task logging.
|
# @PURPOSE: Execute the migration logic with proper task logging.
|
||||||
@@ -148,74 +158,15 @@ class MigrationPlugin(PluginBase):
|
|||||||
from ..dependencies import get_task_manager
|
from ..dependencies import get_task_manager
|
||||||
tm = get_task_manager()
|
tm = get_task_manager()
|
||||||
|
|
||||||
class TaskLoggerProxy:
|
# Use TaskContext logger if available, otherwise fall back to app_logger
|
||||||
# [DEF:__init__:Function]
|
log = context.logger if context else app_logger
|
||||||
# @PURPOSE: Initializes the proxy logger.
|
|
||||||
# @PRE: None.
|
|
||||||
# @POST: Instance is initialized.
|
|
||||||
def __init__(self):
|
|
||||||
with belief_scope("__init__"):
|
|
||||||
# Initialize parent with dummy values since we override methods
|
|
||||||
pass
|
|
||||||
# [/DEF:__init__:Function]
|
|
||||||
|
|
||||||
# [DEF:debug:Function]
|
# Create sub-loggers for different components
|
||||||
# @PURPOSE: Logs a debug message to the task manager.
|
superset_log = log.with_source("superset_api") if context else log
|
||||||
# @PRE: msg is a string.
|
migration_log = log.with_source("migration") if context else log
|
||||||
# @POST: Log is added to task manager if task_id exists.
|
|
||||||
def debug(self, msg, *args, extra=None, **kwargs):
|
|
||||||
with belief_scope("debug"):
|
|
||||||
if task_id: tm._add_log(task_id, "DEBUG", msg, extra or {})
|
|
||||||
# [/DEF:debug:Function]
|
|
||||||
|
|
||||||
# [DEF:info:Function]
|
log.info("Starting migration task.")
|
||||||
# @PURPOSE: Logs an info message to the task manager.
|
log.debug(f"Params: {params}")
|
||||||
# @PRE: msg is a string.
|
|
||||||
# @POST: Log is added to task manager if task_id exists.
|
|
||||||
def info(self, msg, *args, extra=None, **kwargs):
|
|
||||||
with belief_scope("info"):
|
|
||||||
if task_id: tm._add_log(task_id, "INFO", msg, extra or {})
|
|
||||||
# [/DEF:info:Function]
|
|
||||||
|
|
||||||
# [DEF:warning:Function]
|
|
||||||
# @PURPOSE: Logs a warning message to the task manager.
|
|
||||||
# @PRE: msg is a string.
|
|
||||||
# @POST: Log is added to task manager if task_id exists.
|
|
||||||
def warning(self, msg, *args, extra=None, **kwargs):
|
|
||||||
with belief_scope("warning"):
|
|
||||||
if task_id: tm._add_log(task_id, "WARNING", msg, extra or {})
|
|
||||||
# [/DEF:warning:Function]
|
|
||||||
|
|
||||||
# [DEF:error:Function]
|
|
||||||
# @PURPOSE: Logs an error message to the task manager.
|
|
||||||
# @PRE: msg is a string.
|
|
||||||
# @POST: Log is added to task manager if task_id exists.
|
|
||||||
def error(self, msg, *args, extra=None, **kwargs):
|
|
||||||
with belief_scope("error"):
|
|
||||||
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
|
|
||||||
# [/DEF:error:Function]
|
|
||||||
|
|
||||||
# [DEF:critical:Function]
|
|
||||||
# @PURPOSE: Logs a critical message to the task manager.
|
|
||||||
# @PRE: msg is a string.
|
|
||||||
# @POST: Log is added to task manager if task_id exists.
|
|
||||||
def critical(self, msg, *args, extra=None, **kwargs):
|
|
||||||
with belief_scope("critical"):
|
|
||||||
if task_id: tm._add_log(task_id, "ERROR", msg, extra or {})
|
|
||||||
# [/DEF:critical:Function]
|
|
||||||
|
|
||||||
# [DEF:exception:Function]
|
|
||||||
# @PURPOSE: Logs an exception message to the task manager.
|
|
||||||
# @PRE: msg is a string.
|
|
||||||
# @POST: Log is added to task manager if task_id exists.
|
|
||||||
def exception(self, msg, *args, **kwargs):
|
|
||||||
with belief_scope("exception"):
|
|
||||||
if task_id: tm._add_log(task_id, "ERROR", msg, {"exception": True})
|
|
||||||
# [/DEF:exception:Function]
|
|
||||||
|
|
||||||
logger = TaskLoggerProxy()
|
|
||||||
logger.info(f"[MigrationPlugin][Entry] Starting migration task.")
|
|
||||||
logger.info(f"[MigrationPlugin][Action] Params: {params}")
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with belief_scope("execute"):
|
with belief_scope("execute"):
|
||||||
@@ -242,7 +193,7 @@ class MigrationPlugin(PluginBase):
|
|||||||
from_env_name = src_env.name
|
from_env_name = src_env.name
|
||||||
to_env_name = tgt_env.name
|
to_env_name = tgt_env.name
|
||||||
|
|
||||||
logger.info(f"[MigrationPlugin][State] Resolved environments: {from_env_name} -> {to_env_name}")
|
log.info(f"Resolved environments: {from_env_name} -> {to_env_name}")
|
||||||
|
|
||||||
from_c = SupersetClient(src_env)
|
from_c = SupersetClient(src_env)
|
||||||
to_c = SupersetClient(tgt_env)
|
to_c = SupersetClient(tgt_env)
|
||||||
@@ -261,11 +212,11 @@ class MigrationPlugin(PluginBase):
|
|||||||
d for d in all_dashboards if re.search(regex_str, d["dashboard_title"], re.IGNORECASE)
|
d for d in all_dashboards if re.search(regex_str, d["dashboard_title"], re.IGNORECASE)
|
||||||
]
|
]
|
||||||
else:
|
else:
|
||||||
logger.warning("[MigrationPlugin][State] No selection criteria provided (selected_ids or dashboard_regex).")
|
log.warning("No selection criteria provided (selected_ids or dashboard_regex).")
|
||||||
return
|
return
|
||||||
|
|
||||||
if not dashboards_to_migrate:
|
if not dashboards_to_migrate:
|
||||||
logger.warning("[MigrationPlugin][State] No dashboards found matching criteria.")
|
log.warning("No dashboards found matching criteria.")
|
||||||
return
|
return
|
||||||
|
|
||||||
# Fetch mappings from database
|
# Fetch mappings from database
|
||||||
@@ -283,7 +234,7 @@ class MigrationPlugin(PluginBase):
|
|||||||
DatabaseMapping.target_env_id == tgt_env.id
|
DatabaseMapping.target_env_id == tgt_env.id
|
||||||
).all()
|
).all()
|
||||||
db_mapping = {m.source_db_uuid: m.target_db_uuid for m in mappings}
|
db_mapping = {m.source_db_uuid: m.target_db_uuid for m in mappings}
|
||||||
logger.info(f"[MigrationPlugin][State] Loaded {len(db_mapping)} database mappings.")
|
log.info(f"Loaded {len(db_mapping)} database mappings.")
|
||||||
finally:
|
finally:
|
||||||
db.close()
|
db.close()
|
||||||
|
|
||||||
@@ -294,15 +245,15 @@ class MigrationPlugin(PluginBase):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
exported_content, _ = from_c.export_dashboard(dash_id)
|
exported_content, _ = from_c.export_dashboard(dash_id)
|
||||||
with create_temp_file(content=exported_content, dry_run=True, suffix=".zip", logger=logger) as tmp_zip_path:
|
with create_temp_file(content=exported_content, dry_run=True, suffix=".zip") as tmp_zip_path:
|
||||||
# Always transform to strip databases to avoid password errors
|
# Always transform to strip databases to avoid password errors
|
||||||
with create_temp_file(suffix=".zip", dry_run=True, logger=logger) as tmp_new_zip:
|
with create_temp_file(suffix=".zip", dry_run=True) as tmp_new_zip:
|
||||||
success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping, strip_databases=False)
|
success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping, strip_databases=False)
|
||||||
|
|
||||||
if not success and replace_db_config:
|
if not success and replace_db_config:
|
||||||
# Signal missing mapping and wait (only if we care about mappings)
|
# Signal missing mapping and wait (only if we care about mappings)
|
||||||
if task_id:
|
if task_id:
|
||||||
logger.info(f"[MigrationPlugin][Action] Pausing for missing mapping in task {task_id}")
|
log.info(f"Pausing for missing mapping in task {task_id}")
|
||||||
# In a real scenario, we'd pass the missing DB info to the frontend
|
# In a real scenario, we'd pass the missing DB info to the frontend
|
||||||
# For this task, we'll just simulate the wait
|
# For this task, we'll just simulate the wait
|
||||||
await tm.wait_for_resolution(task_id)
|
await tm.wait_for_resolution(task_id)
|
||||||
@@ -324,9 +275,9 @@ class MigrationPlugin(PluginBase):
|
|||||||
if success:
|
if success:
|
||||||
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug)
|
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug)
|
||||||
else:
|
else:
|
||||||
logger.error(f"[MigrationPlugin][Failure] Failed to transform ZIP for dashboard {title}")
|
migration_log.error(f"Failed to transform ZIP for dashboard {title}")
|
||||||
|
|
||||||
logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported.")
|
superset_log.info(f"Dashboard {title} imported.")
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
# Check for password error
|
# Check for password error
|
||||||
error_msg = str(exc)
|
error_msg = str(exc)
|
||||||
@@ -348,7 +299,7 @@ class MigrationPlugin(PluginBase):
|
|||||||
if match_alt:
|
if match_alt:
|
||||||
db_name = match_alt.group(1)
|
db_name = match_alt.group(1)
|
||||||
|
|
||||||
logger.warning(f"[MigrationPlugin][Action] Detected missing password for database: {db_name}")
|
app_logger.warning(f"[MigrationPlugin][Action] Detected missing password for database: {db_name}")
|
||||||
|
|
||||||
if task_id:
|
if task_id:
|
||||||
input_request = {
|
input_request = {
|
||||||
@@ -367,19 +318,19 @@ class MigrationPlugin(PluginBase):
|
|||||||
|
|
||||||
# Retry import with password
|
# Retry import with password
|
||||||
if passwords:
|
if passwords:
|
||||||
logger.info(f"[MigrationPlugin][Action] Retrying import for {title} with provided passwords.")
|
app_logger.info(f"[MigrationPlugin][Action] Retrying import for {title} with provided passwords.")
|
||||||
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug, passwords=passwords)
|
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug, passwords=passwords)
|
||||||
logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported after password injection.")
|
app_logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported after password injection.")
|
||||||
# Clear passwords from params after use for security
|
# Clear passwords from params after use for security
|
||||||
if "passwords" in task.params:
|
if "passwords" in task.params:
|
||||||
del task.params["passwords"]
|
del task.params["passwords"]
|
||||||
continue
|
continue
|
||||||
|
|
||||||
logger.error(f"[MigrationPlugin][Failure] Failed to migrate dashboard {title}: {exc}", exc_info=True)
|
app_logger.error(f"[MigrationPlugin][Failure] Failed to migrate dashboard {title}: {exc}", exc_info=True)
|
||||||
|
|
||||||
logger.info("[MigrationPlugin][Exit] Migration finished.")
|
app_logger.info("[MigrationPlugin][Exit] Migration finished.")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.critical(f"[MigrationPlugin][Failure] Fatal error during migration: {e}", exc_info=True)
|
app_logger.critical(f"[MigrationPlugin][Failure] Fatal error during migration: {e}", exc_info=True)
|
||||||
raise e
|
raise e
|
||||||
# [/DEF:MigrationPlugin.execute:Action]
|
# [/DEF:MigrationPlugin.execute:Action]
|
||||||
# [/DEF:execute:Function]
|
# [/DEF:execute:Function]
|
||||||
|
|||||||
@@ -3,14 +3,16 @@
|
|||||||
# @PURPOSE: Implements a plugin for searching text patterns across all datasets in a specific Superset environment.
|
# @PURPOSE: Implements a plugin for searching text patterns across all datasets in a specific Superset environment.
|
||||||
# @LAYER: Plugins
|
# @LAYER: Plugins
|
||||||
# @RELATION: Inherits from PluginBase. Uses SupersetClient from core.
|
# @RELATION: Inherits from PluginBase. Uses SupersetClient from core.
|
||||||
|
# @RELATION: USES -> TaskContext
|
||||||
# @CONSTRAINT: Must use belief_scope for logging.
|
# @CONSTRAINT: Must use belief_scope for logging.
|
||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
import re
|
import re
|
||||||
from typing import Dict, Any, List, Optional
|
from typing import Dict, Any, Optional
|
||||||
from ..core.plugin_base import PluginBase
|
from ..core.plugin_base import PluginBase
|
||||||
from ..core.superset_client import SupersetClient
|
from ..core.superset_client import SupersetClient
|
||||||
from ..core.logger import logger, belief_scope
|
from ..core.logger import logger, belief_scope
|
||||||
|
from ..core.task_manager.context import TaskContext
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:SearchPlugin:Class]
|
# [DEF:SearchPlugin:Class]
|
||||||
@@ -64,6 +66,15 @@ class SearchPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the search plugin.
|
||||||
|
# @RETURN: str - "/tools/search"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/tools/search"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for the search plugin parameters.
|
# @PURPOSE: Returns the JSON schema for the search plugin parameters.
|
||||||
# @PRE: Plugin instance exists.
|
# @PRE: Plugin instance exists.
|
||||||
@@ -90,18 +101,26 @@ class SearchPlugin(PluginBase):
|
|||||||
# [/DEF:get_schema:Function]
|
# [/DEF:get_schema:Function]
|
||||||
|
|
||||||
# [DEF:execute:Function]
|
# [DEF:execute:Function]
|
||||||
# @PURPOSE: Executes the dataset search logic.
|
# @PURPOSE: Executes the dataset search logic with TaskContext support.
|
||||||
# @PARAM: params (Dict[str, Any]) - Search parameters.
|
# @PARAM: params (Dict[str, Any]) - Search parameters.
|
||||||
|
# @PARAM: context (Optional[TaskContext]) - Task context for logging with source attribution.
|
||||||
# @PRE: Params contain valid 'env' and 'query'.
|
# @PRE: Params contain valid 'env' and 'query'.
|
||||||
# @POST: Returns a dictionary with count and results list.
|
# @POST: Returns a dictionary with count and results list.
|
||||||
# @RETURN: Dict[str, Any] - Search results.
|
# @RETURN: Dict[str, Any] - Search results.
|
||||||
async def execute(self, params: Dict[str, Any]) -> Dict[str, Any]:
|
async def execute(self, params: Dict[str, Any], context: Optional[TaskContext] = None) -> Dict[str, Any]:
|
||||||
with belief_scope("SearchPlugin.execute", f"params={params}"):
|
with belief_scope("SearchPlugin.execute", f"params={params}"):
|
||||||
env_name = params.get("env")
|
env_name = params.get("env")
|
||||||
search_query = params.get("query")
|
search_query = params.get("query")
|
||||||
|
|
||||||
|
# Use TaskContext logger if available, otherwise fall back to app logger
|
||||||
|
log = context.logger if context else logger
|
||||||
|
|
||||||
|
# Create sub-loggers for different components
|
||||||
|
log.with_source("superset_api") if context else log
|
||||||
|
search_log = log.with_source("search") if context else log
|
||||||
|
|
||||||
if not env_name or not search_query:
|
if not env_name or not search_query:
|
||||||
logger.error("[SearchPlugin.execute][State] Missing required parameters.")
|
log.error("Missing required parameters: env, query")
|
||||||
raise ValueError("Missing required parameters: env, query")
|
raise ValueError("Missing required parameters: env, query")
|
||||||
|
|
||||||
# Get config and initialize client
|
# Get config and initialize client
|
||||||
@@ -109,20 +128,20 @@ class SearchPlugin(PluginBase):
|
|||||||
config_manager = get_config_manager()
|
config_manager = get_config_manager()
|
||||||
env_config = config_manager.get_environment(env_name)
|
env_config = config_manager.get_environment(env_name)
|
||||||
if not env_config:
|
if not env_config:
|
||||||
logger.error(f"[SearchPlugin.execute][State] Environment '{env_name}' not found.")
|
log.error(f"Environment '{env_name}' not found in configuration.")
|
||||||
raise ValueError(f"Environment '{env_name}' not found in configuration.")
|
raise ValueError(f"Environment '{env_name}' not found in configuration.")
|
||||||
|
|
||||||
client = SupersetClient(env_config)
|
client = SupersetClient(env_config)
|
||||||
client.authenticate()
|
client.authenticate()
|
||||||
|
|
||||||
logger.info(f"[SearchPlugin.execute][Action] Searching for pattern: '{search_query}' in environment: {env_name}")
|
log.info(f"Searching for pattern: '{search_query}' in environment: {env_name}")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Ported logic from search_script.py
|
# Ported logic from search_script.py
|
||||||
_, datasets = client.get_datasets(query={"columns": ["id", "table_name", "sql", "database", "columns"]})
|
_, datasets = client.get_datasets(query={"columns": ["id", "table_name", "sql", "database", "columns"]})
|
||||||
|
|
||||||
if not datasets:
|
if not datasets:
|
||||||
logger.warning("[SearchPlugin.execute][State] No datasets found.")
|
search_log.warning("No datasets found.")
|
||||||
return {"count": 0, "results": []}
|
return {"count": 0, "results": []}
|
||||||
|
|
||||||
pattern = re.compile(search_query, re.IGNORECASE)
|
pattern = re.compile(search_query, re.IGNORECASE)
|
||||||
@@ -146,17 +165,17 @@ class SearchPlugin(PluginBase):
|
|||||||
"full_value": value_str
|
"full_value": value_str
|
||||||
})
|
})
|
||||||
|
|
||||||
logger.info(f"[SearchPlugin.execute][Success] Found matches in {len(results)} locations.")
|
search_log.info(f"Found matches in {len(results)} locations.")
|
||||||
return {
|
return {
|
||||||
"count": len(results),
|
"count": len(results),
|
||||||
"results": results
|
"results": results
|
||||||
}
|
}
|
||||||
|
|
||||||
except re.error as e:
|
except re.error as e:
|
||||||
logger.error(f"[SearchPlugin.execute][Failure] Invalid regex pattern: {e}")
|
search_log.error(f"Invalid regex pattern: {e}")
|
||||||
raise ValueError(f"Invalid regex pattern: {e}")
|
raise ValueError(f"Invalid regex pattern: {e}")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"[SearchPlugin.execute][Failure] Error during search: {e}")
|
log.error(f"Error during search: {e}")
|
||||||
raise
|
raise
|
||||||
# [/DEF:execute:Function]
|
# [/DEF:execute:Function]
|
||||||
|
|
||||||
|
|||||||
3
backend/src/plugins/storage/__init__.py
Normal file
3
backend/src/plugins/storage/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
from .plugin import StoragePlugin
|
||||||
|
|
||||||
|
__all__ = ["StoragePlugin"]
|
||||||
344
backend/src/plugins/storage/plugin.py
Normal file
344
backend/src/plugins/storage/plugin.py
Normal file
@@ -0,0 +1,344 @@
|
|||||||
|
# [DEF:StoragePlugin:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: storage, files, filesystem, plugin
|
||||||
|
# @PURPOSE: Provides core filesystem operations for managing backups and repositories.
|
||||||
|
# @LAYER: App
|
||||||
|
# @RELATION: IMPLEMENTS -> PluginBase
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.models.storage
|
||||||
|
# @RELATION: USES -> TaskContext
|
||||||
|
#
|
||||||
|
# @INVARIANT: All file operations must be restricted to the configured storage root.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
from pathlib import Path
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Dict, Any, List, Optional
|
||||||
|
from fastapi import UploadFile
|
||||||
|
|
||||||
|
from ...core.plugin_base import PluginBase
|
||||||
|
from ...core.logger import belief_scope, logger
|
||||||
|
from ...models.storage import StoredFile, FileCategory
|
||||||
|
from ...dependencies import get_config_manager
|
||||||
|
from ...core.task_manager.context import TaskContext
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:StoragePlugin:Class]
|
||||||
|
# @PURPOSE: Implementation of the storage management plugin.
|
||||||
|
class StoragePlugin(PluginBase):
|
||||||
|
"""
|
||||||
|
Plugin for managing local file storage for backups and repositories.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the StoragePlugin and ensures required directories exist.
|
||||||
|
# @PRE: Configuration manager must be accessible.
|
||||||
|
# @POST: Storage root and category directories are created on disk.
|
||||||
|
def __init__(self):
|
||||||
|
with belief_scope("StoragePlugin:init"):
|
||||||
|
self.ensure_directories()
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:id:Function]
|
||||||
|
# @PURPOSE: Returns the unique identifier for the storage plugin.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Returns the plugin ID string.
|
||||||
|
# @RETURN: str - "storage-manager"
|
||||||
|
def id(self) -> str:
|
||||||
|
with belief_scope("StoragePlugin:id"):
|
||||||
|
return "storage-manager"
|
||||||
|
# [/DEF:id:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:name:Function]
|
||||||
|
# @PURPOSE: Returns the human-readable name of the storage plugin.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Returns the plugin name string.
|
||||||
|
# @RETURN: str - "Storage Manager"
|
||||||
|
def name(self) -> str:
|
||||||
|
with belief_scope("StoragePlugin:name"):
|
||||||
|
return "Storage Manager"
|
||||||
|
# [/DEF:name:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:description:Function]
|
||||||
|
# @PURPOSE: Returns a description of the storage plugin.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Returns the plugin description string.
|
||||||
|
# @RETURN: str - Plugin description.
|
||||||
|
def description(self) -> str:
|
||||||
|
with belief_scope("StoragePlugin:description"):
|
||||||
|
return "Manages local file storage for backups and repositories."
|
||||||
|
# [/DEF:description:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:version:Function]
|
||||||
|
# @PURPOSE: Returns the version of the storage plugin.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Returns the version string.
|
||||||
|
# @RETURN: str - "1.0.0"
|
||||||
|
def version(self) -> str:
|
||||||
|
with belief_scope("StoragePlugin:version"):
|
||||||
|
return "1.0.0"
|
||||||
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the storage plugin.
|
||||||
|
# @RETURN: str - "/tools/storage"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("StoragePlugin:ui_route"):
|
||||||
|
return "/tools/storage"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
|
# [DEF:get_schema:Function]
|
||||||
|
# @PURPOSE: Returns the JSON schema for storage plugin parameters.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Returns a dictionary representing the JSON schema.
|
||||||
|
# @RETURN: Dict[str, Any] - JSON schema.
|
||||||
|
def get_schema(self) -> Dict[str, Any]:
|
||||||
|
with belief_scope("StoragePlugin:get_schema"):
|
||||||
|
return {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"category": {
|
||||||
|
"type": "string",
|
||||||
|
"enum": [c.value for c in FileCategory],
|
||||||
|
"title": "Category"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": ["category"]
|
||||||
|
}
|
||||||
|
# [/DEF:get_schema:Function]
|
||||||
|
|
||||||
|
# [DEF:execute:Function]
|
||||||
|
# @PURPOSE: Executes storage-related tasks with TaskContext support.
|
||||||
|
# @PARAM: params (Dict[str, Any]) - Storage parameters.
|
||||||
|
# @PARAM: context (Optional[TaskContext]) - Task context for logging with source attribution.
|
||||||
|
# @PRE: params must match the plugin schema.
|
||||||
|
# @POST: Task is executed and logged.
|
||||||
|
async def execute(self, params: Dict[str, Any], context: Optional[TaskContext] = None):
|
||||||
|
with belief_scope("StoragePlugin:execute"):
|
||||||
|
# Use TaskContext logger if available, otherwise fall back to app logger
|
||||||
|
log = context.logger if context else logger
|
||||||
|
|
||||||
|
# Create sub-loggers for different components
|
||||||
|
storage_log = log.with_source("storage") if context else log
|
||||||
|
log.with_source("filesystem") if context else log
|
||||||
|
|
||||||
|
storage_log.info(f"Executing with params: {params}")
|
||||||
|
# [/DEF:execute:Function]
|
||||||
|
|
||||||
|
# [DEF:get_storage_root:Function]
|
||||||
|
# @PURPOSE: Resolves the absolute path to the storage root.
|
||||||
|
# @PRE: Settings must define a storage root path.
|
||||||
|
# @POST: Returns a Path object representing the storage root.
|
||||||
|
def get_storage_root(self) -> Path:
|
||||||
|
with belief_scope("StoragePlugin:get_storage_root"):
|
||||||
|
config_manager = get_config_manager()
|
||||||
|
global_settings = config_manager.get_config().settings
|
||||||
|
|
||||||
|
# Use storage.root_path as the source of truth for storage UI
|
||||||
|
root = Path(global_settings.storage.root_path)
|
||||||
|
|
||||||
|
if not root.is_absolute():
|
||||||
|
# Resolve relative to the backend directory
|
||||||
|
# Path(__file__) is backend/src/plugins/storage/plugin.py
|
||||||
|
# parents[3] is the project root (ss-tools)
|
||||||
|
# We need to ensure it's relative to where backend/ is
|
||||||
|
project_root = Path(__file__).parents[3]
|
||||||
|
root = (project_root / root).resolve()
|
||||||
|
return root
|
||||||
|
# [/DEF:get_storage_root:Function]
|
||||||
|
|
||||||
|
# [DEF:resolve_path:Function]
|
||||||
|
# @PURPOSE: Resolves a dynamic path pattern using provided variables.
|
||||||
|
# @PARAM: pattern (str) - The path pattern to resolve.
|
||||||
|
# @PARAM: variables (Dict[str, str]) - Variables to substitute in the pattern.
|
||||||
|
# @PRE: pattern must be a valid format string.
|
||||||
|
# @POST: Returns the resolved path string.
|
||||||
|
# @RETURN: str - The resolved path.
|
||||||
|
def resolve_path(self, pattern: str, variables: Dict[str, str]) -> str:
|
||||||
|
with belief_scope("StoragePlugin:resolve_path"):
|
||||||
|
# Add common variables
|
||||||
|
vars_with_defaults = {
|
||||||
|
"timestamp": datetime.now().strftime("%Y%m%dT%H%M%S"),
|
||||||
|
**variables
|
||||||
|
}
|
||||||
|
try:
|
||||||
|
resolved = pattern.format(**vars_with_defaults)
|
||||||
|
# Clean up any double slashes or leading/trailing slashes for relative path
|
||||||
|
return os.path.normpath(resolved).strip("/")
|
||||||
|
except KeyError as e:
|
||||||
|
logger.warning(f"[StoragePlugin][Coherence:Failed] Missing variable for path resolution: {e}")
|
||||||
|
# Fallback to literal pattern if formatting fails partially (or handle as needed)
|
||||||
|
return pattern.replace("{", "").replace("}", "")
|
||||||
|
# [/DEF:resolve_path:Function]
|
||||||
|
|
||||||
|
# [DEF:ensure_directories:Function]
|
||||||
|
# @PURPOSE: Creates the storage root and category subdirectories if they don't exist.
|
||||||
|
# @PRE: Storage root must be resolvable.
|
||||||
|
# @POST: Directories are created on the filesystem.
|
||||||
|
# @SIDE_EFFECT: Creates directories on the filesystem.
|
||||||
|
def ensure_directories(self):
|
||||||
|
with belief_scope("StoragePlugin:ensure_directories"):
|
||||||
|
root = self.get_storage_root()
|
||||||
|
for category in FileCategory:
|
||||||
|
# Use singular name for consistency with BackupPlugin and GitService
|
||||||
|
path = root / category.value
|
||||||
|
path.mkdir(parents=True, exist_ok=True)
|
||||||
|
logger.debug(f"[StoragePlugin][Action] Ensured directory: {path}")
|
||||||
|
# [/DEF:ensure_directories:Function]
|
||||||
|
|
||||||
|
# [DEF:validate_path:Function]
|
||||||
|
# @PURPOSE: Prevents path traversal attacks by ensuring the path is within the storage root.
|
||||||
|
# @PRE: path must be a Path object.
|
||||||
|
# @POST: Returns the resolved absolute path if valid, otherwise raises ValueError.
|
||||||
|
def validate_path(self, path: Path) -> Path:
|
||||||
|
with belief_scope("StoragePlugin:validate_path"):
|
||||||
|
root = self.get_storage_root().resolve()
|
||||||
|
resolved = path.resolve()
|
||||||
|
try:
|
||||||
|
resolved.relative_to(root)
|
||||||
|
except ValueError:
|
||||||
|
logger.error(f"[StoragePlugin][Coherence:Failed] Path traversal detected: {resolved} is not under {root}")
|
||||||
|
raise ValueError("Access denied: Path is outside of storage root.")
|
||||||
|
return resolved
|
||||||
|
# [/DEF:validate_path:Function]
|
||||||
|
|
||||||
|
# [DEF:list_files:Function]
|
||||||
|
# @PURPOSE: Lists all files and directories in a specific category and subpath.
|
||||||
|
# @PARAM: category (Optional[FileCategory]) - The category to list.
|
||||||
|
# @PARAM: subpath (Optional[str]) - Nested path within the category.
|
||||||
|
# @PRE: Storage root must exist.
|
||||||
|
# @POST: Returns a list of StoredFile objects.
|
||||||
|
# @RETURN: List[StoredFile] - List of file and directory metadata objects.
|
||||||
|
def list_files(self, category: Optional[FileCategory] = None, subpath: Optional[str] = None) -> List[StoredFile]:
|
||||||
|
with belief_scope("StoragePlugin:list_files"):
|
||||||
|
root = self.get_storage_root()
|
||||||
|
logger.info(f"[StoragePlugin][Action] Listing files in root: {root}, category: {category}, subpath: {subpath}")
|
||||||
|
files = []
|
||||||
|
|
||||||
|
categories = [category] if category else list(FileCategory)
|
||||||
|
|
||||||
|
for cat in categories:
|
||||||
|
# Scan the category subfolder + optional subpath
|
||||||
|
base_dir = root / cat.value
|
||||||
|
if subpath:
|
||||||
|
target_dir = self.validate_path(base_dir / subpath)
|
||||||
|
else:
|
||||||
|
target_dir = base_dir
|
||||||
|
|
||||||
|
if not target_dir.exists():
|
||||||
|
continue
|
||||||
|
|
||||||
|
logger.debug(f"[StoragePlugin][Action] Scanning directory: {target_dir}")
|
||||||
|
|
||||||
|
# Use os.scandir for better performance and to distinguish files vs dirs
|
||||||
|
with os.scandir(target_dir) as it:
|
||||||
|
for entry in it:
|
||||||
|
# Skip logs
|
||||||
|
if "Logs" in entry.path:
|
||||||
|
continue
|
||||||
|
|
||||||
|
stat = entry.stat()
|
||||||
|
is_dir = entry.is_dir()
|
||||||
|
|
||||||
|
files.append(StoredFile(
|
||||||
|
name=entry.name,
|
||||||
|
path=str(Path(entry.path).relative_to(root)),
|
||||||
|
size=stat.st_size if not is_dir else 0,
|
||||||
|
created_at=datetime.fromtimestamp(stat.st_ctime),
|
||||||
|
category=cat,
|
||||||
|
mime_type="directory" if is_dir else None
|
||||||
|
))
|
||||||
|
|
||||||
|
# Sort: directories first, then by name
|
||||||
|
return sorted(files, key=lambda x: (x.mime_type != "directory", x.name))
|
||||||
|
# [/DEF:list_files:Function]
|
||||||
|
|
||||||
|
# [DEF:save_file:Function]
|
||||||
|
# @PURPOSE: Saves an uploaded file to the specified category and optional subpath.
|
||||||
|
# @PARAM: file (UploadFile) - The uploaded file.
|
||||||
|
# @PARAM: category (FileCategory) - The target category.
|
||||||
|
# @PARAM: subpath (Optional[str]) - The target subpath.
|
||||||
|
# @PRE: file must be a valid UploadFile; category must be valid.
|
||||||
|
# @POST: File is written to disk and metadata is returned.
|
||||||
|
# @RETURN: StoredFile - Metadata of the saved file.
|
||||||
|
# @SIDE_EFFECT: Writes file to disk.
|
||||||
|
async def save_file(self, file: UploadFile, category: FileCategory, subpath: Optional[str] = None) -> StoredFile:
|
||||||
|
with belief_scope("StoragePlugin:save_file"):
|
||||||
|
root = self.get_storage_root()
|
||||||
|
dest_dir = root / category.value
|
||||||
|
if subpath:
|
||||||
|
dest_dir = dest_dir / subpath
|
||||||
|
|
||||||
|
dest_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
dest_path = self.validate_path(dest_dir / file.filename)
|
||||||
|
|
||||||
|
with dest_path.open("wb") as buffer:
|
||||||
|
shutil.copyfileobj(file.file, buffer)
|
||||||
|
|
||||||
|
stat = dest_path.stat()
|
||||||
|
return StoredFile(
|
||||||
|
name=dest_path.name,
|
||||||
|
path=str(dest_path.relative_to(root)),
|
||||||
|
size=stat.st_size,
|
||||||
|
created_at=datetime.fromtimestamp(stat.st_ctime),
|
||||||
|
category=category,
|
||||||
|
mime_type=file.content_type
|
||||||
|
)
|
||||||
|
# [/DEF:save_file:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_file:Function]
|
||||||
|
# @PURPOSE: Deletes a file or directory from the specified category and path.
|
||||||
|
# @PARAM: category (FileCategory) - The category.
|
||||||
|
# @PARAM: path (str) - The relative path of the file or directory.
|
||||||
|
# @PRE: path must belong to the specified category and exist on disk.
|
||||||
|
# @POST: The file or directory is removed from disk.
|
||||||
|
# @SIDE_EFFECT: Removes item from disk.
|
||||||
|
def delete_file(self, category: FileCategory, path: str):
|
||||||
|
with belief_scope("StoragePlugin:delete_file"):
|
||||||
|
root = self.get_storage_root()
|
||||||
|
# path is relative to root, but we ensure it starts with category
|
||||||
|
full_path = self.validate_path(root / path)
|
||||||
|
|
||||||
|
if not str(Path(path)).startswith(category.value):
|
||||||
|
raise ValueError(f"Path {path} does not belong to category {category}")
|
||||||
|
|
||||||
|
if full_path.exists():
|
||||||
|
if full_path.is_dir():
|
||||||
|
shutil.rmtree(full_path)
|
||||||
|
else:
|
||||||
|
full_path.unlink()
|
||||||
|
logger.info(f"[StoragePlugin][Action] Deleted: {full_path}")
|
||||||
|
else:
|
||||||
|
raise FileNotFoundError(f"Item {path} not found")
|
||||||
|
# [/DEF:delete_file:Function]
|
||||||
|
|
||||||
|
# [DEF:get_file_path:Function]
|
||||||
|
# @PURPOSE: Returns the absolute path of a file for download.
|
||||||
|
# @PARAM: category (FileCategory) - The category.
|
||||||
|
# @PARAM: path (str) - The relative path of the file.
|
||||||
|
# @PRE: path must belong to the specified category and be a file.
|
||||||
|
# @POST: Returns the absolute Path to the file.
|
||||||
|
# @RETURN: Path - Absolute path to the file.
|
||||||
|
def get_file_path(self, category: FileCategory, path: str) -> Path:
|
||||||
|
with belief_scope("StoragePlugin:get_file_path"):
|
||||||
|
root = self.get_storage_root()
|
||||||
|
file_path = self.validate_path(root / path)
|
||||||
|
|
||||||
|
if not str(Path(path)).startswith(category.value):
|
||||||
|
raise ValueError(f"Path {path} does not belong to category {category}")
|
||||||
|
|
||||||
|
if not file_path.exists() or file_path.is_dir():
|
||||||
|
raise FileNotFoundError(f"File {path} not found")
|
||||||
|
|
||||||
|
return file_path
|
||||||
|
# [/DEF:get_file_path:Function]
|
||||||
|
|
||||||
|
# [/DEF:StoragePlugin:Class]
|
||||||
|
# [/DEF:StoragePlugin:Module]
|
||||||
128
backend/src/schemas/auth.py
Normal file
128
backend/src/schemas/auth.py
Normal file
@@ -0,0 +1,128 @@
|
|||||||
|
# [DEF:backend.src.schemas.auth:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: auth, schemas, pydantic, user, token
|
||||||
|
# @PURPOSE: Pydantic schemas for authentication requests and responses.
|
||||||
|
# @LAYER: API
|
||||||
|
# @RELATION: DEPENDS_ON -> pydantic
|
||||||
|
#
|
||||||
|
# @INVARIANT: Sensitive fields like password must not be included in response schemas.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from typing import List, Optional
|
||||||
|
from pydantic import BaseModel, EmailStr
|
||||||
|
from datetime import datetime
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:Token:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Represents a JWT access token response.
|
||||||
|
class Token(BaseModel):
|
||||||
|
access_token: str
|
||||||
|
token_type: str
|
||||||
|
# [/DEF:Token:Class]
|
||||||
|
|
||||||
|
# [DEF:TokenData:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Represents the data encoded in a JWT token.
|
||||||
|
class TokenData(BaseModel):
|
||||||
|
username: Optional[str] = None
|
||||||
|
scopes: List[str] = []
|
||||||
|
# [/DEF:TokenData:Class]
|
||||||
|
|
||||||
|
# [DEF:PermissionSchema:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Represents a permission in API responses.
|
||||||
|
class PermissionSchema(BaseModel):
|
||||||
|
id: Optional[str] = None
|
||||||
|
resource: str
|
||||||
|
action: str
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:PermissionSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:RoleSchema:Class]
|
||||||
|
# @PURPOSE: Represents a role in API responses.
|
||||||
|
class RoleSchema(BaseModel):
|
||||||
|
id: str
|
||||||
|
name: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
permissions: List[PermissionSchema] = []
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:RoleSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:RoleCreate:Class]
|
||||||
|
# @PURPOSE: Schema for creating a new role.
|
||||||
|
class RoleCreate(BaseModel):
|
||||||
|
name: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
permissions: List[str] = [] # List of permission IDs or "resource:action" strings
|
||||||
|
# [/DEF:RoleCreate:Class]
|
||||||
|
|
||||||
|
# [DEF:RoleUpdate:Class]
|
||||||
|
# @PURPOSE: Schema for updating an existing role.
|
||||||
|
class RoleUpdate(BaseModel):
|
||||||
|
name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
permissions: Optional[List[str]] = None
|
||||||
|
# [/DEF:RoleUpdate:Class]
|
||||||
|
|
||||||
|
# [DEF:ADGroupMappingSchema:Class]
|
||||||
|
# @PURPOSE: Represents an AD Group to Role mapping in API responses.
|
||||||
|
class ADGroupMappingSchema(BaseModel):
|
||||||
|
id: str
|
||||||
|
ad_group: str
|
||||||
|
role_id: str
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:ADGroupMappingSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:ADGroupMappingCreate:Class]
|
||||||
|
# @PURPOSE: Schema for creating an AD Group mapping.
|
||||||
|
class ADGroupMappingCreate(BaseModel):
|
||||||
|
ad_group: str
|
||||||
|
role_id: str
|
||||||
|
# [/DEF:ADGroupMappingCreate:Class]
|
||||||
|
|
||||||
|
# [DEF:UserBase:Class]
|
||||||
|
# @PURPOSE: Base schema for user data.
|
||||||
|
class UserBase(BaseModel):
|
||||||
|
username: str
|
||||||
|
email: Optional[EmailStr] = None
|
||||||
|
is_active: bool = True
|
||||||
|
# [/DEF:UserBase:Class]
|
||||||
|
|
||||||
|
# [DEF:UserCreate:Class]
|
||||||
|
# @PURPOSE: Schema for creating a new user.
|
||||||
|
class UserCreate(UserBase):
|
||||||
|
password: str
|
||||||
|
roles: List[str] = []
|
||||||
|
# [/DEF:UserCreate:Class]
|
||||||
|
|
||||||
|
# [DEF:UserUpdate:Class]
|
||||||
|
# @PURPOSE: Schema for updating an existing user.
|
||||||
|
class UserUpdate(BaseModel):
|
||||||
|
email: Optional[EmailStr] = None
|
||||||
|
password: Optional[str] = None
|
||||||
|
is_active: Optional[bool] = None
|
||||||
|
roles: Optional[List[str]] = None
|
||||||
|
# [/DEF:UserUpdate:Class]
|
||||||
|
|
||||||
|
# [DEF:User:Class]
|
||||||
|
# @PURPOSE: Schema for user data in API responses.
|
||||||
|
class User(UserBase):
|
||||||
|
id: str
|
||||||
|
auth_source: str
|
||||||
|
created_at: datetime
|
||||||
|
last_login: Optional[datetime] = None
|
||||||
|
roles: List[RoleSchema] = []
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:User:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.schemas.auth:Module]
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user