Compare commits
18 Commits
51e9ee3fcc
...
017-llm-an
| Author | SHA1 | Date | |
|---|---|---|---|
| 0f16bab2b8 | |||
| 7de96c17c4 | |||
| f018b97ed2 | |||
| 72846aa835 | |||
| 994c0c3e5d | |||
| 252a8601a9 | |||
| 8044f85ea4 | |||
| d4109e5a03 | |||
| b2bbd73439 | |||
| 0e0e26e2f7 | |||
| 18b42f8dd0 | |||
| e7b31accd6 | |||
| d3c3a80ed2 | |||
| cc244c2d86 | |||
| d10c23e658 | |||
| 1042b35d1b | |||
| 16ffeb1ed6 | |||
| da34deac02 |
7
.gitignore
vendored
7
.gitignore
vendored
@@ -66,7 +66,6 @@ backend/mappings.db
|
|||||||
|
|
||||||
|
|
||||||
backend/tasks.db
|
backend/tasks.db
|
||||||
|
backend/logs
|
||||||
# Git Integration repositories
|
backend/auth.db
|
||||||
backend/git_repos/
|
semantics/reports
|
||||||
backend/backend/git_repos
|
|
||||||
|
|||||||
@@ -29,6 +29,10 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
|
|||||||
- LocalStorage (for language preference) (013-unify-frontend-css)
|
- LocalStorage (for language preference) (013-unify-frontend-css)
|
||||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit (Frontend) (014-file-storage-ui)
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit (Frontend) (014-file-storage-ui)
|
||||||
- Local Filesystem (for artifacts), Config (for storage path) (014-file-storage-ui)
|
- Local Filesystem (for artifacts), Config (for storage path) (014-file-storage-ui)
|
||||||
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit + Tailwind CSS (Frontend) (015-frontend-nav-redesign)
|
||||||
|
- N/A (UI reorganization and API integration) (015-frontend-nav-redesign)
|
||||||
|
- SQLite (`auth.db`) for Users, Roles, Permissions, and Mappings. (016-multi-user-auth)
|
||||||
|
- SQLite (existing `tasks.db` for results, `auth.db` for permissions, `mappings.db` or new `plugins.db` for provider config/metadata) (017-llm-analysis-plugin)
|
||||||
|
|
||||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
||||||
|
|
||||||
@@ -49,9 +53,9 @@ cd src; pytest; ruff check .
|
|||||||
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
||||||
|
|
||||||
## Recent Changes
|
## Recent Changes
|
||||||
- 014-file-storage-ui: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit (Frontend)
|
- 017-llm-analysis-plugin: Added Python 3.9+ (Backend), Node.js 18+ (Frontend)
|
||||||
- 013-unify-frontend-css: Added Node.js 18+ (Frontend Build), Svelte 5.x + SvelteKit, Tailwind CSS, `date-fns` (existing)
|
- 016-multi-user-auth: Added Python 3.9+ (Backend), Node.js 18+ (Frontend)
|
||||||
- 011-git-integration-dashboard: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, GitPython (or CLI git), Pydantic, SQLAlchemy, Superset API
|
- 015-frontend-nav-redesign: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit + Tailwind CSS (Frontend)
|
||||||
|
|
||||||
|
|
||||||
<!-- MANUAL ADDITIONS START -->
|
<!-- MANUAL ADDITIONS START -->
|
||||||
|
|||||||
4
.kilocode/workflows/read_semantic.md
Normal file
4
.kilocode/workflows/read_semantic.md
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
---
|
||||||
|
description: USE SEMANTIC
|
||||||
|
---
|
||||||
|
Прочитай semantic_protocol.md. ОБЯЗАТЕЛЬНО используй его при разработке
|
||||||
@@ -63,6 +63,7 @@ Load only the minimal necessary context from each artifact:
|
|||||||
**From constitution:**
|
**From constitution:**
|
||||||
|
|
||||||
- Load `.specify/memory/constitution.md` for principle validation
|
- Load `.specify/memory/constitution.md` for principle validation
|
||||||
|
- Load `semantic_protocol.md` for technical standard validation
|
||||||
|
|
||||||
### 3. Build Semantic Models
|
### 3. Build Semantic Models
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,10 @@
|
|||||||
---
|
---
|
||||||
description: Execute the implementation plan by processing and executing all tasks defined in tasks.md
|
description: Execute the implementation plan by processing and executing all tasks defined in tasks.md
|
||||||
|
handoffs:
|
||||||
|
- label: Verify Changes
|
||||||
|
agent: speckit.test
|
||||||
|
prompt: Verify the implementation of...
|
||||||
|
send: true
|
||||||
---
|
---
|
||||||
|
|
||||||
## User Input
|
## User Input
|
||||||
@@ -46,6 +51,7 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
- Automatically proceed to step 3
|
- Automatically proceed to step 3
|
||||||
|
|
||||||
3. Load and analyze the implementation context:
|
3. Load and analyze the implementation context:
|
||||||
|
- **REQUIRED**: Read `semantic_protocol.md` for strict coding standards and contract requirements
|
||||||
- **REQUIRED**: Read tasks.md for the complete task list and execution plan
|
- **REQUIRED**: Read tasks.md for the complete task list and execution plan
|
||||||
- **REQUIRED**: Read plan.md for tech stack, architecture, and file structure
|
- **REQUIRED**: Read plan.md for tech stack, architecture, and file structure
|
||||||
- **IF EXISTS**: Read data-model.md for entities and relationships
|
- **IF EXISTS**: Read data-model.md for entities and relationships
|
||||||
@@ -111,6 +117,7 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
- **Validation checkpoints**: Verify each phase completion before proceeding
|
- **Validation checkpoints**: Verify each phase completion before proceeding
|
||||||
|
|
||||||
7. Implementation execution rules:
|
7. Implementation execution rules:
|
||||||
|
- **Strict Adherence**: Apply `semantic_protocol.md` rules - every file must start with [DEF] header, include @TIER, and define contracts
|
||||||
- **Setup first**: Initialize project structure, dependencies, configuration
|
- **Setup first**: Initialize project structure, dependencies, configuration
|
||||||
- **Tests before code**: If you need to write tests for contracts, entities, and integration scenarios
|
- **Tests before code**: If you need to write tests for contracts, entities, and integration scenarios
|
||||||
- **Core development**: Implement models, services, CLI commands, endpoints
|
- **Core development**: Implement models, services, CLI commands, endpoints
|
||||||
|
|||||||
@@ -22,7 +22,7 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
|
|
||||||
1. **Setup**: Run `.specify/scripts/bash/setup-plan.sh --json` from repo root and parse JSON for FEATURE_SPEC, IMPL_PLAN, SPECS_DIR, BRANCH. For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot").
|
1. **Setup**: Run `.specify/scripts/bash/setup-plan.sh --json` from repo root and parse JSON for FEATURE_SPEC, IMPL_PLAN, SPECS_DIR, BRANCH. For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot").
|
||||||
|
|
||||||
2. **Load context**: Read FEATURE_SPEC and `.specify/memory/constitution.md`. Load IMPL_PLAN template (already copied).
|
2. **Load context**: Read FEATURE_SPEC, `ux_reference.md`, `semantic_protocol.md` and `.specify/memory/constitution.md`. Load IMPL_PLAN template (already copied).
|
||||||
|
|
||||||
3. **Execute plan workflow**: Follow the structure in IMPL_PLAN template to:
|
3. **Execute plan workflow**: Follow the structure in IMPL_PLAN template to:
|
||||||
- Fill Technical Context (mark unknowns as "NEEDS CLARIFICATION")
|
- Fill Technical Context (mark unknowns as "NEEDS CLARIFICATION")
|
||||||
@@ -64,12 +64,22 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
|
|
||||||
**Prerequisites:** `research.md` complete
|
**Prerequisites:** `research.md` complete
|
||||||
|
|
||||||
|
0. **Validate Design against UX Reference**:
|
||||||
|
- Check if the proposed architecture supports the latency, interactivity, and flow defined in `ux_reference.md`.
|
||||||
|
- **CRITICAL**: If the technical plan requires compromising the UX defined in `ux_reference.md` (e.g. "We can't do real-time validation because X"), you **MUST STOP** and warn the user. Do not proceed until resolved.
|
||||||
|
|
||||||
1. **Extract entities from feature spec** → `data-model.md`:
|
1. **Extract entities from feature spec** → `data-model.md`:
|
||||||
- Entity name, fields, relationships
|
- Entity name, fields, relationships
|
||||||
- Validation rules from requirements
|
- Validation rules from requirements
|
||||||
- State transitions if applicable
|
- State transitions if applicable
|
||||||
|
|
||||||
2. **Generate API contracts** from functional requirements:
|
2. **Define Module & Function Contracts (Semantic Protocol)**:
|
||||||
|
- **MANDATORY**: For every new module, define the [DEF] Header and Module-level Contract (@TIER, @PURPOSE, @INVARIANT) as per `semantic_protocol.md`.
|
||||||
|
- **REQUIRED**: Define Function Contracts (@PRE, @POST) for critical logic.
|
||||||
|
- Output specific contract definitions to `contracts/modules.md` or append to `data-model.md` to guide implementation.
|
||||||
|
- Ensure strict adherence to `semantic_protocol.md` syntax.
|
||||||
|
|
||||||
|
3. **Generate API contracts** from functional requirements:
|
||||||
- For each user action → endpoint
|
- For each user action → endpoint
|
||||||
- Use standard REST/GraphQL patterns
|
- Use standard REST/GraphQL patterns
|
||||||
- Output OpenAPI/GraphQL schema to `/contracts/`
|
- Output OpenAPI/GraphQL schema to `/contracts/`
|
||||||
|
|||||||
@@ -70,7 +70,22 @@ Given that feature description, do this:
|
|||||||
|
|
||||||
3. Load `.specify/templates/spec-template.md` to understand required sections.
|
3. Load `.specify/templates/spec-template.md` to understand required sections.
|
||||||
|
|
||||||
4. Follow this execution flow:
|
4. **Generate UX Reference**:
|
||||||
|
|
||||||
|
a. Load `.specify/templates/ux-reference-template.md`.
|
||||||
|
|
||||||
|
b. **Design the User Experience**:
|
||||||
|
- **Imagine you are the user**: Visualize the interface and interaction.
|
||||||
|
- **Persona**: Define who is using this.
|
||||||
|
- **Happy Path**: Write the story of the perfect interaction.
|
||||||
|
- **Mockups**: Create concrete CLI text blocks or UI descriptions.
|
||||||
|
- **Errors**: Define how the system guides the user out of failure.
|
||||||
|
|
||||||
|
c. Write the `ux_reference.md` file in the feature directory.
|
||||||
|
|
||||||
|
d. **CRITICAL**: This UX Reference is now the source of truth for the "feel" of the feature. The technical spec MUST support this experience.
|
||||||
|
|
||||||
|
5. Follow this execution flow:
|
||||||
|
|
||||||
1. Parse user description from Input
|
1. Parse user description from Input
|
||||||
If empty: ERROR "No feature description provided"
|
If empty: ERROR "No feature description provided"
|
||||||
@@ -115,6 +130,12 @@ Given that feature description, do this:
|
|||||||
- [ ] Focused on user value and business needs
|
- [ ] Focused on user value and business needs
|
||||||
- [ ] Written for non-technical stakeholders
|
- [ ] Written for non-technical stakeholders
|
||||||
- [ ] All mandatory sections completed
|
- [ ] All mandatory sections completed
|
||||||
|
|
||||||
|
## UX Consistency
|
||||||
|
|
||||||
|
- [ ] Functional requirements fully support the 'Happy Path' in ux_reference.md
|
||||||
|
- [ ] Error handling requirements match the 'Error Experience' in ux_reference.md
|
||||||
|
- [ ] No requirements contradict the defined User Persona or Context
|
||||||
|
|
||||||
## Requirement Completeness
|
## Requirement Completeness
|
||||||
|
|
||||||
@@ -190,7 +211,7 @@ Given that feature description, do this:
|
|||||||
|
|
||||||
d. **Update Checklist**: After each validation iteration, update the checklist file with current pass/fail status
|
d. **Update Checklist**: After each validation iteration, update the checklist file with current pass/fail status
|
||||||
|
|
||||||
7. Report completion with branch name, spec file path, checklist results, and readiness for the next phase (`/speckit.clarify` or `/speckit.plan`).
|
7. Report completion with branch name, spec file path, ux_reference file path, checklist results, and readiness for the next phase (`/speckit.clarify` or `/speckit.plan`).
|
||||||
|
|
||||||
**NOTE:** The script creates and checks out the new branch and initializes the spec file before writing.
|
**NOTE:** The script creates and checks out the new branch and initializes the spec file before writing.
|
||||||
|
|
||||||
|
|||||||
@@ -24,7 +24,7 @@ You **MUST** consider the user input before proceeding (if not empty).
|
|||||||
1. **Setup**: Run `.specify/scripts/bash/check-prerequisites.sh --json` from repo root and parse FEATURE_DIR and AVAILABLE_DOCS list. All paths must be absolute. For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot").
|
1. **Setup**: Run `.specify/scripts/bash/check-prerequisites.sh --json` from repo root and parse FEATURE_DIR and AVAILABLE_DOCS list. All paths must be absolute. For single quotes in args like "I'm Groot", use escape syntax: e.g 'I'\''m Groot' (or double-quote if possible: "I'm Groot").
|
||||||
|
|
||||||
2. **Load design documents**: Read from FEATURE_DIR:
|
2. **Load design documents**: Read from FEATURE_DIR:
|
||||||
- **Required**: plan.md (tech stack, libraries, structure), spec.md (user stories with priorities)
|
- **Required**: plan.md (tech stack, libraries, structure), spec.md (user stories with priorities), ux_reference.md (experience source of truth)
|
||||||
- **Optional**: data-model.md (entities), contracts/ (API endpoints), research.md (decisions), quickstart.md (test scenarios)
|
- **Optional**: data-model.md (entities), contracts/ (API endpoints), research.md (decisions), quickstart.md (test scenarios)
|
||||||
- Note: Not all projects have all documents. Generate tasks based on what's available.
|
- Note: Not all projects have all documents. Generate tasks based on what's available.
|
||||||
|
|
||||||
@@ -70,6 +70,12 @@ The tasks.md should be immediately executable - each task must be specific enoug
|
|||||||
|
|
||||||
**Tests are OPTIONAL**: Only generate test tasks if explicitly requested in the feature specification or if user requests TDD approach.
|
**Tests are OPTIONAL**: Only generate test tasks if explicitly requested in the feature specification or if user requests TDD approach.
|
||||||
|
|
||||||
|
### UX Preservation (CRITICAL)
|
||||||
|
|
||||||
|
- **Source of Truth**: `ux_reference.md` is the absolute standard for the "feel" of the feature.
|
||||||
|
- **Violation Warning**: If any task would inherently violate the UX (e.g. "Remove progress bar to simplify code"), you **MUST** flag this to the user immediately.
|
||||||
|
- **Verification Task**: You **MUST** add a specific task at the end of each User Story phase: `- [ ] Txxx [USx] Verify implementation matches ux_reference.md (Happy Path & Errors)`
|
||||||
|
|
||||||
### Checklist Format (REQUIRED)
|
### Checklist Format (REQUIRED)
|
||||||
|
|
||||||
Every task MUST strictly follow this format:
|
Every task MUST strictly follow this format:
|
||||||
|
|||||||
66
.kilocode/workflows/speckit.test.md
Normal file
66
.kilocode/workflows/speckit.test.md
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
---
|
||||||
|
description: Run semantic validation and functional tests for a specific feature, module, or file.
|
||||||
|
handoffs:
|
||||||
|
- label: Fix Implementation
|
||||||
|
agent: speckit.implement
|
||||||
|
prompt: Fix the issues found during testing...
|
||||||
|
send: true
|
||||||
|
---
|
||||||
|
|
||||||
|
## User Input
|
||||||
|
|
||||||
|
```text
|
||||||
|
$ARGUMENTS
|
||||||
|
```
|
||||||
|
|
||||||
|
**Input format:** Can be a file path, a directory, or a feature name.
|
||||||
|
|
||||||
|
## Outline
|
||||||
|
|
||||||
|
1. **Context Analysis**:
|
||||||
|
- Determine the target scope (Backend vs Frontend vs Full Feature).
|
||||||
|
- Read `semantic_protocol.md` to load validation rules.
|
||||||
|
|
||||||
|
2. **Phase 1: Semantic Static Analysis (The "Compiler" Check)**
|
||||||
|
- **Command:** Use `grep` or script to verify Protocol compliance before running code.
|
||||||
|
- **Check:**
|
||||||
|
- Does the file start with `[DEF:...]` header?
|
||||||
|
- Are `@TIER` and `@PURPOSE` defined?
|
||||||
|
- Are imports located *after* the contracts?
|
||||||
|
- Do functions marked "Critical" have `@PRE`/`@POST` tags?
|
||||||
|
- **Action:** If this phase fails, **STOP** and report "Semantic Compilation Failed". Do not run runtime tests.
|
||||||
|
|
||||||
|
3. **Phase 2: Environment Prep**
|
||||||
|
- Detect project type:
|
||||||
|
- **Python**: Check if `.venv` is active.
|
||||||
|
- **Svelte**: Check if `node_modules` exists.
|
||||||
|
- **Command:** Run linter (e.g., `ruff check`, `eslint`) to catch syntax errors immediately.
|
||||||
|
|
||||||
|
4. **Phase 3: Test Execution (Runtime)**
|
||||||
|
- Select the test runner based on the file path:
|
||||||
|
- **Backend (`*.py`)**:
|
||||||
|
- Command: `pytest <path_to_test_file> -v`
|
||||||
|
- If no specific test file exists, try to find it by convention: `tests/test_<module_name>.py`.
|
||||||
|
- **Frontend (`*.svelte`, `*.ts`)**:
|
||||||
|
- Command: `npm run test -- <path_to_component>`
|
||||||
|
|
||||||
|
- **Verification**:
|
||||||
|
- Analyze output logs.
|
||||||
|
- If tests fail, summarize the failure (AssertionError, Timeout, etc.).
|
||||||
|
|
||||||
|
5. **Phase 4: Contract Coverage Check (Manual/LLM verify)**
|
||||||
|
- Review the test cases executed.
|
||||||
|
- **Question**: Do the tests explicitly verify the `@POST` guarantees defined in the module header?
|
||||||
|
- **Report**: Mark as "Weak Coverage" if contracts exist but aren't tested.
|
||||||
|
|
||||||
|
## Execution Rules
|
||||||
|
|
||||||
|
- **Fail Fast**: If semantic headers are missing, don't waste time running pytest.
|
||||||
|
- **No Silent Failures**: Always output the full error log if a command fails.
|
||||||
|
- **Auto-Correction Hint**: If a test fails, suggest the specific `speckit.implement` command to fix it.
|
||||||
|
|
||||||
|
## Example Commands
|
||||||
|
|
||||||
|
- **Python**: `pytest backend/tests/test_auth.py`
|
||||||
|
- **Svelte**: `npm run test:unit -- src/components/Button.svelte`
|
||||||
|
- **Lint**: `ruff check backend/src/api/`
|
||||||
@@ -13,20 +13,6 @@ customModes:
|
|||||||
- browser
|
- browser
|
||||||
- mcp
|
- mcp
|
||||||
customInstructions: 1. Always begin by loading the relevant plan or task list from the `specs/` directory. 2. Do not assume a task is done just because it is checked; verify the code or functionality first if asked to audit. 3. When updating task lists, ensure you only mark items as complete if you have verified them.
|
customInstructions: 1. Always begin by loading the relevant plan or task list from the `specs/` directory. 2. Do not assume a task is done just because it is checked; verify the code or functionality first if asked to audit. 3. When updating task lists, ensure you only mark items as complete if you have verified them.
|
||||||
- slug: product-manager
|
|
||||||
name: Product Manager
|
|
||||||
description: Executes SpecKit workflows for feature management
|
|
||||||
roleDefinition: |-
|
|
||||||
You are Kilo Code, acting as a Product Manager. Your purpose is to rigorously execute the workflows defined in `.kilocode/workflows/`.
|
|
||||||
You act as the orchestrator for: - Specification (`speckit.specify`, `speckit.clarify`) - Planning (`speckit.plan`) - Task Management (`speckit.tasks`, `speckit.taskstoissues`) - Quality Assurance (`speckit.analyze`, `speckit.checklist`) - Governance (`speckit.constitution`) - Implementation Oversight (`speckit.implement`)
|
|
||||||
For each task, you must read the relevant workflow file from `.kilocode/workflows/` and follow its Execution Steps precisely.
|
|
||||||
whenToUse: Use this mode when you need to run any /speckit.* command or when dealing with high-level feature planning, specification writing, or project management tasks.
|
|
||||||
groups:
|
|
||||||
- read
|
|
||||||
- edit
|
|
||||||
- command
|
|
||||||
- mcp
|
|
||||||
customInstructions: 1. Always read the specific workflow file in `.kilocode/workflows/` before executing a command. 2. Adhere strictly to the "Operating Constraints" and "Execution Steps" in the workflow files.
|
|
||||||
- slug: semantic
|
- slug: semantic
|
||||||
name: Semantic Agent
|
name: Semantic Agent
|
||||||
roleDefinition: |-
|
roleDefinition: |-
|
||||||
@@ -43,3 +29,18 @@ customModes:
|
|||||||
- browser
|
- browser
|
||||||
- mcp
|
- mcp
|
||||||
source: project
|
source: project
|
||||||
|
- slug: product-manager
|
||||||
|
name: Product Manager
|
||||||
|
roleDefinition: |-
|
||||||
|
Your purpose is to rigorously execute the workflows defined in `.kilocode/workflows/`.
|
||||||
|
You act as the orchestrator for: - Specification (`speckit.specify`, `speckit.clarify`) - Planning (`speckit.plan`) - Task Management (`speckit.tasks`, `speckit.taskstoissues`) - Quality Assurance (`speckit.analyze`, `speckit.checklist`) - Governance (`speckit.constitution`) - Implementation Oversight (`speckit.implement`)
|
||||||
|
For each task, you must read the relevant workflow file from `.kilocode/workflows/` and follow its Execution Steps precisely.
|
||||||
|
whenToUse: Use this mode when you need to run any /speckit.* command or when dealing with high-level feature planning, specification writing, or project management tasks.
|
||||||
|
description: Executes SpecKit workflows for feature management
|
||||||
|
customInstructions: 1. Always read the specific workflow file in `.kilocode/workflows/` before executing a command. 2. Adhere strictly to the "Operating Constraints" and "Execution Steps" in the workflow files.
|
||||||
|
groups:
|
||||||
|
- read
|
||||||
|
- edit
|
||||||
|
- command
|
||||||
|
- mcp
|
||||||
|
source: project
|
||||||
|
|||||||
@@ -1,11 +1,11 @@
|
|||||||
<!--
|
<!--
|
||||||
SYNC IMPACT REPORT
|
SYNC IMPACT REPORT
|
||||||
Version: 1.7.1 (Simplified Workflow)
|
Version: 2.2.0 (ConfigManager Discipline)
|
||||||
Changes:
|
Changes:
|
||||||
- Simplified Generation Workflow to a single phase: Code Generation from `tasks.md`.
|
- Updated Principle II: Added mandatory requirement for using `ConfigManager` (via dependency injection) for all configuration access to ensure consistent environment handling and avoid hardcoded values.
|
||||||
- Removed multi-phase Architecture/Implementation split to streamline development.
|
- Updated Principle III: Refined `requestApi` requirement.
|
||||||
Templates Status:
|
Templates Status:
|
||||||
- .specify/templates/plan-template.md: ✅ Aligned (Dynamic check).
|
- .specify/templates/plan-template.md: ✅ Aligned.
|
||||||
- .specify/templates/spec-template.md: ✅ Aligned.
|
- .specify/templates/spec-template.md: ✅ Aligned.
|
||||||
- .specify/templates/tasks-template.md: ✅ Aligned.
|
- .specify/templates/tasks-template.md: ✅ Aligned.
|
||||||
-->
|
-->
|
||||||
@@ -14,54 +14,42 @@ Templates Status:
|
|||||||
## Core Principles
|
## Core Principles
|
||||||
|
|
||||||
### I. Semantic Protocol Compliance
|
### I. Semantic Protocol Compliance
|
||||||
The file `semantic_protocol.md` is the **authoritative technical standard** for this project. All code generation, refactoring, and architecture must strictly adhere to the standards, syntax, and workflows defined therein.
|
The file `semantic_protocol.md` is the **sole and authoritative technical standard** for this project.
|
||||||
- **Syntax**: `[DEF]` anchors, `@RELATION` tags, and metadata must match the Protocol specification.
|
- **Law**: All code must adhere to the Axioms (Meaning First, Contract First, etc.) defined in the Protocol.
|
||||||
- **Structure**: File layouts and headers must follow the "File Structure Standard".
|
- **Syntax & Structure**: Anchors (`[DEF]`), Tags (`@KEY`), and File Structures must strictly match the Protocol.
|
||||||
- **Workflow**: The technical steps for generating code must align with the Protocol.
|
- **Compliance**: Any deviation from `semantic_protocol.md` constitutes a build failure.
|
||||||
|
|
||||||
### II. Causal Validity (Contracts First)
|
### II. Everything is a Plugin & Centralized Config
|
||||||
As defined in the Protocol, Semantic definitions (Contracts) must ALWAYS precede implementation code. Logic is downstream of definition. We define the structure and constraints (`[DEF]`, `@PRE`, `@POST`) before writing the executable logic.
|
All functional extensions, tools, or major features must be implemented as modular Plugins inheriting from `PluginBase`.
|
||||||
|
- **Modularity**: Logic should not reside in standalone services or scripts unless strictly necessary for core infrastructure. This ensures a unified execution model via the `TaskManager`.
|
||||||
|
- **Configuration Discipline**: All configuration access (environments, settings, paths) MUST use the `ConfigManager`. In the backend, the singleton instance MUST be obtained via dependency injection (`get_config_manager()`). Hardcoding environment IDs (e.g., "1") or paths is STRICTLY FORBIDDEN.
|
||||||
|
|
||||||
### III. Immutability of Architecture
|
### III. Unified Frontend Experience
|
||||||
Architectural decisions in the Module Header (`@LAYER`, `@INVARIANT`, `@CONSTRAINT`) are treated as immutable constraints. Changes to these require an explicit refactoring step, not ad-hoc modification during implementation.
|
To ensure a consistent and accessible user experience, all frontend implementations must strictly adhere to the unified design and localization standards.
|
||||||
|
- **Component Reusability**: All UI elements MUST utilize the standardized Svelte component library (`src/lib/ui`) and centralized design tokens.
|
||||||
|
- **Internationalization (i18n)**: All user-facing text MUST be extracted to the translation system (`src/lib/i18n`).
|
||||||
|
- **Backend Communication**: All API requests MUST use the `requestApi` wrapper (or its derivatives like `fetchApi`, `postApi`) from `src/lib/api.js`. Direct use of the native `fetch` API for backend communication is FORBIDDEN to ensure consistent authentication (JWT) and error handling.
|
||||||
|
|
||||||
### IV. Design by Contract (DbC)
|
### IV. Security & Access Control
|
||||||
Contracts are the Source of Truth. Functions and Classes must define their purpose, specifications, and constraints in the metadata block before implementation, strictly following the **Contracts (Section IV)** standard in `semantic_protocol.md`.
|
To support the Role-Based Access Control (RBAC) system, all functional components must define explicit permissions.
|
||||||
|
- **Granular Permissions**: Every Plugin MUST define a unique permission string (e.g., `plugin:name:execute`) required for its operation.
|
||||||
|
- **Registration**: These permissions MUST be registered in the system database (`auth.db`) during initialization.
|
||||||
|
|
||||||
### V. Belief State Logging
|
### V. Independent Testability
|
||||||
Agents must maintain belief state logs for debugging and coherence checks, strictly following the **Logging Standard (Section V)** defined in `semantic_protocol.md`.
|
Every feature specification MUST define "Independent Tests" that allow the feature to be verified in isolation.
|
||||||
|
- **Decoupling**: Features should be designed such that they can be tested without requiring the full application state or external dependencies where possible.
|
||||||
|
- **Verification**: A feature is not complete until its Independent Test scenarios pass.
|
||||||
|
|
||||||
### VI. Fractal Complexity Limit
|
### VI. Asynchronous Execution
|
||||||
To maintain semantic coherence, code must adhere to the complexity limits (Module/Function size) defined in the **Fractal Complexity Limit (Section VI)** of `semantic_protocol.md`.
|
All long-running or resource-intensive operations (migrations, analysis, backups, external API calls) MUST be executed as asynchronous tasks via the `TaskManager`.
|
||||||
|
- **Non-Blocking**: HTTP API endpoints MUST NOT block on these operations; they should spawn a task and return a Task ID.
|
||||||
### VII. Everything is a Plugin
|
- **Observability**: Tasks MUST emit real-time status updates via the WebSocket infrastructure.
|
||||||
All functional extensions, tools, or major features must be implemented as modular Plugins inheriting from `PluginBase`. Logic should not reside in standalone services or scripts unless strictly necessary for core infrastructure. This ensures a unified execution model via the `TaskManager`, consistent logging, and modularity.
|
|
||||||
|
|
||||||
## File Structure Standards
|
|
||||||
Refer to **Section III (File Structure Standard)** in `semantic_protocol.md` for the authoritative definitions of:
|
|
||||||
- Python Module Headers (`.py`)
|
|
||||||
- Svelte Component Headers (`.svelte`)
|
|
||||||
|
|
||||||
## Generation Workflow
|
|
||||||
The development process follows a streamlined single-phase workflow:
|
|
||||||
|
|
||||||
### 1. Code Generation Phase (Mode: `code`)
|
|
||||||
**Input**: `tasks.md`
|
|
||||||
**Responsibility**:
|
|
||||||
- Select task from `tasks.md`.
|
|
||||||
- Generate Scaffolding (`[DEF]` anchors, Headers, Contracts) AND Implementation in one pass.
|
|
||||||
- Ensure strict adherence to Protocol Section IV (Contracts) and Section VII (Generation Workflow).
|
|
||||||
- **Output**: Working code with passing tests.
|
|
||||||
|
|
||||||
### 2. Validation
|
|
||||||
If logic conflicts with Contract -> Stop -> Report Error.
|
|
||||||
|
|
||||||
## Governance
|
## Governance
|
||||||
This Constitution establishes the "Semantic Code Generation Protocol" as the supreme law of this repository.
|
This Constitution establishes the "Semantic Code Generation Protocol" as the supreme law of this repository.
|
||||||
|
|
||||||
- **Authoritative Source**: `semantic_protocol.md` defines the specific implementation rules for these Principles.
|
- **Authoritative Source**: `semantic_protocol.md` defines the specific implementation rules for Principle I.
|
||||||
- **Automated Enforcement**: Tools must validate adherence to the `semantic_protocol.md` syntax.
|
|
||||||
- **Amendments**: Changes to core principles require a Constitution amendment. Changes to technical syntax require a Protocol update.
|
- **Amendments**: Changes to core principles require a Constitution amendment. Changes to technical syntax require a Protocol update.
|
||||||
- **Compliance**: Failure to adhere to the Protocol constitutes a build failure.
|
- **Compliance**: Failure to adhere to the Protocol constitutes a build failure.
|
||||||
|
|
||||||
**Version**: 1.7.1 | **Ratified**: 2025-12-19 | **Last Amended**: 2026-01-13
|
**Version**: 2.2.0 | **Ratified**: 2025-12-19 | **Last Amended**: 2026-01-29
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
# Feature Specification: [FEATURE NAME]
|
# Feature Specification: [FEATURE NAME]
|
||||||
|
|
||||||
**Feature Branch**: `[###-feature-name]`
|
**Feature Branch**: `[###-feature-name]`
|
||||||
|
**Reference UX**: `[ux_reference.md]` (See specific folder)
|
||||||
**Created**: [DATE]
|
**Created**: [DATE]
|
||||||
**Status**: Draft
|
**Status**: Draft
|
||||||
**Input**: User description: "$ARGUMENTS"
|
**Input**: User description: "$ARGUMENTS"
|
||||||
|
|||||||
@@ -1,35 +0,0 @@
|
|||||||
---
|
|
||||||
|
|
||||||
description: "Architecture task list template (Contracts & Scaffolding)"
|
|
||||||
---
|
|
||||||
|
|
||||||
# Architecture Tasks: [FEATURE NAME]
|
|
||||||
|
|
||||||
**Role**: Architect Agent
|
|
||||||
**Goal**: Define the "What" and "Why" (Contracts, Scaffolding, Models) before implementation.
|
|
||||||
**Input**: Design documents from `/specs/[###-feature-name]/`
|
|
||||||
**Output**: Files with `[DEF]` anchors, `@PRE`/`@POST` contracts, and `@RELATION` mappings. No business logic.
|
|
||||||
|
|
||||||
## Phase 1: Setup & Models
|
|
||||||
|
|
||||||
- [ ] A001 Create/Update data models in [path] with `[DEF]` and contracts
|
|
||||||
- [ ] A002 Define API route structure/contracts in [path]
|
|
||||||
- [ ] A003 Define shared utilities/interfaces
|
|
||||||
|
|
||||||
## Phase 2: User Story 1 - [Title]
|
|
||||||
|
|
||||||
- [ ] A004 [US1] Define contracts for [Component/Service] in [path]
|
|
||||||
- [ ] A005 [US1] Define contracts for [Endpoint] in [path]
|
|
||||||
- [ ] A006 [US1] Define contracts for [Frontend Component] in [path]
|
|
||||||
|
|
||||||
## Phase 3: User Story 2 - [Title]
|
|
||||||
|
|
||||||
- [ ] A007 [US2] Define contracts for [Component/Service] in [path]
|
|
||||||
- [ ] A008 [US2] Define contracts for [Endpoint] in [path]
|
|
||||||
|
|
||||||
## Handover Checklist
|
|
||||||
|
|
||||||
- [ ] All new files created with `[DEF]` anchors
|
|
||||||
- [ ] All functions/classes have `@PURPOSE`, `@PRE`, `@POST` tags
|
|
||||||
- [ ] No "naked code" (logic outside of anchors)
|
|
||||||
- [ ] `tasks-dev.md` is ready for the Developer Agent
|
|
||||||
@@ -1,35 +0,0 @@
|
|||||||
---
|
|
||||||
|
|
||||||
description: "Developer task list template (Implementation Logic)"
|
|
||||||
---
|
|
||||||
|
|
||||||
# Developer Tasks: [FEATURE NAME]
|
|
||||||
|
|
||||||
**Role**: Developer Agent
|
|
||||||
**Goal**: Implement the "How" (Logic, State, Error Handling) inside the defined contracts.
|
|
||||||
**Input**: `tasks-arch.md` (completed), Scaffolding files with `[DEF]` anchors.
|
|
||||||
**Output**: Working code that satisfies `@PRE`/`@POST` conditions.
|
|
||||||
|
|
||||||
## Phase 1: Setup & Models
|
|
||||||
|
|
||||||
- [ ] D001 Implement logic for [Model] in [path]
|
|
||||||
- [ ] D002 Implement logic for [API Route] in [path]
|
|
||||||
- [ ] D003 Implement shared utilities
|
|
||||||
|
|
||||||
## Phase 2: User Story 1 - [Title]
|
|
||||||
|
|
||||||
- [ ] D004 [US1] Implement logic for [Component/Service] in [path]
|
|
||||||
- [ ] D005 [US1] Implement logic for [Endpoint] in [path]
|
|
||||||
- [ ] D006 [US1] Implement logic for [Frontend Component] in [path]
|
|
||||||
- [ ] D007 [US1] Verify semantic compliance and belief state logging
|
|
||||||
|
|
||||||
## Phase 3: User Story 2 - [Title]
|
|
||||||
|
|
||||||
- [ ] D008 [US2] Implement logic for [Component/Service] in [path]
|
|
||||||
- [ ] D009 [US2] Implement logic for [Endpoint] in [path]
|
|
||||||
|
|
||||||
## Polish & Quality Assurance
|
|
||||||
|
|
||||||
- [ ] DXXX Verify all tests pass
|
|
||||||
- [ ] DXXX Check error handling and edge cases
|
|
||||||
- [ ] DXXX Ensure code style compliance
|
|
||||||
67
.specify/templates/ux-reference-template.md
Normal file
67
.specify/templates/ux-reference-template.md
Normal file
@@ -0,0 +1,67 @@
|
|||||||
|
# UX Reference: [FEATURE NAME]
|
||||||
|
|
||||||
|
**Feature Branch**: `[###-feature-name]`
|
||||||
|
**Created**: [DATE]
|
||||||
|
**Status**: Draft
|
||||||
|
|
||||||
|
## 1. User Persona & Context
|
||||||
|
|
||||||
|
* **Who is the user?**: [e.g. Junior Developer, System Administrator, End User]
|
||||||
|
* **What is their goal?**: [e.g. Quickly deploy a hotfix, Visualize complex data]
|
||||||
|
* **Context**: [e.g. Running a command in a terminal on a remote server, Browsing the dashboard on a mobile device]
|
||||||
|
|
||||||
|
## 2. The "Happy Path" Narrative
|
||||||
|
|
||||||
|
[Write a short story (3-5 sentences) describing the perfect interaction from the user's perspective. Focus on how it *feels* - is it instant? Does it guide them?]
|
||||||
|
|
||||||
|
## 3. Interface Mockups
|
||||||
|
|
||||||
|
### CLI Interaction (if applicable)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# User runs this command:
|
||||||
|
$ command --flag value
|
||||||
|
|
||||||
|
# System responds immediately with:
|
||||||
|
[ spinner ] specific loading message...
|
||||||
|
|
||||||
|
# Success output:
|
||||||
|
✅ Operation completed successfully in 1.2s
|
||||||
|
- Created file: /path/to/file
|
||||||
|
- Updated config: /path/to/config
|
||||||
|
```
|
||||||
|
|
||||||
|
### UI Layout & Flow (if applicable)
|
||||||
|
|
||||||
|
**Screen/Component**: [Name]
|
||||||
|
|
||||||
|
* **Layout**: [Description of structure, e.g., "Two-column layout, left sidebar navigation..."]
|
||||||
|
* **Key Elements**:
|
||||||
|
* **[Button Name]**: Primary action. Color: Blue.
|
||||||
|
* **[Input Field]**: Placeholder text: "Enter your name...". Validation: Real-time.
|
||||||
|
* **States**:
|
||||||
|
* **Default**: Clean state, waiting for input.
|
||||||
|
* **Loading**: Skeleton loader replaces content area.
|
||||||
|
* **Success**: Toast notification appears top-right: "Saved!" (Green).
|
||||||
|
|
||||||
|
## 4. The "Error" Experience
|
||||||
|
|
||||||
|
**Philosophy**: Don't just report the error; guide the user to the fix.
|
||||||
|
|
||||||
|
### Scenario A: [Common Error, e.g. Invalid Input]
|
||||||
|
|
||||||
|
* **User Action**: Enters "123" in a text-only field.
|
||||||
|
* **System Response**:
|
||||||
|
* (UI) Input border turns Red. Message below input: "Please enter text only."
|
||||||
|
* (CLI) `❌ Error: Invalid input '123'. Expected text format.`
|
||||||
|
* **Recovery**: User can immediately re-type without refreshing/re-running.
|
||||||
|
|
||||||
|
### Scenario B: [System Failure, e.g. Network Timeout]
|
||||||
|
|
||||||
|
* **System Response**: "Unable to connect. Retrying in 3s... (Press C to cancel)"
|
||||||
|
* **Recovery**: Automatic retry or explicit "Retry Now" button.
|
||||||
|
|
||||||
|
## 5. Tone & Voice
|
||||||
|
|
||||||
|
* **Style**: [e.g. Concise, Technical, Friendly, Verbose]
|
||||||
|
* **Terminology**: [e.g. Use "Repository" not "Repo", "Directory" not "Folder"]
|
||||||
158
README.md
158
README.md
@@ -1,119 +1,77 @@
|
|||||||
# Инструменты автоматизации Superset
|
# Инструменты автоматизации Superset (ss-tools)
|
||||||
|
|
||||||
## Обзор
|
## Обзор
|
||||||
Этот репозиторий содержит Python-скрипты и библиотеку (`superset_tool`) для автоматизации задач в Apache Superset, таких как:
|
**ss-tools** — это современная платформа для автоматизации и управления экосистемой Apache Superset. Проект перешел от набора CLI-скриптов к полноценному веб-приложению с архитектурой Backend (FastAPI) + Frontend (SvelteKit), обеспечивая удобный интерфейс для сложных операций.
|
||||||
- **Резервное копирование**: Экспорт всех дашбордов из экземпляра Superset в локальное хранилище.
|
|
||||||
- **Миграция**: Перенос и преобразование дашбордов между разными средами Superset (например, Development, Sandbox, Production).
|
## Основные возможности
|
||||||
|
|
||||||
|
### 🚀 Миграция и управление дашбордами
|
||||||
|
- **Dashboard Grid**: Удобный просмотр всех дашбордов во всех окружениях (Dev, Sandbox, Prod) в едином интерфейсе.
|
||||||
|
- **Интеллектуальный маппинг**: Автоматическое и ручное сопоставление датасетов, таблиц и схем при переносе между окружениями.
|
||||||
|
- **Проверка зависимостей**: Валидация наличия всех необходимых компонентов перед миграцией.
|
||||||
|
|
||||||
|
### 📦 Резервное копирование
|
||||||
|
- **Планировщик (Scheduler)**: Автоматическое создание резервных копий дашбордов и датасетов по расписанию.
|
||||||
|
- **Хранилище**: Локальное хранение артефактов с возможностью управления через UI.
|
||||||
|
|
||||||
|
### 🛠 Git Интеграция
|
||||||
|
- **Version Control**: Возможность версионирования ассетов Superset.
|
||||||
|
- **Git Dashboard**: Управление ветками, коммитами и деплоем изменений напрямую из интерфейса.
|
||||||
|
- **Conflict Resolution**: Встроенные инструменты для разрешения конфликтов в YAML-конфигурациях.
|
||||||
|
|
||||||
|
### 🤖 LLM Анализ (AI Plugin)
|
||||||
|
- **Автоматический аудит**: Анализ состояния дашбордов на основе скриншотов и метаданных.
|
||||||
|
- **Генерация документации**: Автоматическое описание датасетов и колонок с помощью LLM (OpenAI, OpenRouter и др.).
|
||||||
|
- **Smart Validation**: Поиск аномалий и ошибок в визуализациях.
|
||||||
|
|
||||||
|
### 🔐 Безопасность и администрирование
|
||||||
|
- **Multi-user Auth**: Многопользовательский доступ с ролевой моделью (RBAC).
|
||||||
|
- **Управление подключениями**: Централизованная настройка доступов к различным инстансам Superset.
|
||||||
|
- **Логирование**: Подробная история выполнения всех фоновых задач.
|
||||||
|
|
||||||
|
## Технологический стек
|
||||||
|
- **Backend**: Python 3.9+, FastAPI, SQLAlchemy, APScheduler, Pydantic.
|
||||||
|
- **Frontend**: Node.js 18+, SvelteKit, Tailwind CSS.
|
||||||
|
- **Database**: SQLite (для хранения метаданных, задач и настроек доступа).
|
||||||
|
|
||||||
## Структура проекта
|
## Структура проекта
|
||||||
- `backup_script.py`: Основной скрипт для выполнения запланированного резервного копирования дашбордов Superset.
|
- `backend/` — Серверная часть, API и логика плагинов.
|
||||||
- `migration_script.py`: Основной скрипт для переноса конкретных дашбордов между окружениями, включая переопределение соединений с базами данных.
|
- `frontend/` — Клиентская часть (SvelteKit приложение).
|
||||||
- `search_script.py`: Скрипт для поиска данных во всех доступных датасетах на сервере
|
- `specs/` — Спецификации функций и планы реализации.
|
||||||
- `run_mapper.py`: CLI-скрипт для маппинга метаданных датасетов.
|
- `docs/` — Дополнительная документация по маппингу и разработке плагинов.
|
||||||
- `superset_tool/`:
|
|
||||||
- `client.py`: Python-клиент для взаимодействия с API Superset.
|
|
||||||
- `exceptions.py`: Пользовательские классы исключений для структурированной обработки ошибок.
|
|
||||||
- `models.py`: Pydantic-модели для валидации конфигурационных данных.
|
|
||||||
- `utils/`:
|
|
||||||
- `fileio.py`: Утилиты для работы с файловой системой (работа с архивами, парсинг YAML).
|
|
||||||
- `logger.py`: Конфигурация логгера для единообразного логирования в проекте.
|
|
||||||
- `network.py`: HTTP-клиент для сетевых запросов с обработкой аутентификации и повторных попыток.
|
|
||||||
- `init_clients.py`: Утилита для инициализации клиентов Superset для разных окружений.
|
|
||||||
- `dataset_mapper.py`: Логика маппинга метаданных датасетов.
|
|
||||||
|
|
||||||
## Настройка
|
## Быстрый старт
|
||||||
|
|
||||||
### Требования
|
### Требования
|
||||||
- Python 3.9+
|
- Python 3.9+
|
||||||
- `pip` для управления пакетами.
|
- Node.js 18+
|
||||||
- `keyring` для безопасного хранения паролей.
|
- Настроенный доступ к API Superset
|
||||||
|
|
||||||
### Установка
|
### Запуск
|
||||||
1. **Клонируйте репозиторий:**
|
Для автоматической настройки окружений и запуска обоих серверов (Backend & Frontend) используйте скрипт:
|
||||||
```bash
|
|
||||||
git clone https://prod.gitlab.dwh.rusal.com/dwh_bi/superset-tools.git
|
|
||||||
cd superset-tools
|
|
||||||
```
|
|
||||||
2. **Установите зависимости:**
|
|
||||||
```bash
|
|
||||||
pip install -r requirements.txt
|
|
||||||
```
|
|
||||||
(Возможно, потребуется создать `requirements.txt` с `pydantic`, `requests`, `keyring`, `PyYAML`, `urllib3`)
|
|
||||||
3. **Настройте пароли:**
|
|
||||||
Используйте `keyring` для хранения паролей API-пользователей Superset.
|
|
||||||
```python
|
|
||||||
import keyring
|
|
||||||
keyring.set_password("system", "dev migrate", "пароль пользователя migrate_user")
|
|
||||||
keyring.set_password("system", "prod migrate", "пароль пользователя migrate_user")
|
|
||||||
keyring.set_password("system", "sandbox migrate", "пароль пользователя migrate_user")
|
|
||||||
```
|
|
||||||
|
|
||||||
## Использование
|
|
||||||
|
|
||||||
### Запуск проекта (Web UI)
|
|
||||||
Для запуска backend и frontend серверов одной командой:
|
|
||||||
```bash
|
```bash
|
||||||
./run.sh
|
./run.sh
|
||||||
```
|
```
|
||||||
|
*Скрипт создаст виртуальное окружение Python, установит зависимости `pip` и `npm`, и запустит сервисы.*
|
||||||
|
|
||||||
Опции:
|
Опции:
|
||||||
- `--skip-install`: Пропустить проверку и установку зависимостей.
|
- `--skip-install`: Пропустить установку зависимостей.
|
||||||
- `--help`: Показать справку.
|
- `--help`: Показать справку.
|
||||||
|
|
||||||
Переменные окружения:
|
Переменные окружения:
|
||||||
- `BACKEND_PORT`: Порт для backend (по умолчанию 8000).
|
- `BACKEND_PORT`: Порт API (по умолчанию 8000).
|
||||||
- `FRONTEND_PORT`: Порт для frontend (по умолчанию 5173).
|
- `FRONTEND_PORT`: Порт UI (по умолчанию 5173).
|
||||||
|
|
||||||
### Скрипт резервного копирования (`backup_script.py`)
|
## Разработка
|
||||||
Для создания резервных копий дашбордов из настроенных окружений Superset:
|
Проект следует строгим правилам разработки:
|
||||||
```bash
|
1. **Semantic Code Generation**: Использование протокола `semantic_protocol.md` для обеспечения надежности кода.
|
||||||
python backup_script.py
|
2. **Design by Contract (DbC)**: Определение предусловий и постусловий для ключевых функций.
|
||||||
```
|
3. **Constitution**: Соблюдение правил, описанных в конституции проекта в папке `.specify/`.
|
||||||
Резервные копии сохраняются в `P:\Superset\010 Бекапы\` по умолчанию. Логи хранятся в `P:\Superset\010 Бекапы\Logs`.
|
|
||||||
|
|
||||||
### Скрипт миграции (`migration_script.py`)
|
### Полезные команды
|
||||||
Для переноса конкретного дашборда:
|
- **Backend**: `cd backend && .venv/bin/python3 -m uvicorn src.app:app --reload`
|
||||||
```bash
|
- **Frontend**: `cd frontend && npm run dev`
|
||||||
python migration_script.py
|
- **Тесты**: `cd backend && .venv/bin/pytest`
|
||||||
```
|
|
||||||
|
|
||||||
### Скрипт поиска (`search_script.py`)
|
## Контакты и вклад
|
||||||
Для поиска по текстовым паттернам в метаданных датасетов Superset:
|
Для добавления новых функций или исправления ошибок, пожалуйста, ознакомьтесь с `docs/plugin_dev.md` и создайте соответствующую спецификацию в `specs/`.
|
||||||
```bash
|
|
||||||
python search_script.py
|
|
||||||
```
|
|
||||||
Скрипт использует регулярные выражения для поиска в полях датасетов, таких как SQL-запросы. Результаты поиска выводятся в лог и в консоль.
|
|
||||||
|
|
||||||
### Скрипт маппинга метаданных (`run_mapper.py`)
|
|
||||||
Для обновления метаданных датасета (например, verbose names) в Superset:
|
|
||||||
```bash
|
|
||||||
python run_mapper.py --source <source_type> --dataset-id <dataset_id> [--table-name <table_name>] [--table-schema <table_schema>] [--excel-path <path_to_excel>] [--env <environment>]
|
|
||||||
```
|
|
||||||
Если вы используете XLSX - файл должен содержать два столбца - column_name | verbose_name
|
|
||||||
|
|
||||||
|
|
||||||
Параметры:
|
|
||||||
- `--source`: Источник данных ('postgres', 'excel' или 'both').
|
|
||||||
- `--dataset-id`: ID датасета для обновления.
|
|
||||||
- `--table-name`: Имя таблицы для PostgreSQL.
|
|
||||||
- `--table-schema`: Схема таблицы для PostgreSQL.
|
|
||||||
- `--excel-path`: Путь к Excel-файлу.
|
|
||||||
- `--env`: Окружение Superset ('dev', 'prod' и т.д.).
|
|
||||||
|
|
||||||
Пример использования:
|
|
||||||
```bash
|
|
||||||
python run_mapper.py --source postgres --dataset-id 123 --table-name account_debt --table-schema dm_view --env dev
|
|
||||||
|
|
||||||
python run_mapper.py --source=excel --dataset-id=286 --excel-path=H:\dev\ss-tools\286_map.xlsx --env=dev
|
|
||||||
```
|
|
||||||
|
|
||||||
## Логирование
|
|
||||||
Логи пишутся в файл в директории `Logs` (например, `P:\Superset\010 Бекапы\Logs` для резервных копий) и выводятся в консоль. Уровень логирования по умолчанию — `INFO`.
|
|
||||||
|
|
||||||
## Разработка и вклад
|
|
||||||
- Следуйте **Semantic Code Generation Protocol** (см. `semantic_protocol.md`):
|
|
||||||
- Все определения обернуты в `[DEF]...[/DEF]`.
|
|
||||||
- Контракты (`@PRE`, `@POST`) определяются ДО реализации.
|
|
||||||
- Строгая типизация и иммутабельность архитектурных решений.
|
|
||||||
- Соблюдайте Конституцию проекта (`.specify/memory/constitution.md`).
|
|
||||||
- Используйте `Pydantic`-модели для валидации данных.
|
|
||||||
- Реализуйте всестороннюю обработку ошибок с помощью пользовательских исключений.
|
|
||||||
|
|||||||
Submodule backend/backend/git_repos/12 deleted from f46772443a
1
backend/get_full_key.py
Normal file
1
backend/get_full_key.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"print(f'Length": {"else": "print('Provider not found')\ndb.close()"}}
|
||||||
1
backend/git_repos/12
Submodule
1
backend/git_repos/12
Submodule
Submodule backend/git_repos/12 added at 57ab7e8679
137813
backend/logs/app.log.1
137813
backend/logs/app.log.1
File diff suppressed because it is too large
Load Diff
Binary file not shown.
@@ -25,9 +25,13 @@ keyring==25.7.0
|
|||||||
more-itertools==10.8.0
|
more-itertools==10.8.0
|
||||||
pycparser==2.23
|
pycparser==2.23
|
||||||
pydantic==2.12.5
|
pydantic==2.12.5
|
||||||
|
pydantic-settings
|
||||||
pydantic_core==2.41.5
|
pydantic_core==2.41.5
|
||||||
python-multipart==0.0.21
|
python-multipart==0.0.21
|
||||||
PyYAML==6.0.3
|
PyYAML==6.0.3
|
||||||
|
passlib[bcrypt]
|
||||||
|
python-jose[cryptography]
|
||||||
|
PyJWT
|
||||||
RapidFuzz==3.14.3
|
RapidFuzz==3.14.3
|
||||||
referencing==0.37.0
|
referencing==0.37.0
|
||||||
requests==2.32.5
|
requests==2.32.5
|
||||||
@@ -44,4 +48,9 @@ websockets==15.0.1
|
|||||||
pandas
|
pandas
|
||||||
psycopg2-binary
|
psycopg2-binary
|
||||||
openpyxl
|
openpyxl
|
||||||
GitPython==3.1.44
|
GitPython==3.1.44
|
||||||
|
itsdangerous
|
||||||
|
email-validator
|
||||||
|
openai
|
||||||
|
playwright
|
||||||
|
tenacity
|
||||||
@@ -1,59 +1,118 @@
|
|||||||
# [DEF:AuthModule:Module]
|
# [DEF:backend.src.api.auth:Module]
|
||||||
# @SEMANTICS: auth, authentication, adfs, oauth, middleware
|
#
|
||||||
# @PURPOSE: Implements ADFS authentication using Authlib for FastAPI. It provides a dependency to protect endpoints.
|
# @SEMANTICS: api, auth, routes, login, logout
|
||||||
# @LAYER: UI (API)
|
# @PURPOSE: Authentication API endpoints.
|
||||||
# @RELATION: Used by API routers to protect endpoints that require authentication.
|
# @LAYER: API
|
||||||
|
# @RELATION: USES -> backend.src.services.auth_service.AuthService
|
||||||
|
# @RELATION: USES -> backend.src.core.database.get_auth_db
|
||||||
|
#
|
||||||
|
# @INVARIANT: All auth endpoints must return consistent error codes.
|
||||||
|
|
||||||
from fastapi import Depends, HTTPException, status
|
# [SECTION: IMPORTS]
|
||||||
from fastapi.security import OAuth2AuthorizationCodeBearer
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
from authlib.integrations.starlette_client import OAuth
|
from fastapi.security import OAuth2PasswordRequestForm
|
||||||
from starlette.config import Config
|
from sqlalchemy.orm import Session
|
||||||
|
from ..core.database import get_auth_db
|
||||||
|
from ..services.auth_service import AuthService
|
||||||
|
from ..schemas.auth import Token, User as UserSchema
|
||||||
|
from ..dependencies import get_current_user
|
||||||
|
from ..core.auth.oauth import oauth, is_adfs_configured
|
||||||
|
from ..core.auth.logger import log_security_event
|
||||||
|
from ..core.logger import belief_scope
|
||||||
|
import starlette.requests
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
# Placeholder for ADFS configuration. In a real app, this would come from a secure source.
|
# [DEF:router:Variable]
|
||||||
# Create an in-memory .env file
|
# @PURPOSE: APIRouter instance for authentication routes.
|
||||||
from io import StringIO
|
router = APIRouter(prefix="/api/auth", tags=["auth"])
|
||||||
config_data = StringIO("""
|
# [/DEF:router:Variable]
|
||||||
ADFS_CLIENT_ID=your-client-id
|
|
||||||
ADFS_CLIENT_SECRET=your-client-secret
|
|
||||||
ADFS_SERVER_METADATA_URL=https://your-adfs-server/.well-known/openid-configuration
|
|
||||||
""")
|
|
||||||
config = Config(config_data)
|
|
||||||
oauth = OAuth(config)
|
|
||||||
|
|
||||||
oauth.register(
|
# [DEF:login_for_access_token:Function]
|
||||||
name='adfs',
|
# @PURPOSE: Authenticates a user and returns a JWT access token.
|
||||||
server_metadata_url=config('ADFS_SERVER_METADATA_URL'),
|
# @PRE: form_data contains username and password.
|
||||||
client_kwargs={'scope': 'openid profile email'}
|
# @POST: Returns a Token object on success.
|
||||||
)
|
# @THROW: HTTPException 401 if authentication fails.
|
||||||
|
# @PARAM: form_data (OAuth2PasswordRequestForm) - Login credentials.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: Token - The generated JWT token.
|
||||||
|
@router.post("/login", response_model=Token)
|
||||||
|
async def login_for_access_token(
|
||||||
|
form_data: OAuth2PasswordRequestForm = Depends(),
|
||||||
|
db: Session = Depends(get_auth_db)
|
||||||
|
):
|
||||||
|
with belief_scope("api.auth.login"):
|
||||||
|
auth_service = AuthService(db)
|
||||||
|
user = auth_service.authenticate_user(form_data.username, form_data.password)
|
||||||
|
if not user:
|
||||||
|
log_security_event("LOGIN_FAILED", form_data.username, {"reason": "Invalid credentials"})
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="Incorrect username or password",
|
||||||
|
headers={"WWW-Authenticate": "Bearer"},
|
||||||
|
)
|
||||||
|
log_security_event("LOGIN_SUCCESS", user.username, {"source": "LOCAL"})
|
||||||
|
return auth_service.create_session(user)
|
||||||
|
# [/DEF:login_for_access_token:Function]
|
||||||
|
|
||||||
oauth2_scheme = OAuth2AuthorizationCodeBearer(
|
# [DEF:read_users_me:Function]
|
||||||
authorizationUrl="https://your-adfs-server/adfs/oauth2/authorize",
|
# @PURPOSE: Retrieves the profile of the currently authenticated user.
|
||||||
tokenUrl="https://your-adfs-server/adfs/oauth2/token",
|
# @PRE: Valid JWT token provided.
|
||||||
)
|
# @POST: Returns the current user's data.
|
||||||
|
# @PARAM: current_user (UserSchema) - The user extracted from the token.
|
||||||
|
# @RETURN: UserSchema - The current user profile.
|
||||||
|
@router.get("/me", response_model=UserSchema)
|
||||||
|
async def read_users_me(current_user: UserSchema = Depends(get_current_user)):
|
||||||
|
with belief_scope("api.auth.me"):
|
||||||
|
return current_user
|
||||||
|
# [/DEF:read_users_me:Function]
|
||||||
|
|
||||||
# [DEF:get_current_user:Function]
|
# [DEF:logout:Function]
|
||||||
# @PURPOSE: Dependency to get the current user from the ADFS token.
|
# @PURPOSE: Logs out the current user (placeholder for session revocation).
|
||||||
# @PARAM: token (str) - The OAuth2 bearer token.
|
# @PRE: Valid JWT token provided.
|
||||||
# @PRE: token should be provided via Authorization header.
|
# @POST: Returns success message.
|
||||||
# @POST: Returns user details if authenticated, else raises 401.
|
@router.post("/logout")
|
||||||
# @RETURN: Dict[str, str] - User information.
|
async def logout(current_user: UserSchema = Depends(get_current_user)):
|
||||||
async def get_current_user(token: str = Depends(oauth2_scheme)):
|
with belief_scope("api.auth.logout"):
|
||||||
"""
|
log_security_event("LOGOUT", current_user.username)
|
||||||
Dependency to get the current user from the ADFS token.
|
# In a stateless JWT setup, client-side token deletion is primary.
|
||||||
This is a placeholder and needs to be fully implemented.
|
# Server-side revocation (blacklisting) can be added here if needed.
|
||||||
"""
|
return {"message": "Successfully logged out"}
|
||||||
# In a real implementation, you would:
|
# [/DEF:logout:Function]
|
||||||
# 1. Validate the token with ADFS.
|
|
||||||
# 2. Fetch user information.
|
# [DEF:login_adfs:Function]
|
||||||
# 3. Create a user object.
|
# @PURPOSE: Initiates the ADFS OIDC login flow.
|
||||||
# For now, we'll just check if a token exists.
|
# @POST: Redirects the user to ADFS.
|
||||||
if not token:
|
@router.get("/login/adfs")
|
||||||
raise HTTPException(
|
async def login_adfs(request: starlette.requests.Request):
|
||||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
with belief_scope("api.auth.login_adfs"):
|
||||||
detail="Not authenticated",
|
if not is_adfs_configured():
|
||||||
headers={"WWW-Authenticate": "Bearer"},
|
raise HTTPException(
|
||||||
)
|
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
# A real implementation would return a user object.
|
detail="ADFS is not configured. Please set ADFS_CLIENT_ID, ADFS_CLIENT_SECRET, and ADFS_METADATA_URL environment variables."
|
||||||
return {"placeholder_user": "user@example.com"}
|
)
|
||||||
# [/DEF:get_current_user:Function]
|
redirect_uri = request.url_for('auth_callback_adfs')
|
||||||
# [/DEF:AuthModule:Module]
|
return await oauth.adfs.authorize_redirect(request, str(redirect_uri))
|
||||||
|
# [/DEF:login_adfs:Function]
|
||||||
|
|
||||||
|
# [DEF:auth_callback_adfs:Function]
|
||||||
|
# @PURPOSE: Handles the callback from ADFS after successful authentication.
|
||||||
|
# @POST: Provisions user JIT and returns session token.
|
||||||
|
@router.get("/callback/adfs", name="auth_callback_adfs")
|
||||||
|
async def auth_callback_adfs(request: starlette.requests.Request, db: Session = Depends(get_auth_db)):
|
||||||
|
with belief_scope("api.auth.callback_adfs"):
|
||||||
|
if not is_adfs_configured():
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||||
|
detail="ADFS is not configured. Please set ADFS_CLIENT_ID, ADFS_CLIENT_SECRET, and ADFS_METADATA_URL environment variables."
|
||||||
|
)
|
||||||
|
token = await oauth.adfs.authorize_access_token(request)
|
||||||
|
user_info = token.get('userinfo')
|
||||||
|
if not user_info:
|
||||||
|
raise HTTPException(status_code=400, detail="Failed to retrieve user info from ADFS")
|
||||||
|
|
||||||
|
auth_service = AuthService(db)
|
||||||
|
user = auth_service.provision_adfs_user(user_info)
|
||||||
|
return auth_service.create_session(user)
|
||||||
|
# [/DEF:auth_callback_adfs:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.api.auth:Module]
|
||||||
@@ -1 +1 @@
|
|||||||
from . import plugins, tasks, settings, connections, environments, mappings, migration, git, storage
|
from . import plugins, tasks, settings, connections, environments, mappings, migration, git, storage, admin
|
||||||
|
|||||||
310
backend/src/api/routes/admin.py
Normal file
310
backend/src/api/routes/admin.py
Normal file
@@ -0,0 +1,310 @@
|
|||||||
|
# [DEF:backend.src.api.routes.admin:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: api, admin, users, roles, permissions
|
||||||
|
# @PURPOSE: Admin API endpoints for user and role management.
|
||||||
|
# @LAYER: API
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.repository.AuthRepository
|
||||||
|
# @RELATION: USES -> backend.src.dependencies.has_permission
|
||||||
|
#
|
||||||
|
# @INVARIANT: All endpoints in this module require 'Admin' role or 'admin' scope.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from typing import List
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from ...core.database import get_auth_db
|
||||||
|
from ...core.auth.repository import AuthRepository
|
||||||
|
from ...core.auth.security import get_password_hash
|
||||||
|
from ...schemas.auth import (
|
||||||
|
User as UserSchema, UserCreate, UserUpdate,
|
||||||
|
RoleSchema, RoleCreate, RoleUpdate, PermissionSchema,
|
||||||
|
ADGroupMappingSchema, ADGroupMappingCreate
|
||||||
|
)
|
||||||
|
from ...models.auth import User, Role, Permission, ADGroupMapping
|
||||||
|
from ...dependencies import has_permission, get_current_user
|
||||||
|
from ...core.logger import logger, belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:router:Variable]
|
||||||
|
# @PURPOSE: APIRouter instance for admin routes.
|
||||||
|
router = APIRouter(prefix="/api/admin", tags=["admin"])
|
||||||
|
# [/DEF:router:Variable]
|
||||||
|
|
||||||
|
# [DEF:list_users:Function]
|
||||||
|
# @PURPOSE: Lists all registered users.
|
||||||
|
# @PRE: Current user has 'Admin' role.
|
||||||
|
# @POST: Returns a list of UserSchema objects.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: List[UserSchema] - List of users.
|
||||||
|
@router.get("/users", response_model=List[UserSchema])
|
||||||
|
async def list_users(
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:users", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.list_users"):
|
||||||
|
users = db.query(User).all()
|
||||||
|
return users
|
||||||
|
# [/DEF:list_users:Function]
|
||||||
|
|
||||||
|
# [DEF:create_user:Function]
|
||||||
|
# @PURPOSE: Creates a new local user.
|
||||||
|
# @PRE: Current user has 'Admin' role.
|
||||||
|
# @POST: New user is created in the database.
|
||||||
|
# @PARAM: user_in (UserCreate) - New user data.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: UserSchema - The created user.
|
||||||
|
@router.post("/users", response_model=UserSchema, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_user(
|
||||||
|
user_in: UserCreate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:users", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.create_user"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
if repo.get_user_by_username(user_in.username):
|
||||||
|
raise HTTPException(status_code=400, detail="Username already exists")
|
||||||
|
|
||||||
|
new_user = User(
|
||||||
|
username=user_in.username,
|
||||||
|
email=user_in.email,
|
||||||
|
password_hash=get_password_hash(user_in.password),
|
||||||
|
auth_source="LOCAL",
|
||||||
|
is_active=user_in.is_active
|
||||||
|
)
|
||||||
|
|
||||||
|
for role_name in user_in.roles:
|
||||||
|
role = repo.get_role_by_name(role_name)
|
||||||
|
if role:
|
||||||
|
new_user.roles.append(role)
|
||||||
|
|
||||||
|
db.add(new_user)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(new_user)
|
||||||
|
return new_user
|
||||||
|
# [/DEF:create_user:Function]
|
||||||
|
|
||||||
|
# [DEF:update_user:Function]
|
||||||
|
# @PURPOSE: Updates an existing user.
|
||||||
|
@router.put("/users/{user_id}", response_model=UserSchema)
|
||||||
|
async def update_user(
|
||||||
|
user_id: str,
|
||||||
|
user_in: UserUpdate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:users", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.update_user"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
user = repo.get_user_by_id(user_id)
|
||||||
|
if not user:
|
||||||
|
raise HTTPException(status_code=404, detail="User not found")
|
||||||
|
|
||||||
|
if user_in.email is not None:
|
||||||
|
user.email = user_in.email
|
||||||
|
if user_in.is_active is not None:
|
||||||
|
user.is_active = user_in.is_active
|
||||||
|
if user_in.password is not None:
|
||||||
|
user.password_hash = get_password_hash(user_in.password)
|
||||||
|
|
||||||
|
if user_in.roles is not None:
|
||||||
|
user.roles = []
|
||||||
|
for role_name in user_in.roles:
|
||||||
|
role = repo.get_role_by_name(role_name)
|
||||||
|
if role:
|
||||||
|
user.roles.append(role)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(user)
|
||||||
|
return user
|
||||||
|
# [/DEF:update_user:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_user:Function]
|
||||||
|
# @PURPOSE: Deletes a user.
|
||||||
|
@router.delete("/users/{user_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
async def delete_user(
|
||||||
|
user_id: str,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:users", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.delete_user"):
|
||||||
|
logger.info(f"[DEBUG] Attempting to delete user context={{'user_id': '{user_id}'}}")
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
user = repo.get_user_by_id(user_id)
|
||||||
|
if not user:
|
||||||
|
logger.warning(f"[DEBUG] User not found for deletion context={{'user_id': '{user_id}'}}")
|
||||||
|
raise HTTPException(status_code=404, detail="User not found")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Found user to delete context={{'username': '{user.username}'}}")
|
||||||
|
db.delete(user)
|
||||||
|
db.commit()
|
||||||
|
logger.info(f"[DEBUG] Successfully deleted user context={{'user_id': '{user_id}'}}")
|
||||||
|
return None
|
||||||
|
# [/DEF:delete_user:Function]
|
||||||
|
|
||||||
|
# [DEF:list_roles:Function]
|
||||||
|
# @PURPOSE: Lists all available roles.
|
||||||
|
# @RETURN: List[RoleSchema] - List of roles.
|
||||||
|
# @RELATION: CALLS -> backend.src.models.auth.Role
|
||||||
|
@router.get("/roles", response_model=List[RoleSchema])
|
||||||
|
async def list_roles(
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.list_roles"):
|
||||||
|
return db.query(Role).all()
|
||||||
|
# [/DEF:list_roles:Function]
|
||||||
|
|
||||||
|
# [DEF:create_role:Function]
|
||||||
|
# @PURPOSE: Creates a new system role with associated permissions.
|
||||||
|
# @PRE: Role name must be unique.
|
||||||
|
# @POST: New Role record is created in auth.db.
|
||||||
|
# @PARAM: role_in (RoleCreate) - New role data.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: RoleSchema - The created role.
|
||||||
|
# @SIDE_EFFECT: Commits new role and associations to auth.db.
|
||||||
|
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.get_permission_by_id
|
||||||
|
@router.post("/roles", response_model=RoleSchema, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_role(
|
||||||
|
role_in: RoleCreate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.create_role"):
|
||||||
|
if db.query(Role).filter(Role.name == role_in.name).first():
|
||||||
|
raise HTTPException(status_code=400, detail="Role already exists")
|
||||||
|
|
||||||
|
new_role = Role(name=role_in.name, description=role_in.description)
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
|
||||||
|
for perm_id_or_str in role_in.permissions:
|
||||||
|
perm = repo.get_permission_by_id(perm_id_or_str)
|
||||||
|
if not perm and ":" in perm_id_or_str:
|
||||||
|
res, act = perm_id_or_str.split(":", 1)
|
||||||
|
perm = repo.get_permission_by_resource_action(res, act)
|
||||||
|
|
||||||
|
if perm:
|
||||||
|
new_role.permissions.append(perm)
|
||||||
|
|
||||||
|
db.add(new_role)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(new_role)
|
||||||
|
return new_role
|
||||||
|
# [/DEF:create_role:Function]
|
||||||
|
|
||||||
|
# [DEF:update_role:Function]
|
||||||
|
# @PURPOSE: Updates an existing role's metadata and permissions.
|
||||||
|
# @PRE: role_id must be a valid existing role UUID.
|
||||||
|
# @POST: Role record is updated in auth.db.
|
||||||
|
# @PARAM: role_id (str) - Target role identifier.
|
||||||
|
# @PARAM: role_in (RoleUpdate) - Updated role data.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: RoleSchema - The updated role.
|
||||||
|
# @SIDE_EFFECT: Commits updates to auth.db.
|
||||||
|
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.get_role_by_id
|
||||||
|
@router.put("/roles/{role_id}", response_model=RoleSchema)
|
||||||
|
async def update_role(
|
||||||
|
role_id: str,
|
||||||
|
role_in: RoleUpdate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.update_role"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
role = repo.get_role_by_id(role_id)
|
||||||
|
if not role:
|
||||||
|
raise HTTPException(status_code=404, detail="Role not found")
|
||||||
|
|
||||||
|
if role_in.name is not None:
|
||||||
|
role.name = role_in.name
|
||||||
|
if role_in.description is not None:
|
||||||
|
role.description = role_in.description
|
||||||
|
|
||||||
|
if role_in.permissions is not None:
|
||||||
|
role.permissions = []
|
||||||
|
for perm_id_or_str in role_in.permissions:
|
||||||
|
perm = repo.get_permission_by_id(perm_id_or_str)
|
||||||
|
if not perm and ":" in perm_id_or_str:
|
||||||
|
res, act = perm_id_or_str.split(":", 1)
|
||||||
|
perm = repo.get_permission_by_resource_action(res, act)
|
||||||
|
|
||||||
|
if perm:
|
||||||
|
role.permissions.append(perm)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
db.refresh(role)
|
||||||
|
return role
|
||||||
|
# [/DEF:update_role:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_role:Function]
|
||||||
|
# @PURPOSE: Removes a role from the system.
|
||||||
|
# @PRE: role_id must be a valid existing role UUID.
|
||||||
|
# @POST: Role record is removed from auth.db.
|
||||||
|
# @PARAM: role_id (str) - Target role identifier.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: None
|
||||||
|
# @SIDE_EFFECT: Deletes record from auth.db and commits.
|
||||||
|
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.get_role_by_id
|
||||||
|
@router.delete("/roles/{role_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
async def delete_role(
|
||||||
|
role_id: str,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.delete_role"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
role = repo.get_role_by_id(role_id)
|
||||||
|
if not role:
|
||||||
|
raise HTTPException(status_code=404, detail="Role not found")
|
||||||
|
|
||||||
|
db.delete(role)
|
||||||
|
db.commit()
|
||||||
|
return None
|
||||||
|
# [/DEF:delete_role:Function]
|
||||||
|
|
||||||
|
# [DEF:list_permissions:Function]
|
||||||
|
# @PURPOSE: Lists all available system permissions for assignment.
|
||||||
|
# @POST: Returns a list of all PermissionSchema objects.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: List[PermissionSchema] - List of permissions.
|
||||||
|
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.list_permissions
|
||||||
|
@router.get("/permissions", response_model=List[PermissionSchema])
|
||||||
|
async def list_permissions(
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:roles", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.list_permissions"):
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
return repo.list_permissions()
|
||||||
|
# [/DEF:list_permissions:Function]
|
||||||
|
|
||||||
|
# [DEF:list_ad_mappings:Function]
|
||||||
|
# @PURPOSE: Lists all AD Group to Role mappings.
|
||||||
|
@router.get("/ad-mappings", response_model=List[ADGroupMappingSchema])
|
||||||
|
async def list_ad_mappings(
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.list_ad_mappings"):
|
||||||
|
return db.query(ADGroupMapping).all()
|
||||||
|
# [/DEF:list_ad_mappings:Function]
|
||||||
|
|
||||||
|
# [DEF:create_ad_mapping:Function]
|
||||||
|
# @PURPOSE: Creates a new AD Group mapping.
|
||||||
|
@router.post("/ad-mappings", response_model=ADGroupMappingSchema)
|
||||||
|
async def create_ad_mapping(
|
||||||
|
mapping_in: ADGroupMappingCreate,
|
||||||
|
db: Session = Depends(get_auth_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
|
):
|
||||||
|
with belief_scope("api.admin.create_ad_mapping"):
|
||||||
|
new_mapping = ADGroupMapping(
|
||||||
|
ad_group=mapping_in.ad_group,
|
||||||
|
role_id=mapping_in.role_id
|
||||||
|
)
|
||||||
|
db.add(new_mapping)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(new_mapping)
|
||||||
|
return new_mapping
|
||||||
|
# [/DEF:create_ad_mapping:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.api.routes.admin:Module]
|
||||||
@@ -11,7 +11,7 @@
|
|||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from typing import List, Dict, Optional
|
from typing import List, Dict, Optional
|
||||||
from ...dependencies import get_config_manager, get_scheduler_service
|
from ...dependencies import get_config_manager, get_scheduler_service, has_permission
|
||||||
from ...core.superset_client import SupersetClient
|
from ...core.superset_client import SupersetClient
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
from ...core.config_models import Environment as EnvModel
|
from ...core.config_models import Environment as EnvModel
|
||||||
@@ -23,7 +23,7 @@ router = APIRouter()
|
|||||||
# [DEF:ScheduleSchema:DataClass]
|
# [DEF:ScheduleSchema:DataClass]
|
||||||
class ScheduleSchema(BaseModel):
|
class ScheduleSchema(BaseModel):
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
cron_expression: str = Field(..., pattern=r'^(@(annually|yearly|monthly|weekly|daily|hourly|reboot))|((((\d+,)*\d+|(\d+(\/|-)\d+)|\d+|\*) ?){5,7})$')
|
cron_expression: str = Field(..., pattern=r'^(@(annually|yearly|monthly|weekly|daily|hourly|reboot))|((((\d+,)*\d+|(\d+(\/|-)\d+)|\d+|\*) ?){4,6})$')
|
||||||
# [/DEF:ScheduleSchema:DataClass]
|
# [/DEF:ScheduleSchema:DataClass]
|
||||||
|
|
||||||
# [DEF:EnvironmentResponse:DataClass]
|
# [DEF:EnvironmentResponse:DataClass]
|
||||||
@@ -47,7 +47,10 @@ class DatabaseResponse(BaseModel):
|
|||||||
# @POST: Returns a list of EnvironmentResponse objects.
|
# @POST: Returns a list of EnvironmentResponse objects.
|
||||||
# @RETURN: List[EnvironmentResponse]
|
# @RETURN: List[EnvironmentResponse]
|
||||||
@router.get("", response_model=List[EnvironmentResponse])
|
@router.get("", response_model=List[EnvironmentResponse])
|
||||||
async def get_environments(config_manager=Depends(get_config_manager)):
|
async def get_environments(
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("environments", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_environments"):
|
with belief_scope("get_environments"):
|
||||||
envs = config_manager.get_environments()
|
envs = config_manager.get_environments()
|
||||||
# Ensure envs is a list
|
# Ensure envs is a list
|
||||||
@@ -77,7 +80,8 @@ async def update_environment_schedule(
|
|||||||
id: str,
|
id: str,
|
||||||
schedule: ScheduleSchema,
|
schedule: ScheduleSchema,
|
||||||
config_manager=Depends(get_config_manager),
|
config_manager=Depends(get_config_manager),
|
||||||
scheduler_service=Depends(get_scheduler_service)
|
scheduler_service=Depends(get_scheduler_service),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
):
|
):
|
||||||
with belief_scope("update_environment_schedule", f"id={id}"):
|
with belief_scope("update_environment_schedule", f"id={id}"):
|
||||||
envs = config_manager.get_environments()
|
envs = config_manager.get_environments()
|
||||||
@@ -104,7 +108,11 @@ async def update_environment_schedule(
|
|||||||
# @PARAM: id (str) - The environment ID.
|
# @PARAM: id (str) - The environment ID.
|
||||||
# @RETURN: List[Dict] - List of databases.
|
# @RETURN: List[Dict] - List of databases.
|
||||||
@router.get("/{id}/databases")
|
@router.get("/{id}/databases")
|
||||||
async def get_environment_databases(id: str, config_manager=Depends(get_config_manager)):
|
async def get_environment_databases(
|
||||||
|
id: str,
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_environment_databases", f"id={id}"):
|
with belief_scope("get_environment_databases", f"id={id}"):
|
||||||
envs = config_manager.get_environments()
|
envs = config_manager.get_environments()
|
||||||
env = next((e for e in envs if e.id == id), None)
|
env = next((e for e in envs if e.id == id), None)
|
||||||
|
|||||||
@@ -13,7 +13,7 @@ from fastapi import APIRouter, Depends, HTTPException
|
|||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
import typing
|
import typing
|
||||||
from src.dependencies import get_config_manager
|
from src.dependencies import get_config_manager, has_permission
|
||||||
from src.core.database import get_db
|
from src.core.database import get_db
|
||||||
from src.models.git import GitServerConfig, GitStatus, DeploymentEnvironment, GitRepository
|
from src.models.git import GitServerConfig, GitStatus, DeploymentEnvironment, GitRepository
|
||||||
from src.api.routes.git_schemas import (
|
from src.api.routes.git_schemas import (
|
||||||
@@ -34,7 +34,10 @@ git_service = GitService()
|
|||||||
# @POST: Returns a list of all GitServerConfig objects from the database.
|
# @POST: Returns a list of all GitServerConfig objects from the database.
|
||||||
# @RETURN: List[GitServerConfigSchema]
|
# @RETURN: List[GitServerConfigSchema]
|
||||||
@router.get("/config", response_model=List[GitServerConfigSchema])
|
@router.get("/config", response_model=List[GitServerConfigSchema])
|
||||||
async def get_git_configs(db: Session = Depends(get_db)):
|
async def get_git_configs(
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_git_configs"):
|
with belief_scope("get_git_configs"):
|
||||||
return db.query(GitServerConfig).all()
|
return db.query(GitServerConfig).all()
|
||||||
# [/DEF:get_git_configs:Function]
|
# [/DEF:get_git_configs:Function]
|
||||||
@@ -46,7 +49,11 @@ async def get_git_configs(db: Session = Depends(get_db)):
|
|||||||
# @PARAM: config (GitServerConfigCreate)
|
# @PARAM: config (GitServerConfigCreate)
|
||||||
# @RETURN: GitServerConfigSchema
|
# @RETURN: GitServerConfigSchema
|
||||||
@router.post("/config", response_model=GitServerConfigSchema)
|
@router.post("/config", response_model=GitServerConfigSchema)
|
||||||
async def create_git_config(config: GitServerConfigCreate, db: Session = Depends(get_db)):
|
async def create_git_config(
|
||||||
|
config: GitServerConfigCreate,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
|
):
|
||||||
with belief_scope("create_git_config"):
|
with belief_scope("create_git_config"):
|
||||||
db_config = GitServerConfig(**config.dict())
|
db_config = GitServerConfig(**config.dict())
|
||||||
db.add(db_config)
|
db.add(db_config)
|
||||||
@@ -61,7 +68,11 @@ async def create_git_config(config: GitServerConfigCreate, db: Session = Depends
|
|||||||
# @POST: The configuration record is removed from the database.
|
# @POST: The configuration record is removed from the database.
|
||||||
# @PARAM: config_id (str)
|
# @PARAM: config_id (str)
|
||||||
@router.delete("/config/{config_id}")
|
@router.delete("/config/{config_id}")
|
||||||
async def delete_git_config(config_id: str, db: Session = Depends(get_db)):
|
async def delete_git_config(
|
||||||
|
config_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
|
):
|
||||||
with belief_scope("delete_git_config"):
|
with belief_scope("delete_git_config"):
|
||||||
db_config = db.query(GitServerConfig).filter(GitServerConfig.id == config_id).first()
|
db_config = db.query(GitServerConfig).filter(GitServerConfig.id == config_id).first()
|
||||||
if not db_config:
|
if not db_config:
|
||||||
@@ -78,7 +89,10 @@ async def delete_git_config(config_id: str, db: Session = Depends(get_db)):
|
|||||||
# @POST: Returns success if the connection is validated via GitService.
|
# @POST: Returns success if the connection is validated via GitService.
|
||||||
# @PARAM: config (GitServerConfigCreate)
|
# @PARAM: config (GitServerConfigCreate)
|
||||||
@router.post("/config/test")
|
@router.post("/config/test")
|
||||||
async def test_git_config(config: GitServerConfigCreate):
|
async def test_git_config(
|
||||||
|
config: GitServerConfigCreate,
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("test_git_config"):
|
with belief_scope("test_git_config"):
|
||||||
success = await git_service.test_connection(config.provider, config.url, config.pat)
|
success = await git_service.test_connection(config.provider, config.url, config.pat)
|
||||||
if success:
|
if success:
|
||||||
@@ -94,7 +108,12 @@ async def test_git_config(config: GitServerConfigCreate):
|
|||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
# @PARAM: init_data (RepoInitRequest)
|
# @PARAM: init_data (RepoInitRequest)
|
||||||
@router.post("/repositories/{dashboard_id}/init")
|
@router.post("/repositories/{dashboard_id}/init")
|
||||||
async def init_repository(dashboard_id: int, init_data: RepoInitRequest, db: Session = Depends(get_db)):
|
async def init_repository(
|
||||||
|
dashboard_id: int,
|
||||||
|
init_data: RepoInitRequest,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("init_repository"):
|
with belief_scope("init_repository"):
|
||||||
# 1. Get config
|
# 1. Get config
|
||||||
config = db.query(GitServerConfig).filter(GitServerConfig.id == init_data.config_id).first()
|
config = db.query(GitServerConfig).filter(GitServerConfig.id == init_data.config_id).first()
|
||||||
@@ -138,7 +157,10 @@ async def init_repository(dashboard_id: int, init_data: RepoInitRequest, db: Ses
|
|||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
# @RETURN: List[BranchSchema]
|
# @RETURN: List[BranchSchema]
|
||||||
@router.get("/repositories/{dashboard_id}/branches", response_model=List[BranchSchema])
|
@router.get("/repositories/{dashboard_id}/branches", response_model=List[BranchSchema])
|
||||||
async def get_branches(dashboard_id: int):
|
async def get_branches(
|
||||||
|
dashboard_id: int,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("get_branches"):
|
with belief_scope("get_branches"):
|
||||||
try:
|
try:
|
||||||
return git_service.list_branches(dashboard_id)
|
return git_service.list_branches(dashboard_id)
|
||||||
@@ -153,7 +175,11 @@ async def get_branches(dashboard_id: int):
|
|||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
# @PARAM: branch_data (BranchCreate)
|
# @PARAM: branch_data (BranchCreate)
|
||||||
@router.post("/repositories/{dashboard_id}/branches")
|
@router.post("/repositories/{dashboard_id}/branches")
|
||||||
async def create_branch(dashboard_id: int, branch_data: BranchCreate):
|
async def create_branch(
|
||||||
|
dashboard_id: int,
|
||||||
|
branch_data: BranchCreate,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("create_branch"):
|
with belief_scope("create_branch"):
|
||||||
try:
|
try:
|
||||||
git_service.create_branch(dashboard_id, branch_data.name, branch_data.from_branch)
|
git_service.create_branch(dashboard_id, branch_data.name, branch_data.from_branch)
|
||||||
@@ -169,7 +195,11 @@ async def create_branch(dashboard_id: int, branch_data: BranchCreate):
|
|||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
# @PARAM: checkout_data (BranchCheckout)
|
# @PARAM: checkout_data (BranchCheckout)
|
||||||
@router.post("/repositories/{dashboard_id}/checkout")
|
@router.post("/repositories/{dashboard_id}/checkout")
|
||||||
async def checkout_branch(dashboard_id: int, checkout_data: BranchCheckout):
|
async def checkout_branch(
|
||||||
|
dashboard_id: int,
|
||||||
|
checkout_data: BranchCheckout,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("checkout_branch"):
|
with belief_scope("checkout_branch"):
|
||||||
try:
|
try:
|
||||||
git_service.checkout_branch(dashboard_id, checkout_data.name)
|
git_service.checkout_branch(dashboard_id, checkout_data.name)
|
||||||
@@ -185,7 +215,11 @@ async def checkout_branch(dashboard_id: int, checkout_data: BranchCheckout):
|
|||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
# @PARAM: commit_data (CommitCreate)
|
# @PARAM: commit_data (CommitCreate)
|
||||||
@router.post("/repositories/{dashboard_id}/commit")
|
@router.post("/repositories/{dashboard_id}/commit")
|
||||||
async def commit_changes(dashboard_id: int, commit_data: CommitCreate):
|
async def commit_changes(
|
||||||
|
dashboard_id: int,
|
||||||
|
commit_data: CommitCreate,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("commit_changes"):
|
with belief_scope("commit_changes"):
|
||||||
try:
|
try:
|
||||||
git_service.commit_changes(dashboard_id, commit_data.message, commit_data.files)
|
git_service.commit_changes(dashboard_id, commit_data.message, commit_data.files)
|
||||||
@@ -200,7 +234,10 @@ async def commit_changes(dashboard_id: int, commit_data: CommitCreate):
|
|||||||
# @POST: Local commits are pushed to the remote repository.
|
# @POST: Local commits are pushed to the remote repository.
|
||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
@router.post("/repositories/{dashboard_id}/push")
|
@router.post("/repositories/{dashboard_id}/push")
|
||||||
async def push_changes(dashboard_id: int):
|
async def push_changes(
|
||||||
|
dashboard_id: int,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("push_changes"):
|
with belief_scope("push_changes"):
|
||||||
try:
|
try:
|
||||||
git_service.push_changes(dashboard_id)
|
git_service.push_changes(dashboard_id)
|
||||||
@@ -215,7 +252,10 @@ async def push_changes(dashboard_id: int):
|
|||||||
# @POST: Remote changes are fetched and merged into the local branch.
|
# @POST: Remote changes are fetched and merged into the local branch.
|
||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
@router.post("/repositories/{dashboard_id}/pull")
|
@router.post("/repositories/{dashboard_id}/pull")
|
||||||
async def pull_changes(dashboard_id: int):
|
async def pull_changes(
|
||||||
|
dashboard_id: int,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("pull_changes"):
|
with belief_scope("pull_changes"):
|
||||||
try:
|
try:
|
||||||
git_service.pull_changes(dashboard_id)
|
git_service.pull_changes(dashboard_id)
|
||||||
@@ -231,7 +271,11 @@ async def pull_changes(dashboard_id: int):
|
|||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
# @PARAM: source_env_id (Optional[str])
|
# @PARAM: source_env_id (Optional[str])
|
||||||
@router.post("/repositories/{dashboard_id}/sync")
|
@router.post("/repositories/{dashboard_id}/sync")
|
||||||
async def sync_dashboard(dashboard_id: int, source_env_id: typing.Optional[str] = None):
|
async def sync_dashboard(
|
||||||
|
dashboard_id: int,
|
||||||
|
source_env_id: typing.Optional[str] = None,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("sync_dashboard"):
|
with belief_scope("sync_dashboard"):
|
||||||
try:
|
try:
|
||||||
from src.plugins.git_plugin import GitPlugin
|
from src.plugins.git_plugin import GitPlugin
|
||||||
@@ -251,7 +295,10 @@ async def sync_dashboard(dashboard_id: int, source_env_id: typing.Optional[str]
|
|||||||
# @POST: Returns a list of DeploymentEnvironmentSchema objects.
|
# @POST: Returns a list of DeploymentEnvironmentSchema objects.
|
||||||
# @RETURN: List[DeploymentEnvironmentSchema]
|
# @RETURN: List[DeploymentEnvironmentSchema]
|
||||||
@router.get("/environments", response_model=List[DeploymentEnvironmentSchema])
|
@router.get("/environments", response_model=List[DeploymentEnvironmentSchema])
|
||||||
async def get_environments(config_manager=Depends(get_config_manager)):
|
async def get_environments(
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("environments", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_environments"):
|
with belief_scope("get_environments"):
|
||||||
envs = config_manager.get_environments()
|
envs = config_manager.get_environments()
|
||||||
return [
|
return [
|
||||||
@@ -271,7 +318,11 @@ async def get_environments(config_manager=Depends(get_config_manager)):
|
|||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
# @PARAM: deploy_data (DeployRequest)
|
# @PARAM: deploy_data (DeployRequest)
|
||||||
@router.post("/repositories/{dashboard_id}/deploy")
|
@router.post("/repositories/{dashboard_id}/deploy")
|
||||||
async def deploy_dashboard(dashboard_id: int, deploy_data: DeployRequest):
|
async def deploy_dashboard(
|
||||||
|
dashboard_id: int,
|
||||||
|
deploy_data: DeployRequest,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("deploy_dashboard"):
|
with belief_scope("deploy_dashboard"):
|
||||||
try:
|
try:
|
||||||
from src.plugins.git_plugin import GitPlugin
|
from src.plugins.git_plugin import GitPlugin
|
||||||
@@ -293,7 +344,11 @@ async def deploy_dashboard(dashboard_id: int, deploy_data: DeployRequest):
|
|||||||
# @PARAM: limit (int)
|
# @PARAM: limit (int)
|
||||||
# @RETURN: List[CommitSchema]
|
# @RETURN: List[CommitSchema]
|
||||||
@router.get("/repositories/{dashboard_id}/history", response_model=List[CommitSchema])
|
@router.get("/repositories/{dashboard_id}/history", response_model=List[CommitSchema])
|
||||||
async def get_history(dashboard_id: int, limit: int = 50):
|
async def get_history(
|
||||||
|
dashboard_id: int,
|
||||||
|
limit: int = 50,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("get_history"):
|
with belief_scope("get_history"):
|
||||||
try:
|
try:
|
||||||
return git_service.get_commit_history(dashboard_id, limit)
|
return git_service.get_commit_history(dashboard_id, limit)
|
||||||
@@ -308,7 +363,10 @@ async def get_history(dashboard_id: int, limit: int = 50):
|
|||||||
# @PARAM: dashboard_id (int)
|
# @PARAM: dashboard_id (int)
|
||||||
# @RETURN: dict
|
# @RETURN: dict
|
||||||
@router.get("/repositories/{dashboard_id}/status")
|
@router.get("/repositories/{dashboard_id}/status")
|
||||||
async def get_repository_status(dashboard_id: int):
|
async def get_repository_status(
|
||||||
|
dashboard_id: int,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("get_repository_status"):
|
with belief_scope("get_repository_status"):
|
||||||
try:
|
try:
|
||||||
return git_service.get_status(dashboard_id)
|
return git_service.get_status(dashboard_id)
|
||||||
@@ -325,7 +383,12 @@ async def get_repository_status(dashboard_id: int):
|
|||||||
# @PARAM: staged (bool)
|
# @PARAM: staged (bool)
|
||||||
# @RETURN: str
|
# @RETURN: str
|
||||||
@router.get("/repositories/{dashboard_id}/diff")
|
@router.get("/repositories/{dashboard_id}/diff")
|
||||||
async def get_repository_diff(dashboard_id: int, file_path: Optional[str] = None, staged: bool = False):
|
async def get_repository_diff(
|
||||||
|
dashboard_id: int,
|
||||||
|
file_path: Optional[str] = None,
|
||||||
|
staged: bool = False,
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("get_repository_diff"):
|
with belief_scope("get_repository_diff"):
|
||||||
try:
|
try:
|
||||||
diff_text = git_service.get_diff(dashboard_id, file_path, staged)
|
diff_text = git_service.get_diff(dashboard_id, file_path, staged)
|
||||||
@@ -334,4 +397,59 @@ async def get_repository_diff(dashboard_id: int, file_path: Optional[str] = None
|
|||||||
raise HTTPException(status_code=400, detail=str(e))
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
# [/DEF:get_repository_diff:Function]
|
# [/DEF:get_repository_diff:Function]
|
||||||
|
|
||||||
|
# [DEF:generate_commit_message:Function]
|
||||||
|
# @PURPOSE: Generate a suggested commit message using LLM.
|
||||||
|
# @PRE: Repository for `dashboard_id` is initialized.
|
||||||
|
# @POST: Returns a suggested commit message string.
|
||||||
|
@router.post("/repositories/{dashboard_id}/generate-message")
|
||||||
|
async def generate_commit_message(
|
||||||
|
dashboard_id: int,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||||
|
):
|
||||||
|
with belief_scope("generate_commit_message"):
|
||||||
|
try:
|
||||||
|
# 1. Get Diff
|
||||||
|
diff = git_service.get_diff(dashboard_id, staged=True)
|
||||||
|
if not diff:
|
||||||
|
diff = git_service.get_diff(dashboard_id, staged=False)
|
||||||
|
|
||||||
|
if not diff:
|
||||||
|
return {"message": "No changes detected"}
|
||||||
|
|
||||||
|
# 2. Get History
|
||||||
|
history_objs = git_service.get_commit_history(dashboard_id, limit=5)
|
||||||
|
history = [h.message for h in history_objs if hasattr(h, 'message')]
|
||||||
|
|
||||||
|
# 3. Get LLM Client
|
||||||
|
from ...services.llm_provider import LLMProviderService
|
||||||
|
from ...plugins.llm_analysis.service import LLMClient
|
||||||
|
from ...plugins.llm_analysis.models import LLMProviderType
|
||||||
|
|
||||||
|
llm_service = LLMProviderService(db)
|
||||||
|
providers = llm_service.get_all_providers()
|
||||||
|
provider = next((p for p in providers if p.is_active), None)
|
||||||
|
|
||||||
|
if not provider:
|
||||||
|
raise HTTPException(status_code=400, detail="No active LLM provider found")
|
||||||
|
|
||||||
|
api_key = llm_service.get_decrypted_api_key(provider.id)
|
||||||
|
client = LLMClient(
|
||||||
|
provider_type=LLMProviderType(provider.provider_type),
|
||||||
|
api_key=api_key,
|
||||||
|
base_url=provider.base_url,
|
||||||
|
default_model=provider.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
# 4. Generate Message
|
||||||
|
from ...plugins.git.llm_extension import GitLLMExtension
|
||||||
|
extension = GitLLMExtension(client)
|
||||||
|
message = await extension.suggest_commit_message(diff, history)
|
||||||
|
|
||||||
|
return {"message": message}
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to generate commit message: {e}")
|
||||||
|
raise HTTPException(status_code=400, detail=str(e))
|
||||||
|
# [/DEF:generate_commit_message:Function]
|
||||||
|
|
||||||
# [/DEF:backend.src.api.routes.git:Module]
|
# [/DEF:backend.src.api.routes.git:Module]
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
# [DEF:backend.src.api.routes.git_schemas:Module]
|
# [DEF:backend.src.api.routes.git_schemas:Module]
|
||||||
#
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: git, schemas, pydantic, api, contracts
|
# @SEMANTICS: git, schemas, pydantic, api, contracts
|
||||||
# @PURPOSE: Defines Pydantic models for the Git integration API layer.
|
# @PURPOSE: Defines Pydantic models for the Git integration API layer.
|
||||||
# @LAYER: API
|
# @LAYER: API
|
||||||
@@ -14,6 +15,7 @@ from uuid import UUID
|
|||||||
from src.models.git import GitProvider, GitStatus, SyncStatus
|
from src.models.git import GitProvider, GitStatus, SyncStatus
|
||||||
|
|
||||||
# [DEF:GitServerConfigBase:Class]
|
# [DEF:GitServerConfigBase:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Base schema for Git server configuration attributes.
|
# @PURPOSE: Base schema for Git server configuration attributes.
|
||||||
class GitServerConfigBase(BaseModel):
|
class GitServerConfigBase(BaseModel):
|
||||||
name: str = Field(..., description="Display name for the Git server")
|
name: str = Field(..., description="Display name for the Git server")
|
||||||
|
|||||||
207
backend/src/api/routes/llm.py
Normal file
207
backend/src/api/routes/llm.py
Normal file
@@ -0,0 +1,207 @@
|
|||||||
|
# [DEF:backend/src/api/routes/llm.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: api, routes, llm
|
||||||
|
# @PURPOSE: API routes for LLM provider configuration and management.
|
||||||
|
# @LAYER: UI (API)
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from typing import List
|
||||||
|
from ...core.logger import logger
|
||||||
|
from ...schemas.auth import User
|
||||||
|
from ...dependencies import get_current_user as get_current_active_user
|
||||||
|
from ...plugins.llm_analysis.models import LLMProviderConfig, LLMProviderType
|
||||||
|
from ...services.llm_provider import LLMProviderService
|
||||||
|
from ...core.database import get_db
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
# [DEF:router:Global]
|
||||||
|
# @PURPOSE: APIRouter instance for LLM routes.
|
||||||
|
router = APIRouter(prefix="/api/llm", tags=["LLM"])
|
||||||
|
# [/DEF:router:Global]
|
||||||
|
|
||||||
|
# [DEF:get_providers:Function]
|
||||||
|
# @PURPOSE: Retrieve all LLM provider configurations.
|
||||||
|
# @PRE: User is authenticated.
|
||||||
|
# @POST: Returns list of LLMProviderConfig.
|
||||||
|
@router.get("/providers", response_model=List[LLMProviderConfig])
|
||||||
|
async def get_providers(
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get all LLM provider configurations.
|
||||||
|
"""
|
||||||
|
logger.info(f"[llm_routes][get_providers][Action] Fetching providers for user: {current_user.username}")
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
providers = service.get_all_providers()
|
||||||
|
return [
|
||||||
|
LLMProviderConfig(
|
||||||
|
id=p.id,
|
||||||
|
provider_type=LLMProviderType(p.provider_type),
|
||||||
|
name=p.name,
|
||||||
|
base_url=p.base_url,
|
||||||
|
api_key="********",
|
||||||
|
default_model=p.default_model,
|
||||||
|
is_active=p.is_active
|
||||||
|
) for p in providers
|
||||||
|
]
|
||||||
|
# [/DEF:get_providers:Function]
|
||||||
|
|
||||||
|
# [DEF:create_provider:Function]
|
||||||
|
# @PURPOSE: Create a new LLM provider configuration.
|
||||||
|
# @PRE: User is authenticated and has admin permissions.
|
||||||
|
# @POST: Returns the created LLMProviderConfig.
|
||||||
|
@router.post("/providers", response_model=LLMProviderConfig, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_provider(
|
||||||
|
config: LLMProviderConfig,
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Create a new LLM provider configuration.
|
||||||
|
"""
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
provider = service.create_provider(config)
|
||||||
|
return LLMProviderConfig(
|
||||||
|
id=provider.id,
|
||||||
|
provider_type=LLMProviderType(provider.provider_type),
|
||||||
|
name=provider.name,
|
||||||
|
base_url=provider.base_url,
|
||||||
|
api_key="********",
|
||||||
|
default_model=provider.default_model,
|
||||||
|
is_active=provider.is_active
|
||||||
|
)
|
||||||
|
# [/DEF:create_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:update_provider:Function]
|
||||||
|
# @PURPOSE: Update an existing LLM provider configuration.
|
||||||
|
# @PRE: User is authenticated and has admin permissions.
|
||||||
|
# @POST: Returns the updated LLMProviderConfig.
|
||||||
|
@router.put("/providers/{provider_id}", response_model=LLMProviderConfig)
|
||||||
|
async def update_provider(
|
||||||
|
provider_id: str,
|
||||||
|
config: LLMProviderConfig,
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Update an existing LLM provider configuration.
|
||||||
|
"""
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
provider = service.update_provider(provider_id, config)
|
||||||
|
if not provider:
|
||||||
|
raise HTTPException(status_code=404, detail="Provider not found")
|
||||||
|
|
||||||
|
return LLMProviderConfig(
|
||||||
|
id=provider.id,
|
||||||
|
provider_type=LLMProviderType(provider.provider_type),
|
||||||
|
name=provider.name,
|
||||||
|
base_url=provider.base_url,
|
||||||
|
api_key="********",
|
||||||
|
default_model=provider.default_model,
|
||||||
|
is_active=provider.is_active
|
||||||
|
)
|
||||||
|
# [/DEF:update_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_provider:Function]
|
||||||
|
# @PURPOSE: Delete an LLM provider configuration.
|
||||||
|
# @PRE: User is authenticated and has admin permissions.
|
||||||
|
# @POST: Returns success status.
|
||||||
|
@router.delete("/providers/{provider_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
async def delete_provider(
|
||||||
|
provider_id: str,
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Delete an LLM provider configuration.
|
||||||
|
"""
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
if not service.delete_provider(provider_id):
|
||||||
|
raise HTTPException(status_code=404, detail="Provider not found")
|
||||||
|
return
|
||||||
|
# [/DEF:delete_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:test_connection:Function]
|
||||||
|
# @PURPOSE: Test connection to an LLM provider.
|
||||||
|
# @PRE: User is authenticated.
|
||||||
|
# @POST: Returns success status and message.
|
||||||
|
@router.post("/providers/{provider_id}/test")
|
||||||
|
async def test_connection(
|
||||||
|
provider_id: str,
|
||||||
|
current_user: User = Depends(get_current_active_user),
|
||||||
|
db: Session = Depends(get_db)
|
||||||
|
):
|
||||||
|
logger.info(f"[llm_routes][test_connection][Action] Testing connection for provider_id: {provider_id}")
|
||||||
|
"""
|
||||||
|
Test connection to an LLM provider.
|
||||||
|
"""
|
||||||
|
from ...plugins.llm_analysis.service import LLMClient
|
||||||
|
service = LLMProviderService(db)
|
||||||
|
db_provider = service.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
raise HTTPException(status_code=404, detail="Provider not found")
|
||||||
|
|
||||||
|
api_key = service.get_decrypted_api_key(provider_id)
|
||||||
|
|
||||||
|
# Check if API key was successfully decrypted
|
||||||
|
if not api_key:
|
||||||
|
logger.error(f"[llm_routes][test_connection] Failed to decrypt API key for provider {provider_id}")
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=500,
|
||||||
|
detail="Failed to decrypt API key. The provider may have been encrypted with a different encryption key. Please update the provider with a new API key."
|
||||||
|
)
|
||||||
|
|
||||||
|
client = LLMClient(
|
||||||
|
provider_type=LLMProviderType(db_provider.provider_type),
|
||||||
|
api_key=api_key,
|
||||||
|
base_url=db_provider.base_url,
|
||||||
|
default_model=db_provider.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Simple test call
|
||||||
|
await client.client.models.list()
|
||||||
|
return {"success": True, "message": "Connection successful"}
|
||||||
|
except Exception as e:
|
||||||
|
return {"success": False, "error": str(e)}
|
||||||
|
# [/DEF:test_connection:Function]
|
||||||
|
|
||||||
|
# [DEF:test_provider_config:Function]
|
||||||
|
# @PURPOSE: Test connection with a provided configuration (not yet saved).
|
||||||
|
# @PRE: User is authenticated.
|
||||||
|
# @POST: Returns success status and message.
|
||||||
|
@router.post("/providers/test")
|
||||||
|
async def test_provider_config(
|
||||||
|
config: LLMProviderConfig,
|
||||||
|
current_user: User = Depends(get_current_active_user)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Test connection with a provided configuration.
|
||||||
|
"""
|
||||||
|
from ...plugins.llm_analysis.service import LLMClient
|
||||||
|
logger.info(f"[llm_routes][test_provider_config][Action] Testing config for {config.name}")
|
||||||
|
|
||||||
|
# Check if API key is provided
|
||||||
|
if not config.api_key or config.api_key == "********":
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=400,
|
||||||
|
detail="API key is required for testing connection"
|
||||||
|
)
|
||||||
|
|
||||||
|
client = LLMClient(
|
||||||
|
provider_type=config.provider_type,
|
||||||
|
api_key=config.api_key,
|
||||||
|
base_url=config.base_url,
|
||||||
|
default_model=config.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Simple test call
|
||||||
|
await client.client.models.list()
|
||||||
|
return {"success": True, "message": "Connection successful"}
|
||||||
|
except Exception as e:
|
||||||
|
return {"success": False, "error": str(e)}
|
||||||
|
# [/DEF:test_provider_config:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/api/routes/llm.py]
|
||||||
@@ -14,7 +14,7 @@ from fastapi import APIRouter, Depends, HTTPException
|
|||||||
from sqlalchemy.orm import Session
|
from sqlalchemy.orm import Session
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
from ...core.logger import belief_scope
|
from ...core.logger import belief_scope
|
||||||
from ...dependencies import get_config_manager
|
from ...dependencies import get_config_manager, has_permission
|
||||||
from ...core.database import get_db
|
from ...core.database import get_db
|
||||||
from ...models.mapping import DatabaseMapping
|
from ...models.mapping import DatabaseMapping
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
@@ -60,7 +60,8 @@ class SuggestRequest(BaseModel):
|
|||||||
async def get_mappings(
|
async def get_mappings(
|
||||||
source_env_id: Optional[str] = None,
|
source_env_id: Optional[str] = None,
|
||||||
target_env_id: Optional[str] = None,
|
target_env_id: Optional[str] = None,
|
||||||
db: Session = Depends(get_db)
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||||
):
|
):
|
||||||
with belief_scope("get_mappings"):
|
with belief_scope("get_mappings"):
|
||||||
query = db.query(DatabaseMapping)
|
query = db.query(DatabaseMapping)
|
||||||
@@ -76,7 +77,11 @@ async def get_mappings(
|
|||||||
# @PRE: mapping is valid MappingCreate, db session is injected.
|
# @PRE: mapping is valid MappingCreate, db session is injected.
|
||||||
# @POST: DatabaseMapping created or updated in database.
|
# @POST: DatabaseMapping created or updated in database.
|
||||||
@router.post("", response_model=MappingResponse)
|
@router.post("", response_model=MappingResponse)
|
||||||
async def create_mapping(mapping: MappingCreate, db: Session = Depends(get_db)):
|
async def create_mapping(
|
||||||
|
mapping: MappingCreate,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("create_mapping"):
|
with belief_scope("create_mapping"):
|
||||||
# Check if mapping already exists
|
# Check if mapping already exists
|
||||||
existing = db.query(DatabaseMapping).filter(
|
existing = db.query(DatabaseMapping).filter(
|
||||||
@@ -106,10 +111,11 @@ async def create_mapping(mapping: MappingCreate, db: Session = Depends(get_db)):
|
|||||||
@router.post("/suggest")
|
@router.post("/suggest")
|
||||||
async def suggest_mappings_api(
|
async def suggest_mappings_api(
|
||||||
request: SuggestRequest,
|
request: SuggestRequest,
|
||||||
config_manager=Depends(get_config_manager)
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||||
):
|
):
|
||||||
with belief_scope("suggest_mappings_api"):
|
with belief_scope("suggest_mappings_api"):
|
||||||
from backend.src.services.mapping_service import MappingService
|
from ...services.mapping_service import MappingService
|
||||||
service = MappingService(config_manager)
|
service = MappingService(config_manager)
|
||||||
try:
|
try:
|
||||||
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
||||||
|
|||||||
@@ -7,7 +7,7 @@
|
|||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from typing import List, Dict
|
from typing import List, Dict
|
||||||
from ...dependencies import get_config_manager, get_task_manager
|
from ...dependencies import get_config_manager, get_task_manager, has_permission
|
||||||
from ...models.dashboard import DashboardMetadata, DashboardSelection
|
from ...models.dashboard import DashboardMetadata, DashboardSelection
|
||||||
from ...core.superset_client import SupersetClient
|
from ...core.superset_client import SupersetClient
|
||||||
from ...core.logger import belief_scope
|
from ...core.logger import belief_scope
|
||||||
@@ -21,7 +21,11 @@ router = APIRouter(prefix="/api", tags=["migration"])
|
|||||||
# @PARAM: env_id (str) - The ID of the environment to fetch from.
|
# @PARAM: env_id (str) - The ID of the environment to fetch from.
|
||||||
# @RETURN: List[DashboardMetadata]
|
# @RETURN: List[DashboardMetadata]
|
||||||
@router.get("/environments/{env_id}/dashboards", response_model=List[DashboardMetadata])
|
@router.get("/environments/{env_id}/dashboards", response_model=List[DashboardMetadata])
|
||||||
async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)):
|
async def get_dashboards(
|
||||||
|
env_id: str,
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("plugin:migration", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("get_dashboards", f"env_id={env_id}"):
|
with belief_scope("get_dashboards", f"env_id={env_id}"):
|
||||||
environments = config_manager.get_environments()
|
environments = config_manager.get_environments()
|
||||||
env = next((e for e in environments if e.id == env_id), None)
|
env = next((e for e in environments if e.id == env_id), None)
|
||||||
@@ -40,7 +44,12 @@ async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)
|
|||||||
# @PARAM: selection (DashboardSelection) - The dashboards to migrate.
|
# @PARAM: selection (DashboardSelection) - The dashboards to migrate.
|
||||||
# @RETURN: Dict - {"task_id": str, "message": str}
|
# @RETURN: Dict - {"task_id": str, "message": str}
|
||||||
@router.post("/migration/execute")
|
@router.post("/migration/execute")
|
||||||
async def execute_migration(selection: DashboardSelection, config_manager=Depends(get_config_manager), task_manager=Depends(get_task_manager)):
|
async def execute_migration(
|
||||||
|
selection: DashboardSelection,
|
||||||
|
config_manager=Depends(get_config_manager),
|
||||||
|
task_manager=Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("plugin:migration", "EXECUTE"))
|
||||||
|
):
|
||||||
with belief_scope("execute_migration"):
|
with belief_scope("execute_migration"):
|
||||||
# Validate environments exist
|
# Validate environments exist
|
||||||
environments = config_manager.get_environments()
|
environments = config_manager.get_environments()
|
||||||
|
|||||||
@@ -7,7 +7,7 @@ from typing import List
|
|||||||
from fastapi import APIRouter, Depends
|
from fastapi import APIRouter, Depends
|
||||||
|
|
||||||
from ...core.plugin_base import PluginConfig
|
from ...core.plugin_base import PluginConfig
|
||||||
from ...dependencies import get_plugin_loader
|
from ...dependencies import get_plugin_loader, has_permission
|
||||||
from ...core.logger import belief_scope
|
from ...core.logger import belief_scope
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
@@ -19,7 +19,8 @@ router = APIRouter()
|
|||||||
# @RETURN: List[PluginConfig] - List of registered plugins.
|
# @RETURN: List[PluginConfig] - List of registered plugins.
|
||||||
@router.get("", response_model=List[PluginConfig])
|
@router.get("", response_model=List[PluginConfig])
|
||||||
async def list_plugins(
|
async def list_plugins(
|
||||||
plugin_loader = Depends(get_plugin_loader)
|
plugin_loader = Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugins", "READ"))
|
||||||
):
|
):
|
||||||
with belief_scope("list_plugins"):
|
with belief_scope("list_plugins"):
|
||||||
"""
|
"""
|
||||||
|
|||||||
@@ -14,7 +14,7 @@ from fastapi import APIRouter, Depends, HTTPException
|
|||||||
from typing import List
|
from typing import List
|
||||||
from ...core.config_models import AppConfig, Environment, GlobalSettings
|
from ...core.config_models import AppConfig, Environment, GlobalSettings
|
||||||
from ...models.storage import StorageConfig
|
from ...models.storage import StorageConfig
|
||||||
from ...dependencies import get_config_manager
|
from ...dependencies import get_config_manager, has_permission
|
||||||
from ...core.config_manager import ConfigManager
|
from ...core.config_manager import ConfigManager
|
||||||
from ...core.logger import logger, belief_scope
|
from ...core.logger import logger, belief_scope
|
||||||
from ...core.superset_client import SupersetClient
|
from ...core.superset_client import SupersetClient
|
||||||
@@ -29,7 +29,10 @@ router = APIRouter()
|
|||||||
# @POST: Returns masked AppConfig.
|
# @POST: Returns masked AppConfig.
|
||||||
# @RETURN: AppConfig - The current configuration.
|
# @RETURN: AppConfig - The current configuration.
|
||||||
@router.get("", response_model=AppConfig)
|
@router.get("", response_model=AppConfig)
|
||||||
async def get_settings(config_manager: ConfigManager = Depends(get_config_manager)):
|
async def get_settings(
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_settings"):
|
with belief_scope("get_settings"):
|
||||||
logger.info("[get_settings][Entry] Fetching all settings")
|
logger.info("[get_settings][Entry] Fetching all settings")
|
||||||
config = config_manager.get_config().copy(deep=True)
|
config = config_manager.get_config().copy(deep=True)
|
||||||
@@ -49,7 +52,8 @@ async def get_settings(config_manager: ConfigManager = Depends(get_config_manage
|
|||||||
@router.patch("/global", response_model=GlobalSettings)
|
@router.patch("/global", response_model=GlobalSettings)
|
||||||
async def update_global_settings(
|
async def update_global_settings(
|
||||||
settings: GlobalSettings,
|
settings: GlobalSettings,
|
||||||
config_manager: ConfigManager = Depends(get_config_manager)
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
):
|
):
|
||||||
with belief_scope("update_global_settings"):
|
with belief_scope("update_global_settings"):
|
||||||
logger.info("[update_global_settings][Entry] Updating global settings")
|
logger.info("[update_global_settings][Entry] Updating global settings")
|
||||||
@@ -62,7 +66,10 @@ async def update_global_settings(
|
|||||||
# @PURPOSE: Retrieves storage-specific settings.
|
# @PURPOSE: Retrieves storage-specific settings.
|
||||||
# @RETURN: StorageConfig - The storage configuration.
|
# @RETURN: StorageConfig - The storage configuration.
|
||||||
@router.get("/storage", response_model=StorageConfig)
|
@router.get("/storage", response_model=StorageConfig)
|
||||||
async def get_storage_settings(config_manager: ConfigManager = Depends(get_config_manager)):
|
async def get_storage_settings(
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_storage_settings"):
|
with belief_scope("get_storage_settings"):
|
||||||
return config_manager.get_config().settings.storage
|
return config_manager.get_config().settings.storage
|
||||||
# [/DEF:get_storage_settings:Function]
|
# [/DEF:get_storage_settings:Function]
|
||||||
@@ -73,7 +80,11 @@ async def get_storage_settings(config_manager: ConfigManager = Depends(get_confi
|
|||||||
# @POST: Storage settings are updated and saved.
|
# @POST: Storage settings are updated and saved.
|
||||||
# @RETURN: StorageConfig - The updated storage settings.
|
# @RETURN: StorageConfig - The updated storage settings.
|
||||||
@router.put("/storage", response_model=StorageConfig)
|
@router.put("/storage", response_model=StorageConfig)
|
||||||
async def update_storage_settings(storage: StorageConfig, config_manager: ConfigManager = Depends(get_config_manager)):
|
async def update_storage_settings(
|
||||||
|
storage: StorageConfig,
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
|
):
|
||||||
with belief_scope("update_storage_settings"):
|
with belief_scope("update_storage_settings"):
|
||||||
is_valid, message = config_manager.validate_path(storage.root_path)
|
is_valid, message = config_manager.validate_path(storage.root_path)
|
||||||
if not is_valid:
|
if not is_valid:
|
||||||
@@ -91,7 +102,10 @@ async def update_storage_settings(storage: StorageConfig, config_manager: Config
|
|||||||
# @POST: Returns list of environments.
|
# @POST: Returns list of environments.
|
||||||
# @RETURN: List[Environment] - List of environments.
|
# @RETURN: List[Environment] - List of environments.
|
||||||
@router.get("/environments", response_model=List[Environment])
|
@router.get("/environments", response_model=List[Environment])
|
||||||
async def get_environments(config_manager: ConfigManager = Depends(get_config_manager)):
|
async def get_environments(
|
||||||
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("get_environments"):
|
with belief_scope("get_environments"):
|
||||||
logger.info("[get_environments][Entry] Fetching environments")
|
logger.info("[get_environments][Entry] Fetching environments")
|
||||||
return config_manager.get_environments()
|
return config_manager.get_environments()
|
||||||
@@ -106,7 +120,8 @@ async def get_environments(config_manager: ConfigManager = Depends(get_config_ma
|
|||||||
@router.post("/environments", response_model=Environment)
|
@router.post("/environments", response_model=Environment)
|
||||||
async def add_environment(
|
async def add_environment(
|
||||||
env: Environment,
|
env: Environment,
|
||||||
config_manager: ConfigManager = Depends(get_config_manager)
|
config_manager: ConfigManager = Depends(get_config_manager),
|
||||||
|
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||||
):
|
):
|
||||||
with belief_scope("add_environment"):
|
with belief_scope("add_environment"):
|
||||||
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
||||||
|
|||||||
@@ -8,11 +8,12 @@
|
|||||||
# @INVARIANT: All paths must be validated against path traversal.
|
# @INVARIANT: All paths must be validated against path traversal.
|
||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
|
from pathlib import Path
|
||||||
from fastapi import APIRouter, Depends, UploadFile, File, Form, HTTPException
|
from fastapi import APIRouter, Depends, UploadFile, File, Form, HTTPException
|
||||||
from fastapi.responses import FileResponse
|
from fastapi.responses import FileResponse
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
from ...models.storage import StoredFile, FileCategory
|
from ...models.storage import StoredFile, FileCategory
|
||||||
from ...dependencies import get_plugin_loader
|
from ...dependencies import get_plugin_loader, has_permission
|
||||||
from ...plugins.storage.plugin import StoragePlugin
|
from ...plugins.storage.plugin import StoragePlugin
|
||||||
from ...core.logger import belief_scope
|
from ...core.logger import belief_scope
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
@@ -34,7 +35,8 @@ router = APIRouter(tags=["storage"])
|
|||||||
async def list_files(
|
async def list_files(
|
||||||
category: Optional[FileCategory] = None,
|
category: Optional[FileCategory] = None,
|
||||||
path: Optional[str] = None,
|
path: Optional[str] = None,
|
||||||
plugin_loader=Depends(get_plugin_loader)
|
plugin_loader=Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugin:storage", "READ"))
|
||||||
):
|
):
|
||||||
with belief_scope("list_files"):
|
with belief_scope("list_files"):
|
||||||
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||||
@@ -63,7 +65,8 @@ async def upload_file(
|
|||||||
category: FileCategory = Form(...),
|
category: FileCategory = Form(...),
|
||||||
path: Optional[str] = Form(None),
|
path: Optional[str] = Form(None),
|
||||||
file: UploadFile = File(...),
|
file: UploadFile = File(...),
|
||||||
plugin_loader=Depends(get_plugin_loader)
|
plugin_loader=Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugin:storage", "WRITE"))
|
||||||
):
|
):
|
||||||
with belief_scope("upload_file"):
|
with belief_scope("upload_file"):
|
||||||
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||||
@@ -89,7 +92,12 @@ async def upload_file(
|
|||||||
#
|
#
|
||||||
# @RELATION: CALLS -> StoragePlugin.delete_file
|
# @RELATION: CALLS -> StoragePlugin.delete_file
|
||||||
@router.delete("/files/{category}/{path:path}", status_code=204)
|
@router.delete("/files/{category}/{path:path}", status_code=204)
|
||||||
async def delete_file(category: FileCategory, path: str, plugin_loader=Depends(get_plugin_loader)):
|
async def delete_file(
|
||||||
|
category: FileCategory,
|
||||||
|
path: str,
|
||||||
|
plugin_loader=Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugin:storage", "WRITE"))
|
||||||
|
):
|
||||||
with belief_scope("delete_file"):
|
with belief_scope("delete_file"):
|
||||||
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||||
if not storage_plugin:
|
if not storage_plugin:
|
||||||
@@ -114,7 +122,12 @@ async def delete_file(category: FileCategory, path: str, plugin_loader=Depends(g
|
|||||||
#
|
#
|
||||||
# @RELATION: CALLS -> StoragePlugin.get_file_path
|
# @RELATION: CALLS -> StoragePlugin.get_file_path
|
||||||
@router.get("/download/{category}/{path:path}")
|
@router.get("/download/{category}/{path:path}")
|
||||||
async def download_file(category: FileCategory, path: str, plugin_loader=Depends(get_plugin_loader)):
|
async def download_file(
|
||||||
|
category: FileCategory,
|
||||||
|
path: str,
|
||||||
|
plugin_loader=Depends(get_plugin_loader),
|
||||||
|
_ = Depends(has_permission("plugin:storage", "READ"))
|
||||||
|
):
|
||||||
with belief_scope("download_file"):
|
with belief_scope("download_file"):
|
||||||
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||||
if not storage_plugin:
|
if not storage_plugin:
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ from pydantic import BaseModel
|
|||||||
from ...core.logger import belief_scope
|
from ...core.logger import belief_scope
|
||||||
|
|
||||||
from ...core.task_manager import TaskManager, Task, TaskStatus, LogEntry
|
from ...core.task_manager import TaskManager, Task, TaskStatus, LogEntry
|
||||||
from ...dependencies import get_task_manager
|
from ...dependencies import get_task_manager, has_permission, get_current_user
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
@@ -33,13 +33,31 @@ class ResumeTaskRequest(BaseModel):
|
|||||||
# @RETURN: Task - The created task instance.
|
# @RETURN: Task - The created task instance.
|
||||||
async def create_task(
|
async def create_task(
|
||||||
request: CreateTaskRequest,
|
request: CreateTaskRequest,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
current_user = Depends(get_current_user)
|
||||||
):
|
):
|
||||||
|
# Dynamic permission check based on plugin_id
|
||||||
|
has_permission(f"plugin:{request.plugin_id}", "EXECUTE")(current_user)
|
||||||
"""
|
"""
|
||||||
Create and start a new task for a given plugin.
|
Create and start a new task for a given plugin.
|
||||||
"""
|
"""
|
||||||
with belief_scope("create_task"):
|
with belief_scope("create_task"):
|
||||||
try:
|
try:
|
||||||
|
# Special handling for validation task to include provider config
|
||||||
|
if request.plugin_id == "llm_dashboard_validation":
|
||||||
|
from ...core.database import SessionLocal
|
||||||
|
from ...services.llm_provider import LLMProviderService
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
llm_service = LLMProviderService(db)
|
||||||
|
provider_id = request.params.get("provider_id")
|
||||||
|
if provider_id:
|
||||||
|
db_provider = llm_service.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
raise ValueError(f"LLM Provider {provider_id} not found")
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
|
||||||
task = await task_manager.create_task(
|
task = await task_manager.create_task(
|
||||||
plugin_id=request.plugin_id,
|
plugin_id=request.plugin_id,
|
||||||
params=request.params
|
params=request.params
|
||||||
@@ -63,7 +81,8 @@ async def list_tasks(
|
|||||||
limit: int = 10,
|
limit: int = 10,
|
||||||
offset: int = 0,
|
offset: int = 0,
|
||||||
status: Optional[TaskStatus] = None,
|
status: Optional[TaskStatus] = None,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "READ"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Retrieve a list of tasks with pagination and optional status filter.
|
Retrieve a list of tasks with pagination and optional status filter.
|
||||||
@@ -82,7 +101,8 @@ async def list_tasks(
|
|||||||
# @RETURN: Task - The task details.
|
# @RETURN: Task - The task details.
|
||||||
async def get_task(
|
async def get_task(
|
||||||
task_id: str,
|
task_id: str,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "READ"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Retrieve the details of a specific task.
|
Retrieve the details of a specific task.
|
||||||
@@ -104,7 +124,8 @@ async def get_task(
|
|||||||
# @RETURN: List[LogEntry] - List of log entries.
|
# @RETURN: List[LogEntry] - List of log entries.
|
||||||
async def get_task_logs(
|
async def get_task_logs(
|
||||||
task_id: str,
|
task_id: str,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "READ"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Retrieve logs for a specific task.
|
Retrieve logs for a specific task.
|
||||||
@@ -128,7 +149,8 @@ async def get_task_logs(
|
|||||||
async def resolve_task(
|
async def resolve_task(
|
||||||
task_id: str,
|
task_id: str,
|
||||||
request: ResolveTaskRequest,
|
request: ResolveTaskRequest,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "WRITE"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Resolve a task that is awaiting mapping.
|
Resolve a task that is awaiting mapping.
|
||||||
@@ -153,7 +175,8 @@ async def resolve_task(
|
|||||||
async def resume_task(
|
async def resume_task(
|
||||||
task_id: str,
|
task_id: str,
|
||||||
request: ResumeTaskRequest,
|
request: ResumeTaskRequest,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "WRITE"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Resume a task that is awaiting input (e.g., passwords).
|
Resume a task that is awaiting input (e.g., passwords).
|
||||||
@@ -175,7 +198,8 @@ async def resume_task(
|
|||||||
# @POST: Tasks are removed from memory/persistence.
|
# @POST: Tasks are removed from memory/persistence.
|
||||||
async def clear_tasks(
|
async def clear_tasks(
|
||||||
status: Optional[TaskStatus] = None,
|
status: Optional[TaskStatus] = None,
|
||||||
task_manager: TaskManager = Depends(get_task_manager)
|
task_manager: TaskManager = Depends(get_task_manager),
|
||||||
|
_ = Depends(has_permission("tasks", "WRITE"))
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Clear tasks matching the status filter. If no filter, clears all non-running tasks.
|
Clear tasks matching the status filter. If no filter, clears all non-running tasks.
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ from pathlib import Path
|
|||||||
project_root = Path(__file__).resolve().parent.parent.parent
|
project_root = Path(__file__).resolve().parent.parent.parent
|
||||||
|
|
||||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, Request, HTTPException
|
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, Request, HTTPException
|
||||||
|
from starlette.middleware.sessions import SessionMiddleware
|
||||||
from fastapi.middleware.cors import CORSMiddleware
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
from fastapi.staticfiles import StaticFiles
|
from fastapi.staticfiles import StaticFiles
|
||||||
from fastapi.responses import FileResponse
|
from fastapi.responses import FileResponse
|
||||||
@@ -17,8 +18,10 @@ import asyncio
|
|||||||
import os
|
import os
|
||||||
|
|
||||||
from .dependencies import get_task_manager, get_scheduler_service
|
from .dependencies import get_task_manager, get_scheduler_service
|
||||||
|
from .core.utils.network import NetworkError
|
||||||
from .core.logger import logger, belief_scope
|
from .core.logger import logger, belief_scope
|
||||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration, connections, git, storage
|
from .api.routes import plugins, tasks, settings, environments, mappings, migration, connections, git, storage, admin, llm
|
||||||
|
from .api import auth
|
||||||
from .core.database import init_db
|
from .core.database import init_db
|
||||||
|
|
||||||
# [DEF:App:Global]
|
# [DEF:App:Global]
|
||||||
@@ -55,6 +58,10 @@ async def shutdown_event():
|
|||||||
scheduler.stop()
|
scheduler.stop()
|
||||||
# [/DEF:shutdown_event:Function]
|
# [/DEF:shutdown_event:Function]
|
||||||
|
|
||||||
|
# Configure Session Middleware (required by Authlib for OAuth2 flow)
|
||||||
|
from .core.auth.config import auth_config
|
||||||
|
app.add_middleware(SessionMiddleware, secret_key=auth_config.SECRET_KEY)
|
||||||
|
|
||||||
# Configure CORS
|
# Configure CORS
|
||||||
app.add_middleware(
|
app.add_middleware(
|
||||||
CORSMiddleware,
|
CORSMiddleware,
|
||||||
@@ -71,16 +78,39 @@ app.add_middleware(
|
|||||||
# @POST: Logs request and response details.
|
# @POST: Logs request and response details.
|
||||||
# @PARAM: request (Request) - The incoming request object.
|
# @PARAM: request (Request) - The incoming request object.
|
||||||
# @PARAM: call_next (Callable) - The next middleware or route handler.
|
# @PARAM: call_next (Callable) - The next middleware or route handler.
|
||||||
|
@app.exception_handler(NetworkError)
|
||||||
|
async def network_error_handler(request: Request, exc: NetworkError):
|
||||||
|
with belief_scope("network_error_handler"):
|
||||||
|
logger.error(f"Network error: {exc}")
|
||||||
|
return HTTPException(
|
||||||
|
status_code=503,
|
||||||
|
detail="Environment unavailable. Please check if the Superset instance is running."
|
||||||
|
)
|
||||||
|
|
||||||
@app.middleware("http")
|
@app.middleware("http")
|
||||||
async def log_requests(request: Request, call_next):
|
async def log_requests(request: Request, call_next):
|
||||||
with belief_scope("log_requests", f"{request.method} {request.url.path}"):
|
# Avoid spamming logs for polling endpoints
|
||||||
logger.info(f"[DEBUG] Incoming request: {request.method} {request.url.path}")
|
is_polling = request.url.path.endswith("/api/tasks") and request.method == "GET"
|
||||||
|
|
||||||
|
if not is_polling:
|
||||||
|
logger.info(f"Incoming request: {request.method} {request.url.path}")
|
||||||
|
|
||||||
|
try:
|
||||||
response = await call_next(request)
|
response = await call_next(request)
|
||||||
logger.info(f"[DEBUG] Response status: {response.status_code} for {request.url.path}")
|
if not is_polling:
|
||||||
|
logger.info(f"Response status: {response.status_code} for {request.url.path}")
|
||||||
return response
|
return response
|
||||||
|
except NetworkError as e:
|
||||||
|
logger.error(f"Network error caught in middleware: {e}")
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=503,
|
||||||
|
detail="Environment unavailable. Please check if the Superset instance is running."
|
||||||
|
)
|
||||||
# [/DEF:log_requests:Function]
|
# [/DEF:log_requests:Function]
|
||||||
|
|
||||||
# Include API routes
|
# Include API routes
|
||||||
|
app.include_router(auth.router)
|
||||||
|
app.include_router(admin.router)
|
||||||
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
||||||
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
||||||
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
||||||
@@ -89,6 +119,7 @@ app.include_router(environments.router, prefix="/api/environments", tags=["Envir
|
|||||||
app.include_router(mappings.router)
|
app.include_router(mappings.router)
|
||||||
app.include_router(migration.router)
|
app.include_router(migration.router)
|
||||||
app.include_router(git.router)
|
app.include_router(git.router)
|
||||||
|
app.include_router(llm.router)
|
||||||
app.include_router(storage.router, prefix="/api/storage", tags=["Storage"])
|
app.include_router(storage.router, prefix="/api/storage", tags=["Storage"])
|
||||||
|
|
||||||
# [DEF:websocket_endpoint:Function]
|
# [DEF:websocket_endpoint:Function]
|
||||||
|
|||||||
45
backend/src/core/auth/config.py
Normal file
45
backend/src/core/auth/config.py
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
# [DEF:backend.src.core.auth.config:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: auth, config, settings, jwt, adfs
|
||||||
|
# @PURPOSE: Centralized configuration for authentication and authorization.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> pydantic
|
||||||
|
#
|
||||||
|
# @INVARIANT: All sensitive configuration must have defaults or be loaded from environment.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from pydantic import Field
|
||||||
|
from pydantic_settings import BaseSettings
|
||||||
|
import os
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:AuthConfig:Class]
|
||||||
|
# @PURPOSE: Holds authentication-related settings.
|
||||||
|
# @PRE: Environment variables may be provided via .env file.
|
||||||
|
# @POST: Returns a configuration object with validated settings.
|
||||||
|
class AuthConfig(BaseSettings):
|
||||||
|
# JWT Settings
|
||||||
|
SECRET_KEY: str = Field(default="super-secret-key-change-in-production", env="AUTH_SECRET_KEY")
|
||||||
|
ALGORITHM: str = "HS256"
|
||||||
|
ACCESS_TOKEN_EXPIRE_MINUTES: int = 480
|
||||||
|
REFRESH_TOKEN_EXPIRE_DAYS: int = 7
|
||||||
|
|
||||||
|
# Database Settings
|
||||||
|
AUTH_DATABASE_URL: str = Field(default="sqlite:///./backend/auth.db", env="AUTH_DATABASE_URL")
|
||||||
|
|
||||||
|
# ADFS Settings
|
||||||
|
ADFS_CLIENT_ID: str = Field(default="", env="ADFS_CLIENT_ID")
|
||||||
|
ADFS_CLIENT_SECRET: str = Field(default="", env="ADFS_CLIENT_SECRET")
|
||||||
|
ADFS_METADATA_URL: str = Field(default="", env="ADFS_METADATA_URL")
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
env_file = ".env"
|
||||||
|
extra = "ignore"
|
||||||
|
# [/DEF:AuthConfig:Class]
|
||||||
|
|
||||||
|
# [DEF:auth_config:Variable]
|
||||||
|
# @PURPOSE: Singleton instance of AuthConfig.
|
||||||
|
auth_config = AuthConfig()
|
||||||
|
# [/DEF:auth_config:Variable]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.config:Module]
|
||||||
54
backend/src/core/auth/jwt.py
Normal file
54
backend/src/core/auth/jwt.py
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
# [DEF:backend.src.core.auth.jwt:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: jwt, token, session, auth
|
||||||
|
# @PURPOSE: JWT token generation and validation logic.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> jose
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.config.auth_config
|
||||||
|
#
|
||||||
|
# @INVARIANT: Tokens must include expiration time and user identifier.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import Optional, List
|
||||||
|
from jose import JWTError, jwt
|
||||||
|
from .config import auth_config
|
||||||
|
from ..logger import belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:create_access_token:Function]
|
||||||
|
# @PURPOSE: Generates a new JWT access token.
|
||||||
|
# @PRE: data dict contains 'sub' (user_id) and optional 'scopes' (roles).
|
||||||
|
# @POST: Returns a signed JWT string.
|
||||||
|
#
|
||||||
|
# @PARAM: data (dict) - Payload data for the token.
|
||||||
|
# @PARAM: expires_delta (Optional[timedelta]) - Custom expiration time.
|
||||||
|
# @RETURN: str - The encoded JWT.
|
||||||
|
def create_access_token(data: dict, expires_delta: Optional[timedelta] = None) -> str:
|
||||||
|
with belief_scope("create_access_token"):
|
||||||
|
to_encode = data.copy()
|
||||||
|
if expires_delta:
|
||||||
|
expire = datetime.utcnow() + expires_delta
|
||||||
|
else:
|
||||||
|
expire = datetime.utcnow() + timedelta(minutes=auth_config.ACCESS_TOKEN_EXPIRE_MINUTES)
|
||||||
|
|
||||||
|
to_encode.update({"exp": expire})
|
||||||
|
encoded_jwt = jwt.encode(to_encode, auth_config.SECRET_KEY, algorithm=auth_config.ALGORITHM)
|
||||||
|
return encoded_jwt
|
||||||
|
# [/DEF:create_access_token:Function]
|
||||||
|
|
||||||
|
# [DEF:decode_token:Function]
|
||||||
|
# @PURPOSE: Decodes and validates a JWT token.
|
||||||
|
# @PRE: token is a signed JWT string.
|
||||||
|
# @POST: Returns the decoded payload if valid.
|
||||||
|
#
|
||||||
|
# @PARAM: token (str) - The JWT to decode.
|
||||||
|
# @RETURN: dict - The decoded payload.
|
||||||
|
# @THROW: jose.JWTError - If token is invalid or expired.
|
||||||
|
def decode_token(token: str) -> dict:
|
||||||
|
with belief_scope("decode_token"):
|
||||||
|
payload = jwt.decode(token, auth_config.SECRET_KEY, algorithms=[auth_config.ALGORITHM])
|
||||||
|
return payload
|
||||||
|
# [/DEF:decode_token:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.jwt:Module]
|
||||||
31
backend/src/core/auth/logger.py
Normal file
31
backend/src/core/auth/logger.py
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
# [DEF:backend.src.core.auth.logger:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: auth, logger, audit, security
|
||||||
|
# @PURPOSE: Audit logging for security-related events.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: USES -> backend.src.core.logger.belief_scope
|
||||||
|
#
|
||||||
|
# @INVARIANT: Must not log sensitive data like passwords or full tokens.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from ..logger import logger, belief_scope
|
||||||
|
from datetime import datetime
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:log_security_event:Function]
|
||||||
|
# @PURPOSE: Logs a security-related event for audit trails.
|
||||||
|
# @PRE: event_type and username are strings.
|
||||||
|
# @POST: Security event is written to the application log.
|
||||||
|
# @PARAM: event_type (str) - Type of event (e.g., LOGIN_SUCCESS, PERMISSION_DENIED).
|
||||||
|
# @PARAM: username (str) - The user involved in the event.
|
||||||
|
# @PARAM: details (dict) - Additional non-sensitive metadata.
|
||||||
|
def log_security_event(event_type: str, username: str, details: dict = None):
|
||||||
|
with belief_scope("log_security_event", f"{event_type}:{username}"):
|
||||||
|
timestamp = datetime.utcnow().isoformat()
|
||||||
|
msg = f"[AUDIT][{timestamp}][{event_type}] User: {username}"
|
||||||
|
if details:
|
||||||
|
msg += f" Details: {details}"
|
||||||
|
logger.info(msg)
|
||||||
|
# [/DEF:log_security_event:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.logger:Module]
|
||||||
51
backend/src/core/auth/oauth.py
Normal file
51
backend/src/core/auth/oauth.py
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
# [DEF:backend.src.core.auth.oauth:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: auth, oauth, oidc, adfs
|
||||||
|
# @PURPOSE: ADFS OIDC configuration and client using Authlib.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> authlib
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.config.auth_config
|
||||||
|
#
|
||||||
|
# @INVARIANT: Must use secure OIDC flows.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from authlib.integrations.starlette_client import OAuth
|
||||||
|
from .config import auth_config
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:oauth:Variable]
|
||||||
|
# @PURPOSE: Global Authlib OAuth registry.
|
||||||
|
oauth = OAuth()
|
||||||
|
# [/DEF:oauth:Variable]
|
||||||
|
|
||||||
|
# [DEF:register_adfs:Function]
|
||||||
|
# @PURPOSE: Registers the ADFS OIDC client.
|
||||||
|
# @PRE: ADFS configuration is provided in auth_config.
|
||||||
|
# @POST: ADFS client is registered in oauth registry.
|
||||||
|
def register_adfs():
|
||||||
|
if auth_config.ADFS_CLIENT_ID:
|
||||||
|
oauth.register(
|
||||||
|
name='adfs',
|
||||||
|
client_id=auth_config.ADFS_CLIENT_ID,
|
||||||
|
client_secret=auth_config.ADFS_CLIENT_SECRET,
|
||||||
|
server_metadata_url=auth_config.ADFS_METADATA_URL,
|
||||||
|
client_kwargs={
|
||||||
|
'scope': 'openid email profile groups'
|
||||||
|
}
|
||||||
|
)
|
||||||
|
# [/DEF:register_adfs:Function]
|
||||||
|
|
||||||
|
# [DEF:is_adfs_configured:Function]
|
||||||
|
# @PURPOSE: Checks if ADFS is properly configured.
|
||||||
|
# @PRE: None.
|
||||||
|
# @POST: Returns True if ADFS client is registered, False otherwise.
|
||||||
|
# @RETURN: bool - Configuration status.
|
||||||
|
def is_adfs_configured() -> bool:
|
||||||
|
"""Check if ADFS OAuth client is registered."""
|
||||||
|
return 'adfs' in oauth._registry
|
||||||
|
# [/DEF:is_adfs_configured:Function]
|
||||||
|
|
||||||
|
# Initial registration
|
||||||
|
register_adfs()
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.oauth:Module]
|
||||||
123
backend/src/core/auth/repository.py
Normal file
123
backend/src/core/auth/repository.py
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
# [DEF:backend.src.core.auth.repository:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: auth, repository, database, user, role
|
||||||
|
# @PURPOSE: Data access layer for authentication-related entities.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||||
|
# @RELATION: USES -> backend.src.models.auth
|
||||||
|
#
|
||||||
|
# @INVARIANT: All database operations must be performed within a session.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from typing import Optional, List
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from ...models.auth import User, Role, Permission, ADGroupMapping
|
||||||
|
from ..logger import belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:AuthRepository:Class]
|
||||||
|
# @PURPOSE: Encapsulates database operations for authentication.
|
||||||
|
class AuthRepository:
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the repository with a database session.
|
||||||
|
# @PARAM: db (Session) - SQLAlchemy session.
|
||||||
|
def __init__(self, db: Session):
|
||||||
|
self.db = db
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:get_user_by_username:Function]
|
||||||
|
# @PURPOSE: Retrieves a user by their username.
|
||||||
|
# @PRE: username is a string.
|
||||||
|
# @POST: Returns User object if found, else None.
|
||||||
|
# @PARAM: username (str) - The username to search for.
|
||||||
|
# @RETURN: Optional[User] - The found user or None.
|
||||||
|
def get_user_by_username(self, username: str) -> Optional[User]:
|
||||||
|
with belief_scope("AuthRepository.get_user_by_username"):
|
||||||
|
return self.db.query(User).filter(User.username == username).first()
|
||||||
|
# [/DEF:get_user_by_username:Function]
|
||||||
|
|
||||||
|
# [DEF:get_user_by_id:Function]
|
||||||
|
# @PURPOSE: Retrieves a user by their unique ID.
|
||||||
|
# @PRE: user_id is a valid UUID string.
|
||||||
|
# @POST: Returns User object if found, else None.
|
||||||
|
# @PARAM: user_id (str) - The user's unique identifier.
|
||||||
|
# @RETURN: Optional[User] - The found user or None.
|
||||||
|
def get_user_by_id(self, user_id: str) -> Optional[User]:
|
||||||
|
with belief_scope("AuthRepository.get_user_by_id"):
|
||||||
|
return self.db.query(User).filter(User.id == user_id).first()
|
||||||
|
# [/DEF:get_user_by_id:Function]
|
||||||
|
|
||||||
|
# [DEF:get_role_by_name:Function]
|
||||||
|
# @PURPOSE: Retrieves a role by its name.
|
||||||
|
# @PRE: name is a string.
|
||||||
|
# @POST: Returns Role object if found, else None.
|
||||||
|
# @PARAM: name (str) - The role name to search for.
|
||||||
|
# @RETURN: Optional[Role] - The found role or None.
|
||||||
|
def get_role_by_name(self, name: str) -> Optional[Role]:
|
||||||
|
with belief_scope("AuthRepository.get_role_by_name"):
|
||||||
|
return self.db.query(Role).filter(Role.name == name).first()
|
||||||
|
# [/DEF:get_role_by_name:Function]
|
||||||
|
|
||||||
|
# [DEF:update_last_login:Function]
|
||||||
|
# @PURPOSE: Updates the last_login timestamp for a user.
|
||||||
|
# @PRE: user object is a valid User instance.
|
||||||
|
# @POST: User's last_login is updated in the database.
|
||||||
|
# @SIDE_EFFECT: Commits the transaction.
|
||||||
|
# @PARAM: user (User) - The user to update.
|
||||||
|
def update_last_login(self, user: User):
|
||||||
|
with belief_scope("AuthRepository.update_last_login"):
|
||||||
|
from datetime import datetime
|
||||||
|
user.last_login = datetime.utcnow()
|
||||||
|
self.db.add(user)
|
||||||
|
self.db.commit()
|
||||||
|
# [/DEF:update_last_login:Function]
|
||||||
|
|
||||||
|
# [DEF:get_role_by_id:Function]
|
||||||
|
# @PURPOSE: Retrieves a role by its unique ID.
|
||||||
|
# @PRE: role_id is a string.
|
||||||
|
# @POST: Returns Role object if found, else None.
|
||||||
|
# @PARAM: role_id (str) - The role's unique identifier.
|
||||||
|
# @RETURN: Optional[Role] - The found role or None.
|
||||||
|
def get_role_by_id(self, role_id: str) -> Optional[Role]:
|
||||||
|
with belief_scope("AuthRepository.get_role_by_id"):
|
||||||
|
return self.db.query(Role).filter(Role.id == role_id).first()
|
||||||
|
# [/DEF:get_role_by_id:Function]
|
||||||
|
|
||||||
|
# [DEF:get_permission_by_id:Function]
|
||||||
|
# @PURPOSE: Retrieves a permission by its unique ID.
|
||||||
|
# @PRE: perm_id is a string.
|
||||||
|
# @POST: Returns Permission object if found, else None.
|
||||||
|
# @PARAM: perm_id (str) - The permission's unique identifier.
|
||||||
|
# @RETURN: Optional[Permission] - The found permission or None.
|
||||||
|
def get_permission_by_id(self, perm_id: str) -> Optional[Permission]:
|
||||||
|
with belief_scope("AuthRepository.get_permission_by_id"):
|
||||||
|
return self.db.query(Permission).filter(Permission.id == perm_id).first()
|
||||||
|
# [/DEF:get_permission_by_id:Function]
|
||||||
|
|
||||||
|
# [DEF:get_permission_by_resource_action:Function]
|
||||||
|
# @PURPOSE: Retrieves a permission by resource and action.
|
||||||
|
# @PRE: resource and action are strings.
|
||||||
|
# @POST: Returns Permission object if found, else None.
|
||||||
|
# @PARAM: resource (str) - The resource name.
|
||||||
|
# @PARAM: action (str) - The action name.
|
||||||
|
# @RETURN: Optional[Permission] - The found permission or None.
|
||||||
|
def get_permission_by_resource_action(self, resource: str, action: str) -> Optional[Permission]:
|
||||||
|
with belief_scope("AuthRepository.get_permission_by_resource_action"):
|
||||||
|
return self.db.query(Permission).filter(
|
||||||
|
Permission.resource == resource,
|
||||||
|
Permission.action == action
|
||||||
|
).first()
|
||||||
|
# [/DEF:get_permission_by_resource_action:Function]
|
||||||
|
|
||||||
|
# [DEF:list_permissions:Function]
|
||||||
|
# @PURPOSE: Lists all available permissions.
|
||||||
|
# @POST: Returns a list of all Permission objects.
|
||||||
|
# @RETURN: List[Permission] - List of permissions.
|
||||||
|
def list_permissions(self) -> List[Permission]:
|
||||||
|
with belief_scope("AuthRepository.list_permissions"):
|
||||||
|
return self.db.query(Permission).all()
|
||||||
|
# [/DEF:list_permissions:Function]
|
||||||
|
|
||||||
|
# [/DEF:AuthRepository:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.repository:Module]
|
||||||
42
backend/src/core/auth/security.py
Normal file
42
backend/src/core/auth/security.py
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
# [DEF:backend.src.core.auth.security:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: security, password, hashing, bcrypt
|
||||||
|
# @PURPOSE: Utility for password hashing and verification using Passlib.
|
||||||
|
# @LAYER: Core
|
||||||
|
# @RELATION: DEPENDS_ON -> passlib
|
||||||
|
#
|
||||||
|
# @INVARIANT: Uses bcrypt for hashing with standard work factor.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from passlib.context import CryptContext
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:pwd_context:Variable]
|
||||||
|
# @PURPOSE: Passlib CryptContext for password management.
|
||||||
|
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
|
||||||
|
# [/DEF:pwd_context:Variable]
|
||||||
|
|
||||||
|
# [DEF:verify_password:Function]
|
||||||
|
# @PURPOSE: Verifies a plain password against a hashed password.
|
||||||
|
# @PRE: plain_password is a string, hashed_password is a bcrypt hash.
|
||||||
|
# @POST: Returns True if password matches, False otherwise.
|
||||||
|
#
|
||||||
|
# @PARAM: plain_password (str) - The unhashed password.
|
||||||
|
# @PARAM: hashed_password (str) - The stored hash.
|
||||||
|
# @RETURN: bool - Verification result.
|
||||||
|
def verify_password(plain_password: str, hashed_password: str) -> bool:
|
||||||
|
return pwd_context.verify(plain_password, hashed_password)
|
||||||
|
# [/DEF:verify_password:Function]
|
||||||
|
|
||||||
|
# [DEF:get_password_hash:Function]
|
||||||
|
# @PURPOSE: Generates a bcrypt hash for a plain password.
|
||||||
|
# @PRE: password is a string.
|
||||||
|
# @POST: Returns a secure bcrypt hash string.
|
||||||
|
#
|
||||||
|
# @PARAM: password (str) - The password to hash.
|
||||||
|
# @RETURN: str - The generated hash.
|
||||||
|
def get_password_hash(password: str) -> str:
|
||||||
|
return pwd_context.hash(password)
|
||||||
|
# [/DEF:get_password_hash:Function]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.core.auth.security:Module]
|
||||||
@@ -5,6 +5,7 @@
|
|||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
# @RELATION: DEPENDS_ON -> sqlalchemy
|
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||||
# @RELATION: USES -> backend.src.models.mapping
|
# @RELATION: USES -> backend.src.models.mapping
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.config
|
||||||
#
|
#
|
||||||
# @INVARIANT: A single engine instance is used for the entire application.
|
# @INVARIANT: A single engine instance is used for the entire application.
|
||||||
|
|
||||||
@@ -16,44 +17,83 @@ from ..models.mapping import Base
|
|||||||
from ..models.task import TaskRecord
|
from ..models.task import TaskRecord
|
||||||
from ..models.connection import ConnectionConfig
|
from ..models.connection import ConnectionConfig
|
||||||
from ..models.git import GitServerConfig, GitRepository, DeploymentEnvironment
|
from ..models.git import GitServerConfig, GitRepository, DeploymentEnvironment
|
||||||
|
from ..models.auth import User, Role, Permission, ADGroupMapping
|
||||||
|
from ..models.llm import LLMProvider, ValidationRecord
|
||||||
from .logger import belief_scope
|
from .logger import belief_scope
|
||||||
|
from .auth.config import auth_config
|
||||||
import os
|
import os
|
||||||
|
from pathlib import Path
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:BASE_DIR:Variable]
|
||||||
|
# @PURPOSE: Base directory for the backend (where .db files should reside).
|
||||||
|
BASE_DIR = Path(__file__).resolve().parent.parent.parent
|
||||||
|
# [/DEF:BASE_DIR:Variable]
|
||||||
|
|
||||||
# [DEF:DATABASE_URL:Constant]
|
# [DEF:DATABASE_URL:Constant]
|
||||||
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:///./mappings.db")
|
# @PURPOSE: URL for the main mappings database.
|
||||||
|
DATABASE_URL = os.getenv("DATABASE_URL", f"sqlite:///{BASE_DIR}/mappings.db")
|
||||||
# [/DEF:DATABASE_URL:Constant]
|
# [/DEF:DATABASE_URL:Constant]
|
||||||
|
|
||||||
# [DEF:TASKS_DATABASE_URL:Constant]
|
# [DEF:TASKS_DATABASE_URL:Constant]
|
||||||
TASKS_DATABASE_URL = os.getenv("TASKS_DATABASE_URL", "sqlite:///./tasks.db")
|
# @PURPOSE: URL for the tasks execution database.
|
||||||
|
TASKS_DATABASE_URL = os.getenv("TASKS_DATABASE_URL", f"sqlite:///{BASE_DIR}/tasks.db")
|
||||||
# [/DEF:TASKS_DATABASE_URL:Constant]
|
# [/DEF:TASKS_DATABASE_URL:Constant]
|
||||||
|
|
||||||
|
# [DEF:AUTH_DATABASE_URL:Constant]
|
||||||
|
# @PURPOSE: URL for the authentication database.
|
||||||
|
AUTH_DATABASE_URL = os.getenv("AUTH_DATABASE_URL", auth_config.AUTH_DATABASE_URL)
|
||||||
|
# If it's a relative sqlite path starting with ./backend/, fix it to be absolute or relative to BASE_DIR
|
||||||
|
if AUTH_DATABASE_URL.startswith("sqlite:///./backend/"):
|
||||||
|
AUTH_DATABASE_URL = AUTH_DATABASE_URL.replace("sqlite:///./backend/", f"sqlite:///{BASE_DIR}/")
|
||||||
|
elif AUTH_DATABASE_URL.startswith("sqlite:///./") and not AUTH_DATABASE_URL.startswith("sqlite:///./backend/"):
|
||||||
|
# If it's just ./ but we are in backend, it's fine, but let's make it absolute for robustness
|
||||||
|
AUTH_DATABASE_URL = AUTH_DATABASE_URL.replace("sqlite:///./", f"sqlite:///{BASE_DIR}/")
|
||||||
|
# [/DEF:AUTH_DATABASE_URL:Constant]
|
||||||
|
|
||||||
# [DEF:engine:Variable]
|
# [DEF:engine:Variable]
|
||||||
|
# @PURPOSE: SQLAlchemy engine for mappings database.
|
||||||
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
# [/DEF:engine:Variable]
|
# [/DEF:engine:Variable]
|
||||||
|
|
||||||
# [DEF:tasks_engine:Variable]
|
# [DEF:tasks_engine:Variable]
|
||||||
|
# @PURPOSE: SQLAlchemy engine for tasks database.
|
||||||
tasks_engine = create_engine(TASKS_DATABASE_URL, connect_args={"check_same_thread": False})
|
tasks_engine = create_engine(TASKS_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
# [/DEF:tasks_engine:Variable]
|
# [/DEF:tasks_engine:Variable]
|
||||||
|
|
||||||
|
# [DEF:auth_engine:Variable]
|
||||||
|
# @PURPOSE: SQLAlchemy engine for authentication database.
|
||||||
|
auth_engine = create_engine(AUTH_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
|
# [/DEF:auth_engine:Variable]
|
||||||
|
|
||||||
# [DEF:SessionLocal:Class]
|
# [DEF:SessionLocal:Class]
|
||||||
# @PURPOSE: A session factory for the main mappings database.
|
# @PURPOSE: A session factory for the main mappings database.
|
||||||
|
# @PRE: engine is initialized.
|
||||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
# [/DEF:SessionLocal:Class]
|
# [/DEF:SessionLocal:Class]
|
||||||
|
|
||||||
# [DEF:TasksSessionLocal:Class]
|
# [DEF:TasksSessionLocal:Class]
|
||||||
# @PURPOSE: A session factory for the tasks execution database.
|
# @PURPOSE: A session factory for the tasks execution database.
|
||||||
|
# @PRE: tasks_engine is initialized.
|
||||||
TasksSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=tasks_engine)
|
TasksSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=tasks_engine)
|
||||||
# [/DEF:TasksSessionLocal:Class]
|
# [/DEF:TasksSessionLocal:Class]
|
||||||
|
|
||||||
|
# [DEF:AuthSessionLocal:Class]
|
||||||
|
# @PURPOSE: A session factory for the authentication database.
|
||||||
|
# @PRE: auth_engine is initialized.
|
||||||
|
AuthSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=auth_engine)
|
||||||
|
# [/DEF:AuthSessionLocal:Class]
|
||||||
|
|
||||||
# [DEF:init_db:Function]
|
# [DEF:init_db:Function]
|
||||||
# @PURPOSE: Initializes the database by creating all tables.
|
# @PURPOSE: Initializes the database by creating all tables.
|
||||||
# @PRE: engine and tasks_engine are initialized.
|
# @PRE: engine, tasks_engine and auth_engine are initialized.
|
||||||
# @POST: Database tables created.
|
# @POST: Database tables created in all databases.
|
||||||
|
# @SIDE_EFFECT: Creates physical database files if they don't exist.
|
||||||
def init_db():
|
def init_db():
|
||||||
with belief_scope("init_db"):
|
with belief_scope("init_db"):
|
||||||
Base.metadata.create_all(bind=engine)
|
Base.metadata.create_all(bind=engine)
|
||||||
Base.metadata.create_all(bind=tasks_engine)
|
Base.metadata.create_all(bind=tasks_engine)
|
||||||
|
Base.metadata.create_all(bind=auth_engine)
|
||||||
# [/DEF:init_db:Function]
|
# [/DEF:init_db:Function]
|
||||||
|
|
||||||
# [DEF:get_db:Function]
|
# [DEF:get_db:Function]
|
||||||
@@ -84,4 +124,18 @@ def get_tasks_db():
|
|||||||
db.close()
|
db.close()
|
||||||
# [/DEF:get_tasks_db:Function]
|
# [/DEF:get_tasks_db:Function]
|
||||||
|
|
||||||
|
# [DEF:get_auth_db:Function]
|
||||||
|
# @PURPOSE: Dependency for getting an authentication database session.
|
||||||
|
# @PRE: AuthSessionLocal is initialized.
|
||||||
|
# @POST: Session is closed after use.
|
||||||
|
# @RETURN: Generator[Session, None, None]
|
||||||
|
def get_auth_db():
|
||||||
|
with belief_scope("get_auth_db"):
|
||||||
|
db = AuthSessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:get_auth_db:Function]
|
||||||
|
|
||||||
# [/DEF:backend.src.core.database:Module]
|
# [/DEF:backend.src.core.database:Module]
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
from abc import ABC, abstractmethod
|
from abc import ABC, abstractmethod
|
||||||
from typing import Dict, Any
|
from typing import Dict, Any, Optional
|
||||||
from .logger import belief_scope
|
from .logger import belief_scope
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
@@ -68,6 +68,33 @@ class PluginBase(ABC):
|
|||||||
pass
|
pass
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:required_permission:Function]
|
||||||
|
# @PURPOSE: Returns the required permission string to execute this plugin.
|
||||||
|
# @PRE: Plugin instance exists.
|
||||||
|
# @POST: Returns string permission.
|
||||||
|
# @RETURN: str - Required permission (e.g., "plugin:backup:execute").
|
||||||
|
def required_permission(self) -> str:
|
||||||
|
"""The permission string required to execute this plugin."""
|
||||||
|
with belief_scope("required_permission"):
|
||||||
|
return f"plugin:{self.id}:execute"
|
||||||
|
# [/DEF:required_permission:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the plugin's UI, if applicable.
|
||||||
|
# @PRE: Plugin instance exists.
|
||||||
|
# @POST: Returns string route or None.
|
||||||
|
# @RETURN: Optional[str] - Frontend route.
|
||||||
|
def ui_route(self) -> Optional[str]:
|
||||||
|
"""
|
||||||
|
The frontend route for the plugin's UI.
|
||||||
|
Returns None if the plugin does not have a dedicated UI page.
|
||||||
|
"""
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return None
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
@abstractmethod
|
@abstractmethod
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for the plugin's input parameters.
|
# @PURPOSE: Returns the JSON schema for the plugin's input parameters.
|
||||||
@@ -111,5 +138,6 @@ class PluginConfig(BaseModel):
|
|||||||
name: str = Field(..., description="Human-readable name for the plugin")
|
name: str = Field(..., description="Human-readable name for the plugin")
|
||||||
description: str = Field(..., description="Brief description of what the plugin does")
|
description: str = Field(..., description="Brief description of what the plugin does")
|
||||||
version: str = Field(..., description="Version of the plugin")
|
version: str = Field(..., description="Version of the plugin")
|
||||||
|
ui_route: Optional[str] = Field(None, description="Frontend route for the plugin UI")
|
||||||
input_schema: Dict[str, Any] = Field(..., description="JSON schema for input parameters", alias="schema")
|
input_schema: Dict[str, Any] = Field(..., description="JSON schema for input parameters", alias="schema")
|
||||||
# [/DEF:PluginConfig:Class]
|
# [/DEF:PluginConfig:Class]
|
||||||
@@ -141,6 +141,7 @@ class PluginLoader:
|
|||||||
name=plugin_instance.name,
|
name=plugin_instance.name,
|
||||||
description=plugin_instance.description,
|
description=plugin_instance.description,
|
||||||
version=plugin_instance.version,
|
version=plugin_instance.version,
|
||||||
|
ui_route=plugin_instance.ui_route,
|
||||||
schema=schema,
|
schema=schema,
|
||||||
)
|
)
|
||||||
# The following line is commented out because it requires a schema to be passed to validate against.
|
# The following line is commented out because it requires a schema to be passed to validate against.
|
||||||
|
|||||||
@@ -36,6 +36,7 @@ class TaskPersistenceService:
|
|||||||
# @PRE: isinstance(task, Task)
|
# @PRE: isinstance(task, Task)
|
||||||
# @POST: Task record created or updated in database.
|
# @POST: Task record created or updated in database.
|
||||||
# @PARAM: task (Task) - The task object to persist.
|
# @PARAM: task (Task) - The task object to persist.
|
||||||
|
# @SIDE_EFFECT: Writes to task_records table in tasks.db
|
||||||
def persist_task(self, task: Task) -> None:
|
def persist_task(self, task: Task) -> None:
|
||||||
with belief_scope("TaskPersistenceService.persist_task", f"task_id={task.id}"):
|
with belief_scope("TaskPersistenceService.persist_task", f"task_id={task.id}"):
|
||||||
session: Session = TasksSessionLocal()
|
session: Session = TasksSessionLocal()
|
||||||
@@ -50,8 +51,19 @@ class TaskPersistenceService:
|
|||||||
record.environment_id = task.params.get("environment_id") or task.params.get("source_env_id")
|
record.environment_id = task.params.get("environment_id") or task.params.get("source_env_id")
|
||||||
record.started_at = task.started_at
|
record.started_at = task.started_at
|
||||||
record.finished_at = task.finished_at
|
record.finished_at = task.finished_at
|
||||||
record.params = task.params
|
|
||||||
record.result = task.result
|
# Ensure params and result are JSON serializable
|
||||||
|
def json_serializable(obj):
|
||||||
|
if isinstance(obj, dict):
|
||||||
|
return {k: json_serializable(v) for k, v in obj.items()}
|
||||||
|
elif isinstance(obj, list):
|
||||||
|
return [json_serializable(v) for v in obj]
|
||||||
|
elif isinstance(obj, datetime):
|
||||||
|
return obj.isoformat()
|
||||||
|
return obj
|
||||||
|
|
||||||
|
record.params = json_serializable(task.params)
|
||||||
|
record.result = json_serializable(task.result)
|
||||||
|
|
||||||
# Store logs as JSON, converting datetime to string
|
# Store logs as JSON, converting datetime to string
|
||||||
record.logs = []
|
record.logs = []
|
||||||
@@ -59,6 +71,9 @@ class TaskPersistenceService:
|
|||||||
log_dict = log.dict()
|
log_dict = log.dict()
|
||||||
if isinstance(log_dict.get('timestamp'), datetime):
|
if isinstance(log_dict.get('timestamp'), datetime):
|
||||||
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
log_dict['timestamp'] = log_dict['timestamp'].isoformat()
|
||||||
|
# Also clean up any datetimes in context
|
||||||
|
if log_dict.get('context'):
|
||||||
|
log_dict['context'] = json_serializable(log_dict['context'])
|
||||||
record.logs.append(log_dict)
|
record.logs.append(log_dict)
|
||||||
|
|
||||||
# Extract error if failed
|
# Extract error if failed
|
||||||
|
|||||||
@@ -140,7 +140,16 @@ class APIClient:
|
|||||||
app_logger.info("[authenticate][Enter] Authenticating to %s", self.base_url)
|
app_logger.info("[authenticate][Enter] Authenticating to %s", self.base_url)
|
||||||
try:
|
try:
|
||||||
login_url = f"{self.base_url}/security/login"
|
login_url = f"{self.base_url}/security/login"
|
||||||
|
# Log the payload keys and values (masking password)
|
||||||
|
masked_auth = {k: ("******" if k == "password" else v) for k, v in self.auth.items()}
|
||||||
|
app_logger.info(f"[authenticate][Debug] Login URL: {login_url}")
|
||||||
|
app_logger.info(f"[authenticate][Debug] Auth payload: {masked_auth}")
|
||||||
|
|
||||||
response = self.session.post(login_url, json=self.auth, timeout=self.request_settings["timeout"])
|
response = self.session.post(login_url, json=self.auth, timeout=self.request_settings["timeout"])
|
||||||
|
|
||||||
|
if response.status_code != 200:
|
||||||
|
app_logger.error(f"[authenticate][Error] Status: {response.status_code}, Response: {response.text}")
|
||||||
|
|
||||||
response.raise_for_status()
|
response.raise_for_status()
|
||||||
access_token = response.json()["access_token"]
|
access_token = response.json()["access_token"]
|
||||||
|
|
||||||
@@ -153,6 +162,9 @@ class APIClient:
|
|||||||
app_logger.info("[authenticate][Exit] Authenticated successfully.")
|
app_logger.info("[authenticate][Exit] Authenticated successfully.")
|
||||||
return self._tokens
|
return self._tokens
|
||||||
except requests.exceptions.HTTPError as e:
|
except requests.exceptions.HTTPError as e:
|
||||||
|
status_code = e.response.status_code if e.response is not None else None
|
||||||
|
if status_code in [502, 503, 504]:
|
||||||
|
raise NetworkError(f"Environment unavailable during authentication (Status {status_code})", status_code=status_code) from e
|
||||||
raise AuthenticationError(f"Authentication failed: {e}") from e
|
raise AuthenticationError(f"Authentication failed: {e}") from e
|
||||||
except (requests.exceptions.RequestException, KeyError) as e:
|
except (requests.exceptions.RequestException, KeyError) as e:
|
||||||
raise NetworkError(f"Network or parsing error during authentication: {e}") from e
|
raise NetworkError(f"Network or parsing error during authentication: {e}") from e
|
||||||
@@ -209,6 +221,8 @@ class APIClient:
|
|||||||
def _handle_http_error(self, e: requests.exceptions.HTTPError, endpoint: str):
|
def _handle_http_error(self, e: requests.exceptions.HTTPError, endpoint: str):
|
||||||
with belief_scope("_handle_http_error"):
|
with belief_scope("_handle_http_error"):
|
||||||
status_code = e.response.status_code
|
status_code = e.response.status_code
|
||||||
|
if status_code == 502 or status_code == 503 or status_code == 504:
|
||||||
|
raise NetworkError(f"Environment unavailable (Status {status_code})", status_code=status_code) from e
|
||||||
if status_code == 404: raise DashboardNotFoundError(endpoint) from e
|
if status_code == 404: raise DashboardNotFoundError(endpoint) from e
|
||||||
if status_code == 403: raise PermissionDeniedError() from e
|
if status_code == 403: raise PermissionDeniedError() from e
|
||||||
if status_code == 401: raise AuthenticationError() from e
|
if status_code == 401: raise AuthenticationError() from e
|
||||||
|
|||||||
@@ -1,16 +1,23 @@
|
|||||||
# [DEF:Dependencies:Module]
|
# [DEF:Dependencies:Module]
|
||||||
# @SEMANTICS: dependency, injection, singleton, factory
|
# @SEMANTICS: dependency, injection, singleton, factory, auth, jwt
|
||||||
# @PURPOSE: Manages the creation and provision of shared application dependencies, such as the PluginLoader and TaskManager, to avoid circular imports.
|
# @PURPOSE: Manages the creation and provision of shared application dependencies, such as the PluginLoader and TaskManager, to avoid circular imports.
|
||||||
# @LAYER: Core
|
# @LAYER: Core
|
||||||
# @RELATION: Used by the main app and API routers to get access to shared instances.
|
# @RELATION: Used by the main app and API routers to get access to shared instances.
|
||||||
|
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
from typing import Optional
|
||||||
|
from fastapi import Depends, HTTPException, status
|
||||||
|
from fastapi.security import OAuth2PasswordBearer
|
||||||
|
from jose import JWTError
|
||||||
from .core.plugin_loader import PluginLoader
|
from .core.plugin_loader import PluginLoader
|
||||||
from .core.task_manager import TaskManager
|
from .core.task_manager import TaskManager
|
||||||
from .core.config_manager import ConfigManager
|
from .core.config_manager import ConfigManager
|
||||||
from .core.scheduler import SchedulerService
|
from .core.scheduler import SchedulerService
|
||||||
from .core.database import init_db
|
from .core.database import init_db, get_auth_db
|
||||||
from .core.logger import logger, belief_scope
|
from .core.logger import logger, belief_scope
|
||||||
|
from .core.auth.jwt import decode_token
|
||||||
|
from .core.auth.repository import AuthRepository
|
||||||
|
from .models.auth import User
|
||||||
|
|
||||||
# Initialize singletons
|
# Initialize singletons
|
||||||
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
||||||
@@ -28,8 +35,7 @@ init_db()
|
|||||||
# @RETURN: ConfigManager - The shared config manager instance.
|
# @RETURN: ConfigManager - The shared config manager instance.
|
||||||
def get_config_manager() -> ConfigManager:
|
def get_config_manager() -> ConfigManager:
|
||||||
"""Dependency injector for the ConfigManager."""
|
"""Dependency injector for the ConfigManager."""
|
||||||
with belief_scope("get_config_manager"):
|
return config_manager
|
||||||
return config_manager
|
|
||||||
# [/DEF:get_config_manager:Function]
|
# [/DEF:get_config_manager:Function]
|
||||||
|
|
||||||
plugin_dir = Path(__file__).parent / "plugins"
|
plugin_dir = Path(__file__).parent / "plugins"
|
||||||
@@ -51,8 +57,7 @@ logger.info("SchedulerService initialized")
|
|||||||
# @RETURN: PluginLoader - The shared plugin loader instance.
|
# @RETURN: PluginLoader - The shared plugin loader instance.
|
||||||
def get_plugin_loader() -> PluginLoader:
|
def get_plugin_loader() -> PluginLoader:
|
||||||
"""Dependency injector for the PluginLoader."""
|
"""Dependency injector for the PluginLoader."""
|
||||||
with belief_scope("get_plugin_loader"):
|
return plugin_loader
|
||||||
return plugin_loader
|
|
||||||
# [/DEF:get_plugin_loader:Function]
|
# [/DEF:get_plugin_loader:Function]
|
||||||
|
|
||||||
# [DEF:get_task_manager:Function]
|
# [DEF:get_task_manager:Function]
|
||||||
@@ -62,8 +67,7 @@ def get_plugin_loader() -> PluginLoader:
|
|||||||
# @RETURN: TaskManager - The shared task manager instance.
|
# @RETURN: TaskManager - The shared task manager instance.
|
||||||
def get_task_manager() -> TaskManager:
|
def get_task_manager() -> TaskManager:
|
||||||
"""Dependency injector for the TaskManager."""
|
"""Dependency injector for the TaskManager."""
|
||||||
with belief_scope("get_task_manager"):
|
return task_manager
|
||||||
return task_manager
|
|
||||||
# [/DEF:get_task_manager:Function]
|
# [/DEF:get_task_manager:Function]
|
||||||
|
|
||||||
# [DEF:get_scheduler_service:Function]
|
# [DEF:get_scheduler_service:Function]
|
||||||
@@ -73,8 +77,71 @@ def get_task_manager() -> TaskManager:
|
|||||||
# @RETURN: SchedulerService - The shared scheduler service instance.
|
# @RETURN: SchedulerService - The shared scheduler service instance.
|
||||||
def get_scheduler_service() -> SchedulerService:
|
def get_scheduler_service() -> SchedulerService:
|
||||||
"""Dependency injector for the SchedulerService."""
|
"""Dependency injector for the SchedulerService."""
|
||||||
with belief_scope("get_scheduler_service"):
|
return scheduler_service
|
||||||
return scheduler_service
|
|
||||||
# [/DEF:get_scheduler_service:Function]
|
# [/DEF:get_scheduler_service:Function]
|
||||||
|
|
||||||
|
# [DEF:oauth2_scheme:Variable]
|
||||||
|
# @PURPOSE: OAuth2 password bearer scheme for token extraction.
|
||||||
|
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/api/auth/login")
|
||||||
|
# [/DEF:oauth2_scheme:Variable]
|
||||||
|
|
||||||
|
# [DEF:get_current_user:Function]
|
||||||
|
# @PURPOSE: Dependency for retrieving the currently authenticated user from a JWT.
|
||||||
|
# @PRE: JWT token provided in Authorization header.
|
||||||
|
# @POST: Returns the User object if token is valid.
|
||||||
|
# @THROW: HTTPException 401 if token is invalid or user not found.
|
||||||
|
# @PARAM: token (str) - Extracted JWT token.
|
||||||
|
# @PARAM: db (Session) - Auth database session.
|
||||||
|
# @RETURN: User - The authenticated user.
|
||||||
|
def get_current_user(token: str = Depends(oauth2_scheme), db = Depends(get_auth_db)):
|
||||||
|
credentials_exception = HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="Could not validate credentials",
|
||||||
|
headers={"WWW-Authenticate": "Bearer"},
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
payload = decode_token(token)
|
||||||
|
username: str = payload.get("sub")
|
||||||
|
if username is None:
|
||||||
|
raise credentials_exception
|
||||||
|
except JWTError:
|
||||||
|
raise credentials_exception
|
||||||
|
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
user = repo.get_user_by_username(username)
|
||||||
|
if user is None:
|
||||||
|
raise credentials_exception
|
||||||
|
return user
|
||||||
|
# [/DEF:get_current_user:Function]
|
||||||
|
|
||||||
|
# [DEF:has_permission:Function]
|
||||||
|
# @PURPOSE: Dependency for checking if the current user has a specific permission.
|
||||||
|
# @PRE: User is authenticated.
|
||||||
|
# @POST: Returns True if user has permission.
|
||||||
|
# @THROW: HTTPException 403 if permission is denied.
|
||||||
|
# @PARAM: resource (str) - The resource identifier.
|
||||||
|
# @PARAM: action (str) - The action identifier (READ, EXECUTE, WRITE).
|
||||||
|
# @RETURN: User - The authenticated user if permission granted.
|
||||||
|
def has_permission(resource: str, action: str):
|
||||||
|
def permission_checker(current_user: User = Depends(get_current_user)):
|
||||||
|
# Union of all permissions across all roles
|
||||||
|
for role in current_user.roles:
|
||||||
|
for perm in role.permissions:
|
||||||
|
if perm.resource == resource and perm.action == action:
|
||||||
|
return current_user
|
||||||
|
|
||||||
|
# Special case for Admin role (full access)
|
||||||
|
if any(role.name == "Admin" for role in current_user.roles):
|
||||||
|
return current_user
|
||||||
|
|
||||||
|
from .core.auth.logger import log_security_event
|
||||||
|
log_security_event("PERMISSION_DENIED", current_user.username, {"resource": resource, "action": action})
|
||||||
|
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_403_FORBIDDEN,
|
||||||
|
detail=f"Permission denied for {resource}:{action}"
|
||||||
|
)
|
||||||
|
return permission_checker
|
||||||
|
# [/DEF:has_permission:Function]
|
||||||
|
|
||||||
# [/DEF:Dependencies:Module]
|
# [/DEF:Dependencies:Module]
|
||||||
105
backend/src/models/auth.py
Normal file
105
backend/src/models/auth.py
Normal file
@@ -0,0 +1,105 @@
|
|||||||
|
# [DEF:backend.src.models.auth:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: auth, models, user, role, permission, sqlalchemy
|
||||||
|
# @PURPOSE: SQLAlchemy models for multi-user authentication and authorization.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: INHERITS_FROM -> backend.src.models.mapping.Base
|
||||||
|
#
|
||||||
|
# @INVARIANT: Usernames and emails must be unique.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
import uuid
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import Column, String, Boolean, DateTime, ForeignKey, Table, Enum
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from .mapping import Base
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:generate_uuid:Function]
|
||||||
|
# @PURPOSE: Generates a unique UUID string.
|
||||||
|
# @POST: Returns a string representation of a new UUID.
|
||||||
|
def generate_uuid():
|
||||||
|
return str(uuid.uuid4())
|
||||||
|
# [/DEF:generate_uuid:Function]
|
||||||
|
|
||||||
|
# [DEF:user_roles:Table]
|
||||||
|
# @PURPOSE: Association table for many-to-many relationship between Users and Roles.
|
||||||
|
user_roles = Table(
|
||||||
|
"user_roles",
|
||||||
|
Base.metadata,
|
||||||
|
Column("user_id", String, ForeignKey("users.id"), primary_key=True),
|
||||||
|
Column("role_id", String, ForeignKey("roles.id"), primary_key=True),
|
||||||
|
)
|
||||||
|
# [/DEF:user_roles:Table]
|
||||||
|
|
||||||
|
# [DEF:role_permissions:Table]
|
||||||
|
# @PURPOSE: Association table for many-to-many relationship between Roles and Permissions.
|
||||||
|
role_permissions = Table(
|
||||||
|
"role_permissions",
|
||||||
|
Base.metadata,
|
||||||
|
Column("role_id", String, ForeignKey("roles.id"), primary_key=True),
|
||||||
|
Column("permission_id", String, ForeignKey("permissions.id"), primary_key=True),
|
||||||
|
)
|
||||||
|
# [/DEF:role_permissions:Table]
|
||||||
|
|
||||||
|
# [DEF:User:Class]
|
||||||
|
# @PURPOSE: Represents an identity that can authenticate to the system.
|
||||||
|
# @RELATION: HAS_MANY -> Role (via user_roles)
|
||||||
|
class User(Base):
|
||||||
|
__tablename__ = "users"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
username = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
email = Column(String, unique=True, index=True, nullable=True)
|
||||||
|
password_hash = Column(String, nullable=True)
|
||||||
|
auth_source = Column(String, default="LOCAL") # LOCAL or ADFS
|
||||||
|
is_active = Column(Boolean, default=True)
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
last_login = Column(DateTime, nullable=True)
|
||||||
|
|
||||||
|
roles = relationship("Role", secondary=user_roles, back_populates="users")
|
||||||
|
# [/DEF:User:Class]
|
||||||
|
|
||||||
|
# [DEF:Role:Class]
|
||||||
|
# @PURPOSE: Represents a collection of permissions.
|
||||||
|
# @RELATION: HAS_MANY -> User (via user_roles)
|
||||||
|
# @RELATION: HAS_MANY -> Permission (via role_permissions)
|
||||||
|
class Role(Base):
|
||||||
|
__tablename__ = "roles"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
name = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
description = Column(String, nullable=True)
|
||||||
|
|
||||||
|
users = relationship("User", secondary=user_roles, back_populates="roles")
|
||||||
|
permissions = relationship("Permission", secondary=role_permissions, back_populates="roles")
|
||||||
|
# [/DEF:Role:Class]
|
||||||
|
|
||||||
|
# [DEF:Permission:Class]
|
||||||
|
# @PURPOSE: Represents a specific capability within the system.
|
||||||
|
# @RELATION: HAS_MANY -> Role (via role_permissions)
|
||||||
|
class Permission(Base):
|
||||||
|
__tablename__ = "permissions"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
resource = Column(String, nullable=False) # e.g. "plugin:backup"
|
||||||
|
action = Column(String, nullable=False) # e.g. "READ", "EXECUTE", "WRITE"
|
||||||
|
|
||||||
|
roles = relationship("Role", secondary=role_permissions, back_populates="permissions")
|
||||||
|
# [/DEF:Permission:Class]
|
||||||
|
|
||||||
|
# [DEF:ADGroupMapping:Class]
|
||||||
|
# @PURPOSE: Maps an Active Directory group to a local System Role.
|
||||||
|
# @RELATION: DEPENDS_ON -> Role
|
||||||
|
class ADGroupMapping(Base):
|
||||||
|
__tablename__ = "ad_group_mappings"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
ad_group = Column(String, unique=True, index=True, nullable=False)
|
||||||
|
role_id = Column(String, ForeignKey("roles.id"), nullable=False)
|
||||||
|
|
||||||
|
role = relationship("Role")
|
||||||
|
# [/DEF:ADGroupMapping:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.models.auth:Module]
|
||||||
@@ -1,4 +1,5 @@
|
|||||||
# [DEF:backend.src.models.dashboard:Module]
|
# [DEF:backend.src.models.dashboard:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: dashboard, model, metadata, migration
|
# @SEMANTICS: dashboard, model, metadata, migration
|
||||||
# @PURPOSE: Defines data models for dashboard metadata and selection.
|
# @PURPOSE: Defines data models for dashboard metadata and selection.
|
||||||
# @LAYER: Model
|
# @LAYER: Model
|
||||||
@@ -8,6 +9,7 @@ from pydantic import BaseModel
|
|||||||
from typing import List
|
from typing import List
|
||||||
|
|
||||||
# [DEF:DashboardMetadata:Class]
|
# [DEF:DashboardMetadata:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Represents a dashboard available for migration.
|
# @PURPOSE: Represents a dashboard available for migration.
|
||||||
class DashboardMetadata(BaseModel):
|
class DashboardMetadata(BaseModel):
|
||||||
id: int
|
id: int
|
||||||
@@ -17,6 +19,7 @@ class DashboardMetadata(BaseModel):
|
|||||||
# [/DEF:DashboardMetadata:Class]
|
# [/DEF:DashboardMetadata:Class]
|
||||||
|
|
||||||
# [DEF:DashboardSelection:Class]
|
# [DEF:DashboardSelection:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Represents the user's selection of dashboards to migrate.
|
# @PURPOSE: Represents the user's selection of dashboards to migrate.
|
||||||
class DashboardSelection(BaseModel):
|
class DashboardSelection(BaseModel):
|
||||||
selected_ids: List[int]
|
selected_ids: List[int]
|
||||||
|
|||||||
46
backend/src/models/llm.py
Normal file
46
backend/src/models/llm.py
Normal file
@@ -0,0 +1,46 @@
|
|||||||
|
# [DEF:backend.src.models.llm:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: llm, models, sqlalchemy, persistence
|
||||||
|
# @PURPOSE: SQLAlchemy models for LLM provider configuration and validation results.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: INHERITS_FROM -> backend.src.models.mapping.Base
|
||||||
|
|
||||||
|
from sqlalchemy import Column, String, Boolean, DateTime, JSON, Enum, Text
|
||||||
|
from datetime import datetime
|
||||||
|
import uuid
|
||||||
|
from .mapping import Base
|
||||||
|
|
||||||
|
def generate_uuid():
|
||||||
|
return str(uuid.uuid4())
|
||||||
|
|
||||||
|
# [DEF:LLMProvider:Class]
|
||||||
|
# @PURPOSE: SQLAlchemy model for LLM provider configuration.
|
||||||
|
class LLMProvider(Base):
|
||||||
|
__tablename__ = "llm_providers"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
provider_type = Column(String, nullable=False) # openai, openrouter, kilo
|
||||||
|
name = Column(String, nullable=False)
|
||||||
|
base_url = Column(String, nullable=False)
|
||||||
|
api_key = Column(String, nullable=False) # Should be encrypted
|
||||||
|
default_model = Column(String, nullable=False)
|
||||||
|
is_active = Column(Boolean, default=True)
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
# [/DEF:LLMProvider:Class]
|
||||||
|
|
||||||
|
# [DEF:ValidationRecord:Class]
|
||||||
|
# @PURPOSE: SQLAlchemy model for dashboard validation history.
|
||||||
|
class ValidationRecord(Base):
|
||||||
|
__tablename__ = "llm_validation_results"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, default=generate_uuid)
|
||||||
|
dashboard_id = Column(String, nullable=False, index=True)
|
||||||
|
timestamp = Column(DateTime, default=datetime.utcnow)
|
||||||
|
status = Column(String, nullable=False) # PASS, WARN, FAIL
|
||||||
|
screenshot_path = Column(String, nullable=True)
|
||||||
|
issues = Column(JSON, nullable=False)
|
||||||
|
summary = Column(Text, nullable=False)
|
||||||
|
raw_response = Column(Text, nullable=True)
|
||||||
|
# [/DEF:ValidationRecord:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.models.llm:Module]
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
# [DEF:backend.src.models.mapping:Module]
|
# [DEF:backend.src.models.mapping:Module]
|
||||||
#
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
# @SEMANTICS: database, mapping, environment, migration, sqlalchemy, sqlite
|
# @SEMANTICS: database, mapping, environment, migration, sqlalchemy, sqlite
|
||||||
# @PURPOSE: Defines the database schema for environment metadata and database mappings using SQLAlchemy.
|
# @PURPOSE: Defines the database schema for environment metadata and database mappings using SQLAlchemy.
|
||||||
# @LAYER: Domain
|
# @LAYER: Domain
|
||||||
@@ -19,6 +20,7 @@ import enum
|
|||||||
Base = declarative_base()
|
Base = declarative_base()
|
||||||
|
|
||||||
# [DEF:MigrationStatus:Class]
|
# [DEF:MigrationStatus:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
# @PURPOSE: Enumeration of possible migration job statuses.
|
# @PURPOSE: Enumeration of possible migration job statuses.
|
||||||
class MigrationStatus(enum.Enum):
|
class MigrationStatus(enum.Enum):
|
||||||
PENDING = "PENDING"
|
PENDING = "PENDING"
|
||||||
@@ -29,6 +31,7 @@ class MigrationStatus(enum.Enum):
|
|||||||
# [/DEF:MigrationStatus:Class]
|
# [/DEF:MigrationStatus:Class]
|
||||||
|
|
||||||
# [DEF:Environment:Class]
|
# [DEF:Environment:Class]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @PURPOSE: Represents a Superset instance environment.
|
# @PURPOSE: Represents a Superset instance environment.
|
||||||
class Environment(Base):
|
class Environment(Base):
|
||||||
__tablename__ = "environments"
|
__tablename__ = "environments"
|
||||||
@@ -40,6 +43,7 @@ class Environment(Base):
|
|||||||
# [/DEF:Environment:Class]
|
# [/DEF:Environment:Class]
|
||||||
|
|
||||||
# [DEF:DatabaseMapping:Class]
|
# [DEF:DatabaseMapping:Class]
|
||||||
|
# @TIER: STANDARD
|
||||||
# @PURPOSE: Represents a mapping between source and target databases.
|
# @PURPOSE: Represents a mapping between source and target databases.
|
||||||
class DatabaseMapping(Base):
|
class DatabaseMapping(Base):
|
||||||
__tablename__ = "database_mappings"
|
__tablename__ = "database_mappings"
|
||||||
|
|||||||
@@ -75,6 +75,15 @@ class BackupPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the backup plugin.
|
||||||
|
# @RETURN: str - "/tools/backups"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/tools/backups"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for backup plugin parameters.
|
# @PURPOSE: Returns the JSON schema for backup plugin parameters.
|
||||||
# @PRE: Plugin instance exists.
|
# @PRE: Plugin instance exists.
|
||||||
|
|||||||
@@ -63,6 +63,15 @@ class DebugPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the debug plugin.
|
||||||
|
# @RETURN: str - "/tools/debug"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/tools/debug"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for the debug plugin parameters.
|
# @PURPOSE: Returns the JSON schema for the debug plugin parameters.
|
||||||
# @PRE: Plugin instance exists.
|
# @PRE: Plugin instance exists.
|
||||||
|
|||||||
66
backend/src/plugins/git/llm_extension.py
Normal file
66
backend/src/plugins/git/llm_extension.py
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
# [DEF:backend/src/plugins/git/llm_extension:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: git, llm, commit
|
||||||
|
# @PURPOSE: LLM-based extensions for the Git plugin, specifically for commit message generation.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.plugins.llm_analysis.service.LLMClient
|
||||||
|
|
||||||
|
from typing import List, Optional
|
||||||
|
from tenacity import retry, stop_after_attempt, wait_exponential
|
||||||
|
from ..llm_analysis.service import LLMClient
|
||||||
|
from ..llm_analysis.models import LLMProviderType
|
||||||
|
from ...core.logger import belief_scope, logger
|
||||||
|
|
||||||
|
# [DEF:GitLLMExtension:Class]
|
||||||
|
# @PURPOSE: Provides LLM capabilities to the Git plugin.
|
||||||
|
class GitLLMExtension:
|
||||||
|
def __init__(self, client: LLMClient):
|
||||||
|
self.client = client
|
||||||
|
|
||||||
|
# [DEF:suggest_commit_message:Function]
|
||||||
|
# @PURPOSE: Generates a suggested commit message based on a diff and history.
|
||||||
|
# @PARAM: diff (str) - The git diff of staged changes.
|
||||||
|
# @PARAM: history (List[str]) - Recent commit messages for context.
|
||||||
|
# @RETURN: str - The suggested commit message.
|
||||||
|
@retry(
|
||||||
|
stop=stop_after_attempt(2),
|
||||||
|
wait=wait_exponential(multiplier=1, min=2, max=10),
|
||||||
|
reraise=True
|
||||||
|
)
|
||||||
|
async def suggest_commit_message(self, diff: str, history: List[str]) -> str:
|
||||||
|
with belief_scope("suggest_commit_message"):
|
||||||
|
history_text = "\n".join(history)
|
||||||
|
prompt = f"""
|
||||||
|
Generate a concise and professional git commit message based on the following diff and recent history.
|
||||||
|
Use Conventional Commits format (e.g., feat: ..., fix: ..., docs: ...).
|
||||||
|
|
||||||
|
Recent History:
|
||||||
|
{history_text}
|
||||||
|
|
||||||
|
Diff:
|
||||||
|
{diff}
|
||||||
|
|
||||||
|
Commit Message:
|
||||||
|
"""
|
||||||
|
|
||||||
|
logger.debug(f"[suggest_commit_message] Calling LLM with model: {self.client.default_model}")
|
||||||
|
response = await self.client.client.chat.completions.create(
|
||||||
|
model=self.client.default_model,
|
||||||
|
messages=[{"role": "user", "content": prompt}],
|
||||||
|
temperature=0.7
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.debug(f"[suggest_commit_message] LLM Response: {response}")
|
||||||
|
|
||||||
|
if not response or not hasattr(response, 'choices') or not response.choices:
|
||||||
|
error_info = getattr(response, 'error', 'No choices in response')
|
||||||
|
logger.error(f"[suggest_commit_message] Invalid LLM response. Error info: {error_info}")
|
||||||
|
|
||||||
|
# If it's a timeout/provider error, we might want to throw to trigger retry if decorated
|
||||||
|
# but for now we return a safe fallback to avoid UI crash
|
||||||
|
return "Update dashboard configurations (LLM generation failed)"
|
||||||
|
|
||||||
|
return response.choices[0].message.content.strip()
|
||||||
|
# [/DEF:GitLLMExtension:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/git/llm_extension:Module]
|
||||||
@@ -99,6 +99,15 @@ class GitPlugin(PluginBase):
|
|||||||
return "0.1.0"
|
return "0.1.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the git plugin.
|
||||||
|
# @RETURN: str - "/git"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("GitPlugin.ui_route"):
|
||||||
|
return "/git"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Возвращает JSON-схему параметров для выполнения задач плагина.
|
# @PURPOSE: Возвращает JSON-схему параметров для выполнения задач плагина.
|
||||||
# @PRE: GitPlugin is initialized.
|
# @PRE: GitPlugin is initialized.
|
||||||
|
|||||||
12
backend/src/plugins/llm_analysis/__init__.py
Normal file
12
backend/src/plugins/llm_analysis/__init__.py
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/__init__.py:Module]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Initialize the LLM Analysis plugin package.
|
||||||
|
# @LAYER: Domain
|
||||||
|
|
||||||
|
"""
|
||||||
|
LLM Analysis Plugin for automated dashboard validation and dataset documentation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from .plugin import DashboardValidationPlugin, DocumentationPlugin
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/__init__.py:Module]
|
||||||
61
backend/src/plugins/llm_analysis/models.py
Normal file
61
backend/src/plugins/llm_analysis/models.py
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/models.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: pydantic, models, llm
|
||||||
|
# @PURPOSE: Define Pydantic models for LLM Analysis plugin.
|
||||||
|
# @LAYER: Domain
|
||||||
|
|
||||||
|
from typing import List, Optional
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from datetime import datetime
|
||||||
|
from enum import Enum
|
||||||
|
|
||||||
|
# [DEF:LLMProviderType:Class]
|
||||||
|
# @PURPOSE: Enum for supported LLM providers.
|
||||||
|
class LLMProviderType(str, Enum):
|
||||||
|
OPENAI = "openai"
|
||||||
|
OPENROUTER = "openrouter"
|
||||||
|
KILO = "kilo"
|
||||||
|
# [/DEF:LLMProviderType:Class]
|
||||||
|
|
||||||
|
# [DEF:LLMProviderConfig:Class]
|
||||||
|
# @PURPOSE: Configuration for an LLM provider.
|
||||||
|
class LLMProviderConfig(BaseModel):
|
||||||
|
id: Optional[str] = None
|
||||||
|
provider_type: LLMProviderType
|
||||||
|
name: str
|
||||||
|
base_url: str
|
||||||
|
api_key: Optional[str] = None
|
||||||
|
default_model: str
|
||||||
|
is_active: bool = True
|
||||||
|
# [/DEF:LLMProviderConfig:Class]
|
||||||
|
|
||||||
|
# [DEF:ValidationStatus:Class]
|
||||||
|
# @PURPOSE: Enum for dashboard validation status.
|
||||||
|
class ValidationStatus(str, Enum):
|
||||||
|
PASS = "PASS"
|
||||||
|
WARN = "WARN"
|
||||||
|
FAIL = "FAIL"
|
||||||
|
# [/DEF:ValidationStatus:Class]
|
||||||
|
|
||||||
|
# [DEF:DetectedIssue:Class]
|
||||||
|
# @PURPOSE: Model for a single issue detected during validation.
|
||||||
|
class DetectedIssue(BaseModel):
|
||||||
|
severity: ValidationStatus
|
||||||
|
message: str
|
||||||
|
location: Optional[str] = None
|
||||||
|
# [/DEF:DetectedIssue:Class]
|
||||||
|
|
||||||
|
# [DEF:ValidationResult:Class]
|
||||||
|
# @PURPOSE: Model for dashboard validation result.
|
||||||
|
class ValidationResult(BaseModel):
|
||||||
|
id: Optional[str] = None
|
||||||
|
dashboard_id: str
|
||||||
|
timestamp: datetime = Field(default_factory=datetime.utcnow)
|
||||||
|
status: ValidationStatus
|
||||||
|
screenshot_path: Optional[str] = None
|
||||||
|
issues: List[DetectedIssue]
|
||||||
|
summary: str
|
||||||
|
raw_response: Optional[str] = None
|
||||||
|
# [/DEF:ValidationResult:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/models.py:Module]
|
||||||
377
backend/src/plugins/llm_analysis/plugin.py
Normal file
377
backend/src/plugins/llm_analysis/plugin.py
Normal file
@@ -0,0 +1,377 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/plugin.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: plugin, llm, analysis, documentation
|
||||||
|
# @PURPOSE: Implements DashboardValidationPlugin and DocumentationPlugin.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: INHERITS -> backend.src.core.plugin_base.PluginBase
|
||||||
|
# @RELATION: CALLS -> backend.src.plugins.llm_analysis.service.ScreenshotService
|
||||||
|
# @RELATION: CALLS -> backend.src.plugins.llm_analysis.service.LLMClient
|
||||||
|
# @RELATION: CALLS -> backend.src.services.llm_provider.LLMProviderService
|
||||||
|
# @INVARIANT: All LLM interactions must be executed as asynchronous tasks.
|
||||||
|
|
||||||
|
from typing import Dict, Any, Optional, List
|
||||||
|
import os
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from ...core.plugin_base import PluginBase
|
||||||
|
from ...core.logger import belief_scope, logger
|
||||||
|
from ...core.database import SessionLocal
|
||||||
|
from ...core.config_manager import ConfigManager
|
||||||
|
from ...services.llm_provider import LLMProviderService
|
||||||
|
from ...core.superset_client import SupersetClient
|
||||||
|
from .service import ScreenshotService, LLMClient
|
||||||
|
from .models import LLMProviderType, ValidationStatus, ValidationResult, DetectedIssue
|
||||||
|
from ...models.llm import ValidationRecord
|
||||||
|
|
||||||
|
# [DEF:DashboardValidationPlugin:Class]
|
||||||
|
# @PURPOSE: Plugin for automated dashboard health analysis using LLMs.
|
||||||
|
# @RELATION: IMPLEMENTS -> backend.src.core.plugin_base.PluginBase
|
||||||
|
class DashboardValidationPlugin(PluginBase):
|
||||||
|
@property
|
||||||
|
def id(self) -> str:
|
||||||
|
return "llm_dashboard_validation"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self) -> str:
|
||||||
|
return "Dashboard LLM Validation"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def description(self) -> str:
|
||||||
|
return "Automated dashboard health analysis using multimodal LLMs."
|
||||||
|
|
||||||
|
@property
|
||||||
|
def version(self) -> str:
|
||||||
|
return "1.0.0"
|
||||||
|
|
||||||
|
def get_schema(self) -> Dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"dashboard_id": {"type": "string", "title": "Dashboard ID"},
|
||||||
|
"environment_id": {"type": "string", "title": "Environment ID"},
|
||||||
|
"provider_id": {"type": "string", "title": "LLM Provider ID"}
|
||||||
|
},
|
||||||
|
"required": ["dashboard_id", "environment_id", "provider_id"]
|
||||||
|
}
|
||||||
|
|
||||||
|
# [DEF:DashboardValidationPlugin.execute:Function]
|
||||||
|
# @PURPOSE: Executes the dashboard validation task.
|
||||||
|
# @PRE: params contains dashboard_id, environment_id, and provider_id.
|
||||||
|
# @POST: Returns a dictionary with validation results and persists them to the database.
|
||||||
|
# @SIDE_EFFECT: Captures a screenshot, calls LLM API, and writes to the database.
|
||||||
|
async def execute(self, params: Dict[str, Any]):
|
||||||
|
with belief_scope("execute", f"plugin_id={self.id}"):
|
||||||
|
logger.info(f"Executing {self.name} with params: {params}")
|
||||||
|
|
||||||
|
dashboard_id = params.get("dashboard_id")
|
||||||
|
env_id = params.get("environment_id")
|
||||||
|
provider_id = params.get("provider_id")
|
||||||
|
task_id = params.get("_task_id")
|
||||||
|
|
||||||
|
# Helper to log to both app logger and task manager logs
|
||||||
|
def task_log(level: str, message: str, context: Optional[Dict] = None):
|
||||||
|
logger.log(getattr(logging, level.upper()), message)
|
||||||
|
if task_id:
|
||||||
|
from ...dependencies import get_task_manager
|
||||||
|
try:
|
||||||
|
tm = get_task_manager()
|
||||||
|
tm._add_log(task_id, level.upper(), message, context)
|
||||||
|
except: pass
|
||||||
|
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
# 1. Get Environment
|
||||||
|
from ...dependencies import get_config_manager
|
||||||
|
config_mgr = get_config_manager()
|
||||||
|
env = config_mgr.get_environment(env_id)
|
||||||
|
if not env:
|
||||||
|
raise ValueError(f"Environment {env_id} not found")
|
||||||
|
|
||||||
|
# 2. Get LLM Provider
|
||||||
|
llm_service = LLMProviderService(db)
|
||||||
|
db_provider = llm_service.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
raise ValueError(f"LLM Provider {provider_id} not found")
|
||||||
|
|
||||||
|
logger.info(f"[DashboardValidationPlugin.execute] Retrieved provider config:")
|
||||||
|
logger.info(f"[DashboardValidationPlugin.execute] Provider ID: {db_provider.id}")
|
||||||
|
logger.info(f"[DashboardValidationPlugin.execute] Provider Name: {db_provider.name}")
|
||||||
|
logger.info(f"[DashboardValidationPlugin.execute] Provider Type: {db_provider.provider_type}")
|
||||||
|
logger.info(f"[DashboardValidationPlugin.execute] Base URL: {db_provider.base_url}")
|
||||||
|
logger.info(f"[DashboardValidationPlugin.execute] Default Model: {db_provider.default_model}")
|
||||||
|
logger.info(f"[DashboardValidationPlugin.execute] Is Active: {db_provider.is_active}")
|
||||||
|
|
||||||
|
api_key = llm_service.get_decrypted_api_key(provider_id)
|
||||||
|
logger.info(f"[DashboardValidationPlugin.execute] API Key decrypted (first 8 chars): {api_key[:8] if api_key and len(api_key) > 8 else 'EMPTY_OR_NONE'}...")
|
||||||
|
logger.info(f"[DashboardValidationPlugin.execute] API Key Length: {len(api_key) if api_key else 0}")
|
||||||
|
|
||||||
|
# Check if API key was successfully decrypted
|
||||||
|
if not api_key:
|
||||||
|
raise ValueError(
|
||||||
|
f"Failed to decrypt API key for provider {provider_id}. "
|
||||||
|
f"The provider may have been encrypted with a different encryption key. "
|
||||||
|
f"Please update the provider with a new API key through the UI."
|
||||||
|
)
|
||||||
|
|
||||||
|
# 3. Capture Screenshot
|
||||||
|
screenshot_service = ScreenshotService(env)
|
||||||
|
|
||||||
|
storage_root = config_mgr.get_config().settings.storage.root_path
|
||||||
|
screenshots_dir = os.path.join(storage_root, "screenshots")
|
||||||
|
os.makedirs(screenshots_dir, exist_ok=True)
|
||||||
|
|
||||||
|
filename = f"{dashboard_id}_{datetime.now().strftime('%Y%m%d_%H%M%S')}.png"
|
||||||
|
screenshot_path = os.path.join(screenshots_dir, filename)
|
||||||
|
|
||||||
|
await screenshot_service.capture_dashboard(dashboard_id, screenshot_path)
|
||||||
|
|
||||||
|
# 4. Fetch Logs (from Environment /api/v1/log/)
|
||||||
|
logs = []
|
||||||
|
try:
|
||||||
|
client = SupersetClient(env)
|
||||||
|
|
||||||
|
# Calculate time window (last 24 hours)
|
||||||
|
start_time = (datetime.now() - timedelta(hours=24)).isoformat()
|
||||||
|
|
||||||
|
# Construct filter for logs
|
||||||
|
# Note: We filter by dashboard_id matching the object
|
||||||
|
query_params = {
|
||||||
|
"filters": [
|
||||||
|
{"col": "dashboard_id", "opr": "eq", "value": dashboard_id},
|
||||||
|
{"col": "dttm", "opr": "gt", "value": start_time}
|
||||||
|
],
|
||||||
|
"order_column": "dttm",
|
||||||
|
"order_direction": "desc",
|
||||||
|
"page": 0,
|
||||||
|
"page_size": 100
|
||||||
|
}
|
||||||
|
|
||||||
|
response = client.network.request(
|
||||||
|
method="GET",
|
||||||
|
endpoint="/log/",
|
||||||
|
params={"q": json.dumps(query_params)}
|
||||||
|
)
|
||||||
|
|
||||||
|
if isinstance(response, dict) and "result" in response:
|
||||||
|
for item in response["result"]:
|
||||||
|
action = item.get("action", "unknown")
|
||||||
|
dttm = item.get("dttm", "")
|
||||||
|
details = item.get("json", "")
|
||||||
|
logs.append(f"[{dttm}] {action}: {details}")
|
||||||
|
|
||||||
|
if not logs:
|
||||||
|
logs = ["No recent logs found for this dashboard."]
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Failed to fetch logs from environment: {e}")
|
||||||
|
logs = [f"Error fetching remote logs: {str(e)}"]
|
||||||
|
|
||||||
|
# 5. Analyze with LLM
|
||||||
|
llm_client = LLMClient(
|
||||||
|
provider_type=LLMProviderType(db_provider.provider_type),
|
||||||
|
api_key=api_key,
|
||||||
|
base_url=db_provider.base_url,
|
||||||
|
default_model=db_provider.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
analysis = await llm_client.analyze_dashboard(screenshot_path, logs)
|
||||||
|
|
||||||
|
# Log analysis summary to task logs for better visibility
|
||||||
|
task_log("INFO", f"[ANALYSIS_SUMMARY] Status: {analysis['status']}")
|
||||||
|
task_log("INFO", f"[ANALYSIS_SUMMARY] Summary: {analysis['summary']}")
|
||||||
|
if analysis.get("issues"):
|
||||||
|
for i, issue in enumerate(analysis["issues"]):
|
||||||
|
task_log("INFO", f"[ANALYSIS_ISSUE][{i+1}] {issue.get('severity')}: {issue.get('message')} (Location: {issue.get('location', 'N/A')})")
|
||||||
|
|
||||||
|
# 6. Persist Result
|
||||||
|
validation_result = ValidationResult(
|
||||||
|
dashboard_id=dashboard_id,
|
||||||
|
status=ValidationStatus(analysis["status"]),
|
||||||
|
summary=analysis["summary"],
|
||||||
|
issues=[DetectedIssue(**issue) for issue in analysis["issues"]],
|
||||||
|
screenshot_path=screenshot_path,
|
||||||
|
raw_response=str(analysis)
|
||||||
|
)
|
||||||
|
|
||||||
|
db_record = ValidationRecord(
|
||||||
|
dashboard_id=validation_result.dashboard_id,
|
||||||
|
status=validation_result.status.value,
|
||||||
|
summary=validation_result.summary,
|
||||||
|
issues=[issue.dict() for issue in validation_result.issues],
|
||||||
|
screenshot_path=validation_result.screenshot_path,
|
||||||
|
raw_response=validation_result.raw_response
|
||||||
|
)
|
||||||
|
db.add(db_record)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
# 7. Notification on failure (US1 / FR-015)
|
||||||
|
if validation_result.status == ValidationStatus.FAIL:
|
||||||
|
task_log("WARNING", f"Dashboard {dashboard_id} validation FAILED. Summary: {validation_result.summary}")
|
||||||
|
# Placeholder for Email/Pulse notification dispatch
|
||||||
|
# In a real implementation, we would call a NotificationService here
|
||||||
|
# with a payload containing the summary and a link to the report.
|
||||||
|
|
||||||
|
# Final log to ensure all analysis is visible in task logs
|
||||||
|
task_log("INFO", f"Validation completed for dashboard {dashboard_id}. Status: {validation_result.status.value}")
|
||||||
|
|
||||||
|
return validation_result.dict()
|
||||||
|
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:DashboardValidationPlugin.execute:Function]
|
||||||
|
# [/DEF:DashboardValidationPlugin:Class]
|
||||||
|
|
||||||
|
# [DEF:DocumentationPlugin:Class]
|
||||||
|
# @PURPOSE: Plugin for automated dataset documentation using LLMs.
|
||||||
|
# @RELATION: IMPLEMENTS -> backend.src.core.plugin_base.PluginBase
|
||||||
|
class DocumentationPlugin(PluginBase):
|
||||||
|
@property
|
||||||
|
def id(self) -> str:
|
||||||
|
return "llm_documentation"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self) -> str:
|
||||||
|
return "Dataset LLM Documentation"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def description(self) -> str:
|
||||||
|
return "Automated dataset and column documentation using LLMs."
|
||||||
|
|
||||||
|
@property
|
||||||
|
def version(self) -> str:
|
||||||
|
return "1.0.0"
|
||||||
|
|
||||||
|
def get_schema(self) -> Dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"dataset_id": {"type": "string", "title": "Dataset ID"},
|
||||||
|
"environment_id": {"type": "string", "title": "Environment ID"},
|
||||||
|
"provider_id": {"type": "string", "title": "LLM Provider ID"}
|
||||||
|
},
|
||||||
|
"required": ["dataset_id", "environment_id", "provider_id"]
|
||||||
|
}
|
||||||
|
|
||||||
|
# [DEF:DocumentationPlugin.execute:Function]
|
||||||
|
# @PURPOSE: Executes the dataset documentation task.
|
||||||
|
# @PRE: params contains dataset_id, environment_id, and provider_id.
|
||||||
|
# @POST: Returns generated documentation and updates the dataset in Superset.
|
||||||
|
# @SIDE_EFFECT: Calls LLM API and updates dataset metadata in Superset.
|
||||||
|
async def execute(self, params: Dict[str, Any]):
|
||||||
|
with belief_scope("execute", f"plugin_id={self.id}"):
|
||||||
|
logger.info(f"Executing {self.name} with params: {params}")
|
||||||
|
|
||||||
|
dataset_id = params.get("dataset_id")
|
||||||
|
env_id = params.get("environment_id")
|
||||||
|
provider_id = params.get("provider_id")
|
||||||
|
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
# 1. Get Environment
|
||||||
|
from ...dependencies import get_config_manager
|
||||||
|
config_mgr = get_config_manager()
|
||||||
|
env = config_mgr.get_environment(env_id)
|
||||||
|
if not env:
|
||||||
|
raise ValueError(f"Environment {env_id} not found")
|
||||||
|
|
||||||
|
# 2. Get LLM Provider
|
||||||
|
llm_service = LLMProviderService(db)
|
||||||
|
db_provider = llm_service.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
raise ValueError(f"LLM Provider {provider_id} not found")
|
||||||
|
|
||||||
|
logger.info(f"[DocumentationPlugin.execute] Retrieved provider config:")
|
||||||
|
logger.info(f"[DocumentationPlugin.execute] Provider ID: {db_provider.id}")
|
||||||
|
logger.info(f"[DocumentationPlugin.execute] Provider Name: {db_provider.name}")
|
||||||
|
logger.info(f"[DocumentationPlugin.execute] Provider Type: {db_provider.provider_type}")
|
||||||
|
logger.info(f"[DocumentationPlugin.execute] Base URL: {db_provider.base_url}")
|
||||||
|
logger.info(f"[DocumentationPlugin.execute] Default Model: {db_provider.default_model}")
|
||||||
|
logger.info(f"[DocumentationPlugin.execute] Is Active: {db_provider.is_active}")
|
||||||
|
|
||||||
|
api_key = llm_service.get_decrypted_api_key(provider_id)
|
||||||
|
logger.info(f"[DocumentationPlugin.execute] API Key decrypted (first 8 chars): {api_key[:8] if api_key and len(api_key) > 8 else 'EMPTY_OR_NONE'}...")
|
||||||
|
logger.info(f"[DocumentationPlugin.execute] API Key Length: {len(api_key) if api_key else 0}")
|
||||||
|
|
||||||
|
# Check if API key was successfully decrypted
|
||||||
|
if not api_key:
|
||||||
|
raise ValueError(
|
||||||
|
f"Failed to decrypt API key for provider {provider_id}. "
|
||||||
|
f"The provider may have been encrypted with a different encryption key. "
|
||||||
|
f"Please update the provider with a new API key through the UI."
|
||||||
|
)
|
||||||
|
|
||||||
|
# 3. Fetch Metadata (US2 / T024)
|
||||||
|
from ...core.superset_client import SupersetClient
|
||||||
|
client = SupersetClient(env)
|
||||||
|
|
||||||
|
# Optimistic locking check (T045)
|
||||||
|
dataset = client.get_dataset(int(dataset_id))
|
||||||
|
# dataset structure might vary, ensure we get the right field
|
||||||
|
original_changed_on = dataset.get("changed_on_utc") or dataset.get("result", {}).get("changed_on_utc")
|
||||||
|
|
||||||
|
# Extract columns and existing descriptions
|
||||||
|
columns_data = []
|
||||||
|
for col in dataset.get("columns", []):
|
||||||
|
columns_data.append({
|
||||||
|
"name": col.get("column_name"),
|
||||||
|
"type": col.get("type"),
|
||||||
|
"description": col.get("description")
|
||||||
|
})
|
||||||
|
|
||||||
|
# 4. Construct Prompt & Analyze (US2 / T025)
|
||||||
|
llm_client = LLMClient(
|
||||||
|
provider_type=LLMProviderType(db_provider.provider_type),
|
||||||
|
api_key=api_key,
|
||||||
|
base_url=db_provider.base_url,
|
||||||
|
default_model=db_provider.default_model
|
||||||
|
)
|
||||||
|
|
||||||
|
prompt = f"""
|
||||||
|
Generate professional documentation for the following dataset and its columns.
|
||||||
|
Dataset: {dataset.get('table_name')}
|
||||||
|
Columns: {columns_data}
|
||||||
|
|
||||||
|
Provide the documentation in JSON format:
|
||||||
|
{{
|
||||||
|
"dataset_description": "General description of the dataset",
|
||||||
|
"column_descriptions": [
|
||||||
|
{{
|
||||||
|
"name": "column_name",
|
||||||
|
"description": "Generated description"
|
||||||
|
}}
|
||||||
|
]
|
||||||
|
}}
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Using a generic chat completion for text-only US2
|
||||||
|
# We use the shared get_json_completion method from LLMClient
|
||||||
|
doc_result = await llm_client.get_json_completion([{"role": "user", "content": prompt}])
|
||||||
|
|
||||||
|
# 5. Update Metadata (US2 / T026)
|
||||||
|
# This part normally goes to mapping_service, but we implement the logic here for the plugin flow
|
||||||
|
# We'll update the dataset in Superset
|
||||||
|
update_payload = {
|
||||||
|
"description": doc_result["dataset_description"],
|
||||||
|
"columns": []
|
||||||
|
}
|
||||||
|
|
||||||
|
# Map generated descriptions back to column IDs
|
||||||
|
for col_doc in doc_result["column_descriptions"]:
|
||||||
|
for col in dataset.get("columns", []):
|
||||||
|
if col.get("column_name") == col_doc["name"]:
|
||||||
|
update_payload["columns"].append({
|
||||||
|
"id": col.get("id"),
|
||||||
|
"description": col_doc["description"]
|
||||||
|
})
|
||||||
|
|
||||||
|
client.update_dataset(int(dataset_id), update_payload)
|
||||||
|
|
||||||
|
return doc_result
|
||||||
|
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:DocumentationPlugin.execute:Function]
|
||||||
|
# [/DEF:DocumentationPlugin:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/plugin.py:Module]
|
||||||
60
backend/src/plugins/llm_analysis/scheduler.py
Normal file
60
backend/src/plugins/llm_analysis/scheduler.py
Normal file
@@ -0,0 +1,60 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/scheduler.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: scheduler, task, automation
|
||||||
|
# @PURPOSE: Provides helper functions to schedule LLM-based validation tasks.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.core.scheduler
|
||||||
|
|
||||||
|
from typing import Dict, Any
|
||||||
|
from ...dependencies import get_task_manager, get_scheduler_service
|
||||||
|
from ...core.logger import belief_scope, logger
|
||||||
|
|
||||||
|
# [DEF:schedule_dashboard_validation:Function]
|
||||||
|
# @PURPOSE: Schedules a recurring dashboard validation task.
|
||||||
|
# @PARAM: dashboard_id (str) - ID of the dashboard to validate.
|
||||||
|
# @PARAM: cron_expression (str) - Standard cron expression for scheduling.
|
||||||
|
# @PARAM: params (Dict[str, Any]) - Task parameters (environment_id, provider_id).
|
||||||
|
# @SIDE_EFFECT: Adds a job to the scheduler service.
|
||||||
|
def schedule_dashboard_validation(dashboard_id: str, cron_expression: str, params: Dict[str, Any]):
|
||||||
|
with belief_scope("schedule_dashboard_validation", f"dashboard_id={dashboard_id}"):
|
||||||
|
scheduler = get_scheduler_service()
|
||||||
|
task_manager = get_task_manager()
|
||||||
|
|
||||||
|
job_id = f"llm_val_{dashboard_id}"
|
||||||
|
|
||||||
|
async def job_func():
|
||||||
|
await task_manager.create_task(
|
||||||
|
plugin_id="llm_dashboard_validation",
|
||||||
|
params={
|
||||||
|
"dashboard_id": dashboard_id,
|
||||||
|
**params
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
scheduler.add_job(
|
||||||
|
job_func,
|
||||||
|
"cron",
|
||||||
|
id=job_id,
|
||||||
|
replace_existing=True,
|
||||||
|
**_parse_cron(cron_expression)
|
||||||
|
)
|
||||||
|
logger.info(f"Scheduled validation for dashboard {dashboard_id} with cron {cron_expression}")
|
||||||
|
|
||||||
|
# [DEF:_parse_cron:Function]
|
||||||
|
# @PURPOSE: Basic cron parser placeholder.
|
||||||
|
# @PARAM: cron (str) - Cron expression.
|
||||||
|
# @RETURN: Dict[str, str] - Parsed cron parts.
|
||||||
|
def _parse_cron(cron: str) -> Dict[str, str]:
|
||||||
|
# Basic cron parser placeholder
|
||||||
|
parts = cron.split()
|
||||||
|
if len(parts) != 5:
|
||||||
|
return {}
|
||||||
|
return {
|
||||||
|
"minute": parts[0],
|
||||||
|
"hour": parts[1],
|
||||||
|
"day": parts[2],
|
||||||
|
"month": parts[3],
|
||||||
|
"day_of_week": parts[4]
|
||||||
|
}
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/scheduler.py:Module]
|
||||||
629
backend/src/plugins/llm_analysis/service.py
Normal file
629
backend/src/plugins/llm_analysis/service.py
Normal file
@@ -0,0 +1,629 @@
|
|||||||
|
# [DEF:backend/src/plugins/llm_analysis/service.py:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: service, llm, screenshot, playwright, openai
|
||||||
|
# @PURPOSE: Services for LLM interaction and dashboard screenshots.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: DEPENDS_ON -> playwright
|
||||||
|
# @RELATION: DEPENDS_ON -> openai
|
||||||
|
# @RELATION: DEPENDS_ON -> tenacity
|
||||||
|
# @INVARIANT: Screenshots must be 1920px width and capture full page height.
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import base64
|
||||||
|
import json
|
||||||
|
import io
|
||||||
|
from typing import List, Optional, Dict, Any
|
||||||
|
from PIL import Image
|
||||||
|
from playwright.async_api import async_playwright
|
||||||
|
from openai import AsyncOpenAI, RateLimitError, AuthenticationError as OpenAIAuthenticationError
|
||||||
|
from tenacity import retry, stop_after_attempt, wait_exponential, retry_if_exception
|
||||||
|
from .models import LLMProviderType, ValidationResult, ValidationStatus, DetectedIssue
|
||||||
|
from ...core.logger import belief_scope, logger
|
||||||
|
from ...core.config_models import Environment
|
||||||
|
|
||||||
|
# [DEF:ScreenshotService:Class]
|
||||||
|
# @PURPOSE: Handles capturing screenshots of Superset dashboards.
|
||||||
|
class ScreenshotService:
|
||||||
|
# [DEF:ScreenshotService.__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the ScreenshotService with environment configuration.
|
||||||
|
# @PRE: env is a valid Environment object.
|
||||||
|
def __init__(self, env: Environment):
|
||||||
|
self.env = env
|
||||||
|
# [/DEF:ScreenshotService.__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:ScreenshotService.capture_dashboard:Function]
|
||||||
|
# @PURPOSE: Captures a full-page screenshot of a dashboard using Playwright and CDP.
|
||||||
|
# @PRE: dashboard_id is a valid string, output_path is a writable path.
|
||||||
|
# @POST: Returns True if screenshot is saved successfully.
|
||||||
|
# @SIDE_EFFECT: Launches a browser, performs UI login, switches tabs, and writes a PNG file.
|
||||||
|
# @UX_STATE: [Navigating] -> Loading dashboard UI
|
||||||
|
# @UX_STATE: [TabSwitching] -> Iterating through dashboard tabs to trigger lazy loading
|
||||||
|
# @UX_STATE: [CalculatingHeight] -> Determining dashboard dimensions
|
||||||
|
# @UX_STATE: [Capturing] -> Executing CDP screenshot
|
||||||
|
async def capture_dashboard(self, dashboard_id: str, output_path: str) -> bool:
|
||||||
|
with belief_scope("capture_dashboard", f"dashboard_id={dashboard_id}"):
|
||||||
|
logger.info(f"Capturing screenshot for dashboard {dashboard_id}")
|
||||||
|
async with async_playwright() as p:
|
||||||
|
browser = await p.chromium.launch(
|
||||||
|
headless=True,
|
||||||
|
args=[
|
||||||
|
"--disable-blink-features=AutomationControlled",
|
||||||
|
"--disable-infobars",
|
||||||
|
"--no-sandbox"
|
||||||
|
]
|
||||||
|
)
|
||||||
|
# Set a realistic user agent to avoid 403 Forbidden from OpenResty/WAF
|
||||||
|
user_agent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
|
||||||
|
# Construct base UI URL from environment (strip /api/v1 suffix)
|
||||||
|
base_ui_url = self.env.url.rstrip("/")
|
||||||
|
if base_ui_url.endswith("/api/v1"):
|
||||||
|
base_ui_url = base_ui_url[:-len("/api/v1")]
|
||||||
|
|
||||||
|
# Create browser context with realistic headers
|
||||||
|
context = await browser.new_context(
|
||||||
|
viewport={'width': 1280, 'height': 720},
|
||||||
|
user_agent=user_agent,
|
||||||
|
extra_http_headers={
|
||||||
|
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7",
|
||||||
|
"Accept-Language": "ru-RU,ru;q=0.9,en-US;q=0.8,en;q=0.7",
|
||||||
|
"Upgrade-Insecure-Requests": "1",
|
||||||
|
"Sec-Fetch-Dest": "document",
|
||||||
|
"Sec-Fetch-Mode": "navigate",
|
||||||
|
"Sec-Fetch-Site": "none",
|
||||||
|
"Sec-Fetch-User": "?1"
|
||||||
|
}
|
||||||
|
)
|
||||||
|
logger.info("Browser context created successfully")
|
||||||
|
|
||||||
|
page = await context.new_page()
|
||||||
|
# Bypass navigator.webdriver detection
|
||||||
|
await page.add_init_script("delete Object.getPrototypeOf(navigator).webdriver")
|
||||||
|
|
||||||
|
# 1. Navigate to login page and authenticate
|
||||||
|
login_url = f"{base_ui_url.rstrip('/')}/login/"
|
||||||
|
logger.info(f"[DEBUG] Navigating to login page: {login_url}")
|
||||||
|
|
||||||
|
response = await page.goto(login_url, wait_until="networkidle", timeout=60000)
|
||||||
|
if response:
|
||||||
|
logger.info(f"[DEBUG] Login page response status: {response.status}")
|
||||||
|
|
||||||
|
# Wait for login form to be ready
|
||||||
|
await page.wait_for_load_state("domcontentloaded")
|
||||||
|
|
||||||
|
# More exhaustive list of selectors for various Superset versions/themes
|
||||||
|
selectors = {
|
||||||
|
"username": ['input[name="username"]', 'input#username', 'input[placeholder*="Username"]', 'input[type="text"]'],
|
||||||
|
"password": ['input[name="password"]', 'input#password', 'input[placeholder*="Password"]', 'input[type="password"]'],
|
||||||
|
"submit": ['button[type="submit"]', 'button#submit', '.btn-primary', 'input[type="submit"]']
|
||||||
|
}
|
||||||
|
logger.info(f"[DEBUG] Attempting to find login form elements...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Find and fill username
|
||||||
|
u_selector = None
|
||||||
|
for s in selectors["username"]:
|
||||||
|
count = await page.locator(s).count()
|
||||||
|
logger.info(f"[DEBUG] Selector '{s}': {count} elements found")
|
||||||
|
if count > 0:
|
||||||
|
u_selector = s
|
||||||
|
break
|
||||||
|
|
||||||
|
if not u_selector:
|
||||||
|
# Log all input fields on the page for debugging
|
||||||
|
all_inputs = await page.locator('input').all()
|
||||||
|
logger.info(f"[DEBUG] Found {len(all_inputs)} input fields on page")
|
||||||
|
for i, inp in enumerate(all_inputs[:5]): # Log first 5
|
||||||
|
inp_type = await inp.get_attribute('type')
|
||||||
|
inp_name = await inp.get_attribute('name')
|
||||||
|
inp_id = await inp.get_attribute('id')
|
||||||
|
logger.info(f"[DEBUG] Input {i}: type={inp_type}, name={inp_name}, id={inp_id}")
|
||||||
|
raise RuntimeError("Could not find username input field on login page")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Filling username field with selector: {u_selector}")
|
||||||
|
await page.fill(u_selector, self.env.username)
|
||||||
|
|
||||||
|
# Find and fill password
|
||||||
|
p_selector = None
|
||||||
|
for s in selectors["password"]:
|
||||||
|
if await page.locator(s).count() > 0:
|
||||||
|
p_selector = s
|
||||||
|
break
|
||||||
|
|
||||||
|
if not p_selector:
|
||||||
|
raise RuntimeError("Could not find password input field on login page")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Filling password field with selector: {p_selector}")
|
||||||
|
await page.fill(p_selector, self.env.password)
|
||||||
|
|
||||||
|
# Click submit
|
||||||
|
s_selector = selectors["submit"][0]
|
||||||
|
for s in selectors["submit"]:
|
||||||
|
if await page.locator(s).count() > 0:
|
||||||
|
s_selector = s
|
||||||
|
break
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Clicking submit button with selector: {s_selector}")
|
||||||
|
await page.click(s_selector)
|
||||||
|
|
||||||
|
# Wait for navigation after login
|
||||||
|
await page.wait_for_load_state("networkidle", timeout=30000)
|
||||||
|
|
||||||
|
# Check if login was successful
|
||||||
|
if "/login" in page.url:
|
||||||
|
# Check for error messages on page
|
||||||
|
error_msg = await page.locator(".alert-danger, .error-message").text_content() if await page.locator(".alert-danger, .error-message").count() > 0 else "Unknown error"
|
||||||
|
logger.error(f"[DEBUG] Login failed. Still on login page. Error: {error_msg}")
|
||||||
|
debug_path = output_path.replace(".png", "_debug_failed_login.png")
|
||||||
|
await page.screenshot(path=debug_path)
|
||||||
|
raise RuntimeError(f"Login failed: {error_msg}. Debug screenshot saved to {debug_path}")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Login successful. Current URL: {page.url}")
|
||||||
|
|
||||||
|
# Check cookies after successful login
|
||||||
|
page_cookies = await context.cookies()
|
||||||
|
logger.info(f"[DEBUG] Cookies after login: {len(page_cookies)}")
|
||||||
|
for c in page_cookies:
|
||||||
|
logger.info(f"[DEBUG] Cookie: name={c['name']}, domain={c['domain']}, value={c.get('value', '')[:20]}...")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
page_title = await page.title()
|
||||||
|
logger.error(f"UI Login failed. Page title: {page_title}, URL: {page.url}, Error: {str(e)}")
|
||||||
|
debug_path = output_path.replace(".png", "_debug_failed_login.png")
|
||||||
|
await page.screenshot(path=debug_path)
|
||||||
|
raise RuntimeError(f"Login failed: {str(e)}. Debug screenshot saved to {debug_path}")
|
||||||
|
|
||||||
|
# 2. Navigate to dashboard
|
||||||
|
# @UX_STATE: [Navigating] -> Loading dashboard UI
|
||||||
|
dashboard_url = f"{base_ui_url.rstrip('/')}/superset/dashboard/{dashboard_id}/?standalone=true"
|
||||||
|
|
||||||
|
if base_ui_url.startswith("https://") and dashboard_url.startswith("http://"):
|
||||||
|
dashboard_url = dashboard_url.replace("http://", "https://")
|
||||||
|
|
||||||
|
logger.info(f"[DEBUG] Navigating to dashboard: {dashboard_url}")
|
||||||
|
|
||||||
|
# Use networkidle to ensure all initial assets are loaded
|
||||||
|
response = await page.goto(dashboard_url, wait_until="networkidle", timeout=60000)
|
||||||
|
|
||||||
|
if response:
|
||||||
|
logger.info(f"[DEBUG] Dashboard navigation response status: {response.status}, URL: {response.url}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Wait for the dashboard grid to be present
|
||||||
|
await page.wait_for_selector('.dashboard-component, .dashboard-header, [data-test="dashboard-grid"]', timeout=30000)
|
||||||
|
logger.info(f"[DEBUG] Dashboard container loaded")
|
||||||
|
|
||||||
|
# Wait for charts to finish loading (Superset uses loading spinners/skeletons)
|
||||||
|
# We wait until loading indicators disappear or a timeout occurs
|
||||||
|
try:
|
||||||
|
# Wait for loading indicators to disappear
|
||||||
|
await page.wait_for_selector('.loading, .ant-skeleton, .spinner', state="hidden", timeout=60000)
|
||||||
|
logger.info(f"[DEBUG] Loading indicators hidden")
|
||||||
|
except:
|
||||||
|
logger.warning(f"[DEBUG] Timeout waiting for loading indicators to hide")
|
||||||
|
|
||||||
|
# Wait for charts to actually render their content (e.g., ECharts, NVD3)
|
||||||
|
# We look for common chart containers that should have content
|
||||||
|
try:
|
||||||
|
await page.wait_for_selector('.chart-container canvas, .slice_container svg, .superset-chart-canvas, .grid-content .chart-container', timeout=60000)
|
||||||
|
logger.info(f"[DEBUG] Chart content detected")
|
||||||
|
except:
|
||||||
|
logger.warning(f"[DEBUG] Timeout waiting for chart content")
|
||||||
|
|
||||||
|
# Additional check: wait for all chart containers to have non-empty content
|
||||||
|
logger.info(f"[DEBUG] Waiting for all charts to have rendered content...")
|
||||||
|
await page.wait_for_function("""() => {
|
||||||
|
const charts = document.querySelectorAll('.chart-container, .slice_container');
|
||||||
|
if (charts.length === 0) return true; // No charts to wait for
|
||||||
|
|
||||||
|
// Check if all charts have rendered content (canvas, svg, or non-empty div)
|
||||||
|
return Array.from(charts).every(chart => {
|
||||||
|
const hasCanvas = chart.querySelector('canvas') !== null;
|
||||||
|
const hasSvg = chart.querySelector('svg') !== null;
|
||||||
|
const hasContent = chart.innerText.trim().length > 0 || chart.children.length > 0;
|
||||||
|
return hasCanvas || hasSvg || hasContent;
|
||||||
|
});
|
||||||
|
}""", timeout=60000)
|
||||||
|
logger.info(f"[DEBUG] All charts have rendered content")
|
||||||
|
|
||||||
|
# Scroll to bottom and back to top to trigger lazy loading of all charts
|
||||||
|
logger.info(f"[DEBUG] Scrolling to trigger lazy loading...")
|
||||||
|
await page.evaluate("""async () => {
|
||||||
|
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
|
||||||
|
for (let i = 0; i < document.body.scrollHeight; i += 500) {
|
||||||
|
window.scrollTo(0, i);
|
||||||
|
await delay(100);
|
||||||
|
}
|
||||||
|
window.scrollTo(0, 0);
|
||||||
|
await delay(500);
|
||||||
|
}""")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"[DEBUG] Dashboard content wait failed: {e}, proceeding anyway after delay")
|
||||||
|
|
||||||
|
# Final stabilization delay - increased for complex dashboards
|
||||||
|
logger.info(f"[DEBUG] Final stabilization delay...")
|
||||||
|
await asyncio.sleep(15)
|
||||||
|
|
||||||
|
# Logic to handle tabs and full-page capture
|
||||||
|
try:
|
||||||
|
# 1. Handle Tabs (Recursive switching)
|
||||||
|
# @UX_STATE: [TabSwitching] -> Iterating through dashboard tabs to trigger lazy loading
|
||||||
|
processed_tabs = set()
|
||||||
|
|
||||||
|
async def switch_tabs(depth=0):
|
||||||
|
if depth > 3: return # Limit recursion depth
|
||||||
|
|
||||||
|
tab_selectors = [
|
||||||
|
'.ant-tabs-nav-list .ant-tabs-tab',
|
||||||
|
'.dashboard-component-tabs .ant-tabs-tab',
|
||||||
|
'[data-test="dashboard-component-tabs"] .ant-tabs-tab'
|
||||||
|
]
|
||||||
|
|
||||||
|
found_tabs = []
|
||||||
|
for selector in tab_selectors:
|
||||||
|
found_tabs = await page.locator(selector).all()
|
||||||
|
if found_tabs: break
|
||||||
|
|
||||||
|
if found_tabs:
|
||||||
|
logger.info(f"[DEBUG][TabSwitching] Found {len(found_tabs)} tabs at depth {depth}")
|
||||||
|
for i, tab in enumerate(found_tabs):
|
||||||
|
try:
|
||||||
|
tab_text = (await tab.inner_text()).strip()
|
||||||
|
tab_id = f"{depth}_{i}_{tab_text}"
|
||||||
|
|
||||||
|
if tab_id in processed_tabs:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if await tab.is_visible():
|
||||||
|
logger.info(f"[DEBUG][TabSwitching] Switching to tab: {tab_text}")
|
||||||
|
processed_tabs.add(tab_id)
|
||||||
|
|
||||||
|
is_active = "ant-tabs-tab-active" in (await tab.get_attribute("class") or "")
|
||||||
|
if not is_active:
|
||||||
|
await tab.click()
|
||||||
|
await asyncio.sleep(2) # Wait for content to render
|
||||||
|
|
||||||
|
await switch_tabs(depth + 1)
|
||||||
|
except Exception as tab_e:
|
||||||
|
logger.warning(f"[DEBUG][TabSwitching] Failed to process tab {i}: {tab_e}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
first_tab = found_tabs[0]
|
||||||
|
if "ant-tabs-tab-active" not in (await first_tab.get_attribute("class") or ""):
|
||||||
|
await first_tab.click()
|
||||||
|
await asyncio.sleep(1)
|
||||||
|
except: pass
|
||||||
|
|
||||||
|
await switch_tabs()
|
||||||
|
|
||||||
|
# 2. Calculate full height for screenshot
|
||||||
|
# @UX_STATE: [CalculatingHeight] -> Determining dashboard dimensions
|
||||||
|
full_height = await page.evaluate("""() => {
|
||||||
|
const body = document.body;
|
||||||
|
const html = document.documentElement;
|
||||||
|
const dashboardContent = document.querySelector('.dashboard-content');
|
||||||
|
|
||||||
|
return Math.max(
|
||||||
|
body.scrollHeight, body.offsetHeight,
|
||||||
|
html.clientHeight, html.scrollHeight, html.offsetHeight,
|
||||||
|
dashboardContent ? dashboardContent.scrollHeight + 100 : 0
|
||||||
|
);
|
||||||
|
}""")
|
||||||
|
logger.info(f"[DEBUG] Calculated full height: {full_height}")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Count chart elements before resize
|
||||||
|
chart_count_before = await page.evaluate("""() => {
|
||||||
|
return {
|
||||||
|
chartContainers: document.querySelectorAll('.chart-container, .slice_container').length,
|
||||||
|
canvasElements: document.querySelectorAll('canvas').length,
|
||||||
|
svgElements: document.querySelectorAll('.chart-container svg, .slice_container svg').length,
|
||||||
|
visibleCharts: document.querySelectorAll('.chart-container:visible, .slice_container:visible').length
|
||||||
|
};
|
||||||
|
}""")
|
||||||
|
logger.info(f"[DIAGNOSTIC] Chart elements BEFORE viewport resize: {chart_count_before}")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Capture pre-resize screenshot for comparison
|
||||||
|
pre_resize_path = output_path.replace(".png", "_preresize.png")
|
||||||
|
try:
|
||||||
|
await page.screenshot(path=pre_resize_path, full_page=False, timeout=10000)
|
||||||
|
import os
|
||||||
|
pre_resize_size = os.path.getsize(pre_resize_path) if os.path.exists(pre_resize_path) else 0
|
||||||
|
logger.info(f"[DIAGNOSTIC] Pre-resize screenshot saved: {pre_resize_path} ({pre_resize_size} bytes)")
|
||||||
|
except Exception as pre_e:
|
||||||
|
logger.warning(f"[DIAGNOSTIC] Failed to capture pre-resize screenshot: {pre_e}")
|
||||||
|
|
||||||
|
logger.info(f"[DIAGNOSTIC] Resizing viewport from current to 1920x{int(full_height)}")
|
||||||
|
await page.set_viewport_size({"width": 1920, "height": int(full_height)})
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Increased wait time and log timing
|
||||||
|
logger.info("[DIAGNOSTIC] Waiting 10 seconds after viewport resize for re-render...")
|
||||||
|
await asyncio.sleep(10)
|
||||||
|
logger.info("[DIAGNOSTIC] Wait completed")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Count chart elements after resize and wait
|
||||||
|
chart_count_after = await page.evaluate("""() => {
|
||||||
|
return {
|
||||||
|
chartContainers: document.querySelectorAll('.chart-container, .slice_container').length,
|
||||||
|
canvasElements: document.querySelectorAll('canvas').length,
|
||||||
|
svgElements: document.querySelectorAll('.chart-container svg, .slice_container svg').length,
|
||||||
|
visibleCharts: document.querySelectorAll('.chart-container:visible, .slice_container:visible').length
|
||||||
|
};
|
||||||
|
}""")
|
||||||
|
logger.info(f"[DIAGNOSTIC] Chart elements AFTER viewport resize + wait: {chart_count_after}")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Check if any charts have error states
|
||||||
|
chart_errors = await page.evaluate("""() => {
|
||||||
|
const errors = [];
|
||||||
|
document.querySelectorAll('.chart-container, .slice_container').forEach((chart, i) => {
|
||||||
|
const errorEl = chart.querySelector('.error, .alert-danger, .ant-alert-error');
|
||||||
|
if (errorEl) {
|
||||||
|
errors.push({index: i, text: errorEl.innerText.substring(0, 100)});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return errors;
|
||||||
|
}""")
|
||||||
|
if chart_errors:
|
||||||
|
logger.warning(f"[DIAGNOSTIC] Charts with error states detected: {chart_errors}")
|
||||||
|
else:
|
||||||
|
logger.info("[DIAGNOSTIC] No chart error states detected")
|
||||||
|
|
||||||
|
# 3. Take screenshot using CDP to bypass Playwright's font loading wait
|
||||||
|
# @UX_STATE: [Capturing] -> Executing CDP screenshot
|
||||||
|
logger.info("[DEBUG] Attempting full-page screenshot via CDP...")
|
||||||
|
cdp = await page.context.new_cdp_session(page)
|
||||||
|
|
||||||
|
screenshot_data = await cdp.send("Page.captureScreenshot", {
|
||||||
|
"format": "png",
|
||||||
|
"fromSurface": True,
|
||||||
|
"captureBeyondViewport": True
|
||||||
|
})
|
||||||
|
|
||||||
|
image_data = base64.b64decode(screenshot_data["data"])
|
||||||
|
|
||||||
|
with open(output_path, 'wb') as f:
|
||||||
|
f.write(image_data)
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Verify screenshot file
|
||||||
|
import os
|
||||||
|
final_size = os.path.getsize(output_path) if os.path.exists(output_path) else 0
|
||||||
|
logger.info(f"[DIAGNOSTIC] Final screenshot saved: {output_path}")
|
||||||
|
logger.info(f"[DIAGNOSTIC] Final screenshot size: {final_size} bytes ({final_size / 1024:.2f} KB)")
|
||||||
|
|
||||||
|
# DIAGNOSTIC: Get image dimensions
|
||||||
|
try:
|
||||||
|
with Image.open(output_path) as final_img:
|
||||||
|
logger.info(f"[DIAGNOSTIC] Final screenshot dimensions: {final_img.width}x{final_img.height}")
|
||||||
|
except Exception as img_err:
|
||||||
|
logger.warning(f"[DIAGNOSTIC] Could not read final image dimensions: {img_err}")
|
||||||
|
|
||||||
|
logger.info(f"Full-page screenshot saved to {output_path} (via CDP)")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[DEBUG] Full-page/Tab capture failed: {e}")
|
||||||
|
try:
|
||||||
|
await page.screenshot(path=output_path, full_page=True, timeout=10000)
|
||||||
|
except Exception as e2:
|
||||||
|
logger.error(f"[DEBUG] Fallback screenshot also failed: {e2}")
|
||||||
|
await page.screenshot(path=output_path, timeout=5000)
|
||||||
|
|
||||||
|
await browser.close()
|
||||||
|
return True
|
||||||
|
# [/DEF:ScreenshotService.capture_dashboard:Function]
|
||||||
|
# [/DEF:ScreenshotService:Class]
|
||||||
|
|
||||||
|
# [DEF:LLMClient:Class]
|
||||||
|
# @PURPOSE: Wrapper for LLM provider APIs.
|
||||||
|
class LLMClient:
|
||||||
|
# [DEF:LLMClient.__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the LLMClient with provider settings.
|
||||||
|
# @PRE: api_key, base_url, and default_model are non-empty strings.
|
||||||
|
def __init__(self, provider_type: LLMProviderType, api_key: str, base_url: str, default_model: str):
|
||||||
|
self.provider_type = provider_type
|
||||||
|
self.api_key = api_key
|
||||||
|
self.base_url = base_url
|
||||||
|
self.default_model = default_model
|
||||||
|
|
||||||
|
# DEBUG: Log initialization parameters (without exposing full API key)
|
||||||
|
logger.info(f"[LLMClient.__init__] Initializing LLM client:")
|
||||||
|
logger.info(f"[LLMClient.__init__] Provider Type: {provider_type}")
|
||||||
|
logger.info(f"[LLMClient.__init__] Base URL: {base_url}")
|
||||||
|
logger.info(f"[LLMClient.__init__] Default Model: {default_model}")
|
||||||
|
logger.info(f"[LLMClient.__init__] API Key (first 8 chars): {api_key[:8] if api_key and len(api_key) > 8 else 'EMPTY_OR_NONE'}...")
|
||||||
|
logger.info(f"[LLMClient.__init__] API Key Length: {len(api_key) if api_key else 0}")
|
||||||
|
|
||||||
|
self.client = AsyncOpenAI(api_key=api_key, base_url=base_url)
|
||||||
|
# [/DEF:LLMClient.__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:LLMClient.get_json_completion:Function]
|
||||||
|
# @PURPOSE: Helper to handle LLM calls with JSON mode and fallback parsing.
|
||||||
|
# @PRE: messages is a list of valid message dictionaries.
|
||||||
|
# @POST: Returns a parsed JSON dictionary.
|
||||||
|
# @SIDE_EFFECT: Calls external LLM API.
|
||||||
|
def _should_retry(exception: Exception) -> bool:
|
||||||
|
"""Custom retry predicate that excludes authentication errors."""
|
||||||
|
# Don't retry on authentication errors
|
||||||
|
if isinstance(exception, OpenAIAuthenticationError):
|
||||||
|
return False
|
||||||
|
# Retry on rate limit errors and other exceptions
|
||||||
|
return isinstance(exception, (RateLimitError, Exception))
|
||||||
|
|
||||||
|
@retry(
|
||||||
|
stop=stop_after_attempt(5),
|
||||||
|
wait=wait_exponential(multiplier=2, min=5, max=60),
|
||||||
|
retry=retry_if_exception(_should_retry),
|
||||||
|
reraise=True
|
||||||
|
)
|
||||||
|
async def get_json_completion(self, messages: List[Dict[str, Any]]) -> Dict[str, Any]:
|
||||||
|
with belief_scope("get_json_completion"):
|
||||||
|
response = None
|
||||||
|
try:
|
||||||
|
try:
|
||||||
|
logger.info(f"[get_json_completion] Attempting LLM call with JSON mode for model: {self.default_model}")
|
||||||
|
logger.info(f"[get_json_completion] Base URL being used: {self.base_url}")
|
||||||
|
logger.info(f"[get_json_completion] Number of messages: {len(messages)}")
|
||||||
|
logger.info(f"[get_json_completion] API Key present: {bool(self.api_key and len(self.api_key) > 0)}")
|
||||||
|
|
||||||
|
response = await self.client.chat.completions.create(
|
||||||
|
model=self.default_model,
|
||||||
|
messages=messages,
|
||||||
|
response_format={"type": "json_object"}
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
if "JSON mode is not enabled" in str(e) or "400" in str(e):
|
||||||
|
logger.warning(f"[get_json_completion] JSON mode failed or not supported: {str(e)}. Falling back to plain text response.")
|
||||||
|
response = await self.client.chat.completions.create(
|
||||||
|
model=self.default_model,
|
||||||
|
messages=messages
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
raise e
|
||||||
|
|
||||||
|
logger.debug(f"[get_json_completion] LLM Response: {response}")
|
||||||
|
except OpenAIAuthenticationError as e:
|
||||||
|
logger.error(f"[get_json_completion] Authentication error: {str(e)}")
|
||||||
|
# Do not retry on auth errors - re-raise to stop retry
|
||||||
|
raise
|
||||||
|
except RateLimitError as e:
|
||||||
|
logger.warning(f"[get_json_completion] Rate limit hit: {str(e)}")
|
||||||
|
|
||||||
|
# Extract retry_delay from error metadata if available
|
||||||
|
retry_delay = 5.0 # Default fallback
|
||||||
|
try:
|
||||||
|
# Based on logs, the raw response is in e.body or e.response.json()
|
||||||
|
# The logs show 'metadata': {'raw': '...'} which suggests a proxy or specific client wrapper
|
||||||
|
# Let's try to find the 'retryDelay' in the error message or response
|
||||||
|
import re
|
||||||
|
|
||||||
|
# Try to find "retryDelay": "XXs" in the string representation of the error
|
||||||
|
error_str = str(e)
|
||||||
|
match = re.search(r'"retryDelay":\s*"(\d+)s"', error_str)
|
||||||
|
if match:
|
||||||
|
retry_delay = float(match.group(1))
|
||||||
|
else:
|
||||||
|
# Try to parse from response if it's a standard OpenAI-like error with body
|
||||||
|
if hasattr(e, 'body') and isinstance(e.body, dict):
|
||||||
|
# Some providers put it in details
|
||||||
|
details = e.body.get('error', {}).get('details', [])
|
||||||
|
for detail in details:
|
||||||
|
if detail.get('@type') == 'type.googleapis.com/google.rpc.RetryInfo':
|
||||||
|
delay_str = detail.get('retryDelay', '5s')
|
||||||
|
retry_delay = float(delay_str.rstrip('s'))
|
||||||
|
break
|
||||||
|
except Exception as parse_e:
|
||||||
|
logger.debug(f"[get_json_completion] Failed to parse retry delay: {parse_e}")
|
||||||
|
|
||||||
|
# Add a small safety margin (0.5s) as requested
|
||||||
|
wait_time = retry_delay + 0.5
|
||||||
|
logger.info(f"[get_json_completion] Waiting for {wait_time}s before retry...")
|
||||||
|
await asyncio.sleep(wait_time)
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[get_json_completion] LLM call failed: {str(e)}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
if not response or not hasattr(response, 'choices') or not response.choices:
|
||||||
|
raise RuntimeError(f"Invalid LLM response: {response}")
|
||||||
|
|
||||||
|
content = response.choices[0].message.content
|
||||||
|
logger.debug(f"[get_json_completion] Raw content to parse: {content}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
return json.loads(content)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
logger.warning("[get_json_completion] Failed to parse JSON directly, attempting to extract from code blocks")
|
||||||
|
if "```json" in content:
|
||||||
|
json_str = content.split("```json")[1].split("```")[0].strip()
|
||||||
|
return json.loads(json_str)
|
||||||
|
elif "```" in content:
|
||||||
|
json_str = content.split("```")[1].split("```")[0].strip()
|
||||||
|
return json.loads(json_str)
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
# [/DEF:LLMClient.get_json_completion:Function]
|
||||||
|
|
||||||
|
# [DEF:LLMClient.analyze_dashboard:Function]
|
||||||
|
# @PURPOSE: Sends dashboard data (screenshot + logs) to LLM for health analysis.
|
||||||
|
# @PRE: screenshot_path exists, logs is a list of strings.
|
||||||
|
# @POST: Returns a structured analysis dictionary (status, summary, issues).
|
||||||
|
# @SIDE_EFFECT: Reads screenshot file and calls external LLM API.
|
||||||
|
async def analyze_dashboard(self, screenshot_path: str, logs: List[str]) -> Dict[str, Any]:
|
||||||
|
with belief_scope("analyze_dashboard"):
|
||||||
|
# Optimize image to reduce token count (US1 / T023)
|
||||||
|
# Gemini/Gemma models have limits on input tokens, and large images contribute significantly.
|
||||||
|
try:
|
||||||
|
with Image.open(screenshot_path) as img:
|
||||||
|
# Convert to RGB if necessary
|
||||||
|
if img.mode in ("RGBA", "P"):
|
||||||
|
img = img.convert("RGB")
|
||||||
|
|
||||||
|
# Resize if too large (max 1024px width while maintaining aspect ratio)
|
||||||
|
# We reduce width further to 1024px to stay within token limits for long dashboards
|
||||||
|
max_width = 1024
|
||||||
|
if img.width > max_width or img.height > 2048:
|
||||||
|
# Calculate scaling factor to fit within 1024x2048
|
||||||
|
scale = min(max_width / img.width, 2048 / img.height)
|
||||||
|
if scale < 1.0:
|
||||||
|
new_width = int(img.width * scale)
|
||||||
|
new_height = int(img.height * scale)
|
||||||
|
img = img.resize((new_width, new_height), Image.Resampling.LANCZOS)
|
||||||
|
logger.info(f"[analyze_dashboard] Resized image from {img.width}x{img.height} to {new_width}x{new_height}")
|
||||||
|
|
||||||
|
# Compress and convert to base64
|
||||||
|
buffer = io.BytesIO()
|
||||||
|
# Lower quality to 60% to further reduce payload size
|
||||||
|
img.save(buffer, format="JPEG", quality=60, optimize=True)
|
||||||
|
base_64_image = base64.b64encode(buffer.getvalue()).decode('utf-8')
|
||||||
|
logger.info(f"[analyze_dashboard] Optimized image size: {len(buffer.getvalue()) / 1024:.2f} KB")
|
||||||
|
except Exception as img_e:
|
||||||
|
logger.warning(f"[analyze_dashboard] Image optimization failed: {img_e}. Using raw image.")
|
||||||
|
with open(screenshot_path, "rb") as image_file:
|
||||||
|
base_64_image = base64.b64encode(image_file.read()).decode('utf-8')
|
||||||
|
|
||||||
|
log_text = "\n".join(logs)
|
||||||
|
prompt = f"""
|
||||||
|
Analyze the attached dashboard screenshot and the following execution logs for health and visual issues.
|
||||||
|
|
||||||
|
Logs:
|
||||||
|
{log_text}
|
||||||
|
|
||||||
|
Provide the analysis in JSON format with the following structure:
|
||||||
|
{{
|
||||||
|
"status": "PASS" | "WARN" | "FAIL",
|
||||||
|
"summary": "Short summary of findings",
|
||||||
|
"issues": [
|
||||||
|
{{
|
||||||
|
"severity": "WARN" | "FAIL",
|
||||||
|
"message": "Description of the issue",
|
||||||
|
"location": "Optional location info (e.g. chart name)"
|
||||||
|
}}
|
||||||
|
]
|
||||||
|
}}
|
||||||
|
"""
|
||||||
|
|
||||||
|
messages = [
|
||||||
|
{
|
||||||
|
"role": "user",
|
||||||
|
"content": [
|
||||||
|
{"type": "text", "text": prompt},
|
||||||
|
{
|
||||||
|
"type": "image_url",
|
||||||
|
"image_url": {
|
||||||
|
"url": f"data:image/jpeg;base64,{base_64_image}"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
|
||||||
|
try:
|
||||||
|
return await self.get_json_completion(messages)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[analyze_dashboard] Failed to get analysis: {str(e)}")
|
||||||
|
return {
|
||||||
|
"status": "FAIL",
|
||||||
|
"summary": f"Failed to get response from LLM: {str(e)}",
|
||||||
|
"issues": [{"severity": "FAIL", "message": "LLM provider returned empty or invalid response"}]
|
||||||
|
}
|
||||||
|
# [/DEF:LLMClient.analyze_dashboard:Function]
|
||||||
|
# [/DEF:LLMClient:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend/src/plugins/llm_analysis/service.py:Module]
|
||||||
@@ -66,6 +66,15 @@ class MapperPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the mapper plugin.
|
||||||
|
# @RETURN: str - "/tools/mapper"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/tools/mapper"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for the mapper plugin parameters.
|
# @PURPOSE: Returns the JSON schema for the mapper plugin parameters.
|
||||||
# @PRE: Plugin instance exists.
|
# @PRE: Plugin instance exists.
|
||||||
|
|||||||
@@ -71,6 +71,15 @@ class MigrationPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the migration plugin.
|
||||||
|
# @RETURN: str - "/migration"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/migration"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for migration plugin parameters.
|
# @PURPOSE: Returns the JSON schema for migration plugin parameters.
|
||||||
# @PRE: Config manager is available.
|
# @PRE: Config manager is available.
|
||||||
@@ -294,9 +303,9 @@ class MigrationPlugin(PluginBase):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
exported_content, _ = from_c.export_dashboard(dash_id)
|
exported_content, _ = from_c.export_dashboard(dash_id)
|
||||||
with create_temp_file(content=exported_content, dry_run=True, suffix=".zip", logger=logger) as tmp_zip_path:
|
with create_temp_file(content=exported_content, dry_run=True, suffix=".zip") as tmp_zip_path:
|
||||||
# Always transform to strip databases to avoid password errors
|
# Always transform to strip databases to avoid password errors
|
||||||
with create_temp_file(suffix=".zip", dry_run=True, logger=logger) as tmp_new_zip:
|
with create_temp_file(suffix=".zip", dry_run=True) as tmp_new_zip:
|
||||||
success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping, strip_databases=False)
|
success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping, strip_databases=False)
|
||||||
|
|
||||||
if not success and replace_db_config:
|
if not success and replace_db_config:
|
||||||
|
|||||||
@@ -64,6 +64,15 @@ class SearchPlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the search plugin.
|
||||||
|
# @RETURN: str - "/tools/search"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("ui_route"):
|
||||||
|
return "/tools/search"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for the search plugin parameters.
|
# @PURPOSE: Returns the JSON schema for the search plugin parameters.
|
||||||
# @PRE: Plugin instance exists.
|
# @PRE: Plugin instance exists.
|
||||||
|
|||||||
@@ -82,6 +82,15 @@ class StoragePlugin(PluginBase):
|
|||||||
return "1.0.0"
|
return "1.0.0"
|
||||||
# [/DEF:version:Function]
|
# [/DEF:version:Function]
|
||||||
|
|
||||||
|
@property
|
||||||
|
# [DEF:ui_route:Function]
|
||||||
|
# @PURPOSE: Returns the frontend route for the storage plugin.
|
||||||
|
# @RETURN: str - "/tools/storage"
|
||||||
|
def ui_route(self) -> str:
|
||||||
|
with belief_scope("StoragePlugin:ui_route"):
|
||||||
|
return "/tools/storage"
|
||||||
|
# [/DEF:ui_route:Function]
|
||||||
|
|
||||||
# [DEF:get_schema:Function]
|
# [DEF:get_schema:Function]
|
||||||
# @PURPOSE: Returns the JSON schema for storage plugin parameters.
|
# @PURPOSE: Returns the JSON schema for storage plugin parameters.
|
||||||
# @PRE: None.
|
# @PRE: None.
|
||||||
|
|||||||
128
backend/src/schemas/auth.py
Normal file
128
backend/src/schemas/auth.py
Normal file
@@ -0,0 +1,128 @@
|
|||||||
|
# [DEF:backend.src.schemas.auth:Module]
|
||||||
|
#
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: auth, schemas, pydantic, user, token
|
||||||
|
# @PURPOSE: Pydantic schemas for authentication requests and responses.
|
||||||
|
# @LAYER: API
|
||||||
|
# @RELATION: DEPENDS_ON -> pydantic
|
||||||
|
#
|
||||||
|
# @INVARIANT: Sensitive fields like password must not be included in response schemas.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from typing import List, Optional
|
||||||
|
from pydantic import BaseModel, EmailStr, Field
|
||||||
|
from datetime import datetime
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:Token:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Represents a JWT access token response.
|
||||||
|
class Token(BaseModel):
|
||||||
|
access_token: str
|
||||||
|
token_type: str
|
||||||
|
# [/DEF:Token:Class]
|
||||||
|
|
||||||
|
# [DEF:TokenData:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Represents the data encoded in a JWT token.
|
||||||
|
class TokenData(BaseModel):
|
||||||
|
username: Optional[str] = None
|
||||||
|
scopes: List[str] = []
|
||||||
|
# [/DEF:TokenData:Class]
|
||||||
|
|
||||||
|
# [DEF:PermissionSchema:Class]
|
||||||
|
# @TIER: TRIVIAL
|
||||||
|
# @PURPOSE: Represents a permission in API responses.
|
||||||
|
class PermissionSchema(BaseModel):
|
||||||
|
id: Optional[str] = None
|
||||||
|
resource: str
|
||||||
|
action: str
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:PermissionSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:RoleSchema:Class]
|
||||||
|
# @PURPOSE: Represents a role in API responses.
|
||||||
|
class RoleSchema(BaseModel):
|
||||||
|
id: str
|
||||||
|
name: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
permissions: List[PermissionSchema] = []
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:RoleSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:RoleCreate:Class]
|
||||||
|
# @PURPOSE: Schema for creating a new role.
|
||||||
|
class RoleCreate(BaseModel):
|
||||||
|
name: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
permissions: List[str] = [] # List of permission IDs or "resource:action" strings
|
||||||
|
# [/DEF:RoleCreate:Class]
|
||||||
|
|
||||||
|
# [DEF:RoleUpdate:Class]
|
||||||
|
# @PURPOSE: Schema for updating an existing role.
|
||||||
|
class RoleUpdate(BaseModel):
|
||||||
|
name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
permissions: Optional[List[str]] = None
|
||||||
|
# [/DEF:RoleUpdate:Class]
|
||||||
|
|
||||||
|
# [DEF:ADGroupMappingSchema:Class]
|
||||||
|
# @PURPOSE: Represents an AD Group to Role mapping in API responses.
|
||||||
|
class ADGroupMappingSchema(BaseModel):
|
||||||
|
id: str
|
||||||
|
ad_group: str
|
||||||
|
role_id: str
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:ADGroupMappingSchema:Class]
|
||||||
|
|
||||||
|
# [DEF:ADGroupMappingCreate:Class]
|
||||||
|
# @PURPOSE: Schema for creating an AD Group mapping.
|
||||||
|
class ADGroupMappingCreate(BaseModel):
|
||||||
|
ad_group: str
|
||||||
|
role_id: str
|
||||||
|
# [/DEF:ADGroupMappingCreate:Class]
|
||||||
|
|
||||||
|
# [DEF:UserBase:Class]
|
||||||
|
# @PURPOSE: Base schema for user data.
|
||||||
|
class UserBase(BaseModel):
|
||||||
|
username: str
|
||||||
|
email: Optional[EmailStr] = None
|
||||||
|
is_active: bool = True
|
||||||
|
# [/DEF:UserBase:Class]
|
||||||
|
|
||||||
|
# [DEF:UserCreate:Class]
|
||||||
|
# @PURPOSE: Schema for creating a new user.
|
||||||
|
class UserCreate(UserBase):
|
||||||
|
password: str
|
||||||
|
roles: List[str] = []
|
||||||
|
# [/DEF:UserCreate:Class]
|
||||||
|
|
||||||
|
# [DEF:UserUpdate:Class]
|
||||||
|
# @PURPOSE: Schema for updating an existing user.
|
||||||
|
class UserUpdate(BaseModel):
|
||||||
|
email: Optional[EmailStr] = None
|
||||||
|
password: Optional[str] = None
|
||||||
|
is_active: Optional[bool] = None
|
||||||
|
roles: Optional[List[str]] = None
|
||||||
|
# [/DEF:UserUpdate:Class]
|
||||||
|
|
||||||
|
# [DEF:User:Class]
|
||||||
|
# @PURPOSE: Schema for user data in API responses.
|
||||||
|
class User(UserBase):
|
||||||
|
id: str
|
||||||
|
auth_source: str
|
||||||
|
created_at: datetime
|
||||||
|
last_login: Optional[datetime] = None
|
||||||
|
roles: List[RoleSchema] = []
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
# [/DEF:User:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.schemas.auth:Module]
|
||||||
82
backend/src/scripts/create_admin.py
Normal file
82
backend/src/scripts/create_admin.py
Normal file
@@ -0,0 +1,82 @@
|
|||||||
|
# [DEF:backend.src.scripts.create_admin:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: admin, setup, user, auth, cli
|
||||||
|
# @PURPOSE: CLI tool for creating the initial admin user.
|
||||||
|
# @LAYER: Scripts
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.security
|
||||||
|
# @RELATION: USES -> backend.src.core.database
|
||||||
|
# @RELATION: USES -> backend.src.models.auth
|
||||||
|
#
|
||||||
|
# @INVARIANT: Admin user must have the "Admin" role.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
import sys
|
||||||
|
import argparse
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add src to path
|
||||||
|
sys.path.append(str(Path(__file__).parent.parent.parent))
|
||||||
|
|
||||||
|
from src.core.database import AuthSessionLocal, init_db
|
||||||
|
from src.core.auth.security import get_password_hash
|
||||||
|
from src.models.auth import User, Role, Permission
|
||||||
|
from src.core.logger import logger, belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:create_admin:Function]
|
||||||
|
# @PURPOSE: Creates an admin user and necessary roles/permissions.
|
||||||
|
# @PRE: username and password provided via CLI.
|
||||||
|
# @POST: Admin user exists in auth.db.
|
||||||
|
#
|
||||||
|
# @PARAM: username (str) - Admin username.
|
||||||
|
# @PARAM: password (str) - Admin password.
|
||||||
|
def create_admin(username, password):
|
||||||
|
with belief_scope("create_admin"):
|
||||||
|
db = AuthSessionLocal()
|
||||||
|
try:
|
||||||
|
# 1. Ensure Admin role exists
|
||||||
|
admin_role = db.query(Role).filter(Role.name == "Admin").first()
|
||||||
|
if not admin_role:
|
||||||
|
logger.info("Creating Admin role...")
|
||||||
|
admin_role = Role(name="Admin", description="System Administrator")
|
||||||
|
db.add(admin_role)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(admin_role)
|
||||||
|
|
||||||
|
# 2. Check if user already exists
|
||||||
|
existing_user = db.query(User).filter(User.username == username).first()
|
||||||
|
if existing_user:
|
||||||
|
logger.warning(f"User {username} already exists.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# 3. Create Admin user
|
||||||
|
logger.info(f"Creating admin user: {username}")
|
||||||
|
new_user = User(
|
||||||
|
username=username,
|
||||||
|
password_hash=get_password_hash(password),
|
||||||
|
auth_source="LOCAL",
|
||||||
|
is_active=True
|
||||||
|
)
|
||||||
|
new_user.roles.append(admin_role)
|
||||||
|
db.add(new_user)
|
||||||
|
db.commit()
|
||||||
|
logger.info(f"Admin user {username} created successfully.")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to create admin user: {e}")
|
||||||
|
db.rollback()
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:create_admin:Function]
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
parser = argparse.ArgumentParser(description="Create initial admin user")
|
||||||
|
parser.add_argument("--username", required=True, help="Admin username")
|
||||||
|
parser.add_argument("--password", required=True, help="Admin password")
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
# Ensure DB is initialized before creating admin
|
||||||
|
init_db()
|
||||||
|
create_admin(args.username, args.password)
|
||||||
|
|
||||||
|
# [/DEF:backend.src.scripts.create_admin:Module]
|
||||||
44
backend/src/scripts/init_auth_db.py
Normal file
44
backend/src/scripts/init_auth_db.py
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
# [DEF:backend.src.scripts.init_auth_db:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: setup, database, auth, migration
|
||||||
|
# @PURPOSE: Initializes the auth database and creates the necessary tables.
|
||||||
|
# @LAYER: Scripts
|
||||||
|
# @RELATION: CALLS -> backend.src.core.database.init_db
|
||||||
|
#
|
||||||
|
# @INVARIANT: Safe to run multiple times (idempotent).
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add src to path
|
||||||
|
sys.path.append(str(Path(__file__).parent.parent.parent))
|
||||||
|
|
||||||
|
from src.core.database import init_db, auth_engine
|
||||||
|
from src.core.logger import logger, belief_scope
|
||||||
|
from src.scripts.seed_permissions import seed_permissions
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:run_init:Function]
|
||||||
|
# @PURPOSE: Main entry point for the initialization script.
|
||||||
|
# @POST: auth.db is initialized with the correct schema and seeded permissions.
|
||||||
|
def run_init():
|
||||||
|
with belief_scope("init_auth_db"):
|
||||||
|
logger.info("Initializing authentication database...")
|
||||||
|
try:
|
||||||
|
init_db()
|
||||||
|
logger.info("Authentication database initialized successfully.")
|
||||||
|
|
||||||
|
# Seed permissions
|
||||||
|
seed_permissions()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to initialize authentication database: {e}")
|
||||||
|
sys.exit(1)
|
||||||
|
# [/DEF:run_init:Function]
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
run_init()
|
||||||
|
|
||||||
|
# [/DEF:backend.src.scripts.init_auth_db:Module]
|
||||||
116
backend/src/scripts/seed_permissions.py
Normal file
116
backend/src/scripts/seed_permissions.py
Normal file
@@ -0,0 +1,116 @@
|
|||||||
|
# [DEF:backend.src.scripts.seed_permissions:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: setup, database, auth, permissions, seeding
|
||||||
|
# @PURPOSE: Populates the auth database with initial system permissions.
|
||||||
|
# @LAYER: Scripts
|
||||||
|
# @RELATION: USES -> backend.src.core.database.get_auth_db
|
||||||
|
# @RELATION: USES -> backend.src.models.auth.Permission
|
||||||
|
#
|
||||||
|
# @INVARIANT: Safe to run multiple times (idempotent).
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add src to path
|
||||||
|
sys.path.append(str(Path(__file__).parent.parent.parent))
|
||||||
|
|
||||||
|
from src.core.database import AuthSessionLocal
|
||||||
|
from src.models.auth import Permission, Role
|
||||||
|
from src.core.auth.repository import AuthRepository
|
||||||
|
from src.core.logger import logger, belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:INITIAL_PERMISSIONS:Constant]
|
||||||
|
INITIAL_PERMISSIONS = [
|
||||||
|
# Admin Permissions
|
||||||
|
{"resource": "admin:users", "action": "READ"},
|
||||||
|
{"resource": "admin:users", "action": "WRITE"},
|
||||||
|
{"resource": "admin:roles", "action": "READ"},
|
||||||
|
{"resource": "admin:roles", "action": "WRITE"},
|
||||||
|
{"resource": "admin:settings", "action": "READ"},
|
||||||
|
{"resource": "admin:settings", "action": "WRITE"},
|
||||||
|
{"resource": "environments", "action": "READ"},
|
||||||
|
{"resource": "plugins", "action": "READ"},
|
||||||
|
{"resource": "tasks", "action": "READ"},
|
||||||
|
{"resource": "tasks", "action": "WRITE"},
|
||||||
|
|
||||||
|
# Plugin Permissions
|
||||||
|
{"resource": "plugin:backup", "action": "EXECUTE"},
|
||||||
|
{"resource": "plugin:migration", "action": "EXECUTE"},
|
||||||
|
{"resource": "plugin:mapper", "action": "EXECUTE"},
|
||||||
|
{"resource": "plugin:search", "action": "EXECUTE"},
|
||||||
|
{"resource": "plugin:git", "action": "EXECUTE"},
|
||||||
|
{"resource": "plugin:storage", "action": "EXECUTE"},
|
||||||
|
{"resource": "plugin:storage", "action": "READ"},
|
||||||
|
{"resource": "plugin:storage", "action": "WRITE"},
|
||||||
|
{"resource": "plugin:debug", "action": "EXECUTE"},
|
||||||
|
]
|
||||||
|
# [/DEF:INITIAL_PERMISSIONS:Constant]
|
||||||
|
|
||||||
|
# [DEF:seed_permissions:Function]
|
||||||
|
# @PURPOSE: Inserts missing permissions into the database.
|
||||||
|
# @POST: All INITIAL_PERMISSIONS exist in the DB.
|
||||||
|
def seed_permissions():
|
||||||
|
with belief_scope("seed_permissions"):
|
||||||
|
db = AuthSessionLocal()
|
||||||
|
try:
|
||||||
|
logger.info("Seeding permissions...")
|
||||||
|
count = 0
|
||||||
|
for perm_data in INITIAL_PERMISSIONS:
|
||||||
|
exists = db.query(Permission).filter(
|
||||||
|
Permission.resource == perm_data["resource"],
|
||||||
|
Permission.action == perm_data["action"]
|
||||||
|
).first()
|
||||||
|
|
||||||
|
if not exists:
|
||||||
|
new_perm = Permission(
|
||||||
|
resource=perm_data["resource"],
|
||||||
|
action=perm_data["action"]
|
||||||
|
)
|
||||||
|
db.add(new_perm)
|
||||||
|
count += 1
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
logger.info(f"Seeding completed. Added {count} new permissions.")
|
||||||
|
|
||||||
|
# Assign permissions to User role
|
||||||
|
repo = AuthRepository(db)
|
||||||
|
user_role = repo.get_role_by_name("User")
|
||||||
|
if not user_role:
|
||||||
|
user_role = Role(name="User", description="Standard user with plugin access")
|
||||||
|
db.add(user_role)
|
||||||
|
db.flush()
|
||||||
|
|
||||||
|
user_permissions = [
|
||||||
|
("plugin:mapper", "EXECUTE"),
|
||||||
|
("plugin:migration", "EXECUTE"),
|
||||||
|
("plugin:backup", "EXECUTE"),
|
||||||
|
("plugin:git", "EXECUTE"),
|
||||||
|
("plugin:storage", "READ"),
|
||||||
|
("plugin:storage", "WRITE"),
|
||||||
|
("environments", "READ"),
|
||||||
|
("plugins", "READ"),
|
||||||
|
("tasks", "READ"),
|
||||||
|
("tasks", "WRITE"),
|
||||||
|
]
|
||||||
|
|
||||||
|
for res, act in user_permissions:
|
||||||
|
perm = repo.get_permission_by_resource_action(res, act)
|
||||||
|
if perm and perm not in user_role.permissions:
|
||||||
|
user_role.permissions.append(perm)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
logger.info("User role permissions updated.")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to seed permissions: {e}")
|
||||||
|
db.rollback()
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
|
# [/DEF:seed_permissions:Function]
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
seed_permissions()
|
||||||
|
|
||||||
|
# [/DEF:backend.src.scripts.seed_permissions:Module]
|
||||||
115
backend/src/services/auth_service.py
Normal file
115
backend/src/services/auth_service.py
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
# [DEF:backend.src.services.auth_service:Module]
|
||||||
|
#
|
||||||
|
# @SEMANTICS: auth, service, business-logic, login, jwt
|
||||||
|
# @PURPOSE: Orchestrates authentication business logic.
|
||||||
|
# @LAYER: Service
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.repository.AuthRepository
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.security
|
||||||
|
# @RELATION: USES -> backend.src.core.auth.jwt
|
||||||
|
#
|
||||||
|
# @INVARIANT: Authentication must verify both credentials and account status.
|
||||||
|
|
||||||
|
# [SECTION: IMPORTS]
|
||||||
|
from typing import Optional, Dict, Any, List
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from ..models.auth import User, Role
|
||||||
|
from ..core.auth.repository import AuthRepository
|
||||||
|
from ..core.auth.security import verify_password, get_password_hash
|
||||||
|
from ..core.auth.jwt import create_access_token
|
||||||
|
from ..core.logger import belief_scope
|
||||||
|
# [/SECTION]
|
||||||
|
|
||||||
|
# [DEF:AuthService:Class]
|
||||||
|
# @PURPOSE: Provides high-level authentication services.
|
||||||
|
class AuthService:
|
||||||
|
# [DEF:__init__:Function]
|
||||||
|
# @PURPOSE: Initializes the service with a database session.
|
||||||
|
# @PARAM: db (Session) - SQLAlchemy session.
|
||||||
|
def __init__(self, db: Session):
|
||||||
|
self.repo = AuthRepository(db)
|
||||||
|
# [/DEF:__init__:Function]
|
||||||
|
|
||||||
|
# [DEF:authenticate_user:Function]
|
||||||
|
# @PURPOSE: Authenticates a user with username and password.
|
||||||
|
# @PRE: username and password are provided.
|
||||||
|
# @POST: Returns User object if authentication succeeds, else None.
|
||||||
|
# @SIDE_EFFECT: Updates last_login timestamp on success.
|
||||||
|
# @PARAM: username (str) - The username.
|
||||||
|
# @PARAM: password (str) - The plain password.
|
||||||
|
# @RETURN: Optional[User] - The authenticated user or None.
|
||||||
|
def authenticate_user(self, username: str, password: str):
|
||||||
|
with belief_scope("AuthService.authenticate_user"):
|
||||||
|
user = self.repo.get_user_by_username(username)
|
||||||
|
if not user:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if not user.is_active:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if not user.password_hash or not verify_password(password, user.password_hash):
|
||||||
|
return None
|
||||||
|
|
||||||
|
self.repo.update_last_login(user)
|
||||||
|
return user
|
||||||
|
# [/DEF:authenticate_user:Function]
|
||||||
|
|
||||||
|
# [DEF:create_session:Function]
|
||||||
|
# @PURPOSE: Creates a JWT session for an authenticated user.
|
||||||
|
# @PRE: user is a valid User object.
|
||||||
|
# @POST: Returns a dictionary with access_token and token_type.
|
||||||
|
# @PARAM: user (User) - The authenticated user.
|
||||||
|
# @RETURN: Dict[str, str] - Session data.
|
||||||
|
def create_session(self, user) -> Dict[str, str]:
|
||||||
|
with belief_scope("AuthService.create_session"):
|
||||||
|
# Collect role names for scopes
|
||||||
|
scopes = [role.name for role in user.roles]
|
||||||
|
|
||||||
|
token_data = {
|
||||||
|
"sub": user.username,
|
||||||
|
"scopes": scopes
|
||||||
|
}
|
||||||
|
|
||||||
|
access_token = create_access_token(data=token_data)
|
||||||
|
return {
|
||||||
|
"access_token": access_token,
|
||||||
|
"token_type": "bearer"
|
||||||
|
}
|
||||||
|
# [/DEF:create_session:Function]
|
||||||
|
|
||||||
|
# [DEF:provision_adfs_user:Function]
|
||||||
|
# @PURPOSE: Just-In-Time (JIT) provisioning for ADFS users based on group mappings.
|
||||||
|
# @PRE: user_info contains 'upn' (username), 'email', and 'groups'.
|
||||||
|
# @POST: User is created/updated and assigned roles based on groups.
|
||||||
|
# @PARAM: user_info (Dict[str, Any]) - Claims from ADFS token.
|
||||||
|
# @RETURN: User - The provisioned user.
|
||||||
|
def provision_adfs_user(self, user_info: Dict[str, Any]) -> User:
|
||||||
|
with belief_scope("AuthService.provision_adfs_user"):
|
||||||
|
username = user_info.get("upn") or user_info.get("email")
|
||||||
|
email = user_info.get("email")
|
||||||
|
ad_groups = user_info.get("groups", [])
|
||||||
|
|
||||||
|
user = self.repo.get_user_by_username(username)
|
||||||
|
if not user:
|
||||||
|
user = User(
|
||||||
|
username=username,
|
||||||
|
email=email,
|
||||||
|
auth_source="ADFS",
|
||||||
|
is_active=True
|
||||||
|
)
|
||||||
|
self.repo.db.add(user)
|
||||||
|
|
||||||
|
# Update roles based on group mappings
|
||||||
|
from ..models.auth import ADGroupMapping
|
||||||
|
mapped_roles = self.repo.db.query(Role).join(ADGroupMapping).filter(
|
||||||
|
ADGroupMapping.ad_group.in_(ad_groups)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
user.roles = mapped_roles
|
||||||
|
self.repo.db.commit()
|
||||||
|
self.repo.db.refresh(user)
|
||||||
|
return user
|
||||||
|
# [/DEF:provision_adfs_user:Function]
|
||||||
|
|
||||||
|
# [/DEF:AuthService:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.services.auth_service:Module]
|
||||||
129
backend/src/services/llm_provider.py
Normal file
129
backend/src/services/llm_provider.py
Normal file
@@ -0,0 +1,129 @@
|
|||||||
|
# [DEF:backend.src.services.llm_provider:Module]
|
||||||
|
# @TIER: STANDARD
|
||||||
|
# @SEMANTICS: service, llm, provider, encryption
|
||||||
|
# @PURPOSE: Service for managing LLM provider configurations with encrypted API keys.
|
||||||
|
# @LAYER: Domain
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.core.database
|
||||||
|
# @RELATION: DEPENDS_ON -> backend.src.models.llm
|
||||||
|
|
||||||
|
from typing import List, Optional
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from ..models.llm import LLMProvider
|
||||||
|
from ..plugins.llm_analysis.models import LLMProviderConfig, LLMProviderType
|
||||||
|
from ..core.logger import belief_scope, logger
|
||||||
|
from cryptography.fernet import Fernet
|
||||||
|
import os
|
||||||
|
|
||||||
|
# [DEF:EncryptionManager:Class]
|
||||||
|
# @PURPOSE: Handles encryption and decryption of sensitive data like API keys.
|
||||||
|
class EncryptionManager:
|
||||||
|
# @INVARIANT: Uses a secret key from environment or a default one (fallback only for dev).
|
||||||
|
def __init__(self):
|
||||||
|
self.key = os.getenv("ENCRYPTION_KEY", "ZcytYzi0iHIl4Ttr-GdAEk117aGRogkGvN3wiTxrPpE=").encode()
|
||||||
|
self.fernet = Fernet(self.key)
|
||||||
|
|
||||||
|
def encrypt(self, data: str) -> str:
|
||||||
|
return self.fernet.encrypt(data.encode()).decode()
|
||||||
|
|
||||||
|
def decrypt(self, encrypted_data: str) -> str:
|
||||||
|
return self.fernet.decrypt(encrypted_data.encode()).decode()
|
||||||
|
# [/DEF:EncryptionManager:Class]
|
||||||
|
|
||||||
|
# [DEF:LLMProviderService:Class]
|
||||||
|
# @PURPOSE: Service to manage LLM provider lifecycle.
|
||||||
|
class LLMProviderService:
|
||||||
|
def __init__(self, db: Session):
|
||||||
|
self.db = db
|
||||||
|
self.encryption = EncryptionManager()
|
||||||
|
|
||||||
|
# [DEF:get_all_providers:Function]
|
||||||
|
# @PURPOSE: Returns all configured LLM providers.
|
||||||
|
def get_all_providers(self) -> List[LLMProvider]:
|
||||||
|
with belief_scope("get_all_providers"):
|
||||||
|
return self.db.query(LLMProvider).all()
|
||||||
|
# [/DEF:get_all_providers:Function]
|
||||||
|
|
||||||
|
# [DEF:get_provider:Function]
|
||||||
|
# @PURPOSE: Returns a single LLM provider by ID.
|
||||||
|
def get_provider(self, provider_id: str) -> Optional[LLMProvider]:
|
||||||
|
with belief_scope("get_provider"):
|
||||||
|
return self.db.query(LLMProvider).filter(LLMProvider.id == provider_id).first()
|
||||||
|
# [/DEF:get_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:create_provider:Function]
|
||||||
|
# @PURPOSE: Creates a new LLM provider with encrypted API key.
|
||||||
|
def create_provider(self, config: LLMProviderConfig) -> LLMProvider:
|
||||||
|
with belief_scope("create_provider"):
|
||||||
|
encrypted_key = self.encryption.encrypt(config.api_key)
|
||||||
|
db_provider = LLMProvider(
|
||||||
|
provider_type=config.provider_type.value,
|
||||||
|
name=config.name,
|
||||||
|
base_url=config.base_url,
|
||||||
|
api_key=encrypted_key,
|
||||||
|
default_model=config.default_model,
|
||||||
|
is_active=config.is_active
|
||||||
|
)
|
||||||
|
self.db.add(db_provider)
|
||||||
|
self.db.commit()
|
||||||
|
self.db.refresh(db_provider)
|
||||||
|
return db_provider
|
||||||
|
# [/DEF:create_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:update_provider:Function]
|
||||||
|
# @PURPOSE: Updates an existing LLM provider.
|
||||||
|
def update_provider(self, provider_id: str, config: LLMProviderConfig) -> Optional[LLMProvider]:
|
||||||
|
with belief_scope("update_provider"):
|
||||||
|
db_provider = self.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
return None
|
||||||
|
|
||||||
|
db_provider.provider_type = config.provider_type.value
|
||||||
|
db_provider.name = config.name
|
||||||
|
db_provider.base_url = config.base_url
|
||||||
|
# Only update API key if provided (not None and not empty)
|
||||||
|
if config.api_key is not None and config.api_key != "":
|
||||||
|
db_provider.api_key = self.encryption.encrypt(config.api_key)
|
||||||
|
db_provider.default_model = config.default_model
|
||||||
|
db_provider.is_active = config.is_active
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
self.db.refresh(db_provider)
|
||||||
|
return db_provider
|
||||||
|
# [/DEF:update_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:delete_provider:Function]
|
||||||
|
# @PURPOSE: Deletes an LLM provider.
|
||||||
|
def delete_provider(self, provider_id: str) -> bool:
|
||||||
|
with belief_scope("delete_provider"):
|
||||||
|
db_provider = self.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
return False
|
||||||
|
self.db.delete(db_provider)
|
||||||
|
self.db.commit()
|
||||||
|
return True
|
||||||
|
# [/DEF:delete_provider:Function]
|
||||||
|
|
||||||
|
# [DEF:get_decrypted_api_key:Function]
|
||||||
|
# @PURPOSE: Returns the decrypted API key for a provider.
|
||||||
|
def get_decrypted_api_key(self, provider_id: str) -> Optional[str]:
|
||||||
|
with belief_scope("get_decrypted_api_key"):
|
||||||
|
db_provider = self.get_provider(provider_id)
|
||||||
|
if not db_provider:
|
||||||
|
logger.warning(f"[get_decrypted_api_key] Provider {provider_id} not found in database")
|
||||||
|
return None
|
||||||
|
|
||||||
|
logger.info(f"[get_decrypted_api_key] Provider found: {db_provider.id}")
|
||||||
|
logger.info(f"[get_decrypted_api_key] Encrypted API key length: {len(db_provider.api_key) if db_provider.api_key else 0}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
decrypted_key = self.encryption.decrypt(db_provider.api_key)
|
||||||
|
logger.info(f"[get_decrypted_api_key] Decryption successful, key length: {len(decrypted_key) if decrypted_key else 0}")
|
||||||
|
return decrypted_key
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[get_decrypted_api_key] Decryption failed: {str(e)}")
|
||||||
|
return None
|
||||||
|
# [/DEF:get_decrypted_api_key:Function]
|
||||||
|
|
||||||
|
# [/DEF:LLMProviderService:Class]
|
||||||
|
|
||||||
|
# [/DEF:backend.src.services.llm_provider:Module]
|
||||||
@@ -10,9 +10,9 @@
|
|||||||
|
|
||||||
# [SECTION: IMPORTS]
|
# [SECTION: IMPORTS]
|
||||||
from typing import List, Dict
|
from typing import List, Dict
|
||||||
from backend.src.core.logger import belief_scope
|
from ..core.logger import belief_scope
|
||||||
from backend.src.core.superset_client import SupersetClient
|
from ..core.superset_client import SupersetClient
|
||||||
from backend.src.core.utils.matching import suggest_mappings
|
from ..core.utils.matching import suggest_mappings
|
||||||
# [/SECTION]
|
# [/SECTION]
|
||||||
|
|
||||||
# [DEF:MappingService:Class]
|
# [DEF:MappingService:Class]
|
||||||
|
|||||||
BIN
backend/tasks.db
BIN
backend/tasks.db
Binary file not shown.
78
backend/test_auth_debug.py
Normal file
78
backend/test_auth_debug.py
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Debug script to test Superset API authentication"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import requests
|
||||||
|
from pprint import pprint
|
||||||
|
from src.core.superset_client import SupersetClient
|
||||||
|
from src.core.config_manager import ConfigManager
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
print("Debugging Superset API authentication...")
|
||||||
|
|
||||||
|
config = ConfigManager()
|
||||||
|
|
||||||
|
# Select first available environment
|
||||||
|
environments = config.get_environments()
|
||||||
|
|
||||||
|
if not environments:
|
||||||
|
print("No environments configured")
|
||||||
|
return
|
||||||
|
|
||||||
|
env = environments[0]
|
||||||
|
print(f"\nTesting environment: {env.name}")
|
||||||
|
print(f"URL: {env.url}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Test API client authentication
|
||||||
|
print("\n--- Testing API Authentication ---")
|
||||||
|
client = SupersetClient(env)
|
||||||
|
tokens = client.authenticate()
|
||||||
|
|
||||||
|
print("\nAPI Auth Success!")
|
||||||
|
print(f"Access Token: {tokens.get('access_token', 'N/A')}")
|
||||||
|
print(f"CSRF Token: {tokens.get('csrf_token', 'N/A')}")
|
||||||
|
|
||||||
|
# Debug cookies from session
|
||||||
|
print("\n--- Session Cookies ---")
|
||||||
|
for cookie in client.network.session.cookies:
|
||||||
|
print(f"{cookie.name}={cookie.value}")
|
||||||
|
|
||||||
|
# Test accessing UI via requests
|
||||||
|
print("\n--- Testing UI Access ---")
|
||||||
|
ui_url = env.url.rstrip('/').replace('/api/v1', '')
|
||||||
|
print(f"UI URL: {ui_url}")
|
||||||
|
|
||||||
|
# Try to access UI home page
|
||||||
|
ui_response = client.network.session.get(ui_url, timeout=30, allow_redirects=True)
|
||||||
|
print(f"Status Code: {ui_response.status_code}")
|
||||||
|
print(f"URL: {ui_response.url}")
|
||||||
|
|
||||||
|
# Check response headers
|
||||||
|
print("\n--- Response Headers ---")
|
||||||
|
pprint(dict(ui_response.headers))
|
||||||
|
|
||||||
|
print(f"\n--- Response Content Preview (200 chars) ---")
|
||||||
|
print(repr(ui_response.text[:200]))
|
||||||
|
|
||||||
|
if ui_response.status_code == 200:
|
||||||
|
print("\nUI Access: Success")
|
||||||
|
|
||||||
|
# Try to access a dashboard
|
||||||
|
# For testing, just use the home page
|
||||||
|
print("\n--- Checking if login is required ---")
|
||||||
|
if "login" in ui_response.url.lower() or "login" in ui_response.text.lower():
|
||||||
|
print("❌ Not logged in to UI")
|
||||||
|
else:
|
||||||
|
print("✅ Logged in to UI")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"\n❌ Error: {type(e).__name__}: {e}")
|
||||||
|
import traceback
|
||||||
|
print("\nStack Trace:")
|
||||||
|
print(traceback.format_exc())
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
44
backend/test_decryption.py
Normal file
44
backend/test_decryption.py
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Test script to debug API key decryption issue."""
|
||||||
|
|
||||||
|
from src.core.database import SessionLocal
|
||||||
|
from src.models.llm import LLMProvider
|
||||||
|
from cryptography.fernet import Fernet
|
||||||
|
import os
|
||||||
|
|
||||||
|
# Get the encryption key
|
||||||
|
key = os.getenv("ENCRYPTION_KEY", "ZcytYzi0iHIl4Ttr-GdAEk117aGRogkGvN3wiTxrPpE=").encode()
|
||||||
|
print(f"Encryption key (first 20 chars): {key[:20]}")
|
||||||
|
print(f"Encryption key length: {len(key)}")
|
||||||
|
|
||||||
|
# Create Fernet instance
|
||||||
|
fernet = Fernet(key)
|
||||||
|
|
||||||
|
# Get provider from database
|
||||||
|
db = SessionLocal()
|
||||||
|
provider = db.query(LLMProvider).filter(LLMProvider.id == '6c899741-4108-4196-aea4-f38ad2f0150e').first()
|
||||||
|
|
||||||
|
if provider:
|
||||||
|
print(f"\nProvider found:")
|
||||||
|
print(f" ID: {provider.id}")
|
||||||
|
print(f" Name: {provider.name}")
|
||||||
|
print(f" Encrypted API Key (first 50 chars): {provider.api_key[:50]}")
|
||||||
|
print(f" Encrypted API Key Length: {len(provider.api_key)}")
|
||||||
|
|
||||||
|
# Test decryption
|
||||||
|
print(f"\nAttempting decryption...")
|
||||||
|
try:
|
||||||
|
decrypted = fernet.decrypt(provider.api_key.encode()).decode()
|
||||||
|
print(f"Decryption successful!")
|
||||||
|
print(f" Decrypted key length: {len(decrypted)}")
|
||||||
|
print(f" Decrypted key (first 8 chars): {decrypted[:8]}")
|
||||||
|
print(f" Decrypted key is empty: {len(decrypted) == 0}")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Decryption failed with error: {e}")
|
||||||
|
print(f"Error type: {type(e).__name__}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
else:
|
||||||
|
print("Provider not found")
|
||||||
|
|
||||||
|
db.close()
|
||||||
1
backend/test_encryption.py
Normal file
1
backend/test_encryption.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
[{"key[": 20, ")\n\n# Create Fernet instance\nfernet = Fernet(key)\n\n# Test encrypting an empty string\nempty_encrypted = fernet.encrypt(b\"": ".", "print(f": "nEncrypted empty string: {empty_encrypted"}, {"test-api-key-12345\"\ntest_encrypted = fernet.encrypt(test_key.encode()).decode()\nprint(f": "nEncrypted test key: {test_encrypted"}, {"gAAAAABphhwSZie0OwXjJ78Fk-c4Uo6doNJXipX49AX7Bypzp4ohiRX3hXPXKb45R1vhNUOqbm6Ke3-eRwu_KdWMZ9chFBKmqw==\"\nprint(f": "nStored encrypted key: {stored_key"}, {"len(stored_key)}": "Check if stored key matches empty string encryption\nif stored_key == empty_encrypted:\n print(", "string!": "else:\n print(", "print(f": "mpty string encryption: {empty_encrypted"}, {"stored_key}": "Try to decrypt the stored key\ntry:\n decrypted = fernet.decrypt(stored_key.encode()).decode()\n print(f", "print(f": "ecrypted key length: {len(decrypted)"}, {")\nexcept Exception as e:\n print(f": "nDecryption failed with error: {e"}]
|
||||||
162
backend/tests/test_auth.py
Normal file
162
backend/tests/test_auth.py
Normal file
@@ -0,0 +1,162 @@
|
|||||||
|
import sys
|
||||||
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add src to path
|
||||||
|
sys.path.append(str(Path(__file__).parent.parent / "src"))
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
from src.core.database import Base, get_auth_db
|
||||||
|
from src.models.auth import User, Role, Permission, ADGroupMapping
|
||||||
|
from src.services.auth_service import AuthService
|
||||||
|
from src.core.auth.repository import AuthRepository
|
||||||
|
from src.core.auth.security import verify_password, get_password_hash
|
||||||
|
|
||||||
|
# Create in-memory SQLite database for testing
|
||||||
|
SQLALCHEMY_DATABASE_URL = "sqlite:///:memory:"
|
||||||
|
|
||||||
|
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||||
|
TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
|
|
||||||
|
# Create all tables
|
||||||
|
Base.metadata.create_all(bind=engine)
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def db_session():
|
||||||
|
"""Create a new database session with a transaction, rollback after test"""
|
||||||
|
connection = engine.connect()
|
||||||
|
transaction = connection.begin()
|
||||||
|
session = TestingSessionLocal(bind=connection)
|
||||||
|
|
||||||
|
yield session
|
||||||
|
|
||||||
|
session.close()
|
||||||
|
transaction.rollback()
|
||||||
|
connection.close()
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def auth_service(db_session):
|
||||||
|
return AuthService(db_session)
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def auth_repo(db_session):
|
||||||
|
return AuthRepository(db_session)
|
||||||
|
|
||||||
|
def test_create_user(auth_repo):
|
||||||
|
"""Test user creation"""
|
||||||
|
user = User(
|
||||||
|
username="testuser",
|
||||||
|
email="test@example.com",
|
||||||
|
password_hash=get_password_hash("testpassword123"),
|
||||||
|
auth_source="LOCAL"
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_repo.db.add(user)
|
||||||
|
auth_repo.db.commit()
|
||||||
|
|
||||||
|
retrieved_user = auth_repo.get_user_by_username("testuser")
|
||||||
|
assert retrieved_user is not None
|
||||||
|
assert retrieved_user.username == "testuser"
|
||||||
|
assert retrieved_user.email == "test@example.com"
|
||||||
|
assert verify_password("testpassword123", retrieved_user.password_hash)
|
||||||
|
|
||||||
|
def test_authenticate_user(auth_service, auth_repo):
|
||||||
|
"""Test user authentication with valid and invalid credentials"""
|
||||||
|
user = User(
|
||||||
|
username="testuser",
|
||||||
|
email="test@example.com",
|
||||||
|
password_hash=get_password_hash("testpassword123"),
|
||||||
|
auth_source="LOCAL"
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_repo.db.add(user)
|
||||||
|
auth_repo.db.commit()
|
||||||
|
|
||||||
|
# Test valid credentials
|
||||||
|
authenticated_user = auth_service.authenticate_user("testuser", "testpassword123")
|
||||||
|
assert authenticated_user is not None
|
||||||
|
assert authenticated_user.username == "testuser"
|
||||||
|
|
||||||
|
# Test invalid password
|
||||||
|
invalid_user = auth_service.authenticate_user("testuser", "wrongpassword")
|
||||||
|
assert invalid_user is None
|
||||||
|
|
||||||
|
# Test invalid username
|
||||||
|
invalid_user = auth_service.authenticate_user("nonexistent", "testpassword123")
|
||||||
|
assert invalid_user is None
|
||||||
|
|
||||||
|
def test_create_session(auth_service, auth_repo):
|
||||||
|
"""Test session token creation"""
|
||||||
|
user = User(
|
||||||
|
username="testuser",
|
||||||
|
email="test@example.com",
|
||||||
|
password_hash=get_password_hash("testpassword123"),
|
||||||
|
auth_source="LOCAL"
|
||||||
|
)
|
||||||
|
|
||||||
|
auth_repo.db.add(user)
|
||||||
|
auth_repo.db.commit()
|
||||||
|
|
||||||
|
session = auth_service.create_session(user)
|
||||||
|
assert "access_token" in session
|
||||||
|
assert "token_type" in session
|
||||||
|
assert session["token_type"] == "bearer"
|
||||||
|
assert len(session["access_token"]) > 0
|
||||||
|
|
||||||
|
def test_role_permission_association(auth_repo):
|
||||||
|
"""Test role and permission association"""
|
||||||
|
role = Role(name="Admin", description="System administrator")
|
||||||
|
perm1 = Permission(resource="admin:users", action="READ")
|
||||||
|
perm2 = Permission(resource="admin:users", action="WRITE")
|
||||||
|
|
||||||
|
role.permissions.extend([perm1, perm2])
|
||||||
|
|
||||||
|
auth_repo.db.add(role)
|
||||||
|
auth_repo.db.commit()
|
||||||
|
|
||||||
|
retrieved_role = auth_repo.get_role_by_name("Admin")
|
||||||
|
assert retrieved_role is not None
|
||||||
|
assert len(retrieved_role.permissions) == 2
|
||||||
|
|
||||||
|
permissions = [f"{p.resource}:{p.action}" for p in retrieved_role.permissions]
|
||||||
|
assert "admin:users:READ" in permissions
|
||||||
|
assert "admin:users:WRITE" in permissions
|
||||||
|
|
||||||
|
def test_user_role_association(auth_repo):
|
||||||
|
"""Test user and role association"""
|
||||||
|
role = Role(name="Admin", description="System administrator")
|
||||||
|
user = User(
|
||||||
|
username="adminuser",
|
||||||
|
email="admin@example.com",
|
||||||
|
password_hash=get_password_hash("adminpass123"),
|
||||||
|
auth_source="LOCAL"
|
||||||
|
)
|
||||||
|
|
||||||
|
user.roles.append(role)
|
||||||
|
|
||||||
|
auth_repo.db.add(role)
|
||||||
|
auth_repo.db.add(user)
|
||||||
|
auth_repo.db.commit()
|
||||||
|
|
||||||
|
retrieved_user = auth_repo.get_user_by_username("adminuser")
|
||||||
|
assert retrieved_user is not None
|
||||||
|
assert len(retrieved_user.roles) == 1
|
||||||
|
assert retrieved_user.roles[0].name == "Admin"
|
||||||
|
|
||||||
|
def test_ad_group_mapping(auth_repo):
|
||||||
|
"""Test AD group mapping"""
|
||||||
|
role = Role(name="ADFS_Admin", description="ADFS administrators")
|
||||||
|
|
||||||
|
auth_repo.db.add(role)
|
||||||
|
auth_repo.db.commit()
|
||||||
|
|
||||||
|
mapping = ADGroupMapping(ad_group="DOMAIN\\ADFS_Admins", role_id=role.id)
|
||||||
|
|
||||||
|
auth_repo.db.add(mapping)
|
||||||
|
auth_repo.db.commit()
|
||||||
|
|
||||||
|
retrieved_mapping = auth_repo.db.query(ADGroupMapping).filter_by(ad_group="DOMAIN\\ADFS_Admins").first()
|
||||||
|
assert retrieved_mapping is not None
|
||||||
|
assert retrieved_mapping.role_id == role.id
|
||||||
38
docs/architecture_decision_superset_migration.md
Normal file
38
docs/architecture_decision_superset_migration.md
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
# Архитектурное решение: Standalone сервис vs Интеграция в Superset
|
||||||
|
|
||||||
|
## Рекомендация: Оставить отдельным сервисом
|
||||||
|
|
||||||
|
Хотя интеграция непосредственно в Superset может показаться привлекательной для создания "единого интерфейса", **настоятельно рекомендуется оставить `ss-tools` отдельным сервисом** по следующим архитектурным и техническим причинам.
|
||||||
|
|
||||||
|
### 1. Архитектурный паттерн: Оркестратор против Инстанса
|
||||||
|
* **Проблема:** `ss-tools` действует как **"Менеджер менеджеров"**. Его основная ценность заключается в подключении и перемещении данных *между* различными окружениями Superset (Dev → Stage → Prod).
|
||||||
|
* **Почему разделение лучше:** Если встроить эту логику в Superset, придется сделать один конкретный инстанс (например, "Dev") "главным контроллером" для остальных. Если этот инстанс упадет, вы потеряете возможности управления. Отдельный инструмент находится *над* окружениями, рассматривая их все как равные endpoints.
|
||||||
|
|
||||||
|
### 2. Несовместимость технологического стека
|
||||||
|
* **Superset:** Бэкенд на **Flask** (Python) с фронтендом на **React**.
|
||||||
|
* **ss-tools:** Бэкенд на **FastAPI** (Python) с фронтендом на **SvelteKit**.
|
||||||
|
* **Цена миграции:** Перенос в Superset потребует **полного переписывания** всего фронтенда с Svelte на React, чтобы он мог работать как плагин или расширение Superset. Также потребуется рефакторинг внедрения зависимостей (Dependency Injection) из FastAPI, чтобы соответствовать структуре Flask-AppBuilder в Superset.
|
||||||
|
|
||||||
|
### 3. Цикл релизов и гибкость
|
||||||
|
* **Superset:** Имеет сложный и медленный цикл релизов. Внедрение кастомных фич в основной репозиторий (upstream) затруднительно, а поддержка собственного "форка" Superset — это кошмар сопровождения (конфликты слияния при каждом обновлении).
|
||||||
|
* **ss-tools:** Как отдельный микросервис, вы можете развертывать новые функции (например, новые LLM плагины или Git-процессы) мгновенно, не беспокоясь о том, что сломаете основную BI-платформу.
|
||||||
|
|
||||||
|
### 4. Безопасность и изоляция
|
||||||
|
* **Права доступа:** `ss-tools` имеет свои собственные страницы администрирования (`AdminRolesPage` и `AdminUsersPage`), управляющие правами специально для задач *деплоя* и *резервного копирования*. Это задачи DevOps, а не бизнес-аналитики (BI). Смешивание привилегий развертывания с привилегиями просмотра данных в одной системе увеличивает "радиус поражения" в случае проблем с безопасностью.
|
||||||
|
|
||||||
|
### Техническая реализуемость (если все же решиться)
|
||||||
|
Это технически возможно, но процесс будет выглядеть так:
|
||||||
|
1. **Frontend:** Придется переписать все компоненты Svelte (`DashboardGrid`, `MappingTable` и т.д.) как компоненты React, чтобы они могли жить внутри `superset-frontend`.
|
||||||
|
2. **Backend:** Вероятно, придется реализовать это как серверное **расширение Superset** (используя Flask Blueprints), создав новые API endpoints, которые будет вызывать фронтенд на React.
|
||||||
|
3. **Database:** Придется добавить таблицы проекта (`task_records`, `connection_configs`) в базу данных метаданных Superset через миграции Alembic.
|
||||||
|
|
||||||
|
### Итоговая таблица
|
||||||
|
|
||||||
|
| Характеристика | Отдельный сервис (Текущий) | Интеграция в Superset |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **Скорость разработки** | 🚀 Высокая (Независимая) | 🐢 Низкая (Связана с монолитом) |
|
||||||
|
| **Технологический стек** | Современный (FastAPI + SvelteKit) | Legacy/Строгий (Flask + React) |
|
||||||
|
| **Управление мульти-окружением** | ✅ Нативно (Спроектирован для этого) | ⚠️ Неудобно (Один инстанс управляет всеми) |
|
||||||
|
| **Поддержка (Maintenance)** | ✅ Низкая (Обновление в любое время) | ❌ Высокая (Rebase при каждом обновлении Superset) |
|
||||||
|
|
||||||
|
**Вывод:** Проект уже хорошо спроектирован как специализированный инструмент DevOps (слой `DevOps/Tooling` в вашей семантической карте). Слияние его с Superset, скорее всего, приведет к техническому долгу и регрессии функциональности.
|
||||||
@@ -15,11 +15,14 @@
|
|||||||
import { t } from '../lib/i18n';
|
import { t } from '../lib/i18n';
|
||||||
import { Button, Input } from '../lib/ui';
|
import { Button, Input } from '../lib/ui';
|
||||||
import GitManager from './git/GitManager.svelte';
|
import GitManager from './git/GitManager.svelte';
|
||||||
|
import { api } from '../lib/api';
|
||||||
|
import { addToast as toast } from '../lib/toasts.js';
|
||||||
// [/SECTION]
|
// [/SECTION]
|
||||||
|
|
||||||
// [SECTION: PROPS]
|
// [SECTION: PROPS]
|
||||||
export let dashboards: DashboardMetadata[] = [];
|
export let dashboards: DashboardMetadata[] = [];
|
||||||
export let selectedIds: number[] = [];
|
export let selectedIds: number[] = [];
|
||||||
|
export let environmentId: string = "ss1";
|
||||||
// [/SECTION]
|
// [/SECTION]
|
||||||
|
|
||||||
// [SECTION: STATE]
|
// [SECTION: STATE]
|
||||||
@@ -34,8 +37,54 @@
|
|||||||
let showGitManager = false;
|
let showGitManager = false;
|
||||||
let gitDashboardId: number | null = null;
|
let gitDashboardId: number | null = null;
|
||||||
let gitDashboardTitle = "";
|
let gitDashboardTitle = "";
|
||||||
|
let validatingIds: Set<number> = new Set();
|
||||||
// [/SECTION]
|
// [/SECTION]
|
||||||
|
|
||||||
|
// [DEF:handleValidate:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Triggers dashboard validation task.
|
||||||
|
*/
|
||||||
|
async function handleValidate(dashboard: DashboardMetadata) {
|
||||||
|
if (validatingIds.has(dashboard.id)) return;
|
||||||
|
|
||||||
|
validatingIds.add(dashboard.id);
|
||||||
|
validatingIds = validatingIds; // Trigger reactivity
|
||||||
|
|
||||||
|
try {
|
||||||
|
// TODO: Get provider_id from settings or prompt user
|
||||||
|
// For now, we assume a default provider or let the backend handle it if possible,
|
||||||
|
// but the plugin requires provider_id.
|
||||||
|
// In a real implementation, we might open a modal to select provider if not configured globally.
|
||||||
|
// Or we pick the first active one.
|
||||||
|
|
||||||
|
// Fetch active provider first
|
||||||
|
const providers = await api.fetchApi('/llm/providers');
|
||||||
|
const activeProvider = providers.find((p: any) => p.is_active);
|
||||||
|
|
||||||
|
if (!activeProvider) {
|
||||||
|
toast('No active LLM provider found. Please configure one in settings.', 'error');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
await api.postApi('/tasks', {
|
||||||
|
plugin_id: 'llm_dashboard_validation',
|
||||||
|
params: {
|
||||||
|
dashboard_id: dashboard.id.toString(),
|
||||||
|
environment_id: environmentId,
|
||||||
|
provider_id: activeProvider.id
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
toast('Validation task started', 'success');
|
||||||
|
} catch (e: any) {
|
||||||
|
toast(e.message || 'Validation failed to start', 'error');
|
||||||
|
} finally {
|
||||||
|
validatingIds.delete(dashboard.id);
|
||||||
|
validatingIds = validatingIds;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:handleValidate:Function]
|
||||||
|
|
||||||
// [SECTION: DERIVED]
|
// [SECTION: DERIVED]
|
||||||
$: filteredDashboards = dashboards.filter(d =>
|
$: filteredDashboards = dashboards.filter(d =>
|
||||||
d.title.toLowerCase().includes(filterText.toLowerCase())
|
d.title.toLowerCase().includes(filterText.toLowerCase())
|
||||||
@@ -175,6 +224,7 @@
|
|||||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider cursor-pointer hover:text-gray-700 transition-colors" on:click={() => handleSort('status')}>
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider cursor-pointer hover:text-gray-700 transition-colors" on:click={() => handleSort('status')}>
|
||||||
{$t.dashboard.status} {sortColumn === 'status' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
|
{$t.dashboard.status} {sortColumn === 'status' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
|
||||||
</th>
|
</th>
|
||||||
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.dashboard.validation}</th>
|
||||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.dashboard.git}</th>
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.dashboard.git}</th>
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
@@ -196,6 +246,17 @@
|
|||||||
{dashboard.status}
|
{dashboard.status}
|
||||||
</span>
|
</span>
|
||||||
</td>
|
</td>
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap text-sm font-medium">
|
||||||
|
<Button
|
||||||
|
variant="secondary"
|
||||||
|
size="sm"
|
||||||
|
on:click={() => handleValidate(dashboard)}
|
||||||
|
disabled={validatingIds.has(dashboard.id)}
|
||||||
|
class="text-purple-600 hover:text-purple-900"
|
||||||
|
>
|
||||||
|
{validatingIds.has(dashboard.id) ? 'Validating...' : 'Validate'}
|
||||||
|
</Button>
|
||||||
|
</td>
|
||||||
<td class="px-6 py-4 whitespace-nowrap text-sm font-medium">
|
<td class="px-6 py-4 whitespace-nowrap text-sm font-medium">
|
||||||
<Button
|
<Button
|
||||||
variant="ghost"
|
variant="ghost"
|
||||||
|
|||||||
@@ -57,4 +57,4 @@
|
|||||||
/* Component specific styles */
|
/* Component specific styles */
|
||||||
</style>
|
</style>
|
||||||
|
|
||||||
<!-- [/DEF:EnvSelector:Component] -->
|
<!-- [/DEF:EnvSelector:Component] -->
|
||||||
@@ -9,6 +9,13 @@
|
|||||||
import { page } from '$app/stores';
|
import { page } from '$app/stores';
|
||||||
import { t } from '$lib/i18n';
|
import { t } from '$lib/i18n';
|
||||||
import { LanguageSwitcher } from '$lib/ui';
|
import { LanguageSwitcher } from '$lib/ui';
|
||||||
|
import { auth } from '$lib/auth/store';
|
||||||
|
import { goto } from '$app/navigation';
|
||||||
|
|
||||||
|
function handleLogout() {
|
||||||
|
auth.logout();
|
||||||
|
goto('/login');
|
||||||
|
}
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<header class="bg-white shadow-md p-4 flex justify-between items-center">
|
<header class="bg-white shadow-md p-4 flex justify-between items-center">
|
||||||
@@ -25,35 +32,12 @@
|
|||||||
>
|
>
|
||||||
{$t.nav.dashboard}
|
{$t.nav.dashboard}
|
||||||
</a>
|
</a>
|
||||||
<a
|
|
||||||
href="/migration"
|
|
||||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/migration') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
|
||||||
>
|
|
||||||
{$t.nav.migration}
|
|
||||||
</a>
|
|
||||||
<a
|
|
||||||
href="/git"
|
|
||||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/git') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
|
||||||
>
|
|
||||||
{$t.nav.git}
|
|
||||||
</a>
|
|
||||||
<a
|
<a
|
||||||
href="/tasks"
|
href="/tasks"
|
||||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/tasks') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/tasks') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||||
>
|
>
|
||||||
{$t.nav.tasks}
|
{$t.nav.tasks}
|
||||||
</a>
|
</a>
|
||||||
<div class="relative inline-block group">
|
|
||||||
<button class="text-gray-600 hover:text-blue-600 font-medium pb-1 {$page.url.pathname.startsWith('/tools') ? 'text-blue-600 border-b-2 border-blue-600' : ''}">
|
|
||||||
{$t.nav.tools}
|
|
||||||
</button>
|
|
||||||
<div class="absolute hidden group-hover:block bg-white shadow-lg rounded-md mt-1 py-2 w-48 z-10 border border-gray-100 before:absolute before:-top-2 before:left-0 before:right-0 before:h-2 before:content-[''] right-0">
|
|
||||||
<a href="/tools/search" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.tools_search}</a>
|
|
||||||
<a href="/tools/mapper" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.tools_mapper}</a>
|
|
||||||
<a href="/tools/debug" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.tools_debug}</a>
|
|
||||||
<a href="/tools/storage" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.tools_storage}</a>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="relative inline-block group">
|
<div class="relative inline-block group">
|
||||||
<button class="text-gray-600 hover:text-blue-600 font-medium pb-1 {$page.url.pathname.startsWith('/settings') ? 'text-blue-600 border-b-2 border-blue-600' : ''}">
|
<button class="text-gray-600 hover:text-blue-600 font-medium pb-1 {$page.url.pathname.startsWith('/settings') ? 'text-blue-600 border-b-2 border-blue-600' : ''}">
|
||||||
{$t.nav.settings}
|
{$t.nav.settings}
|
||||||
@@ -62,10 +46,35 @@
|
|||||||
<a href="/settings" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_general}</a>
|
<a href="/settings" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_general}</a>
|
||||||
<a href="/settings/connections" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_connections}</a>
|
<a href="/settings/connections" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_connections}</a>
|
||||||
<a href="/settings/git" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_git}</a>
|
<a href="/settings/git" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_git}</a>
|
||||||
<a href="/settings/environments" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_environments}</a>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
{#if $auth.isAuthenticated && $auth.user?.roles?.some(r => r.name === 'Admin')}
|
||||||
|
<div class="relative inline-block group">
|
||||||
|
<button class="text-gray-600 hover:text-blue-600 font-medium pb-1 {$page.url.pathname.startsWith('/admin') ? 'text-blue-600 border-b-2 border-blue-600' : ''}">
|
||||||
|
{$t.nav.admin}
|
||||||
|
</button>
|
||||||
|
<div class="absolute hidden group-hover:block bg-white shadow-lg rounded-md mt-1 py-2 w-48 z-10 border border-gray-100 before:absolute before:-top-2 before:left-0 before:right-0 before:h-2 before:content-[''] right-0">
|
||||||
|
<a href="/admin/users" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.admin_users}</a>
|
||||||
|
<a href="/admin/roles" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.admin_roles}</a>
|
||||||
|
<a href="/admin/settings" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.admin_settings}</a>
|
||||||
|
<a href="/admin/settings/llm" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.admin_llm}</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
<LanguageSwitcher />
|
<LanguageSwitcher />
|
||||||
|
|
||||||
|
{#if $auth.isAuthenticated}
|
||||||
|
<div class="flex items-center space-x-2 border-l pl-4 ml-4">
|
||||||
|
<span class="text-sm text-gray-600">{$auth.user?.username}</span>
|
||||||
|
<button
|
||||||
|
on:click={handleLogout}
|
||||||
|
class="text-sm text-red-600 hover:text-red-800 font-medium"
|
||||||
|
>
|
||||||
|
Logout
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
</nav>
|
</nav>
|
||||||
</header>
|
</header>
|
||||||
<!-- [/DEF:Navbar:Component] -->
|
<!-- [/DEF:Navbar:Component] -->
|
||||||
|
|||||||
@@ -9,6 +9,7 @@
|
|||||||
<script>
|
<script>
|
||||||
import { onMount, onDestroy } from 'svelte';
|
import { onMount, onDestroy } from 'svelte';
|
||||||
import { selectedTask } from '../lib/stores.js';
|
import { selectedTask } from '../lib/stores.js';
|
||||||
|
import { api } from '../lib/api.js';
|
||||||
|
|
||||||
let tasks = [];
|
let tasks = [];
|
||||||
let loading = true;
|
let loading = true;
|
||||||
@@ -21,9 +22,7 @@
|
|||||||
// @POST: tasks array is updated and selectedTask status synchronized.
|
// @POST: tasks array is updated and selectedTask status synchronized.
|
||||||
async function fetchTasks() {
|
async function fetchTasks() {
|
||||||
try {
|
try {
|
||||||
const res = await fetch('/api/tasks?limit=10');
|
tasks = await api.fetchApi('/tasks?limit=10');
|
||||||
if (!res.ok) throw new Error('Failed to fetch tasks');
|
|
||||||
tasks = await res.json();
|
|
||||||
|
|
||||||
// [DEBUG] Check for tasks requiring attention
|
// [DEBUG] Check for tasks requiring attention
|
||||||
tasks.forEach(t => {
|
tasks.forEach(t => {
|
||||||
@@ -54,13 +53,10 @@
|
|||||||
async function clearTasks(status = null) {
|
async function clearTasks(status = null) {
|
||||||
if (!confirm('Are you sure you want to clear tasks?')) return;
|
if (!confirm('Are you sure you want to clear tasks?')) return;
|
||||||
try {
|
try {
|
||||||
let url = '/api/tasks';
|
let endpoint = '/tasks';
|
||||||
const params = new URLSearchParams();
|
if (status) endpoint += `?status=${status}`;
|
||||||
if (status) params.append('status', status);
|
|
||||||
|
|
||||||
const res = await fetch(`${url}?${params.toString()}`, { method: 'DELETE' });
|
|
||||||
if (!res.ok) throw new Error('Failed to clear tasks');
|
|
||||||
|
|
||||||
|
await api.requestApi(endpoint, 'DELETE');
|
||||||
await fetchTasks();
|
await fetchTasks();
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
error = e.message;
|
error = e.message;
|
||||||
@@ -75,14 +71,8 @@
|
|||||||
async function selectTask(task) {
|
async function selectTask(task) {
|
||||||
try {
|
try {
|
||||||
// Fetch the full task details (including logs) before setting it as selected
|
// Fetch the full task details (including logs) before setting it as selected
|
||||||
const res = await fetch(`/api/tasks/${task.id}`);
|
const fullTask = await api.getTask(task.id);
|
||||||
if (res.ok) {
|
selectedTask.set(fullTask);
|
||||||
const fullTask = await res.json();
|
|
||||||
selectedTask.set(fullTask);
|
|
||||||
} else {
|
|
||||||
// Fallback to the list version if fetch fails
|
|
||||||
selectedTask.set(task);
|
|
||||||
}
|
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.error("Failed to fetch full task details:", e);
|
console.error("Failed to fetch full task details:", e);
|
||||||
selectedTask.set(task);
|
selectedTask.set(task);
|
||||||
|
|||||||
@@ -13,7 +13,7 @@
|
|||||||
import { onMount, onDestroy } from 'svelte';
|
import { onMount, onDestroy } from 'svelte';
|
||||||
import { get } from 'svelte/store';
|
import { get } from 'svelte/store';
|
||||||
import { selectedTask, taskLogs } from '../lib/stores.js';
|
import { selectedTask, taskLogs } from '../lib/stores.js';
|
||||||
import { getWsUrl } from '../lib/api.js';
|
import { getWsUrl, api } from '../lib/api.js';
|
||||||
import { addToast } from '../lib/toasts.js';
|
import { addToast } from '../lib/toasts.js';
|
||||||
import MissingMappingModal from './MissingMappingModal.svelte';
|
import MissingMappingModal from './MissingMappingModal.svelte';
|
||||||
import PasswordPrompt from './PasswordPrompt.svelte';
|
import PasswordPrompt from './PasswordPrompt.svelte';
|
||||||
@@ -141,13 +141,11 @@
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
// We need to find the environment ID by name first
|
// We need to find the environment ID by name first
|
||||||
const envsRes = await fetch('/api/environments');
|
const envs = await api.fetchApi('/environments');
|
||||||
const envs = await envsRes.json();
|
|
||||||
const targetEnv = envs.find(e => e.name === task.params.to_env);
|
const targetEnv = envs.find(e => e.name === task.params.to_env);
|
||||||
|
|
||||||
if (targetEnv) {
|
if (targetEnv) {
|
||||||
const res = await fetch(`/api/environments/${targetEnv.id}/databases`);
|
targetDatabases = await api.fetchApi(`/environments/${targetEnv.id}/databases`);
|
||||||
targetDatabases = await res.json();
|
|
||||||
}
|
}
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.error('Failed to fetch target databases', e);
|
console.error('Failed to fetch target databases', e);
|
||||||
@@ -165,31 +163,22 @@
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
// 1. Save mapping to backend
|
// 1. Save mapping to backend
|
||||||
const envsRes = await fetch('/api/environments');
|
const envs = await api.fetchApi('/environments');
|
||||||
const envs = await envsRes.json();
|
|
||||||
const srcEnv = envs.find(e => e.name === task.params.from_env);
|
const srcEnv = envs.find(e => e.name === task.params.from_env);
|
||||||
const tgtEnv = envs.find(e => e.name === task.params.to_env);
|
const tgtEnv = envs.find(e => e.name === task.params.to_env);
|
||||||
|
|
||||||
await fetch('/api/mappings', {
|
await api.postApi('/mappings', {
|
||||||
method: 'POST',
|
source_env_id: srcEnv.id,
|
||||||
headers: { 'Content-Type': 'application/json' },
|
target_env_id: tgtEnv.id,
|
||||||
body: JSON.stringify({
|
source_db_uuid: sourceDbUuid,
|
||||||
source_env_id: srcEnv.id,
|
target_db_uuid: targetDbUuid,
|
||||||
target_env_id: tgtEnv.id,
|
source_db_name: missingDbInfo.name,
|
||||||
source_db_uuid: sourceDbUuid,
|
target_db_name: targetDbName
|
||||||
target_db_uuid: targetDbUuid,
|
|
||||||
source_db_name: missingDbInfo.name,
|
|
||||||
target_db_name: targetDbName
|
|
||||||
})
|
|
||||||
});
|
});
|
||||||
|
|
||||||
// 2. Resolve task
|
// 2. Resolve task
|
||||||
await fetch(`/api/tasks/${task.id}/resolve`, {
|
await api.postApi(`/tasks/${task.id}/resolve`, {
|
||||||
method: 'POST',
|
resolution_params: { resolved_mapping: { [sourceDbUuid]: targetDbUuid } }
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({
|
|
||||||
resolution_params: { resolved_mapping: { [sourceDbUuid]: targetDbUuid } }
|
|
||||||
})
|
|
||||||
});
|
});
|
||||||
|
|
||||||
connectionStatus = 'connected';
|
connectionStatus = 'connected';
|
||||||
@@ -209,11 +198,7 @@
|
|||||||
const { passwords } = event.detail;
|
const { passwords } = event.detail;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
await fetch(`/api/tasks/${task.id}/resume`, {
|
await api.postApi(`/tasks/${task.id}/resume`, { passwords });
|
||||||
method: 'POST',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({ passwords })
|
|
||||||
});
|
|
||||||
|
|
||||||
showPasswordPrompt = false;
|
showPasswordPrompt = false;
|
||||||
connectionStatus = 'connected';
|
connectionStatus = 'connected';
|
||||||
|
|||||||
50
frontend/src/components/auth/ProtectedRoute.svelte
Normal file
50
frontend/src/components/auth/ProtectedRoute.svelte
Normal file
@@ -0,0 +1,50 @@
|
|||||||
|
<!-- [DEF:ProtectedRoute:Component] -->
|
||||||
|
<!--
|
||||||
|
@SEMANTICS: auth, guard, route, protection
|
||||||
|
@PURPOSE: Wraps content to ensure only authenticated users can access it.
|
||||||
|
@LAYER: Component
|
||||||
|
@RELATION: USES -> authStore
|
||||||
|
@RELATION: CALLS -> goto
|
||||||
|
|
||||||
|
@INVARIANT: Redirects to /login if user is not authenticated.
|
||||||
|
-->
|
||||||
|
|
||||||
|
<script lang="ts">
|
||||||
|
import { onMount } from 'svelte';
|
||||||
|
import { auth } from '../../lib/auth/store';
|
||||||
|
import { api } from '../../lib/api';
|
||||||
|
import { goto } from '$app/navigation';
|
||||||
|
|
||||||
|
// [SECTION: TEMPLATE]
|
||||||
|
// Only render slot if authenticated
|
||||||
|
// [/SECTION: TEMPLATE]
|
||||||
|
|
||||||
|
onMount(async () => {
|
||||||
|
// Check if we have a token but no user profile yet
|
||||||
|
if ($auth.token && !$auth.user) {
|
||||||
|
auth.setLoading(true);
|
||||||
|
try {
|
||||||
|
const user = await api.fetchApi('/auth/me');
|
||||||
|
auth.setUser(user);
|
||||||
|
} catch (e) {
|
||||||
|
console.error('Failed to verify session:', e);
|
||||||
|
auth.logout();
|
||||||
|
goto('/login');
|
||||||
|
} finally {
|
||||||
|
auth.setLoading(false);
|
||||||
|
}
|
||||||
|
} else if (!$auth.token) {
|
||||||
|
goto('/login');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
|
||||||
|
{#if $auth.loading}
|
||||||
|
<div class="flex items-center justify-center min-h-screen">
|
||||||
|
<div class="animate-spin rounded-full h-12 w-12 border-b-2 border-blue-600"></div>
|
||||||
|
</div>
|
||||||
|
{:else if $auth.isAuthenticated}
|
||||||
|
<slot />
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
<!-- [/DEF:ProtectedRoute:Component] -->
|
||||||
84
frontend/src/components/backups/BackupList.svelte
Normal file
84
frontend/src/components/backups/BackupList.svelte
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
<!-- [DEF:BackupList:Component] -->
|
||||||
|
<!--
|
||||||
|
@SEMANTICS: backup, list, table
|
||||||
|
@PURPOSE: Displays a list of existing backups.
|
||||||
|
@LAYER: Component
|
||||||
|
@RELATION: USED_BY -> frontend/src/components/backups/BackupManager.svelte
|
||||||
|
-->
|
||||||
|
|
||||||
|
<script lang="ts">
|
||||||
|
// [SECTION: IMPORTS]
|
||||||
|
import { t } from '../../lib/i18n';
|
||||||
|
import { Button } from '../../lib/ui';
|
||||||
|
import type { Backup } from '../../types/backup';
|
||||||
|
import { goto } from '$app/navigation';
|
||||||
|
// [/SECTION]
|
||||||
|
|
||||||
|
// [SECTION: PROPS]
|
||||||
|
/**
|
||||||
|
* @type {Backup[]}
|
||||||
|
* @description Array of backup objects to display.
|
||||||
|
*/
|
||||||
|
export let backups: Backup[] = [];
|
||||||
|
// [/SECTION]
|
||||||
|
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<!-- [SECTION: TEMPLATE] -->
|
||||||
|
<div class="overflow-x-auto rounded-lg border border-gray-200">
|
||||||
|
<table class="min-w-full divide-y divide-gray-200">
|
||||||
|
<thead class="bg-gray-50">
|
||||||
|
<tr>
|
||||||
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">
|
||||||
|
{$t.storage.table.name}
|
||||||
|
</th>
|
||||||
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">
|
||||||
|
{$t.tasks.target_env}
|
||||||
|
</th>
|
||||||
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">
|
||||||
|
{$t.storage.table.created_at}
|
||||||
|
</th>
|
||||||
|
<th class="px-6 py-3 text-right text-xs font-medium text-gray-500 uppercase tracking-wider">
|
||||||
|
{$t.storage.table.actions}
|
||||||
|
</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody class="bg-white divide-y divide-gray-200">
|
||||||
|
{#each backups as backup}
|
||||||
|
<tr class="hover:bg-gray-50 transition-colors">
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap text-sm font-medium text-gray-900">
|
||||||
|
{backup.name}
|
||||||
|
</td>
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap text-sm text-gray-500">
|
||||||
|
{backup.environment}
|
||||||
|
</td>
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap text-sm text-gray-500">
|
||||||
|
{new Date(backup.created_at).toLocaleString()}
|
||||||
|
</td>
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap text-right text-sm font-medium">
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="sm"
|
||||||
|
class="text-blue-600 hover:text-blue-900"
|
||||||
|
on:click={() => goto(`/tools/storage?path=backups/${backup.name}`)}
|
||||||
|
>
|
||||||
|
{$t.storage.table.go_to_storage}
|
||||||
|
</Button>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
{:else}
|
||||||
|
<tr>
|
||||||
|
<td colspan="4" class="px-6 py-10 text-center text-gray-500">
|
||||||
|
{$t.storage.no_files}
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
{/each}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
<!-- [/SECTION] -->
|
||||||
|
|
||||||
|
<style>
|
||||||
|
</style>
|
||||||
|
|
||||||
|
<!-- [/DEF:BackupList:Component] -->
|
||||||
241
frontend/src/components/backups/BackupManager.svelte
Normal file
241
frontend/src/components/backups/BackupManager.svelte
Normal file
@@ -0,0 +1,241 @@
|
|||||||
|
<!-- [DEF:BackupManager:Component] -->
|
||||||
|
<!--
|
||||||
|
@SEMANTICS: backup, manager, orchestrator
|
||||||
|
@PURPOSE: Main container for backup management, handling creation and listing.
|
||||||
|
@LAYER: Feature
|
||||||
|
@RELATION: USES -> BackupList
|
||||||
|
@RELATION: USES -> api
|
||||||
|
|
||||||
|
@INVARIANT: Only one backup task can be triggered at a time from the UI.
|
||||||
|
-->
|
||||||
|
|
||||||
|
<script lang="ts">
|
||||||
|
// [SECTION: IMPORTS]
|
||||||
|
import { onMount } from 'svelte';
|
||||||
|
import { t } from '../../lib/i18n';
|
||||||
|
import { api, requestApi } from '../../lib/api';
|
||||||
|
import { addToast } from '../../lib/toasts';
|
||||||
|
import { Button, Card, Select, Input } from '../../lib/ui';
|
||||||
|
import BackupList from './BackupList.svelte';
|
||||||
|
import type { Backup } from '../../types/backup';
|
||||||
|
// [/SECTION]
|
||||||
|
|
||||||
|
// [SECTION: STATE]
|
||||||
|
let backups: Backup[] = [];
|
||||||
|
let environments: any[] = [];
|
||||||
|
let selectedEnvId = '';
|
||||||
|
let loading = true;
|
||||||
|
let creating = false;
|
||||||
|
let savingSchedule = false;
|
||||||
|
|
||||||
|
// Schedule state for selected environment
|
||||||
|
let scheduleEnabled = false;
|
||||||
|
let cronExpression = '0 0 * * *';
|
||||||
|
|
||||||
|
$: selectedEnv = environments.find(e => e.id === selectedEnvId);
|
||||||
|
$: if (selectedEnv) {
|
||||||
|
scheduleEnabled = selectedEnv.backup_schedule?.enabled ?? false;
|
||||||
|
cronExpression = selectedEnv.backup_schedule?.cron_expression ?? '0 0 * * *';
|
||||||
|
}
|
||||||
|
// [/SECTION]
|
||||||
|
|
||||||
|
// [DEF:loadData:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Loads backups and environments from the backend.
|
||||||
|
*
|
||||||
|
* @pre API must be reachable.
|
||||||
|
* @post environments and backups stores are populated.
|
||||||
|
*
|
||||||
|
* @returns {Promise<void>}
|
||||||
|
* @side_effect Updates local state variables.
|
||||||
|
*/
|
||||||
|
// @RELATION: CALLS -> api.getEnvironmentsList
|
||||||
|
// @RELATION: CALLS -> api.requestApi
|
||||||
|
async function loadData() {
|
||||||
|
console.log("[BackupManager][Entry] Loading data.");
|
||||||
|
loading = true;
|
||||||
|
try {
|
||||||
|
const [envsData, storageData] = await Promise.all([
|
||||||
|
api.getEnvironmentsList(),
|
||||||
|
requestApi('/storage/files?category=backups')
|
||||||
|
]);
|
||||||
|
environments = envsData;
|
||||||
|
|
||||||
|
// Map storage files to Backup type
|
||||||
|
backups = (storageData || []).map((file: any) => ({
|
||||||
|
id: file.name,
|
||||||
|
name: file.name,
|
||||||
|
environment: file.path.split('/')[0] || 'Unknown',
|
||||||
|
created_at: file.created_at,
|
||||||
|
size_bytes: file.size,
|
||||||
|
status: 'success'
|
||||||
|
}));
|
||||||
|
console.log("[BackupManager][Action] Data loaded successfully.");
|
||||||
|
} catch (error) {
|
||||||
|
console.error("[BackupManager][Coherence:Failed] Load failed", error);
|
||||||
|
} finally {
|
||||||
|
loading = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:loadData:Function]
|
||||||
|
|
||||||
|
// [DEF:handleCreateBackup:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Triggers a new backup task for the selected environment.
|
||||||
|
*
|
||||||
|
* @pre selectedEnvId must be a valid environment ID.
|
||||||
|
* @post A new task is created on the backend.
|
||||||
|
*
|
||||||
|
* @returns {Promise<void>}
|
||||||
|
* @side_effect Dispatches a toast notification.
|
||||||
|
*/
|
||||||
|
// @RELATION: CALLS -> api.createTask
|
||||||
|
// [DEF:handleUpdateSchedule:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Updates the backup schedule for the selected environment.
|
||||||
|
* @pre selectedEnvId must be set.
|
||||||
|
* @post Environment config is updated on the backend.
|
||||||
|
*/
|
||||||
|
async function handleUpdateSchedule() {
|
||||||
|
if (!selectedEnvId) return;
|
||||||
|
|
||||||
|
console.log(`[BackupManager][Action] Updating schedule for env: ${selectedEnvId}`);
|
||||||
|
savingSchedule = true;
|
||||||
|
try {
|
||||||
|
await api.updateEnvironmentSchedule(selectedEnvId, {
|
||||||
|
enabled: scheduleEnabled,
|
||||||
|
cron_expression: cronExpression
|
||||||
|
});
|
||||||
|
addToast($t.common.success, 'success');
|
||||||
|
|
||||||
|
// Update local state
|
||||||
|
environments = environments.map(e =>
|
||||||
|
e.id === selectedEnvId
|
||||||
|
? { ...e, backup_schedule: { enabled: scheduleEnabled, cron_expression: cronExpression } }
|
||||||
|
: e
|
||||||
|
);
|
||||||
|
} catch (error) {
|
||||||
|
console.error("[BackupManager][Coherence:Failed] Schedule update failed", error);
|
||||||
|
} finally {
|
||||||
|
savingSchedule = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:handleUpdateSchedule:Function]
|
||||||
|
|
||||||
|
async function handleCreateBackup() {
|
||||||
|
if (!selectedEnvId) {
|
||||||
|
addToast($t.tasks.select_env, 'error');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`[BackupManager][Action] Triggering backup for env: ${selectedEnvId}`);
|
||||||
|
creating = true;
|
||||||
|
try {
|
||||||
|
await api.createTask('superset-backup', { environment_id: selectedEnvId });
|
||||||
|
addToast($t.common.success, 'success');
|
||||||
|
console.log("[BackupManager][Coherence:OK] Backup task triggered.");
|
||||||
|
} catch (error) {
|
||||||
|
console.error("[BackupManager][Coherence:Failed] Create failed", error);
|
||||||
|
} finally {
|
||||||
|
creating = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:handleCreateBackup:Function]
|
||||||
|
|
||||||
|
onMount(loadData);
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<!-- [SECTION: TEMPLATE] -->
|
||||||
|
<div class="space-y-6">
|
||||||
|
<Card title={$t.tasks.manual_backup}>
|
||||||
|
<div class="space-y-4">
|
||||||
|
<div class="flex items-end gap-4">
|
||||||
|
<div class="flex-1">
|
||||||
|
<Select
|
||||||
|
label={$t.tasks.target_env}
|
||||||
|
bind:value={selectedEnvId}
|
||||||
|
options={[
|
||||||
|
{ value: '', label: $t.tasks.select_env },
|
||||||
|
...environments.map(e => ({ value: e.id, label: e.name }))
|
||||||
|
]}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<Button
|
||||||
|
variant="primary"
|
||||||
|
on:click={handleCreateBackup}
|
||||||
|
disabled={creating || !selectedEnvId}
|
||||||
|
>
|
||||||
|
{creating ? $t.common.loading : $t.tasks.start_backup}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if selectedEnvId}
|
||||||
|
<div class="pt-6 border-t border-gray-100 mt-4">
|
||||||
|
<h3 class="text-sm font-semibold text-gray-800 mb-4 flex items-center gap-2">
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 text-blue-500" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z" />
|
||||||
|
</svg>
|
||||||
|
{$t.tasks.backup_schedule}
|
||||||
|
</h3>
|
||||||
|
|
||||||
|
<div class="bg-gray-50 rounded-lg p-4 border border-gray-200">
|
||||||
|
<div class="flex flex-col md:flex-row md:items-start gap-6">
|
||||||
|
<div class="pt-8">
|
||||||
|
<label class="flex items-center gap-3 cursor-pointer group">
|
||||||
|
<div class="relative inline-flex items-center">
|
||||||
|
<input type="checkbox" bind:checked={scheduleEnabled} class="sr-only peer" />
|
||||||
|
<div class="w-11 h-6 bg-gray-200 peer-focus:outline-none peer-focus:ring-4 peer-focus:ring-blue-300 rounded-full peer peer-checked:after:translate-x-full peer-checked:after:border-white after:content-[''] after:absolute after:top-[2px] after:left-[2px] after:bg-white after:border-gray-300 after:border after:rounded-full after:h-5 after:w-5 after:transition-all peer-checked:bg-blue-600"></div>
|
||||||
|
<span class="ml-3 text-sm font-medium text-gray-700 group-hover:text-gray-900 transition-colors">{$t.tasks.schedule_enabled}</span>
|
||||||
|
</div>
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex-1 space-y-2">
|
||||||
|
<Input
|
||||||
|
label={$t.tasks.cron_label}
|
||||||
|
placeholder="0 0 * * *"
|
||||||
|
bind:value={cronExpression}
|
||||||
|
disabled={!scheduleEnabled}
|
||||||
|
/>
|
||||||
|
<p class="text-xs text-gray-500 italic">{$t.tasks.cron_hint}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="pt-8">
|
||||||
|
<Button
|
||||||
|
variant="secondary"
|
||||||
|
on:click={handleUpdateSchedule}
|
||||||
|
disabled={savingSchedule}
|
||||||
|
class="min-w-[100px]"
|
||||||
|
>
|
||||||
|
{#if savingSchedule}
|
||||||
|
<span class="flex items-center gap-2">
|
||||||
|
<svg class="animate-spin h-4 w-4" viewBox="0 0 24 24">
|
||||||
|
<circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4" fill="none"></circle>
|
||||||
|
<path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
|
||||||
|
</svg>
|
||||||
|
{$t.common.loading}
|
||||||
|
</span>
|
||||||
|
{:else}
|
||||||
|
{$t.common.save}
|
||||||
|
{/if}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
</Card>
|
||||||
|
|
||||||
|
<div class="space-y-3">
|
||||||
|
<h2 class="text-lg font-semibold text-gray-700">{$t.storage.backups}</h2>
|
||||||
|
{#if loading}
|
||||||
|
<div class="py-10 text-center text-gray-500">{$t.common.loading}</div>
|
||||||
|
{:else}
|
||||||
|
<BackupList {backups} />
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<!-- [/SECTION] -->
|
||||||
|
|
||||||
|
<!-- [/DEF:BackupManager:Component] -->
|
||||||
@@ -14,6 +14,7 @@
|
|||||||
import { createEventDispatcher, onMount } from 'svelte';
|
import { createEventDispatcher, onMount } from 'svelte';
|
||||||
import { gitService } from '../../services/gitService';
|
import { gitService } from '../../services/gitService';
|
||||||
import { addToast as toast } from '../../lib/toasts.js';
|
import { addToast as toast } from '../../lib/toasts.js';
|
||||||
|
import { api } from '../../lib/api';
|
||||||
// [/SECTION]
|
// [/SECTION]
|
||||||
|
|
||||||
// [SECTION: PROPS]
|
// [SECTION: PROPS]
|
||||||
@@ -27,10 +28,32 @@
|
|||||||
let status = null;
|
let status = null;
|
||||||
let diff = '';
|
let diff = '';
|
||||||
let loading = false;
|
let loading = false;
|
||||||
|
let generatingMessage = false;
|
||||||
// [/SECTION]
|
// [/SECTION]
|
||||||
|
|
||||||
const dispatch = createEventDispatcher();
|
const dispatch = createEventDispatcher();
|
||||||
|
|
||||||
|
// [DEF:handleGenerateMessage:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Generates a commit message using LLM.
|
||||||
|
*/
|
||||||
|
async function handleGenerateMessage() {
|
||||||
|
generatingMessage = true;
|
||||||
|
try {
|
||||||
|
console.log(`[CommitModal][Action] Generating commit message for dashboard ${dashboardId}`);
|
||||||
|
// postApi returns the JSON data directly or throws an error
|
||||||
|
const data = await api.postApi(`/git/repositories/${dashboardId}/generate-message`);
|
||||||
|
message = data.message;
|
||||||
|
toast('Commit message generated', 'success');
|
||||||
|
} catch (e) {
|
||||||
|
console.error(`[CommitModal][Coherence:Failed] ${e.message}`);
|
||||||
|
toast(e.message || 'Failed to generate message', 'error');
|
||||||
|
} finally {
|
||||||
|
generatingMessage = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:handleGenerateMessage:Function]
|
||||||
|
|
||||||
// [DEF:loadStatus:Function]
|
// [DEF:loadStatus:Function]
|
||||||
/**
|
/**
|
||||||
* @purpose Загружает текущий статус репозитория и diff.
|
* @purpose Загружает текущий статус репозитория и diff.
|
||||||
@@ -99,8 +122,21 @@
|
|||||||
<!-- Left: Message and Files -->
|
<!-- Left: Message and Files -->
|
||||||
<div class="w-full md:w-1/3 flex flex-col">
|
<div class="w-full md:w-1/3 flex flex-col">
|
||||||
<div class="mb-4">
|
<div class="mb-4">
|
||||||
<label class="block text-sm font-medium text-gray-700 mb-1">Commit Message</label>
|
<div class="flex justify-between items-center mb-1">
|
||||||
<textarea
|
<label class="block text-sm font-medium text-gray-700">Commit Message</label>
|
||||||
|
<button
|
||||||
|
on:click={handleGenerateMessage}
|
||||||
|
disabled={generatingMessage || loading}
|
||||||
|
class="text-xs text-blue-600 hover:text-blue-800 disabled:opacity-50 flex items-center"
|
||||||
|
>
|
||||||
|
{#if generatingMessage}
|
||||||
|
<span class="animate-spin mr-1">↻</span> Generating...
|
||||||
|
{:else}
|
||||||
|
<span class="mr-1">✨</span> Generate with AI
|
||||||
|
{/if}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<textarea
|
||||||
bind:value={message}
|
bind:value={message}
|
||||||
class="w-full border rounded p-2 h-32 focus:ring-2 focus:ring-blue-500 outline-none resize-none"
|
class="w-full border rounded p-2 h-32 focus:ring-2 focus:ring-blue-500 outline-none resize-none"
|
||||||
placeholder="Describe your changes..."
|
placeholder="Describe your changes..."
|
||||||
|
|||||||
79
frontend/src/components/llm/DocPreview.svelte
Normal file
79
frontend/src/components/llm/DocPreview.svelte
Normal file
@@ -0,0 +1,79 @@
|
|||||||
|
<!-- [DEF:DocPreview:Component] -->
|
||||||
|
<!--
|
||||||
|
@TIER: STANDARD
|
||||||
|
@PURPOSE: UI component for previewing generated dataset documentation before saving.
|
||||||
|
@LAYER: UI
|
||||||
|
@RELATION: DEPENDS_ON -> backend/src/plugins/llm_analysis/plugin.py
|
||||||
|
-->
|
||||||
|
|
||||||
|
<script>
|
||||||
|
import { t } from '../../lib/i18n';
|
||||||
|
|
||||||
|
/** @type {Object} */
|
||||||
|
export let documentation = null;
|
||||||
|
export let onSave = async (doc) => {};
|
||||||
|
export let onCancel = () => {};
|
||||||
|
|
||||||
|
let isSaving = false;
|
||||||
|
|
||||||
|
async function handleSave() {
|
||||||
|
isSaving = true;
|
||||||
|
try {
|
||||||
|
await onSave(documentation);
|
||||||
|
} catch (err) {
|
||||||
|
console.error("Save failed", err);
|
||||||
|
} finally {
|
||||||
|
isSaving = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
{#if documentation}
|
||||||
|
<div class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
|
||||||
|
<div class="bg-white p-6 rounded-lg shadow-xl w-full max-w-2xl max-h-[90vh] flex flex-col">
|
||||||
|
<h3 class="text-lg font-semibold mb-4">{$t.llm.doc_preview_title}</h3>
|
||||||
|
|
||||||
|
<div class="flex-1 overflow-y-auto mb-6 prose prose-sm max-w-none border rounded p-4 bg-gray-50">
|
||||||
|
<h4 class="text-md font-bold text-gray-800 mb-2">{$t.llm.dataset_desc}</h4>
|
||||||
|
<p class="text-gray-700 mb-4 whitespace-pre-wrap">{documentation.description || 'No description generated.'}</p>
|
||||||
|
|
||||||
|
<h4 class="text-md font-bold text-gray-800 mb-2">{$t.llm.column_doc}</h4>
|
||||||
|
<table class="min-w-full divide-y divide-gray-200">
|
||||||
|
<thead class="bg-gray-100">
|
||||||
|
<tr>
|
||||||
|
<th class="px-3 py-2 text-left text-xs font-medium text-gray-500 uppercase">Column</th>
|
||||||
|
<th class="px-3 py-2 text-left text-xs font-medium text-gray-500 uppercase">Description</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody class="divide-y divide-gray-200">
|
||||||
|
{#each Object.entries(documentation.columns || {}) as [name, desc]}
|
||||||
|
<tr>
|
||||||
|
<td class="px-3 py-2 text-sm font-mono text-gray-900">{name}</td>
|
||||||
|
<td class="px-3 py-2 text-sm text-gray-700">{desc}</td>
|
||||||
|
</tr>
|
||||||
|
{/each}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex justify-end gap-3">
|
||||||
|
<button
|
||||||
|
class="px-4 py-2 border rounded hover:bg-gray-50"
|
||||||
|
on:click={onCancel}
|
||||||
|
disabled={isSaving}
|
||||||
|
>
|
||||||
|
{$t.llm.cancel}
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
class="px-4 py-2 bg-blue-600 text-white rounded hover:bg-blue-700 disabled:opacity-50"
|
||||||
|
on:click={handleSave}
|
||||||
|
disabled={isSaving}
|
||||||
|
>
|
||||||
|
{isSaving ? $t.llm.applying : $t.llm.apply_doc}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
<!-- [/DEF:DocPreview:Component] -->
|
||||||
226
frontend/src/components/llm/ProviderConfig.svelte
Normal file
226
frontend/src/components/llm/ProviderConfig.svelte
Normal file
@@ -0,0 +1,226 @@
|
|||||||
|
<!-- [DEF:ProviderConfig:Component] -->
|
||||||
|
<!--
|
||||||
|
@TIER: STANDARD
|
||||||
|
@PURPOSE: UI form for managing LLM provider configurations.
|
||||||
|
@LAYER: UI
|
||||||
|
@RELATION: DEPENDS_ON -> backend/src/api/routes/llm.py
|
||||||
|
-->
|
||||||
|
|
||||||
|
<script>
|
||||||
|
import { onMount } from 'svelte';
|
||||||
|
import { t } from '../../lib/i18n';
|
||||||
|
import { requestApi } from '../../lib/api';
|
||||||
|
|
||||||
|
/** @type {Array} */
|
||||||
|
export let providers = [];
|
||||||
|
export let onSave = () => {};
|
||||||
|
|
||||||
|
let editingProvider = null;
|
||||||
|
let showForm = false;
|
||||||
|
|
||||||
|
let formData = {
|
||||||
|
name: '',
|
||||||
|
provider_type: 'openai',
|
||||||
|
base_url: 'https://api.openai.com/v1',
|
||||||
|
api_key: '',
|
||||||
|
default_model: 'gpt-4o',
|
||||||
|
is_active: true
|
||||||
|
};
|
||||||
|
|
||||||
|
let testStatus = { type: '', message: '' };
|
||||||
|
let isTesting = false;
|
||||||
|
|
||||||
|
function resetForm() {
|
||||||
|
formData = {
|
||||||
|
name: '',
|
||||||
|
provider_type: 'openai',
|
||||||
|
base_url: 'https://api.openai.com/v1',
|
||||||
|
api_key: '',
|
||||||
|
default_model: 'gpt-4o',
|
||||||
|
is_active: true
|
||||||
|
};
|
||||||
|
editingProvider = null;
|
||||||
|
testStatus = { type: '', message: '' };
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleEdit(provider) {
|
||||||
|
editingProvider = provider;
|
||||||
|
formData = { ...provider, api_key: '' }; // Don't populate key for security
|
||||||
|
showForm = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function testConnection() {
|
||||||
|
console.log("[ProviderConfig][Action] Testing connection", formData);
|
||||||
|
isTesting = true;
|
||||||
|
testStatus = { type: 'info', message: $t.llm.testing };
|
||||||
|
|
||||||
|
try {
|
||||||
|
const endpoint = editingProvider ? `/llm/providers/${editingProvider.id}/test` : '/llm/providers/test';
|
||||||
|
const result = await requestApi(endpoint, 'POST', formData);
|
||||||
|
|
||||||
|
if (result.success) {
|
||||||
|
testStatus = { type: 'success', message: $t.llm.connection_success };
|
||||||
|
} else {
|
||||||
|
testStatus = { type: 'error', message: $t.llm.connection_failed.replace('{error}', result.error || 'Unknown error') };
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
testStatus = { type: 'error', message: $t.llm.connection_failed.replace('{error}', err.message) };
|
||||||
|
} finally {
|
||||||
|
isTesting = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function handleSubmit() {
|
||||||
|
console.log("[ProviderConfig][Action] Submitting provider config");
|
||||||
|
const method = editingProvider ? 'PUT' : 'POST';
|
||||||
|
const endpoint = editingProvider ? `/llm/providers/${editingProvider.id}` : '/llm/providers';
|
||||||
|
|
||||||
|
// When editing, only include api_key if user entered a new one
|
||||||
|
const submitData = { ...formData };
|
||||||
|
if (editingProvider && !submitData.api_key) {
|
||||||
|
// If editing and api_key is empty, don't send it (backend will keep existing)
|
||||||
|
delete submitData.api_key;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
await requestApi(endpoint, method, submitData);
|
||||||
|
showForm = false;
|
||||||
|
resetForm();
|
||||||
|
onSave();
|
||||||
|
} catch (err) {
|
||||||
|
alert(`Error: ${err.message}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function toggleActive(provider) {
|
||||||
|
try {
|
||||||
|
await requestApi(`/llm/providers/${provider.id}`, 'PUT', {
|
||||||
|
...provider,
|
||||||
|
is_active: !provider.is_active
|
||||||
|
});
|
||||||
|
onSave();
|
||||||
|
} catch (err) {
|
||||||
|
console.error("Failed to toggle status", err);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div class="p-4">
|
||||||
|
<div class="flex justify-between items-center mb-6">
|
||||||
|
<h2 class="text-xl font-bold">{$t.llm.providers_title}</h2>
|
||||||
|
<button
|
||||||
|
class="bg-blue-600 text-white px-4 py-2 rounded hover:bg-blue-700 transition"
|
||||||
|
on:click={() => { resetForm(); showForm = true; }}
|
||||||
|
>
|
||||||
|
{$t.llm.add_provider}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if showForm}
|
||||||
|
<div class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
|
||||||
|
<div class="bg-white p-6 rounded-lg shadow-xl w-full max-w-md">
|
||||||
|
<h3 class="text-lg font-semibold mb-4">{editingProvider ? $t.llm.edit_provider : $t.llm.new_provider}</h3>
|
||||||
|
|
||||||
|
<div class="space-y-4">
|
||||||
|
<div>
|
||||||
|
<label for="provider-name" class="block text-sm font-medium text-gray-700">{$t.llm.name}</label>
|
||||||
|
<input id="provider-name" type="text" bind:value={formData.name} class="mt-1 block w-full border rounded-md p-2" placeholder="e.g. My OpenAI" />
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label for="provider-type" class="block text-sm font-medium text-gray-700">{$t.llm.type}</label>
|
||||||
|
<select id="provider-type" bind:value={formData.provider_type} class="mt-1 block w-full border rounded-md p-2">
|
||||||
|
<option value="openai">OpenAI</option>
|
||||||
|
<option value="openrouter">OpenRouter</option>
|
||||||
|
<option value="kilo">Kilo</option>
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label for="provider-base-url" class="block text-sm font-medium text-gray-700">{$t.llm.base_url}</label>
|
||||||
|
<input id="provider-base-url" type="text" bind:value={formData.base_url} class="mt-1 block w-full border rounded-md p-2" />
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label for="provider-api-key" class="block text-sm font-medium text-gray-700">{$t.llm.api_key}</label>
|
||||||
|
<input id="provider-api-key" type="password" bind:value={formData.api_key} class="mt-1 block w-full border rounded-md p-2" placeholder={editingProvider ? "••••••••" : "sk-..."} />
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label for="provider-default-model" class="block text-sm font-medium text-gray-700">{$t.llm.default_model}</label>
|
||||||
|
<input id="provider-default-model" type="text" bind:value={formData.default_model} class="mt-1 block w-full border rounded-md p-2" placeholder="gpt-4o" />
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex items-center">
|
||||||
|
<input id="provider-active" type="checkbox" bind:checked={formData.is_active} class="mr-2" />
|
||||||
|
<label for="provider-active" class="text-sm font-medium text-gray-700">{$t.llm.active}</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if testStatus.message}
|
||||||
|
<div class={`mt-4 p-2 rounded text-sm ${testStatus.type === 'success' ? 'bg-green-100 text-green-800' : testStatus.type === 'error' ? 'bg-red-100 text-red-800' : 'bg-blue-100 text-blue-800'}`}>
|
||||||
|
{testStatus.message}
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
<div class="mt-6 flex justify-between gap-2">
|
||||||
|
<button
|
||||||
|
class="px-4 py-2 border rounded hover:bg-gray-50 flex-1"
|
||||||
|
on:click={() => { showForm = false; }}
|
||||||
|
>
|
||||||
|
{$t.llm.cancel}
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
class="px-4 py-2 bg-gray-600 text-white rounded hover:bg-gray-700 flex-1"
|
||||||
|
disabled={isTesting}
|
||||||
|
on:click={testConnection}
|
||||||
|
>
|
||||||
|
{isTesting ? $t.llm.testing : $t.llm.test}
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
class="px-4 py-2 bg-blue-600 text-white rounded hover:bg-blue-700 flex-1"
|
||||||
|
on:click={handleSubmit}
|
||||||
|
>
|
||||||
|
{$t.llm.save}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
<div class="grid gap-4">
|
||||||
|
{#each providers as provider}
|
||||||
|
<div class="border rounded-lg p-4 flex justify-between items-center bg-white shadow-sm">
|
||||||
|
<div>
|
||||||
|
<div class="font-bold flex items-center gap-2">
|
||||||
|
{provider.name}
|
||||||
|
<span class={`text-xs px-2 py-0.5 rounded-full ${provider.is_active ? 'bg-green-100 text-green-800' : 'bg-gray-100 text-gray-800'}`}>
|
||||||
|
{provider.is_active ? $t.llm.active : 'Inactive'}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<div class="text-sm text-gray-500">{provider.provider_type} • {provider.default_model}</div>
|
||||||
|
</div>
|
||||||
|
<div class="flex gap-2">
|
||||||
|
<button
|
||||||
|
class="text-sm text-blue-600 hover:underline"
|
||||||
|
on:click={() => handleEdit(provider)}
|
||||||
|
>
|
||||||
|
{$t.common.edit}
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
class={`text-sm ${provider.is_active ? 'text-orange-600' : 'text-green-600'} hover:underline`}
|
||||||
|
on:click={() => toggleActive(provider)}
|
||||||
|
>
|
||||||
|
{provider.is_active ? 'Deactivate' : 'Activate'}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<div class="text-center py-8 text-gray-500 border-2 border-dashed rounded-lg">
|
||||||
|
{$t.llm.no_providers}
|
||||||
|
</div>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- [/DEF:ProviderConfig:Component] -->
|
||||||
75
frontend/src/components/llm/ValidationReport.svelte
Normal file
75
frontend/src/components/llm/ValidationReport.svelte
Normal file
@@ -0,0 +1,75 @@
|
|||||||
|
<!-- [DEF:frontend/src/components/llm/ValidationReport.svelte:Component] -->
|
||||||
|
<!-- @TIER: STANDARD -->
|
||||||
|
<!-- @PURPOSE: Displays the results of an LLM-based dashboard validation task. -->
|
||||||
|
|
||||||
|
<script>
|
||||||
|
export let result = null;
|
||||||
|
|
||||||
|
function getStatusColor(status) {
|
||||||
|
switch (status) {
|
||||||
|
case 'PASS': return 'text-green-600 bg-green-100';
|
||||||
|
case 'WARN': return 'text-yellow-600 bg-yellow-100';
|
||||||
|
case 'FAIL': return 'text-red-600 bg-red-100';
|
||||||
|
default: return 'text-gray-600 bg-gray-100';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
{#if result}
|
||||||
|
<div class="bg-white shadow rounded-lg p-6 space-y-6">
|
||||||
|
<div class="flex items-center justify-between">
|
||||||
|
<h2 class="text-xl font-bold text-gray-900">Validation Report</h2>
|
||||||
|
<span class={`px-3 py-1 rounded-full text-sm font-medium ${getStatusColor(result.status)}`}>
|
||||||
|
{result.status}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="prose max-w-none">
|
||||||
|
<h3 class="text-lg font-semibold">Summary</h3>
|
||||||
|
<p>{result.summary}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if result.issues && result.issues.length > 0}
|
||||||
|
<div class="space-y-4">
|
||||||
|
<h3 class="text-lg font-semibold">Detected Issues</h3>
|
||||||
|
<div class="overflow-hidden shadow ring-1 ring-black ring-opacity-5 md:rounded-lg">
|
||||||
|
<table class="min-w-full divide-y divide-gray-300">
|
||||||
|
<thead class="bg-gray-50">
|
||||||
|
<tr>
|
||||||
|
<th class="py-3.5 pl-4 pr-3 text-left text-sm font-semibold text-gray-900">Severity</th>
|
||||||
|
<th class="px-3 py-3.5 text-left text-sm font-semibold text-gray-900">Message</th>
|
||||||
|
<th class="px-3 py-3.5 text-left text-sm font-semibold text-gray-900">Location</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody class="divide-y divide-gray-200 bg-white">
|
||||||
|
{#each result.issues as issue}
|
||||||
|
<tr>
|
||||||
|
<td class="whitespace-nowrap py-4 pl-4 pr-3 text-sm">
|
||||||
|
<span class={`px-2 py-1 rounded text-xs font-medium ${getStatusColor(issue.severity)}`}>
|
||||||
|
{issue.severity}
|
||||||
|
</span>
|
||||||
|
</td>
|
||||||
|
<td class="px-3 py-4 text-sm text-gray-500">{issue.message}</td>
|
||||||
|
<td class="px-3 py-4 text-sm text-gray-500">{issue.location || 'N/A'}</td>
|
||||||
|
</tr>
|
||||||
|
{/each}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
{#if result.screenshot_path}
|
||||||
|
<div class="space-y-4">
|
||||||
|
<h3 class="text-lg font-semibold">Screenshot</h3>
|
||||||
|
<img src={`/api/storage/file?path=${encodeURIComponent(result.screenshot_path)}`} alt="Dashboard Screenshot" class="rounded-lg border border-gray-200 shadow-sm max-w-full" />
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<div class="text-center py-12 bg-gray-50 rounded-lg border-2 border-dashed border-gray-300">
|
||||||
|
<p class="text-gray-500">No validation result available.</p>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
<!-- [/DEF:frontend/src/components/llm/ValidationReport.svelte] -->
|
||||||
@@ -1,8 +1,9 @@
|
|||||||
<!-- [DEF:FileList:Component] -->
|
<!-- [DEF:FileList:Component] -->
|
||||||
<!--
|
<!--
|
||||||
|
@TIER: STANDARD
|
||||||
@SEMANTICS: storage, files, list, table
|
@SEMANTICS: storage, files, list, table
|
||||||
@PURPOSE: Displays a table of files with metadata and actions.
|
@PURPOSE: Displays a table of files with metadata and actions.
|
||||||
@LAYER: Component
|
@LAYER: UI
|
||||||
@RELATION: DEPENDS_ON -> storageService
|
@RELATION: DEPENDS_ON -> storageService
|
||||||
|
|
||||||
@PROPS: files (Array) - List of StoredFile objects.
|
@PROPS: files (Array) - List of StoredFile objects.
|
||||||
@@ -22,10 +23,13 @@
|
|||||||
// [DEF:isDirectory:Function]
|
// [DEF:isDirectory:Function]
|
||||||
/**
|
/**
|
||||||
* @purpose Checks if a file object represents a directory.
|
* @purpose Checks if a file object represents a directory.
|
||||||
|
* @pre file object has mime_type property.
|
||||||
|
* @post Returns boolean.
|
||||||
* @param {Object} file - The file object to check.
|
* @param {Object} file - The file object to check.
|
||||||
* @return {boolean} True if it's a directory, false otherwise.
|
* @return {boolean} True if it's a directory, false otherwise.
|
||||||
*/
|
*/
|
||||||
function isDirectory(file) {
|
function isDirectory(file) {
|
||||||
|
console.log("[isDirectory][Action] Checking file type");
|
||||||
return file.mime_type === 'directory';
|
return file.mime_type === 'directory';
|
||||||
}
|
}
|
||||||
// [/DEF:isDirectory:Function]
|
// [/DEF:isDirectory:Function]
|
||||||
@@ -33,10 +37,13 @@
|
|||||||
// [DEF:formatSize:Function]
|
// [DEF:formatSize:Function]
|
||||||
/**
|
/**
|
||||||
* @purpose Formats file size in bytes into a human-readable string.
|
* @purpose Formats file size in bytes into a human-readable string.
|
||||||
|
* @pre bytes is a number.
|
||||||
|
* @post Returns formatted string.
|
||||||
* @param {number} bytes - The size in bytes.
|
* @param {number} bytes - The size in bytes.
|
||||||
* @return {string} Formatted size (e.g., "1.2 MB").
|
* @return {string} Formatted size (e.g., "1.2 MB").
|
||||||
*/
|
*/
|
||||||
function formatSize(bytes) {
|
function formatSize(bytes) {
|
||||||
|
console.log(`[formatSize][Action] Formatting ${bytes} bytes`);
|
||||||
if (bytes === 0) return '0 B';
|
if (bytes === 0) return '0 B';
|
||||||
const k = 1024;
|
const k = 1024;
|
||||||
const sizes = ['B', 'KB', 'MB', 'GB', 'TB'];
|
const sizes = ['B', 'KB', 'MB', 'GB', 'TB'];
|
||||||
@@ -48,10 +55,13 @@
|
|||||||
// [DEF:formatDate:Function]
|
// [DEF:formatDate:Function]
|
||||||
/**
|
/**
|
||||||
* @purpose Formats an ISO date string into a localized readable format.
|
* @purpose Formats an ISO date string into a localized readable format.
|
||||||
|
* @pre dateStr is a valid date string.
|
||||||
|
* @post Returns localized string.
|
||||||
* @param {string} dateStr - The date string to format.
|
* @param {string} dateStr - The date string to format.
|
||||||
* @return {string} Localized date and time.
|
* @return {string} Localized date and time.
|
||||||
*/
|
*/
|
||||||
function formatDate(dateStr) {
|
function formatDate(dateStr) {
|
||||||
|
console.log("[formatDate][Action] Formatting date string");
|
||||||
return new Date(dateStr).toLocaleString();
|
return new Date(dateStr).toLocaleString();
|
||||||
}
|
}
|
||||||
// [/DEF:formatDate:Function]
|
// [/DEF:formatDate:Function]
|
||||||
|
|||||||
@@ -1,8 +1,9 @@
|
|||||||
<!-- [DEF:FileUpload:Component] -->
|
<!-- [DEF:FileUpload:Component] -->
|
||||||
<!--
|
<!--
|
||||||
|
@TIER: STANDARD
|
||||||
@SEMANTICS: storage, upload, files
|
@SEMANTICS: storage, upload, files
|
||||||
@PURPOSE: Provides a form for uploading files to a specific category.
|
@PURPOSE: Provides a form for uploading files to a specific category.
|
||||||
@LAYER: Component
|
@LAYER: UI
|
||||||
@RELATION: DEPENDS_ON -> storageService
|
@RELATION: DEPENDS_ON -> storageService
|
||||||
|
|
||||||
@PROPS: None
|
@PROPS: None
|
||||||
|
|||||||
@@ -8,6 +8,7 @@
|
|||||||
<script>
|
<script>
|
||||||
// [SECTION: IMPORTS]
|
// [SECTION: IMPORTS]
|
||||||
import { onMount } from 'svelte';
|
import { onMount } from 'svelte';
|
||||||
|
import { api } from '../../lib/api.js';
|
||||||
import { runTask, getTaskStatus } from '../../services/toolsService.js';
|
import { runTask, getTaskStatus } from '../../services/toolsService.js';
|
||||||
import { selectedTask } from '../../lib/stores.js';
|
import { selectedTask } from '../../lib/stores.js';
|
||||||
import { addToast } from '../../lib/toasts.js';
|
import { addToast } from '../../lib/toasts.js';
|
||||||
@@ -32,8 +33,7 @@
|
|||||||
*/
|
*/
|
||||||
async function fetchEnvironments() {
|
async function fetchEnvironments() {
|
||||||
try {
|
try {
|
||||||
const res = await fetch('/api/environments');
|
envs = await api.getEnvironmentsList();
|
||||||
envs = await res.json();
|
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
addToast('Failed to fetch environments', 'error');
|
addToast('Failed to fetch environments', 'error');
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -11,8 +11,12 @@
|
|||||||
import { onMount } from 'svelte';
|
import { onMount } from 'svelte';
|
||||||
import { runTask } from '../../services/toolsService.js';
|
import { runTask } from '../../services/toolsService.js';
|
||||||
import { getConnections } from '../../services/connectionService.js';
|
import { getConnections } from '../../services/connectionService.js';
|
||||||
|
import { api } from '../../lib/api';
|
||||||
import { selectedTask } from '../../lib/stores.js';
|
import { selectedTask } from '../../lib/stores.js';
|
||||||
import { addToast } from '../../lib/toasts.js';
|
import { addToast } from '../../lib/toasts.js';
|
||||||
|
import { t } from '../../lib/i18n';
|
||||||
|
import { Button, Card, Select, Input } from '../../lib/ui';
|
||||||
|
import DocPreview from '../llm/DocPreview.svelte';
|
||||||
// [/SECTION]
|
// [/SECTION]
|
||||||
|
|
||||||
let envs = [];
|
let envs = [];
|
||||||
@@ -25,6 +29,8 @@
|
|||||||
let tableSchema = 'public';
|
let tableSchema = 'public';
|
||||||
let excelPath = '';
|
let excelPath = '';
|
||||||
let isRunning = false;
|
let isRunning = false;
|
||||||
|
let isGeneratingDocs = false;
|
||||||
|
let generatedDoc = null;
|
||||||
|
|
||||||
// [DEF:fetchData:Function]
|
// [DEF:fetchData:Function]
|
||||||
// @PURPOSE: Fetches environments and saved connections.
|
// @PURPOSE: Fetches environments and saved connections.
|
||||||
@@ -32,11 +38,10 @@
|
|||||||
// @POST: envs and connections arrays are populated.
|
// @POST: envs and connections arrays are populated.
|
||||||
async function fetchData() {
|
async function fetchData() {
|
||||||
try {
|
try {
|
||||||
const envsRes = await fetch('/api/environments');
|
envs = await api.fetchApi('/environments');
|
||||||
envs = await envsRes.json();
|
|
||||||
connections = await getConnections();
|
connections = await getConnections();
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
addToast('Failed to fetch data', 'error');
|
addToast($t.mapper.errors.fetch_failed, 'error');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
// [/DEF:fetchData:Function]
|
// [/DEF:fetchData:Function]
|
||||||
@@ -47,17 +52,17 @@
|
|||||||
// @POST: Mapper task is started and selectedTask is updated.
|
// @POST: Mapper task is started and selectedTask is updated.
|
||||||
async function handleRunMapper() {
|
async function handleRunMapper() {
|
||||||
if (!selectedEnv || !datasetId) {
|
if (!selectedEnv || !datasetId) {
|
||||||
addToast('Please fill in required fields', 'warning');
|
addToast($t.mapper.errors.required_fields, 'warning');
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (source === 'postgres' && (!selectedConnection || !tableName)) {
|
if (source === 'postgres' && (!selectedConnection || !tableName)) {
|
||||||
addToast('Connection and Table Name are required for postgres source', 'warning');
|
addToast($t.mapper.errors.postgres_required, 'warning');
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (source === 'excel' && !excelPath) {
|
if (source === 'excel' && !excelPath) {
|
||||||
addToast('Excel path is required for excel source', 'warning');
|
addToast($t.mapper.errors.excel_required, 'warning');
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -75,7 +80,7 @@
|
|||||||
});
|
});
|
||||||
|
|
||||||
selectedTask.set(task);
|
selectedTask.set(task);
|
||||||
addToast('Mapper task started', 'success');
|
addToast($t.mapper.success.started, 'success');
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
addToast(e.message, 'error');
|
addToast(e.message, 'error');
|
||||||
} finally {
|
} finally {
|
||||||
@@ -84,82 +89,162 @@
|
|||||||
}
|
}
|
||||||
// [/DEF:handleRunMapper:Function]
|
// [/DEF:handleRunMapper:Function]
|
||||||
|
|
||||||
|
// [DEF:handleGenerateDocs:Function]
|
||||||
|
// @PURPOSE: Triggers the LLM Documentation task.
|
||||||
|
// @PRE: selectedEnv and datasetId are set.
|
||||||
|
// @POST: Documentation task is started.
|
||||||
|
async function handleGenerateDocs() {
|
||||||
|
if (!selectedEnv || !datasetId) {
|
||||||
|
addToast($t.mapper.errors.required_fields, 'warning');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
isGeneratingDocs = true;
|
||||||
|
try {
|
||||||
|
// Fetch active provider first
|
||||||
|
const providers = await api.fetchApi('/llm/providers');
|
||||||
|
const activeProvider = providers.find(p => p.is_active);
|
||||||
|
|
||||||
|
if (!activeProvider) {
|
||||||
|
addToast('No active LLM provider found', 'error');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const task = await runTask('llm_documentation', {
|
||||||
|
dataset_id: datasetId,
|
||||||
|
environment_id: selectedEnv,
|
||||||
|
provider_id: activeProvider.id
|
||||||
|
});
|
||||||
|
|
||||||
|
selectedTask.set(task);
|
||||||
|
addToast('Documentation generation started', 'success');
|
||||||
|
} catch (e) {
|
||||||
|
addToast(e.message || 'Failed to start documentation generation', 'error');
|
||||||
|
} finally {
|
||||||
|
isGeneratingDocs = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:handleGenerateDocs:Function]
|
||||||
|
|
||||||
|
async function handleApplyDoc(doc) {
|
||||||
|
try {
|
||||||
|
await api.put(`/mappings/datasets/${datasetId}/metadata`, doc);
|
||||||
|
generatedDoc = null;
|
||||||
|
addToast('Documentation applied successfully', 'success');
|
||||||
|
} catch (err) {
|
||||||
|
addToast(err.message || 'Failed to apply documentation', 'error');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
onMount(fetchData);
|
onMount(fetchData);
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<!-- [SECTION: TEMPLATE] -->
|
<!-- [SECTION: TEMPLATE] -->
|
||||||
<div class="bg-white p-6 rounded-lg shadow-sm border border-gray-200">
|
<div class="space-y-6">
|
||||||
<h3 class="text-lg font-medium text-gray-900 mb-4">Dataset Column Mapper</h3>
|
<Card title={$t.mapper.title}>
|
||||||
<div class="space-y-4">
|
<div class="space-y-4">
|
||||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||||
<div>
|
|
||||||
<label for="mapper-env" class="block text-sm font-medium text-gray-700">Environment</label>
|
|
||||||
<select id="mapper-env" bind:value={selectedEnv} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm">
|
|
||||||
<option value="" disabled>-- Select Environment --</option>
|
|
||||||
{#each envs as env}
|
|
||||||
<option value={env.id}>{env.name}</option>
|
|
||||||
{/each}
|
|
||||||
</select>
|
|
||||||
</div>
|
|
||||||
<div>
|
|
||||||
<label for="mapper-ds-id" class="block text-sm font-medium text-gray-700">Dataset ID</label>
|
|
||||||
<input type="number" id="mapper-ds-id" bind:value={datasetId} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div>
|
|
||||||
<label class="block text-sm font-medium text-gray-700">Mapping Source</label>
|
|
||||||
<div class="mt-2 flex space-x-4">
|
|
||||||
<label class="inline-flex items-center">
|
|
||||||
<input type="radio" bind:group={source} value="postgres" class="focus:ring-indigo-500 h-4 w-4 text-indigo-600 border-gray-300" />
|
|
||||||
<span class="ml-2 text-sm text-gray-700">PostgreSQL</span>
|
|
||||||
</label>
|
|
||||||
<label class="inline-flex items-center">
|
|
||||||
<input type="radio" bind:group={source} value="excel" class="focus:ring-indigo-500 h-4 w-4 text-indigo-600 border-gray-300" />
|
|
||||||
<span class="ml-2 text-sm text-gray-700">Excel</span>
|
|
||||||
</label>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{#if source === 'postgres'}
|
|
||||||
<div class="space-y-4 p-4 bg-gray-50 rounded-md border border-gray-100">
|
|
||||||
<div>
|
<div>
|
||||||
<label for="mapper-conn" class="block text-sm font-medium text-gray-700">Saved Connection</label>
|
<Select
|
||||||
<select id="mapper-conn" bind:value={selectedConnection} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm">
|
label={$t.mapper.environment}
|
||||||
<option value="" disabled>-- Select Connection --</option>
|
bind:value={selectedEnv}
|
||||||
{#each connections as conn}
|
options={[
|
||||||
<option value={conn.id}>{conn.name}</option>
|
{ value: '', label: $t.mapper.select_env },
|
||||||
{/each}
|
...envs.map(e => ({ value: e.id, label: e.name }))
|
||||||
</select>
|
]}
|
||||||
|
/>
|
||||||
</div>
|
</div>
|
||||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
<div>
|
||||||
<div>
|
<Input
|
||||||
<label for="mapper-table" class="block text-sm font-medium text-gray-700">Table Name</label>
|
label={$t.mapper.dataset_id}
|
||||||
<input type="text" id="mapper-table" bind:value={tableName} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
type="number"
|
||||||
</div>
|
bind:value={datasetId}
|
||||||
<div>
|
/>
|
||||||
<label for="mapper-schema" class="block text-sm font-medium text-gray-700">Table Schema</label>
|
|
||||||
<input type="text" id="mapper-schema" bind:value={tableSchema} class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{:else}
|
|
||||||
<div class="p-4 bg-gray-50 rounded-md border border-gray-100">
|
|
||||||
<label for="mapper-excel" class="block text-sm font-medium text-gray-700">Excel File Path</label>
|
|
||||||
<input type="text" id="mapper-excel" bind:value={excelPath} placeholder="/path/to/mapping.xlsx" class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm" />
|
|
||||||
</div>
|
|
||||||
{/if}
|
|
||||||
|
|
||||||
<div class="flex justify-end">
|
<div>
|
||||||
<button
|
<label class="block text-sm font-medium text-gray-700 mb-2">{$t.mapper.source}</label>
|
||||||
on:click={handleRunMapper}
|
<div class="flex space-x-4">
|
||||||
disabled={isRunning}
|
<label class="inline-flex items-center">
|
||||||
class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:opacity-50"
|
<input type="radio" bind:group={source} value="postgres" class="focus:ring-blue-500 h-4 w-4 text-blue-600 border-gray-300" />
|
||||||
>
|
<span class="ml-2 text-sm text-gray-700">{$t.mapper.source_postgres}</span>
|
||||||
{isRunning ? 'Starting...' : 'Run Mapper'}
|
</label>
|
||||||
</button>
|
<label class="inline-flex items-center">
|
||||||
|
<input type="radio" bind:group={source} value="excel" class="focus:ring-blue-500 h-4 w-4 text-blue-600 border-gray-300" />
|
||||||
|
<span class="ml-2 text-sm text-gray-700">{$t.mapper.source_excel}</span>
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if source === 'postgres'}
|
||||||
|
<div class="space-y-4 p-4 bg-gray-50 rounded-md border border-gray-100">
|
||||||
|
<div>
|
||||||
|
<Select
|
||||||
|
label={$t.mapper.connection}
|
||||||
|
bind:value={selectedConnection}
|
||||||
|
options={[
|
||||||
|
{ value: '', label: $t.mapper.select_connection },
|
||||||
|
...connections.map(c => ({ value: c.id, label: c.name }))
|
||||||
|
]}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||||
|
<div>
|
||||||
|
<Input
|
||||||
|
label={$t.mapper.table_name}
|
||||||
|
type="text"
|
||||||
|
bind:value={tableName}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<Input
|
||||||
|
label={$t.mapper.table_schema}
|
||||||
|
type="text"
|
||||||
|
bind:value={tableSchema}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<div class="p-4 bg-gray-50 rounded-md border border-gray-100">
|
||||||
|
<Input
|
||||||
|
label={$t.mapper.excel_path}
|
||||||
|
type="text"
|
||||||
|
bind:value={excelPath}
|
||||||
|
placeholder="/path/to/mapping.xlsx"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
<div class="flex justify-end pt-2 space-x-3">
|
||||||
|
<Button
|
||||||
|
variant="secondary"
|
||||||
|
on:click={handleGenerateDocs}
|
||||||
|
disabled={isGeneratingDocs || isRunning}
|
||||||
|
>
|
||||||
|
{#if isGeneratingDocs}
|
||||||
|
<span class="animate-spin mr-1">↻</span> Generating...
|
||||||
|
{:else}
|
||||||
|
<span class="mr-1">✨</span> Generate Docs
|
||||||
|
{/if}
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="primary"
|
||||||
|
on:click={handleRunMapper}
|
||||||
|
disabled={isRunning}
|
||||||
|
>
|
||||||
|
{isRunning ? $t.mapper.starting : $t.mapper.run}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</Card>
|
||||||
|
|
||||||
|
<DocPreview
|
||||||
|
documentation={generatedDoc}
|
||||||
|
onCancel={() => generatedDoc = null}
|
||||||
|
onSave={handleApplyDoc}
|
||||||
|
/>
|
||||||
</div>
|
</div>
|
||||||
<!-- [/SECTION] -->
|
<!-- [/SECTION] -->
|
||||||
<!-- [/DEF:MapperTool:Component] -->
|
<!-- [/DEF:MapperTool:Component] -->
|
||||||
@@ -1,186 +0,0 @@
|
|||||||
<!-- [DEF:SearchTool:Component] -->
|
|
||||||
<!--
|
|
||||||
@SEMANTICS: search, tool, dataset, regex
|
|
||||||
@PURPOSE: UI component for searching datasets using the SearchPlugin.
|
|
||||||
@LAYER: UI
|
|
||||||
@RELATION: USES -> frontend/src/services/toolsService.js
|
|
||||||
-->
|
|
||||||
<script>
|
|
||||||
// [SECTION: IMPORTS]
|
|
||||||
import { onMount } from 'svelte';
|
|
||||||
import { runTask, getTaskStatus } from '../../services/toolsService.js';
|
|
||||||
import { selectedTask } from '../../lib/stores.js';
|
|
||||||
import { addToast } from '../../lib/toasts.js';
|
|
||||||
// [/SECTION]
|
|
||||||
|
|
||||||
let envs = [];
|
|
||||||
let selectedEnv = '';
|
|
||||||
let searchQuery = '';
|
|
||||||
let isRunning = false;
|
|
||||||
let results = null;
|
|
||||||
let pollInterval;
|
|
||||||
|
|
||||||
// [DEF:fetchEnvironments:Function]
|
|
||||||
// @PURPOSE: Fetches the list of available environments.
|
|
||||||
// @PRE: None.
|
|
||||||
// @POST: envs array is populated.
|
|
||||||
async function fetchEnvironments() {
|
|
||||||
try {
|
|
||||||
const res = await fetch('/api/environments');
|
|
||||||
envs = await res.json();
|
|
||||||
} catch (e) {
|
|
||||||
addToast('Failed to fetch environments', 'error');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// [/DEF:fetchEnvironments:Function]
|
|
||||||
|
|
||||||
// [DEF:handleSearch:Function]
|
|
||||||
// @PURPOSE: Triggers the SearchPlugin task.
|
|
||||||
// @PRE: selectedEnv and searchQuery must be set.
|
|
||||||
// @POST: Task is started and polling begins.
|
|
||||||
async function handleSearch() {
|
|
||||||
if (!selectedEnv || !searchQuery) {
|
|
||||||
addToast('Please select environment and enter query', 'warning');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
isRunning = true;
|
|
||||||
results = null;
|
|
||||||
try {
|
|
||||||
// Find the environment name from ID
|
|
||||||
const env = envs.find(e => e.id === selectedEnv);
|
|
||||||
const task = await runTask('search-datasets', {
|
|
||||||
env: env.name,
|
|
||||||
query: searchQuery
|
|
||||||
});
|
|
||||||
|
|
||||||
selectedTask.set(task);
|
|
||||||
startPolling(task.id);
|
|
||||||
} catch (e) {
|
|
||||||
isRunning = false;
|
|
||||||
addToast(e.message, 'error');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// [/DEF:handleSearch:Function]
|
|
||||||
|
|
||||||
// [DEF:startPolling:Function]
|
|
||||||
// @PURPOSE: Polls for task completion and results.
|
|
||||||
// @PRE: taskId is provided.
|
|
||||||
// @POST: pollInterval is set and results are updated on success.
|
|
||||||
function startPolling(taskId) {
|
|
||||||
if (pollInterval) clearInterval(pollInterval);
|
|
||||||
|
|
||||||
pollInterval = setInterval(async () => {
|
|
||||||
try {
|
|
||||||
const task = await getTaskStatus(taskId);
|
|
||||||
selectedTask.set(task);
|
|
||||||
|
|
||||||
if (task.status === 'SUCCESS') {
|
|
||||||
clearInterval(pollInterval);
|
|
||||||
isRunning = false;
|
|
||||||
results = task.result;
|
|
||||||
addToast('Search completed', 'success');
|
|
||||||
} else if (task.status === 'FAILED') {
|
|
||||||
clearInterval(pollInterval);
|
|
||||||
isRunning = false;
|
|
||||||
addToast('Search failed', 'error');
|
|
||||||
}
|
|
||||||
} catch (e) {
|
|
||||||
clearInterval(pollInterval);
|
|
||||||
isRunning = false;
|
|
||||||
addToast('Error polling task status', 'error');
|
|
||||||
}
|
|
||||||
}, 2000);
|
|
||||||
}
|
|
||||||
// [/DEF:startPolling:Function]
|
|
||||||
|
|
||||||
onMount(fetchEnvironments);
|
|
||||||
</script>
|
|
||||||
|
|
||||||
<!-- [SECTION: TEMPLATE] -->
|
|
||||||
<div class="space-y-6">
|
|
||||||
<div class="bg-white p-6 rounded-lg shadow-sm border border-gray-200">
|
|
||||||
<h3 class="text-lg font-medium text-gray-900 mb-4">Search Dataset Metadata</h3>
|
|
||||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4 items-end">
|
|
||||||
<div>
|
|
||||||
<label for="env-select" class="block text-sm font-medium text-gray-700">Environment</label>
|
|
||||||
<select
|
|
||||||
id="env-select"
|
|
||||||
bind:value={selectedEnv}
|
|
||||||
class="mt-1 block w-full pl-3 pr-10 py-2 text-base border-gray-300 focus:outline-none focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm rounded-md"
|
|
||||||
>
|
|
||||||
<option value="" disabled>-- Select Environment --</option>
|
|
||||||
{#each envs as env}
|
|
||||||
<option value={env.id}>{env.name}</option>
|
|
||||||
{/each}
|
|
||||||
</select>
|
|
||||||
</div>
|
|
||||||
<div>
|
|
||||||
<label for="search-query" class="block text-sm font-medium text-gray-700">Regex Pattern</label>
|
|
||||||
<input
|
|
||||||
type="text"
|
|
||||||
id="search-query"
|
|
||||||
bind:value={searchQuery}
|
|
||||||
placeholder="e.g. from dm.*\.account"
|
|
||||||
class="mt-1 block w-full border-gray-300 rounded-md shadow-sm focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm"
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="mt-4 flex justify-end">
|
|
||||||
<button
|
|
||||||
on:click={handleSearch}
|
|
||||||
disabled={isRunning}
|
|
||||||
class="inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white bg-indigo-600 hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 disabled:opacity-50"
|
|
||||||
>
|
|
||||||
{#if isRunning}
|
|
||||||
<svg class="animate-spin -ml-1 mr-3 h-5 w-5 text-white" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24">
|
|
||||||
<circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle>
|
|
||||||
<path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
|
|
||||||
</svg>
|
|
||||||
Searching...
|
|
||||||
{:else}
|
|
||||||
Search
|
|
||||||
{/if}
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{#if results}
|
|
||||||
<div class="bg-white shadow overflow-hidden sm:rounded-md border border-gray-200">
|
|
||||||
<div class="px-4 py-5 sm:px-6 flex justify-between items-center bg-gray-50 border-b border-gray-200">
|
|
||||||
<h3 class="text-lg leading-6 font-medium text-gray-900">
|
|
||||||
Search Results
|
|
||||||
</h3>
|
|
||||||
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-blue-100 text-blue-800">
|
|
||||||
{results.count} matches
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
<ul class="divide-y divide-gray-200">
|
|
||||||
{#each results.results as item}
|
|
||||||
<li class="p-4 hover:bg-gray-50">
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div class="text-sm font-medium text-indigo-600 truncate">
|
|
||||||
{item.dataset_name} (ID: {item.dataset_id})
|
|
||||||
</div>
|
|
||||||
<div class="ml-2 flex-shrink-0 flex">
|
|
||||||
<p class="px-2 inline-flex text-xs leading-5 font-semibold rounded-full bg-green-100 text-green-800">
|
|
||||||
Field: {item.field}
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="mt-2">
|
|
||||||
<pre class="text-xs text-gray-500 bg-gray-50 p-2 rounded border border-gray-100 overflow-x-auto">{item.match_context}</pre>
|
|
||||||
</div>
|
|
||||||
</li>
|
|
||||||
{/each}
|
|
||||||
{#if results.count === 0}
|
|
||||||
<li class="p-8 text-center text-gray-500 italic">
|
|
||||||
No matches found for the given pattern.
|
|
||||||
</li>
|
|
||||||
{/if}
|
|
||||||
</ul>
|
|
||||||
</div>
|
|
||||||
{/if}
|
|
||||||
</div>
|
|
||||||
<!-- [/SECTION] -->
|
|
||||||
<!-- [/DEF:SearchTool:Component] -->
|
|
||||||
@@ -25,6 +25,22 @@ export const getWsUrl = (taskId) => {
|
|||||||
};
|
};
|
||||||
// [/DEF:getWsUrl:Function]
|
// [/DEF:getWsUrl:Function]
|
||||||
|
|
||||||
|
// [DEF:getAuthHeaders:Function]
|
||||||
|
// @PURPOSE: Returns headers with Authorization if token exists.
|
||||||
|
function getAuthHeaders() {
|
||||||
|
const headers = {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
};
|
||||||
|
if (typeof window !== 'undefined') {
|
||||||
|
const token = localStorage.getItem('auth_token');
|
||||||
|
if (token) {
|
||||||
|
headers['Authorization'] = `Bearer ${token}`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return headers;
|
||||||
|
}
|
||||||
|
// [/DEF:getAuthHeaders:Function]
|
||||||
|
|
||||||
// [DEF:fetchApi:Function]
|
// [DEF:fetchApi:Function]
|
||||||
// @PURPOSE: Generic GET request wrapper.
|
// @PURPOSE: Generic GET request wrapper.
|
||||||
// @PRE: endpoint string is provided.
|
// @PRE: endpoint string is provided.
|
||||||
@@ -34,10 +50,18 @@ export const getWsUrl = (taskId) => {
|
|||||||
async function fetchApi(endpoint) {
|
async function fetchApi(endpoint) {
|
||||||
try {
|
try {
|
||||||
console.log(`[api.fetchApi][Action] Fetching from context={{'endpoint': '${endpoint}'}}`);
|
console.log(`[api.fetchApi][Action] Fetching from context={{'endpoint': '${endpoint}'}}`);
|
||||||
const response = await fetch(`${API_BASE_URL}${endpoint}`);
|
const response = await fetch(`${API_BASE_URL}${endpoint}`, {
|
||||||
|
headers: getAuthHeaders()
|
||||||
|
});
|
||||||
|
console.log(`[api.fetchApi][Action] Received response context={{'status': ${response.status}, 'ok': ${response.ok}}}`);
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
throw new Error(`API request failed with status ${response.status}`);
|
const errorData = await response.json().catch(() => ({}));
|
||||||
|
const message = errorData.detail
|
||||||
|
? (typeof errorData.detail === 'string' ? errorData.detail : JSON.stringify(errorData.detail))
|
||||||
|
: `API request failed with status ${response.status}`;
|
||||||
|
throw new Error(message);
|
||||||
}
|
}
|
||||||
|
if (response.status === 204) return null;
|
||||||
return await response.json();
|
return await response.json();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(`[api.fetchApi][Coherence:Failed] Error fetching from ${endpoint}:`, error);
|
console.error(`[api.fetchApi][Coherence:Failed] Error fetching from ${endpoint}:`, error);
|
||||||
@@ -59,14 +83,18 @@ async function postApi(endpoint, body) {
|
|||||||
console.log(`[api.postApi][Action] Posting to context={{'endpoint': '${endpoint}'}}`);
|
console.log(`[api.postApi][Action] Posting to context={{'endpoint': '${endpoint}'}}`);
|
||||||
const response = await fetch(`${API_BASE_URL}${endpoint}`, {
|
const response = await fetch(`${API_BASE_URL}${endpoint}`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: getAuthHeaders(),
|
||||||
'Content-Type': 'application/json',
|
|
||||||
},
|
|
||||||
body: JSON.stringify(body),
|
body: JSON.stringify(body),
|
||||||
});
|
});
|
||||||
|
console.log(`[api.postApi][Action] Received response context={{'status': ${response.status}, 'ok': ${response.ok}}}`);
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
throw new Error(`API request failed with status ${response.status}`);
|
const errorData = await response.json().catch(() => ({}));
|
||||||
|
const message = errorData.detail
|
||||||
|
? (typeof errorData.detail === 'string' ? errorData.detail : JSON.stringify(errorData.detail))
|
||||||
|
: `API request failed with status ${response.status}`;
|
||||||
|
throw new Error(message);
|
||||||
}
|
}
|
||||||
|
if (response.status === 204) return null;
|
||||||
return await response.json();
|
return await response.json();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(`[api.postApi][Coherence:Failed] Error posting to ${endpoint}:`, error);
|
console.error(`[api.postApi][Coherence:Failed] Error posting to ${endpoint}:`, error);
|
||||||
@@ -85,17 +113,24 @@ async function requestApi(endpoint, method = 'GET', body = null) {
|
|||||||
console.log(`[api.requestApi][Action] ${method} to context={{'endpoint': '${endpoint}'}}`);
|
console.log(`[api.requestApi][Action] ${method} to context={{'endpoint': '${endpoint}'}}`);
|
||||||
const options = {
|
const options = {
|
||||||
method,
|
method,
|
||||||
headers: {
|
headers: getAuthHeaders(),
|
||||||
'Content-Type': 'application/json',
|
|
||||||
},
|
|
||||||
};
|
};
|
||||||
if (body) {
|
if (body) {
|
||||||
options.body = JSON.stringify(body);
|
options.body = JSON.stringify(body);
|
||||||
}
|
}
|
||||||
const response = await fetch(`${API_BASE_URL}${endpoint}`, options);
|
const response = await fetch(`${API_BASE_URL}${endpoint}`, options);
|
||||||
|
console.log(`[api.requestApi][Action] Received response context={{'status': ${response.status}, 'ok': ${response.ok}}}`);
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const errorData = await response.json().catch(() => ({}));
|
const errorData = await response.json().catch(() => ({}));
|
||||||
throw new Error(errorData.detail || `API request failed with status ${response.status}`);
|
const message = errorData.detail
|
||||||
|
? (typeof errorData.detail === 'string' ? errorData.detail : JSON.stringify(errorData.detail))
|
||||||
|
: `API request failed with status ${response.status}`;
|
||||||
|
console.error(`[api.requestApi][Action] Request failed context={{'status': ${response.status}, 'message': '${message}'}}`);
|
||||||
|
throw new Error(message);
|
||||||
|
}
|
||||||
|
if (response.status === 204) {
|
||||||
|
console.log('[api.requestApi][Action] 204 No Content received');
|
||||||
|
return null;
|
||||||
}
|
}
|
||||||
return await response.json();
|
return await response.json();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
@@ -109,6 +144,9 @@ async function requestApi(endpoint, method = 'GET', body = null) {
|
|||||||
// [DEF:api:Data]
|
// [DEF:api:Data]
|
||||||
// @PURPOSE: API client object with specific methods.
|
// @PURPOSE: API client object with specific methods.
|
||||||
export const api = {
|
export const api = {
|
||||||
|
fetchApi,
|
||||||
|
postApi,
|
||||||
|
requestApi,
|
||||||
getPlugins: () => fetchApi('/plugins'),
|
getPlugins: () => fetchApi('/plugins'),
|
||||||
getTasks: () => fetchApi('/tasks'),
|
getTasks: () => fetchApi('/tasks'),
|
||||||
getTask: (taskId) => fetchApi(`/tasks/${taskId}`),
|
getTask: (taskId) => fetchApi(`/tasks/${taskId}`),
|
||||||
@@ -132,6 +170,7 @@ export const api = {
|
|||||||
// [/DEF:api_module:Module]
|
// [/DEF:api_module:Module]
|
||||||
|
|
||||||
// Export individual functions for easier use in components
|
// Export individual functions for easier use in components
|
||||||
|
export { requestApi };
|
||||||
export const getPlugins = api.getPlugins;
|
export const getPlugins = api.getPlugins;
|
||||||
export const getTasks = api.getTasks;
|
export const getTasks = api.getTasks;
|
||||||
export const getTask = api.getTask;
|
export const getTask = api.getTask;
|
||||||
|
|||||||
@@ -3,16 +3,28 @@
|
|||||||
import Navbar from '../components/Navbar.svelte';
|
import Navbar from '../components/Navbar.svelte';
|
||||||
import Footer from '../components/Footer.svelte';
|
import Footer from '../components/Footer.svelte';
|
||||||
import Toast from '../components/Toast.svelte';
|
import Toast from '../components/Toast.svelte';
|
||||||
|
import ProtectedRoute from '../components/auth/ProtectedRoute.svelte';
|
||||||
|
import { page } from '$app/stores';
|
||||||
|
|
||||||
|
$: isLoginPage = $page.url.pathname === '/login';
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<Toast />
|
<Toast />
|
||||||
|
|
||||||
<main class="bg-gray-50 min-h-screen flex flex-col">
|
<main class="bg-gray-50 min-h-screen flex flex-col">
|
||||||
<Navbar />
|
{#if isLoginPage}
|
||||||
|
<div class="p-4 flex-grow">
|
||||||
|
<slot />
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<ProtectedRoute>
|
||||||
|
<Navbar />
|
||||||
|
|
||||||
<div class="p-4 flex-grow">
|
<div class="p-4 flex-grow">
|
||||||
<slot />
|
<slot />
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<Footer />
|
<Footer />
|
||||||
|
</ProtectedRoute>
|
||||||
|
{/if}
|
||||||
</main>
|
</main>
|
||||||
|
|||||||
@@ -23,10 +23,8 @@
|
|||||||
*/
|
*/
|
||||||
function selectPlugin(plugin) {
|
function selectPlugin(plugin) {
|
||||||
console.log(`[Dashboard][Action] Selecting plugin: ${plugin.id}`);
|
console.log(`[Dashboard][Action] Selecting plugin: ${plugin.id}`);
|
||||||
if (plugin.id === 'superset-migration') {
|
if (plugin.ui_route) {
|
||||||
goto('/migration');
|
goto(plugin.ui_route);
|
||||||
} else if (plugin.id === 'git-integration') {
|
|
||||||
goto('/git');
|
|
||||||
} else {
|
} else {
|
||||||
selectedPlugin.set(plugin);
|
selectedPlugin.set(plugin);
|
||||||
}
|
}
|
||||||
@@ -82,7 +80,7 @@
|
|||||||
{/if}
|
{/if}
|
||||||
|
|
||||||
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6">
|
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6">
|
||||||
{#each data.plugins as plugin}
|
{#each data.plugins.filter(p => p.id !== 'superset-search') as plugin}
|
||||||
<div
|
<div
|
||||||
on:click={() => selectPlugin(plugin)}
|
on:click={() => selectPlugin(plugin)}
|
||||||
role="button"
|
role="button"
|
||||||
|
|||||||
236
frontend/src/routes/admin/roles/+page.svelte
Normal file
236
frontend/src/routes/admin/roles/+page.svelte
Normal file
@@ -0,0 +1,236 @@
|
|||||||
|
<!-- [DEF:AdminRolesPage:Component] -->
|
||||||
|
<!--
|
||||||
|
@TIER: STANDARD
|
||||||
|
@SEMANTICS: admin, role-management, rbac
|
||||||
|
@PURPOSE: UI for managing system roles and their permissions.
|
||||||
|
@LAYER: Domain
|
||||||
|
@RELATION: DEPENDS_ON -> frontend.src.services.adminService
|
||||||
|
@RELATION: DEPENDS_ON -> frontend.src.components.auth.ProtectedRoute
|
||||||
|
|
||||||
|
@INVARIANT: Only accessible by users with Admin role.
|
||||||
|
-->
|
||||||
|
|
||||||
|
<script lang="ts">
|
||||||
|
// [SECTION: IMPORTS]
|
||||||
|
import { onMount } from 'svelte';
|
||||||
|
import { t } from '$lib/i18n';
|
||||||
|
import ProtectedRoute from '../../../components/auth/ProtectedRoute.svelte';
|
||||||
|
import { adminService } from '../../../services/adminService';
|
||||||
|
// [/SECTION: IMPORTS]
|
||||||
|
|
||||||
|
let roles = [];
|
||||||
|
let permissions = [];
|
||||||
|
let loading = true;
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
let showModal = false;
|
||||||
|
let isEditing = false;
|
||||||
|
let currentRoleId = null;
|
||||||
|
let roleForm = {
|
||||||
|
name: '',
|
||||||
|
description: '',
|
||||||
|
permissions: []
|
||||||
|
};
|
||||||
|
|
||||||
|
// [DEF:loadData:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Fetches roles and available permissions.
|
||||||
|
* @pre Component mounted.
|
||||||
|
* @post roles and permissions arrays populated.
|
||||||
|
*/
|
||||||
|
async function loadData() {
|
||||||
|
console.log('[AdminRolesPage][loadData][Entry]');
|
||||||
|
loading = true;
|
||||||
|
try {
|
||||||
|
[roles, permissions] = await Promise.all([
|
||||||
|
adminService.getRoles(),
|
||||||
|
adminService.getPermissions()
|
||||||
|
]);
|
||||||
|
console.log('[AdminRolesPage][loadData][Coherence:OK]');
|
||||||
|
} catch (e) {
|
||||||
|
error = "Failed to load roles data.";
|
||||||
|
console.error('[AdminRolesPage][loadData][Coherence:Failed]', e);
|
||||||
|
} finally {
|
||||||
|
loading = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:loadData:Function]
|
||||||
|
|
||||||
|
// [DEF:openCreateModal:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Initializes state for creating a new role.
|
||||||
|
* @pre None.
|
||||||
|
* @post showModal is true, roleForm is reset.
|
||||||
|
*/
|
||||||
|
function openCreateModal() {
|
||||||
|
console.log("[openCreateModal][Action] Opening create modal");
|
||||||
|
isEditing = false;
|
||||||
|
currentRoleId = null;
|
||||||
|
roleForm = { name: '', description: '', permissions: [] };
|
||||||
|
showModal = true;
|
||||||
|
}
|
||||||
|
// [/DEF:openCreateModal:Function]
|
||||||
|
|
||||||
|
// [DEF:openEditModal:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Initializes state for editing an existing role.
|
||||||
|
* @pre role object is provided.
|
||||||
|
* @post showModal is true, roleForm is populated.
|
||||||
|
*/
|
||||||
|
function openEditModal(role) {
|
||||||
|
console.log(`[openEditModal][Action] Opening edit modal for role ${role.id}`);
|
||||||
|
isEditing = true;
|
||||||
|
currentRoleId = role.id;
|
||||||
|
roleForm = {
|
||||||
|
name: role.name,
|
||||||
|
description: role.description || '',
|
||||||
|
permissions: role.permissions.map(p => p.id)
|
||||||
|
};
|
||||||
|
showModal = true;
|
||||||
|
}
|
||||||
|
// [/DEF:openEditModal:Function]
|
||||||
|
|
||||||
|
// [DEF:handleSaveRole:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Submits role data (create or update).
|
||||||
|
* @pre roleForm contains valid data.
|
||||||
|
* @post Role is saved, modal closed, data reloaded.
|
||||||
|
*/
|
||||||
|
async function handleSaveRole() {
|
||||||
|
console.log('[AdminRolesPage][handleSaveRole][Entry]');
|
||||||
|
try {
|
||||||
|
if (isEditing) {
|
||||||
|
await adminService.updateRole(currentRoleId, roleForm);
|
||||||
|
} else {
|
||||||
|
await adminService.createRole(roleForm);
|
||||||
|
}
|
||||||
|
showModal = false;
|
||||||
|
await loadData();
|
||||||
|
console.log('[AdminRolesPage][handleSaveRole][Coherence:OK]');
|
||||||
|
} catch (e) {
|
||||||
|
alert("Failed to save role: " + e.message);
|
||||||
|
console.error('[AdminRolesPage][handleSaveRole][Coherence:Failed]', e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:handleSaveRole:Function]
|
||||||
|
|
||||||
|
// [DEF:handleDeleteRole:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Deletes a role after confirmation.
|
||||||
|
* @pre role object is provided.
|
||||||
|
* @post Role is deleted if confirmed, data reloaded.
|
||||||
|
*/
|
||||||
|
async function handleDeleteRole(role) {
|
||||||
|
if (!confirm($t.admin.roles.confirm_delete.replace('{name}', role.name))) return;
|
||||||
|
|
||||||
|
console.log('[AdminRolesPage][handleDeleteRole][Entry]');
|
||||||
|
try {
|
||||||
|
await adminService.deleteRole(role.id);
|
||||||
|
await loadData();
|
||||||
|
console.log('[AdminRolesPage][handleDeleteRole][Coherence:OK]');
|
||||||
|
} catch (e) {
|
||||||
|
alert("Failed to delete role: " + e.message);
|
||||||
|
console.error('[AdminRolesPage][handleDeleteRole][Coherence:Failed]', e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:handleDeleteRole:Function]
|
||||||
|
|
||||||
|
onMount(loadData);
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<ProtectedRoute requiredPermission="admin:roles">
|
||||||
|
<!-- [SECTION: TEMPLATE] -->
|
||||||
|
<div class="container mx-auto p-4">
|
||||||
|
<div class="flex justify-between items-center mb-6">
|
||||||
|
<h1 class="text-2xl font-bold">{$t.admin.roles.title}</h1>
|
||||||
|
<button
|
||||||
|
class="bg-blue-600 text-white px-4 py-2 rounded hover:bg-blue-700"
|
||||||
|
on:click={openCreateModal}
|
||||||
|
>
|
||||||
|
{$t.admin.roles.create}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if loading}
|
||||||
|
<p>{$t.admin.roles.loading}</p>
|
||||||
|
{:else if error}
|
||||||
|
<div class="bg-red-100 text-red-700 p-4 rounded">{error}</div>
|
||||||
|
{:else}
|
||||||
|
<div class="bg-white shadow rounded-lg overflow-hidden">
|
||||||
|
<table class="min-w-full divide-y divide-gray-200">
|
||||||
|
<thead class="bg-gray-50">
|
||||||
|
<tr>
|
||||||
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.admin.roles.name}</th>
|
||||||
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.admin.roles.description}</th>
|
||||||
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.admin.roles.permissions}</th>
|
||||||
|
<th class="px-6 py-3 text-right text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.common.actions}</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody class="bg-white divide-y divide-gray-200">
|
||||||
|
{#each roles as role}
|
||||||
|
<tr>
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap font-medium">{role.name}</td>
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap text-sm text-gray-500">{role.description || '-'}</td>
|
||||||
|
<td class="px-6 py-4">
|
||||||
|
<div class="flex flex-wrap gap-1">
|
||||||
|
{#each role.permissions as perm}
|
||||||
|
<span class="px-2 py-0.5 bg-blue-50 text-blue-700 text-xs rounded border border-blue-100">
|
||||||
|
{perm.resource}:{perm.action}
|
||||||
|
</span>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap text-right text-sm font-medium">
|
||||||
|
<button on:click={() => openEditModal(role)} class="text-blue-600 hover:text-blue-900 mr-3">{$t.common.edit}</button>
|
||||||
|
<button on:click={() => handleDeleteRole(role)} class="text-red-600 hover:text-red-900">{$t.common.delete}</button>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
{/each}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
{#if showModal}
|
||||||
|
<div class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center p-4 z-50">
|
||||||
|
<div class="bg-white rounded-lg p-6 max-w-2xl w-full max-h-[90vh] overflow-y-auto">
|
||||||
|
<h2 class="text-xl font-bold mb-4">
|
||||||
|
{isEditing ? $t.admin.roles.modal_edit_title : $t.admin.roles.modal_create_title}
|
||||||
|
</h2>
|
||||||
|
<form on:submit|preventDefault={handleSaveRole}>
|
||||||
|
<div class="mb-4">
|
||||||
|
<label class="block text-sm font-medium mb-1">{$t.admin.roles.name}</label>
|
||||||
|
<input type="text" bind:value={roleForm.name} class="w-full border p-2 rounded" required readonly={isEditing} />
|
||||||
|
</div>
|
||||||
|
<div class="mb-4">
|
||||||
|
<label class="block text-sm font-medium mb-1">{$t.admin.roles.description}</label>
|
||||||
|
<textarea bind:value={roleForm.description} class="w-full border p-2 rounded h-20"></textarea>
|
||||||
|
</div>
|
||||||
|
<div class="mb-6">
|
||||||
|
<label class="block text-sm font-medium mb-2">{$t.admin.roles.permissions}</label>
|
||||||
|
<div class="grid grid-cols-2 md:grid-cols-3 gap-2 border p-3 rounded bg-gray-50">
|
||||||
|
{#each permissions as perm}
|
||||||
|
<label class="flex items-center space-x-2 p-1 hover:bg-white rounded cursor-pointer">
|
||||||
|
<input type="checkbox" value={perm.id} bind:group={roleForm.permissions} class="rounded text-blue-600" />
|
||||||
|
<span class="text-xs">{perm.resource}:{perm.action}</span>
|
||||||
|
</label>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
<p class="text-xs text-gray-500 mt-2">{$t.admin.roles.permissions_hint}</p>
|
||||||
|
</div>
|
||||||
|
<div class="flex justify-end gap-2 pt-4 border-t">
|
||||||
|
<button type="button" class="px-4 py-2 text-gray-600" on:click={() => showModal = false}>{$t.common.cancel}</button>
|
||||||
|
<button type="submit" class="px-4 py-2 bg-blue-600 text-white rounded hover:bg-blue-700">{$t.common.save}</button>
|
||||||
|
</div>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
<!-- [/SECTION: TEMPLATE] -->
|
||||||
|
</ProtectedRoute>
|
||||||
|
|
||||||
|
<style>
|
||||||
|
</style>
|
||||||
|
|
||||||
|
<!-- [/DEF:AdminRolesPage:Component] -->
|
||||||
213
frontend/src/routes/admin/settings/+page.svelte
Normal file
213
frontend/src/routes/admin/settings/+page.svelte
Normal file
@@ -0,0 +1,213 @@
|
|||||||
|
<!-- [DEF:AdminSettingsPage:Component] -->
|
||||||
|
<!--
|
||||||
|
@SEMANTICS: admin, adfs, mappings, configuration
|
||||||
|
@PURPOSE: UI for configuring Active Directory Group to local Role mappings for ADFS SSO.
|
||||||
|
@LAYER: Feature
|
||||||
|
@RELATION: DEPENDS_ON -> frontend.src.services.adminService
|
||||||
|
@RELATION: DEPENDS_ON -> frontend.src.components.auth.ProtectedRoute
|
||||||
|
|
||||||
|
@INVARIANT: Only accessible by users with "admin:settings" permission.
|
||||||
|
-->
|
||||||
|
|
||||||
|
<script lang="ts">
|
||||||
|
// [SECTION: IMPORTS]
|
||||||
|
import { onMount } from 'svelte';
|
||||||
|
import { t } from '$lib/i18n';
|
||||||
|
import ProtectedRoute from '../../../components/auth/ProtectedRoute.svelte';
|
||||||
|
import { adminService } from '../../../services/adminService';
|
||||||
|
// [/SECTION: IMPORTS]
|
||||||
|
|
||||||
|
let mappings = [];
|
||||||
|
let roles = [];
|
||||||
|
let loading = true;
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
let showCreateModal = false;
|
||||||
|
let newMapping = {
|
||||||
|
ad_group: '',
|
||||||
|
role_id: ''
|
||||||
|
};
|
||||||
|
|
||||||
|
// [DEF:loadData:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Fetches AD mappings and roles from the backend to populate the UI.
|
||||||
|
* @pre Component is mounted and user has active session.
|
||||||
|
* @post mappings and roles variables are updated with backend data.
|
||||||
|
* @returns {Promise<void>}
|
||||||
|
* @side_effect Updates local 'mappings', 'roles', 'loading', and 'error' states.
|
||||||
|
* @relation CALLS -> adminService.getRoles
|
||||||
|
* @relation CALLS -> adminService.getADGroupMappings
|
||||||
|
*/
|
||||||
|
async function loadData() {
|
||||||
|
console.log('[AdminSettingsPage][loadData][Entry]');
|
||||||
|
loading = true;
|
||||||
|
try {
|
||||||
|
// Fetch roles first as they are required for displaying mapping labels
|
||||||
|
roles = await adminService.getRoles();
|
||||||
|
|
||||||
|
try {
|
||||||
|
mappings = await adminService.getADGroupMappings();
|
||||||
|
} catch (e) {
|
||||||
|
console.warn("[AdminSettingsPage][loadData] AD Mappings endpoint potentially unavailable.");
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log('[AdminSettingsPage][loadData][Coherence:OK]');
|
||||||
|
} catch (e) {
|
||||||
|
error = "Failed to load roles or configuration.";
|
||||||
|
console.error('[AdminSettingsPage][loadData][Coherence:Failed]', e);
|
||||||
|
} finally {
|
||||||
|
loading = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:loadData:Function]
|
||||||
|
|
||||||
|
// [DEF:handleCreateMapping:Function]
|
||||||
|
/**
|
||||||
|
* @purpose Submits a new AD Group to Role mapping to the backend.
|
||||||
|
* @pre 'newMapping' object contains valid 'ad_group' and 'role_id'.
|
||||||
|
* @post A new mapping is created in the database and the table is refreshed.
|
||||||
|
* @returns {Promise<void>}
|
||||||
|
* @side_effect Closes the modal on success, shows alert on failure.
|
||||||
|
* @relation CALLS -> adminService.createADGroupMapping
|
||||||
|
*/
|
||||||
|
async function handleCreateMapping() {
|
||||||
|
console.log('[AdminSettingsPage][handleCreateMapping][Entry]');
|
||||||
|
|
||||||
|
// Guard Clause (@PRE)
|
||||||
|
if (!newMapping.ad_group || !newMapping.role_id) {
|
||||||
|
alert("Please fill in all fields.");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
await adminService.createADGroupMapping(newMapping);
|
||||||
|
showCreateModal = false;
|
||||||
|
// Reset form
|
||||||
|
newMapping = { ad_group: '', role_id: '' };
|
||||||
|
await loadData();
|
||||||
|
console.log('[AdminSettingsPage][handleCreateMapping][Coherence:OK]');
|
||||||
|
} catch (e) {
|
||||||
|
alert("Failed to create mapping: " + (e.message || "Unknown error"));
|
||||||
|
console.error('[AdminSettingsPage][handleCreateMapping][Coherence:Failed]', e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// [/DEF:handleCreateMapping:Function]
|
||||||
|
|
||||||
|
onMount(loadData);
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<ProtectedRoute requiredPermission="admin:settings">
|
||||||
|
<!-- [SECTION: TEMPLATE] -->
|
||||||
|
<div class="container mx-auto p-4">
|
||||||
|
<div class="flex justify-between items-center mb-6">
|
||||||
|
<h1 class="text-2xl font-bold">{$t.admin.settings.title}</h1>
|
||||||
|
<button
|
||||||
|
class="bg-blue-600 text-white px-4 py-2 rounded hover:bg-blue-700 transition-colors"
|
||||||
|
on:click={() => showCreateModal = true}
|
||||||
|
>
|
||||||
|
{$t.admin.settings.add_mapping}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if loading}
|
||||||
|
<div class="flex justify-center py-8">
|
||||||
|
<p class="text-gray-500 animate-pulse">{$t.common.loading}</p>
|
||||||
|
</div>
|
||||||
|
{:else if error}
|
||||||
|
<div class="bg-red-100 border-l-4 border-red-500 text-red-700 p-4 rounded mb-4" role="alert">
|
||||||
|
<p class="font-bold">{$t.common.error}</p>
|
||||||
|
<p>{error}</p>
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<div class="bg-white shadow rounded-lg overflow-hidden border border-gray-200">
|
||||||
|
<table class="min-w-full divide-y divide-gray-200">
|
||||||
|
<thead class="bg-gray-50">
|
||||||
|
<tr>
|
||||||
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.admin.settings.ad_group}</th>
|
||||||
|
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.admin.settings.local_role}</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody class="bg-white divide-y divide-gray-200">
|
||||||
|
{#each mappings as mapping}
|
||||||
|
<tr class="hover:bg-gray-50 transition-colors">
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap font-mono text-sm text-gray-600">{mapping.ad_group}</td>
|
||||||
|
<td class="px-6 py-4 whitespace-nowrap">
|
||||||
|
<span class="px-2 py-1 bg-blue-100 text-blue-800 text-xs font-semibold rounded-full">
|
||||||
|
{roles.find(r => r.id === mapping.role_id)?.name || mapping.role_id}
|
||||||
|
</span>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
{/each}
|
||||||
|
{#if mappings.length === 0}
|
||||||
|
<tr>
|
||||||
|
<td colspan="2" class="px-6 py-12 text-center text-gray-500">
|
||||||
|
<div class="flex flex-col items-center">
|
||||||
|
<svg class="w-12 h-12 text-gray-300 mb-2" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 12h6m-6 4h6m2 5H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z"></path>
|
||||||
|
</svg>
|
||||||
|
<p>{$t.admin.settings.no_mappings}</p>
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
{/if}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
{#if showCreateModal}
|
||||||
|
<div class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center p-4 z-50">
|
||||||
|
<div class="bg-white rounded-lg shadow-xl p-6 max-w-md w-full">
|
||||||
|
<h2 class="text-xl font-bold mb-4 border-b pb-2">{$t.admin.settings.modal_title}</h2>
|
||||||
|
<form on:submit|preventDefault={handleCreateMapping}>
|
||||||
|
<div class="mb-4">
|
||||||
|
<label class="block text-sm font-medium text-gray-700 mb-1">{$t.admin.settings.ad_group_dn}</label>
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
bind:value={newMapping.ad_group}
|
||||||
|
class="w-full border border-gray-300 p-2 rounded focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
|
||||||
|
placeholder="e.g. CN=SS_ADMINS,OU=Groups,DC=org"
|
||||||
|
required
|
||||||
|
/>
|
||||||
|
<p class="text-xs text-gray-500 mt-1">{$t.admin.settings.ad_group_hint}</p>
|
||||||
|
</div>
|
||||||
|
<div class="mb-6">
|
||||||
|
<label class="block text-sm font-medium text-gray-700 mb-1">{$t.admin.settings.local_role_select}</label>
|
||||||
|
<select
|
||||||
|
bind:value={newMapping.role_id}
|
||||||
|
class="w-full border border-gray-300 p-2 rounded focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
|
||||||
|
required
|
||||||
|
>
|
||||||
|
<option value="" disabled>{$t.admin.settings.select_role}</option>
|
||||||
|
{#each roles as role}
|
||||||
|
<option value={role.id}>{role.name}</option>
|
||||||
|
{/each}
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
<div class="flex justify-end gap-3">
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
class="px-4 py-2 text-gray-600 hover:text-gray-800 font-medium"
|
||||||
|
on:click={() => showCreateModal = false}
|
||||||
|
>
|
||||||
|
{$t.common.cancel}
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
type="submit"
|
||||||
|
class="px-6 py-2 bg-blue-600 text-white rounded font-bold hover:bg-blue-700 shadow-md"
|
||||||
|
>
|
||||||
|
{$t.common.save}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
<!-- [/SECTION: TEMPLATE] -->
|
||||||
|
</ProtectedRoute>
|
||||||
|
|
||||||
|
<style>
|
||||||
|
</style>
|
||||||
|
|
||||||
|
<!-- [/DEF:AdminSettingsPage:Component] -->
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user