Compare commits
23 Commits
010-refact
...
016-multi-
| Author | SHA1 | Date | |
|---|---|---|---|
| 0e0e26e2f7 | |||
| 18b42f8dd0 | |||
| e7b31accd6 | |||
| d3c3a80ed2 | |||
| cc244c2d86 | |||
| d10c23e658 | |||
| 1042b35d1b | |||
| 16ffeb1ed6 | |||
| da34deac02 | |||
| 51e9ee3fcc | |||
| edf9286071 | |||
| a542e7d2df | |||
| a863807cf2 | |||
| e2bc68683f | |||
| 43cb82697b | |||
| 4ba28cf93e | |||
| 343f2e29f5 | |||
| c9a53578fd | |||
| 07ec2d9797 | |||
| e9d3f3c827 | |||
| 26ba015b75 | |||
| 49129d3e86 | |||
| d99a13d91f |
6
.gitignore
vendored
6
.gitignore
vendored
@@ -59,9 +59,13 @@ Thumbs.db
|
||||
*.ps1
|
||||
keyring passwords.py
|
||||
*github*
|
||||
*git*
|
||||
|
||||
*tech_spec*
|
||||
dashboards
|
||||
backend/mappings.db
|
||||
|
||||
|
||||
backend/tasks.db
|
||||
backend/logs
|
||||
backend/auth.db
|
||||
semantics/reports
|
||||
|
||||
@@ -20,6 +20,18 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
|
||||
- SQLite (`tasks.db`), JSON (`config.json`) (009-backup-scheduler)
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, `superset_tool` (internal lib) (010-refactor-cli-to-web)
|
||||
- SQLite (for job history/results, connection configs), Filesystem (for temporary file uploads) (010-refactor-cli-to-web)
|
||||
- Python 3.9+ + FastAPI, Pydantic, requests, pyyaml (migrated from superset_tool) (012-remove-superset-tool)
|
||||
- SQLite (tasks.db, migrations.db), Filesystem (012-remove-superset-tool)
|
||||
- Filesystem (local git repo), SQLite (for GitServerConfig, Environment) (011-git-integration-dashboard)
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, GitPython (or CLI git), Pydantic, SQLAlchemy, Superset API (011-git-integration-dashboard)
|
||||
- SQLite (for config/history), Filesystem (local Git repositories) (011-git-integration-dashboard)
|
||||
- Node.js 18+ (Frontend Build), Svelte 5.x + SvelteKit, Tailwind CSS, `date-fns` (existing) (013-unify-frontend-css)
|
||||
- LocalStorage (for language preference) (013-unify-frontend-css)
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit (Frontend) (014-file-storage-ui)
|
||||
- Local Filesystem (for artifacts), Config (for storage path) (014-file-storage-ui)
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit + Tailwind CSS (Frontend) (015-frontend-nav-redesign)
|
||||
- N/A (UI reorganization and API integration) (015-frontend-nav-redesign)
|
||||
- SQLite (`auth.db`) for Users, Roles, Permissions, and Mappings. (016-multi-user-auth)
|
||||
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
||||
|
||||
@@ -40,9 +52,9 @@ cd src; pytest; ruff check .
|
||||
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
||||
|
||||
## Recent Changes
|
||||
- 010-refactor-cli-to-web: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, Pydantic, SQLAlchemy, `superset_tool` (internal lib)
|
||||
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
||||
- 009-backup-scheduler: Added Python 3.9+, Node.js 18+ + FastAPI, APScheduler, SQLAlchemy, SvelteKit, Tailwind CSS
|
||||
- 016-multi-user-auth: Added Python 3.9+ (Backend), Node.js 18+ (Frontend)
|
||||
- 015-frontend-nav-redesign: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit + Tailwind CSS (Frontend)
|
||||
- 014-file-storage-ui: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit (Frontend)
|
||||
|
||||
|
||||
<!-- MANUAL ADDITIONS START -->
|
||||
|
||||
4
.kilocode/workflows/read_semantic.md
Normal file
4
.kilocode/workflows/read_semantic.md
Normal file
@@ -0,0 +1,4 @@
|
||||
---
|
||||
description: USE SEMANTIC
|
||||
---
|
||||
Прочитай semantic_protocol.md. ОБЯЗАТЕЛЬНО используй его при разработке
|
||||
@@ -1,11 +1,10 @@
|
||||
<!--
|
||||
SYNC IMPACT REPORT
|
||||
Version: 1.7.1 (Simplified Workflow)
|
||||
Version: 1.9.0 (Security & RBAC Mandate)
|
||||
Changes:
|
||||
- Simplified Generation Workflow to a single phase: Code Generation from `tasks.md`.
|
||||
- Removed multi-phase Architecture/Implementation split to streamline development.
|
||||
- Added Principle IX: Security & Access Control (Mandating granular permissions for plugins).
|
||||
Templates Status:
|
||||
- .specify/templates/plan-template.md: ✅ Aligned (Dynamic check).
|
||||
- .specify/templates/plan-template.md: ✅ Aligned.
|
||||
- .specify/templates/spec-template.md: ✅ Aligned.
|
||||
- .specify/templates/tasks-template.md: ✅ Aligned.
|
||||
-->
|
||||
@@ -37,6 +36,16 @@ To maintain semantic coherence, code must adhere to the complexity limits (Modul
|
||||
### VII. Everything is a Plugin
|
||||
All functional extensions, tools, or major features must be implemented as modular Plugins inheriting from `PluginBase`. Logic should not reside in standalone services or scripts unless strictly necessary for core infrastructure. This ensures a unified execution model via the `TaskManager`, consistent logging, and modularity.
|
||||
|
||||
### VIII. Unified Frontend Experience
|
||||
To ensure a consistent and accessible user experience, all frontend implementations must strictly adhere to the unified design and localization standards.
|
||||
- **Component Reusability**: All UI elements MUST utilize the standardized Svelte component library (`src/lib/ui`) and centralized design tokens. Ad-hoc styling and hardcoded values are prohibited.
|
||||
- **Internationalization (i18n)**: All user-facing text MUST be extracted to the translation system (`src/lib/i18n`). Hardcoded strings in the UI are prohibited.
|
||||
|
||||
### IX. Security & Access Control
|
||||
To support the Role-Based Access Control (RBAC) system, all functional components must define explicit permissions.
|
||||
- **Granular Permissions**: Every Plugin MUST define a unique permission string (e.g., `plugin:name:execute`) required for its operation.
|
||||
- **Registration**: These permissions MUST be registered in the system database during initialization or plugin loading to ensure they are available for role assignment in the Admin UI.
|
||||
|
||||
## File Structure Standards
|
||||
Refer to **Section III (File Structure Standard)** in `semantic_protocol.md` for the authoritative definitions of:
|
||||
- Python Module Headers (`.py`)
|
||||
@@ -64,4 +73,4 @@ This Constitution establishes the "Semantic Code Generation Protocol" as the sup
|
||||
- **Amendments**: Changes to core principles require a Constitution amendment. Changes to technical syntax require a Protocol update.
|
||||
- **Compliance**: Failure to adhere to the Protocol constitutes a build failure.
|
||||
|
||||
**Version**: 1.7.1 | **Ratified**: 2025-12-19 | **Last Amended**: 2026-01-13
|
||||
**Version**: 1.9.0 | **Ratified**: 2025-12-19 | **Last Amended**: 2026-01-27
|
||||
|
||||
BIN
backend/backend/auth.db
Normal file
BIN
backend/backend/auth.db
Normal file
Binary file not shown.
1
backend/backend/git_repos/12
Submodule
1
backend/backend/git_repos/12
Submodule
Submodule backend/backend/git_repos/12 added at f46772443a
@@ -1,269 +0,0 @@
|
||||
2025-12-20 19:55:11,325 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 19:55:11,325 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 19:55:11,327 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 43, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 21:01:49,905 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 21:01:49,906 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 21:01:49,988 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 21:01:49,990 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 22:42:32,538 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 22:42:32,538 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 22:42:32,583 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 22:42:32,587 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 22:54:29,770 - INFO - [BackupPlugin][Entry] Starting backup for .
|
||||
2025-12-20 22:54:29,771 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 22:54:29,831 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 22:54:29,833 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 22:54:34,078 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 22:54:34,078 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 22:54:34,079 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 22:54:34,079 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 22:59:25,060 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 22:59:25,060 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 22:59:25,114 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 22:59:25,117 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 23:00:31,156 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 23:00:31,156 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 23:00:31,157 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 23:00:31,162 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 23:00:34,710 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 23:00:34,710 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 23:00:34,710 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 23:00:34,711 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 23:01:43,894 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 23:01:43,894 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 23:01:43,895 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 23:01:43,895 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 23:04:07,731 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 23:04:07,731 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 23:04:07,732 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 23:04:07,732 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 23:06:39,641 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 23:06:39,642 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 23:06:39,687 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 23:06:39,689 - CRITICAL - [setup_clients][Failure] Critical error during client initialization: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
Traceback (most recent call last):
|
||||
File "/home/user/ss-tools/superset_tool/utils/init_clients.py", line 66, in setup_clients
|
||||
config = SupersetConfig(
|
||||
^^^^^^^^^^^^^^^
|
||||
File "/home/user/ss-tools/backend/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
|
||||
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
pydantic_core._pydantic_core.ValidationError: 1 validation error for SupersetConfig
|
||||
base_url
|
||||
Value error, Invalid URL format: https://superset.bebesh.ru. Must include '/api/v1'. [type=value_error, input_value='https://superset.bebesh.ru', input_type=str]
|
||||
For further information visit https://errors.pydantic.dev/2.12/v/value_error
|
||||
2025-12-20 23:30:36,090 - INFO - [BackupPlugin][Entry] Starting backup for superset.
|
||||
2025-12-20 23:30:36,093 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-20 23:30:36,128 - INFO - [setup_clients][Action] Loading environments from ConfigManager
|
||||
2025-12-20 23:30:36,129 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-20 23:30:36,129 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-20 23:30:36,130 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-20 23:30:36,130 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-20 23:30:36,130 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-20 23:30:36,130 - INFO - [get_dashboards][Enter] Fetching dashboards.
|
||||
2025-12-20 23:30:36,131 - INFO - [authenticate][Enter] Authenticating to https://superset.bebesh.ru/api/v1
|
||||
2025-12-20 23:30:36,897 - INFO - [authenticate][Exit] Authenticated successfully.
|
||||
2025-12-20 23:30:37,527 - INFO - [get_dashboards][Exit] Found 11 dashboards.
|
||||
2025-12-20 23:30:37,527 - INFO - [BackupPlugin][Progress] Found 11 dashboards to export in superset.
|
||||
2025-12-20 23:30:37,529 - INFO - [export_dashboard][Enter] Exporting dashboard 11.
|
||||
2025-12-20 23:30:38,224 - INFO - [export_dashboard][Exit] Exported dashboard 11 to dashboard_export_20251220T203037.zip.
|
||||
2025-12-20 23:30:38,225 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:38,226 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/FCC New Coder Survey 2018/dashboard_export_20251220T203037.zip
|
||||
2025-12-20 23:30:38,227 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/FCC New Coder Survey 2018
|
||||
2025-12-20 23:30:38,230 - INFO - [export_dashboard][Enter] Exporting dashboard 10.
|
||||
2025-12-20 23:30:38,438 - INFO - [export_dashboard][Exit] Exported dashboard 10 to dashboard_export_20251220T203038.zip.
|
||||
2025-12-20 23:30:38,438 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:38,439 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/COVID Vaccine Dashboard/dashboard_export_20251220T203038.zip
|
||||
2025-12-20 23:30:38,439 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/COVID Vaccine Dashboard
|
||||
2025-12-20 23:30:38,440 - INFO - [export_dashboard][Enter] Exporting dashboard 9.
|
||||
2025-12-20 23:30:38,853 - INFO - [export_dashboard][Exit] Exported dashboard 9 to dashboard_export_20251220T203038.zip.
|
||||
2025-12-20 23:30:38,853 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:38,856 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Sales Dashboard/dashboard_export_20251220T203038.zip
|
||||
2025-12-20 23:30:38,856 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Sales Dashboard
|
||||
2025-12-20 23:30:38,858 - INFO - [export_dashboard][Enter] Exporting dashboard 8.
|
||||
2025-12-20 23:30:38,939 - INFO - [export_dashboard][Exit] Exported dashboard 8 to dashboard_export_20251220T203038.zip.
|
||||
2025-12-20 23:30:38,940 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:38,941 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Unicode Test/dashboard_export_20251220T203038.zip
|
||||
2025-12-20 23:30:38,941 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Unicode Test
|
||||
2025-12-20 23:30:38,942 - INFO - [export_dashboard][Enter] Exporting dashboard 7.
|
||||
2025-12-20 23:30:39,148 - INFO - [export_dashboard][Exit] Exported dashboard 7 to dashboard_export_20251220T203038.zip.
|
||||
2025-12-20 23:30:39,148 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:39,149 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Video Game Sales/dashboard_export_20251220T203038.zip
|
||||
2025-12-20 23:30:39,149 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Video Game Sales
|
||||
2025-12-20 23:30:39,150 - INFO - [export_dashboard][Enter] Exporting dashboard 6.
|
||||
2025-12-20 23:30:39,689 - INFO - [export_dashboard][Exit] Exported dashboard 6 to dashboard_export_20251220T203039.zip.
|
||||
2025-12-20 23:30:39,689 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:39,690 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Featured Charts/dashboard_export_20251220T203039.zip
|
||||
2025-12-20 23:30:39,691 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Featured Charts
|
||||
2025-12-20 23:30:39,692 - INFO - [export_dashboard][Enter] Exporting dashboard 5.
|
||||
2025-12-20 23:30:39,960 - INFO - [export_dashboard][Exit] Exported dashboard 5 to dashboard_export_20251220T203039.zip.
|
||||
2025-12-20 23:30:39,960 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:39,961 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Slack Dashboard/dashboard_export_20251220T203039.zip
|
||||
2025-12-20 23:30:39,961 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Slack Dashboard
|
||||
2025-12-20 23:30:39,962 - INFO - [export_dashboard][Enter] Exporting dashboard 4.
|
||||
2025-12-20 23:30:40,196 - INFO - [export_dashboard][Exit] Exported dashboard 4 to dashboard_export_20251220T203039.zip.
|
||||
2025-12-20 23:30:40,196 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:40,197 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/deck.gl Demo/dashboard_export_20251220T203039.zip
|
||||
2025-12-20 23:30:40,197 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/deck.gl Demo
|
||||
2025-12-20 23:30:40,198 - INFO - [export_dashboard][Enter] Exporting dashboard 3.
|
||||
2025-12-20 23:30:40,745 - INFO - [export_dashboard][Exit] Exported dashboard 3 to dashboard_export_20251220T203040.zip.
|
||||
2025-12-20 23:30:40,746 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:40,760 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/Misc Charts/dashboard_export_20251220T203040.zip
|
||||
2025-12-20 23:30:40,761 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/Misc Charts
|
||||
2025-12-20 23:30:40,762 - INFO - [export_dashboard][Enter] Exporting dashboard 2.
|
||||
2025-12-20 23:30:40,928 - INFO - [export_dashboard][Exit] Exported dashboard 2 to dashboard_export_20251220T203040.zip.
|
||||
2025-12-20 23:30:40,929 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:40,930 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/USA Births Names/dashboard_export_20251220T203040.zip
|
||||
2025-12-20 23:30:40,931 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/USA Births Names
|
||||
2025-12-20 23:30:40,932 - INFO - [export_dashboard][Enter] Exporting dashboard 1.
|
||||
2025-12-20 23:30:41,582 - INFO - [export_dashboard][Exit] Exported dashboard 1 to dashboard_export_20251220T203040.zip.
|
||||
2025-12-20 23:30:41,582 - INFO - [save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: False
|
||||
2025-12-20 23:30:41,749 - INFO - [save_and_unpack_dashboard][State] Dashboard saved to: backups/SUPERSET/World Bank's Data/dashboard_export_20251220T203040.zip
|
||||
2025-12-20 23:30:41,750 - INFO - [archive_exports][Enter] Managing archive in backups/SUPERSET/World Bank's Data
|
||||
2025-12-20 23:30:41,752 - INFO - [consolidate_archive_folders][Enter] Consolidating archives in backups/SUPERSET
|
||||
2025-12-20 23:30:41,753 - INFO - [remove_empty_directories][Enter] Starting cleanup of empty directories in backups/SUPERSET
|
||||
2025-12-20 23:30:41,758 - INFO - [remove_empty_directories][Exit] Removed 0 empty directories.
|
||||
2025-12-20 23:30:41,758 - INFO - [BackupPlugin][CoherenceCheck:Passed] Backup logic completed for superset.
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1,10 +1,17 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Script to delete tasks with RUNNING status from the database."""
|
||||
# [DEF:backend.delete_running_tasks:Module]
|
||||
# @PURPOSE: Script to delete tasks with RUNNING status from the database.
|
||||
# @LAYER: Utility
|
||||
# @SEMANTICS: maintenance, database, cleanup
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
from src.core.database import TasksSessionLocal
|
||||
from src.models.task import TaskRecord
|
||||
|
||||
# [DEF:delete_running_tasks:Function]
|
||||
# @PURPOSE: Delete all tasks with RUNNING status from the database.
|
||||
# @PRE: Database is accessible and TaskRecord model is defined.
|
||||
# @POST: All tasks with status 'RUNNING' are removed from the database.
|
||||
def delete_running_tasks():
|
||||
"""Delete all tasks with RUNNING status from the database."""
|
||||
session: Session = TasksSessionLocal()
|
||||
@@ -30,6 +37,8 @@ def delete_running_tasks():
|
||||
print(f"Error deleting tasks: {e}")
|
||||
finally:
|
||||
session.close()
|
||||
# [/DEF:delete_running_tasks:Function]
|
||||
|
||||
if __name__ == "__main__":
|
||||
delete_running_tasks()
|
||||
# [/DEF:backend.delete_running_tasks:Module]
|
||||
|
||||
58244
backend/logs/app.log.1
Normal file
58244
backend/logs/app.log.1
Normal file
File diff suppressed because it is too large
Load Diff
Binary file not shown.
@@ -25,9 +25,13 @@ keyring==25.7.0
|
||||
more-itertools==10.8.0
|
||||
pycparser==2.23
|
||||
pydantic==2.12.5
|
||||
pydantic-settings
|
||||
pydantic_core==2.41.5
|
||||
python-multipart==0.0.21
|
||||
PyYAML==6.0.3
|
||||
passlib[bcrypt]
|
||||
python-jose[cryptography]
|
||||
PyJWT
|
||||
RapidFuzz==3.14.3
|
||||
referencing==0.37.0
|
||||
requests==2.32.5
|
||||
@@ -42,4 +46,8 @@ urllib3==2.6.2
|
||||
uvicorn==0.38.0
|
||||
websockets==15.0.1
|
||||
pandas
|
||||
psycopg2-binary
|
||||
psycopg2-binary
|
||||
openpyxl
|
||||
GitPython==3.1.44
|
||||
itsdangerous
|
||||
email-validator
|
||||
@@ -1,59 +1,118 @@
|
||||
# [DEF:AuthModule:Module]
|
||||
# @SEMANTICS: auth, authentication, adfs, oauth, middleware
|
||||
# @PURPOSE: Implements ADFS authentication using Authlib for FastAPI. It provides a dependency to protect endpoints.
|
||||
# @LAYER: UI (API)
|
||||
# @RELATION: Used by API routers to protect endpoints that require authentication.
|
||||
# [DEF:backend.src.api.auth:Module]
|
||||
#
|
||||
# @SEMANTICS: api, auth, routes, login, logout
|
||||
# @PURPOSE: Authentication API endpoints.
|
||||
# @LAYER: API
|
||||
# @RELATION: USES -> backend.src.services.auth_service.AuthService
|
||||
# @RELATION: USES -> backend.src.core.database.get_auth_db
|
||||
#
|
||||
# @INVARIANT: All auth endpoints must return consistent error codes.
|
||||
|
||||
from fastapi import Depends, HTTPException, status
|
||||
from fastapi.security import OAuth2AuthorizationCodeBearer
|
||||
from authlib.integrations.starlette_client import OAuth
|
||||
from starlette.config import Config
|
||||
# [SECTION: IMPORTS]
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from fastapi.security import OAuth2PasswordRequestForm
|
||||
from sqlalchemy.orm import Session
|
||||
from ..core.database import get_auth_db
|
||||
from ..services.auth_service import AuthService
|
||||
from ..schemas.auth import Token, User as UserSchema
|
||||
from ..dependencies import get_current_user
|
||||
from ..core.auth.oauth import oauth, is_adfs_configured
|
||||
from ..core.auth.logger import log_security_event
|
||||
from ..core.logger import belief_scope
|
||||
import starlette.requests
|
||||
# [/SECTION]
|
||||
|
||||
# Placeholder for ADFS configuration. In a real app, this would come from a secure source.
|
||||
# Create an in-memory .env file
|
||||
from io import StringIO
|
||||
config_data = StringIO("""
|
||||
ADFS_CLIENT_ID=your-client-id
|
||||
ADFS_CLIENT_SECRET=your-client-secret
|
||||
ADFS_SERVER_METADATA_URL=https://your-adfs-server/.well-known/openid-configuration
|
||||
""")
|
||||
config = Config(config_data)
|
||||
oauth = OAuth(config)
|
||||
# [DEF:router:Variable]
|
||||
# @PURPOSE: APIRouter instance for authentication routes.
|
||||
router = APIRouter(prefix="/api/auth", tags=["auth"])
|
||||
# [/DEF:router:Variable]
|
||||
|
||||
oauth.register(
|
||||
name='adfs',
|
||||
server_metadata_url=config('ADFS_SERVER_METADATA_URL'),
|
||||
client_kwargs={'scope': 'openid profile email'}
|
||||
)
|
||||
# [DEF:login_for_access_token:Function]
|
||||
# @PURPOSE: Authenticates a user and returns a JWT access token.
|
||||
# @PRE: form_data contains username and password.
|
||||
# @POST: Returns a Token object on success.
|
||||
# @THROW: HTTPException 401 if authentication fails.
|
||||
# @PARAM: form_data (OAuth2PasswordRequestForm) - Login credentials.
|
||||
# @PARAM: db (Session) - Auth database session.
|
||||
# @RETURN: Token - The generated JWT token.
|
||||
@router.post("/login", response_model=Token)
|
||||
async def login_for_access_token(
|
||||
form_data: OAuth2PasswordRequestForm = Depends(),
|
||||
db: Session = Depends(get_auth_db)
|
||||
):
|
||||
with belief_scope("api.auth.login"):
|
||||
auth_service = AuthService(db)
|
||||
user = auth_service.authenticate_user(form_data.username, form_data.password)
|
||||
if not user:
|
||||
log_security_event("LOGIN_FAILED", form_data.username, {"reason": "Invalid credentials"})
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Incorrect username or password",
|
||||
headers={"WWW-Authenticate": "Bearer"},
|
||||
)
|
||||
log_security_event("LOGIN_SUCCESS", user.username, {"source": "LOCAL"})
|
||||
return auth_service.create_session(user)
|
||||
# [/DEF:login_for_access_token:Function]
|
||||
|
||||
oauth2_scheme = OAuth2AuthorizationCodeBearer(
|
||||
authorizationUrl="https://your-adfs-server/adfs/oauth2/authorize",
|
||||
tokenUrl="https://your-adfs-server/adfs/oauth2/token",
|
||||
)
|
||||
# [DEF:read_users_me:Function]
|
||||
# @PURPOSE: Retrieves the profile of the currently authenticated user.
|
||||
# @PRE: Valid JWT token provided.
|
||||
# @POST: Returns the current user's data.
|
||||
# @PARAM: current_user (UserSchema) - The user extracted from the token.
|
||||
# @RETURN: UserSchema - The current user profile.
|
||||
@router.get("/me", response_model=UserSchema)
|
||||
async def read_users_me(current_user: UserSchema = Depends(get_current_user)):
|
||||
with belief_scope("api.auth.me"):
|
||||
return current_user
|
||||
# [/DEF:read_users_me:Function]
|
||||
|
||||
# [DEF:get_current_user:Function]
|
||||
# @PURPOSE: Dependency to get the current user from the ADFS token.
|
||||
# @PARAM: token (str) - The OAuth2 bearer token.
|
||||
# @PRE: token should be provided via Authorization header.
|
||||
# @POST: Returns user details if authenticated, else raises 401.
|
||||
# @RETURN: Dict[str, str] - User information.
|
||||
async def get_current_user(token: str = Depends(oauth2_scheme)):
|
||||
"""
|
||||
Dependency to get the current user from the ADFS token.
|
||||
This is a placeholder and needs to be fully implemented.
|
||||
"""
|
||||
# In a real implementation, you would:
|
||||
# 1. Validate the token with ADFS.
|
||||
# 2. Fetch user information.
|
||||
# 3. Create a user object.
|
||||
# For now, we'll just check if a token exists.
|
||||
if not token:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Not authenticated",
|
||||
headers={"WWW-Authenticate": "Bearer"},
|
||||
)
|
||||
# A real implementation would return a user object.
|
||||
return {"placeholder_user": "user@example.com"}
|
||||
# [/DEF:get_current_user:Function]
|
||||
# [/DEF:AuthModule:Module]
|
||||
# [DEF:logout:Function]
|
||||
# @PURPOSE: Logs out the current user (placeholder for session revocation).
|
||||
# @PRE: Valid JWT token provided.
|
||||
# @POST: Returns success message.
|
||||
@router.post("/logout")
|
||||
async def logout(current_user: UserSchema = Depends(get_current_user)):
|
||||
with belief_scope("api.auth.logout"):
|
||||
log_security_event("LOGOUT", current_user.username)
|
||||
# In a stateless JWT setup, client-side token deletion is primary.
|
||||
# Server-side revocation (blacklisting) can be added here if needed.
|
||||
return {"message": "Successfully logged out"}
|
||||
# [/DEF:logout:Function]
|
||||
|
||||
# [DEF:login_adfs:Function]
|
||||
# @PURPOSE: Initiates the ADFS OIDC login flow.
|
||||
# @POST: Redirects the user to ADFS.
|
||||
@router.get("/login/adfs")
|
||||
async def login_adfs(request: starlette.requests.Request):
|
||||
with belief_scope("api.auth.login_adfs"):
|
||||
if not is_adfs_configured():
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||
detail="ADFS is not configured. Please set ADFS_CLIENT_ID, ADFS_CLIENT_SECRET, and ADFS_METADATA_URL environment variables."
|
||||
)
|
||||
redirect_uri = request.url_for('auth_callback_adfs')
|
||||
return await oauth.adfs.authorize_redirect(request, str(redirect_uri))
|
||||
# [/DEF:login_adfs:Function]
|
||||
|
||||
# [DEF:auth_callback_adfs:Function]
|
||||
# @PURPOSE: Handles the callback from ADFS after successful authentication.
|
||||
# @POST: Provisions user JIT and returns session token.
|
||||
@router.get("/callback/adfs", name="auth_callback_adfs")
|
||||
async def auth_callback_adfs(request: starlette.requests.Request, db: Session = Depends(get_auth_db)):
|
||||
with belief_scope("api.auth.callback_adfs"):
|
||||
if not is_adfs_configured():
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
|
||||
detail="ADFS is not configured. Please set ADFS_CLIENT_ID, ADFS_CLIENT_SECRET, and ADFS_METADATA_URL environment variables."
|
||||
)
|
||||
token = await oauth.adfs.authorize_access_token(request)
|
||||
user_info = token.get('userinfo')
|
||||
if not user_info:
|
||||
raise HTTPException(status_code=400, detail="Failed to retrieve user info from ADFS")
|
||||
|
||||
auth_service = AuthService(db)
|
||||
user = auth_service.provision_adfs_user(user_info)
|
||||
return auth_service.create_session(user)
|
||||
# [/DEF:auth_callback_adfs:Function]
|
||||
|
||||
# [/DEF:backend.src.api.auth:Module]
|
||||
@@ -1 +1 @@
|
||||
from . import plugins, tasks, settings, connections
|
||||
from . import plugins, tasks, settings, connections, environments, mappings, migration, git, storage, admin
|
||||
|
||||
310
backend/src/api/routes/admin.py
Normal file
310
backend/src/api/routes/admin.py
Normal file
@@ -0,0 +1,310 @@
|
||||
# [DEF:backend.src.api.routes.admin:Module]
|
||||
#
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: api, admin, users, roles, permissions
|
||||
# @PURPOSE: Admin API endpoints for user and role management.
|
||||
# @LAYER: API
|
||||
# @RELATION: USES -> backend.src.core.auth.repository.AuthRepository
|
||||
# @RELATION: USES -> backend.src.dependencies.has_permission
|
||||
#
|
||||
# @INVARIANT: All endpoints in this module require 'Admin' role or 'admin' scope.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.orm import Session
|
||||
from ...core.database import get_auth_db
|
||||
from ...core.auth.repository import AuthRepository
|
||||
from ...core.auth.security import get_password_hash
|
||||
from ...schemas.auth import (
|
||||
User as UserSchema, UserCreate, UserUpdate,
|
||||
RoleSchema, RoleCreate, RoleUpdate, PermissionSchema,
|
||||
ADGroupMappingSchema, ADGroupMappingCreate
|
||||
)
|
||||
from ...models.auth import User, Role, Permission, ADGroupMapping
|
||||
from ...dependencies import has_permission, get_current_user
|
||||
from ...core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:router:Variable]
|
||||
# @PURPOSE: APIRouter instance for admin routes.
|
||||
router = APIRouter(prefix="/api/admin", tags=["admin"])
|
||||
# [/DEF:router:Variable]
|
||||
|
||||
# [DEF:list_users:Function]
|
||||
# @PURPOSE: Lists all registered users.
|
||||
# @PRE: Current user has 'Admin' role.
|
||||
# @POST: Returns a list of UserSchema objects.
|
||||
# @PARAM: db (Session) - Auth database session.
|
||||
# @RETURN: List[UserSchema] - List of users.
|
||||
@router.get("/users", response_model=List[UserSchema])
|
||||
async def list_users(
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:users", "READ"))
|
||||
):
|
||||
with belief_scope("api.admin.list_users"):
|
||||
users = db.query(User).all()
|
||||
return users
|
||||
# [/DEF:list_users:Function]
|
||||
|
||||
# [DEF:create_user:Function]
|
||||
# @PURPOSE: Creates a new local user.
|
||||
# @PRE: Current user has 'Admin' role.
|
||||
# @POST: New user is created in the database.
|
||||
# @PARAM: user_in (UserCreate) - New user data.
|
||||
# @PARAM: db (Session) - Auth database session.
|
||||
# @RETURN: UserSchema - The created user.
|
||||
@router.post("/users", response_model=UserSchema, status_code=status.HTTP_201_CREATED)
|
||||
async def create_user(
|
||||
user_in: UserCreate,
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:users", "WRITE"))
|
||||
):
|
||||
with belief_scope("api.admin.create_user"):
|
||||
repo = AuthRepository(db)
|
||||
if repo.get_user_by_username(user_in.username):
|
||||
raise HTTPException(status_code=400, detail="Username already exists")
|
||||
|
||||
new_user = User(
|
||||
username=user_in.username,
|
||||
email=user_in.email,
|
||||
password_hash=get_password_hash(user_in.password),
|
||||
auth_source="LOCAL",
|
||||
is_active=user_in.is_active
|
||||
)
|
||||
|
||||
for role_name in user_in.roles:
|
||||
role = repo.get_role_by_name(role_name)
|
||||
if role:
|
||||
new_user.roles.append(role)
|
||||
|
||||
db.add(new_user)
|
||||
db.commit()
|
||||
db.refresh(new_user)
|
||||
return new_user
|
||||
# [/DEF:create_user:Function]
|
||||
|
||||
# [DEF:update_user:Function]
|
||||
# @PURPOSE: Updates an existing user.
|
||||
@router.put("/users/{user_id}", response_model=UserSchema)
|
||||
async def update_user(
|
||||
user_id: str,
|
||||
user_in: UserUpdate,
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:users", "WRITE"))
|
||||
):
|
||||
with belief_scope("api.admin.update_user"):
|
||||
repo = AuthRepository(db)
|
||||
user = repo.get_user_by_id(user_id)
|
||||
if not user:
|
||||
raise HTTPException(status_code=404, detail="User not found")
|
||||
|
||||
if user_in.email is not None:
|
||||
user.email = user_in.email
|
||||
if user_in.is_active is not None:
|
||||
user.is_active = user_in.is_active
|
||||
if user_in.password is not None:
|
||||
user.password_hash = get_password_hash(user_in.password)
|
||||
|
||||
if user_in.roles is not None:
|
||||
user.roles = []
|
||||
for role_name in user_in.roles:
|
||||
role = repo.get_role_by_name(role_name)
|
||||
if role:
|
||||
user.roles.append(role)
|
||||
|
||||
db.commit()
|
||||
db.refresh(user)
|
||||
return user
|
||||
# [/DEF:update_user:Function]
|
||||
|
||||
# [DEF:delete_user:Function]
|
||||
# @PURPOSE: Deletes a user.
|
||||
@router.delete("/users/{user_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||
async def delete_user(
|
||||
user_id: str,
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:users", "WRITE"))
|
||||
):
|
||||
with belief_scope("api.admin.delete_user"):
|
||||
logger.info(f"[DEBUG] Attempting to delete user context={{'user_id': '{user_id}'}}")
|
||||
repo = AuthRepository(db)
|
||||
user = repo.get_user_by_id(user_id)
|
||||
if not user:
|
||||
logger.warning(f"[DEBUG] User not found for deletion context={{'user_id': '{user_id}'}}")
|
||||
raise HTTPException(status_code=404, detail="User not found")
|
||||
|
||||
logger.info(f"[DEBUG] Found user to delete context={{'username': '{user.username}'}}")
|
||||
db.delete(user)
|
||||
db.commit()
|
||||
logger.info(f"[DEBUG] Successfully deleted user context={{'user_id': '{user_id}'}}")
|
||||
return None
|
||||
# [/DEF:delete_user:Function]
|
||||
|
||||
# [DEF:list_roles:Function]
|
||||
# @PURPOSE: Lists all available roles.
|
||||
# @RETURN: List[RoleSchema] - List of roles.
|
||||
# @RELATION: CALLS -> backend.src.models.auth.Role
|
||||
@router.get("/roles", response_model=List[RoleSchema])
|
||||
async def list_roles(
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:roles", "READ"))
|
||||
):
|
||||
with belief_scope("api.admin.list_roles"):
|
||||
return db.query(Role).all()
|
||||
# [/DEF:list_roles:Function]
|
||||
|
||||
# [DEF:create_role:Function]
|
||||
# @PURPOSE: Creates a new system role with associated permissions.
|
||||
# @PRE: Role name must be unique.
|
||||
# @POST: New Role record is created in auth.db.
|
||||
# @PARAM: role_in (RoleCreate) - New role data.
|
||||
# @PARAM: db (Session) - Auth database session.
|
||||
# @RETURN: RoleSchema - The created role.
|
||||
# @SIDE_EFFECT: Commits new role and associations to auth.db.
|
||||
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.get_permission_by_id
|
||||
@router.post("/roles", response_model=RoleSchema, status_code=status.HTTP_201_CREATED)
|
||||
async def create_role(
|
||||
role_in: RoleCreate,
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:roles", "WRITE"))
|
||||
):
|
||||
with belief_scope("api.admin.create_role"):
|
||||
if db.query(Role).filter(Role.name == role_in.name).first():
|
||||
raise HTTPException(status_code=400, detail="Role already exists")
|
||||
|
||||
new_role = Role(name=role_in.name, description=role_in.description)
|
||||
repo = AuthRepository(db)
|
||||
|
||||
for perm_id_or_str in role_in.permissions:
|
||||
perm = repo.get_permission_by_id(perm_id_or_str)
|
||||
if not perm and ":" in perm_id_or_str:
|
||||
res, act = perm_id_or_str.split(":", 1)
|
||||
perm = repo.get_permission_by_resource_action(res, act)
|
||||
|
||||
if perm:
|
||||
new_role.permissions.append(perm)
|
||||
|
||||
db.add(new_role)
|
||||
db.commit()
|
||||
db.refresh(new_role)
|
||||
return new_role
|
||||
# [/DEF:create_role:Function]
|
||||
|
||||
# [DEF:update_role:Function]
|
||||
# @PURPOSE: Updates an existing role's metadata and permissions.
|
||||
# @PRE: role_id must be a valid existing role UUID.
|
||||
# @POST: Role record is updated in auth.db.
|
||||
# @PARAM: role_id (str) - Target role identifier.
|
||||
# @PARAM: role_in (RoleUpdate) - Updated role data.
|
||||
# @PARAM: db (Session) - Auth database session.
|
||||
# @RETURN: RoleSchema - The updated role.
|
||||
# @SIDE_EFFECT: Commits updates to auth.db.
|
||||
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.get_role_by_id
|
||||
@router.put("/roles/{role_id}", response_model=RoleSchema)
|
||||
async def update_role(
|
||||
role_id: str,
|
||||
role_in: RoleUpdate,
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:roles", "WRITE"))
|
||||
):
|
||||
with belief_scope("api.admin.update_role"):
|
||||
repo = AuthRepository(db)
|
||||
role = repo.get_role_by_id(role_id)
|
||||
if not role:
|
||||
raise HTTPException(status_code=404, detail="Role not found")
|
||||
|
||||
if role_in.name is not None:
|
||||
role.name = role_in.name
|
||||
if role_in.description is not None:
|
||||
role.description = role_in.description
|
||||
|
||||
if role_in.permissions is not None:
|
||||
role.permissions = []
|
||||
for perm_id_or_str in role_in.permissions:
|
||||
perm = repo.get_permission_by_id(perm_id_or_str)
|
||||
if not perm and ":" in perm_id_or_str:
|
||||
res, act = perm_id_or_str.split(":", 1)
|
||||
perm = repo.get_permission_by_resource_action(res, act)
|
||||
|
||||
if perm:
|
||||
role.permissions.append(perm)
|
||||
|
||||
db.commit()
|
||||
db.refresh(role)
|
||||
return role
|
||||
# [/DEF:update_role:Function]
|
||||
|
||||
# [DEF:delete_role:Function]
|
||||
# @PURPOSE: Removes a role from the system.
|
||||
# @PRE: role_id must be a valid existing role UUID.
|
||||
# @POST: Role record is removed from auth.db.
|
||||
# @PARAM: role_id (str) - Target role identifier.
|
||||
# @PARAM: db (Session) - Auth database session.
|
||||
# @RETURN: None
|
||||
# @SIDE_EFFECT: Deletes record from auth.db and commits.
|
||||
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.get_role_by_id
|
||||
@router.delete("/roles/{role_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||
async def delete_role(
|
||||
role_id: str,
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:roles", "WRITE"))
|
||||
):
|
||||
with belief_scope("api.admin.delete_role"):
|
||||
repo = AuthRepository(db)
|
||||
role = repo.get_role_by_id(role_id)
|
||||
if not role:
|
||||
raise HTTPException(status_code=404, detail="Role not found")
|
||||
|
||||
db.delete(role)
|
||||
db.commit()
|
||||
return None
|
||||
# [/DEF:delete_role:Function]
|
||||
|
||||
# [DEF:list_permissions:Function]
|
||||
# @PURPOSE: Lists all available system permissions for assignment.
|
||||
# @POST: Returns a list of all PermissionSchema objects.
|
||||
# @PARAM: db (Session) - Auth database session.
|
||||
# @RETURN: List[PermissionSchema] - List of permissions.
|
||||
# @RELATION: CALLS -> backend.src.core.auth.repository.AuthRepository.list_permissions
|
||||
@router.get("/permissions", response_model=List[PermissionSchema])
|
||||
async def list_permissions(
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:roles", "READ"))
|
||||
):
|
||||
with belief_scope("api.admin.list_permissions"):
|
||||
repo = AuthRepository(db)
|
||||
return repo.list_permissions()
|
||||
# [/DEF:list_permissions:Function]
|
||||
|
||||
# [DEF:list_ad_mappings:Function]
|
||||
# @PURPOSE: Lists all AD Group to Role mappings.
|
||||
@router.get("/ad-mappings", response_model=List[ADGroupMappingSchema])
|
||||
async def list_ad_mappings(
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:settings", "READ"))
|
||||
):
|
||||
with belief_scope("api.admin.list_ad_mappings"):
|
||||
return db.query(ADGroupMapping).all()
|
||||
# [/DEF:list_ad_mappings:Function]
|
||||
|
||||
# [DEF:create_ad_mapping:Function]
|
||||
# @PURPOSE: Creates a new AD Group mapping.
|
||||
@router.post("/ad-mappings", response_model=ADGroupMappingSchema)
|
||||
async def create_ad_mapping(
|
||||
mapping_in: ADGroupMappingCreate,
|
||||
db: Session = Depends(get_auth_db),
|
||||
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||
):
|
||||
with belief_scope("api.admin.create_ad_mapping"):
|
||||
new_mapping = ADGroupMapping(
|
||||
ad_group=mapping_in.ad_group,
|
||||
role_id=mapping_in.role_id
|
||||
)
|
||||
db.add(new_mapping)
|
||||
db.commit()
|
||||
db.refresh(new_mapping)
|
||||
return new_mapping
|
||||
# [/DEF:create_ad_mapping:Function]
|
||||
|
||||
# [/DEF:backend.src.api.routes.admin:Module]
|
||||
@@ -11,12 +11,11 @@
|
||||
# [SECTION: IMPORTS]
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List, Dict, Optional
|
||||
from backend.src.dependencies import get_config_manager, get_scheduler_service
|
||||
from backend.src.core.superset_client import SupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
from ...dependencies import get_config_manager, get_scheduler_service, has_permission
|
||||
from ...core.superset_client import SupersetClient
|
||||
from pydantic import BaseModel, Field
|
||||
from backend.src.core.config_models import Environment as EnvModel
|
||||
from backend.src.core.logger import belief_scope
|
||||
from ...core.config_models import Environment as EnvModel
|
||||
from ...core.logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter()
|
||||
@@ -24,7 +23,7 @@ router = APIRouter()
|
||||
# [DEF:ScheduleSchema:DataClass]
|
||||
class ScheduleSchema(BaseModel):
|
||||
enabled: bool = False
|
||||
cron_expression: str = Field(..., pattern=r'^(@(annually|yearly|monthly|weekly|daily|hourly|reboot))|((((\d+,)*\d+|(\d+(\/|-)\d+)|\d+|\*) ?){5,7})$')
|
||||
cron_expression: str = Field(..., pattern=r'^(@(annually|yearly|monthly|weekly|daily|hourly|reboot))|((((\d+,)*\d+|(\d+(\/|-)\d+)|\d+|\*) ?){4,6})$')
|
||||
# [/DEF:ScheduleSchema:DataClass]
|
||||
|
||||
# [DEF:EnvironmentResponse:DataClass]
|
||||
@@ -48,7 +47,10 @@ class DatabaseResponse(BaseModel):
|
||||
# @POST: Returns a list of EnvironmentResponse objects.
|
||||
# @RETURN: List[EnvironmentResponse]
|
||||
@router.get("", response_model=List[EnvironmentResponse])
|
||||
async def get_environments(config_manager=Depends(get_config_manager)):
|
||||
async def get_environments(
|
||||
config_manager=Depends(get_config_manager),
|
||||
_ = Depends(has_permission("environments", "READ"))
|
||||
):
|
||||
with belief_scope("get_environments"):
|
||||
envs = config_manager.get_environments()
|
||||
# Ensure envs is a list
|
||||
@@ -62,7 +64,7 @@ async def get_environments(config_manager=Depends(get_config_manager)):
|
||||
backup_schedule=ScheduleSchema(
|
||||
enabled=e.backup_schedule.enabled,
|
||||
cron_expression=e.backup_schedule.cron_expression
|
||||
) if e.backup_schedule else None
|
||||
) if getattr(e, 'backup_schedule', None) else None
|
||||
) for e in envs
|
||||
]
|
||||
# [/DEF:get_environments:Function]
|
||||
@@ -78,7 +80,8 @@ async def update_environment_schedule(
|
||||
id: str,
|
||||
schedule: ScheduleSchema,
|
||||
config_manager=Depends(get_config_manager),
|
||||
scheduler_service=Depends(get_scheduler_service)
|
||||
scheduler_service=Depends(get_scheduler_service),
|
||||
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||
):
|
||||
with belief_scope("update_environment_schedule", f"id={id}"):
|
||||
envs = config_manager.get_environments()
|
||||
@@ -105,7 +108,11 @@ async def update_environment_schedule(
|
||||
# @PARAM: id (str) - The environment ID.
|
||||
# @RETURN: List[Dict] - List of databases.
|
||||
@router.get("/{id}/databases")
|
||||
async def get_environment_databases(id: str, config_manager=Depends(get_config_manager)):
|
||||
async def get_environment_databases(
|
||||
id: str,
|
||||
config_manager=Depends(get_config_manager),
|
||||
_ = Depends(has_permission("admin:settings", "READ"))
|
||||
):
|
||||
with belief_scope("get_environment_databases", f"id={id}"):
|
||||
envs = config_manager.get_environments()
|
||||
env = next((e for e in envs if e.id == id), None)
|
||||
@@ -114,18 +121,7 @@ async def get_environment_databases(id: str, config_manager=Depends(get_config_m
|
||||
|
||||
try:
|
||||
# Initialize SupersetClient from environment config
|
||||
# Note: We need to map Environment model to SupersetConfig
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db", # Defaulting to db provider
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(superset_config)
|
||||
client = SupersetClient(env)
|
||||
return client.get_databases_summary()
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Failed to fetch databases: {str(e)}")
|
||||
|
||||
400
backend/src/api/routes/git.py
Normal file
400
backend/src/api/routes/git.py
Normal file
@@ -0,0 +1,400 @@
|
||||
# [DEF:backend.src.api.routes.git:Module]
|
||||
#
|
||||
# @SEMANTICS: git, routes, api, fastapi, repository, deployment
|
||||
# @PURPOSE: Provides FastAPI endpoints for Git integration operations.
|
||||
# @LAYER: API
|
||||
# @RELATION: USES -> src.services.git_service.GitService
|
||||
# @RELATION: USES -> src.api.routes.git_schemas
|
||||
# @RELATION: USES -> src.models.git
|
||||
#
|
||||
# @INVARIANT: All Git operations must be routed through GitService.
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import List, Optional
|
||||
import typing
|
||||
from src.dependencies import get_config_manager, has_permission
|
||||
from src.core.database import get_db
|
||||
from src.models.git import GitServerConfig, GitStatus, DeploymentEnvironment, GitRepository
|
||||
from src.api.routes.git_schemas import (
|
||||
GitServerConfigSchema, GitServerConfigCreate,
|
||||
GitRepositorySchema, BranchSchema, BranchCreate,
|
||||
BranchCheckout, CommitSchema, CommitCreate,
|
||||
DeploymentEnvironmentSchema, DeployRequest, RepoInitRequest
|
||||
)
|
||||
from src.services.git_service import GitService
|
||||
from src.core.logger import logger, belief_scope
|
||||
|
||||
router = APIRouter(prefix="/api/git", tags=["git"])
|
||||
git_service = GitService()
|
||||
|
||||
# [DEF:get_git_configs:Function]
|
||||
# @PURPOSE: List all configured Git servers.
|
||||
# @PRE: Database session `db` is available.
|
||||
# @POST: Returns a list of all GitServerConfig objects from the database.
|
||||
# @RETURN: List[GitServerConfigSchema]
|
||||
@router.get("/config", response_model=List[GitServerConfigSchema])
|
||||
async def get_git_configs(
|
||||
db: Session = Depends(get_db),
|
||||
_ = Depends(has_permission("admin:settings", "READ"))
|
||||
):
|
||||
with belief_scope("get_git_configs"):
|
||||
return db.query(GitServerConfig).all()
|
||||
# [/DEF:get_git_configs:Function]
|
||||
|
||||
# [DEF:create_git_config:Function]
|
||||
# @PURPOSE: Register a new Git server configuration.
|
||||
# @PRE: `config` contains valid GitServerConfigCreate data.
|
||||
# @POST: A new GitServerConfig record is created in the database.
|
||||
# @PARAM: config (GitServerConfigCreate)
|
||||
# @RETURN: GitServerConfigSchema
|
||||
@router.post("/config", response_model=GitServerConfigSchema)
|
||||
async def create_git_config(
|
||||
config: GitServerConfigCreate,
|
||||
db: Session = Depends(get_db),
|
||||
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||
):
|
||||
with belief_scope("create_git_config"):
|
||||
db_config = GitServerConfig(**config.dict())
|
||||
db.add(db_config)
|
||||
db.commit()
|
||||
db.refresh(db_config)
|
||||
return db_config
|
||||
# [/DEF:create_git_config:Function]
|
||||
|
||||
# [DEF:delete_git_config:Function]
|
||||
# @PURPOSE: Remove a Git server configuration.
|
||||
# @PRE: `config_id` corresponds to an existing configuration.
|
||||
# @POST: The configuration record is removed from the database.
|
||||
# @PARAM: config_id (str)
|
||||
@router.delete("/config/{config_id}")
|
||||
async def delete_git_config(
|
||||
config_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||
):
|
||||
with belief_scope("delete_git_config"):
|
||||
db_config = db.query(GitServerConfig).filter(GitServerConfig.id == config_id).first()
|
||||
if not db_config:
|
||||
raise HTTPException(status_code=404, detail="Configuration not found")
|
||||
|
||||
db.delete(db_config)
|
||||
db.commit()
|
||||
return {"status": "success", "message": "Configuration deleted"}
|
||||
# [/DEF:delete_git_config:Function]
|
||||
|
||||
# [DEF:test_git_config:Function]
|
||||
# @PURPOSE: Validate connection to a Git server using provided credentials.
|
||||
# @PRE: `config` contains provider, url, and pat.
|
||||
# @POST: Returns success if the connection is validated via GitService.
|
||||
# @PARAM: config (GitServerConfigCreate)
|
||||
@router.post("/config/test")
|
||||
async def test_git_config(
|
||||
config: GitServerConfigCreate,
|
||||
_ = Depends(has_permission("admin:settings", "READ"))
|
||||
):
|
||||
with belief_scope("test_git_config"):
|
||||
success = await git_service.test_connection(config.provider, config.url, config.pat)
|
||||
if success:
|
||||
return {"status": "success", "message": "Connection successful"}
|
||||
else:
|
||||
raise HTTPException(status_code=400, detail="Connection failed")
|
||||
# [/DEF:test_git_config:Function]
|
||||
|
||||
# [DEF:init_repository:Function]
|
||||
# @PURPOSE: Link a dashboard to a Git repository and perform initial clone/init.
|
||||
# @PRE: `dashboard_id` exists and `init_data` contains valid config_id and remote_url.
|
||||
# @POST: Repository is initialized on disk and a GitRepository record is saved in DB.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: init_data (RepoInitRequest)
|
||||
@router.post("/repositories/{dashboard_id}/init")
|
||||
async def init_repository(
|
||||
dashboard_id: int,
|
||||
init_data: RepoInitRequest,
|
||||
db: Session = Depends(get_db),
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("init_repository"):
|
||||
# 1. Get config
|
||||
config = db.query(GitServerConfig).filter(GitServerConfig.id == init_data.config_id).first()
|
||||
if not config:
|
||||
raise HTTPException(status_code=404, detail="Git configuration not found")
|
||||
|
||||
try:
|
||||
# 2. Perform Git clone/init
|
||||
logger.info(f"[init_repository][Action] Initializing repo for dashboard {dashboard_id}")
|
||||
git_service.init_repo(dashboard_id, init_data.remote_url, config.pat)
|
||||
|
||||
# 3. Save to DB
|
||||
repo_path = git_service._get_repo_path(dashboard_id)
|
||||
db_repo = db.query(GitRepository).filter(GitRepository.dashboard_id == dashboard_id).first()
|
||||
if not db_repo:
|
||||
db_repo = GitRepository(
|
||||
dashboard_id=dashboard_id,
|
||||
config_id=config.id,
|
||||
remote_url=init_data.remote_url,
|
||||
local_path=repo_path
|
||||
)
|
||||
db.add(db_repo)
|
||||
else:
|
||||
db_repo.config_id = config.id
|
||||
db_repo.remote_url = init_data.remote_url
|
||||
db_repo.local_path = repo_path
|
||||
|
||||
db.commit()
|
||||
logger.info(f"[init_repository][Coherence:OK] Repository initialized for dashboard {dashboard_id}")
|
||||
return {"status": "success", "message": "Repository initialized"}
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
logger.error(f"[init_repository][Coherence:Failed] Failed to init repository: {e}")
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:init_repository:Function]
|
||||
|
||||
# [DEF:get_branches:Function]
|
||||
# @PURPOSE: List all branches for a dashboard's repository.
|
||||
# @PRE: Repository for `dashboard_id` is initialized.
|
||||
# @POST: Returns a list of branches from the local repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @RETURN: List[BranchSchema]
|
||||
@router.get("/repositories/{dashboard_id}/branches", response_model=List[BranchSchema])
|
||||
async def get_branches(
|
||||
dashboard_id: int,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("get_branches"):
|
||||
try:
|
||||
return git_service.list_branches(dashboard_id)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=404, detail=str(e))
|
||||
# [/DEF:get_branches:Function]
|
||||
|
||||
# [DEF:create_branch:Function]
|
||||
# @PURPOSE: Create a new branch in the dashboard's repository.
|
||||
# @PRE: `dashboard_id` repository exists and `branch_data` has name and from_branch.
|
||||
# @POST: A new branch is created in the local repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: branch_data (BranchCreate)
|
||||
@router.post("/repositories/{dashboard_id}/branches")
|
||||
async def create_branch(
|
||||
dashboard_id: int,
|
||||
branch_data: BranchCreate,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("create_branch"):
|
||||
try:
|
||||
git_service.create_branch(dashboard_id, branch_data.name, branch_data.from_branch)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:create_branch:Function]
|
||||
|
||||
# [DEF:checkout_branch:Function]
|
||||
# @PURPOSE: Switch the dashboard's repository to a specific branch.
|
||||
# @PRE: `dashboard_id` repository exists and branch `checkout_data.name` exists.
|
||||
# @POST: The local repository HEAD is moved to the specified branch.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: checkout_data (BranchCheckout)
|
||||
@router.post("/repositories/{dashboard_id}/checkout")
|
||||
async def checkout_branch(
|
||||
dashboard_id: int,
|
||||
checkout_data: BranchCheckout,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("checkout_branch"):
|
||||
try:
|
||||
git_service.checkout_branch(dashboard_id, checkout_data.name)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:checkout_branch:Function]
|
||||
|
||||
# [DEF:commit_changes:Function]
|
||||
# @PURPOSE: Stage and commit changes in the dashboard's repository.
|
||||
# @PRE: `dashboard_id` repository exists and `commit_data` has message and files.
|
||||
# @POST: Specified files are staged and a new commit is created.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: commit_data (CommitCreate)
|
||||
@router.post("/repositories/{dashboard_id}/commit")
|
||||
async def commit_changes(
|
||||
dashboard_id: int,
|
||||
commit_data: CommitCreate,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("commit_changes"):
|
||||
try:
|
||||
git_service.commit_changes(dashboard_id, commit_data.message, commit_data.files)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:commit_changes:Function]
|
||||
|
||||
# [DEF:push_changes:Function]
|
||||
# @PURPOSE: Push local commits to the remote repository.
|
||||
# @PRE: `dashboard_id` repository exists and has a remote configured.
|
||||
# @POST: Local commits are pushed to the remote repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
@router.post("/repositories/{dashboard_id}/push")
|
||||
async def push_changes(
|
||||
dashboard_id: int,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("push_changes"):
|
||||
try:
|
||||
git_service.push_changes(dashboard_id)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:push_changes:Function]
|
||||
|
||||
# [DEF:pull_changes:Function]
|
||||
# @PURPOSE: Pull changes from the remote repository.
|
||||
# @PRE: `dashboard_id` repository exists and has a remote configured.
|
||||
# @POST: Remote changes are fetched and merged into the local branch.
|
||||
# @PARAM: dashboard_id (int)
|
||||
@router.post("/repositories/{dashboard_id}/pull")
|
||||
async def pull_changes(
|
||||
dashboard_id: int,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("pull_changes"):
|
||||
try:
|
||||
git_service.pull_changes(dashboard_id)
|
||||
return {"status": "success"}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:pull_changes:Function]
|
||||
|
||||
# [DEF:sync_dashboard:Function]
|
||||
# @PURPOSE: Sync dashboard state from Superset to Git using the GitPlugin.
|
||||
# @PRE: `dashboard_id` is valid; GitPlugin is available.
|
||||
# @POST: Dashboard YAMLs are exported from Superset and committed to Git.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: source_env_id (Optional[str])
|
||||
@router.post("/repositories/{dashboard_id}/sync")
|
||||
async def sync_dashboard(
|
||||
dashboard_id: int,
|
||||
source_env_id: typing.Optional[str] = None,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("sync_dashboard"):
|
||||
try:
|
||||
from src.plugins.git_plugin import GitPlugin
|
||||
plugin = GitPlugin()
|
||||
return await plugin.execute({
|
||||
"operation": "sync",
|
||||
"dashboard_id": dashboard_id,
|
||||
"source_env_id": source_env_id
|
||||
})
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:sync_dashboard:Function]
|
||||
|
||||
# [DEF:get_environments:Function]
|
||||
# @PURPOSE: List all deployment environments.
|
||||
# @PRE: Config manager is accessible.
|
||||
# @POST: Returns a list of DeploymentEnvironmentSchema objects.
|
||||
# @RETURN: List[DeploymentEnvironmentSchema]
|
||||
@router.get("/environments", response_model=List[DeploymentEnvironmentSchema])
|
||||
async def get_environments(
|
||||
config_manager=Depends(get_config_manager),
|
||||
_ = Depends(has_permission("environments", "READ"))
|
||||
):
|
||||
with belief_scope("get_environments"):
|
||||
envs = config_manager.get_environments()
|
||||
return [
|
||||
DeploymentEnvironmentSchema(
|
||||
id=e.id,
|
||||
name=e.name,
|
||||
superset_url=e.url,
|
||||
is_active=True
|
||||
) for e in envs
|
||||
]
|
||||
# [/DEF:get_environments:Function]
|
||||
|
||||
# [DEF:deploy_dashboard:Function]
|
||||
# @PURPOSE: Deploy dashboard from Git to a target environment.
|
||||
# @PRE: `dashboard_id` and `deploy_data.environment_id` are valid.
|
||||
# @POST: Dashboard YAMLs are read from Git and imported into the target Superset.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: deploy_data (DeployRequest)
|
||||
@router.post("/repositories/{dashboard_id}/deploy")
|
||||
async def deploy_dashboard(
|
||||
dashboard_id: int,
|
||||
deploy_data: DeployRequest,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("deploy_dashboard"):
|
||||
try:
|
||||
from src.plugins.git_plugin import GitPlugin
|
||||
plugin = GitPlugin()
|
||||
return await plugin.execute({
|
||||
"operation": "deploy",
|
||||
"dashboard_id": dashboard_id,
|
||||
"environment_id": deploy_data.environment_id
|
||||
})
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:deploy_dashboard:Function]
|
||||
|
||||
# [DEF:get_history:Function]
|
||||
# @PURPOSE: View commit history for a dashboard's repository.
|
||||
# @PRE: `dashboard_id` repository exists.
|
||||
# @POST: Returns a list of recent commits from the repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: limit (int)
|
||||
# @RETURN: List[CommitSchema]
|
||||
@router.get("/repositories/{dashboard_id}/history", response_model=List[CommitSchema])
|
||||
async def get_history(
|
||||
dashboard_id: int,
|
||||
limit: int = 50,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("get_history"):
|
||||
try:
|
||||
return git_service.get_commit_history(dashboard_id, limit)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=404, detail=str(e))
|
||||
# [/DEF:get_history:Function]
|
||||
|
||||
# [DEF:get_repository_status:Function]
|
||||
# @PURPOSE: Get current Git status for a dashboard repository.
|
||||
# @PRE: `dashboard_id` repository exists.
|
||||
# @POST: Returns the status of the working directory (staged, unstaged, untracked).
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @RETURN: dict
|
||||
@router.get("/repositories/{dashboard_id}/status")
|
||||
async def get_repository_status(
|
||||
dashboard_id: int,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("get_repository_status"):
|
||||
try:
|
||||
return git_service.get_status(dashboard_id)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:get_repository_status:Function]
|
||||
|
||||
# [DEF:get_repository_diff:Function]
|
||||
# @PURPOSE: Get Git diff for a dashboard repository.
|
||||
# @PRE: `dashboard_id` repository exists.
|
||||
# @POST: Returns the diff text for the specified file or all changes.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: file_path (Optional[str])
|
||||
# @PARAM: staged (bool)
|
||||
# @RETURN: str
|
||||
@router.get("/repositories/{dashboard_id}/diff")
|
||||
async def get_repository_diff(
|
||||
dashboard_id: int,
|
||||
file_path: Optional[str] = None,
|
||||
staged: bool = False,
|
||||
_ = Depends(has_permission("plugin:git", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("get_repository_diff"):
|
||||
try:
|
||||
diff_text = git_service.get_diff(dashboard_id, file_path, staged)
|
||||
return diff_text
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:get_repository_diff:Function]
|
||||
|
||||
# [/DEF:backend.src.api.routes.git:Module]
|
||||
145
backend/src/api/routes/git_schemas.py
Normal file
145
backend/src/api/routes/git_schemas.py
Normal file
@@ -0,0 +1,145 @@
|
||||
# [DEF:backend.src.api.routes.git_schemas:Module]
|
||||
#
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: git, schemas, pydantic, api, contracts
|
||||
# @PURPOSE: Defines Pydantic models for the Git integration API layer.
|
||||
# @LAYER: API
|
||||
# @RELATION: DEPENDS_ON -> backend.src.models.git
|
||||
#
|
||||
# @INVARIANT: All schemas must be compatible with the FastAPI router.
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
from typing import List, Optional
|
||||
from datetime import datetime
|
||||
from uuid import UUID
|
||||
from src.models.git import GitProvider, GitStatus, SyncStatus
|
||||
|
||||
# [DEF:GitServerConfigBase:Class]
|
||||
# @TIER: TRIVIAL
|
||||
# @PURPOSE: Base schema for Git server configuration attributes.
|
||||
class GitServerConfigBase(BaseModel):
|
||||
name: str = Field(..., description="Display name for the Git server")
|
||||
provider: GitProvider = Field(..., description="Git provider (GITHUB, GITLAB, GITEA)")
|
||||
url: str = Field(..., description="Server base URL")
|
||||
pat: str = Field(..., description="Personal Access Token")
|
||||
default_repository: Optional[str] = Field(None, description="Default repository path (org/repo)")
|
||||
# [/DEF:GitServerConfigBase:Class]
|
||||
|
||||
# [DEF:GitServerConfigCreate:Class]
|
||||
# @PURPOSE: Schema for creating a new Git server configuration.
|
||||
class GitServerConfigCreate(GitServerConfigBase):
|
||||
"""Schema for creating a new Git server configuration."""
|
||||
pass
|
||||
# [/DEF:GitServerConfigCreate:Class]
|
||||
|
||||
# [DEF:GitServerConfigSchema:Class]
|
||||
# @PURPOSE: Schema for representing a Git server configuration with metadata.
|
||||
class GitServerConfigSchema(GitServerConfigBase):
|
||||
"""Schema for representing a Git server configuration with metadata."""
|
||||
id: str
|
||||
status: GitStatus
|
||||
last_validated: datetime
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:GitServerConfigSchema:Class]
|
||||
|
||||
# [DEF:GitRepositorySchema:Class]
|
||||
# @PURPOSE: Schema for tracking a local Git repository linked to a dashboard.
|
||||
class GitRepositorySchema(BaseModel):
|
||||
"""Schema for tracking a local Git repository linked to a dashboard."""
|
||||
id: str
|
||||
dashboard_id: int
|
||||
config_id: str
|
||||
remote_url: str
|
||||
local_path: str
|
||||
current_branch: str
|
||||
sync_status: SyncStatus
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:GitRepositorySchema:Class]
|
||||
|
||||
# [DEF:BranchSchema:Class]
|
||||
# @PURPOSE: Schema for representing a Git branch metadata.
|
||||
class BranchSchema(BaseModel):
|
||||
"""Schema for representing a Git branch."""
|
||||
name: str
|
||||
commit_hash: str
|
||||
is_remote: bool
|
||||
last_updated: datetime
|
||||
# [/DEF:BranchSchema:Class]
|
||||
|
||||
# [DEF:CommitSchema:Class]
|
||||
# @PURPOSE: Schema for representing Git commit details.
|
||||
class CommitSchema(BaseModel):
|
||||
"""Schema for representing a Git commit."""
|
||||
hash: str
|
||||
author: str
|
||||
email: str
|
||||
timestamp: datetime
|
||||
message: str
|
||||
files_changed: List[str]
|
||||
# [/DEF:CommitSchema:Class]
|
||||
|
||||
# [DEF:BranchCreate:Class]
|
||||
# @PURPOSE: Schema for branch creation requests.
|
||||
class BranchCreate(BaseModel):
|
||||
"""Schema for branch creation requests."""
|
||||
name: str
|
||||
from_branch: str
|
||||
# [/DEF:BranchCreate:Class]
|
||||
|
||||
# [DEF:BranchCheckout:Class]
|
||||
# @PURPOSE: Schema for branch checkout requests.
|
||||
class BranchCheckout(BaseModel):
|
||||
"""Schema for branch checkout requests."""
|
||||
name: str
|
||||
# [/DEF:BranchCheckout:Class]
|
||||
|
||||
# [DEF:CommitCreate:Class]
|
||||
# @PURPOSE: Schema for staging and committing changes.
|
||||
class CommitCreate(BaseModel):
|
||||
"""Schema for staging and committing changes."""
|
||||
message: str
|
||||
files: List[str]
|
||||
# [/DEF:CommitCreate:Class]
|
||||
|
||||
# [DEF:ConflictResolution:Class]
|
||||
# @PURPOSE: Schema for resolving merge conflicts.
|
||||
class ConflictResolution(BaseModel):
|
||||
"""Schema for resolving merge conflicts."""
|
||||
file_path: str
|
||||
resolution: str = Field(pattern="^(mine|theirs|manual)$")
|
||||
content: Optional[str] = None
|
||||
# [/DEF:ConflictResolution:Class]
|
||||
|
||||
# [DEF:DeploymentEnvironmentSchema:Class]
|
||||
# @PURPOSE: Schema for representing a target deployment environment.
|
||||
class DeploymentEnvironmentSchema(BaseModel):
|
||||
"""Schema for representing a target deployment environment."""
|
||||
id: str
|
||||
name: str
|
||||
superset_url: str
|
||||
is_active: bool
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:DeploymentEnvironmentSchema:Class]
|
||||
|
||||
# [DEF:DeployRequest:Class]
|
||||
# @PURPOSE: Schema for dashboard deployment requests.
|
||||
class DeployRequest(BaseModel):
|
||||
"""Schema for deployment requests."""
|
||||
environment_id: str
|
||||
# [/DEF:DeployRequest:Class]
|
||||
|
||||
# [DEF:RepoInitRequest:Class]
|
||||
# @PURPOSE: Schema for repository initialization requests.
|
||||
class RepoInitRequest(BaseModel):
|
||||
"""Schema for repository initialization requests."""
|
||||
config_id: str
|
||||
remote_url: str
|
||||
# [/DEF:RepoInitRequest:Class]
|
||||
|
||||
# [/DEF:backend.src.api.routes.git_schemas:Module]
|
||||
@@ -13,9 +13,10 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import List, Optional
|
||||
from backend.src.dependencies import get_config_manager
|
||||
from backend.src.core.database import get_db
|
||||
from backend.src.models.mapping import DatabaseMapping
|
||||
from ...core.logger import belief_scope
|
||||
from ...dependencies import get_config_manager, has_permission
|
||||
from ...core.database import get_db
|
||||
from ...models.mapping import DatabaseMapping
|
||||
from pydantic import BaseModel
|
||||
# [/SECTION]
|
||||
|
||||
@@ -59,7 +60,8 @@ class SuggestRequest(BaseModel):
|
||||
async def get_mappings(
|
||||
source_env_id: Optional[str] = None,
|
||||
target_env_id: Optional[str] = None,
|
||||
db: Session = Depends(get_db)
|
||||
db: Session = Depends(get_db),
|
||||
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("get_mappings"):
|
||||
query = db.query(DatabaseMapping)
|
||||
@@ -75,7 +77,11 @@ async def get_mappings(
|
||||
# @PRE: mapping is valid MappingCreate, db session is injected.
|
||||
# @POST: DatabaseMapping created or updated in database.
|
||||
@router.post("", response_model=MappingResponse)
|
||||
async def create_mapping(mapping: MappingCreate, db: Session = Depends(get_db)):
|
||||
async def create_mapping(
|
||||
mapping: MappingCreate,
|
||||
db: Session = Depends(get_db),
|
||||
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("create_mapping"):
|
||||
# Check if mapping already exists
|
||||
existing = db.query(DatabaseMapping).filter(
|
||||
@@ -105,10 +111,11 @@ async def create_mapping(mapping: MappingCreate, db: Session = Depends(get_db)):
|
||||
@router.post("/suggest")
|
||||
async def suggest_mappings_api(
|
||||
request: SuggestRequest,
|
||||
config_manager=Depends(get_config_manager)
|
||||
config_manager=Depends(get_config_manager),
|
||||
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("suggest_mappings_api"):
|
||||
from backend.src.services.mapping_service import MappingService
|
||||
from ...services.mapping_service import MappingService
|
||||
service = MappingService(config_manager)
|
||||
try:
|
||||
return await service.get_suggestions(request.source_env_id, request.target_env_id)
|
||||
|
||||
@@ -7,10 +7,10 @@
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List, Dict
|
||||
from backend.src.dependencies import get_config_manager, get_task_manager
|
||||
from backend.src.models.dashboard import DashboardMetadata, DashboardSelection
|
||||
from backend.src.core.superset_client import SupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
from ...dependencies import get_config_manager, get_task_manager, has_permission
|
||||
from ...models.dashboard import DashboardMetadata, DashboardSelection
|
||||
from ...core.superset_client import SupersetClient
|
||||
from ...core.logger import belief_scope
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["migration"])
|
||||
|
||||
@@ -21,20 +21,18 @@ router = APIRouter(prefix="/api", tags=["migration"])
|
||||
# @PARAM: env_id (str) - The ID of the environment to fetch from.
|
||||
# @RETURN: List[DashboardMetadata]
|
||||
@router.get("/environments/{env_id}/dashboards", response_model=List[DashboardMetadata])
|
||||
async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)):
|
||||
environments = config_manager.get_environments()
|
||||
async def get_dashboards(
|
||||
env_id: str,
|
||||
config_manager=Depends(get_config_manager),
|
||||
_ = Depends(has_permission("plugin:migration", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("get_dashboards", f"env_id={env_id}"):
|
||||
environments = config_manager.get_environments()
|
||||
env = next((e for e in environments if e.id == env_id), None)
|
||||
if not env:
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={'provider': 'db', 'username': env.username, 'password': env.password, 'refresh': False},
|
||||
verify_ssl=True,
|
||||
timeout=30
|
||||
)
|
||||
client = SupersetClient(config)
|
||||
client = SupersetClient(env)
|
||||
dashboards = client.get_dashboards_summary()
|
||||
return dashboards
|
||||
# [/DEF:get_dashboards:Function]
|
||||
@@ -46,9 +44,15 @@ async def get_dashboards(env_id: str, config_manager=Depends(get_config_manager)
|
||||
# @PARAM: selection (DashboardSelection) - The dashboards to migrate.
|
||||
# @RETURN: Dict - {"task_id": str, "message": str}
|
||||
@router.post("/migration/execute")
|
||||
async def execute_migration(selection: DashboardSelection, config_manager=Depends(get_config_manager), task_manager=Depends(get_task_manager)):
|
||||
# Validate environments exist
|
||||
environments = config_manager.get_environments()
|
||||
async def execute_migration(
|
||||
selection: DashboardSelection,
|
||||
config_manager=Depends(get_config_manager),
|
||||
task_manager=Depends(get_task_manager),
|
||||
_ = Depends(has_permission("plugin:migration", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("execute_migration"):
|
||||
# Validate environments exist
|
||||
environments = config_manager.get_environments()
|
||||
env_ids = {e.id for e in environments}
|
||||
if selection.source_env_id not in env_ids or selection.target_env_id not in env_ids:
|
||||
raise HTTPException(status_code=400, detail="Invalid source or target environment")
|
||||
|
||||
@@ -7,7 +7,7 @@ from typing import List
|
||||
from fastapi import APIRouter, Depends
|
||||
|
||||
from ...core.plugin_base import PluginConfig
|
||||
from ...dependencies import get_plugin_loader
|
||||
from ...dependencies import get_plugin_loader, has_permission
|
||||
from ...core.logger import belief_scope
|
||||
|
||||
router = APIRouter()
|
||||
@@ -19,7 +19,8 @@ router = APIRouter()
|
||||
# @RETURN: List[PluginConfig] - List of registered plugins.
|
||||
@router.get("", response_model=List[PluginConfig])
|
||||
async def list_plugins(
|
||||
plugin_loader = Depends(get_plugin_loader)
|
||||
plugin_loader = Depends(get_plugin_loader),
|
||||
_ = Depends(has_permission("plugins", "READ"))
|
||||
):
|
||||
with belief_scope("list_plugins"):
|
||||
"""
|
||||
|
||||
@@ -13,11 +13,11 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List
|
||||
from ...core.config_models import AppConfig, Environment, GlobalSettings
|
||||
from ...dependencies import get_config_manager
|
||||
from ...models.storage import StorageConfig
|
||||
from ...dependencies import get_config_manager, has_permission
|
||||
from ...core.config_manager import ConfigManager
|
||||
from ...core.logger import logger, belief_scope
|
||||
from ...core.superset_client import SupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
@@ -28,8 +28,11 @@ router = APIRouter()
|
||||
# @PRE: Config manager is available.
|
||||
# @POST: Returns masked AppConfig.
|
||||
# @RETURN: AppConfig - The current configuration.
|
||||
@router.get("/", response_model=AppConfig)
|
||||
async def get_settings(config_manager: ConfigManager = Depends(get_config_manager)):
|
||||
@router.get("", response_model=AppConfig)
|
||||
async def get_settings(
|
||||
config_manager: ConfigManager = Depends(get_config_manager),
|
||||
_ = Depends(has_permission("admin:settings", "READ"))
|
||||
):
|
||||
with belief_scope("get_settings"):
|
||||
logger.info("[get_settings][Entry] Fetching all settings")
|
||||
config = config_manager.get_config().copy(deep=True)
|
||||
@@ -49,21 +52,60 @@ async def get_settings(config_manager: ConfigManager = Depends(get_config_manage
|
||||
@router.patch("/global", response_model=GlobalSettings)
|
||||
async def update_global_settings(
|
||||
settings: GlobalSettings,
|
||||
config_manager: ConfigManager = Depends(get_config_manager)
|
||||
config_manager: ConfigManager = Depends(get_config_manager),
|
||||
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||
):
|
||||
with belief_scope("update_global_settings"):
|
||||
logger.info("[update_global_settings][Entry] Updating global settings")
|
||||
|
||||
config_manager.update_global_settings(settings)
|
||||
return settings
|
||||
# [/DEF:update_global_settings:Function]
|
||||
|
||||
# [DEF:get_storage_settings:Function]
|
||||
# @PURPOSE: Retrieves storage-specific settings.
|
||||
# @RETURN: StorageConfig - The storage configuration.
|
||||
@router.get("/storage", response_model=StorageConfig)
|
||||
async def get_storage_settings(
|
||||
config_manager: ConfigManager = Depends(get_config_manager),
|
||||
_ = Depends(has_permission("admin:settings", "READ"))
|
||||
):
|
||||
with belief_scope("get_storage_settings"):
|
||||
return config_manager.get_config().settings.storage
|
||||
# [/DEF:get_storage_settings:Function]
|
||||
|
||||
# [DEF:update_storage_settings:Function]
|
||||
# @PURPOSE: Updates storage-specific settings.
|
||||
# @PARAM: storage (StorageConfig) - The new storage settings.
|
||||
# @POST: Storage settings are updated and saved.
|
||||
# @RETURN: StorageConfig - The updated storage settings.
|
||||
@router.put("/storage", response_model=StorageConfig)
|
||||
async def update_storage_settings(
|
||||
storage: StorageConfig,
|
||||
config_manager: ConfigManager = Depends(get_config_manager),
|
||||
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||
):
|
||||
with belief_scope("update_storage_settings"):
|
||||
is_valid, message = config_manager.validate_path(storage.root_path)
|
||||
if not is_valid:
|
||||
raise HTTPException(status_code=400, detail=message)
|
||||
|
||||
settings = config_manager.get_config().settings
|
||||
settings.storage = storage
|
||||
config_manager.update_global_settings(settings)
|
||||
return config_manager.get_config().settings.storage
|
||||
# [/DEF:update_storage_settings:Function]
|
||||
|
||||
# [DEF:get_environments:Function]
|
||||
# @PURPOSE: Lists all configured Superset environments.
|
||||
# @PRE: Config manager is available.
|
||||
# @POST: Returns list of environments.
|
||||
# @RETURN: List[Environment] - List of environments.
|
||||
@router.get("/environments", response_model=List[Environment])
|
||||
async def get_environments(config_manager: ConfigManager = Depends(get_config_manager)):
|
||||
async def get_environments(
|
||||
config_manager: ConfigManager = Depends(get_config_manager),
|
||||
_ = Depends(has_permission("admin:settings", "READ"))
|
||||
):
|
||||
with belief_scope("get_environments"):
|
||||
logger.info("[get_environments][Entry] Fetching environments")
|
||||
return config_manager.get_environments()
|
||||
@@ -78,24 +120,15 @@ async def get_environments(config_manager: ConfigManager = Depends(get_config_ma
|
||||
@router.post("/environments", response_model=Environment)
|
||||
async def add_environment(
|
||||
env: Environment,
|
||||
config_manager: ConfigManager = Depends(get_config_manager)
|
||||
config_manager: ConfigManager = Depends(get_config_manager),
|
||||
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||
):
|
||||
with belief_scope("add_environment"):
|
||||
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
||||
|
||||
# Validate connection before adding
|
||||
try:
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db",
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "true"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(config=superset_config)
|
||||
client = SupersetClient(env)
|
||||
client.get_dashboards(query={"page_size": 1})
|
||||
except Exception as e:
|
||||
logger.error(f"[add_environment][Coherence:Failed] Connection validation failed: {e}")
|
||||
@@ -130,17 +163,7 @@ async def update_environment(
|
||||
|
||||
# Validate connection before updating
|
||||
try:
|
||||
superset_config = SupersetConfig(
|
||||
env=env_to_validate.name,
|
||||
base_url=env_to_validate.url,
|
||||
auth={
|
||||
"provider": "db",
|
||||
"username": env_to_validate.username,
|
||||
"password": env_to_validate.password,
|
||||
"refresh": "true"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(config=superset_config)
|
||||
client = SupersetClient(env_to_validate)
|
||||
client.get_dashboards(query={"page_size": 1})
|
||||
except Exception as e:
|
||||
logger.error(f"[update_environment][Coherence:Failed] Connection validation failed: {e}")
|
||||
@@ -187,21 +210,8 @@ async def test_environment_connection(
|
||||
raise HTTPException(status_code=404, detail=f"Environment {id} not found")
|
||||
|
||||
try:
|
||||
# Create SupersetConfig
|
||||
# Note: SupersetConfig expects 'auth' dict with specific keys
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db", # Defaulting to db for now
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "true"
|
||||
}
|
||||
)
|
||||
|
||||
# Initialize client (this will trigger authentication)
|
||||
client = SupersetClient(config=superset_config)
|
||||
client = SupersetClient(env)
|
||||
|
||||
# Try a simple request to verify
|
||||
client.get_dashboards(query={"page_size": 1})
|
||||
@@ -213,30 +223,5 @@ async def test_environment_connection(
|
||||
return {"status": "error", "message": str(e)}
|
||||
# [/DEF:test_environment_connection:Function]
|
||||
|
||||
# [DEF:validate_backup_path:Function]
|
||||
# @PURPOSE: Validates if a backup path exists and is writable.
|
||||
# @PRE: Path is provided in path_data.
|
||||
# @POST: Returns success or error status.
|
||||
# @PARAM: path (str) - The path to validate.
|
||||
# @RETURN: dict - Validation result.
|
||||
@router.post("/validate-path")
|
||||
async def validate_backup_path(
|
||||
path_data: dict,
|
||||
config_manager: ConfigManager = Depends(get_config_manager)
|
||||
):
|
||||
with belief_scope("validate_backup_path"):
|
||||
path = path_data.get("path")
|
||||
if not path:
|
||||
raise HTTPException(status_code=400, detail="Path is required")
|
||||
|
||||
logger.info(f"[validate_backup_path][Entry] Validating path: {path}")
|
||||
|
||||
valid, message = config_manager.validate_path(path)
|
||||
|
||||
if not valid:
|
||||
return {"status": "error", "message": message}
|
||||
|
||||
return {"status": "success", "message": message}
|
||||
# [/DEF:validate_backup_path:Function]
|
||||
|
||||
# [/DEF:SettingsRouter:Module]
|
||||
|
||||
145
backend/src/api/routes/storage.py
Normal file
145
backend/src/api/routes/storage.py
Normal file
@@ -0,0 +1,145 @@
|
||||
# [DEF:storage_routes:Module]
|
||||
#
|
||||
# @SEMANTICS: storage, files, upload, download, backup, repository
|
||||
# @PURPOSE: API endpoints for file storage management (backups and repositories).
|
||||
# @LAYER: API
|
||||
# @RELATION: DEPENDS_ON -> backend.src.models.storage
|
||||
#
|
||||
# @INVARIANT: All paths must be validated against path traversal.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from pathlib import Path
|
||||
from fastapi import APIRouter, Depends, UploadFile, File, Form, HTTPException
|
||||
from fastapi.responses import FileResponse
|
||||
from typing import List, Optional
|
||||
from ...models.storage import StoredFile, FileCategory
|
||||
from ...dependencies import get_plugin_loader, has_permission
|
||||
from ...plugins.storage.plugin import StoragePlugin
|
||||
from ...core.logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter(tags=["storage"])
|
||||
|
||||
# [DEF:list_files:Function]
|
||||
# @PURPOSE: List all files and directories in the storage system.
|
||||
#
|
||||
# @PRE: None.
|
||||
# @POST: Returns a list of StoredFile objects.
|
||||
#
|
||||
# @PARAM: category (Optional[FileCategory]) - Filter by category.
|
||||
# @PARAM: path (Optional[str]) - Subpath within the category.
|
||||
# @RETURN: List[StoredFile] - List of files/directories.
|
||||
#
|
||||
# @RELATION: CALLS -> StoragePlugin.list_files
|
||||
@router.get("/files", response_model=List[StoredFile])
|
||||
async def list_files(
|
||||
category: Optional[FileCategory] = None,
|
||||
path: Optional[str] = None,
|
||||
plugin_loader=Depends(get_plugin_loader),
|
||||
_ = Depends(has_permission("plugin:storage", "READ"))
|
||||
):
|
||||
with belief_scope("list_files"):
|
||||
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||
if not storage_plugin:
|
||||
raise HTTPException(status_code=500, detail="Storage plugin not loaded")
|
||||
return storage_plugin.list_files(category, path)
|
||||
# [/DEF:list_files:Function]
|
||||
|
||||
# [DEF:upload_file:Function]
|
||||
# @PURPOSE: Upload a file to the storage system.
|
||||
#
|
||||
# @PRE: category must be a valid FileCategory.
|
||||
# @PRE: file must be a valid UploadFile.
|
||||
# @POST: Returns the StoredFile object of the uploaded file.
|
||||
#
|
||||
# @PARAM: category (FileCategory) - Target category.
|
||||
# @PARAM: path (Optional[str]) - Target subpath.
|
||||
# @PARAM: file (UploadFile) - The file content.
|
||||
# @RETURN: StoredFile - Metadata of the uploaded file.
|
||||
#
|
||||
# @SIDE_EFFECT: Writes file to the filesystem.
|
||||
#
|
||||
# @RELATION: CALLS -> StoragePlugin.save_file
|
||||
@router.post("/upload", response_model=StoredFile, status_code=201)
|
||||
async def upload_file(
|
||||
category: FileCategory = Form(...),
|
||||
path: Optional[str] = Form(None),
|
||||
file: UploadFile = File(...),
|
||||
plugin_loader=Depends(get_plugin_loader),
|
||||
_ = Depends(has_permission("plugin:storage", "WRITE"))
|
||||
):
|
||||
with belief_scope("upload_file"):
|
||||
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||
if not storage_plugin:
|
||||
raise HTTPException(status_code=500, detail="Storage plugin not loaded")
|
||||
try:
|
||||
return await storage_plugin.save_file(file, category, path)
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:upload_file:Function]
|
||||
|
||||
# [DEF:delete_file:Function]
|
||||
# @PURPOSE: Delete a specific file or directory.
|
||||
#
|
||||
# @PRE: category must be a valid FileCategory.
|
||||
# @POST: Item is removed from storage.
|
||||
#
|
||||
# @PARAM: category (FileCategory) - File category.
|
||||
# @PARAM: path (str) - Relative path of the item.
|
||||
# @RETURN: None
|
||||
#
|
||||
# @SIDE_EFFECT: Deletes item from the filesystem.
|
||||
#
|
||||
# @RELATION: CALLS -> StoragePlugin.delete_file
|
||||
@router.delete("/files/{category}/{path:path}", status_code=204)
|
||||
async def delete_file(
|
||||
category: FileCategory,
|
||||
path: str,
|
||||
plugin_loader=Depends(get_plugin_loader),
|
||||
_ = Depends(has_permission("plugin:storage", "WRITE"))
|
||||
):
|
||||
with belief_scope("delete_file"):
|
||||
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||
if not storage_plugin:
|
||||
raise HTTPException(status_code=500, detail="Storage plugin not loaded")
|
||||
try:
|
||||
storage_plugin.delete_file(category, path)
|
||||
except FileNotFoundError:
|
||||
raise HTTPException(status_code=404, detail="File not found")
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:delete_file:Function]
|
||||
|
||||
# [DEF:download_file:Function]
|
||||
# @PURPOSE: Retrieve a file for download.
|
||||
#
|
||||
# @PRE: category must be a valid FileCategory.
|
||||
# @POST: Returns a FileResponse.
|
||||
#
|
||||
# @PARAM: category (FileCategory) - File category.
|
||||
# @PARAM: path (str) - Relative path of the file.
|
||||
# @RETURN: FileResponse - The file content.
|
||||
#
|
||||
# @RELATION: CALLS -> StoragePlugin.get_file_path
|
||||
@router.get("/download/{category}/{path:path}")
|
||||
async def download_file(
|
||||
category: FileCategory,
|
||||
path: str,
|
||||
plugin_loader=Depends(get_plugin_loader),
|
||||
_ = Depends(has_permission("plugin:storage", "READ"))
|
||||
):
|
||||
with belief_scope("download_file"):
|
||||
storage_plugin: StoragePlugin = plugin_loader.get_plugin("storage-manager")
|
||||
if not storage_plugin:
|
||||
raise HTTPException(status_code=500, detail="Storage plugin not loaded")
|
||||
try:
|
||||
abs_path = storage_plugin.get_file_path(category, path)
|
||||
filename = Path(path).name
|
||||
return FileResponse(path=abs_path, filename=filename)
|
||||
except FileNotFoundError:
|
||||
raise HTTPException(status_code=404, detail="File not found")
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
# [/DEF:download_file:Function]
|
||||
|
||||
# [/DEF:storage_routes:Module]
|
||||
@@ -9,7 +9,7 @@ from pydantic import BaseModel
|
||||
from ...core.logger import belief_scope
|
||||
|
||||
from ...core.task_manager import TaskManager, Task, TaskStatus, LogEntry
|
||||
from ...dependencies import get_task_manager
|
||||
from ...dependencies import get_task_manager, has_permission
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@@ -33,7 +33,8 @@ class ResumeTaskRequest(BaseModel):
|
||||
# @RETURN: Task - The created task instance.
|
||||
async def create_task(
|
||||
request: CreateTaskRequest,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
task_manager: TaskManager = Depends(get_task_manager),
|
||||
_ = Depends(lambda req: has_permission(f"plugin:{req.plugin_id}", "EXECUTE"))
|
||||
):
|
||||
"""
|
||||
Create and start a new task for a given plugin.
|
||||
@@ -63,7 +64,8 @@ async def list_tasks(
|
||||
limit: int = 10,
|
||||
offset: int = 0,
|
||||
status: Optional[TaskStatus] = None,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
task_manager: TaskManager = Depends(get_task_manager),
|
||||
_ = Depends(has_permission("tasks", "READ"))
|
||||
):
|
||||
"""
|
||||
Retrieve a list of tasks with pagination and optional status filter.
|
||||
@@ -82,7 +84,8 @@ async def list_tasks(
|
||||
# @RETURN: Task - The task details.
|
||||
async def get_task(
|
||||
task_id: str,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
task_manager: TaskManager = Depends(get_task_manager),
|
||||
_ = Depends(has_permission("tasks", "READ"))
|
||||
):
|
||||
"""
|
||||
Retrieve the details of a specific task.
|
||||
@@ -104,7 +107,8 @@ async def get_task(
|
||||
# @RETURN: List[LogEntry] - List of log entries.
|
||||
async def get_task_logs(
|
||||
task_id: str,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
task_manager: TaskManager = Depends(get_task_manager),
|
||||
_ = Depends(has_permission("tasks", "READ"))
|
||||
):
|
||||
"""
|
||||
Retrieve logs for a specific task.
|
||||
@@ -128,7 +132,8 @@ async def get_task_logs(
|
||||
async def resolve_task(
|
||||
task_id: str,
|
||||
request: ResolveTaskRequest,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
task_manager: TaskManager = Depends(get_task_manager),
|
||||
_ = Depends(has_permission("tasks", "WRITE"))
|
||||
):
|
||||
"""
|
||||
Resolve a task that is awaiting mapping.
|
||||
@@ -153,7 +158,8 @@ async def resolve_task(
|
||||
async def resume_task(
|
||||
task_id: str,
|
||||
request: ResumeTaskRequest,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
task_manager: TaskManager = Depends(get_task_manager),
|
||||
_ = Depends(has_permission("tasks", "WRITE"))
|
||||
):
|
||||
"""
|
||||
Resume a task that is awaiting input (e.g., passwords).
|
||||
@@ -175,7 +181,8 @@ async def resume_task(
|
||||
# @POST: Tasks are removed from memory/persistence.
|
||||
async def clear_tasks(
|
||||
status: Optional[TaskStatus] = None,
|
||||
task_manager: TaskManager = Depends(get_task_manager)
|
||||
task_manager: TaskManager = Depends(get_task_manager),
|
||||
_ = Depends(has_permission("tasks", "WRITE"))
|
||||
):
|
||||
"""
|
||||
Clear tasks matching the status filter. If no filter, clears all non-running tasks.
|
||||
|
||||
@@ -6,12 +6,11 @@
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add project root to sys.path to allow importing superset_tool
|
||||
# Assuming app.py is in backend/src/
|
||||
# project_root is used for static files mounting
|
||||
project_root = Path(__file__).resolve().parent.parent.parent
|
||||
sys.path.append(str(project_root))
|
||||
|
||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, Request, HTTPException
|
||||
from starlette.middleware.sessions import SessionMiddleware
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from fastapi.responses import FileResponse
|
||||
@@ -20,7 +19,8 @@ import os
|
||||
|
||||
from .dependencies import get_task_manager, get_scheduler_service
|
||||
from .core.logger import logger, belief_scope
|
||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration, connections
|
||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration, connections, git, storage, admin
|
||||
from .api import auth
|
||||
from .core.database import init_db
|
||||
|
||||
# [DEF:App:Global]
|
||||
@@ -57,6 +57,10 @@ async def shutdown_event():
|
||||
scheduler.stop()
|
||||
# [/DEF:shutdown_event:Function]
|
||||
|
||||
# Configure Session Middleware (required by Authlib for OAuth2 flow)
|
||||
from .core.auth.config import auth_config
|
||||
app.add_middleware(SessionMiddleware, secret_key=auth_config.SECRET_KEY)
|
||||
|
||||
# Configure CORS
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
@@ -83,13 +87,17 @@ async def log_requests(request: Request, call_next):
|
||||
# [/DEF:log_requests:Function]
|
||||
|
||||
# Include API routes
|
||||
app.include_router(auth.router)
|
||||
app.include_router(admin.router)
|
||||
app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
||||
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
||||
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
||||
app.include_router(connections.router, prefix="/api/connections", tags=["Connections"])
|
||||
app.include_router(connections.router, prefix="/api/settings/connections", tags=["Connections"])
|
||||
app.include_router(environments.router, prefix="/api/environments", tags=["Environments"])
|
||||
app.include_router(mappings.router)
|
||||
app.include_router(migration.router)
|
||||
app.include_router(git.router)
|
||||
app.include_router(storage.router, prefix="/api/storage", tags=["Storage"])
|
||||
|
||||
# [DEF:websocket_endpoint:Function]
|
||||
# @PURPOSE: Provides a WebSocket endpoint for real-time log streaming of a task.
|
||||
|
||||
45
backend/src/core/auth/config.py
Normal file
45
backend/src/core/auth/config.py
Normal file
@@ -0,0 +1,45 @@
|
||||
# [DEF:backend.src.core.auth.config:Module]
|
||||
#
|
||||
# @SEMANTICS: auth, config, settings, jwt, adfs
|
||||
# @PURPOSE: Centralized configuration for authentication and authorization.
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> pydantic
|
||||
#
|
||||
# @INVARIANT: All sensitive configuration must have defaults or be loaded from environment.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from pydantic import Field
|
||||
from pydantic_settings import BaseSettings
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:AuthConfig:Class]
|
||||
# @PURPOSE: Holds authentication-related settings.
|
||||
# @PRE: Environment variables may be provided via .env file.
|
||||
# @POST: Returns a configuration object with validated settings.
|
||||
class AuthConfig(BaseSettings):
|
||||
# JWT Settings
|
||||
SECRET_KEY: str = Field(default="super-secret-key-change-in-production", env="AUTH_SECRET_KEY")
|
||||
ALGORITHM: str = "HS256"
|
||||
ACCESS_TOKEN_EXPIRE_MINUTES: int = 30
|
||||
REFRESH_TOKEN_EXPIRE_DAYS: int = 7
|
||||
|
||||
# Database Settings
|
||||
AUTH_DATABASE_URL: str = Field(default="sqlite:///./backend/auth.db", env="AUTH_DATABASE_URL")
|
||||
|
||||
# ADFS Settings
|
||||
ADFS_CLIENT_ID: str = Field(default="", env="ADFS_CLIENT_ID")
|
||||
ADFS_CLIENT_SECRET: str = Field(default="", env="ADFS_CLIENT_SECRET")
|
||||
ADFS_METADATA_URL: str = Field(default="", env="ADFS_METADATA_URL")
|
||||
|
||||
class Config:
|
||||
env_file = ".env"
|
||||
extra = "ignore"
|
||||
# [/DEF:AuthConfig:Class]
|
||||
|
||||
# [DEF:auth_config:Variable]
|
||||
# @PURPOSE: Singleton instance of AuthConfig.
|
||||
auth_config = AuthConfig()
|
||||
# [/DEF:auth_config:Variable]
|
||||
|
||||
# [/DEF:backend.src.core.auth.config:Module]
|
||||
54
backend/src/core/auth/jwt.py
Normal file
54
backend/src/core/auth/jwt.py
Normal file
@@ -0,0 +1,54 @@
|
||||
# [DEF:backend.src.core.auth.jwt:Module]
|
||||
#
|
||||
# @SEMANTICS: jwt, token, session, auth
|
||||
# @PURPOSE: JWT token generation and validation logic.
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> jose
|
||||
# @RELATION: USES -> backend.src.core.auth.config.auth_config
|
||||
#
|
||||
# @INVARIANT: Tokens must include expiration time and user identifier.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional, List
|
||||
from jose import JWTError, jwt
|
||||
from .config import auth_config
|
||||
from ..logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:create_access_token:Function]
|
||||
# @PURPOSE: Generates a new JWT access token.
|
||||
# @PRE: data dict contains 'sub' (user_id) and optional 'scopes' (roles).
|
||||
# @POST: Returns a signed JWT string.
|
||||
#
|
||||
# @PARAM: data (dict) - Payload data for the token.
|
||||
# @PARAM: expires_delta (Optional[timedelta]) - Custom expiration time.
|
||||
# @RETURN: str - The encoded JWT.
|
||||
def create_access_token(data: dict, expires_delta: Optional[timedelta] = None) -> str:
|
||||
with belief_scope("create_access_token"):
|
||||
to_encode = data.copy()
|
||||
if expires_delta:
|
||||
expire = datetime.utcnow() + expires_delta
|
||||
else:
|
||||
expire = datetime.utcnow() + timedelta(minutes=auth_config.ACCESS_TOKEN_EXPIRE_MINUTES)
|
||||
|
||||
to_encode.update({"exp": expire})
|
||||
encoded_jwt = jwt.encode(to_encode, auth_config.SECRET_KEY, algorithm=auth_config.ALGORITHM)
|
||||
return encoded_jwt
|
||||
# [/DEF:create_access_token:Function]
|
||||
|
||||
# [DEF:decode_token:Function]
|
||||
# @PURPOSE: Decodes and validates a JWT token.
|
||||
# @PRE: token is a signed JWT string.
|
||||
# @POST: Returns the decoded payload if valid.
|
||||
#
|
||||
# @PARAM: token (str) - The JWT to decode.
|
||||
# @RETURN: dict - The decoded payload.
|
||||
# @THROW: jose.JWTError - If token is invalid or expired.
|
||||
def decode_token(token: str) -> dict:
|
||||
with belief_scope("decode_token"):
|
||||
payload = jwt.decode(token, auth_config.SECRET_KEY, algorithms=[auth_config.ALGORITHM])
|
||||
return payload
|
||||
# [/DEF:decode_token:Function]
|
||||
|
||||
# [/DEF:backend.src.core.auth.jwt:Module]
|
||||
31
backend/src/core/auth/logger.py
Normal file
31
backend/src/core/auth/logger.py
Normal file
@@ -0,0 +1,31 @@
|
||||
# [DEF:backend.src.core.auth.logger:Module]
|
||||
#
|
||||
# @SEMANTICS: auth, logger, audit, security
|
||||
# @PURPOSE: Audit logging for security-related events.
|
||||
# @LAYER: Core
|
||||
# @RELATION: USES -> backend.src.core.logger.belief_scope
|
||||
#
|
||||
# @INVARIANT: Must not log sensitive data like passwords or full tokens.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from ..logger import logger, belief_scope
|
||||
from datetime import datetime
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:log_security_event:Function]
|
||||
# @PURPOSE: Logs a security-related event for audit trails.
|
||||
# @PRE: event_type and username are strings.
|
||||
# @POST: Security event is written to the application log.
|
||||
# @PARAM: event_type (str) - Type of event (e.g., LOGIN_SUCCESS, PERMISSION_DENIED).
|
||||
# @PARAM: username (str) - The user involved in the event.
|
||||
# @PARAM: details (dict) - Additional non-sensitive metadata.
|
||||
def log_security_event(event_type: str, username: str, details: dict = None):
|
||||
with belief_scope("log_security_event", f"{event_type}:{username}"):
|
||||
timestamp = datetime.utcnow().isoformat()
|
||||
msg = f"[AUDIT][{timestamp}][{event_type}] User: {username}"
|
||||
if details:
|
||||
msg += f" Details: {details}"
|
||||
logger.info(msg)
|
||||
# [/DEF:log_security_event:Function]
|
||||
|
||||
# [/DEF:backend.src.core.auth.logger:Module]
|
||||
51
backend/src/core/auth/oauth.py
Normal file
51
backend/src/core/auth/oauth.py
Normal file
@@ -0,0 +1,51 @@
|
||||
# [DEF:backend.src.core.auth.oauth:Module]
|
||||
#
|
||||
# @SEMANTICS: auth, oauth, oidc, adfs
|
||||
# @PURPOSE: ADFS OIDC configuration and client using Authlib.
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> authlib
|
||||
# @RELATION: USES -> backend.src.core.auth.config.auth_config
|
||||
#
|
||||
# @INVARIANT: Must use secure OIDC flows.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from authlib.integrations.starlette_client import OAuth
|
||||
from .config import auth_config
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:oauth:Variable]
|
||||
# @PURPOSE: Global Authlib OAuth registry.
|
||||
oauth = OAuth()
|
||||
# [/DEF:oauth:Variable]
|
||||
|
||||
# [DEF:register_adfs:Function]
|
||||
# @PURPOSE: Registers the ADFS OIDC client.
|
||||
# @PRE: ADFS configuration is provided in auth_config.
|
||||
# @POST: ADFS client is registered in oauth registry.
|
||||
def register_adfs():
|
||||
if auth_config.ADFS_CLIENT_ID:
|
||||
oauth.register(
|
||||
name='adfs',
|
||||
client_id=auth_config.ADFS_CLIENT_ID,
|
||||
client_secret=auth_config.ADFS_CLIENT_SECRET,
|
||||
server_metadata_url=auth_config.ADFS_METADATA_URL,
|
||||
client_kwargs={
|
||||
'scope': 'openid email profile groups'
|
||||
}
|
||||
)
|
||||
# [/DEF:register_adfs:Function]
|
||||
|
||||
# [DEF:is_adfs_configured:Function]
|
||||
# @PURPOSE: Checks if ADFS is properly configured.
|
||||
# @PRE: None.
|
||||
# @POST: Returns True if ADFS client is registered, False otherwise.
|
||||
# @RETURN: bool - Configuration status.
|
||||
def is_adfs_configured() -> bool:
|
||||
"""Check if ADFS OAuth client is registered."""
|
||||
return 'adfs' in oauth._registry
|
||||
# [/DEF:is_adfs_configured:Function]
|
||||
|
||||
# Initial registration
|
||||
register_adfs()
|
||||
|
||||
# [/DEF:backend.src.core.auth.oauth:Module]
|
||||
123
backend/src/core/auth/repository.py
Normal file
123
backend/src/core/auth/repository.py
Normal file
@@ -0,0 +1,123 @@
|
||||
# [DEF:backend.src.core.auth.repository:Module]
|
||||
#
|
||||
# @SEMANTICS: auth, repository, database, user, role
|
||||
# @PURPOSE: Data access layer for authentication-related entities.
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||
# @RELATION: USES -> backend.src.models.auth
|
||||
#
|
||||
# @INVARIANT: All database operations must be performed within a session.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Optional, List
|
||||
from sqlalchemy.orm import Session
|
||||
from ...models.auth import User, Role, Permission, ADGroupMapping
|
||||
from ..logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:AuthRepository:Class]
|
||||
# @PURPOSE: Encapsulates database operations for authentication.
|
||||
class AuthRepository:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the repository with a database session.
|
||||
# @PARAM: db (Session) - SQLAlchemy session.
|
||||
def __init__(self, db: Session):
|
||||
self.db = db
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:get_user_by_username:Function]
|
||||
# @PURPOSE: Retrieves a user by their username.
|
||||
# @PRE: username is a string.
|
||||
# @POST: Returns User object if found, else None.
|
||||
# @PARAM: username (str) - The username to search for.
|
||||
# @RETURN: Optional[User] - The found user or None.
|
||||
def get_user_by_username(self, username: str) -> Optional[User]:
|
||||
with belief_scope("AuthRepository.get_user_by_username"):
|
||||
return self.db.query(User).filter(User.username == username).first()
|
||||
# [/DEF:get_user_by_username:Function]
|
||||
|
||||
# [DEF:get_user_by_id:Function]
|
||||
# @PURPOSE: Retrieves a user by their unique ID.
|
||||
# @PRE: user_id is a valid UUID string.
|
||||
# @POST: Returns User object if found, else None.
|
||||
# @PARAM: user_id (str) - The user's unique identifier.
|
||||
# @RETURN: Optional[User] - The found user or None.
|
||||
def get_user_by_id(self, user_id: str) -> Optional[User]:
|
||||
with belief_scope("AuthRepository.get_user_by_id"):
|
||||
return self.db.query(User).filter(User.id == user_id).first()
|
||||
# [/DEF:get_user_by_id:Function]
|
||||
|
||||
# [DEF:get_role_by_name:Function]
|
||||
# @PURPOSE: Retrieves a role by its name.
|
||||
# @PRE: name is a string.
|
||||
# @POST: Returns Role object if found, else None.
|
||||
# @PARAM: name (str) - The role name to search for.
|
||||
# @RETURN: Optional[Role] - The found role or None.
|
||||
def get_role_by_name(self, name: str) -> Optional[Role]:
|
||||
with belief_scope("AuthRepository.get_role_by_name"):
|
||||
return self.db.query(Role).filter(Role.name == name).first()
|
||||
# [/DEF:get_role_by_name:Function]
|
||||
|
||||
# [DEF:update_last_login:Function]
|
||||
# @PURPOSE: Updates the last_login timestamp for a user.
|
||||
# @PRE: user object is a valid User instance.
|
||||
# @POST: User's last_login is updated in the database.
|
||||
# @SIDE_EFFECT: Commits the transaction.
|
||||
# @PARAM: user (User) - The user to update.
|
||||
def update_last_login(self, user: User):
|
||||
with belief_scope("AuthRepository.update_last_login"):
|
||||
from datetime import datetime
|
||||
user.last_login = datetime.utcnow()
|
||||
self.db.add(user)
|
||||
self.db.commit()
|
||||
# [/DEF:update_last_login:Function]
|
||||
|
||||
# [DEF:get_role_by_id:Function]
|
||||
# @PURPOSE: Retrieves a role by its unique ID.
|
||||
# @PRE: role_id is a string.
|
||||
# @POST: Returns Role object if found, else None.
|
||||
# @PARAM: role_id (str) - The role's unique identifier.
|
||||
# @RETURN: Optional[Role] - The found role or None.
|
||||
def get_role_by_id(self, role_id: str) -> Optional[Role]:
|
||||
with belief_scope("AuthRepository.get_role_by_id"):
|
||||
return self.db.query(Role).filter(Role.id == role_id).first()
|
||||
# [/DEF:get_role_by_id:Function]
|
||||
|
||||
# [DEF:get_permission_by_id:Function]
|
||||
# @PURPOSE: Retrieves a permission by its unique ID.
|
||||
# @PRE: perm_id is a string.
|
||||
# @POST: Returns Permission object if found, else None.
|
||||
# @PARAM: perm_id (str) - The permission's unique identifier.
|
||||
# @RETURN: Optional[Permission] - The found permission or None.
|
||||
def get_permission_by_id(self, perm_id: str) -> Optional[Permission]:
|
||||
with belief_scope("AuthRepository.get_permission_by_id"):
|
||||
return self.db.query(Permission).filter(Permission.id == perm_id).first()
|
||||
# [/DEF:get_permission_by_id:Function]
|
||||
|
||||
# [DEF:get_permission_by_resource_action:Function]
|
||||
# @PURPOSE: Retrieves a permission by resource and action.
|
||||
# @PRE: resource and action are strings.
|
||||
# @POST: Returns Permission object if found, else None.
|
||||
# @PARAM: resource (str) - The resource name.
|
||||
# @PARAM: action (str) - The action name.
|
||||
# @RETURN: Optional[Permission] - The found permission or None.
|
||||
def get_permission_by_resource_action(self, resource: str, action: str) -> Optional[Permission]:
|
||||
with belief_scope("AuthRepository.get_permission_by_resource_action"):
|
||||
return self.db.query(Permission).filter(
|
||||
Permission.resource == resource,
|
||||
Permission.action == action
|
||||
).first()
|
||||
# [/DEF:get_permission_by_resource_action:Function]
|
||||
|
||||
# [DEF:list_permissions:Function]
|
||||
# @PURPOSE: Lists all available permissions.
|
||||
# @POST: Returns a list of all Permission objects.
|
||||
# @RETURN: List[Permission] - List of permissions.
|
||||
def list_permissions(self) -> List[Permission]:
|
||||
with belief_scope("AuthRepository.list_permissions"):
|
||||
return self.db.query(Permission).all()
|
||||
# [/DEF:list_permissions:Function]
|
||||
|
||||
# [/DEF:AuthRepository:Class]
|
||||
|
||||
# [/DEF:backend.src.core.auth.repository:Module]
|
||||
42
backend/src/core/auth/security.py
Normal file
42
backend/src/core/auth/security.py
Normal file
@@ -0,0 +1,42 @@
|
||||
# [DEF:backend.src.core.auth.security:Module]
|
||||
#
|
||||
# @SEMANTICS: security, password, hashing, bcrypt
|
||||
# @PURPOSE: Utility for password hashing and verification using Passlib.
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> passlib
|
||||
#
|
||||
# @INVARIANT: Uses bcrypt for hashing with standard work factor.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from passlib.context import CryptContext
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:pwd_context:Variable]
|
||||
# @PURPOSE: Passlib CryptContext for password management.
|
||||
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
|
||||
# [/DEF:pwd_context:Variable]
|
||||
|
||||
# [DEF:verify_password:Function]
|
||||
# @PURPOSE: Verifies a plain password against a hashed password.
|
||||
# @PRE: plain_password is a string, hashed_password is a bcrypt hash.
|
||||
# @POST: Returns True if password matches, False otherwise.
|
||||
#
|
||||
# @PARAM: plain_password (str) - The unhashed password.
|
||||
# @PARAM: hashed_password (str) - The stored hash.
|
||||
# @RETURN: bool - Verification result.
|
||||
def verify_password(plain_password: str, hashed_password: str) -> bool:
|
||||
return pwd_context.verify(plain_password, hashed_password)
|
||||
# [/DEF:verify_password:Function]
|
||||
|
||||
# [DEF:get_password_hash:Function]
|
||||
# @PURPOSE: Generates a bcrypt hash for a plain password.
|
||||
# @PRE: password is a string.
|
||||
# @POST: Returns a secure bcrypt hash string.
|
||||
#
|
||||
# @PARAM: password (str) - The password to hash.
|
||||
# @RETURN: str - The generated hash.
|
||||
def get_password_hash(password: str) -> str:
|
||||
return pwd_context.hash(password)
|
||||
# [/DEF:get_password_hash:Function]
|
||||
|
||||
# [/DEF:backend.src.core.auth.security:Module]
|
||||
@@ -62,14 +62,18 @@ class ConfigManager:
|
||||
logger.info(f"[_load_config][Action] Config file not found. Creating default.")
|
||||
default_config = AppConfig(
|
||||
environments=[],
|
||||
settings=GlobalSettings(backup_path="backups")
|
||||
settings=GlobalSettings()
|
||||
)
|
||||
self._save_config_to_disk(default_config)
|
||||
return default_config
|
||||
|
||||
try:
|
||||
with open(self.config_path, "r") as f:
|
||||
data = json.load(f)
|
||||
|
||||
# Check for deprecated field
|
||||
if "settings" in data and "backup_path" in data["settings"]:
|
||||
del data["settings"]["backup_path"]
|
||||
|
||||
config = AppConfig(**data)
|
||||
logger.info(f"[_load_config][Coherence:OK] Configuration loaded")
|
||||
return config
|
||||
@@ -79,7 +83,7 @@ class ConfigManager:
|
||||
# For now, return default to be safe, but log the error prominently.
|
||||
return AppConfig(
|
||||
environments=[],
|
||||
settings=GlobalSettings(backup_path="backups")
|
||||
settings=GlobalSettings(storage=StorageConfig())
|
||||
)
|
||||
# [/DEF:_load_config:Function]
|
||||
|
||||
|
||||
@@ -7,6 +7,7 @@
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
from typing import List, Optional
|
||||
from ..models.storage import StorageConfig
|
||||
|
||||
# [DEF:Schedule:DataClass]
|
||||
# @PURPOSE: Represents a backup schedule configuration.
|
||||
@@ -23,6 +24,8 @@ class Environment(BaseModel):
|
||||
url: str
|
||||
username: str
|
||||
password: str # Will be masked in UI
|
||||
verify_ssl: bool = True
|
||||
timeout: int = 30
|
||||
is_default: bool = False
|
||||
backup_schedule: Schedule = Field(default_factory=Schedule)
|
||||
# [/DEF:Environment:DataClass]
|
||||
@@ -40,7 +43,7 @@ class LoggingConfig(BaseModel):
|
||||
# [DEF:GlobalSettings:DataClass]
|
||||
# @PURPOSE: Represents global application settings.
|
||||
class GlobalSettings(BaseModel):
|
||||
backup_path: str
|
||||
storage: StorageConfig = Field(default_factory=StorageConfig)
|
||||
default_environment_id: Optional[str] = None
|
||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||
|
||||
|
||||
@@ -5,6 +5,7 @@
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||
# @RELATION: USES -> backend.src.models.mapping
|
||||
# @RELATION: USES -> backend.src.core.auth.config
|
||||
#
|
||||
# @INVARIANT: A single engine instance is used for the entire application.
|
||||
|
||||
@@ -15,44 +16,71 @@ from ..models.mapping import Base
|
||||
# Import models to ensure they're registered with Base
|
||||
from ..models.task import TaskRecord
|
||||
from ..models.connection import ConnectionConfig
|
||||
from ..models.git import GitServerConfig, GitRepository, DeploymentEnvironment
|
||||
from ..models.auth import User, Role, Permission, ADGroupMapping
|
||||
from .logger import belief_scope
|
||||
from .auth.config import auth_config
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:DATABASE_URL:Constant]
|
||||
# @PURPOSE: URL for the main mappings database.
|
||||
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:///./mappings.db")
|
||||
# [/DEF:DATABASE_URL:Constant]
|
||||
|
||||
# [DEF:TASKS_DATABASE_URL:Constant]
|
||||
# @PURPOSE: URL for the tasks execution database.
|
||||
TASKS_DATABASE_URL = os.getenv("TASKS_DATABASE_URL", "sqlite:///./tasks.db")
|
||||
# [/DEF:TASKS_DATABASE_URL:Constant]
|
||||
|
||||
# [DEF:AUTH_DATABASE_URL:Constant]
|
||||
# @PURPOSE: URL for the authentication database.
|
||||
AUTH_DATABASE_URL = auth_config.AUTH_DATABASE_URL
|
||||
# [/DEF:AUTH_DATABASE_URL:Constant]
|
||||
|
||||
# [DEF:engine:Variable]
|
||||
# @PURPOSE: SQLAlchemy engine for mappings database.
|
||||
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
# [/DEF:engine:Variable]
|
||||
|
||||
# [DEF:tasks_engine:Variable]
|
||||
# @PURPOSE: SQLAlchemy engine for tasks database.
|
||||
tasks_engine = create_engine(TASKS_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
# [/DEF:tasks_engine:Variable]
|
||||
|
||||
# [DEF:auth_engine:Variable]
|
||||
# @PURPOSE: SQLAlchemy engine for authentication database.
|
||||
auth_engine = create_engine(AUTH_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
# [/DEF:auth_engine:Variable]
|
||||
|
||||
# [DEF:SessionLocal:Class]
|
||||
# @PURPOSE: A session factory for the main mappings database.
|
||||
# @PRE: engine is initialized.
|
||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
# [/DEF:SessionLocal:Class]
|
||||
|
||||
# [DEF:TasksSessionLocal:Class]
|
||||
# @PURPOSE: A session factory for the tasks execution database.
|
||||
# @PRE: tasks_engine is initialized.
|
||||
TasksSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=tasks_engine)
|
||||
# [/DEF:TasksSessionLocal:Class]
|
||||
|
||||
# [DEF:AuthSessionLocal:Class]
|
||||
# @PURPOSE: A session factory for the authentication database.
|
||||
# @PRE: auth_engine is initialized.
|
||||
AuthSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=auth_engine)
|
||||
# [/DEF:AuthSessionLocal:Class]
|
||||
|
||||
# [DEF:init_db:Function]
|
||||
# @PURPOSE: Initializes the database by creating all tables.
|
||||
# @PRE: engine and tasks_engine are initialized.
|
||||
# @POST: Database tables created.
|
||||
# @PRE: engine, tasks_engine and auth_engine are initialized.
|
||||
# @POST: Database tables created in all databases.
|
||||
# @SIDE_EFFECT: Creates physical database files if they don't exist.
|
||||
def init_db():
|
||||
with belief_scope("init_db"):
|
||||
Base.metadata.create_all(bind=engine)
|
||||
Base.metadata.create_all(bind=tasks_engine)
|
||||
Base.metadata.create_all(bind=auth_engine)
|
||||
# [/DEF:init_db:Function]
|
||||
|
||||
# [DEF:get_db:Function]
|
||||
@@ -83,4 +111,18 @@ def get_tasks_db():
|
||||
db.close()
|
||||
# [/DEF:get_tasks_db:Function]
|
||||
|
||||
# [DEF:get_auth_db:Function]
|
||||
# @PURPOSE: Dependency for getting an authentication database session.
|
||||
# @PRE: AuthSessionLocal is initialized.
|
||||
# @POST: Session is closed after use.
|
||||
# @RETURN: Generator[Session, None, None]
|
||||
def get_auth_db():
|
||||
with belief_scope("get_auth_db"):
|
||||
db = AuthSessionLocal()
|
||||
try:
|
||||
yield db
|
||||
finally:
|
||||
db.close()
|
||||
# [/DEF:get_auth_db:Function]
|
||||
|
||||
# [/DEF:backend.src.core.database:Module]
|
||||
|
||||
@@ -28,12 +28,12 @@ class BeliefFormatter(logging.Formatter):
|
||||
# @POST: Returns formatted string.
|
||||
# @PARAM: record (logging.LogRecord) - The log record to format.
|
||||
# @RETURN: str - The formatted log message.
|
||||
# @SEMANTICS: logging, formatter, context
|
||||
def format(self, record):
|
||||
msg = super().format(record)
|
||||
anchor_id = getattr(_belief_state, 'anchor_id', None)
|
||||
if anchor_id:
|
||||
msg = f"[{anchor_id}][Action] {msg}"
|
||||
return msg
|
||||
record.msg = f"[{anchor_id}][Action] {record.msg}"
|
||||
return super().format(record)
|
||||
# [/DEF:format:Function]
|
||||
# [/DEF:BeliefFormatter:Class]
|
||||
|
||||
@@ -55,6 +55,7 @@ class LogEntry(BaseModel):
|
||||
# @PARAM: message (str) - Optional entry message.
|
||||
# @PRE: anchor_id must be provided.
|
||||
# @POST: Thread-local belief state is updated and entry/exit logs are generated.
|
||||
# @SEMANTICS: logging, context, belief_state
|
||||
@contextmanager
|
||||
def belief_scope(anchor_id: str, message: str = ""):
|
||||
# Log Entry if enabled
|
||||
@@ -89,6 +90,7 @@ def belief_scope(anchor_id: str, message: str = ""):
|
||||
# @PRE: config is a valid LoggingConfig instance.
|
||||
# @POST: Logger level, handlers, and belief state flag are updated.
|
||||
# @PARAM: config (LoggingConfig) - The logging configuration.
|
||||
# @SEMANTICS: logging, configuration, initialization
|
||||
def configure_logger(config):
|
||||
global _enable_belief_state
|
||||
_enable_belief_state = config.enable_belief_state
|
||||
@@ -141,6 +143,7 @@ class WebSocketLogHandler(logging.Handler):
|
||||
# @PRE: capacity is an integer.
|
||||
# @POST: Instance initialized with empty deque.
|
||||
# @PARAM: capacity (int) - Maximum number of logs to keep in memory.
|
||||
# @SEMANTICS: logging, initialization, buffer
|
||||
def __init__(self, capacity: int = 1000):
|
||||
super().__init__()
|
||||
self.log_buffer: deque[LogEntry] = deque(maxlen=capacity)
|
||||
@@ -153,6 +156,7 @@ class WebSocketLogHandler(logging.Handler):
|
||||
# @PRE: record is a logging.LogRecord.
|
||||
# @POST: Log is added to the log_buffer.
|
||||
# @PARAM: record (logging.LogRecord) - The log record to emit.
|
||||
# @SEMANTICS: logging, handler, buffer
|
||||
def emit(self, record: logging.LogRecord):
|
||||
try:
|
||||
log_entry = LogEntry(
|
||||
@@ -180,6 +184,7 @@ class WebSocketLogHandler(logging.Handler):
|
||||
# @PRE: None.
|
||||
# @POST: Returns list of LogEntry objects.
|
||||
# @RETURN: List[LogEntry] - List of buffered log entries.
|
||||
# @SEMANTICS: logging, buffer, retrieval
|
||||
def get_recent_logs(self) -> List[LogEntry]:
|
||||
"""
|
||||
Returns a list of recent log entries from the buffer.
|
||||
@@ -193,6 +198,30 @@ class WebSocketLogHandler(logging.Handler):
|
||||
# @SEMANTICS: logger, global, instance
|
||||
# @PURPOSE: The global logger instance for the application, configured with both a console handler and the custom WebSocket handler.
|
||||
logger = logging.getLogger("superset_tools_app")
|
||||
|
||||
# [DEF:believed:Function]
|
||||
# @PURPOSE: A decorator that wraps a function in a belief scope.
|
||||
# @PARAM: anchor_id (str) - The identifier for the semantic block.
|
||||
# @PRE: anchor_id must be a string.
|
||||
# @POST: Returns a decorator function.
|
||||
def believed(anchor_id: str):
|
||||
# [DEF:decorator:Function]
|
||||
# @PURPOSE: Internal decorator for belief scope.
|
||||
# @PRE: func must be a callable.
|
||||
# @POST: Returns the wrapped function.
|
||||
def decorator(func):
|
||||
# [DEF:wrapper:Function]
|
||||
# @PURPOSE: Internal wrapper that enters belief scope.
|
||||
# @PRE: None.
|
||||
# @POST: Executes the function within a belief scope.
|
||||
def wrapper(*args, **kwargs):
|
||||
with belief_scope(anchor_id):
|
||||
return func(*args, **kwargs)
|
||||
# [/DEF:wrapper:Function]
|
||||
return wrapper
|
||||
# [/DEF:decorator:Function]
|
||||
return decorator
|
||||
# [/DEF:believed:Function]
|
||||
logger.setLevel(logging.INFO)
|
||||
|
||||
# Create a formatter
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Dict, Any
|
||||
from typing import Dict, Any, Optional
|
||||
from .logger import belief_scope
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
@@ -68,6 +68,33 @@ class PluginBase(ABC):
|
||||
pass
|
||||
# [/DEF:version:Function]
|
||||
|
||||
@property
|
||||
# [DEF:required_permission:Function]
|
||||
# @PURPOSE: Returns the required permission string to execute this plugin.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string permission.
|
||||
# @RETURN: str - Required permission (e.g., "plugin:backup:execute").
|
||||
def required_permission(self) -> str:
|
||||
"""The permission string required to execute this plugin."""
|
||||
with belief_scope("required_permission"):
|
||||
return f"plugin:{self.id}:execute"
|
||||
# [/DEF:required_permission:Function]
|
||||
|
||||
@property
|
||||
# [DEF:ui_route:Function]
|
||||
# @PURPOSE: Returns the frontend route for the plugin's UI, if applicable.
|
||||
# @PRE: Plugin instance exists.
|
||||
# @POST: Returns string route or None.
|
||||
# @RETURN: Optional[str] - Frontend route.
|
||||
def ui_route(self) -> Optional[str]:
|
||||
"""
|
||||
The frontend route for the plugin's UI.
|
||||
Returns None if the plugin does not have a dedicated UI page.
|
||||
"""
|
||||
with belief_scope("ui_route"):
|
||||
return None
|
||||
# [/DEF:ui_route:Function]
|
||||
|
||||
@abstractmethod
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for the plugin's input parameters.
|
||||
@@ -111,5 +138,6 @@ class PluginConfig(BaseModel):
|
||||
name: str = Field(..., description="Human-readable name for the plugin")
|
||||
description: str = Field(..., description="Brief description of what the plugin does")
|
||||
version: str = Field(..., description="Version of the plugin")
|
||||
ui_route: Optional[str] = Field(None, description="Frontend route for the plugin UI")
|
||||
input_schema: Dict[str, Any] = Field(..., description="JSON schema for input parameters", alias="schema")
|
||||
# [/DEF:PluginConfig:Class]
|
||||
@@ -50,9 +50,18 @@ class PluginLoader:
|
||||
sys.path.insert(0, plugin_parent_dir)
|
||||
|
||||
for filename in os.listdir(self.plugin_dir):
|
||||
file_path = os.path.join(self.plugin_dir, filename)
|
||||
|
||||
# Handle directory-based plugins (packages)
|
||||
if os.path.isdir(file_path):
|
||||
init_file = os.path.join(file_path, "__init__.py")
|
||||
if os.path.exists(init_file):
|
||||
self._load_module(filename, init_file)
|
||||
continue
|
||||
|
||||
# Handle single-file plugins
|
||||
if filename.endswith(".py") and filename != "__init__.py":
|
||||
module_name = filename[:-3]
|
||||
file_path = os.path.join(self.plugin_dir, filename)
|
||||
self._load_module(module_name, file_path)
|
||||
# [/DEF:_load_plugins:Function]
|
||||
|
||||
@@ -132,6 +141,7 @@ class PluginLoader:
|
||||
name=plugin_instance.name,
|
||||
description=plugin_instance.description,
|
||||
version=plugin_instance.version,
|
||||
ui_route=plugin_instance.ui_route,
|
||||
schema=schema,
|
||||
)
|
||||
# The following line is commented out because it requires a schema to be passed to validate against.
|
||||
|
||||
@@ -1,82 +1,108 @@
|
||||
# [DEF:backend.src.core.superset_client:Module]
|
||||
#
|
||||
# @SEMANTICS: superset, api, client, database, metadata
|
||||
# @PURPOSE: Extends the base SupersetClient with database-specific metadata fetching.
|
||||
# @SEMANTICS: superset, api, client, rest, http, dashboard, dataset, import, export
|
||||
# @PURPOSE: Предоставляет высокоуровневый клиент для взаимодействия с Superset REST API, инкапсулируя логику запросов, обработку ошибок и пагинацию.
|
||||
# @LAYER: Core
|
||||
# @RELATION: INHERITS_FROM -> superset_tool.client.SupersetClient
|
||||
# @RELATION: USES -> backend.src.core.utils.network.APIClient
|
||||
# @RELATION: USES -> backend.src.core.config_models.Environment
|
||||
#
|
||||
# @INVARIANT: All database metadata requests must include UUID and name.
|
||||
# @INVARIANT: All network operations must use the internal APIClient instance.
|
||||
# @PUBLIC_API: SupersetClient
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Dict, Optional, Tuple
|
||||
from .logger import belief_scope
|
||||
from superset_tool.client import SupersetClient as BaseSupersetClient
|
||||
from superset_tool.models import SupersetConfig
|
||||
import json
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional, Tuple, Union, cast
|
||||
from requests import Response
|
||||
from .logger import logger as app_logger, belief_scope
|
||||
from .utils.network import APIClient, SupersetAPIError, AuthenticationError, DashboardNotFoundError, NetworkError
|
||||
from .utils.fileio import get_filename_from_headers
|
||||
from .config_models import Environment
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:SupersetClient:Class]
|
||||
# @PURPOSE: Extended SupersetClient for migration-specific operations.
|
||||
class SupersetClient(BaseSupersetClient):
|
||||
# @PURPOSE: Класс-обёртка над Superset REST API, предоставляющий методы для работы с дашбордами и датасетами.
|
||||
class SupersetClient:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Инициализирует клиент, проверяет конфигурацию и создает сетевой клиент.
|
||||
# @PRE: `env` должен быть валидным объектом Environment.
|
||||
# @POST: Атрибуты `env` и `network` созданы и готовы к работе.
|
||||
# @PARAM: env (Environment) - Конфигурация окружения.
|
||||
def __init__(self, env: Environment):
|
||||
with belief_scope("__init__"):
|
||||
app_logger.info("[SupersetClient.__init__][Enter] Initializing SupersetClient for env %s.", env.name)
|
||||
self.env = env
|
||||
# Construct auth payload expected by Superset API
|
||||
auth_payload = {
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"provider": "db",
|
||||
"refresh": "true"
|
||||
}
|
||||
self.network = APIClient(
|
||||
config={
|
||||
"base_url": env.url,
|
||||
"auth": auth_payload
|
||||
},
|
||||
verify_ssl=env.verify_ssl,
|
||||
timeout=env.timeout
|
||||
)
|
||||
self.delete_before_reimport: bool = False
|
||||
app_logger.info("[SupersetClient.__init__][Exit] SupersetClient initialized.")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:authenticate:Function]
|
||||
# @PURPOSE: Authenticates the client using the configured credentials.
|
||||
# @PRE: self.network must be initialized with valid auth configuration.
|
||||
# @POST: Client is authenticated and tokens are stored.
|
||||
# @RETURN: Dict[str, str] - Authentication tokens.
|
||||
def authenticate(self):
|
||||
def authenticate(self) -> Dict[str, str]:
|
||||
with belief_scope("SupersetClient.authenticate"):
|
||||
return self.network.authenticate()
|
||||
|
||||
# [DEF:get_databases_summary:Function]
|
||||
# @PURPOSE: Fetch a summary of databases including uuid, name, and engine.
|
||||
# @PRE: self.network must be initialized and authenticated.
|
||||
# @POST: Returns a list of database dictionaries with 'engine' field.
|
||||
# @RETURN: List[Dict] - Summary of databases.
|
||||
def get_databases_summary(self) -> List[Dict]:
|
||||
with belief_scope("SupersetClient.get_databases_summary"):
|
||||
"""
|
||||
Fetch a summary of databases including uuid, name, and engine.
|
||||
"""
|
||||
query = {
|
||||
"columns": ["uuid", "database_name", "backend"]
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
|
||||
# Map 'backend' to 'engine' for consistency with contracts
|
||||
for db in databases:
|
||||
db['engine'] = db.pop('backend', None)
|
||||
|
||||
return databases
|
||||
# [/DEF:get_databases_summary:Function]
|
||||
# [/DEF:authenticate:Function]
|
||||
|
||||
# [DEF:get_database_by_uuid:Function]
|
||||
# @PURPOSE: Find a database by its UUID.
|
||||
# @PRE: db_uuid must be a string.
|
||||
# @POST: Returns database metadata if found.
|
||||
# @PARAM: db_uuid (str) - The UUID of the database.
|
||||
# @RETURN: Optional[Dict] - Database info if found, else None.
|
||||
def get_database_by_uuid(self, db_uuid: str) -> Optional[Dict]:
|
||||
with belief_scope("SupersetClient.get_database_by_uuid", f"uuid={db_uuid}"):
|
||||
"""
|
||||
Find a database by its UUID.
|
||||
"""
|
||||
query = {
|
||||
"filters": [{"col": "uuid", "op": "eq", "value": db_uuid}]
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
return databases[0] if databases else None
|
||||
# [/DEF:get_database_by_uuid:Function]
|
||||
@property
|
||||
# [DEF:headers:Function]
|
||||
# @PURPOSE: Возвращает базовые HTTP-заголовки, используемые сетевым клиентом.
|
||||
# @PRE: APIClient is initialized and authenticated.
|
||||
# @POST: Returns a dictionary of HTTP headers.
|
||||
def headers(self) -> dict:
|
||||
with belief_scope("headers"):
|
||||
return self.network.headers
|
||||
# [/DEF:headers:Function]
|
||||
|
||||
# [SECTION: DASHBOARD OPERATIONS]
|
||||
|
||||
# [DEF:get_dashboards:Function]
|
||||
# @PURPOSE: Получает полный список дашбордов, автоматически обрабатывая пагинацию.
|
||||
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса для API.
|
||||
# @PRE: Client is authenticated.
|
||||
# @POST: Returns a tuple with total count and list of dashboards.
|
||||
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список дашбордов).
|
||||
def get_dashboards(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
||||
with belief_scope("get_dashboards"):
|
||||
app_logger.info("[get_dashboards][Enter] Fetching dashboards.")
|
||||
validated_query = self._validate_query_params(query or {})
|
||||
if 'columns' not in validated_query:
|
||||
validated_query['columns'] = ["slug", "id", "changed_on_utc", "dashboard_title", "published"]
|
||||
|
||||
total_count = self._fetch_total_object_count(endpoint="/dashboard/")
|
||||
paginated_data = self._fetch_all_pages(
|
||||
endpoint="/dashboard/",
|
||||
pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"},
|
||||
)
|
||||
app_logger.info("[get_dashboards][Exit] Found %d dashboards.", total_count)
|
||||
return total_count, paginated_data
|
||||
# [/DEF:get_dashboards:Function]
|
||||
|
||||
# [DEF:get_dashboards_summary:Function]
|
||||
# @PURPOSE: Fetches dashboard metadata optimized for the grid.
|
||||
# @PRE: self.network must be authenticated.
|
||||
# @POST: Returns a list of dashboard dictionaries mapped to the grid schema.
|
||||
# @PRE: Client is authenticated.
|
||||
# @POST: Returns a list of dashboard metadata summaries.
|
||||
# @RETURN: List[Dict]
|
||||
def get_dashboards_summary(self) -> List[Dict]:
|
||||
with belief_scope("SupersetClient.get_dashboards_summary"):
|
||||
"""
|
||||
Fetches dashboard metadata optimized for the grid.
|
||||
Returns a list of dictionaries mapped to DashboardMetadata fields.
|
||||
"""
|
||||
query = {
|
||||
"columns": ["id", "dashboard_title", "changed_on_utc", "published"]
|
||||
}
|
||||
@@ -94,34 +120,331 @@ class SupersetClient(BaseSupersetClient):
|
||||
return result
|
||||
# [/DEF:get_dashboards_summary:Function]
|
||||
|
||||
# [DEF:export_dashboard:Function]
|
||||
# @PURPOSE: Экспортирует дашборд в виде ZIP-архива.
|
||||
# @PARAM: dashboard_id (int) - ID дашборда для экспорта.
|
||||
# @PRE: dashboard_id must exist in Superset.
|
||||
# @POST: Returns ZIP content and filename.
|
||||
# @RETURN: Tuple[bytes, str] - Бинарное содержимое ZIP-архива и имя файла.
|
||||
def export_dashboard(self, dashboard_id: int) -> Tuple[bytes, str]:
|
||||
with belief_scope("export_dashboard"):
|
||||
app_logger.info("[export_dashboard][Enter] Exporting dashboard %s.", dashboard_id)
|
||||
response = self.network.request(
|
||||
method="GET",
|
||||
endpoint="/dashboard/export/",
|
||||
params={"q": json.dumps([dashboard_id])},
|
||||
stream=True,
|
||||
raw_response=True,
|
||||
)
|
||||
response = cast(Response, response)
|
||||
self._validate_export_response(response, dashboard_id)
|
||||
filename = self._resolve_export_filename(response, dashboard_id)
|
||||
app_logger.info("[export_dashboard][Exit] Exported dashboard %s to %s.", dashboard_id, filename)
|
||||
return response.content, filename
|
||||
# [/DEF:export_dashboard:Function]
|
||||
|
||||
# [DEF:import_dashboard:Function]
|
||||
# @PURPOSE: Импортирует дашборд из ZIP-файла.
|
||||
# @PARAM: file_name (Union[str, Path]) - Путь к ZIP-архиву.
|
||||
# @PARAM: dash_id (Optional[int]) - ID дашборда для удаления при сбое.
|
||||
# @PARAM: dash_slug (Optional[str]) - Slug дашборда для поиска ID.
|
||||
# @PRE: file_name must be a valid ZIP dashboard export.
|
||||
# @POST: Dashboard is imported or re-imported after deletion.
|
||||
# @RETURN: Dict - Ответ API в случае успеха.
|
||||
def import_dashboard(self, file_name: Union[str, Path], dash_id: Optional[int] = None, dash_slug: Optional[str] = None) -> Dict:
|
||||
with belief_scope("import_dashboard"):
|
||||
file_path = str(file_name)
|
||||
self._validate_import_file(file_path)
|
||||
try:
|
||||
return self._do_import(file_path)
|
||||
except Exception as exc:
|
||||
app_logger.error("[import_dashboard][Failure] First import attempt failed: %s", exc, exc_info=True)
|
||||
if not self.delete_before_reimport:
|
||||
raise
|
||||
|
||||
target_id = self._resolve_target_id_for_delete(dash_id, dash_slug)
|
||||
if target_id is None:
|
||||
app_logger.error("[import_dashboard][Failure] No ID available for delete-retry.")
|
||||
raise
|
||||
|
||||
self.delete_dashboard(target_id)
|
||||
app_logger.info("[import_dashboard][State] Deleted dashboard ID %s, retrying import.", target_id)
|
||||
return self._do_import(file_path)
|
||||
# [/DEF:import_dashboard:Function]
|
||||
|
||||
# [DEF:delete_dashboard:Function]
|
||||
# @PURPOSE: Удаляет дашборд по его ID или slug.
|
||||
# @PARAM: dashboard_id (Union[int, str]) - ID или slug дашборда.
|
||||
# @PRE: dashboard_id must exist.
|
||||
# @POST: Dashboard is removed from Superset.
|
||||
def delete_dashboard(self, dashboard_id: Union[int, str]) -> None:
|
||||
with belief_scope("delete_dashboard"):
|
||||
app_logger.info("[delete_dashboard][Enter] Deleting dashboard %s.", dashboard_id)
|
||||
response = self.network.request(method="DELETE", endpoint=f"/dashboard/{dashboard_id}")
|
||||
response = cast(Dict, response)
|
||||
if response.get("result", True) is not False:
|
||||
app_logger.info("[delete_dashboard][Success] Dashboard %s deleted.", dashboard_id)
|
||||
else:
|
||||
app_logger.warning("[delete_dashboard][Warning] Unexpected response while deleting %s: %s", dashboard_id, response)
|
||||
# [/DEF:delete_dashboard:Function]
|
||||
|
||||
# [/SECTION]
|
||||
|
||||
# [SECTION: DATASET OPERATIONS]
|
||||
|
||||
# [DEF:get_datasets:Function]
|
||||
# @PURPOSE: Получает полный список датасетов, автоматически обрабатывая пагинацию.
|
||||
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса.
|
||||
# @PRE: Client is authenticated.
|
||||
# @POST: Returns total count and list of datasets.
|
||||
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список датасетов).
|
||||
def get_datasets(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
||||
with belief_scope("get_datasets"):
|
||||
app_logger.info("[get_datasets][Enter] Fetching datasets.")
|
||||
validated_query = self._validate_query_params(query)
|
||||
|
||||
total_count = self._fetch_total_object_count(endpoint="/dataset/")
|
||||
paginated_data = self._fetch_all_pages(
|
||||
endpoint="/dataset/",
|
||||
pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"},
|
||||
)
|
||||
app_logger.info("[get_datasets][Exit] Found %d datasets.", total_count)
|
||||
return total_count, paginated_data
|
||||
# [/DEF:get_datasets:Function]
|
||||
|
||||
# [DEF:get_dataset:Function]
|
||||
# @PURPOSE: Fetch full dataset structure including columns and metrics.
|
||||
# @PRE: dataset_id must be a valid integer.
|
||||
# @POST: Returns full dataset metadata from Superset API.
|
||||
# @PARAM: dataset_id (int) - The ID of the dataset.
|
||||
# @RETURN: Dict - The dataset metadata.
|
||||
# @PURPOSE: Получает информацию о конкретном датасете по его ID.
|
||||
# @PARAM: dataset_id (int) - ID датасета.
|
||||
# @PRE: dataset_id must exist.
|
||||
# @POST: Returns dataset details.
|
||||
# @RETURN: Dict - Информация о датасете.
|
||||
def get_dataset(self, dataset_id: int) -> Dict:
|
||||
with belief_scope("SupersetClient.get_dataset", f"id={dataset_id}"):
|
||||
"""
|
||||
Fetch full dataset structure.
|
||||
"""
|
||||
return self.network.get(f"/api/v1/dataset/{dataset_id}").json()
|
||||
app_logger.info("[get_dataset][Enter] Fetching dataset %s.", dataset_id)
|
||||
response = self.network.request(method="GET", endpoint=f"/dataset/{dataset_id}")
|
||||
response = cast(Dict, response)
|
||||
app_logger.info("[get_dataset][Exit] Got dataset %s.", dataset_id)
|
||||
return response
|
||||
# [/DEF:get_dataset:Function]
|
||||
|
||||
# [DEF:update_dataset:Function]
|
||||
# @PURPOSE: Update dataset metadata.
|
||||
# @PRE: dataset_id must be valid, data must be a valid Superset dataset payload.
|
||||
# @PURPOSE: Обновляет данные датасета по его ID.
|
||||
# @PARAM: dataset_id (int) - ID датасета.
|
||||
# @PARAM: data (Dict) - Данные для обновления.
|
||||
# @PRE: dataset_id must exist.
|
||||
# @POST: Dataset is updated in Superset.
|
||||
# @PARAM: dataset_id (int) - The ID of the dataset.
|
||||
# @PARAM: data (Dict) - The payload for update.
|
||||
def update_dataset(self, dataset_id: int, data: Dict):
|
||||
# @RETURN: Dict - Ответ API.
|
||||
def update_dataset(self, dataset_id: int, data: Dict) -> Dict:
|
||||
with belief_scope("SupersetClient.update_dataset", f"id={dataset_id}"):
|
||||
"""
|
||||
Update dataset metadata.
|
||||
"""
|
||||
self.network.put(f"/api/v1/dataset/{dataset_id}", json=data)
|
||||
app_logger.info("[update_dataset][Enter] Updating dataset %s.", dataset_id)
|
||||
response = self.network.request(
|
||||
method="PUT",
|
||||
endpoint=f"/dataset/{dataset_id}",
|
||||
data=json.dumps(data),
|
||||
headers={'Content-Type': 'application/json'}
|
||||
)
|
||||
response = cast(Dict, response)
|
||||
app_logger.info("[update_dataset][Exit] Updated dataset %s.", dataset_id)
|
||||
return response
|
||||
# [/DEF:update_dataset:Function]
|
||||
|
||||
# [/SECTION]
|
||||
|
||||
# [SECTION: DATABASE OPERATIONS]
|
||||
|
||||
# [DEF:get_databases:Function]
|
||||
# @PURPOSE: Получает полный список баз данных.
|
||||
# @PARAM: query (Optional[Dict]) - Дополнительные параметры запроса.
|
||||
# @PRE: Client is authenticated.
|
||||
# @POST: Returns total count and list of databases.
|
||||
# @RETURN: Tuple[int, List[Dict]] - Кортеж (общее количество, список баз данных).
|
||||
def get_databases(self, query: Optional[Dict] = None) -> Tuple[int, List[Dict]]:
|
||||
with belief_scope("get_databases"):
|
||||
app_logger.info("[get_databases][Enter] Fetching databases.")
|
||||
validated_query = self._validate_query_params(query or {})
|
||||
if 'columns' not in validated_query:
|
||||
validated_query['columns'] = []
|
||||
total_count = self._fetch_total_object_count(endpoint="/database/")
|
||||
paginated_data = self._fetch_all_pages(
|
||||
endpoint="/database/",
|
||||
pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"},
|
||||
)
|
||||
app_logger.info("[get_databases][Exit] Found %d databases.", total_count)
|
||||
return total_count, paginated_data
|
||||
# [/DEF:get_databases:Function]
|
||||
|
||||
# [DEF:get_database:Function]
|
||||
# @PURPOSE: Получает информацию о конкретной базе данных по её ID.
|
||||
# @PARAM: database_id (int) - ID базы данных.
|
||||
# @PRE: database_id must exist.
|
||||
# @POST: Returns database details.
|
||||
# @RETURN: Dict - Информация о базе данных.
|
||||
def get_database(self, database_id: int) -> Dict:
|
||||
with belief_scope("get_database"):
|
||||
app_logger.info("[get_database][Enter] Fetching database %s.", database_id)
|
||||
response = self.network.request(method="GET", endpoint=f"/database/{database_id}")
|
||||
response = cast(Dict, response)
|
||||
app_logger.info("[get_database][Exit] Got database %s.", database_id)
|
||||
return response
|
||||
# [/DEF:get_database:Function]
|
||||
|
||||
# [DEF:get_databases_summary:Function]
|
||||
# @PURPOSE: Fetch a summary of databases including uuid, name, and engine.
|
||||
# @PRE: Client is authenticated.
|
||||
# @POST: Returns list of database summaries.
|
||||
# @RETURN: List[Dict] - Summary of databases.
|
||||
def get_databases_summary(self) -> List[Dict]:
|
||||
with belief_scope("SupersetClient.get_databases_summary"):
|
||||
query = {
|
||||
"columns": ["uuid", "database_name", "backend"]
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
|
||||
# Map 'backend' to 'engine' for consistency with contracts
|
||||
for db in databases:
|
||||
db['engine'] = db.pop('backend', None)
|
||||
|
||||
return databases
|
||||
# [/DEF:get_databases_summary:Function]
|
||||
|
||||
# [DEF:get_database_by_uuid:Function]
|
||||
# @PURPOSE: Find a database by its UUID.
|
||||
# @PARAM: db_uuid (str) - The UUID of the database.
|
||||
# @PRE: db_uuid must be a valid UUID string.
|
||||
# @POST: Returns database info or None.
|
||||
# @RETURN: Optional[Dict] - Database info if found, else None.
|
||||
def get_database_by_uuid(self, db_uuid: str) -> Optional[Dict]:
|
||||
with belief_scope("SupersetClient.get_database_by_uuid", f"uuid={db_uuid}"):
|
||||
query = {
|
||||
"filters": [{"col": "uuid", "op": "eq", "value": db_uuid}]
|
||||
}
|
||||
_, databases = self.get_databases(query=query)
|
||||
return databases[0] if databases else None
|
||||
# [/DEF:get_database_by_uuid:Function]
|
||||
|
||||
# [/SECTION]
|
||||
|
||||
# [SECTION: HELPERS]
|
||||
|
||||
# [DEF:_resolve_target_id_for_delete:Function]
|
||||
# @PURPOSE: Resolves a dashboard ID from either an ID or a slug.
|
||||
# @PRE: Either dash_id or dash_slug should be provided.
|
||||
# @POST: Returns the resolved ID or None.
|
||||
def _resolve_target_id_for_delete(self, dash_id: Optional[int], dash_slug: Optional[str]) -> Optional[int]:
|
||||
with belief_scope("_resolve_target_id_for_delete"):
|
||||
if dash_id is not None:
|
||||
return dash_id
|
||||
if dash_slug is not None:
|
||||
app_logger.debug("[_resolve_target_id_for_delete][State] Resolving ID by slug '%s'.", dash_slug)
|
||||
try:
|
||||
_, candidates = self.get_dashboards(query={"filters": [{"col": "slug", "op": "eq", "value": dash_slug}]})
|
||||
if candidates:
|
||||
target_id = candidates[0]["id"]
|
||||
app_logger.debug("[_resolve_target_id_for_delete][Success] Resolved slug to ID %s.", target_id)
|
||||
return target_id
|
||||
except Exception as e:
|
||||
app_logger.warning("[_resolve_target_id_for_delete][Warning] Could not resolve slug '%s' to ID: %s", dash_slug, e)
|
||||
return None
|
||||
# [/DEF:_resolve_target_id_for_delete:Function]
|
||||
|
||||
# [DEF:_do_import:Function]
|
||||
# @PURPOSE: Performs the actual multipart upload for import.
|
||||
# @PRE: file_name must be a path to an existing ZIP file.
|
||||
# @POST: Returns the API response from the upload.
|
||||
def _do_import(self, file_name: Union[str, Path]) -> Dict:
|
||||
with belief_scope("_do_import"):
|
||||
app_logger.debug(f"[_do_import][State] Uploading file: {file_name}")
|
||||
file_path = Path(file_name)
|
||||
if not file_path.exists():
|
||||
app_logger.error(f"[_do_import][Failure] File does not exist: {file_name}")
|
||||
raise FileNotFoundError(f"File does not exist: {file_name}")
|
||||
|
||||
return self.network.upload_file(
|
||||
endpoint="/dashboard/import/",
|
||||
file_info={"file_obj": file_path, "file_name": file_path.name, "form_field": "formData"},
|
||||
extra_data={"overwrite": "true"},
|
||||
timeout=self.env.timeout * 2,
|
||||
)
|
||||
# [/DEF:_do_import:Function]
|
||||
|
||||
# [DEF:_validate_export_response:Function]
|
||||
# @PURPOSE: Validates that the export response is a non-empty ZIP archive.
|
||||
# @PRE: response must be a valid requests.Response object.
|
||||
# @POST: Raises SupersetAPIError if validation fails.
|
||||
def _validate_export_response(self, response: Response, dashboard_id: int) -> None:
|
||||
with belief_scope("_validate_export_response"):
|
||||
content_type = response.headers.get("Content-Type", "")
|
||||
if "application/zip" not in content_type:
|
||||
raise SupersetAPIError(f"Получен не ZIP-архив (Content-Type: {content_type})")
|
||||
if not response.content:
|
||||
raise SupersetAPIError("Получены пустые данные при экспорте")
|
||||
# [/DEF:_validate_export_response:Function]
|
||||
|
||||
# [DEF:_resolve_export_filename:Function]
|
||||
# @PURPOSE: Determines the filename for an exported dashboard.
|
||||
# @PRE: response must contain Content-Disposition header or dashboard_id must be provided.
|
||||
# @POST: Returns a sanitized filename string.
|
||||
def _resolve_export_filename(self, response: Response, dashboard_id: int) -> str:
|
||||
with belief_scope("_resolve_export_filename"):
|
||||
filename = get_filename_from_headers(dict(response.headers))
|
||||
if not filename:
|
||||
from datetime import datetime
|
||||
timestamp = datetime.now().strftime("%Y%m%dT%H%M%S")
|
||||
filename = f"dashboard_export_{dashboard_id}_{timestamp}.zip"
|
||||
app_logger.warning("[_resolve_export_filename][Warning] Generated filename: %s", filename)
|
||||
return filename
|
||||
# [/DEF:_resolve_export_filename:Function]
|
||||
|
||||
# [DEF:_validate_query_params:Function]
|
||||
# @PURPOSE: Ensures query parameters have default page and page_size.
|
||||
# @PRE: query can be None or a dictionary.
|
||||
# @POST: Returns a dictionary with at least page and page_size.
|
||||
def _validate_query_params(self, query: Optional[Dict]) -> Dict:
|
||||
with belief_scope("_validate_query_params"):
|
||||
base_query = {"page": 0, "page_size": 1000}
|
||||
return {**base_query, **(query or {})}
|
||||
# [/DEF:_validate_query_params:Function]
|
||||
|
||||
# [DEF:_fetch_total_object_count:Function]
|
||||
# @PURPOSE: Fetches the total number of items for a given endpoint.
|
||||
# @PRE: endpoint must be a valid Superset API path.
|
||||
# @POST: Returns the total count as an integer.
|
||||
def _fetch_total_object_count(self, endpoint: str) -> int:
|
||||
with belief_scope("_fetch_total_object_count"):
|
||||
return self.network.fetch_paginated_count(
|
||||
endpoint=endpoint,
|
||||
query_params={"page": 0, "page_size": 1},
|
||||
count_field="count",
|
||||
)
|
||||
# [/DEF:_fetch_total_object_count:Function]
|
||||
|
||||
# [DEF:_fetch_all_pages:Function]
|
||||
# @PURPOSE: Iterates through all pages to collect all data items.
|
||||
# @PRE: pagination_options must contain base_query, total_count, and results_field.
|
||||
# @POST: Returns a combined list of all items.
|
||||
def _fetch_all_pages(self, endpoint: str, pagination_options: Dict) -> List[Dict]:
|
||||
with belief_scope("_fetch_all_pages"):
|
||||
return self.network.fetch_paginated_data(endpoint=endpoint, pagination_options=pagination_options)
|
||||
# [/DEF:_fetch_all_pages:Function]
|
||||
|
||||
# [DEF:_validate_import_file:Function]
|
||||
# @PURPOSE: Validates that the file to be imported is a valid ZIP with metadata.yaml.
|
||||
# @PRE: zip_path must be a path to a file.
|
||||
# @POST: Raises error if file is missing, not a ZIP, or missing metadata.
|
||||
def _validate_import_file(self, zip_path: Union[str, Path]) -> None:
|
||||
with belief_scope("_validate_import_file"):
|
||||
path = Path(zip_path)
|
||||
if not path.exists():
|
||||
raise FileNotFoundError(f"Файл {zip_path} не существует")
|
||||
if not zipfile.is_zipfile(path):
|
||||
raise SupersetAPIError(f"Файл {zip_path} не является ZIP-архивом")
|
||||
with zipfile.ZipFile(path, "r") as zf:
|
||||
if not any(n.endswith("metadata.yaml") for n in zf.namelist()):
|
||||
raise SupersetAPIError(f"Архив {zip_path} не содержит 'metadata.yaml'")
|
||||
# [/DEF:_validate_import_file:Function]
|
||||
|
||||
# [/SECTION]
|
||||
|
||||
# [/DEF:SupersetClient:Class]
|
||||
|
||||
# [/DEF:backend.src.core.superset_client:Module]
|
||||
|
||||
477
superset_tool/utils/dataset_mapper.py → backend/src/core/utils/dataset_mapper.py
Executable file → Normal file
477
superset_tool/utils/dataset_mapper.py → backend/src/core/utils/dataset_mapper.py
Executable file → Normal file
@@ -1,240 +1,237 @@
|
||||
# [DEF:superset_tool.utils.dataset_mapper:Module]
|
||||
#
|
||||
# @SEMANTICS: dataset, mapping, postgresql, xlsx, superset
|
||||
# @PURPOSE: Этот модуль отвечает за обновление метаданных (verbose_map) в датасетах Superset, извлекая их из PostgreSQL или XLSX-файлов.
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.client
|
||||
# @RELATION: DEPENDS_ON -> pandas
|
||||
# @RELATION: DEPENDS_ON -> psycopg2
|
||||
# @PUBLIC_API: DatasetMapper
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import pandas as pd # type: ignore
|
||||
import psycopg2 # type: ignore
|
||||
from superset_tool.client import SupersetClient
|
||||
from superset_tool.utils.init_clients import setup_clients
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
from typing import Dict, List, Optional, Any
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:DatasetMapper:Class]
|
||||
# @PURPOSE: Класс для меппинга и обновления verbose_map в датасетах Superset.
|
||||
class DatasetMapper:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the mapper.
|
||||
# @PRE: logger должен быть экземпляром SupersetLogger.
|
||||
# @POST: Объект DatasetMapper инициализирован.
|
||||
def __init__(self, logger: SupersetLogger):
|
||||
self.logger = logger
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:get_postgres_comments:Function]
|
||||
# @PURPOSE: Извлекает комментарии к колонкам из системного каталога PostgreSQL.
|
||||
# @PRE: db_config должен содержать валидные параметры подключения (host, port, user, password, dbname).
|
||||
# @PRE: table_name и table_schema должны быть строками.
|
||||
# @POST: Возвращается словарь, где ключи - имена колонок, значения - комментарии из БД.
|
||||
# @THROW: Exception - При ошибках подключения или выполнения запроса к БД.
|
||||
# @PARAM: db_config (Dict) - Конфигурация для подключения к БД.
|
||||
# @PARAM: table_name (str) - Имя таблицы.
|
||||
# @PARAM: table_schema (str) - Схема таблицы.
|
||||
# @RETURN: Dict[str, str] - Словарь с комментариями к колонкам.
|
||||
def get_postgres_comments(self, db_config: Dict, table_name: str, table_schema: str) -> Dict[str, str]:
|
||||
with self.logger.belief_scope("Fetch comments from PostgreSQL"):
|
||||
self.logger.info("[get_postgres_comments][Enter] Fetching comments from PostgreSQL for %s.%s.", table_schema, table_name)
|
||||
query = f"""
|
||||
SELECT
|
||||
cols.column_name,
|
||||
CASE
|
||||
WHEN pg_catalog.col_description(
|
||||
(SELECT c.oid
|
||||
FROM pg_catalog.pg_class c
|
||||
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
|
||||
WHERE c.relname = cols.table_name
|
||||
AND n.nspname = cols.table_schema),
|
||||
cols.ordinal_position::int
|
||||
) LIKE '%|%' THEN
|
||||
split_part(
|
||||
pg_catalog.col_description(
|
||||
(SELECT c.oid
|
||||
FROM pg_catalog.pg_class c
|
||||
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
|
||||
WHERE c.relname = cols.table_name
|
||||
AND n.nspname = cols.table_schema),
|
||||
cols.ordinal_position::int
|
||||
),
|
||||
'|',
|
||||
1
|
||||
)
|
||||
ELSE
|
||||
pg_catalog.col_description(
|
||||
(SELECT c.oid
|
||||
FROM pg_catalog.pg_class c
|
||||
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
|
||||
WHERE c.relname = cols.table_name
|
||||
AND n.nspname = cols.table_schema),
|
||||
cols.ordinal_position::int
|
||||
)
|
||||
END AS column_comment
|
||||
FROM
|
||||
information_schema.columns cols
|
||||
WHERE cols.table_catalog = '{db_config.get('dbname')}' AND cols.table_name = '{table_name}' AND cols.table_schema = '{table_schema}';
|
||||
"""
|
||||
comments = {}
|
||||
try:
|
||||
with psycopg2.connect(**db_config) as conn, conn.cursor() as cursor:
|
||||
cursor.execute(query)
|
||||
for row in cursor.fetchall():
|
||||
if row[1]:
|
||||
comments[row[0]] = row[1]
|
||||
self.logger.info("[get_postgres_comments][Success] Fetched %d comments.", len(comments))
|
||||
except Exception as e:
|
||||
self.logger.error("[get_postgres_comments][Failure] %s", e, exc_info=True)
|
||||
raise
|
||||
return comments
|
||||
# [/DEF:get_postgres_comments:Function]
|
||||
|
||||
# [DEF:load_excel_mappings:Function]
|
||||
# @PURPOSE: Загружает меппинги 'column_name' -> 'column_comment' из XLSX файла.
|
||||
# @PRE: file_path должен указывать на существующий XLSX файл.
|
||||
# @POST: Возвращается словарь с меппингами из файла.
|
||||
# @THROW: Exception - При ошибках чтения файла или парсинга.
|
||||
# @PARAM: file_path (str) - Путь к XLSX файлу.
|
||||
# @RETURN: Dict[str, str] - Словарь с меппингами.
|
||||
def load_excel_mappings(self, file_path: str) -> Dict[str, str]:
|
||||
with self.logger.belief_scope("Load mappings from Excel"):
|
||||
self.logger.info("[load_excel_mappings][Enter] Loading mappings from %s.", file_path)
|
||||
try:
|
||||
df = pd.read_excel(file_path)
|
||||
mappings = df.set_index('column_name')['verbose_name'].to_dict()
|
||||
self.logger.info("[load_excel_mappings][Success] Loaded %d mappings.", len(mappings))
|
||||
return mappings
|
||||
except Exception as e:
|
||||
self.logger.error("[load_excel_mappings][Failure] %s", e, exc_info=True)
|
||||
raise
|
||||
# [/DEF:load_excel_mappings:Function]
|
||||
|
||||
# [DEF:run_mapping:Function]
|
||||
# @PURPOSE: Основная функция для выполнения меппинга и обновления verbose_map датасета в Superset.
|
||||
# @PRE: superset_client должен быть авторизован.
|
||||
# @PRE: dataset_id должен быть существующим ID в Superset.
|
||||
# @POST: Если найдены изменения, датасет в Superset обновлен через API.
|
||||
# @RELATION: CALLS -> self.get_postgres_comments
|
||||
# @RELATION: CALLS -> self.load_excel_mappings
|
||||
# @RELATION: CALLS -> superset_client.get_dataset
|
||||
# @RELATION: CALLS -> superset_client.update_dataset
|
||||
# @PARAM: superset_client (SupersetClient) - Клиент Superset.
|
||||
# @PARAM: dataset_id (int) - ID датасета для обновления.
|
||||
# @PARAM: source (str) - Источник данных ('postgres', 'excel', 'both').
|
||||
# @PARAM: postgres_config (Optional[Dict]) - Конфигурация для подключения к PostgreSQL.
|
||||
# @PARAM: excel_path (Optional[str]) - Путь к XLSX файлу.
|
||||
# @PARAM: table_name (Optional[str]) - Имя таблицы в PostgreSQL.
|
||||
# @PARAM: table_schema (Optional[str]) - Схема таблицы в PostgreSQL.
|
||||
def run_mapping(self, superset_client: SupersetClient, dataset_id: int, source: str, postgres_config: Optional[Dict] = None, excel_path: Optional[str] = None, table_name: Optional[str] = None, table_schema: Optional[str] = None):
|
||||
with self.logger.belief_scope(f"Run dataset mapping for ID {dataset_id}"):
|
||||
self.logger.info("[run_mapping][Enter] Starting dataset mapping for ID %d from source '%s'.", dataset_id, source)
|
||||
mappings: Dict[str, str] = {}
|
||||
|
||||
try:
|
||||
if source in ['postgres', 'both']:
|
||||
assert postgres_config and table_name and table_schema, "Postgres config is required."
|
||||
mappings.update(self.get_postgres_comments(postgres_config, table_name, table_schema))
|
||||
if source in ['excel', 'both']:
|
||||
assert excel_path, "Excel path is required."
|
||||
mappings.update(self.load_excel_mappings(excel_path))
|
||||
if source not in ['postgres', 'excel', 'both']:
|
||||
self.logger.error("[run_mapping][Failure] Invalid source: %s.", source)
|
||||
return
|
||||
|
||||
dataset_response = superset_client.get_dataset(dataset_id)
|
||||
dataset_data = dataset_response['result']
|
||||
|
||||
original_columns = dataset_data.get('columns', [])
|
||||
updated_columns = []
|
||||
changes_made = False
|
||||
|
||||
for column in original_columns:
|
||||
col_name = column.get('column_name')
|
||||
|
||||
new_column = {
|
||||
"column_name": col_name,
|
||||
"id": column.get("id"),
|
||||
"advanced_data_type": column.get("advanced_data_type"),
|
||||
"description": column.get("description"),
|
||||
"expression": column.get("expression"),
|
||||
"extra": column.get("extra"),
|
||||
"filterable": column.get("filterable"),
|
||||
"groupby": column.get("groupby"),
|
||||
"is_active": column.get("is_active"),
|
||||
"is_dttm": column.get("is_dttm"),
|
||||
"python_date_format": column.get("python_date_format"),
|
||||
"type": column.get("type"),
|
||||
"uuid": column.get("uuid"),
|
||||
"verbose_name": column.get("verbose_name"),
|
||||
}
|
||||
|
||||
new_column = {k: v for k, v in new_column.items() if v is not None}
|
||||
|
||||
if col_name in mappings:
|
||||
mapping_value = mappings[col_name]
|
||||
if isinstance(mapping_value, str) and new_column.get('verbose_name') != mapping_value:
|
||||
new_column['verbose_name'] = mapping_value
|
||||
changes_made = True
|
||||
|
||||
updated_columns.append(new_column)
|
||||
|
||||
updated_metrics = []
|
||||
for metric in dataset_data.get("metrics", []):
|
||||
new_metric = {
|
||||
"id": metric.get("id"),
|
||||
"metric_name": metric.get("metric_name"),
|
||||
"expression": metric.get("expression"),
|
||||
"verbose_name": metric.get("verbose_name"),
|
||||
"description": metric.get("description"),
|
||||
"d3format": metric.get("d3format"),
|
||||
"currency": metric.get("currency"),
|
||||
"extra": metric.get("extra"),
|
||||
"warning_text": metric.get("warning_text"),
|
||||
"metric_type": metric.get("metric_type"),
|
||||
"uuid": metric.get("uuid"),
|
||||
}
|
||||
updated_metrics.append({k: v for k, v in new_metric.items() if v is not None})
|
||||
|
||||
if changes_made:
|
||||
payload_for_update = {
|
||||
"database_id": dataset_data.get("database", {}).get("id"),
|
||||
"table_name": dataset_data.get("table_name"),
|
||||
"schema": dataset_data.get("schema"),
|
||||
"columns": updated_columns,
|
||||
"owners": [owner["id"] for owner in dataset_data.get("owners", [])],
|
||||
"metrics": updated_metrics,
|
||||
"extra": dataset_data.get("extra"),
|
||||
"description": dataset_data.get("description"),
|
||||
"sql": dataset_data.get("sql"),
|
||||
"cache_timeout": dataset_data.get("cache_timeout"),
|
||||
"catalog": dataset_data.get("catalog"),
|
||||
"default_endpoint": dataset_data.get("default_endpoint"),
|
||||
"external_url": dataset_data.get("external_url"),
|
||||
"fetch_values_predicate": dataset_data.get("fetch_values_predicate"),
|
||||
"filter_select_enabled": dataset_data.get("filter_select_enabled"),
|
||||
"is_managed_externally": dataset_data.get("is_managed_externally"),
|
||||
"is_sqllab_view": dataset_data.get("is_sqllab_view"),
|
||||
"main_dttm_col": dataset_data.get("main_dttm_col"),
|
||||
"normalize_columns": dataset_data.get("normalize_columns"),
|
||||
"offset": dataset_data.get("offset"),
|
||||
"template_params": dataset_data.get("template_params"),
|
||||
}
|
||||
|
||||
payload_for_update = {k: v for k, v in payload_for_update.items() if v is not None}
|
||||
|
||||
superset_client.update_dataset(dataset_id, payload_for_update)
|
||||
self.logger.info("[run_mapping][Success] Dataset %d columns' verbose_name updated.", dataset_id)
|
||||
else:
|
||||
self.logger.info("[run_mapping][State] No changes in columns' verbose_name, skipping update.")
|
||||
|
||||
except (AssertionError, FileNotFoundError, Exception) as e:
|
||||
self.logger.error("[run_mapping][Failure] %s", e, exc_info=True)
|
||||
return
|
||||
# [/DEF:run_mapping:Function]
|
||||
# [/DEF:DatasetMapper:Class]
|
||||
|
||||
# [/DEF:superset_tool.utils.dataset_mapper:Module]
|
||||
# [DEF:backend.core.utils.dataset_mapper:Module]
|
||||
#
|
||||
# @SEMANTICS: dataset, mapping, postgresql, xlsx, superset
|
||||
# @PURPOSE: Этот модуль отвечает за обновление метаданных (verbose_map) в датасетах Superset, извлекая их из PostgreSQL или XLSX-файлов.
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> backend.core.superset_client
|
||||
# @RELATION: DEPENDS_ON -> pandas
|
||||
# @RELATION: DEPENDS_ON -> psycopg2
|
||||
# @PUBLIC_API: DatasetMapper
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import pandas as pd # type: ignore
|
||||
import psycopg2 # type: ignore
|
||||
from typing import Dict, List, Optional, Any
|
||||
from ..logger import logger as app_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:DatasetMapper:Class]
|
||||
# @PURPOSE: Класс для меппинга и обновления verbose_map в датасетах Superset.
|
||||
class DatasetMapper:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the mapper.
|
||||
# @POST: Объект DatasetMapper инициализирован.
|
||||
def __init__(self):
|
||||
pass
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:get_postgres_comments:Function]
|
||||
# @PURPOSE: Извлекает комментарии к колонкам из системного каталога PostgreSQL.
|
||||
# @PRE: db_config должен содержать валидные параметры подключения (host, port, user, password, dbname).
|
||||
# @PRE: table_name и table_schema должны быть строками.
|
||||
# @POST: Возвращается словарь, где ключи - имена колонок, значения - комментарии из БД.
|
||||
# @THROW: Exception - При ошибках подключения или выполнения запроса к БД.
|
||||
# @PARAM: db_config (Dict) - Конфигурация для подключения к БД.
|
||||
# @PARAM: table_name (str) - Имя таблицы.
|
||||
# @PARAM: table_schema (str) - Схема таблицы.
|
||||
# @RETURN: Dict[str, str] - Словарь с комментариями к колонкам.
|
||||
def get_postgres_comments(self, db_config: Dict, table_name: str, table_schema: str) -> Dict[str, str]:
|
||||
with belief_scope("Fetch comments from PostgreSQL"):
|
||||
app_logger.info("[get_postgres_comments][Enter] Fetching comments from PostgreSQL for %s.%s.", table_schema, table_name)
|
||||
query = f"""
|
||||
SELECT
|
||||
cols.column_name,
|
||||
CASE
|
||||
WHEN pg_catalog.col_description(
|
||||
(SELECT c.oid
|
||||
FROM pg_catalog.pg_class c
|
||||
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
|
||||
WHERE c.relname = cols.table_name
|
||||
AND n.nspname = cols.table_schema),
|
||||
cols.ordinal_position::int
|
||||
) LIKE '%|%' THEN
|
||||
split_part(
|
||||
pg_catalog.col_description(
|
||||
(SELECT c.oid
|
||||
FROM pg_catalog.pg_class c
|
||||
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
|
||||
WHERE c.relname = cols.table_name
|
||||
AND n.nspname = cols.table_schema),
|
||||
cols.ordinal_position::int
|
||||
),
|
||||
'|',
|
||||
1
|
||||
)
|
||||
ELSE
|
||||
pg_catalog.col_description(
|
||||
(SELECT c.oid
|
||||
FROM pg_catalog.pg_class c
|
||||
JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
|
||||
WHERE c.relname = cols.table_name
|
||||
AND n.nspname = cols.table_schema),
|
||||
cols.ordinal_position::int
|
||||
)
|
||||
END AS column_comment
|
||||
FROM
|
||||
information_schema.columns cols
|
||||
WHERE cols.table_catalog = '{db_config.get('dbname')}' AND cols.table_name = '{table_name}' AND cols.table_schema = '{table_schema}';
|
||||
"""
|
||||
comments = {}
|
||||
try:
|
||||
with psycopg2.connect(**db_config) as conn, conn.cursor() as cursor:
|
||||
cursor.execute(query)
|
||||
for row in cursor.fetchall():
|
||||
if row[1]:
|
||||
comments[row[0]] = row[1]
|
||||
app_logger.info("[get_postgres_comments][Success] Fetched %d comments.", len(comments))
|
||||
except Exception as e:
|
||||
app_logger.error("[get_postgres_comments][Failure] %s", e, exc_info=True)
|
||||
raise
|
||||
return comments
|
||||
# [/DEF:get_postgres_comments:Function]
|
||||
|
||||
# [DEF:load_excel_mappings:Function]
|
||||
# @PURPOSE: Загружает меппинги 'column_name' -> 'column_comment' из XLSX файла.
|
||||
# @PRE: file_path должен указывать на существующий XLSX файл.
|
||||
# @POST: Возвращается словарь с меппингами из файла.
|
||||
# @THROW: Exception - При ошибках чтения файла или парсинга.
|
||||
# @PARAM: file_path (str) - Путь к XLSX файлу.
|
||||
# @RETURN: Dict[str, str] - Словарь с меппингами.
|
||||
def load_excel_mappings(self, file_path: str) -> Dict[str, str]:
|
||||
with belief_scope("Load mappings from Excel"):
|
||||
app_logger.info("[load_excel_mappings][Enter] Loading mappings from %s.", file_path)
|
||||
try:
|
||||
df = pd.read_excel(file_path)
|
||||
mappings = df.set_index('column_name')['verbose_name'].to_dict()
|
||||
app_logger.info("[load_excel_mappings][Success] Loaded %d mappings.", len(mappings))
|
||||
return mappings
|
||||
except Exception as e:
|
||||
app_logger.error("[load_excel_mappings][Failure] %s", e, exc_info=True)
|
||||
raise
|
||||
# [/DEF:load_excel_mappings:Function]
|
||||
|
||||
# [DEF:run_mapping:Function]
|
||||
# @PURPOSE: Основная функция для выполнения меппинга и обновления verbose_map датасета в Superset.
|
||||
# @PRE: superset_client должен быть авторизован.
|
||||
# @PRE: dataset_id должен быть существующим ID в Superset.
|
||||
# @POST: Если найдены изменения, датасет в Superset обновлен через API.
|
||||
# @RELATION: CALLS -> self.get_postgres_comments
|
||||
# @RELATION: CALLS -> self.load_excel_mappings
|
||||
# @RELATION: CALLS -> superset_client.get_dataset
|
||||
# @RELATION: CALLS -> superset_client.update_dataset
|
||||
# @PARAM: superset_client (Any) - Клиент Superset.
|
||||
# @PARAM: dataset_id (int) - ID датасета для обновления.
|
||||
# @PARAM: source (str) - Источник данных ('postgres', 'excel', 'both').
|
||||
# @PARAM: postgres_config (Optional[Dict]) - Конфигурация для подключения к PostgreSQL.
|
||||
# @PARAM: excel_path (Optional[str]) - Путь к XLSX файлу.
|
||||
# @PARAM: table_name (Optional[str]) - Имя таблицы в PostgreSQL.
|
||||
# @PARAM: table_schema (Optional[str]) - Схема таблицы в PostgreSQL.
|
||||
def run_mapping(self, superset_client: Any, dataset_id: int, source: str, postgres_config: Optional[Dict] = None, excel_path: Optional[str] = None, table_name: Optional[str] = None, table_schema: Optional[str] = None):
|
||||
with belief_scope(f"Run dataset mapping for ID {dataset_id}"):
|
||||
app_logger.info("[run_mapping][Enter] Starting dataset mapping for ID %d from source '%s'.", dataset_id, source)
|
||||
mappings: Dict[str, str] = {}
|
||||
|
||||
try:
|
||||
if source in ['postgres', 'both']:
|
||||
assert postgres_config and table_name and table_schema, "Postgres config is required."
|
||||
mappings.update(self.get_postgres_comments(postgres_config, table_name, table_schema))
|
||||
if source in ['excel', 'both']:
|
||||
assert excel_path, "Excel path is required."
|
||||
mappings.update(self.load_excel_mappings(excel_path))
|
||||
if source not in ['postgres', 'excel', 'both']:
|
||||
app_logger.error("[run_mapping][Failure] Invalid source: %s.", source)
|
||||
return
|
||||
|
||||
dataset_response = superset_client.get_dataset(dataset_id)
|
||||
dataset_data = dataset_response['result']
|
||||
|
||||
original_columns = dataset_data.get('columns', [])
|
||||
updated_columns = []
|
||||
changes_made = False
|
||||
|
||||
for column in original_columns:
|
||||
col_name = column.get('column_name')
|
||||
|
||||
new_column = {
|
||||
"column_name": col_name,
|
||||
"id": column.get("id"),
|
||||
"advanced_data_type": column.get("advanced_data_type"),
|
||||
"description": column.get("description"),
|
||||
"expression": column.get("expression"),
|
||||
"extra": column.get("extra"),
|
||||
"filterable": column.get("filterable"),
|
||||
"groupby": column.get("groupby"),
|
||||
"is_active": column.get("is_active"),
|
||||
"is_dttm": column.get("is_dttm"),
|
||||
"python_date_format": column.get("python_date_format"),
|
||||
"type": column.get("type"),
|
||||
"uuid": column.get("uuid"),
|
||||
"verbose_name": column.get("verbose_name"),
|
||||
}
|
||||
|
||||
new_column = {k: v for k, v in new_column.items() if v is not None}
|
||||
|
||||
if col_name in mappings:
|
||||
mapping_value = mappings[col_name]
|
||||
if isinstance(mapping_value, str) and new_column.get('verbose_name') != mapping_value:
|
||||
new_column['verbose_name'] = mapping_value
|
||||
changes_made = True
|
||||
|
||||
updated_columns.append(new_column)
|
||||
|
||||
updated_metrics = []
|
||||
for metric in dataset_data.get("metrics", []):
|
||||
new_metric = {
|
||||
"id": metric.get("id"),
|
||||
"metric_name": metric.get("metric_name"),
|
||||
"expression": metric.get("expression"),
|
||||
"verbose_name": metric.get("verbose_name"),
|
||||
"description": metric.get("description"),
|
||||
"d3format": metric.get("d3format"),
|
||||
"currency": metric.get("currency"),
|
||||
"extra": metric.get("extra"),
|
||||
"warning_text": metric.get("warning_text"),
|
||||
"metric_type": metric.get("metric_type"),
|
||||
"uuid": metric.get("uuid"),
|
||||
}
|
||||
updated_metrics.append({k: v for k, v in new_metric.items() if v is not None})
|
||||
|
||||
if changes_made:
|
||||
payload_for_update = {
|
||||
"database_id": dataset_data.get("database", {}).get("id"),
|
||||
"table_name": dataset_data.get("table_name"),
|
||||
"schema": dataset_data.get("schema"),
|
||||
"columns": updated_columns,
|
||||
"owners": [owner["id"] for owner in dataset_data.get("owners", [])],
|
||||
"metrics": updated_metrics,
|
||||
"extra": dataset_data.get("extra"),
|
||||
"description": dataset_data.get("description"),
|
||||
"sql": dataset_data.get("sql"),
|
||||
"cache_timeout": dataset_data.get("cache_timeout"),
|
||||
"catalog": dataset_data.get("catalog"),
|
||||
"default_endpoint": dataset_data.get("default_endpoint"),
|
||||
"external_url": dataset_data.get("external_url"),
|
||||
"fetch_values_predicate": dataset_data.get("fetch_values_predicate"),
|
||||
"filter_select_enabled": dataset_data.get("filter_select_enabled"),
|
||||
"is_managed_externally": dataset_data.get("is_managed_externally"),
|
||||
"is_sqllab_view": dataset_data.get("is_sqllab_view"),
|
||||
"main_dttm_col": dataset_data.get("main_dttm_col"),
|
||||
"normalize_columns": dataset_data.get("normalize_columns"),
|
||||
"offset": dataset_data.get("offset"),
|
||||
"template_params": dataset_data.get("template_params"),
|
||||
}
|
||||
|
||||
payload_for_update = {k: v for k, v in payload_for_update.items() if v is not None}
|
||||
|
||||
superset_client.update_dataset(dataset_id, payload_for_update)
|
||||
app_logger.info("[run_mapping][Success] Dataset %d columns' verbose_name updated.", dataset_id)
|
||||
else:
|
||||
app_logger.info("[run_mapping][State] No changes in columns' verbose_name, skipping update.")
|
||||
|
||||
except (AssertionError, FileNotFoundError, Exception) as e:
|
||||
app_logger.error("[run_mapping][Failure] %s", e, exc_info=True)
|
||||
return
|
||||
# [/DEF:run_mapping:Function]
|
||||
# [/DEF:DatasetMapper:Class]
|
||||
|
||||
# [/DEF:backend.core.utils.dataset_mapper:Module]
|
||||
995
superset_tool/utils/fileio.py → backend/src/core/utils/fileio.py
Executable file → Normal file
995
superset_tool/utils/fileio.py → backend/src/core/utils/fileio.py
Executable file → Normal file
@@ -1,507 +1,488 @@
|
||||
# [DEF:superset_tool.utils.fileio:Module]
|
||||
#
|
||||
# @SEMANTICS: file, io, zip, yaml, temp, archive, utility
|
||||
# @PURPOSE: Предоставляет набор утилит для управления файловыми операциями, включая работу с временными файлами, архивами ZIP, файлами YAML и очистку директорий.
|
||||
# @LAYER: Infra
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.exceptions
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.utils.logger
|
||||
# @RELATION: DEPENDS_ON -> pyyaml
|
||||
# @PUBLIC_API: create_temp_file, remove_empty_directories, read_dashboard_from_disk, calculate_crc32, RetentionPolicy, archive_exports, save_and_unpack_dashboard, update_yamls, create_dashboard_export, sanitize_filename, get_filename_from_headers, consolidate_archive_folders
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import os
|
||||
import re
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Any, Optional, Tuple, Dict, List, Union, LiteralString, Generator
|
||||
from contextlib import contextmanager
|
||||
import tempfile
|
||||
from datetime import date, datetime
|
||||
import glob
|
||||
import shutil
|
||||
import zlib
|
||||
from dataclasses import dataclass
|
||||
import yaml
|
||||
from superset_tool.exceptions import InvalidZipFormatError
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:create_temp_file:Function]
|
||||
# @PURPOSE: Контекстный менеджер для создания временного файла или директории с гарантированным удалением.
|
||||
# @PRE: suffix должен быть строкой, определяющей тип ресурса.
|
||||
# @POST: Временный ресурс создан и путь к нему возвращен; ресурс удален после выхода из контекста.
|
||||
# @PARAM: content (Optional[bytes]) - Бинарное содержимое для записи во временный файл.
|
||||
# @PARAM: suffix (str) - Суффикс ресурса. Если `.dir`, создается директория.
|
||||
# @PARAM: mode (str) - Режим записи в файл (e.g., 'wb').
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера.
|
||||
# @YIELDS: Path - Путь к временному ресурсу.
|
||||
# @THROW: IOError - При ошибках создания ресурса.
|
||||
@contextmanager
|
||||
def create_temp_file(content: Optional[bytes] = None, suffix: str = ".zip", mode: str = 'wb', dry_run = False, logger: Optional[SupersetLogger] = None) -> Generator[Path, None, None]:
|
||||
logger = logger or SupersetLogger(name="fileio")
|
||||
with logger.belief_scope("Create temporary resource"):
|
||||
resource_path = None
|
||||
is_dir = suffix.startswith('.dir')
|
||||
try:
|
||||
if is_dir:
|
||||
with tempfile.TemporaryDirectory(suffix=suffix) as temp_dir:
|
||||
resource_path = Path(temp_dir)
|
||||
logger.debug("[create_temp_file][State] Created temporary directory: %s", resource_path)
|
||||
yield resource_path
|
||||
else:
|
||||
fd, temp_path_str = tempfile.mkstemp(suffix=suffix)
|
||||
resource_path = Path(temp_path_str)
|
||||
os.close(fd)
|
||||
if content:
|
||||
resource_path.write_bytes(content)
|
||||
logger.debug("[create_temp_file][State] Created temporary file: %s", resource_path)
|
||||
yield resource_path
|
||||
finally:
|
||||
if resource_path and resource_path.exists() and not dry_run:
|
||||
try:
|
||||
if resource_path.is_dir():
|
||||
shutil.rmtree(resource_path)
|
||||
logger.debug("[create_temp_file][Cleanup] Removed temporary directory: %s", resource_path)
|
||||
else:
|
||||
resource_path.unlink()
|
||||
logger.debug("[create_temp_file][Cleanup] Removed temporary file: %s", resource_path)
|
||||
except OSError as e:
|
||||
logger.error("[create_temp_file][Failure] Error during cleanup of %s: %s", resource_path, e)
|
||||
# [/DEF:create_temp_file:Function]
|
||||
|
||||
# [DEF:remove_empty_directories:Function]
|
||||
# @PURPOSE: Рекурсивно удаляет все пустые поддиректории, начиная с указанного пути.
|
||||
# @PRE: root_dir должен быть путем к существующей директории.
|
||||
# @POST: Все пустые поддиректории удалены, возвращено их количество.
|
||||
# @PARAM: root_dir (str) - Путь к корневой директории для очистки.
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера.
|
||||
# @RETURN: int - Количество удаленных директорий.
|
||||
def remove_empty_directories(root_dir: str, logger: Optional[SupersetLogger] = None) -> int:
|
||||
logger = logger or SupersetLogger(name="fileio")
|
||||
with logger.belief_scope(f"Remove empty directories in {root_dir}"):
|
||||
logger.info("[remove_empty_directories][Enter] Starting cleanup of empty directories in %s", root_dir)
|
||||
removed_count = 0
|
||||
if not os.path.isdir(root_dir):
|
||||
logger.error("[remove_empty_directories][Failure] Directory not found: %s", root_dir)
|
||||
return 0
|
||||
for current_dir, _, _ in os.walk(root_dir, topdown=False):
|
||||
if not os.listdir(current_dir):
|
||||
try:
|
||||
os.rmdir(current_dir)
|
||||
removed_count += 1
|
||||
logger.info("[remove_empty_directories][State] Removed empty directory: %s", current_dir)
|
||||
except OSError as e:
|
||||
logger.error("[remove_empty_directories][Failure] Failed to remove %s: %s", current_dir, e)
|
||||
logger.info("[remove_empty_directories][Exit] Removed %d empty directories.", removed_count)
|
||||
return removed_count
|
||||
# [/DEF:remove_empty_directories:Function]
|
||||
|
||||
# [DEF:read_dashboard_from_disk:Function]
|
||||
# @PURPOSE: Читает бинарное содержимое файла с диска.
|
||||
# @PRE: file_path должен указывать на существующий файл.
|
||||
# @POST: Возвращает байты содержимого и имя файла.
|
||||
# @PARAM: file_path (str) - Путь к файлу.
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера.
|
||||
# @RETURN: Tuple[bytes, str] - Кортеж (содержимое, имя файла).
|
||||
# @THROW: FileNotFoundError - Если файл не найден.
|
||||
def read_dashboard_from_disk(file_path: str, logger: Optional[SupersetLogger] = None) -> Tuple[bytes, str]:
|
||||
logger = logger or SupersetLogger(name="fileio")
|
||||
with logger.belief_scope(f"Read dashboard from {file_path}"):
|
||||
path = Path(file_path)
|
||||
assert path.is_file(), f"Файл дашборда не найден: {file_path}"
|
||||
logger.info("[read_dashboard_from_disk][Enter] Reading file: %s", file_path)
|
||||
content = path.read_bytes()
|
||||
if not content:
|
||||
logger.warning("[read_dashboard_from_disk][Warning] File is empty: %s", file_path)
|
||||
return content, path.name
|
||||
# [/DEF:read_dashboard_from_disk:Function]
|
||||
|
||||
# [DEF:calculate_crc32:Function]
|
||||
# @PURPOSE: Вычисляет контрольную сумму CRC32 для файла.
|
||||
# @PRE: file_path должен быть объектом Path к существующему файлу.
|
||||
# @POST: Возвращает 8-значную hex-строку CRC32.
|
||||
# @PARAM: file_path (Path) - Путь к файлу.
|
||||
# @RETURN: str - 8-значное шестнадцатеричное представление CRC32.
|
||||
# @THROW: IOError - При ошибках чтения файла.
|
||||
def calculate_crc32(file_path: Path) -> str:
|
||||
logger = SupersetLogger(name="fileio")
|
||||
with logger.belief_scope(f"Calculate CRC32 for {file_path}"):
|
||||
with open(file_path, 'rb') as f:
|
||||
crc32_value = zlib.crc32(f.read())
|
||||
return f"{crc32_value:08x}"
|
||||
# [/DEF:calculate_crc32:Function]
|
||||
|
||||
# [SECTION: DATA_CLASSES]
|
||||
# [DEF:RetentionPolicy:DataClass]
|
||||
# @PURPOSE: Определяет политику хранения для архивов (ежедневные, еженедельные, ежемесячные).
|
||||
@dataclass
|
||||
class RetentionPolicy:
|
||||
daily: int = 7
|
||||
weekly: int = 4
|
||||
monthly: int = 12
|
||||
# [/DEF:RetentionPolicy:DataClass]
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:archive_exports:Function]
|
||||
# @PURPOSE: Управляет архивом экспортированных файлов, применяя политику хранения и дедупликацию.
|
||||
# @PRE: output_dir должен быть путем к существующей директории.
|
||||
# @POST: Старые или дублирующиеся архивы удалены согласно политике.
|
||||
# @RELATION: CALLS -> apply_retention_policy
|
||||
# @RELATION: CALLS -> calculate_crc32
|
||||
# @PARAM: output_dir (str) - Директория с архивами.
|
||||
# @PARAM: policy (RetentionPolicy) - Политика хранения.
|
||||
# @PARAM: deduplicate (bool) - Флаг для включения удаления дубликатов по CRC32.
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера.
|
||||
def archive_exports(output_dir: str, policy: RetentionPolicy, deduplicate: bool = False, logger: Optional[SupersetLogger] = None) -> None:
|
||||
logger = logger or SupersetLogger(name="fileio")
|
||||
with logger.belief_scope(f"Archive exports in {output_dir}"):
|
||||
output_path = Path(output_dir)
|
||||
if not output_path.is_dir():
|
||||
logger.warning("[archive_exports][Skip] Archive directory not found: %s", output_dir)
|
||||
return
|
||||
|
||||
logger.info("[archive_exports][Enter] Managing archive in %s", output_dir)
|
||||
|
||||
# 1. Collect all zip files
|
||||
zip_files = list(output_path.glob("*.zip"))
|
||||
if not zip_files:
|
||||
logger.info("[archive_exports][State] No zip files found in %s", output_dir)
|
||||
return
|
||||
|
||||
# 2. Deduplication
|
||||
if deduplicate:
|
||||
logger.info("[archive_exports][State] Starting deduplication...")
|
||||
checksums = {}
|
||||
files_to_remove = []
|
||||
|
||||
# Sort by modification time (newest first) to keep the latest version
|
||||
zip_files.sort(key=lambda f: f.stat().st_mtime, reverse=True)
|
||||
|
||||
for file_path in zip_files:
|
||||
try:
|
||||
crc = calculate_crc32(file_path)
|
||||
if crc in checksums:
|
||||
files_to_remove.append(file_path)
|
||||
logger.debug("[archive_exports][State] Duplicate found: %s (same as %s)", file_path.name, checksums[crc].name)
|
||||
else:
|
||||
checksums[crc] = file_path
|
||||
except Exception as e:
|
||||
logger.error("[archive_exports][Failure] Failed to calculate CRC32 for %s: %s", file_path, e)
|
||||
|
||||
for f in files_to_remove:
|
||||
try:
|
||||
f.unlink()
|
||||
zip_files.remove(f)
|
||||
logger.info("[archive_exports][State] Removed duplicate: %s", f.name)
|
||||
except OSError as e:
|
||||
logger.error("[archive_exports][Failure] Failed to remove duplicate %s: %s", f, e)
|
||||
|
||||
# 3. Retention Policy
|
||||
files_with_dates = []
|
||||
for file_path in zip_files:
|
||||
# Try to extract date from filename
|
||||
# Pattern: ..._YYYYMMDD_HHMMSS.zip or ..._YYYYMMDD.zip
|
||||
match = re.search(r'_(\d{8})_', file_path.name)
|
||||
file_date = None
|
||||
if match:
|
||||
try:
|
||||
date_str = match.group(1)
|
||||
file_date = datetime.strptime(date_str, "%Y%m%d").date()
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
if not file_date:
|
||||
# Fallback to modification time
|
||||
file_date = datetime.fromtimestamp(file_path.stat().st_mtime).date()
|
||||
|
||||
files_with_dates.append((file_path, file_date))
|
||||
|
||||
files_to_keep = apply_retention_policy(files_with_dates, policy, logger)
|
||||
|
||||
for file_path, _ in files_with_dates:
|
||||
if file_path not in files_to_keep:
|
||||
try:
|
||||
file_path.unlink()
|
||||
logger.info("[archive_exports][State] Removed by retention policy: %s", file_path.name)
|
||||
except OSError as e:
|
||||
logger.error("[archive_exports][Failure] Failed to remove %s: %s", file_path, e)
|
||||
# [/DEF:archive_exports:Function]
|
||||
|
||||
# [DEF:apply_retention_policy:Function]
|
||||
# @PURPOSE: (Helper) Применяет политику хранения к списку файлов, возвращая те, что нужно сохранить.
|
||||
# @PRE: files_with_dates is a list of (Path, date) tuples.
|
||||
# @POST: Returns a set of files to keep.
|
||||
# @PARAM: files_with_dates (List[Tuple[Path, date]]) - Список файлов с датами.
|
||||
# @PARAM: policy (RetentionPolicy) - Политика хранения.
|
||||
# @PARAM: logger (SupersetLogger) - Логгер.
|
||||
# @RETURN: set - Множество путей к файлам, которые должны быть сохранены.
|
||||
def apply_retention_policy(files_with_dates: List[Tuple[Path, date]], policy: RetentionPolicy, logger: SupersetLogger) -> set:
|
||||
with logger.belief_scope("Apply retention policy"):
|
||||
# Сортируем по дате (от новой к старой)
|
||||
sorted_files = sorted(files_with_dates, key=lambda x: x[1], reverse=True)
|
||||
# Словарь для хранения файлов по категориям
|
||||
daily_files = []
|
||||
weekly_files = []
|
||||
monthly_files = []
|
||||
today = date.today()
|
||||
for file_path, file_date in sorted_files:
|
||||
# Ежедневные
|
||||
if (today - file_date).days < policy.daily:
|
||||
daily_files.append(file_path)
|
||||
# Еженедельные
|
||||
elif (today - file_date).days < policy.weekly * 7:
|
||||
weekly_files.append(file_path)
|
||||
# Ежемесячные
|
||||
elif (today - file_date).days < policy.monthly * 30:
|
||||
monthly_files.append(file_path)
|
||||
# Возвращаем множество файлов, которые нужно сохранить
|
||||
files_to_keep = set()
|
||||
files_to_keep.update(daily_files)
|
||||
files_to_keep.update(weekly_files[:policy.weekly])
|
||||
files_to_keep.update(monthly_files[:policy.monthly])
|
||||
logger.debug("[apply_retention_policy][State] Keeping %d files according to retention policy", len(files_to_keep))
|
||||
return files_to_keep
|
||||
# [/DEF:apply_retention_policy:Function]
|
||||
|
||||
# [DEF:save_and_unpack_dashboard:Function]
|
||||
# @PURPOSE: Сохраняет бинарное содержимое ZIP-архива на диск и опционально распаковывает его.
|
||||
# @PRE: zip_content должен быть байтами валидного ZIP-архива.
|
||||
# @POST: ZIP-файл сохранен, и если unpack=True, он распакован в output_dir.
|
||||
# @PARAM: zip_content (bytes) - Содержимое ZIP-архива.
|
||||
# @PARAM: output_dir (Union[str, Path]) - Директория для сохранения.
|
||||
# @PARAM: unpack (bool) - Флаг, нужно ли распаковывать архив.
|
||||
# @PARAM: original_filename (Optional[str]) - Исходное имя файла для сохранения.
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера.
|
||||
# @RETURN: Tuple[Path, Optional[Path]] - Путь к ZIP-файлу и, если применимо, путь к директории с распаковкой.
|
||||
# @THROW: InvalidZipFormatError - При ошибке формата ZIP.
|
||||
def save_and_unpack_dashboard(zip_content: bytes, output_dir: Union[str, Path], unpack: bool = False, original_filename: Optional[str] = None, logger: Optional[SupersetLogger] = None) -> Tuple[Path, Optional[Path]]:
|
||||
logger = logger or SupersetLogger(name="fileio")
|
||||
with logger.belief_scope("Save and unpack dashboard"):
|
||||
logger.info("[save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: %s", unpack)
|
||||
try:
|
||||
output_path = Path(output_dir)
|
||||
output_path.mkdir(parents=True, exist_ok=True)
|
||||
zip_name = sanitize_filename(original_filename) if original_filename else f"dashboard_export_{datetime.now().strftime('%Y%m%d_%H%M%S')}.zip"
|
||||
zip_path = output_path / zip_name
|
||||
zip_path.write_bytes(zip_content)
|
||||
logger.info("[save_and_unpack_dashboard][State] Dashboard saved to: %s", zip_path)
|
||||
if unpack:
|
||||
with zipfile.ZipFile(zip_path, 'r') as zip_ref:
|
||||
zip_ref.extractall(output_path)
|
||||
logger.info("[save_and_unpack_dashboard][State] Dashboard unpacked to: %s", output_path)
|
||||
return zip_path, output_path
|
||||
return zip_path, None
|
||||
except zipfile.BadZipFile as e:
|
||||
logger.error("[save_and_unpack_dashboard][Failure] Invalid ZIP archive: %s", e)
|
||||
raise InvalidZipFormatError(f"Invalid ZIP file: {e}") from e
|
||||
# [/DEF:save_and_unpack_dashboard:Function]
|
||||
|
||||
# [DEF:update_yamls:Function]
|
||||
# @PURPOSE: Обновляет конфигурации в YAML-файлах, заменяя значения или применяя regex.
|
||||
# @PRE: path должен быть существующей директорией.
|
||||
# @POST: Все YAML файлы в директории обновлены согласно переданным параметрам.
|
||||
# @RELATION: CALLS -> _update_yaml_file
|
||||
# @THROW: FileNotFoundError - Если `path` не существует.
|
||||
# @PARAM: db_configs (Optional[List[Dict]]) - Список конфигураций для замены.
|
||||
# @PARAM: path (str) - Путь к директории с YAML файлами.
|
||||
# @PARAM: regexp_pattern (Optional[LiteralString]) - Паттерн для поиска.
|
||||
# @PARAM: replace_string (Optional[LiteralString]) - Строка для замены.
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера.
|
||||
def update_yamls(db_configs: Optional[List[Dict[str, Any]]] = None, path: str = "dashboards", regexp_pattern: Optional[LiteralString] = None, replace_string: Optional[LiteralString] = None, logger: Optional[SupersetLogger] = None) -> None:
|
||||
logger = logger or SupersetLogger(name="fileio")
|
||||
with logger.belief_scope("Update YAML configurations"):
|
||||
logger.info("[update_yamls][Enter] Starting YAML configuration update.")
|
||||
dir_path = Path(path)
|
||||
assert dir_path.is_dir(), f"Путь {path} не существует или не является директорией"
|
||||
|
||||
configs: List[Dict[str, Any]] = db_configs or []
|
||||
|
||||
for file_path in dir_path.rglob("*.yaml"):
|
||||
_update_yaml_file(file_path, configs, regexp_pattern, replace_string, logger)
|
||||
# [/DEF:update_yamls:Function]
|
||||
|
||||
# [DEF:_update_yaml_file:Function]
|
||||
# @PURPOSE: (Helper) Обновляет один YAML файл.
|
||||
# @PRE: file_path должен быть объектом Path к существующему YAML файлу.
|
||||
# @POST: Файл обновлен согласно переданным конфигурациям или регулярному выражению.
|
||||
# @PARAM: file_path (Path) - Путь к файлу.
|
||||
# @PARAM: db_configs (List[Dict]) - Конфигурации.
|
||||
# @PARAM: regexp_pattern (Optional[str]) - Паттерн.
|
||||
# @PARAM: replace_string (Optional[str]) - Замена.
|
||||
# @PARAM: logger (SupersetLogger) - Логгер.
|
||||
def _update_yaml_file(file_path: Path, db_configs: List[Dict[str, Any]], regexp_pattern: Optional[str], replace_string: Optional[str], logger: SupersetLogger) -> None:
|
||||
with logger.belief_scope(f"Update YAML file: {file_path}"):
|
||||
# Читаем содержимое файла
|
||||
try:
|
||||
with open(file_path, 'r', encoding='utf-8') as f:
|
||||
content = f.read()
|
||||
except Exception as e:
|
||||
logger.error("[_update_yaml_file][Failure] Failed to read %s: %s", file_path, e)
|
||||
return
|
||||
# Если задан pattern и replace_string, применяем замену по регулярному выражению
|
||||
if regexp_pattern and replace_string:
|
||||
try:
|
||||
new_content = re.sub(regexp_pattern, replace_string, content)
|
||||
if new_content != content:
|
||||
with open(file_path, 'w', encoding='utf-8') as f:
|
||||
f.write(new_content)
|
||||
logger.info("[_update_yaml_file][State] Updated %s using regex pattern", file_path)
|
||||
except Exception as e:
|
||||
logger.error("[_update_yaml_file][Failure] Error applying regex to %s: %s", file_path, e)
|
||||
# Если заданы конфигурации, заменяем значения (поддержка old/new)
|
||||
if db_configs:
|
||||
try:
|
||||
# Прямой текстовый заменитель для старых/новых значений, чтобы сохранить структуру файла
|
||||
modified_content = content
|
||||
for cfg in db_configs:
|
||||
# Ожидаем структуру: {'old': {...}, 'new': {...}}
|
||||
old_cfg = cfg.get('old', {})
|
||||
new_cfg = cfg.get('new', {})
|
||||
for key, old_val in old_cfg.items():
|
||||
if key in new_cfg:
|
||||
new_val = new_cfg[key]
|
||||
# Заменяем только точные совпадения старого значения в тексте YAML, используя ключ для контекста
|
||||
if isinstance(old_val, str):
|
||||
# Ищем паттерн: key: "value" или key: value
|
||||
key_pattern = re.escape(key)
|
||||
val_pattern = re.escape(old_val)
|
||||
# Группы: 1=ключ+разделитель, 2=открывающая кавычка (опц), 3=значение, 4=закрывающая кавычка (опц)
|
||||
pattern = rf'({key_pattern}\s*:\s*)(["\']?)({val_pattern})(["\']?)'
|
||||
|
||||
# [DEF:replacer:Function]
|
||||
# @PURPOSE: Функция замены, сохраняющая кавычки если они были.
|
||||
# @PRE: match должен быть объектом совпадения регулярного выражения.
|
||||
# @POST: Возвращает строку с новым значением, сохраняя префикс и кавычки.
|
||||
def replacer(match):
|
||||
with logger.belief_scope("replacer"):
|
||||
prefix = match.group(1)
|
||||
quote_open = match.group(2)
|
||||
quote_close = match.group(4)
|
||||
return f"{prefix}{quote_open}{new_val}{quote_close}"
|
||||
# [/DEF:replacer:Function]
|
||||
|
||||
modified_content = re.sub(pattern, replacer, modified_content)
|
||||
logger.info("[_update_yaml_file][State] Replaced '%s' with '%s' for key %s in %s", old_val, new_val, key, file_path)
|
||||
# Записываем обратно изменённый контент без парсинга YAML, сохраняем оригинальное форматирование
|
||||
with open(file_path, 'w', encoding='utf-8') as f:
|
||||
f.write(modified_content)
|
||||
except Exception as e:
|
||||
logger.error("[_update_yaml_file][Failure] Error performing raw replacement in %s: %s", file_path, e)
|
||||
# [/DEF:_update_yaml_file:Function]
|
||||
|
||||
# [DEF:create_dashboard_export:Function]
|
||||
# @PURPOSE: Создает ZIP-архив из указанных исходных путей.
|
||||
# @PRE: source_paths должен содержать существующие пути.
|
||||
# @POST: ZIP-архив создан по пути zip_path.
|
||||
# @PARAM: zip_path (Union[str, Path]) - Путь для сохранения ZIP архива.
|
||||
# @PARAM: source_paths (List[Union[str, Path]]) - Список исходных путей для архивации.
|
||||
# @PARAM: exclude_extensions (Optional[List[str]]) - Список расширений для исключения.
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера.
|
||||
# @RETURN: bool - `True` при успехе, `False` при ошибке.
|
||||
def create_dashboard_export(zip_path: Union[str, Path], source_paths: List[Union[str, Path]], exclude_extensions: Optional[List[str]] = None, logger: Optional[SupersetLogger] = None) -> bool:
|
||||
logger = logger or SupersetLogger(name="fileio")
|
||||
with logger.belief_scope(f"Create dashboard export: {zip_path}"):
|
||||
logger.info("[create_dashboard_export][Enter] Packing dashboard: %s -> %s", source_paths, zip_path)
|
||||
try:
|
||||
exclude_ext = [ext.lower() for ext in exclude_extensions or []]
|
||||
with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf:
|
||||
for src_path_str in source_paths:
|
||||
src_path = Path(src_path_str)
|
||||
assert src_path.exists(), f"Путь не найден: {src_path}"
|
||||
for item in src_path.rglob('*'):
|
||||
if item.is_file() and item.suffix.lower() not in exclude_ext:
|
||||
arcname = item.relative_to(src_path.parent)
|
||||
zipf.write(item, arcname)
|
||||
logger.info("[create_dashboard_export][Exit] Archive created: %s", zip_path)
|
||||
return True
|
||||
except (IOError, zipfile.BadZipFile, AssertionError) as e:
|
||||
logger.error("[create_dashboard_export][Failure] Error: %s", e, exc_info=True)
|
||||
return False
|
||||
# [/DEF:create_dashboard_export:Function]
|
||||
|
||||
# [DEF:sanitize_filename:Function]
|
||||
# @PURPOSE: Очищает строку от символов, недопустимых в именах файлов.
|
||||
# @PRE: filename должен быть строкой.
|
||||
# @POST: Возвращает строку без спецсимволов.
|
||||
# @PARAM: filename (str) - Исходное имя файла.
|
||||
# @RETURN: str - Очищенная строка.
|
||||
def sanitize_filename(filename: str) -> str:
|
||||
logger = SupersetLogger(name="fileio")
|
||||
with logger.belief_scope(f"Sanitize filename: {filename}"):
|
||||
return re.sub(r'[\\/*?:"<>|]', "_", filename).strip()
|
||||
# [/DEF:sanitize_filename:Function]
|
||||
|
||||
# [DEF:get_filename_from_headers:Function]
|
||||
# @PURPOSE: Извлекает имя файла из HTTP заголовка 'Content-Disposition'.
|
||||
# @PRE: headers должен быть словарем заголовков.
|
||||
# @POST: Возвращает имя файла или None, если заголовок отсутствует.
|
||||
# @PARAM: headers (dict) - Словарь HTTP заголовков.
|
||||
# @RETURN: Optional[str] - Имя файла or `None`.
|
||||
def get_filename_from_headers(headers: dict) -> Optional[str]:
|
||||
logger = SupersetLogger(name="fileio")
|
||||
with logger.belief_scope("Get filename from headers"):
|
||||
content_disposition = headers.get("Content-Disposition", "")
|
||||
if match := re.search(r'filename="?([^"]+)"?', content_disposition):
|
||||
return match.group(1).strip()
|
||||
return None
|
||||
# [/DEF:get_filename_from_headers:Function]
|
||||
|
||||
# [DEF:consolidate_archive_folders:Function]
|
||||
# @PURPOSE: Консолидирует директории архивов на основе общего слага в имени.
|
||||
# @PRE: root_directory должен быть объектом Path к существующей директории.
|
||||
# @POST: Директории с одинаковым префиксом объединены в одну.
|
||||
# @THROW: TypeError, ValueError - Если `root_directory` невалиден.
|
||||
# @PARAM: root_directory (Path) - Корневая директория для консолидации.
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Экземпляр логгера.
|
||||
def consolidate_archive_folders(root_directory: Path, logger: Optional[SupersetLogger] = None) -> None:
|
||||
logger = logger or SupersetLogger(name="fileio")
|
||||
with logger.belief_scope(f"Consolidate archives in {root_directory}"):
|
||||
assert isinstance(root_directory, Path), "root_directory must be a Path object."
|
||||
assert root_directory.is_dir(), "root_directory must be an existing directory."
|
||||
|
||||
logger.info("[consolidate_archive_folders][Enter] Consolidating archives in %s", root_directory)
|
||||
# Собираем все директории с архивами
|
||||
archive_dirs = []
|
||||
for item in root_directory.iterdir():
|
||||
if item.is_dir():
|
||||
# Проверяем, есть ли в директории ZIP-архивы
|
||||
if any(item.glob("*.zip")):
|
||||
archive_dirs.append(item)
|
||||
# Группируем по слагу (части имени до первого '_')
|
||||
slug_groups = {}
|
||||
for dir_path in archive_dirs:
|
||||
dir_name = dir_path.name
|
||||
slug = dir_name.split('_')[0] if '_' in dir_name else dir_name
|
||||
if slug not in slug_groups:
|
||||
slug_groups[slug] = []
|
||||
slug_groups[slug].append(dir_path)
|
||||
# Для каждой группы консолидируем
|
||||
for slug, dirs in slug_groups.items():
|
||||
if len(dirs) <= 1:
|
||||
continue
|
||||
# Создаем целевую директорию
|
||||
target_dir = root_directory / slug
|
||||
target_dir.mkdir(exist_ok=True)
|
||||
logger.info("[consolidate_archive_folders][State] Consolidating %d directories under %s", len(dirs), target_dir)
|
||||
# Перемещаем содержимое
|
||||
for source_dir in dirs:
|
||||
if source_dir == target_dir:
|
||||
continue
|
||||
for item in source_dir.iterdir():
|
||||
dest_item = target_dir / item.name
|
||||
try:
|
||||
if item.is_dir():
|
||||
shutil.move(str(item), str(dest_item))
|
||||
else:
|
||||
shutil.move(str(item), str(dest_item))
|
||||
except Exception as e:
|
||||
logger.error("[consolidate_archive_folders][Failure] Failed to move %s to %s: %s", item, dest_item, e)
|
||||
# Удаляем исходную директорию
|
||||
try:
|
||||
source_dir.rmdir()
|
||||
logger.info("[consolidate_archive_folders][State] Removed source directory: %s", source_dir)
|
||||
except Exception as e:
|
||||
logger.error("[consolidate_archive_folders][Failure] Failed to remove source directory %s: %s", source_dir, e)
|
||||
# [/DEF:consolidate_archive_folders:Function]
|
||||
|
||||
# [/DEF:superset_tool.utils.fileio:Module]
|
||||
# [DEF:backend.core.utils.fileio:Module]
|
||||
#
|
||||
# @SEMANTICS: file, io, zip, yaml, temp, archive, utility
|
||||
# @PURPOSE: Предоставляет набор утилит для управления файловыми операциями, включая работу с временными файлами, архивами ZIP, файлами YAML и очистку директорий.
|
||||
# @LAYER: Infra
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.logger
|
||||
# @RELATION: DEPENDS_ON -> pyyaml
|
||||
# @PUBLIC_API: create_temp_file, remove_empty_directories, read_dashboard_from_disk, calculate_crc32, RetentionPolicy, archive_exports, save_and_unpack_dashboard, update_yamls, create_dashboard_export, sanitize_filename, get_filename_from_headers, consolidate_archive_folders
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import os
|
||||
import re
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Any, Optional, Tuple, Dict, List, Union, LiteralString, Generator
|
||||
from contextlib import contextmanager
|
||||
import tempfile
|
||||
from datetime import date, datetime
|
||||
import shutil
|
||||
import zlib
|
||||
from dataclasses import dataclass
|
||||
import yaml
|
||||
from ..logger import logger as app_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:InvalidZipFormatError:Class]
|
||||
# @PURPOSE: Exception raised when a file is not a valid ZIP archive.
|
||||
class InvalidZipFormatError(Exception):
|
||||
pass
|
||||
# [/DEF:InvalidZipFormatError:Class]
|
||||
|
||||
# [DEF:create_temp_file:Function]
|
||||
# @PURPOSE: Контекстный менеджер для создания временного файла или директории с гарантированным удалением.
|
||||
# @PRE: suffix должен быть строкой, определяющей тип ресурса.
|
||||
# @POST: Временный ресурс создан и путь к нему возвращен; ресурс удален после выхода из контекста.
|
||||
# @PARAM: content (Optional[bytes]) - Бинарное содержимое для записи во временный файл.
|
||||
# @PARAM: suffix (str) - Суффикс ресурса. Если `.dir`, создается директория.
|
||||
# @PARAM: mode (str) - Режим записи в файл (e.g., 'wb').
|
||||
# @YIELDS: Path - Путь к временному ресурсу.
|
||||
# @THROW: IOError - При ошибках создания ресурса.
|
||||
@contextmanager
|
||||
def create_temp_file(content: Optional[bytes] = None, suffix: str = ".zip", mode: str = 'wb', dry_run = False) -> Generator[Path, None, None]:
|
||||
with belief_scope("Create temporary resource"):
|
||||
resource_path = None
|
||||
is_dir = suffix.startswith('.dir')
|
||||
try:
|
||||
if is_dir:
|
||||
with tempfile.TemporaryDirectory(suffix=suffix) as temp_dir:
|
||||
resource_path = Path(temp_dir)
|
||||
app_logger.debug("[create_temp_file][State] Created temporary directory: %s", resource_path)
|
||||
yield resource_path
|
||||
else:
|
||||
fd, temp_path_str = tempfile.mkstemp(suffix=suffix)
|
||||
resource_path = Path(temp_path_str)
|
||||
os.close(fd)
|
||||
if content:
|
||||
resource_path.write_bytes(content)
|
||||
app_logger.debug("[create_temp_file][State] Created temporary file: %s", resource_path)
|
||||
yield resource_path
|
||||
finally:
|
||||
if resource_path and resource_path.exists() and not dry_run:
|
||||
try:
|
||||
if resource_path.is_dir():
|
||||
shutil.rmtree(resource_path)
|
||||
app_logger.debug("[create_temp_file][Cleanup] Removed temporary directory: %s", resource_path)
|
||||
else:
|
||||
resource_path.unlink()
|
||||
app_logger.debug("[create_temp_file][Cleanup] Removed temporary file: %s", resource_path)
|
||||
except OSError as e:
|
||||
app_logger.error("[create_temp_file][Failure] Error during cleanup of %s: %s", resource_path, e)
|
||||
# [/DEF:create_temp_file:Function]
|
||||
|
||||
# [DEF:remove_empty_directories:Function]
|
||||
# @PURPOSE: Рекурсивно удаляет все пустые поддиректории, начиная с указанного пути.
|
||||
# @PRE: root_dir должен быть путем к существующей директории.
|
||||
# @POST: Все пустые поддиректории удалены, возвращено их количество.
|
||||
# @PARAM: root_dir (str) - Путь к корневой директории для очистки.
|
||||
# @RETURN: int - Количество удаленных директорий.
|
||||
def remove_empty_directories(root_dir: str) -> int:
|
||||
with belief_scope(f"Remove empty directories in {root_dir}"):
|
||||
app_logger.info("[remove_empty_directories][Enter] Starting cleanup of empty directories in %s", root_dir)
|
||||
removed_count = 0
|
||||
if not os.path.isdir(root_dir):
|
||||
app_logger.error("[remove_empty_directories][Failure] Directory not found: %s", root_dir)
|
||||
return 0
|
||||
for current_dir, _, _ in os.walk(root_dir, topdown=False):
|
||||
if not os.listdir(current_dir):
|
||||
try:
|
||||
os.rmdir(current_dir)
|
||||
removed_count += 1
|
||||
app_logger.info("[remove_empty_directories][State] Removed empty directory: %s", current_dir)
|
||||
except OSError as e:
|
||||
app_logger.error("[remove_empty_directories][Failure] Failed to remove %s: %s", current_dir, e)
|
||||
app_logger.info("[remove_empty_directories][Exit] Removed %d empty directories.", removed_count)
|
||||
return removed_count
|
||||
# [/DEF:remove_empty_directories:Function]
|
||||
|
||||
# [DEF:read_dashboard_from_disk:Function]
|
||||
# @PURPOSE: Читает бинарное содержимое файла с диска.
|
||||
# @PRE: file_path должен указывать на существующий файл.
|
||||
# @POST: Возвращает байты содержимого и имя файла.
|
||||
# @PARAM: file_path (str) - Путь к файлу.
|
||||
# @RETURN: Tuple[bytes, str] - Кортеж (содержимое, имя файла).
|
||||
# @THROW: FileNotFoundError - Если файл не найден.
|
||||
def read_dashboard_from_disk(file_path: str) -> Tuple[bytes, str]:
|
||||
with belief_scope(f"Read dashboard from {file_path}"):
|
||||
path = Path(file_path)
|
||||
assert path.is_file(), f"Файл дашборда не найден: {file_path}"
|
||||
app_logger.info("[read_dashboard_from_disk][Enter] Reading file: %s", file_path)
|
||||
content = path.read_bytes()
|
||||
if not content:
|
||||
app_logger.warning("[read_dashboard_from_disk][Warning] File is empty: %s", file_path)
|
||||
return content, path.name
|
||||
# [/DEF:read_dashboard_from_disk:Function]
|
||||
|
||||
# [DEF:calculate_crc32:Function]
|
||||
# @PURPOSE: Вычисляет контрольную сумму CRC32 для файла.
|
||||
# @PRE: file_path должен быть объектом Path к существующему файлу.
|
||||
# @POST: Возвращает 8-значную hex-строку CRC32.
|
||||
# @PARAM: file_path (Path) - Путь к файлу.
|
||||
# @RETURN: str - 8-значное шестнадцатеричное представление CRC32.
|
||||
# @THROW: IOError - При ошибках чтения файла.
|
||||
def calculate_crc32(file_path: Path) -> str:
|
||||
with belief_scope(f"Calculate CRC32 for {file_path}"):
|
||||
with open(file_path, 'rb') as f:
|
||||
crc32_value = zlib.crc32(f.read())
|
||||
return f"{crc32_value:08x}"
|
||||
# [/DEF:calculate_crc32:Function]
|
||||
|
||||
# [SECTION: DATA_CLASSES]
|
||||
# [DEF:RetentionPolicy:DataClass]
|
||||
# @PURPOSE: Определяет политику хранения для архивов (ежедневные, еженедельные, ежемесячные).
|
||||
@dataclass
|
||||
class RetentionPolicy:
|
||||
daily: int = 7
|
||||
weekly: int = 4
|
||||
monthly: int = 12
|
||||
# [/DEF:RetentionPolicy:DataClass]
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:archive_exports:Function]
|
||||
# @PURPOSE: Управляет архивом экспортированных файлов, применяя политику хранения и дедупликацию.
|
||||
# @PRE: output_dir должен быть путем к существующей директории.
|
||||
# @POST: Старые или дублирующиеся архивы удалены согласно политике.
|
||||
# @RELATION: CALLS -> apply_retention_policy
|
||||
# @RELATION: CALLS -> calculate_crc32
|
||||
# @PARAM: output_dir (str) - Директория с архивами.
|
||||
# @PARAM: policy (RetentionPolicy) - Политика хранения.
|
||||
# @PARAM: deduplicate (bool) - Флаг для включения удаления дубликатов по CRC32.
|
||||
def archive_exports(output_dir: str, policy: RetentionPolicy, deduplicate: bool = False) -> None:
|
||||
with belief_scope(f"Archive exports in {output_dir}"):
|
||||
output_path = Path(output_dir)
|
||||
if not output_path.is_dir():
|
||||
app_logger.warning("[archive_exports][Skip] Archive directory not found: %s", output_dir)
|
||||
return
|
||||
|
||||
app_logger.info("[archive_exports][Enter] Managing archive in %s", output_dir)
|
||||
|
||||
# 1. Collect all zip files
|
||||
zip_files = list(output_path.glob("*.zip"))
|
||||
if not zip_files:
|
||||
app_logger.info("[archive_exports][State] No zip files found in %s", output_dir)
|
||||
return
|
||||
|
||||
# 2. Deduplication
|
||||
if deduplicate:
|
||||
app_logger.info("[archive_exports][State] Starting deduplication...")
|
||||
checksums = {}
|
||||
files_to_remove = []
|
||||
|
||||
# Sort by modification time (newest first) to keep the latest version
|
||||
zip_files.sort(key=lambda f: f.stat().st_mtime, reverse=True)
|
||||
|
||||
for file_path in zip_files:
|
||||
try:
|
||||
crc = calculate_crc32(file_path)
|
||||
if crc in checksums:
|
||||
files_to_remove.append(file_path)
|
||||
app_logger.debug("[archive_exports][State] Duplicate found: %s (same as %s)", file_path.name, checksums[crc].name)
|
||||
else:
|
||||
checksums[crc] = file_path
|
||||
except Exception as e:
|
||||
app_logger.error("[archive_exports][Failure] Failed to calculate CRC32 for %s: %s", file_path, e)
|
||||
|
||||
for f in files_to_remove:
|
||||
try:
|
||||
f.unlink()
|
||||
zip_files.remove(f)
|
||||
app_logger.info("[archive_exports][State] Removed duplicate: %s", f.name)
|
||||
except OSError as e:
|
||||
app_logger.error("[archive_exports][Failure] Failed to remove duplicate %s: %s", f, e)
|
||||
|
||||
# 3. Retention Policy
|
||||
files_with_dates = []
|
||||
for file_path in zip_files:
|
||||
# Try to extract date from filename
|
||||
# Pattern: ..._YYYYMMDD_HHMMSS.zip or ..._YYYYMMDD.zip
|
||||
match = re.search(r'_(\d{8})_', file_path.name)
|
||||
file_date = None
|
||||
if match:
|
||||
try:
|
||||
date_str = match.group(1)
|
||||
file_date = datetime.strptime(date_str, "%Y%m%d").date()
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
if not file_date:
|
||||
# Fallback to modification time
|
||||
file_date = datetime.fromtimestamp(file_path.stat().st_mtime).date()
|
||||
|
||||
files_with_dates.append((file_path, file_date))
|
||||
|
||||
files_to_keep = apply_retention_policy(files_with_dates, policy)
|
||||
|
||||
for file_path, _ in files_with_dates:
|
||||
if file_path not in files_to_keep:
|
||||
try:
|
||||
file_path.unlink()
|
||||
app_logger.info("[archive_exports][State] Removed by retention policy: %s", file_path.name)
|
||||
except OSError as e:
|
||||
app_logger.error("[archive_exports][Failure] Failed to remove %s: %s", file_path, e)
|
||||
# [/DEF:archive_exports:Function]
|
||||
|
||||
# [DEF:apply_retention_policy:Function]
|
||||
# @PURPOSE: (Helper) Применяет политику хранения к списку файлов, возвращая те, что нужно сохранить.
|
||||
# @PRE: files_with_dates is a list of (Path, date) tuples.
|
||||
# @POST: Returns a set of files to keep.
|
||||
# @PARAM: files_with_dates (List[Tuple[Path, date]]) - Список файлов с датами.
|
||||
# @PARAM: policy (RetentionPolicy) - Политика хранения.
|
||||
# @RETURN: set - Множество путей к файлам, которые должны быть сохранены.
|
||||
def apply_retention_policy(files_with_dates: List[Tuple[Path, date]], policy: RetentionPolicy) -> set:
|
||||
with belief_scope("Apply retention policy"):
|
||||
# Сортируем по дате (от новой к старой)
|
||||
sorted_files = sorted(files_with_dates, key=lambda x: x[1], reverse=True)
|
||||
# Словарь для хранения файлов по категориям
|
||||
daily_files = []
|
||||
weekly_files = []
|
||||
monthly_files = []
|
||||
today = date.today()
|
||||
for file_path, file_date in sorted_files:
|
||||
# Ежедневные
|
||||
if (today - file_date).days < policy.daily:
|
||||
daily_files.append(file_path)
|
||||
# Еженедельные
|
||||
elif (today - file_date).days < policy.weekly * 7:
|
||||
weekly_files.append(file_path)
|
||||
# Ежемесячные
|
||||
elif (today - file_date).days < policy.monthly * 30:
|
||||
monthly_files.append(file_path)
|
||||
# Возвращаем множество файлов, которые нужно сохранить
|
||||
files_to_keep = set()
|
||||
files_to_keep.update(daily_files)
|
||||
files_to_keep.update(weekly_files[:policy.weekly])
|
||||
files_to_keep.update(monthly_files[:policy.monthly])
|
||||
app_logger.debug("[apply_retention_policy][State] Keeping %d files according to retention policy", len(files_to_keep))
|
||||
return files_to_keep
|
||||
# [/DEF:apply_retention_policy:Function]
|
||||
|
||||
# [DEF:save_and_unpack_dashboard:Function]
|
||||
# @PURPOSE: Сохраняет бинарное содержимое ZIP-архива на диск и опционально распаковывает его.
|
||||
# @PRE: zip_content должен быть байтами валидного ZIP-архива.
|
||||
# @POST: ZIP-файл сохранен, и если unpack=True, он распакован в output_dir.
|
||||
# @PARAM: zip_content (bytes) - Содержимое ZIP-архива.
|
||||
# @PARAM: output_dir (Union[str, Path]) - Директория для сохранения.
|
||||
# @PARAM: unpack (bool) - Флаг, нужно ли распаковывать архив.
|
||||
# @PARAM: original_filename (Optional[str]) - Исходное имя файла для сохранения.
|
||||
# @RETURN: Tuple[Path, Optional[Path]] - Путь к ZIP-файлу и, если применимо, путь к директории с распаковкой.
|
||||
# @THROW: InvalidZipFormatError - При ошибке формата ZIP.
|
||||
def save_and_unpack_dashboard(zip_content: bytes, output_dir: Union[str, Path], unpack: bool = False, original_filename: Optional[str] = None) -> Tuple[Path, Optional[Path]]:
|
||||
with belief_scope("Save and unpack dashboard"):
|
||||
app_logger.info("[save_and_unpack_dashboard][Enter] Processing dashboard. Unpack: %s", unpack)
|
||||
try:
|
||||
output_path = Path(output_dir)
|
||||
output_path.mkdir(parents=True, exist_ok=True)
|
||||
zip_name = sanitize_filename(original_filename) if original_filename else f"dashboard_export_{datetime.now().strftime('%Y%m%d_%H%M%S')}.zip"
|
||||
zip_path = output_path / zip_name
|
||||
zip_path.write_bytes(zip_content)
|
||||
app_logger.info("[save_and_unpack_dashboard][State] Dashboard saved to: %s", zip_path)
|
||||
if unpack:
|
||||
with zipfile.ZipFile(zip_path, 'r') as zip_ref:
|
||||
zip_ref.extractall(output_path)
|
||||
app_logger.info("[save_and_unpack_dashboard][State] Dashboard unpacked to: %s", output_path)
|
||||
return zip_path, output_path
|
||||
return zip_path, None
|
||||
except zipfile.BadZipFile as e:
|
||||
app_logger.error("[save_and_unpack_dashboard][Failure] Invalid ZIP archive: %s", e)
|
||||
raise InvalidZipFormatError(f"Invalid ZIP file: {e}") from e
|
||||
# [/DEF:save_and_unpack_dashboard:Function]
|
||||
|
||||
# [DEF:update_yamls:Function]
|
||||
# @PURPOSE: Обновляет конфигурации в YAML-файлах, заменяя значения или применяя regex.
|
||||
# @PRE: path должен быть существующей директорией.
|
||||
# @POST: Все YAML файлы в директории обновлены согласно переданным параметрам.
|
||||
# @RELATION: CALLS -> _update_yaml_file
|
||||
# @THROW: FileNotFoundError - Если `path` не существует.
|
||||
# @PARAM: db_configs (Optional[List[Dict]]) - Список конфигураций для замены.
|
||||
# @PARAM: path (str) - Путь к директории с YAML файлами.
|
||||
# @PARAM: regexp_pattern (Optional[LiteralString]) - Паттерн для поиска.
|
||||
# @PARAM: replace_string (Optional[LiteralString]) - Строка для замены.
|
||||
def update_yamls(db_configs: Optional[List[Dict[str, Any]]] = None, path: str = "dashboards", regexp_pattern: Optional[LiteralString] = None, replace_string: Optional[LiteralString] = None) -> None:
|
||||
with belief_scope("Update YAML configurations"):
|
||||
app_logger.info("[update_yamls][Enter] Starting YAML configuration update.")
|
||||
dir_path = Path(path)
|
||||
assert dir_path.is_dir(), f"Путь {path} не существует или не является директорией"
|
||||
|
||||
configs: List[Dict[str, Any]] = db_configs or []
|
||||
|
||||
for file_path in dir_path.rglob("*.yaml"):
|
||||
_update_yaml_file(file_path, configs, regexp_pattern, replace_string)
|
||||
# [/DEF:update_yamls:Function]
|
||||
|
||||
# [DEF:_update_yaml_file:Function]
|
||||
# @PURPOSE: (Helper) Обновляет один YAML файл.
|
||||
# @PRE: file_path должен быть объектом Path к существующему YAML файлу.
|
||||
# @POST: Файл обновлен согласно переданным конфигурациям или регулярному выражению.
|
||||
# @PARAM: file_path (Path) - Путь к файлу.
|
||||
# @PARAM: db_configs (List[Dict]) - Конфигурации.
|
||||
# @PARAM: regexp_pattern (Optional[str]) - Паттерн.
|
||||
# @PARAM: replace_string (Optional[str]) - Замена.
|
||||
def _update_yaml_file(file_path: Path, db_configs: List[Dict[str, Any]], regexp_pattern: Optional[str], replace_string: Optional[str]) -> None:
|
||||
with belief_scope(f"Update YAML file: {file_path}"):
|
||||
# Читаем содержимое файла
|
||||
try:
|
||||
with open(file_path, 'r', encoding='utf-8') as f:
|
||||
content = f.read()
|
||||
except Exception as e:
|
||||
app_logger.error("[_update_yaml_file][Failure] Failed to read %s: %s", file_path, e)
|
||||
return
|
||||
# Если задан pattern и replace_string, применяем замену по регулярному выражению
|
||||
if regexp_pattern and replace_string:
|
||||
try:
|
||||
new_content = re.sub(regexp_pattern, replace_string, content)
|
||||
if new_content != content:
|
||||
with open(file_path, 'w', encoding='utf-8') as f:
|
||||
f.write(new_content)
|
||||
app_logger.info("[_update_yaml_file][State] Updated %s using regex pattern", file_path)
|
||||
except Exception as e:
|
||||
app_logger.error("[_update_yaml_file][Failure] Error applying regex to %s: %s", file_path, e)
|
||||
# Если заданы конфигурации, заменяем значения (поддержка old/new)
|
||||
if db_configs:
|
||||
try:
|
||||
# Прямой текстовый заменитель для старых/новых значений, чтобы сохранить структуру файла
|
||||
modified_content = content
|
||||
for cfg in db_configs:
|
||||
# Ожидаем структуру: {'old': {...}, 'new': {...}}
|
||||
old_cfg = cfg.get('old', {})
|
||||
new_cfg = cfg.get('new', {})
|
||||
for key, old_val in old_cfg.items():
|
||||
if key in new_cfg:
|
||||
new_val = new_cfg[key]
|
||||
# Заменяем только точные совпадения старого значения в тексте YAML, используя ключ для контекста
|
||||
if isinstance(old_val, str):
|
||||
# Ищем паттерн: key: "value" или key: value
|
||||
key_pattern = re.escape(key)
|
||||
val_pattern = re.escape(old_val)
|
||||
# Группы: 1=ключ+разделитель, 2=открывающая кавычка (опц), 3=значение, 4=закрывающая кавычка (опц)
|
||||
pattern = rf'({key_pattern}\s*:\s*)(["\']?)({val_pattern})(["\']?)'
|
||||
|
||||
# [DEF:replacer:Function]
|
||||
# @PURPOSE: Функция замены, сохраняющая кавычки если они были.
|
||||
# @PRE: match должен быть объектом совпадения регулярного выражения.
|
||||
# @POST: Возвращает строку с новым значением, сохраняя префикс и кавычки.
|
||||
def replacer(match):
|
||||
prefix = match.group(1)
|
||||
quote_open = match.group(2)
|
||||
quote_close = match.group(4)
|
||||
return f"{prefix}{quote_open}{new_val}{quote_close}"
|
||||
# [/DEF:replacer:Function]
|
||||
|
||||
modified_content = re.sub(pattern, replacer, modified_content)
|
||||
app_logger.info("[_update_yaml_file][State] Replaced '%s' with '%s' for key %s in %s", old_val, new_val, key, file_path)
|
||||
# Записываем обратно изменённый контент без парсинга YAML, сохраняем оригинальное форматирование
|
||||
with open(file_path, 'w', encoding='utf-8') as f:
|
||||
f.write(modified_content)
|
||||
except Exception as e:
|
||||
app_logger.error("[_update_yaml_file][Failure] Error performing raw replacement in %s: %s", file_path, e)
|
||||
# [/DEF:_update_yaml_file:Function]
|
||||
|
||||
# [DEF:create_dashboard_export:Function]
|
||||
# @PURPOSE: Создает ZIP-архив из указанных исходных путей.
|
||||
# @PRE: source_paths должен содержать существующие пути.
|
||||
# @POST: ZIP-архив создан по пути zip_path.
|
||||
# @PARAM: zip_path (Union[str, Path]) - Путь для сохранения ZIP архива.
|
||||
# @PARAM: source_paths (List[Union[str, Path]]) - Список исходных путей для архивации.
|
||||
# @PARAM: exclude_extensions (Optional[List[str]]) - Список расширений для исключения.
|
||||
# @RETURN: bool - `True` при успехе, `False` при ошибке.
|
||||
def create_dashboard_export(zip_path: Union[str, Path], source_paths: List[Union[str, Path]], exclude_extensions: Optional[List[str]] = None) -> bool:
|
||||
with belief_scope(f"Create dashboard export: {zip_path}"):
|
||||
app_logger.info("[create_dashboard_export][Enter] Packing dashboard: %s -> %s", source_paths, zip_path)
|
||||
try:
|
||||
exclude_ext = [ext.lower() for ext in exclude_extensions or []]
|
||||
with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf:
|
||||
for src_path_str in source_paths:
|
||||
src_path = Path(src_path_str)
|
||||
assert src_path.exists(), f"Путь не найден: {src_path}"
|
||||
for item in src_path.rglob('*'):
|
||||
if item.is_file() and item.suffix.lower() not in exclude_ext:
|
||||
arcname = item.relative_to(src_path.parent)
|
||||
zipf.write(item, arcname)
|
||||
app_logger.info("[create_dashboard_export][Exit] Archive created: %s", zip_path)
|
||||
return True
|
||||
except (IOError, zipfile.BadZipFile, AssertionError) as e:
|
||||
app_logger.error("[create_dashboard_export][Failure] Error: %s", e, exc_info=True)
|
||||
return False
|
||||
# [/DEF:create_dashboard_export:Function]
|
||||
|
||||
# [DEF:sanitize_filename:Function]
|
||||
# @PURPOSE: Очищает строку от символов, недопустимых в именах файлов.
|
||||
# @PRE: filename должен быть строкой.
|
||||
# @POST: Возвращает строку без спецсимволов.
|
||||
# @PARAM: filename (str) - Исходное имя файла.
|
||||
# @RETURN: str - Очищенная строка.
|
||||
def sanitize_filename(filename: str) -> str:
|
||||
with belief_scope(f"Sanitize filename: {filename}"):
|
||||
return re.sub(r'[\\/*?:"<>|]', "_", filename).strip()
|
||||
# [/DEF:sanitize_filename:Function]
|
||||
|
||||
# [DEF:get_filename_from_headers:Function]
|
||||
# @PURPOSE: Извлекает имя файла из HTTP заголовка 'Content-Disposition'.
|
||||
# @PRE: headers должен быть словарем заголовков.
|
||||
# @POST: Возвращает имя файла или None, если заголовок отсутствует.
|
||||
# @PARAM: headers (dict) - Словарь HTTP заголовков.
|
||||
# @RETURN: Optional[str] - Имя файла or `None`.
|
||||
def get_filename_from_headers(headers: dict) -> Optional[str]:
|
||||
with belief_scope("Get filename from headers"):
|
||||
content_disposition = headers.get("Content-Disposition", "")
|
||||
if match := re.search(r'filename="?([^"]+)"?', content_disposition):
|
||||
return match.group(1).strip()
|
||||
return None
|
||||
# [/DEF:get_filename_from_headers:Function]
|
||||
|
||||
# [DEF:consolidate_archive_folders:Function]
|
||||
# @PURPOSE: Консолидирует директории архивов на основе общего слага в имени.
|
||||
# @PRE: root_directory должен быть объектом Path к существующей директории.
|
||||
# @POST: Директории с одинаковым префиксом объединены в одну.
|
||||
# @THROW: TypeError, ValueError - Если `root_directory` невалиден.
|
||||
# @PARAM: root_directory (Path) - Корневая директория для консолидации.
|
||||
def consolidate_archive_folders(root_directory: Path) -> None:
|
||||
with belief_scope(f"Consolidate archives in {root_directory}"):
|
||||
assert isinstance(root_directory, Path), "root_directory must be a Path object."
|
||||
assert root_directory.is_dir(), "root_directory must be an existing directory."
|
||||
|
||||
app_logger.info("[consolidate_archive_folders][Enter] Consolidating archives in %s", root_directory)
|
||||
# Собираем все директории с архивами
|
||||
archive_dirs = []
|
||||
for item in root_directory.iterdir():
|
||||
if item.is_dir():
|
||||
# Проверяем, есть ли в директории ZIP-архивы
|
||||
if any(item.glob("*.zip")):
|
||||
archive_dirs.append(item)
|
||||
# Группируем по слагу (части имени до первого '_')
|
||||
slug_groups = {}
|
||||
for dir_path in archive_dirs:
|
||||
dir_name = dir_path.name
|
||||
slug = dir_name.split('_')[0] if '_' in dir_name else dir_name
|
||||
if slug not in slug_groups:
|
||||
slug_groups[slug] = []
|
||||
slug_groups[slug].append(dir_path)
|
||||
# Для каждой группы консолидируем
|
||||
for slug, dirs in slug_groups.items():
|
||||
if len(dirs) <= 1:
|
||||
continue
|
||||
# Создаем целевую директорию
|
||||
target_dir = root_directory / slug
|
||||
target_dir.mkdir(exist_ok=True)
|
||||
app_logger.info("[consolidate_archive_folders][State] Consolidating %d directories under %s", len(dirs), target_dir)
|
||||
# Перемещаем содержимое
|
||||
for source_dir in dirs:
|
||||
if source_dir == target_dir:
|
||||
continue
|
||||
for item in source_dir.iterdir():
|
||||
dest_item = target_dir / item.name
|
||||
try:
|
||||
if item.is_dir():
|
||||
shutil.move(str(item), str(dest_item))
|
||||
else:
|
||||
shutil.move(str(item), str(dest_item))
|
||||
except Exception as e:
|
||||
app_logger.error("[consolidate_archive_folders][Failure] Failed to move %s to %s: %s", item, dest_item, e)
|
||||
# Удаляем исходную директорию
|
||||
try:
|
||||
source_dir.rmdir()
|
||||
app_logger.info("[consolidate_archive_folders][State] Removed source directory: %s", source_dir)
|
||||
except Exception as e:
|
||||
app_logger.error("[consolidate_archive_folders][Failure] Failed to remove source directory %s: %s", source_dir, e)
|
||||
# [/DEF:consolidate_archive_folders:Function]
|
||||
|
||||
# [/DEF:backend.core.utils.fileio:Module]
|
||||
591
superset_tool/utils/network.py → backend/src/core/utils/network.py
Executable file → Normal file
591
superset_tool/utils/network.py → backend/src/core/utils/network.py
Executable file → Normal file
@@ -1,265 +1,326 @@
|
||||
# [DEF:superset_tool.utils.network:Module]
|
||||
#
|
||||
# @SEMANTICS: network, http, client, api, requests, session, authentication
|
||||
# @PURPOSE: Инкапсулирует низкоуровневую HTTP-логику для взаимодействия с Superset API, включая аутентификацию, управление сессией, retry-логику и обработку ошибок.
|
||||
# @LAYER: Infra
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.exceptions
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.utils.logger
|
||||
# @RELATION: DEPENDS_ON -> requests
|
||||
# @PUBLIC_API: APIClient
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Optional, Dict, Any, List, Union, cast
|
||||
import json
|
||||
import io
|
||||
from pathlib import Path
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
import urllib3
|
||||
from superset_tool.utils.logger import belief_scope
|
||||
from urllib3.util.retry import Retry
|
||||
from superset_tool.exceptions import AuthenticationError, NetworkError, DashboardNotFoundError, SupersetAPIError, PermissionDeniedError
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:APIClient:Class]
|
||||
# @PURPOSE: Инкапсулирует HTTP-логику для работы с API, включая сессии, аутентификацию, и обработку запросов.
|
||||
class APIClient:
|
||||
DEFAULT_TIMEOUT = 30
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Инициализирует API клиент с конфигурацией, сессией и логгером.
|
||||
# @PARAM: config (Dict[str, Any]) - Конфигурация.
|
||||
# @PARAM: verify_ssl (bool) - Проверять ли SSL.
|
||||
# @PARAM: timeout (int) - Таймаут запросов.
|
||||
# @PARAM: logger (Optional[SupersetLogger]) - Логгер.
|
||||
# @PRE: config must contain 'base_url' and 'auth'.
|
||||
# @POST: APIClient instance is initialized with a session.
|
||||
def __init__(self, config: Dict[str, Any], verify_ssl: bool = True, timeout: int = DEFAULT_TIMEOUT, logger: Optional[SupersetLogger] = None):
|
||||
with belief_scope("__init__"):
|
||||
self.logger = logger or SupersetLogger(name="APIClient")
|
||||
self.logger.info("[APIClient.__init__][Entry] Initializing APIClient.")
|
||||
self.base_url: str = config.get("base_url", "")
|
||||
self.auth = config.get("auth")
|
||||
self.request_settings = {"verify_ssl": verify_ssl, "timeout": timeout}
|
||||
self.session = self._init_session()
|
||||
self._tokens: Dict[str, str] = {}
|
||||
self._authenticated = False
|
||||
self.logger.info("[APIClient.__init__][Exit] APIClient initialized.")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:_init_session:Function]
|
||||
# @PURPOSE: Создает и настраивает `requests.Session` с retry-логикой.
|
||||
# @PRE: self.request_settings must be initialized.
|
||||
# @POST: Returns a configured requests.Session instance.
|
||||
# @RETURN: requests.Session - Настроенная сессия.
|
||||
def _init_session(self) -> requests.Session:
|
||||
with belief_scope("_init_session"):
|
||||
session = requests.Session()
|
||||
retries = Retry(total=3, backoff_factor=0.5, status_forcelist=[500, 502, 503, 504])
|
||||
adapter = HTTPAdapter(max_retries=retries)
|
||||
session.mount('http://', adapter)
|
||||
session.mount('https://', adapter)
|
||||
if not self.request_settings["verify_ssl"]:
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
self.logger.warning("[_init_session][State] SSL verification disabled.")
|
||||
session.verify = self.request_settings["verify_ssl"]
|
||||
return session
|
||||
# [/DEF:_init_session:Function]
|
||||
|
||||
# [DEF:authenticate:Function]
|
||||
# @PURPOSE: Выполняет аутентификацию в Superset API и получает access и CSRF токены.
|
||||
# @PRE: self.auth and self.base_url must be valid.
|
||||
# @POST: `self._tokens` заполнен, `self._authenticated` установлен в `True`.
|
||||
# @RETURN: Dict[str, str] - Словарь с токенами.
|
||||
# @THROW: AuthenticationError, NetworkError - при ошибках.
|
||||
def authenticate(self) -> Dict[str, str]:
|
||||
with belief_scope("authenticate"):
|
||||
self.logger.info("[authenticate][Enter] Authenticating to %s", self.base_url)
|
||||
try:
|
||||
login_url = f"{self.base_url}/security/login"
|
||||
response = self.session.post(login_url, json=self.auth, timeout=self.request_settings["timeout"])
|
||||
response.raise_for_status()
|
||||
access_token = response.json()["access_token"]
|
||||
|
||||
csrf_url = f"{self.base_url}/security/csrf_token/"
|
||||
csrf_response = self.session.get(csrf_url, headers={"Authorization": f"Bearer {access_token}"}, timeout=self.request_settings["timeout"])
|
||||
csrf_response.raise_for_status()
|
||||
|
||||
self._tokens = {"access_token": access_token, "csrf_token": csrf_response.json()["result"]}
|
||||
self._authenticated = True
|
||||
self.logger.info("[authenticate][Exit] Authenticated successfully.")
|
||||
return self._tokens
|
||||
except requests.exceptions.HTTPError as e:
|
||||
raise AuthenticationError(f"Authentication failed: {e}") from e
|
||||
except (requests.exceptions.RequestException, KeyError) as e:
|
||||
raise NetworkError(f"Network or parsing error during authentication: {e}") from e
|
||||
# [/DEF:authenticate:Function]
|
||||
|
||||
@property
|
||||
# [DEF:headers:Function]
|
||||
# @PURPOSE: Возвращает HTTP-заголовки для аутентифицированных запросов.
|
||||
# @PRE: APIClient is initialized and authenticated or can be authenticated.
|
||||
# @POST: Returns headers including auth tokens.
|
||||
def headers(self) -> Dict[str, str]:
|
||||
with belief_scope("headers"):
|
||||
if not self._authenticated: self.authenticate()
|
||||
return {
|
||||
"Authorization": f"Bearer {self._tokens['access_token']}",
|
||||
"X-CSRFToken": self._tokens.get("csrf_token", ""),
|
||||
"Referer": self.base_url,
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
# [/DEF:headers:Function]
|
||||
|
||||
# [DEF:request:Function]
|
||||
# @PURPOSE: Выполняет универсальный HTTP-запрос к API.
|
||||
# @PARAM: method (str) - HTTP метод.
|
||||
# @PARAM: endpoint (str) - API эндпоинт.
|
||||
# @PARAM: headers (Optional[Dict]) - Дополнительные заголовки.
|
||||
# @PARAM: raw_response (bool) - Возвращать ли сырой ответ.
|
||||
# @PRE: method and endpoint must be strings.
|
||||
# @POST: Returns response content or raw Response object.
|
||||
# @RETURN: `requests.Response` если `raw_response=True`, иначе `dict`.
|
||||
# @THROW: SupersetAPIError, NetworkError и их подклассы.
|
||||
def request(self, method: str, endpoint: str, headers: Optional[Dict] = None, raw_response: bool = False, **kwargs) -> Union[requests.Response, Dict[str, Any]]:
|
||||
with belief_scope("request"):
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy()
|
||||
if headers: _headers.update(headers)
|
||||
|
||||
try:
|
||||
response = self.session.request(method, full_url, headers=_headers, **kwargs)
|
||||
response.raise_for_status()
|
||||
return response if raw_response else response.json()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
self._handle_http_error(e, endpoint)
|
||||
except requests.exceptions.RequestException as e:
|
||||
self._handle_network_error(e, full_url)
|
||||
# [/DEF:request:Function]
|
||||
|
||||
# [DEF:_handle_http_error:Function]
|
||||
# @PURPOSE: (Helper) Преобразует HTTP ошибки в кастомные исключения.
|
||||
# @PARAM: e (requests.exceptions.HTTPError) - Ошибка.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PRE: e must be a valid HTTPError with a response.
|
||||
# @POST: Raises a specific SupersetAPIError or subclass.
|
||||
def _handle_http_error(self, e: requests.exceptions.HTTPError, endpoint: str):
|
||||
with belief_scope("_handle_http_error"):
|
||||
status_code = e.response.status_code
|
||||
if status_code == 404: raise DashboardNotFoundError(endpoint) from e
|
||||
if status_code == 403: raise PermissionDeniedError() from e
|
||||
if status_code == 401: raise AuthenticationError() from e
|
||||
raise SupersetAPIError(f"API Error {status_code}: {e.response.text}") from e
|
||||
# [/DEF:_handle_http_error:Function]
|
||||
|
||||
# [DEF:_handle_network_error:Function]
|
||||
# @PURPOSE: (Helper) Преобразует сетевые ошибки в `NetworkError`.
|
||||
# @PARAM: e (requests.exceptions.RequestException) - Ошибка.
|
||||
# @PARAM: url (str) - URL.
|
||||
# @PRE: e must be a RequestException.
|
||||
# @POST: Raises a NetworkError.
|
||||
def _handle_network_error(self, e: requests.exceptions.RequestException, url: str):
|
||||
with belief_scope("_handle_network_error"):
|
||||
if isinstance(e, requests.exceptions.Timeout): msg = "Request timeout"
|
||||
elif isinstance(e, requests.exceptions.ConnectionError): msg = "Connection error"
|
||||
else: msg = f"Unknown network error: {e}"
|
||||
raise NetworkError(msg, url=url) from e
|
||||
# [/DEF:_handle_network_error:Function]
|
||||
|
||||
# [DEF:upload_file:Function]
|
||||
# @PURPOSE: Загружает файл на сервер через multipart/form-data.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: file_info (Dict[str, Any]) - Информация о файле.
|
||||
# @PARAM: extra_data (Optional[Dict]) - Дополнительные данные.
|
||||
# @PARAM: timeout (Optional[int]) - Таймаут.
|
||||
# @PRE: file_info must contain 'file_obj' and 'file_name'.
|
||||
# @POST: File is uploaded and response returned.
|
||||
# @RETURN: Ответ API в виде словаря.
|
||||
# @THROW: SupersetAPIError, NetworkError, TypeError.
|
||||
def upload_file(self, endpoint: str, file_info: Dict[str, Any], extra_data: Optional[Dict] = None, timeout: Optional[int] = None) -> Dict:
|
||||
with belief_scope("upload_file"):
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy(); _headers.pop('Content-Type', None)
|
||||
|
||||
file_obj, file_name, form_field = file_info.get("file_obj"), file_info.get("file_name"), file_info.get("form_field", "file")
|
||||
|
||||
files_payload = {}
|
||||
if isinstance(file_obj, (str, Path)):
|
||||
with open(file_obj, 'rb') as f:
|
||||
files_payload = {form_field: (file_name, f.read(), 'application/x-zip-compressed')}
|
||||
elif isinstance(file_obj, io.BytesIO):
|
||||
files_payload = {form_field: (file_name, file_obj.getvalue(), 'application/x-zip-compressed')}
|
||||
else:
|
||||
raise TypeError(f"Unsupported file_obj type: {type(file_obj)}")
|
||||
|
||||
return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
|
||||
# [/DEF:upload_file:Function]
|
||||
|
||||
# [DEF:_perform_upload:Function]
|
||||
# @PURPOSE: (Helper) Выполняет POST запрос с файлом.
|
||||
# @PARAM: url (str) - URL.
|
||||
# @PARAM: files (Dict) - Файлы.
|
||||
# @PARAM: data (Optional[Dict]) - Данные.
|
||||
# @PARAM: headers (Dict) - Заголовки.
|
||||
# @PARAM: timeout (Optional[int]) - Таймаут.
|
||||
# @PRE: url, files, and headers must be provided.
|
||||
# @POST: POST request is performed and JSON response returned.
|
||||
# @RETURN: Dict - Ответ.
|
||||
def _perform_upload(self, url: str, files: Dict, data: Optional[Dict], headers: Dict, timeout: Optional[int]) -> Dict:
|
||||
with belief_scope("_perform_upload"):
|
||||
try:
|
||||
response = self.session.post(url, files=files, data=data or {}, headers=headers, timeout=timeout or self.request_settings["timeout"])
|
||||
response.raise_for_status()
|
||||
# Добавляем логирование для отладки
|
||||
if response.status_code == 200:
|
||||
try:
|
||||
return response.json()
|
||||
except Exception as json_e:
|
||||
self.logger.debug(f"[_perform_upload][Debug] Response is not valid JSON: {response.text[:200]}...")
|
||||
raise SupersetAPIError(f"API error during upload: Response is not valid JSON: {json_e}") from json_e
|
||||
return response.json()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error during upload: {e}", url=url) from e
|
||||
# [/DEF:_perform_upload:Function]
|
||||
|
||||
# [DEF:fetch_paginated_count:Function]
|
||||
# @PURPOSE: Получает общее количество элементов для пагинации.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: query_params (Dict) - Параметры запроса.
|
||||
# @PARAM: count_field (str) - Поле с количеством.
|
||||
# @PRE: query_params must be a dictionary.
|
||||
# @POST: Returns total count of items.
|
||||
# @RETURN: int - Количество.
|
||||
def fetch_paginated_count(self, endpoint: str, query_params: Dict, count_field: str = "count") -> int:
|
||||
with belief_scope("fetch_paginated_count"):
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query_params)}))
|
||||
return response_json.get(count_field, 0)
|
||||
# [/DEF:fetch_paginated_count:Function]
|
||||
|
||||
# [DEF:fetch_paginated_data:Function]
|
||||
# @PURPOSE: Автоматически собирает данные со всех страниц пагинированного эндпоинта.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: pagination_options (Dict[str, Any]) - Опции пагинации.
|
||||
# @PRE: pagination_options must contain 'base_query', 'total_count', 'results_field'.
|
||||
# @POST: Returns all items across all pages.
|
||||
# @RETURN: List[Any] - Список данных.
|
||||
def fetch_paginated_data(self, endpoint: str, pagination_options: Dict[str, Any]) -> List[Any]:
|
||||
with belief_scope("fetch_paginated_data"):
|
||||
base_query, total_count = pagination_options["base_query"], pagination_options["total_count"]
|
||||
results_field, page_size = pagination_options["results_field"], base_query.get('page_size')
|
||||
assert page_size and page_size > 0, "'page_size' must be a positive number."
|
||||
|
||||
results = []
|
||||
for page in range((total_count + page_size - 1) // page_size):
|
||||
query = {**base_query, 'page': page}
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query)}))
|
||||
results.extend(response_json.get(results_field, []))
|
||||
return results
|
||||
# [/DEF:fetch_paginated_data:Function]
|
||||
|
||||
# [/DEF:APIClient:Class]
|
||||
|
||||
# [/DEF:superset_tool.utils.network:Module]
|
||||
# [DEF:backend.core.utils.network:Module]
|
||||
#
|
||||
# @SEMANTICS: network, http, client, api, requests, session, authentication
|
||||
# @PURPOSE: Инкапсулирует низкоуровневую HTTP-логику для взаимодействия с Superset API, включая аутентификацию, управление сессией, retry-логику и обработку ошибок.
|
||||
# @LAYER: Infra
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.logger
|
||||
# @RELATION: DEPENDS_ON -> requests
|
||||
# @PUBLIC_API: APIClient
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Optional, Dict, Any, List, Union, cast
|
||||
import json
|
||||
import io
|
||||
from pathlib import Path
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
import urllib3
|
||||
from urllib3.util.retry import Retry
|
||||
from ..logger import logger as app_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:SupersetAPIError:Class]
|
||||
# @PURPOSE: Base exception for all Superset API related errors.
|
||||
class SupersetAPIError(Exception):
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the exception with a message and context.
|
||||
# @PRE: message is a string, context is a dict.
|
||||
# @POST: Exception is initialized with context.
|
||||
def __init__(self, message: str = "Superset API error", **context: Any):
|
||||
with belief_scope("SupersetAPIError.__init__"):
|
||||
self.context = context
|
||||
super().__init__(f"[API_FAILURE] {message} | Context: {self.context}")
|
||||
# [/DEF:__init__:Function]
|
||||
# [/DEF:SupersetAPIError:Class]
|
||||
|
||||
# [DEF:AuthenticationError:Class]
|
||||
# @PURPOSE: Exception raised when authentication fails.
|
||||
class AuthenticationError(SupersetAPIError):
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the authentication error.
|
||||
# @PRE: message is a string, context is a dict.
|
||||
# @POST: AuthenticationError is initialized.
|
||||
def __init__(self, message: str = "Authentication failed", **context: Any):
|
||||
with belief_scope("AuthenticationError.__init__"):
|
||||
super().__init__(message, type="authentication", **context)
|
||||
# [/DEF:__init__:Function]
|
||||
# [/DEF:AuthenticationError:Class]
|
||||
|
||||
# [DEF:PermissionDeniedError:Class]
|
||||
# @PURPOSE: Exception raised when access is denied.
|
||||
class PermissionDeniedError(AuthenticationError):
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the permission denied error.
|
||||
# @PRE: message is a string, context is a dict.
|
||||
# @POST: PermissionDeniedError is initialized.
|
||||
def __init__(self, message: str = "Permission denied", **context: Any):
|
||||
with belief_scope("PermissionDeniedError.__init__"):
|
||||
super().__init__(message, **context)
|
||||
# [/DEF:__init__:Function]
|
||||
# [/DEF:PermissionDeniedError:Class]
|
||||
|
||||
# [DEF:DashboardNotFoundError:Class]
|
||||
# @PURPOSE: Exception raised when a dashboard cannot be found.
|
||||
class DashboardNotFoundError(SupersetAPIError):
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the not found error with resource ID.
|
||||
# @PRE: resource_id is provided.
|
||||
# @POST: DashboardNotFoundError is initialized.
|
||||
def __init__(self, resource_id: Union[int, str], message: str = "Dashboard not found", **context: Any):
|
||||
with belief_scope("DashboardNotFoundError.__init__"):
|
||||
super().__init__(f"Dashboard '{resource_id}' {message}", subtype="not_found", resource_id=resource_id, **context)
|
||||
# [/DEF:__init__:Function]
|
||||
# [/DEF:DashboardNotFoundError:Class]
|
||||
|
||||
# [DEF:NetworkError:Class]
|
||||
# @PURPOSE: Exception raised when a network level error occurs.
|
||||
class NetworkError(Exception):
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the network error.
|
||||
# @PRE: message is a string.
|
||||
# @POST: NetworkError is initialized.
|
||||
def __init__(self, message: str = "Network connection failed", **context: Any):
|
||||
with belief_scope("NetworkError.__init__"):
|
||||
self.context = context
|
||||
super().__init__(f"[NETWORK_FAILURE] {message} | Context: {self.context}")
|
||||
# [/DEF:__init__:Function]
|
||||
# [/DEF:NetworkError:Class]
|
||||
|
||||
# [DEF:APIClient:Class]
|
||||
# @PURPOSE: Инкапсулирует HTTP-логику для работы с API, включая сессии, аутентификацию, и обработку запросов.
|
||||
class APIClient:
|
||||
DEFAULT_TIMEOUT = 30
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Инициализирует API клиент с конфигурацией, сессией и логгером.
|
||||
# @PARAM: config (Dict[str, Any]) - Конфигурация.
|
||||
# @PARAM: verify_ssl (bool) - Проверять ли SSL.
|
||||
# @PARAM: timeout (int) - Таймаут запросов.
|
||||
# @PRE: config must contain 'base_url' and 'auth'.
|
||||
# @POST: APIClient instance is initialized with a session.
|
||||
def __init__(self, config: Dict[str, Any], verify_ssl: bool = True, timeout: int = DEFAULT_TIMEOUT):
|
||||
with belief_scope("__init__"):
|
||||
app_logger.info("[APIClient.__init__][Entry] Initializing APIClient.")
|
||||
self.base_url: str = config.get("base_url", "")
|
||||
self.auth = config.get("auth")
|
||||
self.request_settings = {"verify_ssl": verify_ssl, "timeout": timeout}
|
||||
self.session = self._init_session()
|
||||
self._tokens: Dict[str, str] = {}
|
||||
self._authenticated = False
|
||||
app_logger.info("[APIClient.__init__][Exit] APIClient initialized.")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:_init_session:Function]
|
||||
# @PURPOSE: Создает и настраивает `requests.Session` с retry-логикой.
|
||||
# @PRE: self.request_settings must be initialized.
|
||||
# @POST: Returns a configured requests.Session instance.
|
||||
# @RETURN: requests.Session - Настроенная сессия.
|
||||
def _init_session(self) -> requests.Session:
|
||||
with belief_scope("_init_session"):
|
||||
session = requests.Session()
|
||||
retries = Retry(total=3, backoff_factor=0.5, status_forcelist=[500, 502, 503, 504])
|
||||
adapter = HTTPAdapter(max_retries=retries)
|
||||
session.mount('http://', adapter)
|
||||
session.mount('https://', adapter)
|
||||
if not self.request_settings["verify_ssl"]:
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
app_logger.warning("[_init_session][State] SSL verification disabled.")
|
||||
session.verify = self.request_settings["verify_ssl"]
|
||||
return session
|
||||
# [/DEF:_init_session:Function]
|
||||
|
||||
# [DEF:authenticate:Function]
|
||||
# @PURPOSE: Выполняет аутентификацию в Superset API и получает access и CSRF токены.
|
||||
# @PRE: self.auth and self.base_url must be valid.
|
||||
# @POST: `self._tokens` заполнен, `self._authenticated` установлен в `True`.
|
||||
# @RETURN: Dict[str, str] - Словарь с токенами.
|
||||
# @THROW: AuthenticationError, NetworkError - при ошибках.
|
||||
def authenticate(self) -> Dict[str, str]:
|
||||
with belief_scope("authenticate"):
|
||||
app_logger.info("[authenticate][Enter] Authenticating to %s", self.base_url)
|
||||
try:
|
||||
login_url = f"{self.base_url}/security/login"
|
||||
response = self.session.post(login_url, json=self.auth, timeout=self.request_settings["timeout"])
|
||||
response.raise_for_status()
|
||||
access_token = response.json()["access_token"]
|
||||
|
||||
csrf_url = f"{self.base_url}/security/csrf_token/"
|
||||
csrf_response = self.session.get(csrf_url, headers={"Authorization": f"Bearer {access_token}"}, timeout=self.request_settings["timeout"])
|
||||
csrf_response.raise_for_status()
|
||||
|
||||
self._tokens = {"access_token": access_token, "csrf_token": csrf_response.json()["result"]}
|
||||
self._authenticated = True
|
||||
app_logger.info("[authenticate][Exit] Authenticated successfully.")
|
||||
return self._tokens
|
||||
except requests.exceptions.HTTPError as e:
|
||||
raise AuthenticationError(f"Authentication failed: {e}") from e
|
||||
except (requests.exceptions.RequestException, KeyError) as e:
|
||||
raise NetworkError(f"Network or parsing error during authentication: {e}") from e
|
||||
# [/DEF:authenticate:Function]
|
||||
|
||||
@property
|
||||
# [DEF:headers:Function]
|
||||
# @PURPOSE: Возвращает HTTP-заголовки для аутентифицированных запросов.
|
||||
# @PRE: APIClient is initialized and authenticated or can be authenticated.
|
||||
# @POST: Returns headers including auth tokens.
|
||||
def headers(self) -> Dict[str, str]:
|
||||
with belief_scope("headers"):
|
||||
if not self._authenticated: self.authenticate()
|
||||
return {
|
||||
"Authorization": f"Bearer {self._tokens['access_token']}",
|
||||
"X-CSRFToken": self._tokens.get("csrf_token", ""),
|
||||
"Referer": self.base_url,
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
# [/DEF:headers:Function]
|
||||
|
||||
# [DEF:request:Function]
|
||||
# @PURPOSE: Выполняет универсальный HTTP-запрос к API.
|
||||
# @PARAM: method (str) - HTTP метод.
|
||||
# @PARAM: endpoint (str) - API эндпоинт.
|
||||
# @PARAM: headers (Optional[Dict]) - Дополнительные заголовки.
|
||||
# @PARAM: raw_response (bool) - Возвращать ли сырой ответ.
|
||||
# @PRE: method and endpoint must be strings.
|
||||
# @POST: Returns response content or raw Response object.
|
||||
# @RETURN: `requests.Response` если `raw_response=True`, иначе `dict`.
|
||||
# @THROW: SupersetAPIError, NetworkError и их подклассы.
|
||||
def request(self, method: str, endpoint: str, headers: Optional[Dict] = None, raw_response: bool = False, **kwargs) -> Union[requests.Response, Dict[str, Any]]:
|
||||
with belief_scope("request"):
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy()
|
||||
if headers: _headers.update(headers)
|
||||
|
||||
try:
|
||||
response = self.session.request(method, full_url, headers=_headers, **kwargs)
|
||||
response.raise_for_status()
|
||||
return response if raw_response else response.json()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
self._handle_http_error(e, endpoint)
|
||||
except requests.exceptions.RequestException as e:
|
||||
self._handle_network_error(e, full_url)
|
||||
# [/DEF:request:Function]
|
||||
|
||||
# [DEF:_handle_http_error:Function]
|
||||
# @PURPOSE: (Helper) Преобразует HTTP ошибки в кастомные исключения.
|
||||
# @PARAM: e (requests.exceptions.HTTPError) - Ошибка.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PRE: e must be a valid HTTPError with a response.
|
||||
# @POST: Raises a specific SupersetAPIError or subclass.
|
||||
def _handle_http_error(self, e: requests.exceptions.HTTPError, endpoint: str):
|
||||
with belief_scope("_handle_http_error"):
|
||||
status_code = e.response.status_code
|
||||
if status_code == 404: raise DashboardNotFoundError(endpoint) from e
|
||||
if status_code == 403: raise PermissionDeniedError() from e
|
||||
if status_code == 401: raise AuthenticationError() from e
|
||||
raise SupersetAPIError(f"API Error {status_code}: {e.response.text}") from e
|
||||
# [/DEF:_handle_http_error:Function]
|
||||
|
||||
# [DEF:_handle_network_error:Function]
|
||||
# @PURPOSE: (Helper) Преобразует сетевые ошибки в `NetworkError`.
|
||||
# @PARAM: e (requests.exceptions.RequestException) - Ошибка.
|
||||
# @PARAM: url (str) - URL.
|
||||
# @PRE: e must be a RequestException.
|
||||
# @POST: Raises a NetworkError.
|
||||
def _handle_network_error(self, e: requests.exceptions.RequestException, url: str):
|
||||
with belief_scope("_handle_network_error"):
|
||||
if isinstance(e, requests.exceptions.Timeout): msg = "Request timeout"
|
||||
elif isinstance(e, requests.exceptions.ConnectionError): msg = "Connection error"
|
||||
else: msg = f"Unknown network error: {e}"
|
||||
raise NetworkError(msg, url=url) from e
|
||||
# [/DEF:_handle_network_error:Function]
|
||||
|
||||
# [DEF:upload_file:Function]
|
||||
# @PURPOSE: Загружает файл на сервер через multipart/form-data.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: file_info (Dict[str, Any]) - Информация о файле.
|
||||
# @PARAM: extra_data (Optional[Dict]) - Дополнительные данные.
|
||||
# @PARAM: timeout (Optional[int]) - Таймаут.
|
||||
# @PRE: file_info must contain 'file_obj' and 'file_name'.
|
||||
# @POST: File is uploaded and response returned.
|
||||
# @RETURN: Ответ API в виде словаря.
|
||||
# @THROW: SupersetAPIError, NetworkError, TypeError.
|
||||
def upload_file(self, endpoint: str, file_info: Dict[str, Any], extra_data: Optional[Dict] = None, timeout: Optional[int] = None) -> Dict:
|
||||
with belief_scope("upload_file"):
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy(); _headers.pop('Content-Type', None)
|
||||
|
||||
file_obj, file_name, form_field = file_info.get("file_obj"), file_info.get("file_name"), file_info.get("form_field", "file")
|
||||
|
||||
files_payload = {}
|
||||
if isinstance(file_obj, (str, Path)):
|
||||
with open(file_obj, 'rb') as f:
|
||||
files_payload = {form_field: (file_name, f.read(), 'application/x-zip-compressed')}
|
||||
elif isinstance(file_obj, io.BytesIO):
|
||||
files_payload = {form_field: (file_name, file_obj.getvalue(), 'application/x-zip-compressed')}
|
||||
else:
|
||||
raise TypeError(f"Unsupported file_obj type: {type(file_obj)}")
|
||||
|
||||
return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
|
||||
# [/DEF:upload_file:Function]
|
||||
|
||||
# [DEF:_perform_upload:Function]
|
||||
# @PURPOSE: (Helper) Выполняет POST запрос с файлом.
|
||||
# @PARAM: url (str) - URL.
|
||||
# @PARAM: files (Dict) - Файлы.
|
||||
# @PARAM: data (Optional[Dict]) - Данные.
|
||||
# @PARAM: headers (Dict) - Заголовки.
|
||||
# @PARAM: timeout (Optional[int]) - Таймаут.
|
||||
# @PRE: url, files, and headers must be provided.
|
||||
# @POST: POST request is performed and JSON response returned.
|
||||
# @RETURN: Dict - Ответ.
|
||||
def _perform_upload(self, url: str, files: Dict, data: Optional[Dict], headers: Dict, timeout: Optional[int]) -> Dict:
|
||||
with belief_scope("_perform_upload"):
|
||||
try:
|
||||
response = self.session.post(url, files=files, data=data or {}, headers=headers, timeout=timeout or self.request_settings["timeout"])
|
||||
response.raise_for_status()
|
||||
if response.status_code == 200:
|
||||
try:
|
||||
return response.json()
|
||||
except Exception as json_e:
|
||||
app_logger.debug(f"[_perform_upload][Debug] Response is not valid JSON: {response.text[:200]}...")
|
||||
raise SupersetAPIError(f"API error during upload: Response is not valid JSON: {json_e}") from json_e
|
||||
return response.json()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error during upload: {e}", url=url) from e
|
||||
# [/DEF:_perform_upload:Function]
|
||||
|
||||
# [DEF:fetch_paginated_count:Function]
|
||||
# @PURPOSE: Получает общее количество элементов для пагинации.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: query_params (Dict) - Параметры запроса.
|
||||
# @PARAM: count_field (str) - Поле с количеством.
|
||||
# @PRE: query_params must be a dictionary.
|
||||
# @POST: Returns total count of items.
|
||||
# @RETURN: int - Количество.
|
||||
def fetch_paginated_count(self, endpoint: str, query_params: Dict, count_field: str = "count") -> int:
|
||||
with belief_scope("fetch_paginated_count"):
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query_params)}))
|
||||
return response_json.get(count_field, 0)
|
||||
# [/DEF:fetch_paginated_count:Function]
|
||||
|
||||
# [DEF:fetch_paginated_data:Function]
|
||||
# @PURPOSE: Автоматически собирает данные со всех страниц пагинированного эндпоинта.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: pagination_options (Dict[str, Any]) - Опции пагинации.
|
||||
# @PRE: pagination_options must contain 'base_query', 'total_count', 'results_field'.
|
||||
# @POST: Returns all items across all pages.
|
||||
# @RETURN: List[Any] - Список данных.
|
||||
def fetch_paginated_data(self, endpoint: str, pagination_options: Dict[str, Any]) -> List[Any]:
|
||||
with belief_scope("fetch_paginated_data"):
|
||||
base_query, total_count = pagination_options["base_query"], pagination_options["total_count"]
|
||||
results_field, page_size = pagination_options["results_field"], base_query.get('page_size')
|
||||
assert page_size and page_size > 0, "'page_size' must be a positive number."
|
||||
|
||||
results = []
|
||||
for page in range((total_count + page_size - 1) // page_size):
|
||||
query = {**base_query, 'page': page}
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query)}))
|
||||
results.extend(response_json.get(results_field, []))
|
||||
return results
|
||||
# [/DEF:fetch_paginated_data:Function]
|
||||
|
||||
# [/DEF:APIClient:Class]
|
||||
|
||||
# [/DEF:backend.core.utils.network:Module]
|
||||
@@ -1,16 +1,23 @@
|
||||
# [DEF:Dependencies:Module]
|
||||
# @SEMANTICS: dependency, injection, singleton, factory
|
||||
# @SEMANTICS: dependency, injection, singleton, factory, auth, jwt
|
||||
# @PURPOSE: Manages the creation and provision of shared application dependencies, such as the PluginLoader and TaskManager, to avoid circular imports.
|
||||
# @LAYER: Core
|
||||
# @RELATION: Used by the main app and API routers to get access to shared instances.
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
from fastapi import Depends, HTTPException, status
|
||||
from fastapi.security import OAuth2PasswordBearer
|
||||
from jose import JWTError
|
||||
from .core.plugin_loader import PluginLoader
|
||||
from .core.task_manager import TaskManager
|
||||
from .core.config_manager import ConfigManager
|
||||
from .core.scheduler import SchedulerService
|
||||
from .core.database import init_db
|
||||
from .core.database import init_db, get_auth_db
|
||||
from .core.logger import logger, belief_scope
|
||||
from .core.auth.jwt import decode_token
|
||||
from .core.auth.repository import AuthRepository
|
||||
from .models.auth import User
|
||||
|
||||
# Initialize singletons
|
||||
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
||||
@@ -77,4 +84,70 @@ def get_scheduler_service() -> SchedulerService:
|
||||
return scheduler_service
|
||||
# [/DEF:get_scheduler_service:Function]
|
||||
|
||||
# [DEF:oauth2_scheme:Variable]
|
||||
# @PURPOSE: OAuth2 password bearer scheme for token extraction.
|
||||
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/api/auth/login")
|
||||
# [/DEF:oauth2_scheme:Variable]
|
||||
|
||||
# [DEF:get_current_user:Function]
|
||||
# @PURPOSE: Dependency for retrieving the currently authenticated user from a JWT.
|
||||
# @PRE: JWT token provided in Authorization header.
|
||||
# @POST: Returns the User object if token is valid.
|
||||
# @THROW: HTTPException 401 if token is invalid or user not found.
|
||||
# @PARAM: token (str) - Extracted JWT token.
|
||||
# @PARAM: db (Session) - Auth database session.
|
||||
# @RETURN: User - The authenticated user.
|
||||
def get_current_user(token: str = Depends(oauth2_scheme), db = Depends(get_auth_db)):
|
||||
with belief_scope("get_current_user"):
|
||||
credentials_exception = HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Could not validate credentials",
|
||||
headers={"WWW-Authenticate": "Bearer"},
|
||||
)
|
||||
try:
|
||||
payload = decode_token(token)
|
||||
username: str = payload.get("sub")
|
||||
if username is None:
|
||||
raise credentials_exception
|
||||
except JWTError:
|
||||
raise credentials_exception
|
||||
|
||||
repo = AuthRepository(db)
|
||||
user = repo.get_user_by_username(username)
|
||||
if user is None:
|
||||
raise credentials_exception
|
||||
return user
|
||||
# [/DEF:get_current_user:Function]
|
||||
|
||||
# [DEF:has_permission:Function]
|
||||
# @PURPOSE: Dependency for checking if the current user has a specific permission.
|
||||
# @PRE: User is authenticated.
|
||||
# @POST: Returns True if user has permission.
|
||||
# @THROW: HTTPException 403 if permission is denied.
|
||||
# @PARAM: resource (str) - The resource identifier.
|
||||
# @PARAM: action (str) - The action identifier (READ, EXECUTE, WRITE).
|
||||
# @RETURN: User - The authenticated user if permission granted.
|
||||
def has_permission(resource: str, action: str):
|
||||
def permission_checker(current_user: User = Depends(get_current_user)):
|
||||
with belief_scope("has_permission", f"{resource}:{action}"):
|
||||
# Union of all permissions across all roles
|
||||
for role in current_user.roles:
|
||||
for perm in role.permissions:
|
||||
if perm.resource == resource and perm.action == action:
|
||||
return current_user
|
||||
|
||||
# Special case for Admin role (full access)
|
||||
if any(role.name == "Admin" for role in current_user.roles):
|
||||
return current_user
|
||||
|
||||
from .core.auth.logger import log_security_event
|
||||
log_security_event("PERMISSION_DENIED", current_user.username, {"resource": resource, "action": action})
|
||||
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN,
|
||||
detail=f"Permission denied for {resource}:{action}"
|
||||
)
|
||||
return permission_checker
|
||||
# [/DEF:has_permission:Function]
|
||||
|
||||
# [/DEF:Dependencies:Module]
|
||||
105
backend/src/models/auth.py
Normal file
105
backend/src/models/auth.py
Normal file
@@ -0,0 +1,105 @@
|
||||
# [DEF:backend.src.models.auth:Module]
|
||||
#
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: auth, models, user, role, permission, sqlalchemy
|
||||
# @PURPOSE: SQLAlchemy models for multi-user authentication and authorization.
|
||||
# @LAYER: Domain
|
||||
# @RELATION: INHERITS_FROM -> backend.src.models.mapping.Base
|
||||
#
|
||||
# @INVARIANT: Usernames and emails must be unique.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
from sqlalchemy import Column, String, Boolean, DateTime, ForeignKey, Table, Enum
|
||||
from sqlalchemy.orm import relationship
|
||||
from .mapping import Base
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:generate_uuid:Function]
|
||||
# @PURPOSE: Generates a unique UUID string.
|
||||
# @POST: Returns a string representation of a new UUID.
|
||||
def generate_uuid():
|
||||
return str(uuid.uuid4())
|
||||
# [/DEF:generate_uuid:Function]
|
||||
|
||||
# [DEF:user_roles:Table]
|
||||
# @PURPOSE: Association table for many-to-many relationship between Users and Roles.
|
||||
user_roles = Table(
|
||||
"user_roles",
|
||||
Base.metadata,
|
||||
Column("user_id", String, ForeignKey("users.id"), primary_key=True),
|
||||
Column("role_id", String, ForeignKey("roles.id"), primary_key=True),
|
||||
)
|
||||
# [/DEF:user_roles:Table]
|
||||
|
||||
# [DEF:role_permissions:Table]
|
||||
# @PURPOSE: Association table for many-to-many relationship between Roles and Permissions.
|
||||
role_permissions = Table(
|
||||
"role_permissions",
|
||||
Base.metadata,
|
||||
Column("role_id", String, ForeignKey("roles.id"), primary_key=True),
|
||||
Column("permission_id", String, ForeignKey("permissions.id"), primary_key=True),
|
||||
)
|
||||
# [/DEF:role_permissions:Table]
|
||||
|
||||
# [DEF:User:Class]
|
||||
# @PURPOSE: Represents an identity that can authenticate to the system.
|
||||
# @RELATION: HAS_MANY -> Role (via user_roles)
|
||||
class User(Base):
|
||||
__tablename__ = "users"
|
||||
|
||||
id = Column(String, primary_key=True, default=generate_uuid)
|
||||
username = Column(String, unique=True, index=True, nullable=False)
|
||||
email = Column(String, unique=True, index=True, nullable=True)
|
||||
password_hash = Column(String, nullable=True)
|
||||
auth_source = Column(String, default="LOCAL") # LOCAL or ADFS
|
||||
is_active = Column(Boolean, default=True)
|
||||
created_at = Column(DateTime, default=datetime.utcnow)
|
||||
last_login = Column(DateTime, nullable=True)
|
||||
|
||||
roles = relationship("Role", secondary=user_roles, back_populates="users")
|
||||
# [/DEF:User:Class]
|
||||
|
||||
# [DEF:Role:Class]
|
||||
# @PURPOSE: Represents a collection of permissions.
|
||||
# @RELATION: HAS_MANY -> User (via user_roles)
|
||||
# @RELATION: HAS_MANY -> Permission (via role_permissions)
|
||||
class Role(Base):
|
||||
__tablename__ = "roles"
|
||||
|
||||
id = Column(String, primary_key=True, default=generate_uuid)
|
||||
name = Column(String, unique=True, index=True, nullable=False)
|
||||
description = Column(String, nullable=True)
|
||||
|
||||
users = relationship("User", secondary=user_roles, back_populates="roles")
|
||||
permissions = relationship("Permission", secondary=role_permissions, back_populates="roles")
|
||||
# [/DEF:Role:Class]
|
||||
|
||||
# [DEF:Permission:Class]
|
||||
# @PURPOSE: Represents a specific capability within the system.
|
||||
# @RELATION: HAS_MANY -> Role (via role_permissions)
|
||||
class Permission(Base):
|
||||
__tablename__ = "permissions"
|
||||
|
||||
id = Column(String, primary_key=True, default=generate_uuid)
|
||||
resource = Column(String, nullable=False) # e.g. "plugin:backup"
|
||||
action = Column(String, nullable=False) # e.g. "READ", "EXECUTE", "WRITE"
|
||||
|
||||
roles = relationship("Role", secondary=role_permissions, back_populates="permissions")
|
||||
# [/DEF:Permission:Class]
|
||||
|
||||
# [DEF:ADGroupMapping:Class]
|
||||
# @PURPOSE: Maps an Active Directory group to a local System Role.
|
||||
# @RELATION: DEPENDS_ON -> Role
|
||||
class ADGroupMapping(Base):
|
||||
__tablename__ = "ad_group_mappings"
|
||||
|
||||
id = Column(String, primary_key=True, default=generate_uuid)
|
||||
ad_group = Column(String, unique=True, index=True, nullable=False)
|
||||
role_id = Column(String, ForeignKey("roles.id"), nullable=False)
|
||||
|
||||
role = relationship("Role")
|
||||
# [/DEF:ADGroupMapping:Class]
|
||||
|
||||
# [/DEF:backend.src.models.auth:Module]
|
||||
@@ -1,4 +1,5 @@
|
||||
# [DEF:backend.src.models.dashboard:Module]
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: dashboard, model, metadata, migration
|
||||
# @PURPOSE: Defines data models for dashboard metadata and selection.
|
||||
# @LAYER: Model
|
||||
@@ -8,6 +9,7 @@ from pydantic import BaseModel
|
||||
from typing import List
|
||||
|
||||
# [DEF:DashboardMetadata:Class]
|
||||
# @TIER: TRIVIAL
|
||||
# @PURPOSE: Represents a dashboard available for migration.
|
||||
class DashboardMetadata(BaseModel):
|
||||
id: int
|
||||
@@ -17,6 +19,7 @@ class DashboardMetadata(BaseModel):
|
||||
# [/DEF:DashboardMetadata:Class]
|
||||
|
||||
# [DEF:DashboardSelection:Class]
|
||||
# @TIER: TRIVIAL
|
||||
# @PURPOSE: Represents the user's selection of dashboards to migrate.
|
||||
class DashboardSelection(BaseModel):
|
||||
selected_ids: List[int]
|
||||
|
||||
73
backend/src/models/git.py
Normal file
73
backend/src/models/git.py
Normal file
@@ -0,0 +1,73 @@
|
||||
# [DEF:GitModels:Module]
|
||||
# @SEMANTICS: git, models, sqlalchemy, database, schema
|
||||
# @PURPOSE: Git-specific SQLAlchemy models for configuration and repository tracking.
|
||||
# @LAYER: Model
|
||||
# @RELATION: specs/011-git-integration-dashboard/data-model.md
|
||||
|
||||
import enum
|
||||
from datetime import datetime
|
||||
from sqlalchemy import Column, String, Integer, DateTime, Enum, ForeignKey, Boolean
|
||||
from sqlalchemy.dialects.postgresql import UUID
|
||||
import uuid
|
||||
from src.core.database import Base
|
||||
|
||||
class GitProvider(str, enum.Enum):
|
||||
GITHUB = "GITHUB"
|
||||
GITLAB = "GITLAB"
|
||||
GITEA = "GITEA"
|
||||
|
||||
class GitStatus(str, enum.Enum):
|
||||
CONNECTED = "CONNECTED"
|
||||
FAILED = "FAILED"
|
||||
UNKNOWN = "UNKNOWN"
|
||||
|
||||
class SyncStatus(str, enum.Enum):
|
||||
CLEAN = "CLEAN"
|
||||
DIRTY = "DIRTY"
|
||||
CONFLICT = "CONFLICT"
|
||||
|
||||
class GitServerConfig(Base):
|
||||
"""
|
||||
[DEF:GitServerConfig:Class]
|
||||
Configuration for a Git server connection.
|
||||
"""
|
||||
__tablename__ = "git_server_configs"
|
||||
|
||||
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
name = Column(String(255), nullable=False)
|
||||
provider = Column(Enum(GitProvider), nullable=False)
|
||||
url = Column(String(255), nullable=False)
|
||||
pat = Column(String(255), nullable=False) # PERSONAL ACCESS TOKEN
|
||||
default_repository = Column(String(255), nullable=True)
|
||||
status = Column(Enum(GitStatus), default=GitStatus.UNKNOWN)
|
||||
last_validated = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
class GitRepository(Base):
|
||||
"""
|
||||
[DEF:GitRepository:Class]
|
||||
Tracking for a local Git repository linked to a dashboard.
|
||||
"""
|
||||
__tablename__ = "git_repositories"
|
||||
|
||||
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
dashboard_id = Column(Integer, nullable=False, unique=True)
|
||||
config_id = Column(String(36), ForeignKey("git_server_configs.id"), nullable=False)
|
||||
remote_url = Column(String(255), nullable=False)
|
||||
local_path = Column(String(255), nullable=False)
|
||||
current_branch = Column(String(255), default="main")
|
||||
sync_status = Column(Enum(SyncStatus), default=SyncStatus.CLEAN)
|
||||
|
||||
class DeploymentEnvironment(Base):
|
||||
"""
|
||||
[DEF:DeploymentEnvironment:Class]
|
||||
Target Superset environments for dashboard deployment.
|
||||
"""
|
||||
__tablename__ = "deployment_environments"
|
||||
|
||||
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
|
||||
name = Column(String(255), nullable=False)
|
||||
superset_url = Column(String(255), nullable=False)
|
||||
superset_token = Column(String(255), nullable=False)
|
||||
is_active = Column(Boolean, default=True)
|
||||
|
||||
# [/DEF:GitModels:Module]
|
||||
@@ -1,5 +1,6 @@
|
||||
# [DEF:backend.src.models.mapping:Module]
|
||||
#
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: database, mapping, environment, migration, sqlalchemy, sqlite
|
||||
# @PURPOSE: Defines the database schema for environment metadata and database mappings using SQLAlchemy.
|
||||
# @LAYER: Domain
|
||||
@@ -19,6 +20,7 @@ import enum
|
||||
Base = declarative_base()
|
||||
|
||||
# [DEF:MigrationStatus:Class]
|
||||
# @TIER: TRIVIAL
|
||||
# @PURPOSE: Enumeration of possible migration job statuses.
|
||||
class MigrationStatus(enum.Enum):
|
||||
PENDING = "PENDING"
|
||||
@@ -29,6 +31,7 @@ class MigrationStatus(enum.Enum):
|
||||
# [/DEF:MigrationStatus:Class]
|
||||
|
||||
# [DEF:Environment:Class]
|
||||
# @TIER: STANDARD
|
||||
# @PURPOSE: Represents a Superset instance environment.
|
||||
class Environment(Base):
|
||||
__tablename__ = "environments"
|
||||
@@ -40,6 +43,7 @@ class Environment(Base):
|
||||
# [/DEF:Environment:Class]
|
||||
|
||||
# [DEF:DatabaseMapping:Class]
|
||||
# @TIER: STANDARD
|
||||
# @PURPOSE: Represents a mapping between source and target databases.
|
||||
class DatabaseMapping(Base):
|
||||
__tablename__ = "database_mappings"
|
||||
|
||||
31
backend/src/models/storage.py
Normal file
31
backend/src/models/storage.py
Normal file
@@ -0,0 +1,31 @@
|
||||
from datetime import datetime
|
||||
from enum import Enum
|
||||
from typing import Optional
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
# [DEF:FileCategory:Class]
|
||||
# @PURPOSE: Enumeration of supported file categories in the storage system.
|
||||
class FileCategory(str, Enum):
|
||||
BACKUP = "backups"
|
||||
REPOSITORY = "repositorys"
|
||||
# [/DEF:FileCategory:Class]
|
||||
|
||||
# [DEF:StorageConfig:Class]
|
||||
# @PURPOSE: Configuration model for the storage system, defining paths and naming patterns.
|
||||
class StorageConfig(BaseModel):
|
||||
root_path: str = Field(default="backups", description="Absolute path to the storage root directory.")
|
||||
backup_structure_pattern: str = Field(default="{category}/", description="Pattern for backup directory structure.")
|
||||
repo_structure_pattern: str = Field(default="{category}/", description="Pattern for repository directory structure.")
|
||||
filename_pattern: str = Field(default="{name}_{timestamp}", description="Pattern for filenames.")
|
||||
# [/DEF:StorageConfig:Class]
|
||||
|
||||
# [DEF:StoredFile:Class]
|
||||
# @PURPOSE: Data model representing metadata for a file stored in the system.
|
||||
class StoredFile(BaseModel):
|
||||
name: str = Field(..., description="Name of the file (including extension).")
|
||||
path: str = Field(..., description="Relative path from storage root.")
|
||||
size: int = Field(..., ge=0, description="Size of the file in bytes.")
|
||||
created_at: datetime = Field(..., description="Creation timestamp.")
|
||||
category: FileCategory = Field(..., description="Category of the file.")
|
||||
mime_type: Optional[str] = Field(None, description="MIME type of the file.")
|
||||
# [/DEF:StoredFile:Class]
|
||||
@@ -12,10 +12,9 @@ from requests.exceptions import RequestException
|
||||
|
||||
from ..core.plugin_base import PluginBase
|
||||
from ..core.logger import belief_scope
|
||||
from superset_tool.client import SupersetClient
|
||||
from superset_tool.exceptions import SupersetAPIError
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
from superset_tool.utils.fileio import (
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.utils.network import SupersetAPIError
|
||||
from ..core.utils.fileio import (
|
||||
save_and_unpack_dashboard,
|
||||
archive_exports,
|
||||
sanitize_filename,
|
||||
@@ -23,7 +22,6 @@ from superset_tool.utils.fileio import (
|
||||
remove_empty_directories,
|
||||
RetentionPolicy
|
||||
)
|
||||
from superset_tool.utils.init_clients import setup_clients
|
||||
from ..dependencies import get_config_manager
|
||||
|
||||
# [DEF:BackupPlugin:Class]
|
||||
@@ -77,6 +75,15 @@ class BackupPlugin(PluginBase):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
@property
|
||||
# [DEF:ui_route:Function]
|
||||
# @PURPOSE: Returns the frontend route for the backup plugin.
|
||||
# @RETURN: str - "/tools/backups"
|
||||
def ui_route(self) -> str:
|
||||
with belief_scope("ui_route"):
|
||||
return "/tools/backups"
|
||||
# [/DEF:ui_route:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for backup plugin parameters.
|
||||
# @PRE: Plugin instance exists.
|
||||
@@ -86,7 +93,7 @@ class BackupPlugin(PluginBase):
|
||||
with belief_scope("get_schema"):
|
||||
config_manager = get_config_manager()
|
||||
envs = [e.name for e in config_manager.get_environments()]
|
||||
default_path = config_manager.get_config().settings.backup_path
|
||||
default_path = config_manager.get_config().settings.storage.root_path
|
||||
|
||||
return {
|
||||
"type": "object",
|
||||
@@ -97,14 +104,8 @@ class BackupPlugin(PluginBase):
|
||||
"description": "The Superset environment to back up.",
|
||||
"enum": envs if envs else [],
|
||||
},
|
||||
"backup_path": {
|
||||
"type": "string",
|
||||
"title": "Backup Path",
|
||||
"description": "The root directory to save backups to.",
|
||||
"default": default_path
|
||||
}
|
||||
},
|
||||
"required": ["env", "backup_path"],
|
||||
"required": ["env"],
|
||||
}
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
@@ -128,28 +129,29 @@ class BackupPlugin(PluginBase):
|
||||
if not env:
|
||||
raise KeyError("env")
|
||||
|
||||
backup_path_str = params.get("backup_path") or config_manager.get_config().settings.backup_path
|
||||
backup_path = Path(backup_path_str)
|
||||
storage_settings = config_manager.get_config().settings.storage
|
||||
# Use 'backups' subfolder within the storage root
|
||||
backup_path = Path(storage_settings.root_path) / "backups"
|
||||
|
||||
logger = SupersetLogger(log_dir=backup_path / "Logs", console=True)
|
||||
logger.info(f"[BackupPlugin][Entry] Starting backup for {env}.")
|
||||
from ..core.logger import logger as app_logger
|
||||
app_logger.info(f"[BackupPlugin][Entry] Starting backup for {env}.")
|
||||
|
||||
try:
|
||||
config_manager = get_config_manager()
|
||||
if not config_manager.has_environments():
|
||||
raise ValueError("No Superset environments configured. Please add an environment in Settings.")
|
||||
|
||||
clients = setup_clients(logger, custom_envs=config_manager.get_environments())
|
||||
client = clients.get(env)
|
||||
|
||||
if not client:
|
||||
env_config = config_manager.get_environment(env)
|
||||
if not env_config:
|
||||
raise ValueError(f"Environment '{env}' not found in configuration.")
|
||||
|
||||
client = SupersetClient(env_config)
|
||||
|
||||
dashboard_count, dashboard_meta = client.get_dashboards()
|
||||
logger.info(f"[BackupPlugin][Progress] Found {dashboard_count} dashboards to export in {env}.")
|
||||
app_logger.info(f"[BackupPlugin][Progress] Found {dashboard_count} dashboards to export in {env}.")
|
||||
|
||||
if dashboard_count == 0:
|
||||
logger.info("[BackupPlugin][Exit] No dashboards to back up.")
|
||||
app_logger.info("[BackupPlugin][Exit] No dashboards to back up.")
|
||||
return
|
||||
|
||||
for db in dashboard_meta:
|
||||
@@ -169,23 +171,22 @@ class BackupPlugin(PluginBase):
|
||||
zip_content=zip_content,
|
||||
original_filename=filename,
|
||||
output_dir=dashboard_dir,
|
||||
unpack=False,
|
||||
logger=logger
|
||||
unpack=False
|
||||
)
|
||||
|
||||
archive_exports(str(dashboard_dir), policy=RetentionPolicy(), logger=logger)
|
||||
archive_exports(str(dashboard_dir), policy=RetentionPolicy())
|
||||
|
||||
except (SupersetAPIError, RequestException, IOError, OSError) as db_error:
|
||||
logger.error(f"[BackupPlugin][Failure] Failed to export dashboard {dashboard_title} (ID: {dashboard_id}): {db_error}", exc_info=True)
|
||||
app_logger.error(f"[BackupPlugin][Failure] Failed to export dashboard {dashboard_title} (ID: {dashboard_id}): {db_error}", exc_info=True)
|
||||
continue
|
||||
|
||||
consolidate_archive_folders(backup_path / env.upper(), logger=logger)
|
||||
remove_empty_directories(str(backup_path / env.upper()), logger=logger)
|
||||
consolidate_archive_folders(backup_path / env.upper())
|
||||
remove_empty_directories(str(backup_path / env.upper()))
|
||||
|
||||
logger.info(f"[BackupPlugin][CoherenceCheck:Passed] Backup logic completed for {env}.")
|
||||
app_logger.info(f"[BackupPlugin][CoherenceCheck:Passed] Backup logic completed for {env}.")
|
||||
|
||||
except (RequestException, IOError, KeyError) as e:
|
||||
logger.critical(f"[BackupPlugin][Failure] Fatal error during backup for {env}: {e}", exc_info=True)
|
||||
app_logger.critical(f"[BackupPlugin][Failure] Fatal error during backup for {env}: {e}", exc_info=True)
|
||||
raise e
|
||||
# [/DEF:execute:Function]
|
||||
# [/DEF:BackupPlugin:Class]
|
||||
|
||||
@@ -63,6 +63,15 @@ class DebugPlugin(PluginBase):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
@property
|
||||
# [DEF:ui_route:Function]
|
||||
# @PURPOSE: Returns the frontend route for the debug plugin.
|
||||
# @RETURN: str - "/tools/debug"
|
||||
def ui_route(self) -> str:
|
||||
with belief_scope("ui_route"):
|
||||
return "/tools/debug"
|
||||
# [/DEF:ui_route:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for the debug plugin parameters.
|
||||
# @PRE: Plugin instance exists.
|
||||
@@ -145,19 +154,7 @@ class DebugPlugin(PluginBase):
|
||||
if not env_config:
|
||||
raise ValueError(f"Environment '{name}' not found.")
|
||||
|
||||
# Map Environment model to SupersetConfig
|
||||
from superset_tool.models import SupersetConfig
|
||||
superset_config = SupersetConfig(
|
||||
env=env_config.name,
|
||||
base_url=env_config.url,
|
||||
auth={
|
||||
"provider": "db", # Defaulting to db provider
|
||||
"username": env_config.username,
|
||||
"password": env_config.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(superset_config)
|
||||
client = SupersetClient(env_config)
|
||||
client.authenticate()
|
||||
count, dbs = client.get_databases()
|
||||
results[name] = {
|
||||
@@ -188,19 +185,7 @@ class DebugPlugin(PluginBase):
|
||||
if not env_config:
|
||||
raise ValueError(f"Environment '{env_name}' not found.")
|
||||
|
||||
# Map Environment model to SupersetConfig
|
||||
from superset_tool.models import SupersetConfig
|
||||
superset_config = SupersetConfig(
|
||||
env=env_config.name,
|
||||
base_url=env_config.url,
|
||||
auth={
|
||||
"provider": "db", # Defaulting to db provider
|
||||
"username": env_config.username,
|
||||
"password": env_config.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(superset_config)
|
||||
client = SupersetClient(env_config)
|
||||
client.authenticate()
|
||||
|
||||
dataset_response = client.get_dataset(dataset_id)
|
||||
|
||||
385
backend/src/plugins/git_plugin.py
Normal file
385
backend/src/plugins/git_plugin.py
Normal file
@@ -0,0 +1,385 @@
|
||||
# [DEF:backend.src.plugins.git_plugin:Module]
|
||||
#
|
||||
# @SEMANTICS: git, plugin, dashboard, version_control, sync, deploy
|
||||
# @PURPOSE: Предоставляет плагин для версионирования и развертывания дашбордов Superset.
|
||||
# @LAYER: Plugin
|
||||
# @RELATION: INHERITS_FROM -> src.core.plugin_base.PluginBase
|
||||
# @RELATION: USES -> src.services.git_service.GitService
|
||||
# @RELATION: USES -> src.core.superset_client.SupersetClient
|
||||
# @RELATION: USES -> src.core.config_manager.ConfigManager
|
||||
#
|
||||
# @INVARIANT: Все операции с Git должны выполняться через GitService.
|
||||
# @CONSTRAINT: Плагин работает только с распакованными YAML-экспортами Superset.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import os
|
||||
import io
|
||||
import shutil
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Dict, Any, Optional
|
||||
from src.core.plugin_base import PluginBase
|
||||
from src.services.git_service import GitService
|
||||
from src.core.logger import logger, belief_scope
|
||||
from src.core.config_manager import ConfigManager
|
||||
from src.core.superset_client import SupersetClient
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:GitPlugin:Class]
|
||||
# @PURPOSE: Реализация плагина Git Integration для управления версиями дашбордов.
|
||||
class GitPlugin(PluginBase):
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Инициализирует плагин и его зависимости.
|
||||
# @PRE: config.json exists or shared config_manager is available.
|
||||
# @POST: Инициализированы git_service и config_manager.
|
||||
def __init__(self):
|
||||
with belief_scope("GitPlugin.__init__"):
|
||||
logger.info("[GitPlugin.__init__][Entry] Initializing GitPlugin.")
|
||||
self.git_service = GitService()
|
||||
|
||||
# Robust config path resolution:
|
||||
# 1. Try absolute path from src/dependencies.py style if possible
|
||||
# 2. Try relative paths based on common execution patterns
|
||||
if os.path.exists("../config.json"):
|
||||
config_path = "../config.json"
|
||||
elif os.path.exists("config.json"):
|
||||
config_path = "config.json"
|
||||
else:
|
||||
# Fallback to the one initialized in dependencies if we can import it
|
||||
try:
|
||||
from src.dependencies import config_manager
|
||||
self.config_manager = config_manager
|
||||
logger.info("[GitPlugin.__init__][Exit] GitPlugin initialized using shared config_manager.")
|
||||
return
|
||||
except:
|
||||
config_path = "config.json"
|
||||
|
||||
self.config_manager = ConfigManager(config_path)
|
||||
logger.info(f"[GitPlugin.__init__][Exit] GitPlugin initialized with {config_path}")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
@property
|
||||
# [DEF:id:Function]
|
||||
# @PURPOSE: Returns the plugin identifier.
|
||||
# @PRE: GitPlugin is initialized.
|
||||
# @POST: Returns 'git-integration'.
|
||||
def id(self) -> str:
|
||||
with belief_scope("GitPlugin.id"):
|
||||
return "git-integration"
|
||||
# [/DEF:id:Function]
|
||||
|
||||
@property
|
||||
# [DEF:name:Function]
|
||||
# @PURPOSE: Returns the plugin name.
|
||||
# @PRE: GitPlugin is initialized.
|
||||
# @POST: Returns the human-readable name.
|
||||
def name(self) -> str:
|
||||
with belief_scope("GitPlugin.name"):
|
||||
return "Git Integration"
|
||||
# [/DEF:name:Function]
|
||||
|
||||
@property
|
||||
# [DEF:description:Function]
|
||||
# @PURPOSE: Returns the plugin description.
|
||||
# @PRE: GitPlugin is initialized.
|
||||
# @POST: Returns the plugin's purpose description.
|
||||
def description(self) -> str:
|
||||
with belief_scope("GitPlugin.description"):
|
||||
return "Version control for Superset dashboards"
|
||||
# [/DEF:description:Function]
|
||||
|
||||
@property
|
||||
# [DEF:version:Function]
|
||||
# @PURPOSE: Returns the plugin version.
|
||||
# @PRE: GitPlugin is initialized.
|
||||
# @POST: Returns the version string.
|
||||
def version(self) -> str:
|
||||
with belief_scope("GitPlugin.version"):
|
||||
return "0.1.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
@property
|
||||
# [DEF:ui_route:Function]
|
||||
# @PURPOSE: Returns the frontend route for the git plugin.
|
||||
# @RETURN: str - "/git"
|
||||
def ui_route(self) -> str:
|
||||
with belief_scope("GitPlugin.ui_route"):
|
||||
return "/git"
|
||||
# [/DEF:ui_route:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Возвращает JSON-схему параметров для выполнения задач плагина.
|
||||
# @PRE: GitPlugin is initialized.
|
||||
# @POST: Returns a JSON schema dictionary.
|
||||
# @RETURN: Dict[str, Any] - Схема параметров.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
with belief_scope("GitPlugin.get_schema"):
|
||||
return {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"operation": {"type": "string", "enum": ["sync", "deploy", "history"]},
|
||||
"dashboard_id": {"type": "integer"},
|
||||
"environment_id": {"type": "string"},
|
||||
"source_env_id": {"type": "string"}
|
||||
},
|
||||
"required": ["operation", "dashboard_id"]
|
||||
}
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
# [DEF:initialize:Function]
|
||||
# @PURPOSE: Выполняет начальную настройку плагина.
|
||||
# @PRE: GitPlugin is initialized.
|
||||
# @POST: Плагин готов к выполнению задач.
|
||||
async def initialize(self):
|
||||
with belief_scope("GitPlugin.initialize"):
|
||||
logger.info("[GitPlugin.initialize][Action] Initializing Git Integration Plugin logic.")
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Основной метод выполнения задач плагина.
|
||||
# @PRE: task_data содержит 'operation' и 'dashboard_id'.
|
||||
# @POST: Возвращает результат выполнения операции.
|
||||
# @PARAM: task_data (Dict[str, Any]) - Данные задачи.
|
||||
# @RETURN: Dict[str, Any] - Статус и сообщение.
|
||||
# @RELATION: CALLS -> self._handle_sync
|
||||
# @RELATION: CALLS -> self._handle_deploy
|
||||
async def execute(self, task_data: Dict[str, Any]) -> Dict[str, Any]:
|
||||
with belief_scope("GitPlugin.execute"):
|
||||
operation = task_data.get("operation")
|
||||
dashboard_id = task_data.get("dashboard_id")
|
||||
|
||||
logger.info(f"[GitPlugin.execute][Entry] Executing operation: {operation} for dashboard {dashboard_id}")
|
||||
|
||||
if operation == "sync":
|
||||
source_env_id = task_data.get("source_env_id")
|
||||
result = await self._handle_sync(dashboard_id, source_env_id)
|
||||
elif operation == "deploy":
|
||||
env_id = task_data.get("environment_id")
|
||||
result = await self._handle_deploy(dashboard_id, env_id)
|
||||
elif operation == "history":
|
||||
result = {"status": "success", "message": "History available via API"}
|
||||
else:
|
||||
logger.error(f"[GitPlugin.execute][Coherence:Failed] Unknown operation: {operation}")
|
||||
raise ValueError(f"Unknown operation: {operation}")
|
||||
|
||||
logger.info(f"[GitPlugin.execute][Exit] Operation {operation} completed.")
|
||||
return result
|
||||
# [/DEF:execute:Function]
|
||||
|
||||
# [DEF:_handle_sync:Function]
|
||||
# @PURPOSE: Экспортирует дашборд из Superset и распаковывает в Git-репозиторий.
|
||||
# @PRE: Репозиторий для дашборда должен существовать.
|
||||
# @POST: Файлы в репозитории обновлены до текущего состояния в Superset.
|
||||
# @PARAM: dashboard_id (int) - ID дашборда.
|
||||
# @PARAM: source_env_id (Optional[str]) - ID исходного окружения.
|
||||
# @RETURN: Dict[str, str] - Результат синхронизации.
|
||||
# @SIDE_EFFECT: Изменяет файлы в локальной рабочей директории репозитория.
|
||||
# @RELATION: CALLS -> src.services.git_service.GitService.get_repo
|
||||
# @RELATION: CALLS -> src.core.superset_client.SupersetClient.export_dashboard
|
||||
async def _handle_sync(self, dashboard_id: int, source_env_id: Optional[str] = None) -> Dict[str, str]:
|
||||
with belief_scope("GitPlugin._handle_sync"):
|
||||
try:
|
||||
# 1. Получение репозитория
|
||||
repo = self.git_service.get_repo(dashboard_id)
|
||||
repo_path = Path(repo.working_dir)
|
||||
logger.info(f"[_handle_sync][Action] Target repo path: {repo_path}")
|
||||
|
||||
# 2. Настройка клиента Superset
|
||||
env = self._get_env(source_env_id)
|
||||
client = SupersetClient(env)
|
||||
client.authenticate()
|
||||
|
||||
# 3. Экспорт дашборда
|
||||
logger.info(f"[_handle_sync][Action] Exporting dashboard {dashboard_id} from {env.name}")
|
||||
zip_bytes, _ = client.export_dashboard(dashboard_id)
|
||||
|
||||
# 4. Распаковка с выравниванием структуры (flattening)
|
||||
logger.info(f"[_handle_sync][Action] Unpacking export to {repo_path}")
|
||||
|
||||
# Список папок/файлов, которые мы ожидаем от Superset
|
||||
managed_dirs = ["dashboards", "charts", "datasets", "databases"]
|
||||
managed_files = ["metadata.yaml"]
|
||||
|
||||
# Очистка старых данных перед распаковкой, чтобы не оставалось "призраков"
|
||||
for d in managed_dirs:
|
||||
d_path = repo_path / d
|
||||
if d_path.exists() and d_path.is_dir():
|
||||
shutil.rmtree(d_path)
|
||||
for f in managed_files:
|
||||
f_path = repo_path / f
|
||||
if f_path.exists():
|
||||
f_path.unlink()
|
||||
|
||||
with zipfile.ZipFile(io.BytesIO(zip_bytes)) as zf:
|
||||
# Superset экспортирует всё в подпапку dashboard_export_timestamp/
|
||||
# Нам нужно найти это имя папки
|
||||
namelist = zf.namelist()
|
||||
if not namelist:
|
||||
raise ValueError("Export ZIP is empty")
|
||||
|
||||
root_folder = namelist[0].split('/')[0]
|
||||
logger.info(f"[_handle_sync][Action] Detected root folder in ZIP: {root_folder}")
|
||||
|
||||
for member in zf.infolist():
|
||||
if member.filename.startswith(root_folder + "/") and len(member.filename) > len(root_folder) + 1:
|
||||
# Убираем префикс папки
|
||||
relative_path = member.filename[len(root_folder)+1:]
|
||||
target_path = repo_path / relative_path
|
||||
|
||||
if member.is_dir():
|
||||
target_path.mkdir(parents=True, exist_ok=True)
|
||||
else:
|
||||
target_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with zf.open(member) as source, open(target_path, "wb") as target:
|
||||
shutil.copyfileobj(source, target)
|
||||
|
||||
# 5. Автоматический staging изменений (не коммит, чтобы юзер мог проверить diff)
|
||||
try:
|
||||
repo.git.add(A=True)
|
||||
logger.info(f"[_handle_sync][Action] Changes staged in git")
|
||||
except Exception as ge:
|
||||
logger.warning(f"[_handle_sync][Action] Failed to stage changes: {ge}")
|
||||
|
||||
logger.info(f"[_handle_sync][Coherence:OK] Dashboard {dashboard_id} synced successfully.")
|
||||
return {"status": "success", "message": "Dashboard synced and flattened in local repository"}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[_handle_sync][Coherence:Failed] Sync failed: {e}")
|
||||
raise
|
||||
# [/DEF:_handle_sync:Function]
|
||||
|
||||
# [DEF:_handle_deploy:Function]
|
||||
# @PURPOSE: Упаковывает репозиторий в ZIP и импортирует в целевое окружение Superset.
|
||||
# @PRE: environment_id должен соответствовать настроенному окружению.
|
||||
# @POST: Дашборд импортирован в целевой Superset.
|
||||
# @PARAM: dashboard_id (int) - ID дашборда.
|
||||
# @PARAM: env_id (str) - ID целевого окружения.
|
||||
# @RETURN: Dict[str, Any] - Результат деплоя.
|
||||
# @SIDE_EFFECT: Создает и удаляет временный ZIP-файл.
|
||||
# @RELATION: CALLS -> src.core.superset_client.SupersetClient.import_dashboard
|
||||
async def _handle_deploy(self, dashboard_id: int, env_id: str) -> Dict[str, Any]:
|
||||
with belief_scope("GitPlugin._handle_deploy"):
|
||||
try:
|
||||
if not env_id:
|
||||
raise ValueError("Target environment ID required for deployment")
|
||||
|
||||
# 1. Получение репозитория
|
||||
repo = self.git_service.get_repo(dashboard_id)
|
||||
repo_path = Path(repo.working_dir)
|
||||
|
||||
# 2. Упаковка в ZIP
|
||||
logger.info(f"[_handle_deploy][Action] Packing repository {repo_path} for deployment.")
|
||||
zip_buffer = io.BytesIO()
|
||||
|
||||
# Superset expects a root directory in the ZIP (e.g., dashboard_export_20240101T000000/)
|
||||
root_dir_name = f"dashboard_export_{dashboard_id}"
|
||||
|
||||
with zipfile.ZipFile(zip_buffer, "w", zipfile.ZIP_DEFLATED) as zf:
|
||||
for root, dirs, files in os.walk(repo_path):
|
||||
if ".git" in dirs:
|
||||
dirs.remove(".git")
|
||||
for file in files:
|
||||
if file == ".git" or file.endswith(".zip"): continue
|
||||
file_path = Path(root) / file
|
||||
# Prepend the root directory name to the archive path
|
||||
arcname = Path(root_dir_name) / file_path.relative_to(repo_path)
|
||||
zf.write(file_path, arcname)
|
||||
|
||||
zip_buffer.seek(0)
|
||||
|
||||
# 3. Настройка клиента Superset
|
||||
env = self.config_manager.get_environment(env_id)
|
||||
if not env:
|
||||
raise ValueError(f"Environment {env_id} not found")
|
||||
|
||||
client = SupersetClient(env)
|
||||
client.authenticate()
|
||||
|
||||
# 4. Импорт
|
||||
temp_zip_path = repo_path / f"deploy_{dashboard_id}.zip"
|
||||
logger.info(f"[_handle_deploy][Action] Saving temporary zip to {temp_zip_path}")
|
||||
with open(temp_zip_path, "wb") as f:
|
||||
f.write(zip_buffer.getvalue())
|
||||
|
||||
try:
|
||||
logger.info(f"[_handle_deploy][Action] Importing dashboard to {env.name}")
|
||||
result = client.import_dashboard(temp_zip_path)
|
||||
logger.info(f"[_handle_deploy][Coherence:OK] Deployment successful for dashboard {dashboard_id}.")
|
||||
return {"status": "success", "message": f"Dashboard deployed to {env.name}", "details": result}
|
||||
finally:
|
||||
if temp_zip_path.exists():
|
||||
os.remove(temp_zip_path)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[_handle_deploy][Coherence:Failed] Deployment failed: {e}")
|
||||
raise
|
||||
# [/DEF:_handle_deploy:Function]
|
||||
|
||||
# [DEF:_get_env:Function]
|
||||
# @PURPOSE: Вспомогательный метод для получения конфигурации окружения.
|
||||
# @PARAM: env_id (Optional[str]) - ID окружения.
|
||||
# @PRE: env_id is a string or None.
|
||||
# @POST: Returns an Environment object from config or DB.
|
||||
# @RETURN: Environment - Объект конфигурации окружения.
|
||||
def _get_env(self, env_id: Optional[str] = None):
|
||||
with belief_scope("GitPlugin._get_env"):
|
||||
logger.info(f"[_get_env][Entry] Fetching environment for ID: {env_id}")
|
||||
|
||||
# Priority 1: ConfigManager (config.json)
|
||||
if env_id:
|
||||
env = self.config_manager.get_environment(env_id)
|
||||
if env:
|
||||
logger.info(f"[_get_env][Exit] Found environment by ID in ConfigManager: {env.name}")
|
||||
return env
|
||||
|
||||
# Priority 2: Database (DeploymentEnvironment)
|
||||
from src.core.database import SessionLocal
|
||||
from src.models.git import DeploymentEnvironment
|
||||
|
||||
db = SessionLocal()
|
||||
try:
|
||||
if env_id:
|
||||
db_env = db.query(DeploymentEnvironment).filter(DeploymentEnvironment.id == env_id).first()
|
||||
else:
|
||||
# If no ID, try to find active or any environment in DB
|
||||
db_env = db.query(DeploymentEnvironment).filter(DeploymentEnvironment.is_active == True).first()
|
||||
if not db_env:
|
||||
db_env = db.query(DeploymentEnvironment).first()
|
||||
|
||||
if db_env:
|
||||
logger.info(f"[_get_env][Exit] Found environment in DB: {db_env.name}")
|
||||
from src.core.config_models import Environment
|
||||
# Use token as password for SupersetClient
|
||||
return Environment(
|
||||
id=db_env.id,
|
||||
name=db_env.name,
|
||||
url=db_env.superset_url,
|
||||
username="admin",
|
||||
password=db_env.superset_token,
|
||||
verify_ssl=True
|
||||
)
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
# Priority 3: ConfigManager Default (if no env_id provided)
|
||||
envs = self.config_manager.get_environments()
|
||||
if envs:
|
||||
if env_id:
|
||||
# If env_id was provided but not found in DB or specifically by ID in config,
|
||||
# but we have other envs, maybe it's one of them?
|
||||
env = next((e for e in envs if e.id == env_id), None)
|
||||
if env:
|
||||
logger.info(f"[_get_env][Exit] Found environment {env_id} in ConfigManager list")
|
||||
return env
|
||||
|
||||
if not env_id:
|
||||
logger.info(f"[_get_env][Exit] Using first environment from ConfigManager: {envs[0].name}")
|
||||
return envs[0]
|
||||
|
||||
logger.error(f"[_get_env][Coherence:Failed] No environments configured (searched config.json and DB). env_id={env_id}")
|
||||
raise ValueError("No environments configured. Please add a Superset Environment in Settings.")
|
||||
# [/DEF:_get_env:Function]
|
||||
|
||||
# [/DEF:initialize:Function]
|
||||
# [/DEF:GitPlugin:Class]
|
||||
# [/DEF:backend.src.plugins.git_plugin:Module]
|
||||
@@ -12,8 +12,7 @@ from ..core.superset_client import SupersetClient
|
||||
from ..core.logger import logger, belief_scope
|
||||
from ..core.database import SessionLocal
|
||||
from ..models.connection import ConnectionConfig
|
||||
from superset_tool.utils.dataset_mapper import DatasetMapper
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
from ..core.utils.dataset_mapper import DatasetMapper
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:MapperPlugin:Class]
|
||||
@@ -67,6 +66,15 @@ class MapperPlugin(PluginBase):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
@property
|
||||
# [DEF:ui_route:Function]
|
||||
# @PURPOSE: Returns the frontend route for the mapper plugin.
|
||||
# @RETURN: str - "/tools/mapper"
|
||||
def ui_route(self) -> str:
|
||||
with belief_scope("ui_route"):
|
||||
return "/tools/mapper"
|
||||
# [/DEF:ui_route:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for the mapper plugin parameters.
|
||||
# @PRE: Plugin instance exists.
|
||||
@@ -137,25 +145,13 @@ class MapperPlugin(PluginBase):
|
||||
|
||||
# Get config and initialize client
|
||||
from ..dependencies import get_config_manager
|
||||
from superset_tool.models import SupersetConfig
|
||||
config_manager = get_config_manager()
|
||||
env_config = config_manager.get_environment(env_name)
|
||||
if not env_config:
|
||||
logger.error(f"[MapperPlugin.execute][State] Environment '{env_name}' not found.")
|
||||
raise ValueError(f"Environment '{env_name}' not found in configuration.")
|
||||
|
||||
# Map Environment model to SupersetConfig
|
||||
superset_config = SupersetConfig(
|
||||
env=env_config.name,
|
||||
base_url=env_config.url,
|
||||
auth={
|
||||
"provider": "db", # Defaulting to db provider
|
||||
"username": env_config.username,
|
||||
"password": env_config.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(superset_config)
|
||||
client = SupersetClient(env_config)
|
||||
client.authenticate()
|
||||
|
||||
postgres_config = None
|
||||
@@ -185,9 +181,7 @@ class MapperPlugin(PluginBase):
|
||||
|
||||
logger.info(f"[MapperPlugin.execute][Action] Starting mapping for dataset {dataset_id} in {env_name}")
|
||||
|
||||
# Use internal SupersetLogger for DatasetMapper
|
||||
s_logger = SupersetLogger(name="dataset_mapper_plugin")
|
||||
mapper = DatasetMapper(s_logger)
|
||||
mapper = DatasetMapper()
|
||||
|
||||
try:
|
||||
mapper.run_mapping(
|
||||
|
||||
@@ -13,11 +13,9 @@ import re
|
||||
|
||||
from ..core.plugin_base import PluginBase
|
||||
from ..core.logger import belief_scope
|
||||
from superset_tool.client import SupersetClient
|
||||
from superset_tool.utils.init_clients import setup_clients
|
||||
from superset_tool.utils.fileio import create_temp_file, update_yamls, create_dashboard_export
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.utils.fileio import create_temp_file, update_yamls, create_dashboard_export
|
||||
from ..dependencies import get_config_manager
|
||||
from superset_tool.utils.logger import SupersetLogger
|
||||
from ..core.migration_engine import MigrationEngine
|
||||
from ..core.database import SessionLocal
|
||||
from ..models.mapping import DatabaseMapping, Environment
|
||||
@@ -73,6 +71,15 @@ class MigrationPlugin(PluginBase):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
@property
|
||||
# [DEF:ui_route:Function]
|
||||
# @PURPOSE: Returns the frontend route for the migration plugin.
|
||||
# @RETURN: str - "/migration"
|
||||
def ui_route(self) -> str:
|
||||
with belief_scope("ui_route"):
|
||||
return "/migration"
|
||||
# [/DEF:ui_route:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for migration plugin parameters.
|
||||
# @PRE: Config manager is available.
|
||||
@@ -150,7 +157,7 @@ class MigrationPlugin(PluginBase):
|
||||
from ..dependencies import get_task_manager
|
||||
tm = get_task_manager()
|
||||
|
||||
class TaskLoggerProxy(SupersetLogger):
|
||||
class TaskLoggerProxy:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the proxy logger.
|
||||
# @PRE: None.
|
||||
@@ -158,7 +165,7 @@ class MigrationPlugin(PluginBase):
|
||||
def __init__(self):
|
||||
with belief_scope("__init__"):
|
||||
# Initialize parent with dummy values since we override methods
|
||||
super().__init__(console=False)
|
||||
pass
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:debug:Function]
|
||||
@@ -246,9 +253,8 @@ class MigrationPlugin(PluginBase):
|
||||
|
||||
logger.info(f"[MigrationPlugin][State] Resolved environments: {from_env_name} -> {to_env_name}")
|
||||
|
||||
all_clients = setup_clients(logger, custom_envs=environments)
|
||||
from_c = all_clients.get(from_env_name)
|
||||
to_c = all_clients.get(to_env_name)
|
||||
from_c = SupersetClient(src_env)
|
||||
to_c = SupersetClient(tgt_env)
|
||||
|
||||
if not from_c or not to_c:
|
||||
raise ValueError(f"Clients not initialized for environments: {from_env_name}, {to_env_name}")
|
||||
@@ -297,9 +303,9 @@ class MigrationPlugin(PluginBase):
|
||||
|
||||
try:
|
||||
exported_content, _ = from_c.export_dashboard(dash_id)
|
||||
with create_temp_file(content=exported_content, dry_run=True, suffix=".zip", logger=logger) as tmp_zip_path:
|
||||
with create_temp_file(content=exported_content, dry_run=True, suffix=".zip") as tmp_zip_path:
|
||||
# Always transform to strip databases to avoid password errors
|
||||
with create_temp_file(suffix=".zip", dry_run=True, logger=logger) as tmp_new_zip:
|
||||
with create_temp_file(suffix=".zip", dry_run=True) as tmp_new_zip:
|
||||
success = engine.transform_zip(str(tmp_zip_path), str(tmp_new_zip), db_mapping, strip_databases=False)
|
||||
|
||||
if not success and replace_db_config:
|
||||
|
||||
@@ -64,6 +64,15 @@ class SearchPlugin(PluginBase):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
@property
|
||||
# [DEF:ui_route:Function]
|
||||
# @PURPOSE: Returns the frontend route for the search plugin.
|
||||
# @RETURN: str - "/tools/search"
|
||||
def ui_route(self) -> str:
|
||||
with belief_scope("ui_route"):
|
||||
return "/tools/search"
|
||||
# [/DEF:ui_route:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for the search plugin parameters.
|
||||
# @PRE: Plugin instance exists.
|
||||
@@ -106,25 +115,13 @@ class SearchPlugin(PluginBase):
|
||||
|
||||
# Get config and initialize client
|
||||
from ..dependencies import get_config_manager
|
||||
from superset_tool.models import SupersetConfig
|
||||
config_manager = get_config_manager()
|
||||
env_config = config_manager.get_environment(env_name)
|
||||
if not env_config:
|
||||
logger.error(f"[SearchPlugin.execute][State] Environment '{env_name}' not found.")
|
||||
raise ValueError(f"Environment '{env_name}' not found in configuration.")
|
||||
|
||||
# Map Environment model to SupersetConfig
|
||||
superset_config = SupersetConfig(
|
||||
env=env_config.name,
|
||||
base_url=env_config.url,
|
||||
auth={
|
||||
"provider": "db", # Defaulting to db provider
|
||||
"username": env_config.username,
|
||||
"password": env_config.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
client = SupersetClient(superset_config)
|
||||
client = SupersetClient(env_config)
|
||||
client.authenticate()
|
||||
|
||||
logger.info(f"[SearchPlugin.execute][Action] Searching for pattern: '{search_query}' in environment: {env_name}")
|
||||
|
||||
3
backend/src/plugins/storage/__init__.py
Normal file
3
backend/src/plugins/storage/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
from .plugin import StoragePlugin
|
||||
|
||||
__all__ = ["StoragePlugin"]
|
||||
333
backend/src/plugins/storage/plugin.py
Normal file
333
backend/src/plugins/storage/plugin.py
Normal file
@@ -0,0 +1,333 @@
|
||||
# [DEF:StoragePlugin:Module]
|
||||
#
|
||||
# @SEMANTICS: storage, files, filesystem, plugin
|
||||
# @PURPOSE: Provides core filesystem operations for managing backups and repositories.
|
||||
# @LAYER: App
|
||||
# @RELATION: IMPLEMENTS -> PluginBase
|
||||
# @RELATION: DEPENDS_ON -> backend.src.models.storage
|
||||
#
|
||||
# @INVARIANT: All file operations must be restricted to the configured storage root.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import os
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, List, Optional
|
||||
from fastapi import UploadFile
|
||||
|
||||
from ...core.plugin_base import PluginBase
|
||||
from ...core.logger import belief_scope, logger
|
||||
from ...models.storage import StoredFile, FileCategory, StorageConfig
|
||||
from ...dependencies import get_config_manager
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:StoragePlugin:Class]
|
||||
# @PURPOSE: Implementation of the storage management plugin.
|
||||
class StoragePlugin(PluginBase):
|
||||
"""
|
||||
Plugin for managing local file storage for backups and repositories.
|
||||
"""
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the StoragePlugin and ensures required directories exist.
|
||||
# @PRE: Configuration manager must be accessible.
|
||||
# @POST: Storage root and category directories are created on disk.
|
||||
def __init__(self):
|
||||
with belief_scope("StoragePlugin:init"):
|
||||
self.ensure_directories()
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
@property
|
||||
# [DEF:id:Function]
|
||||
# @PURPOSE: Returns the unique identifier for the storage plugin.
|
||||
# @PRE: None.
|
||||
# @POST: Returns the plugin ID string.
|
||||
# @RETURN: str - "storage-manager"
|
||||
def id(self) -> str:
|
||||
with belief_scope("StoragePlugin:id"):
|
||||
return "storage-manager"
|
||||
# [/DEF:id:Function]
|
||||
|
||||
@property
|
||||
# [DEF:name:Function]
|
||||
# @PURPOSE: Returns the human-readable name of the storage plugin.
|
||||
# @PRE: None.
|
||||
# @POST: Returns the plugin name string.
|
||||
# @RETURN: str - "Storage Manager"
|
||||
def name(self) -> str:
|
||||
with belief_scope("StoragePlugin:name"):
|
||||
return "Storage Manager"
|
||||
# [/DEF:name:Function]
|
||||
|
||||
@property
|
||||
# [DEF:description:Function]
|
||||
# @PURPOSE: Returns a description of the storage plugin.
|
||||
# @PRE: None.
|
||||
# @POST: Returns the plugin description string.
|
||||
# @RETURN: str - Plugin description.
|
||||
def description(self) -> str:
|
||||
with belief_scope("StoragePlugin:description"):
|
||||
return "Manages local file storage for backups and repositories."
|
||||
# [/DEF:description:Function]
|
||||
|
||||
@property
|
||||
# [DEF:version:Function]
|
||||
# @PURPOSE: Returns the version of the storage plugin.
|
||||
# @PRE: None.
|
||||
# @POST: Returns the version string.
|
||||
# @RETURN: str - "1.0.0"
|
||||
def version(self) -> str:
|
||||
with belief_scope("StoragePlugin:version"):
|
||||
return "1.0.0"
|
||||
# [/DEF:version:Function]
|
||||
|
||||
@property
|
||||
# [DEF:ui_route:Function]
|
||||
# @PURPOSE: Returns the frontend route for the storage plugin.
|
||||
# @RETURN: str - "/tools/storage"
|
||||
def ui_route(self) -> str:
|
||||
with belief_scope("StoragePlugin:ui_route"):
|
||||
return "/tools/storage"
|
||||
# [/DEF:ui_route:Function]
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Returns the JSON schema for storage plugin parameters.
|
||||
# @PRE: None.
|
||||
# @POST: Returns a dictionary representing the JSON schema.
|
||||
# @RETURN: Dict[str, Any] - JSON schema.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
with belief_scope("StoragePlugin:get_schema"):
|
||||
return {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"category": {
|
||||
"type": "string",
|
||||
"enum": [c.value for c in FileCategory],
|
||||
"title": "Category"
|
||||
}
|
||||
},
|
||||
"required": ["category"]
|
||||
}
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Executes storage-related tasks (placeholder for PluginBase compliance).
|
||||
# @PRE: params must match the plugin schema.
|
||||
# @POST: Task is executed and logged.
|
||||
async def execute(self, params: Dict[str, Any]):
|
||||
with belief_scope("StoragePlugin:execute"):
|
||||
logger.info(f"[StoragePlugin][Action] Executing with params: {params}")
|
||||
# [/DEF:execute:Function]
|
||||
|
||||
# [DEF:get_storage_root:Function]
|
||||
# @PURPOSE: Resolves the absolute path to the storage root.
|
||||
# @PRE: Settings must define a storage root path.
|
||||
# @POST: Returns a Path object representing the storage root.
|
||||
def get_storage_root(self) -> Path:
|
||||
with belief_scope("StoragePlugin:get_storage_root"):
|
||||
config_manager = get_config_manager()
|
||||
global_settings = config_manager.get_config().settings
|
||||
|
||||
# Use storage.root_path as the source of truth for storage UI
|
||||
root = Path(global_settings.storage.root_path)
|
||||
|
||||
if not root.is_absolute():
|
||||
# Resolve relative to the backend directory
|
||||
# Path(__file__) is backend/src/plugins/storage/plugin.py
|
||||
# parents[3] is the project root (ss-tools)
|
||||
# We need to ensure it's relative to where backend/ is
|
||||
project_root = Path(__file__).parents[3]
|
||||
root = (project_root / root).resolve()
|
||||
return root
|
||||
# [/DEF:get_storage_root:Function]
|
||||
|
||||
# [DEF:resolve_path:Function]
|
||||
# @PURPOSE: Resolves a dynamic path pattern using provided variables.
|
||||
# @PARAM: pattern (str) - The path pattern to resolve.
|
||||
# @PARAM: variables (Dict[str, str]) - Variables to substitute in the pattern.
|
||||
# @PRE: pattern must be a valid format string.
|
||||
# @POST: Returns the resolved path string.
|
||||
# @RETURN: str - The resolved path.
|
||||
def resolve_path(self, pattern: str, variables: Dict[str, str]) -> str:
|
||||
with belief_scope("StoragePlugin:resolve_path"):
|
||||
# Add common variables
|
||||
vars_with_defaults = {
|
||||
"timestamp": datetime.now().strftime("%Y%m%dT%H%M%S"),
|
||||
**variables
|
||||
}
|
||||
try:
|
||||
resolved = pattern.format(**vars_with_defaults)
|
||||
# Clean up any double slashes or leading/trailing slashes for relative path
|
||||
return os.path.normpath(resolved).strip("/")
|
||||
except KeyError as e:
|
||||
logger.warning(f"[StoragePlugin][Coherence:Failed] Missing variable for path resolution: {e}")
|
||||
# Fallback to literal pattern if formatting fails partially (or handle as needed)
|
||||
return pattern.replace("{", "").replace("}", "")
|
||||
# [/DEF:resolve_path:Function]
|
||||
|
||||
# [DEF:ensure_directories:Function]
|
||||
# @PURPOSE: Creates the storage root and category subdirectories if they don't exist.
|
||||
# @PRE: Storage root must be resolvable.
|
||||
# @POST: Directories are created on the filesystem.
|
||||
# @SIDE_EFFECT: Creates directories on the filesystem.
|
||||
def ensure_directories(self):
|
||||
with belief_scope("StoragePlugin:ensure_directories"):
|
||||
root = self.get_storage_root()
|
||||
for category in FileCategory:
|
||||
# Use singular name for consistency with BackupPlugin and GitService
|
||||
path = root / category.value
|
||||
path.mkdir(parents=True, exist_ok=True)
|
||||
logger.debug(f"[StoragePlugin][Action] Ensured directory: {path}")
|
||||
# [/DEF:ensure_directories:Function]
|
||||
|
||||
# [DEF:validate_path:Function]
|
||||
# @PURPOSE: Prevents path traversal attacks by ensuring the path is within the storage root.
|
||||
# @PRE: path must be a Path object.
|
||||
# @POST: Returns the resolved absolute path if valid, otherwise raises ValueError.
|
||||
def validate_path(self, path: Path) -> Path:
|
||||
with belief_scope("StoragePlugin:validate_path"):
|
||||
root = self.get_storage_root().resolve()
|
||||
resolved = path.resolve()
|
||||
try:
|
||||
resolved.relative_to(root)
|
||||
except ValueError:
|
||||
logger.error(f"[StoragePlugin][Coherence:Failed] Path traversal detected: {resolved} is not under {root}")
|
||||
raise ValueError("Access denied: Path is outside of storage root.")
|
||||
return resolved
|
||||
# [/DEF:validate_path:Function]
|
||||
|
||||
# [DEF:list_files:Function]
|
||||
# @PURPOSE: Lists all files and directories in a specific category and subpath.
|
||||
# @PARAM: category (Optional[FileCategory]) - The category to list.
|
||||
# @PARAM: subpath (Optional[str]) - Nested path within the category.
|
||||
# @PRE: Storage root must exist.
|
||||
# @POST: Returns a list of StoredFile objects.
|
||||
# @RETURN: List[StoredFile] - List of file and directory metadata objects.
|
||||
def list_files(self, category: Optional[FileCategory] = None, subpath: Optional[str] = None) -> List[StoredFile]:
|
||||
with belief_scope("StoragePlugin:list_files"):
|
||||
root = self.get_storage_root()
|
||||
logger.info(f"[StoragePlugin][Action] Listing files in root: {root}, category: {category}, subpath: {subpath}")
|
||||
files = []
|
||||
|
||||
categories = [category] if category else list(FileCategory)
|
||||
|
||||
for cat in categories:
|
||||
# Scan the category subfolder + optional subpath
|
||||
base_dir = root / cat.value
|
||||
if subpath:
|
||||
target_dir = self.validate_path(base_dir / subpath)
|
||||
else:
|
||||
target_dir = base_dir
|
||||
|
||||
if not target_dir.exists():
|
||||
continue
|
||||
|
||||
logger.debug(f"[StoragePlugin][Action] Scanning directory: {target_dir}")
|
||||
|
||||
# Use os.scandir for better performance and to distinguish files vs dirs
|
||||
with os.scandir(target_dir) as it:
|
||||
for entry in it:
|
||||
# Skip logs
|
||||
if "Logs" in entry.path:
|
||||
continue
|
||||
|
||||
stat = entry.stat()
|
||||
is_dir = entry.is_dir()
|
||||
|
||||
files.append(StoredFile(
|
||||
name=entry.name,
|
||||
path=str(Path(entry.path).relative_to(root)),
|
||||
size=stat.st_size if not is_dir else 0,
|
||||
created_at=datetime.fromtimestamp(stat.st_ctime),
|
||||
category=cat,
|
||||
mime_type="directory" if is_dir else None
|
||||
))
|
||||
|
||||
# Sort: directories first, then by name
|
||||
return sorted(files, key=lambda x: (x.mime_type != "directory", x.name))
|
||||
# [/DEF:list_files:Function]
|
||||
|
||||
# [DEF:save_file:Function]
|
||||
# @PURPOSE: Saves an uploaded file to the specified category and optional subpath.
|
||||
# @PARAM: file (UploadFile) - The uploaded file.
|
||||
# @PARAM: category (FileCategory) - The target category.
|
||||
# @PARAM: subpath (Optional[str]) - The target subpath.
|
||||
# @PRE: file must be a valid UploadFile; category must be valid.
|
||||
# @POST: File is written to disk and metadata is returned.
|
||||
# @RETURN: StoredFile - Metadata of the saved file.
|
||||
# @SIDE_EFFECT: Writes file to disk.
|
||||
async def save_file(self, file: UploadFile, category: FileCategory, subpath: Optional[str] = None) -> StoredFile:
|
||||
with belief_scope("StoragePlugin:save_file"):
|
||||
root = self.get_storage_root()
|
||||
dest_dir = root / category.value
|
||||
if subpath:
|
||||
dest_dir = dest_dir / subpath
|
||||
|
||||
dest_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
dest_path = self.validate_path(dest_dir / file.filename)
|
||||
|
||||
with dest_path.open("wb") as buffer:
|
||||
shutil.copyfileobj(file.file, buffer)
|
||||
|
||||
stat = dest_path.stat()
|
||||
return StoredFile(
|
||||
name=dest_path.name,
|
||||
path=str(dest_path.relative_to(root)),
|
||||
size=stat.st_size,
|
||||
created_at=datetime.fromtimestamp(stat.st_ctime),
|
||||
category=category,
|
||||
mime_type=file.content_type
|
||||
)
|
||||
# [/DEF:save_file:Function]
|
||||
|
||||
# [DEF:delete_file:Function]
|
||||
# @PURPOSE: Deletes a file or directory from the specified category and path.
|
||||
# @PARAM: category (FileCategory) - The category.
|
||||
# @PARAM: path (str) - The relative path of the file or directory.
|
||||
# @PRE: path must belong to the specified category and exist on disk.
|
||||
# @POST: The file or directory is removed from disk.
|
||||
# @SIDE_EFFECT: Removes item from disk.
|
||||
def delete_file(self, category: FileCategory, path: str):
|
||||
with belief_scope("StoragePlugin:delete_file"):
|
||||
root = self.get_storage_root()
|
||||
# path is relative to root, but we ensure it starts with category
|
||||
full_path = self.validate_path(root / path)
|
||||
|
||||
if not str(Path(path)).startswith(category.value):
|
||||
raise ValueError(f"Path {path} does not belong to category {category}")
|
||||
|
||||
if full_path.exists():
|
||||
if full_path.is_dir():
|
||||
shutil.rmtree(full_path)
|
||||
else:
|
||||
full_path.unlink()
|
||||
logger.info(f"[StoragePlugin][Action] Deleted: {full_path}")
|
||||
else:
|
||||
raise FileNotFoundError(f"Item {path} not found")
|
||||
# [/DEF:delete_file:Function]
|
||||
|
||||
# [DEF:get_file_path:Function]
|
||||
# @PURPOSE: Returns the absolute path of a file for download.
|
||||
# @PARAM: category (FileCategory) - The category.
|
||||
# @PARAM: path (str) - The relative path of the file.
|
||||
# @PRE: path must belong to the specified category and be a file.
|
||||
# @POST: Returns the absolute Path to the file.
|
||||
# @RETURN: Path - Absolute path to the file.
|
||||
def get_file_path(self, category: FileCategory, path: str) -> Path:
|
||||
with belief_scope("StoragePlugin:get_file_path"):
|
||||
root = self.get_storage_root()
|
||||
file_path = self.validate_path(root / path)
|
||||
|
||||
if not str(Path(path)).startswith(category.value):
|
||||
raise ValueError(f"Path {path} does not belong to category {category}")
|
||||
|
||||
if not file_path.exists() or file_path.is_dir():
|
||||
raise FileNotFoundError(f"File {path} not found")
|
||||
|
||||
return file_path
|
||||
# [/DEF:get_file_path:Function]
|
||||
|
||||
# [/DEF:StoragePlugin:Class]
|
||||
# [/DEF:StoragePlugin:Module]
|
||||
128
backend/src/schemas/auth.py
Normal file
128
backend/src/schemas/auth.py
Normal file
@@ -0,0 +1,128 @@
|
||||
# [DEF:backend.src.schemas.auth:Module]
|
||||
#
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: auth, schemas, pydantic, user, token
|
||||
# @PURPOSE: Pydantic schemas for authentication requests and responses.
|
||||
# @LAYER: API
|
||||
# @RELATION: DEPENDS_ON -> pydantic
|
||||
#
|
||||
# @INVARIANT: Sensitive fields like password must not be included in response schemas.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Optional
|
||||
from pydantic import BaseModel, EmailStr, Field
|
||||
from datetime import datetime
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:Token:Class]
|
||||
# @TIER: TRIVIAL
|
||||
# @PURPOSE: Represents a JWT access token response.
|
||||
class Token(BaseModel):
|
||||
access_token: str
|
||||
token_type: str
|
||||
# [/DEF:Token:Class]
|
||||
|
||||
# [DEF:TokenData:Class]
|
||||
# @TIER: TRIVIAL
|
||||
# @PURPOSE: Represents the data encoded in a JWT token.
|
||||
class TokenData(BaseModel):
|
||||
username: Optional[str] = None
|
||||
scopes: List[str] = []
|
||||
# [/DEF:TokenData:Class]
|
||||
|
||||
# [DEF:PermissionSchema:Class]
|
||||
# @TIER: TRIVIAL
|
||||
# @PURPOSE: Represents a permission in API responses.
|
||||
class PermissionSchema(BaseModel):
|
||||
id: Optional[str] = None
|
||||
resource: str
|
||||
action: str
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:PermissionSchema:Class]
|
||||
|
||||
# [DEF:RoleSchema:Class]
|
||||
# @PURPOSE: Represents a role in API responses.
|
||||
class RoleSchema(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
description: Optional[str] = None
|
||||
permissions: List[PermissionSchema] = []
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:RoleSchema:Class]
|
||||
|
||||
# [DEF:RoleCreate:Class]
|
||||
# @PURPOSE: Schema for creating a new role.
|
||||
class RoleCreate(BaseModel):
|
||||
name: str
|
||||
description: Optional[str] = None
|
||||
permissions: List[str] = [] # List of permission IDs or "resource:action" strings
|
||||
# [/DEF:RoleCreate:Class]
|
||||
|
||||
# [DEF:RoleUpdate:Class]
|
||||
# @PURPOSE: Schema for updating an existing role.
|
||||
class RoleUpdate(BaseModel):
|
||||
name: Optional[str] = None
|
||||
description: Optional[str] = None
|
||||
permissions: Optional[List[str]] = None
|
||||
# [/DEF:RoleUpdate:Class]
|
||||
|
||||
# [DEF:ADGroupMappingSchema:Class]
|
||||
# @PURPOSE: Represents an AD Group to Role mapping in API responses.
|
||||
class ADGroupMappingSchema(BaseModel):
|
||||
id: str
|
||||
ad_group: str
|
||||
role_id: str
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:ADGroupMappingSchema:Class]
|
||||
|
||||
# [DEF:ADGroupMappingCreate:Class]
|
||||
# @PURPOSE: Schema for creating an AD Group mapping.
|
||||
class ADGroupMappingCreate(BaseModel):
|
||||
ad_group: str
|
||||
role_id: str
|
||||
# [/DEF:ADGroupMappingCreate:Class]
|
||||
|
||||
# [DEF:UserBase:Class]
|
||||
# @PURPOSE: Base schema for user data.
|
||||
class UserBase(BaseModel):
|
||||
username: str
|
||||
email: Optional[EmailStr] = None
|
||||
is_active: bool = True
|
||||
# [/DEF:UserBase:Class]
|
||||
|
||||
# [DEF:UserCreate:Class]
|
||||
# @PURPOSE: Schema for creating a new user.
|
||||
class UserCreate(UserBase):
|
||||
password: str
|
||||
roles: List[str] = []
|
||||
# [/DEF:UserCreate:Class]
|
||||
|
||||
# [DEF:UserUpdate:Class]
|
||||
# @PURPOSE: Schema for updating an existing user.
|
||||
class UserUpdate(BaseModel):
|
||||
email: Optional[EmailStr] = None
|
||||
password: Optional[str] = None
|
||||
is_active: Optional[bool] = None
|
||||
roles: Optional[List[str]] = None
|
||||
# [/DEF:UserUpdate:Class]
|
||||
|
||||
# [DEF:User:Class]
|
||||
# @PURPOSE: Schema for user data in API responses.
|
||||
class User(UserBase):
|
||||
id: str
|
||||
auth_source: str
|
||||
created_at: datetime
|
||||
last_login: Optional[datetime] = None
|
||||
roles: List[RoleSchema] = []
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
# [/DEF:User:Class]
|
||||
|
||||
# [/DEF:backend.src.schemas.auth:Module]
|
||||
82
backend/src/scripts/create_admin.py
Normal file
82
backend/src/scripts/create_admin.py
Normal file
@@ -0,0 +1,82 @@
|
||||
# [DEF:backend.src.scripts.create_admin:Module]
|
||||
#
|
||||
# @SEMANTICS: admin, setup, user, auth, cli
|
||||
# @PURPOSE: CLI tool for creating the initial admin user.
|
||||
# @LAYER: Scripts
|
||||
# @RELATION: USES -> backend.src.core.auth.security
|
||||
# @RELATION: USES -> backend.src.core.database
|
||||
# @RELATION: USES -> backend.src.models.auth
|
||||
#
|
||||
# @INVARIANT: Admin user must have the "Admin" role.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import sys
|
||||
import argparse
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path
|
||||
sys.path.append(str(Path(__file__).parent.parent.parent))
|
||||
|
||||
from src.core.database import AuthSessionLocal, init_db
|
||||
from src.core.auth.security import get_password_hash
|
||||
from src.models.auth import User, Role, Permission
|
||||
from src.core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:create_admin:Function]
|
||||
# @PURPOSE: Creates an admin user and necessary roles/permissions.
|
||||
# @PRE: username and password provided via CLI.
|
||||
# @POST: Admin user exists in auth.db.
|
||||
#
|
||||
# @PARAM: username (str) - Admin username.
|
||||
# @PARAM: password (str) - Admin password.
|
||||
def create_admin(username, password):
|
||||
with belief_scope("create_admin"):
|
||||
db = AuthSessionLocal()
|
||||
try:
|
||||
# 1. Ensure Admin role exists
|
||||
admin_role = db.query(Role).filter(Role.name == "Admin").first()
|
||||
if not admin_role:
|
||||
logger.info("Creating Admin role...")
|
||||
admin_role = Role(name="Admin", description="System Administrator")
|
||||
db.add(admin_role)
|
||||
db.commit()
|
||||
db.refresh(admin_role)
|
||||
|
||||
# 2. Check if user already exists
|
||||
existing_user = db.query(User).filter(User.username == username).first()
|
||||
if existing_user:
|
||||
logger.warning(f"User {username} already exists.")
|
||||
return
|
||||
|
||||
# 3. Create Admin user
|
||||
logger.info(f"Creating admin user: {username}")
|
||||
new_user = User(
|
||||
username=username,
|
||||
password_hash=get_password_hash(password),
|
||||
auth_source="LOCAL",
|
||||
is_active=True
|
||||
)
|
||||
new_user.roles.append(admin_role)
|
||||
db.add(new_user)
|
||||
db.commit()
|
||||
logger.info(f"Admin user {username} created successfully.")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create admin user: {e}")
|
||||
db.rollback()
|
||||
finally:
|
||||
db.close()
|
||||
# [/DEF:create_admin:Function]
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser(description="Create initial admin user")
|
||||
parser.add_argument("--username", required=True, help="Admin username")
|
||||
parser.add_argument("--password", required=True, help="Admin password")
|
||||
args = parser.parse_args()
|
||||
|
||||
# Ensure DB is initialized before creating admin
|
||||
init_db()
|
||||
create_admin(args.username, args.password)
|
||||
|
||||
# [/DEF:backend.src.scripts.create_admin:Module]
|
||||
44
backend/src/scripts/init_auth_db.py
Normal file
44
backend/src/scripts/init_auth_db.py
Normal file
@@ -0,0 +1,44 @@
|
||||
# [DEF:backend.src.scripts.init_auth_db:Module]
|
||||
#
|
||||
# @SEMANTICS: setup, database, auth, migration
|
||||
# @PURPOSE: Initializes the auth database and creates the necessary tables.
|
||||
# @LAYER: Scripts
|
||||
# @RELATION: CALLS -> backend.src.core.database.init_db
|
||||
#
|
||||
# @INVARIANT: Safe to run multiple times (idempotent).
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import sys
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path
|
||||
sys.path.append(str(Path(__file__).parent.parent.parent))
|
||||
|
||||
from src.core.database import init_db, auth_engine
|
||||
from src.core.logger import logger, belief_scope
|
||||
from src.scripts.seed_permissions import seed_permissions
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:run_init:Function]
|
||||
# @PURPOSE: Main entry point for the initialization script.
|
||||
# @POST: auth.db is initialized with the correct schema and seeded permissions.
|
||||
def run_init():
|
||||
with belief_scope("init_auth_db"):
|
||||
logger.info("Initializing authentication database...")
|
||||
try:
|
||||
init_db()
|
||||
logger.info("Authentication database initialized successfully.")
|
||||
|
||||
# Seed permissions
|
||||
seed_permissions()
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to initialize authentication database: {e}")
|
||||
sys.exit(1)
|
||||
# [/DEF:run_init:Function]
|
||||
|
||||
if __name__ == "__main__":
|
||||
run_init()
|
||||
|
||||
# [/DEF:backend.src.scripts.init_auth_db:Module]
|
||||
116
backend/src/scripts/seed_permissions.py
Normal file
116
backend/src/scripts/seed_permissions.py
Normal file
@@ -0,0 +1,116 @@
|
||||
# [DEF:backend.src.scripts.seed_permissions:Module]
|
||||
#
|
||||
# @SEMANTICS: setup, database, auth, permissions, seeding
|
||||
# @PURPOSE: Populates the auth database with initial system permissions.
|
||||
# @LAYER: Scripts
|
||||
# @RELATION: USES -> backend.src.core.database.get_auth_db
|
||||
# @RELATION: USES -> backend.src.models.auth.Permission
|
||||
#
|
||||
# @INVARIANT: Safe to run multiple times (idempotent).
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path
|
||||
sys.path.append(str(Path(__file__).parent.parent.parent))
|
||||
|
||||
from src.core.database import AuthSessionLocal
|
||||
from src.models.auth import Permission, Role
|
||||
from src.core.auth.repository import AuthRepository
|
||||
from src.core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:INITIAL_PERMISSIONS:Constant]
|
||||
INITIAL_PERMISSIONS = [
|
||||
# Admin Permissions
|
||||
{"resource": "admin:users", "action": "READ"},
|
||||
{"resource": "admin:users", "action": "WRITE"},
|
||||
{"resource": "admin:roles", "action": "READ"},
|
||||
{"resource": "admin:roles", "action": "WRITE"},
|
||||
{"resource": "admin:settings", "action": "READ"},
|
||||
{"resource": "admin:settings", "action": "WRITE"},
|
||||
{"resource": "environments", "action": "READ"},
|
||||
{"resource": "plugins", "action": "READ"},
|
||||
{"resource": "tasks", "action": "READ"},
|
||||
{"resource": "tasks", "action": "WRITE"},
|
||||
|
||||
# Plugin Permissions
|
||||
{"resource": "plugin:backup", "action": "EXECUTE"},
|
||||
{"resource": "plugin:migration", "action": "EXECUTE"},
|
||||
{"resource": "plugin:mapper", "action": "EXECUTE"},
|
||||
{"resource": "plugin:search", "action": "EXECUTE"},
|
||||
{"resource": "plugin:git", "action": "EXECUTE"},
|
||||
{"resource": "plugin:storage", "action": "EXECUTE"},
|
||||
{"resource": "plugin:storage", "action": "READ"},
|
||||
{"resource": "plugin:storage", "action": "WRITE"},
|
||||
{"resource": "plugin:debug", "action": "EXECUTE"},
|
||||
]
|
||||
# [/DEF:INITIAL_PERMISSIONS:Constant]
|
||||
|
||||
# [DEF:seed_permissions:Function]
|
||||
# @PURPOSE: Inserts missing permissions into the database.
|
||||
# @POST: All INITIAL_PERMISSIONS exist in the DB.
|
||||
def seed_permissions():
|
||||
with belief_scope("seed_permissions"):
|
||||
db = AuthSessionLocal()
|
||||
try:
|
||||
logger.info("Seeding permissions...")
|
||||
count = 0
|
||||
for perm_data in INITIAL_PERMISSIONS:
|
||||
exists = db.query(Permission).filter(
|
||||
Permission.resource == perm_data["resource"],
|
||||
Permission.action == perm_data["action"]
|
||||
).first()
|
||||
|
||||
if not exists:
|
||||
new_perm = Permission(
|
||||
resource=perm_data["resource"],
|
||||
action=perm_data["action"]
|
||||
)
|
||||
db.add(new_perm)
|
||||
count += 1
|
||||
|
||||
db.commit()
|
||||
logger.info(f"Seeding completed. Added {count} new permissions.")
|
||||
|
||||
# Assign permissions to User role
|
||||
repo = AuthRepository(db)
|
||||
user_role = repo.get_role_by_name("User")
|
||||
if not user_role:
|
||||
user_role = Role(name="User", description="Standard user with plugin access")
|
||||
db.add(user_role)
|
||||
db.flush()
|
||||
|
||||
user_permissions = [
|
||||
("plugin:mapper", "EXECUTE"),
|
||||
("plugin:migration", "EXECUTE"),
|
||||
("plugin:backup", "EXECUTE"),
|
||||
("plugin:git", "EXECUTE"),
|
||||
("plugin:storage", "READ"),
|
||||
("plugin:storage", "WRITE"),
|
||||
("environments", "READ"),
|
||||
("plugins", "READ"),
|
||||
("tasks", "READ"),
|
||||
("tasks", "WRITE"),
|
||||
]
|
||||
|
||||
for res, act in user_permissions:
|
||||
perm = repo.get_permission_by_resource_action(res, act)
|
||||
if perm and perm not in user_role.permissions:
|
||||
user_role.permissions.append(perm)
|
||||
|
||||
db.commit()
|
||||
logger.info("User role permissions updated.")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to seed permissions: {e}")
|
||||
db.rollback()
|
||||
finally:
|
||||
db.close()
|
||||
# [/DEF:seed_permissions:Function]
|
||||
|
||||
if __name__ == "__main__":
|
||||
seed_permissions()
|
||||
|
||||
# [/DEF:backend.src.scripts.seed_permissions:Module]
|
||||
115
backend/src/services/auth_service.py
Normal file
115
backend/src/services/auth_service.py
Normal file
@@ -0,0 +1,115 @@
|
||||
# [DEF:backend.src.services.auth_service:Module]
|
||||
#
|
||||
# @SEMANTICS: auth, service, business-logic, login, jwt
|
||||
# @PURPOSE: Orchestrates authentication business logic.
|
||||
# @LAYER: Service
|
||||
# @RELATION: USES -> backend.src.core.auth.repository.AuthRepository
|
||||
# @RELATION: USES -> backend.src.core.auth.security
|
||||
# @RELATION: USES -> backend.src.core.auth.jwt
|
||||
#
|
||||
# @INVARIANT: Authentication must verify both credentials and account status.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Optional, Dict, Any, List
|
||||
from sqlalchemy.orm import Session
|
||||
from ..models.auth import User, Role
|
||||
from ..core.auth.repository import AuthRepository
|
||||
from ..core.auth.security import verify_password, get_password_hash
|
||||
from ..core.auth.jwt import create_access_token
|
||||
from ..core.logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:AuthService:Class]
|
||||
# @PURPOSE: Provides high-level authentication services.
|
||||
class AuthService:
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the service with a database session.
|
||||
# @PARAM: db (Session) - SQLAlchemy session.
|
||||
def __init__(self, db: Session):
|
||||
self.repo = AuthRepository(db)
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:authenticate_user:Function]
|
||||
# @PURPOSE: Authenticates a user with username and password.
|
||||
# @PRE: username and password are provided.
|
||||
# @POST: Returns User object if authentication succeeds, else None.
|
||||
# @SIDE_EFFECT: Updates last_login timestamp on success.
|
||||
# @PARAM: username (str) - The username.
|
||||
# @PARAM: password (str) - The plain password.
|
||||
# @RETURN: Optional[User] - The authenticated user or None.
|
||||
def authenticate_user(self, username: str, password: str):
|
||||
with belief_scope("AuthService.authenticate_user"):
|
||||
user = self.repo.get_user_by_username(username)
|
||||
if not user:
|
||||
return None
|
||||
|
||||
if not user.is_active:
|
||||
return None
|
||||
|
||||
if not user.password_hash or not verify_password(password, user.password_hash):
|
||||
return None
|
||||
|
||||
self.repo.update_last_login(user)
|
||||
return user
|
||||
# [/DEF:authenticate_user:Function]
|
||||
|
||||
# [DEF:create_session:Function]
|
||||
# @PURPOSE: Creates a JWT session for an authenticated user.
|
||||
# @PRE: user is a valid User object.
|
||||
# @POST: Returns a dictionary with access_token and token_type.
|
||||
# @PARAM: user (User) - The authenticated user.
|
||||
# @RETURN: Dict[str, str] - Session data.
|
||||
def create_session(self, user) -> Dict[str, str]:
|
||||
with belief_scope("AuthService.create_session"):
|
||||
# Collect role names for scopes
|
||||
scopes = [role.name for role in user.roles]
|
||||
|
||||
token_data = {
|
||||
"sub": user.username,
|
||||
"scopes": scopes
|
||||
}
|
||||
|
||||
access_token = create_access_token(data=token_data)
|
||||
return {
|
||||
"access_token": access_token,
|
||||
"token_type": "bearer"
|
||||
}
|
||||
# [/DEF:create_session:Function]
|
||||
|
||||
# [DEF:provision_adfs_user:Function]
|
||||
# @PURPOSE: Just-In-Time (JIT) provisioning for ADFS users based on group mappings.
|
||||
# @PRE: user_info contains 'upn' (username), 'email', and 'groups'.
|
||||
# @POST: User is created/updated and assigned roles based on groups.
|
||||
# @PARAM: user_info (Dict[str, Any]) - Claims from ADFS token.
|
||||
# @RETURN: User - The provisioned user.
|
||||
def provision_adfs_user(self, user_info: Dict[str, Any]) -> User:
|
||||
with belief_scope("AuthService.provision_adfs_user"):
|
||||
username = user_info.get("upn") or user_info.get("email")
|
||||
email = user_info.get("email")
|
||||
ad_groups = user_info.get("groups", [])
|
||||
|
||||
user = self.repo.get_user_by_username(username)
|
||||
if not user:
|
||||
user = User(
|
||||
username=username,
|
||||
email=email,
|
||||
auth_source="ADFS",
|
||||
is_active=True
|
||||
)
|
||||
self.repo.db.add(user)
|
||||
|
||||
# Update roles based on group mappings
|
||||
from ..models.auth import ADGroupMapping
|
||||
mapped_roles = self.repo.db.query(Role).join(ADGroupMapping).filter(
|
||||
ADGroupMapping.ad_group.in_(ad_groups)
|
||||
).all()
|
||||
|
||||
user.roles = mapped_roles
|
||||
self.repo.db.commit()
|
||||
self.repo.db.refresh(user)
|
||||
return user
|
||||
# [/DEF:provision_adfs_user:Function]
|
||||
|
||||
# [/DEF:AuthService:Class]
|
||||
|
||||
# [/DEF:backend.src.services.auth_service:Module]
|
||||
413
backend/src/services/git_service.py
Normal file
413
backend/src/services/git_service.py
Normal file
@@ -0,0 +1,413 @@
|
||||
# [DEF:backend.src.services.git_service:Module]
|
||||
#
|
||||
# @SEMANTICS: git, service, gitpython, repository, version_control
|
||||
# @PURPOSE: Core Git logic using GitPython to manage dashboard repositories.
|
||||
# @LAYER: Service
|
||||
# @RELATION: INHERITS_FROM -> None
|
||||
# @RELATION: USED_BY -> src.api.routes.git
|
||||
# @RELATION: USED_BY -> src.plugins.git_plugin
|
||||
#
|
||||
# @INVARIANT: All Git operations must be performed on a valid local directory.
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import httpx
|
||||
from git import Repo, RemoteProgress
|
||||
from fastapi import HTTPException
|
||||
from typing import List, Optional
|
||||
from datetime import datetime
|
||||
from src.core.logger import logger, belief_scope
|
||||
from src.models.git import GitProvider
|
||||
|
||||
# [DEF:GitService:Class]
|
||||
# @PURPOSE: Wrapper for GitPython operations with semantic logging and error handling.
|
||||
class GitService:
|
||||
"""
|
||||
Wrapper for GitPython operations.
|
||||
"""
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the GitService with a base path for repositories.
|
||||
# @PARAM: base_path (str) - Root directory for all Git clones.
|
||||
# @PRE: base_path is a valid string path.
|
||||
# @POST: GitService is initialized; base_path directory exists.
|
||||
def __init__(self, base_path: str = "git_repos"):
|
||||
with belief_scope("GitService.__init__"):
|
||||
# Resolve relative to the backend directory
|
||||
# Path(__file__) is backend/src/services/git_service.py
|
||||
# parents[2] is backend/
|
||||
from pathlib import Path
|
||||
backend_root = Path(__file__).parents[2]
|
||||
|
||||
self.base_path = str((backend_root / base_path).resolve())
|
||||
if not os.path.exists(self.base_path):
|
||||
os.makedirs(self.base_path)
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:_get_repo_path:Function]
|
||||
# @PURPOSE: Resolves the local filesystem path for a dashboard's repository.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PRE: dashboard_id is an integer.
|
||||
# @POST: Returns the absolute or relative path to the dashboard's repo.
|
||||
# @RETURN: str
|
||||
def _get_repo_path(self, dashboard_id: int) -> str:
|
||||
with belief_scope("GitService._get_repo_path"):
|
||||
return os.path.join(self.base_path, str(dashboard_id))
|
||||
# [/DEF:_get_repo_path:Function]
|
||||
|
||||
# [DEF:init_repo:Function]
|
||||
# @PURPOSE: Initialize or clone a repository for a dashboard.
|
||||
# @PARAM: dashboard_id (int)
|
||||
# @PARAM: remote_url (str)
|
||||
# @PARAM: pat (str) - Personal Access Token for authentication.
|
||||
# @PRE: dashboard_id is int, remote_url is valid Git URL, pat is provided.
|
||||
# @POST: Repository is cloned or opened at the local path.
|
||||
# @RETURN: Repo - GitPython Repo object.
|
||||
def init_repo(self, dashboard_id: int, remote_url: str, pat: str) -> Repo:
|
||||
with belief_scope("GitService.init_repo"):
|
||||
repo_path = self._get_repo_path(dashboard_id)
|
||||
|
||||
# Inject PAT into remote URL if needed
|
||||
if pat and "://" in remote_url:
|
||||
proto, rest = remote_url.split("://", 1)
|
||||
auth_url = f"{proto}://oauth2:{pat}@{rest}"
|
||||
else:
|
||||
auth_url = remote_url
|
||||
|
||||
if os.path.exists(repo_path):
|
||||
logger.info(f"[init_repo][Action] Opening existing repo at {repo_path}")
|
||||
return Repo(repo_path)
|
||||
|
||||
logger.info(f"[init_repo][Action] Cloning {remote_url} to {repo_path}")
|
||||
return Repo.clone_from(auth_url, repo_path)
|
||||
# [/DEF:init_repo:Function]
|
||||
|
||||
# [DEF:get_repo:Function]
|
||||
# @PURPOSE: Get Repo object for a dashboard.
|
||||
# @PRE: Repository must exist on disk for the given dashboard_id.
|
||||
# @POST: Returns a GitPython Repo instance for the dashboard.
|
||||
# @RETURN: Repo
|
||||
def get_repo(self, dashboard_id: int) -> Repo:
|
||||
with belief_scope("GitService.get_repo"):
|
||||
repo_path = self._get_repo_path(dashboard_id)
|
||||
if not os.path.exists(repo_path):
|
||||
logger.error(f"[get_repo][Coherence:Failed] Repository for dashboard {dashboard_id} does not exist")
|
||||
raise HTTPException(status_code=404, detail=f"Repository for dashboard {dashboard_id} not found")
|
||||
try:
|
||||
return Repo(repo_path)
|
||||
except Exception as e:
|
||||
logger.error(f"[get_repo][Coherence:Failed] Failed to open repository at {repo_path}: {e}")
|
||||
raise HTTPException(status_code=500, detail="Failed to open local Git repository")
|
||||
# [/DEF:get_repo:Function]
|
||||
|
||||
# [DEF:list_branches:Function]
|
||||
# @PURPOSE: List all branches for a dashboard's repository.
|
||||
# @PRE: Repository for dashboard_id exists.
|
||||
# @POST: Returns a list of branch metadata dictionaries.
|
||||
# @RETURN: List[dict]
|
||||
def list_branches(self, dashboard_id: int) -> List[dict]:
|
||||
with belief_scope("GitService.list_branches"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
logger.info(f"[list_branches][Action] Listing branches for {dashboard_id}. Refs: {repo.refs}")
|
||||
branches = []
|
||||
|
||||
# Add existing refs
|
||||
for ref in repo.refs:
|
||||
try:
|
||||
# Strip prefixes for UI
|
||||
name = ref.name.replace('refs/heads/', '').replace('refs/remotes/origin/', '')
|
||||
|
||||
# Avoid duplicates (e.g. local and remote with same name)
|
||||
if any(b['name'] == name for b in branches):
|
||||
continue
|
||||
|
||||
branches.append({
|
||||
"name": name,
|
||||
"commit_hash": ref.commit.hexsha if hasattr(ref, 'commit') else "0000000",
|
||||
"is_remote": ref.is_remote() if hasattr(ref, 'is_remote') else False,
|
||||
"last_updated": datetime.fromtimestamp(ref.commit.committed_date) if hasattr(ref, 'commit') else datetime.utcnow()
|
||||
})
|
||||
except Exception as e:
|
||||
logger.warning(f"[list_branches][Action] Skipping ref {ref}: {e}")
|
||||
|
||||
# Ensure the current active branch is in the list even if it has no commits or refs
|
||||
try:
|
||||
active_name = repo.active_branch.name
|
||||
if not any(b['name'] == active_name for b in branches):
|
||||
branches.append({
|
||||
"name": active_name,
|
||||
"commit_hash": "0000000",
|
||||
"is_remote": False,
|
||||
"last_updated": datetime.utcnow()
|
||||
})
|
||||
except Exception as e:
|
||||
logger.warning(f"[list_branches][Action] Could not determine active branch: {e}")
|
||||
# If everything else failed and list is still empty, add default
|
||||
if not branches:
|
||||
branches.append({
|
||||
"name": "main",
|
||||
"commit_hash": "0000000",
|
||||
"is_remote": False,
|
||||
"last_updated": datetime.utcnow()
|
||||
})
|
||||
|
||||
return branches
|
||||
# [/DEF:list_branches:Function]
|
||||
|
||||
# [DEF:create_branch:Function]
|
||||
# @PURPOSE: Create a new branch from an existing one.
|
||||
# @PARAM: name (str) - New branch name.
|
||||
# @PARAM: from_branch (str) - Source branch.
|
||||
# @PRE: Repository exists; name is valid; from_branch exists or repo is empty.
|
||||
# @POST: A new branch is created in the repository.
|
||||
def create_branch(self, dashboard_id: int, name: str, from_branch: str = "main"):
|
||||
with belief_scope("GitService.create_branch"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
logger.info(f"[create_branch][Action] Creating branch {name} from {from_branch}")
|
||||
|
||||
# Handle empty repository case (no commits)
|
||||
if not repo.heads and not repo.remotes:
|
||||
logger.warning(f"[create_branch][Action] Repository is empty. Creating initial commit to enable branching.")
|
||||
readme_path = os.path.join(repo.working_dir, "README.md")
|
||||
if not os.path.exists(readme_path):
|
||||
with open(readme_path, "w") as f:
|
||||
f.write(f"# Dashboard {dashboard_id}\nGit repository for Superset dashboard integration.")
|
||||
repo.index.add(["README.md"])
|
||||
repo.index.commit("Initial commit")
|
||||
|
||||
# Verify source branch exists
|
||||
try:
|
||||
repo.commit(from_branch)
|
||||
except:
|
||||
logger.warning(f"[create_branch][Action] Source branch {from_branch} not found, using HEAD")
|
||||
from_branch = repo.head
|
||||
|
||||
try:
|
||||
new_branch = repo.create_head(name, from_branch)
|
||||
return new_branch
|
||||
except Exception as e:
|
||||
logger.error(f"[create_branch][Coherence:Failed] {e}")
|
||||
raise
|
||||
# [/DEF:create_branch:Function]
|
||||
|
||||
# [DEF:checkout_branch:Function]
|
||||
# @PURPOSE: Switch to a specific branch.
|
||||
# @PRE: Repository exists and the specified branch name exists.
|
||||
# @POST: The repository working directory is updated to the specified branch.
|
||||
def checkout_branch(self, dashboard_id: int, name: str):
|
||||
with belief_scope("GitService.checkout_branch"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
logger.info(f"[checkout_branch][Action] Checking out branch {name}")
|
||||
repo.git.checkout(name)
|
||||
# [/DEF:checkout_branch:Function]
|
||||
|
||||
# [DEF:commit_changes:Function]
|
||||
# @PURPOSE: Stage and commit changes.
|
||||
# @PARAM: message (str) - Commit message.
|
||||
# @PARAM: files (List[str]) - Optional list of specific files to stage.
|
||||
# @PRE: Repository exists and has changes (dirty) or files are specified.
|
||||
# @POST: Changes are staged and a new commit is created.
|
||||
def commit_changes(self, dashboard_id: int, message: str, files: List[str] = None):
|
||||
with belief_scope("GitService.commit_changes"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
|
||||
# Check if there are any changes to commit
|
||||
if not repo.is_dirty(untracked_files=True) and not files:
|
||||
logger.info(f"[commit_changes][Action] No changes to commit for dashboard {dashboard_id}")
|
||||
return
|
||||
|
||||
if files:
|
||||
logger.info(f"[commit_changes][Action] Staging files: {files}")
|
||||
repo.index.add(files)
|
||||
else:
|
||||
logger.info("[commit_changes][Action] Staging all changes")
|
||||
repo.git.add(A=True)
|
||||
|
||||
repo.index.commit(message)
|
||||
logger.info(f"[commit_changes][Coherence:OK] Committed changes with message: {message}")
|
||||
# [/DEF:commit_changes:Function]
|
||||
|
||||
# [DEF:push_changes:Function]
|
||||
# @PURPOSE: Push local commits to remote.
|
||||
# @PRE: Repository exists and has an 'origin' remote.
|
||||
# @POST: Local branch commits are pushed to origin.
|
||||
def push_changes(self, dashboard_id: int):
|
||||
with belief_scope("GitService.push_changes"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
|
||||
# Ensure we have something to push
|
||||
if not repo.heads:
|
||||
logger.warning(f"[push_changes][Coherence:Failed] No local branches to push for dashboard {dashboard_id}")
|
||||
return
|
||||
|
||||
try:
|
||||
origin = repo.remote(name='origin')
|
||||
except ValueError:
|
||||
logger.error(f"[push_changes][Coherence:Failed] Remote 'origin' not found for dashboard {dashboard_id}")
|
||||
raise HTTPException(status_code=400, detail="Remote 'origin' not configured")
|
||||
|
||||
# Check if current branch has an upstream
|
||||
try:
|
||||
current_branch = repo.active_branch
|
||||
logger.info(f"[push_changes][Action] Pushing branch {current_branch.name} to origin")
|
||||
# Using a timeout for network operations
|
||||
push_info = origin.push(refspec=f'{current_branch.name}:{current_branch.name}')
|
||||
for info in push_info:
|
||||
if info.flags & info.ERROR:
|
||||
logger.error(f"[push_changes][Coherence:Failed] Error pushing ref {info.remote_ref_string}: {info.summary}")
|
||||
raise Exception(f"Git push error for {info.remote_ref_string}: {info.summary}")
|
||||
except Exception as e:
|
||||
logger.error(f"[push_changes][Coherence:Failed] Failed to push changes: {e}")
|
||||
raise HTTPException(status_code=500, detail=f"Git push failed: {str(e)}")
|
||||
# [/DEF:push_changes:Function]
|
||||
|
||||
# [DEF:pull_changes:Function]
|
||||
# @PURPOSE: Pull changes from remote.
|
||||
# @PRE: Repository exists and has an 'origin' remote.
|
||||
# @POST: Changes from origin are pulled and merged into the active branch.
|
||||
def pull_changes(self, dashboard_id: int):
|
||||
with belief_scope("GitService.pull_changes"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
try:
|
||||
origin = repo.remote(name='origin')
|
||||
logger.info("[pull_changes][Action] Pulling changes from origin")
|
||||
fetch_info = origin.pull()
|
||||
for info in fetch_info:
|
||||
if info.flags & info.ERROR:
|
||||
logger.error(f"[pull_changes][Coherence:Failed] Error pulling ref {info.ref}: {info.note}")
|
||||
raise Exception(f"Git pull error for {info.ref}: {info.note}")
|
||||
except ValueError:
|
||||
logger.error(f"[pull_changes][Coherence:Failed] Remote 'origin' not found for dashboard {dashboard_id}")
|
||||
raise HTTPException(status_code=400, detail="Remote 'origin' not configured")
|
||||
except Exception as e:
|
||||
logger.error(f"[pull_changes][Coherence:Failed] Failed to pull changes: {e}")
|
||||
raise HTTPException(status_code=500, detail=f"Git pull failed: {str(e)}")
|
||||
# [/DEF:pull_changes:Function]
|
||||
|
||||
# [DEF:get_status:Function]
|
||||
# @PURPOSE: Get current repository status (dirty files, untracked, etc.)
|
||||
# @PRE: Repository for dashboard_id exists.
|
||||
# @POST: Returns a dictionary representing the Git status.
|
||||
# @RETURN: dict
|
||||
def get_status(self, dashboard_id: int) -> dict:
|
||||
with belief_scope("GitService.get_status"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
|
||||
# Handle empty repository (no commits)
|
||||
has_commits = False
|
||||
try:
|
||||
repo.head.commit
|
||||
has_commits = True
|
||||
except (ValueError, Exception):
|
||||
has_commits = False
|
||||
|
||||
return {
|
||||
"is_dirty": repo.is_dirty(untracked_files=True),
|
||||
"untracked_files": repo.untracked_files,
|
||||
"modified_files": [item.a_path for item in repo.index.diff(None)],
|
||||
"staged_files": [item.a_path for item in repo.index.diff("HEAD")] if has_commits else [],
|
||||
"current_branch": repo.active_branch.name
|
||||
}
|
||||
# [/DEF:get_status:Function]
|
||||
|
||||
# [DEF:get_diff:Function]
|
||||
# @PURPOSE: Generate diff for a file or the whole repository.
|
||||
# @PARAM: file_path (str) - Optional specific file.
|
||||
# @PARAM: staged (bool) - Whether to show staged changes.
|
||||
# @PRE: Repository for dashboard_id exists.
|
||||
# @POST: Returns the diff text as a string.
|
||||
# @RETURN: str
|
||||
def get_diff(self, dashboard_id: int, file_path: str = None, staged: bool = False) -> str:
|
||||
with belief_scope("GitService.get_diff"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
diff_args = []
|
||||
if staged:
|
||||
diff_args.append("--staged")
|
||||
|
||||
if file_path:
|
||||
return repo.git.diff(*diff_args, "--", file_path)
|
||||
return repo.git.diff(*diff_args)
|
||||
# [/DEF:get_diff:Function]
|
||||
|
||||
# [DEF:get_commit_history:Function]
|
||||
# @PURPOSE: Retrieve commit history for a repository.
|
||||
# @PARAM: limit (int) - Max number of commits to return.
|
||||
# @PRE: Repository for dashboard_id exists.
|
||||
# @POST: Returns a list of dictionaries for each commit in history.
|
||||
# @RETURN: List[dict]
|
||||
def get_commit_history(self, dashboard_id: int, limit: int = 50) -> List[dict]:
|
||||
with belief_scope("GitService.get_commit_history"):
|
||||
repo = self.get_repo(dashboard_id)
|
||||
commits = []
|
||||
try:
|
||||
# Check if there are any commits at all
|
||||
if not repo.heads and not repo.remotes:
|
||||
return []
|
||||
|
||||
for commit in repo.iter_commits(max_count=limit):
|
||||
commits.append({
|
||||
"hash": commit.hexsha,
|
||||
"author": commit.author.name,
|
||||
"email": commit.author.email,
|
||||
"timestamp": datetime.fromtimestamp(commit.committed_date),
|
||||
"message": commit.message.strip(),
|
||||
"files_changed": list(commit.stats.files.keys())
|
||||
})
|
||||
except Exception as e:
|
||||
logger.warning(f"[get_commit_history][Action] Could not retrieve commit history for dashboard {dashboard_id}: {e}")
|
||||
return []
|
||||
return commits
|
||||
# [/DEF:get_commit_history:Function]
|
||||
|
||||
# [DEF:test_connection:Function]
|
||||
# @PURPOSE: Test connection to Git provider using PAT.
|
||||
# @PARAM: provider (GitProvider)
|
||||
# @PARAM: url (str)
|
||||
# @PARAM: pat (str)
|
||||
# @PRE: provider is valid; url is a valid HTTP(S) URL; pat is provided.
|
||||
# @POST: Returns True if connection to the provider's API succeeds.
|
||||
# @RETURN: bool
|
||||
async def test_connection(self, provider: GitProvider, url: str, pat: str) -> bool:
|
||||
with belief_scope("GitService.test_connection"):
|
||||
# Check for offline mode or local-only URLs
|
||||
if ".local" in url or "localhost" in url:
|
||||
logger.info("[test_connection][Action] Local/Offline mode detected for URL")
|
||||
return True
|
||||
|
||||
if not url.startswith(('http://', 'https://')):
|
||||
logger.error(f"[test_connection][Coherence:Failed] Invalid URL protocol: {url}")
|
||||
return False
|
||||
|
||||
if not pat or not pat.strip():
|
||||
logger.error("[test_connection][Coherence:Failed] Git PAT is missing or empty")
|
||||
return False
|
||||
|
||||
pat = pat.strip()
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
if provider == GitProvider.GITHUB:
|
||||
headers = {"Authorization": f"token {pat}"}
|
||||
api_url = "https://api.github.com/user" if "github.com" in url else f"{url.rstrip('/')}/api/v3/user"
|
||||
resp = await client.get(api_url, headers=headers)
|
||||
elif provider == GitProvider.GITLAB:
|
||||
headers = {"PRIVATE-TOKEN": pat}
|
||||
api_url = f"{url.rstrip('/')}/api/v4/user"
|
||||
resp = await client.get(api_url, headers=headers)
|
||||
elif provider == GitProvider.GITEA:
|
||||
headers = {"Authorization": f"token {pat}"}
|
||||
api_url = f"{url.rstrip('/')}/api/v1/user"
|
||||
resp = await client.get(api_url, headers=headers)
|
||||
else:
|
||||
return False
|
||||
|
||||
if resp.status_code != 200:
|
||||
logger.error(f"[test_connection][Coherence:Failed] Git connection test failed for {provider} at {api_url}. Status: {resp.status_code}")
|
||||
return resp.status_code == 200
|
||||
except Exception as e:
|
||||
logger.error(f"[test_connection][Coherence:Failed] Error testing git connection: {e}")
|
||||
return False
|
||||
# [/DEF:test_connection:Function]
|
||||
|
||||
# [/DEF:GitService:Class]
|
||||
# [/DEF:backend.src.services.git_service:Module]
|
||||
@@ -10,10 +10,9 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Dict
|
||||
from backend.src.core.logger import belief_scope
|
||||
from backend.src.core.superset_client import SupersetClient
|
||||
from backend.src.core.utils.matching import suggest_mappings
|
||||
from superset_tool.models import SupersetConfig
|
||||
from ..core.logger import belief_scope
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.utils.matching import suggest_mappings
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:MappingService:Class]
|
||||
@@ -43,17 +42,7 @@ class MappingService:
|
||||
if not env:
|
||||
raise ValueError(f"Environment {env_id} not found")
|
||||
|
||||
superset_config = SupersetConfig(
|
||||
env=env.name,
|
||||
base_url=env.url,
|
||||
auth={
|
||||
"provider": "db",
|
||||
"username": env.username,
|
||||
"password": env.password,
|
||||
"refresh": "false"
|
||||
}
|
||||
)
|
||||
return SupersetClient(superset_config)
|
||||
return SupersetClient(env)
|
||||
# [/DEF:_get_client:Function]
|
||||
|
||||
# [DEF:get_suggestions:Function]
|
||||
|
||||
BIN
backend/tasks.db
BIN
backend/tasks.db
Binary file not shown.
@@ -1,99 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Test script to verify the fixes for SupersetClient initialization."""
|
||||
|
||||
import sys
|
||||
sys.path.insert(0, '.')
|
||||
|
||||
from src.core.config_manager import ConfigManager
|
||||
from src.core.config_models import Environment
|
||||
from src.plugins.search import SearchPlugin
|
||||
from src.plugins.mapper import MapperPlugin
|
||||
from src.plugins.debug import DebugPlugin
|
||||
|
||||
def test_config_manager():
|
||||
"""Test ConfigManager methods."""
|
||||
print("Testing ConfigManager...")
|
||||
try:
|
||||
config_manager = ConfigManager()
|
||||
print(f" ConfigManager initialized")
|
||||
|
||||
# Test get_environment method
|
||||
if hasattr(config_manager, 'get_environment'):
|
||||
print(f" get_environment method exists")
|
||||
|
||||
# Add a test environment if none exists
|
||||
if not config_manager.has_environments():
|
||||
test_env = Environment(
|
||||
id="test-env",
|
||||
name="Test Environment",
|
||||
url="http://localhost:8088",
|
||||
username="admin",
|
||||
password="admin"
|
||||
)
|
||||
config_manager.add_environment(test_env)
|
||||
print(f" Added test environment: {test_env.name}")
|
||||
|
||||
# Test retrieving environment
|
||||
envs = config_manager.get_environments()
|
||||
if envs:
|
||||
test_env_id = envs[0].id
|
||||
env_config = config_manager.get_environment(test_env_id)
|
||||
print(f" Successfully retrieved environment: {env_config.name}")
|
||||
return True
|
||||
else:
|
||||
print(f" No environments available (add one in settings)")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
print(f" Error: {e}")
|
||||
return False
|
||||
|
||||
def test_plugins():
|
||||
"""Test plugin initialization."""
|
||||
print("\nTesting plugins...")
|
||||
|
||||
plugins = [
|
||||
("Search Plugin", SearchPlugin()),
|
||||
("Mapper Plugin", MapperPlugin()),
|
||||
("Debug Plugin", DebugPlugin())
|
||||
]
|
||||
|
||||
all_ok = True
|
||||
|
||||
for name, plugin in plugins:
|
||||
print(f"\nTesting {name}...")
|
||||
try:
|
||||
plugin_id = plugin.id
|
||||
plugin_name = plugin.name
|
||||
plugin_version = plugin.version
|
||||
schema = plugin.get_schema()
|
||||
|
||||
print(f" ✓ ID: {plugin_id}")
|
||||
print(f" ✓ Name: {plugin_name}")
|
||||
print(f" ✓ Version: {plugin_version}")
|
||||
print(f" ✓ Schema: {schema}")
|
||||
|
||||
except Exception as e:
|
||||
print(f" ✗ Error: {e}")
|
||||
all_ok = False
|
||||
|
||||
return all_ok
|
||||
|
||||
def main():
|
||||
"""Main test function."""
|
||||
print("=" * 50)
|
||||
print("Superset Tools Fix Verification")
|
||||
print("=" * 50)
|
||||
|
||||
config_ok = test_config_manager()
|
||||
plugins_ok = test_plugins()
|
||||
|
||||
print("\n" + "=" * 50)
|
||||
if config_ok and plugins_ok:
|
||||
print("✅ All fixes verified successfully!")
|
||||
else:
|
||||
print("❌ Some tests failed")
|
||||
sys.exit(1)
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
162
backend/tests/test_auth.py
Normal file
162
backend/tests/test_auth.py
Normal file
@@ -0,0 +1,162 @@
|
||||
import sys
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path
|
||||
sys.path.append(str(Path(__file__).parent.parent / "src"))
|
||||
|
||||
import pytest
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from src.core.database import Base, get_auth_db
|
||||
from src.models.auth import User, Role, Permission, ADGroupMapping
|
||||
from src.services.auth_service import AuthService
|
||||
from src.core.auth.repository import AuthRepository
|
||||
from src.core.auth.security import verify_password, get_password_hash
|
||||
|
||||
# Create in-memory SQLite database for testing
|
||||
SQLALCHEMY_DATABASE_URL = "sqlite:///:memory:"
|
||||
|
||||
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
|
||||
# Create all tables
|
||||
Base.metadata.create_all(bind=engine)
|
||||
|
||||
@pytest.fixture
|
||||
def db_session():
|
||||
"""Create a new database session with a transaction, rollback after test"""
|
||||
connection = engine.connect()
|
||||
transaction = connection.begin()
|
||||
session = TestingSessionLocal(bind=connection)
|
||||
|
||||
yield session
|
||||
|
||||
session.close()
|
||||
transaction.rollback()
|
||||
connection.close()
|
||||
|
||||
@pytest.fixture
|
||||
def auth_service(db_session):
|
||||
return AuthService(db_session)
|
||||
|
||||
@pytest.fixture
|
||||
def auth_repo(db_session):
|
||||
return AuthRepository(db_session)
|
||||
|
||||
def test_create_user(auth_repo):
|
||||
"""Test user creation"""
|
||||
user = User(
|
||||
username="testuser",
|
||||
email="test@example.com",
|
||||
password_hash=get_password_hash("testpassword123"),
|
||||
auth_source="LOCAL"
|
||||
)
|
||||
|
||||
auth_repo.db.add(user)
|
||||
auth_repo.db.commit()
|
||||
|
||||
retrieved_user = auth_repo.get_user_by_username("testuser")
|
||||
assert retrieved_user is not None
|
||||
assert retrieved_user.username == "testuser"
|
||||
assert retrieved_user.email == "test@example.com"
|
||||
assert verify_password("testpassword123", retrieved_user.password_hash)
|
||||
|
||||
def test_authenticate_user(auth_service, auth_repo):
|
||||
"""Test user authentication with valid and invalid credentials"""
|
||||
user = User(
|
||||
username="testuser",
|
||||
email="test@example.com",
|
||||
password_hash=get_password_hash("testpassword123"),
|
||||
auth_source="LOCAL"
|
||||
)
|
||||
|
||||
auth_repo.db.add(user)
|
||||
auth_repo.db.commit()
|
||||
|
||||
# Test valid credentials
|
||||
authenticated_user = auth_service.authenticate_user("testuser", "testpassword123")
|
||||
assert authenticated_user is not None
|
||||
assert authenticated_user.username == "testuser"
|
||||
|
||||
# Test invalid password
|
||||
invalid_user = auth_service.authenticate_user("testuser", "wrongpassword")
|
||||
assert invalid_user is None
|
||||
|
||||
# Test invalid username
|
||||
invalid_user = auth_service.authenticate_user("nonexistent", "testpassword123")
|
||||
assert invalid_user is None
|
||||
|
||||
def test_create_session(auth_service, auth_repo):
|
||||
"""Test session token creation"""
|
||||
user = User(
|
||||
username="testuser",
|
||||
email="test@example.com",
|
||||
password_hash=get_password_hash("testpassword123"),
|
||||
auth_source="LOCAL"
|
||||
)
|
||||
|
||||
auth_repo.db.add(user)
|
||||
auth_repo.db.commit()
|
||||
|
||||
session = auth_service.create_session(user)
|
||||
assert "access_token" in session
|
||||
assert "token_type" in session
|
||||
assert session["token_type"] == "bearer"
|
||||
assert len(session["access_token"]) > 0
|
||||
|
||||
def test_role_permission_association(auth_repo):
|
||||
"""Test role and permission association"""
|
||||
role = Role(name="Admin", description="System administrator")
|
||||
perm1 = Permission(resource="admin:users", action="READ")
|
||||
perm2 = Permission(resource="admin:users", action="WRITE")
|
||||
|
||||
role.permissions.extend([perm1, perm2])
|
||||
|
||||
auth_repo.db.add(role)
|
||||
auth_repo.db.commit()
|
||||
|
||||
retrieved_role = auth_repo.get_role_by_name("Admin")
|
||||
assert retrieved_role is not None
|
||||
assert len(retrieved_role.permissions) == 2
|
||||
|
||||
permissions = [f"{p.resource}:{p.action}" for p in retrieved_role.permissions]
|
||||
assert "admin:users:READ" in permissions
|
||||
assert "admin:users:WRITE" in permissions
|
||||
|
||||
def test_user_role_association(auth_repo):
|
||||
"""Test user and role association"""
|
||||
role = Role(name="Admin", description="System administrator")
|
||||
user = User(
|
||||
username="adminuser",
|
||||
email="admin@example.com",
|
||||
password_hash=get_password_hash("adminpass123"),
|
||||
auth_source="LOCAL"
|
||||
)
|
||||
|
||||
user.roles.append(role)
|
||||
|
||||
auth_repo.db.add(role)
|
||||
auth_repo.db.add(user)
|
||||
auth_repo.db.commit()
|
||||
|
||||
retrieved_user = auth_repo.get_user_by_username("adminuser")
|
||||
assert retrieved_user is not None
|
||||
assert len(retrieved_user.roles) == 1
|
||||
assert retrieved_user.roles[0].name == "Admin"
|
||||
|
||||
def test_ad_group_mapping(auth_repo):
|
||||
"""Test AD group mapping"""
|
||||
role = Role(name="ADFS_Admin", description="ADFS administrators")
|
||||
|
||||
auth_repo.db.add(role)
|
||||
auth_repo.db.commit()
|
||||
|
||||
mapping = ADGroupMapping(ad_group="DOMAIN\\ADFS_Admins", role_id=role.id)
|
||||
|
||||
auth_repo.db.add(mapping)
|
||||
auth_repo.db.commit()
|
||||
|
||||
retrieved_mapping = auth_repo.db.query(ADGroupMapping).filter_by(ad_group="DOMAIN\\ADFS_Admins").first()
|
||||
assert retrieved_mapping is not None
|
||||
assert retrieved_mapping.role_id == role.id
|
||||
@@ -1,5 +1,5 @@
|
||||
import pytest
|
||||
from backend.src.core.logger import belief_scope, logger
|
||||
from src.core.logger import belief_scope, logger
|
||||
|
||||
|
||||
# [DEF:test_belief_scope_logs_entry_action_exit:Function]
|
||||
|
||||
@@ -1,62 +1,21 @@
|
||||
import pytest
|
||||
from superset_tool.models import SupersetConfig
|
||||
from superset_tool.utils.logger import belief_scope
|
||||
from src.core.config_models import Environment
|
||||
from src.core.logger import belief_scope
|
||||
|
||||
# [DEF:test_superset_config_url_normalization:Function]
|
||||
# @PURPOSE: Tests that SupersetConfig correctly normalizes the base URL.
|
||||
# @PRE: SupersetConfig class is available.
|
||||
# @POST: URL normalization is verified.
|
||||
def test_superset_config_url_normalization():
|
||||
with belief_scope("test_superset_config_url_normalization"):
|
||||
auth = {
|
||||
"provider": "db",
|
||||
"username": "admin",
|
||||
"password": "password",
|
||||
"refresh": "token"
|
||||
}
|
||||
|
||||
# Test with /api/v1 already present
|
||||
config = SupersetConfig(
|
||||
env="dev",
|
||||
base_url="http://localhost:8088/api/v1",
|
||||
auth=auth
|
||||
# [DEF:test_environment_model:Function]
|
||||
# @PURPOSE: Tests that Environment model correctly stores values.
|
||||
# @PRE: Environment class is available.
|
||||
# @POST: Values are verified.
|
||||
def test_environment_model():
|
||||
with belief_scope("test_environment_model"):
|
||||
env = Environment(
|
||||
id="test-id",
|
||||
name="test-env",
|
||||
url="http://localhost:8088/api/v1",
|
||||
username="admin",
|
||||
password="password"
|
||||
)
|
||||
assert config.base_url == "http://localhost:8088/api/v1"
|
||||
|
||||
# Test without /api/v1
|
||||
config = SupersetConfig(
|
||||
env="dev",
|
||||
base_url="http://localhost:8088",
|
||||
auth=auth
|
||||
)
|
||||
assert config.base_url == "http://localhost:8088/api/v1"
|
||||
|
||||
# Test with trailing slash
|
||||
config = SupersetConfig(
|
||||
env="dev",
|
||||
base_url="http://localhost:8088/",
|
||||
auth=auth
|
||||
)
|
||||
assert config.base_url == "http://localhost:8088/api/v1"
|
||||
# [/DEF:test_superset_config_url_normalization:Function]
|
||||
|
||||
# [DEF:test_superset_config_invalid_url:Function]
|
||||
# @PURPOSE: Tests that SupersetConfig raises ValueError for invalid URLs.
|
||||
# @PRE: SupersetConfig class is available.
|
||||
# @POST: ValueError is raised for invalid URLs.
|
||||
def test_superset_config_invalid_url():
|
||||
with belief_scope("test_superset_config_invalid_url"):
|
||||
auth = {
|
||||
"provider": "db",
|
||||
"username": "admin",
|
||||
"password": "password",
|
||||
"refresh": "token"
|
||||
}
|
||||
|
||||
with pytest.raises(ValueError, match="Must start with http:// or https://"):
|
||||
SupersetConfig(
|
||||
env="dev",
|
||||
base_url="localhost:8088",
|
||||
auth=auth
|
||||
)
|
||||
# [/DEF:test_superset_config_invalid_url:Function]
|
||||
assert env.id == "test-id"
|
||||
assert env.name == "test-env"
|
||||
assert env.url == "http://localhost:8088/api/v1"
|
||||
# [/DEF:test_environment_model:Function]
|
||||
|
||||
@@ -0,0 +1,55 @@
|
||||
slice_name: "FI-0083 \u0421\u0442\u0430\u0442\u0438\u0441\u0442\u0438\u043A\u0430\
|
||||
\ \u043F\u043E \u0414\u0417/\u041F\u0414\u0417"
|
||||
description: null
|
||||
certified_by: null
|
||||
certification_details: null
|
||||
viz_type: pivot_table_v2
|
||||
params:
|
||||
datasource: 859__table
|
||||
viz_type: pivot_table_v2
|
||||
slice_id: 4019
|
||||
groupbyColumns:
|
||||
- dt
|
||||
groupbyRows:
|
||||
- counterparty_search_name
|
||||
- attribute
|
||||
time_grain_sqla: P1M
|
||||
temporal_columns_lookup:
|
||||
dt: true
|
||||
metrics:
|
||||
- m_debt_amount
|
||||
- m_overdue_amount
|
||||
metricsLayout: COLUMNS
|
||||
adhoc_filters:
|
||||
- clause: WHERE
|
||||
comparator: No filter
|
||||
expressionType: SIMPLE
|
||||
operator: TEMPORAL_RANGE
|
||||
subject: dt
|
||||
row_limit: '90000'
|
||||
order_desc: false
|
||||
aggregateFunction: Sum
|
||||
combineMetric: true
|
||||
valueFormat: SMART_NUMBER
|
||||
date_format: smart_date
|
||||
rowOrder: key_a_to_z
|
||||
colOrder: key_a_to_z
|
||||
value_font_size: 12
|
||||
header_font_size: 12
|
||||
label_align: left
|
||||
column_config:
|
||||
m_debt_amount:
|
||||
d3NumberFormat: ',d'
|
||||
m_overdue_amount:
|
||||
d3NumberFormat: ',d'
|
||||
conditional_formatting: []
|
||||
extra_form_data: {}
|
||||
dashboards:
|
||||
- 184
|
||||
query_context: '{"datasource":{"id":859,"type":"table"},"force":false,"queries":[{"filters":[{"col":"dt","op":"TEMPORAL_RANGE","val":"No
|
||||
filter"}],"extras":{"having":"","where":""},"applied_time_extras":{},"columns":[{"timeGrain":"P1M","columnType":"BASE_AXIS","sqlExpression":"dt","label":"dt","expressionType":"SQL"},"counterparty_search_name","attribute"],"metrics":["m_debt_amount","m_overdue_amount"],"orderby":[["m_debt_amount",true]],"annotation_layers":[],"row_limit":90000,"series_limit":0,"order_desc":false,"url_params":{},"custom_params":{},"custom_form_data":{}}],"form_data":{"datasource":"859__table","viz_type":"pivot_table_v2","slice_id":4019,"groupbyColumns":["dt"],"groupbyRows":["counterparty_search_name","attribute"],"time_grain_sqla":"P1M","temporal_columns_lookup":{"dt":true},"metrics":["m_debt_amount","m_overdue_amount"],"metricsLayout":"COLUMNS","adhoc_filters":[{"clause":"WHERE","comparator":"No
|
||||
filter","expressionType":"SIMPLE","operator":"TEMPORAL_RANGE","subject":"dt"}],"row_limit":"90000","order_desc":false,"aggregateFunction":"Sum","combineMetric":true,"valueFormat":"SMART_NUMBER","date_format":"smart_date","rowOrder":"key_a_to_z","colOrder":"key_a_to_z","value_font_size":12,"header_font_size":12,"label_align":"left","column_config":{"m_debt_amount":{"d3NumberFormat":",d"},"m_overdue_amount":{"d3NumberFormat":",d"}},"conditional_formatting":[],"extra_form_data":{},"dashboards":[184],"force":false,"result_format":"json","result_type":"full"},"result_format":"json","result_type":"full"}'
|
||||
cache_timeout: null
|
||||
uuid: 9c293065-73e2-4d9b-a175-d188ff8ef575
|
||||
version: 1.0.0
|
||||
dataset_uuid: 9e645dc0-da25-4f61-9465-6e649b0bc4b1
|
||||
@@ -0,0 +1,13 @@
|
||||
database_name: Prod Clickhouse
|
||||
sqlalchemy_uri: clickhousedb+connect://viz_superset_click_prod:XXXXXXXXXX@rgm-s-khclk.hq.root.ad:443/dm
|
||||
cache_timeout: null
|
||||
expose_in_sqllab: true
|
||||
allow_run_async: false
|
||||
allow_ctas: false
|
||||
allow_cvas: false
|
||||
allow_dml: true
|
||||
allow_file_upload: false
|
||||
extra:
|
||||
allows_virtual_table_explore: true
|
||||
uuid: 97aced68-326a-4094-b381-27980560efa9
|
||||
version: 1.0.0
|
||||
@@ -0,0 +1,119 @@
|
||||
table_name: "FI-0080-06 \u041A\u0430\u043B\u0435\u043D\u0434\u0430\u0440\u044C (\u041E\
|
||||
\u0431\u0449\u0438\u0439 \u0441\u043F\u0440\u0430\u0432\u043E\u0447\u043D\u0438\u043A\
|
||||
)"
|
||||
main_dttm_col: null
|
||||
description: null
|
||||
default_endpoint: null
|
||||
offset: 0
|
||||
cache_timeout: null
|
||||
schema: dm_view
|
||||
sql: "-- [HEADER]\r\n-- [\u041D\u0410\u0417\u041D\u0410\u0427\u0415\u041D\u0418\u0415\
|
||||
]: \u041F\u043E\u043B\u0443\u0447\u0435\u043D\u0438\u0435 \u0434\u0438\u0430\u043F\
|
||||
\u0430\u0437\u043E\u043D\u0430 \u0434\u0430\u0442 \u0434\u043B\u044F \u043E\u0442\
|
||||
\u0447\u0435\u0442\u0430 \u043E \u0437\u0430\u0434\u043E\u043B\u0436\u0435\u043D\
|
||||
\u043D\u043E\u0441\u0442\u044F\u0445 \u043F\u043E \u043E\u0431\u043E\u0440\u043E\
|
||||
\u0442\u043D\u044B\u043C \u0441\u0440\u0435\u0434\u0441\u0442\u0432\u0430\u043C\r\
|
||||
\n-- [\u041A\u041B\u042E\u0427\u0415\u0412\u042B\u0415 \u041A\u041E\u041B\u041E\u041D\
|
||||
\u041A\u0418]:\r\n-- - from_dt_txt: \u041D\u0430\u0447\u0430\u043B\u044C\u043D\
|
||||
\u0430\u044F \u0434\u0430\u0442\u0430 \u0432 \u0444\u043E\u0440\u043C\u0430\u0442\
|
||||
\u0435 DD.MM.YYYY\r\n-- - to_dt_txt: \u041A\u043E\u043D\u0435\u0447\u043D\u0430\
|
||||
\u044F \u0434\u0430\u0442\u0430 \u0432 \u0444\u043E\u0440\u043C\u0430\u0442\u0435\
|
||||
\ DD.MM.YYYY\r\n-- [JINJA \u041F\u0410\u0420\u0410\u041C\u0415\u0422\u0420\u042B\
|
||||
]:\r\n-- - {{ filter_values(\"yes_no_check\") }}: \u0424\u0438\u043B\u044C\u0442\
|
||||
\u0440 \"\u0414\u0430/\u041D\u0435\u0442\" \u0434\u043B\u044F \u043E\u0433\u0440\
|
||||
\u0430\u043D\u0438\u0447\u0435\u043D\u0438\u044F \u0432\u044B\u0431\u043E\u0440\u043A\
|
||||
\u0438 \u043F\u043E \u0434\u0430\u0442\u0435\r\n-- [\u041B\u041E\u0413\u0418\u041A\
|
||||
\u0410]: \u041E\u043F\u0440\u0435\u0434\u0435\u043B\u044F\u0435\u0442 \u043F\u043E\
|
||||
\u0440\u043E\u0433\u043E\u0432\u0443\u044E \u0434\u0430\u0442\u0443 \u0432 \u0437\
|
||||
\u0430\u0432\u0438\u0441\u0438\u043C\u043E\u0441\u0442\u0438 \u043E\u0442 \u0442\
|
||||
\u0435\u043A\u0443\u0449\u0435\u0433\u043E \u0434\u043D\u044F \u043C\u0435\u0441\
|
||||
\u044F\u0446\u0430 \u0438 \u0444\u0438\u043B\u044C\u0442\u0440\u0443\u0435\u0442\
|
||||
\ \u0434\u0430\u043D\u043D\u044B\u0435\r\n\r\nWITH date_threshold AS (\r\n SELECT\
|
||||
\ \r\n -- \u041E\u043F\u0440\u0435\u0434\u0435\u043B\u044F\u0435\u043C \u043F\
|
||||
\u043E\u0440\u043E\u0433\u043E\u0432\u0443\u044E \u0434\u0430\u0442\u0443 \u0432\
|
||||
\ \u0437\u0430\u0432\u0438\u0441\u0438\u043C\u043E\u0441\u0442\u0438 \u043E\u0442\
|
||||
\ \u0442\u0435\u043A\u0443\u0449\u0435\u0433\u043E \u0434\u043D\u044F \r\n \
|
||||
\ CASE \r\n WHEN toDayOfMonth(now()) <= 10 THEN \r\n \
|
||||
\ toStartOfMonth(dateSub(MONTH, 1, now())) \r\n ELSE \r\n \
|
||||
\ toStartOfMonth(now()) \r\n END AS cutoff_date \r\n),\r\nfiltered_dates\
|
||||
\ AS (\r\n SELECT \r\n dt,\r\n formatDateTime(dt, '%d.%m.%Y') AS\
|
||||
\ from_dt_txt,\r\n formatDateTime(dt, '%d.%m.%Y') AS to_dt_txt\r\n \
|
||||
\ --dt as from_dt_txt,\r\n -- dt as to_dt_txt\r\n FROM dm_view.account_debt_for_working_capital_final\r\
|
||||
\n WHERE 1=1\r\n -- \u0411\u0435\u0437\u043E\u043F\u0430\u0441\u043D\u0430\
|
||||
\u044F \u043F\u0440\u043E\u0432\u0435\u0440\u043A\u0430 \u0444\u0438\u043B\u044C\
|
||||
\u0442\u0440\u0430\r\n {% if filter_values(\"yes_no_check\") | length !=\
|
||||
\ 0 %}\r\n {% if filter_values(\"yes_no_check\")[0] == \"\u0414\u0430\
|
||||
\" %}\r\n AND dt < (SELECT cutoff_date FROM date_threshold)\r\n \
|
||||
\ {% endif %}\r\n {% endif %}\r\n)\r\nSELECT \r\ndt,\r\n from_dt_txt,\r\
|
||||
\n to_dt_txt,\r\n formatDateTime(toLastDayOfMonth(dt), '%d.%m.%Y') as last_day_of_month_dt_txt\r\
|
||||
\nFROM \r\n filtered_dates\r\nGROUP BY \r\n dt, from_dt_txt, to_dt_txt\r\n\
|
||||
ORDER BY \r\n dt DESC"
|
||||
params: null
|
||||
template_params: null
|
||||
filter_select_enabled: true
|
||||
fetch_values_predicate: null
|
||||
extra: null
|
||||
normalize_columns: false
|
||||
uuid: fca62707-6947-4440-a16b-70cb6a5cea5b
|
||||
metrics:
|
||||
- metric_name: max_date
|
||||
verbose_name: max_date
|
||||
metric_type: count
|
||||
expression: max(dt)
|
||||
description: null
|
||||
d3format: null
|
||||
currency: null
|
||||
extra:
|
||||
warning_markdown: ''
|
||||
warning_text: null
|
||||
columns:
|
||||
- column_name: from_dt_txt
|
||||
verbose_name: null
|
||||
is_dttm: true
|
||||
is_active: true
|
||||
type: String
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: '%Y'
|
||||
extra: {}
|
||||
- column_name: dt
|
||||
verbose_name: null
|
||||
is_dttm: true
|
||||
is_active: true
|
||||
type: Date
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra: {}
|
||||
- column_name: last_day_of_month_dt_txt
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: String
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra: {}
|
||||
- column_name: to_dt_txt
|
||||
verbose_name: null
|
||||
is_dttm: true
|
||||
is_active: true
|
||||
type: String
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra: {}
|
||||
version: 1.0.0
|
||||
database_uuid: 97aced68-326a-4094-b381-27980560efa9
|
||||
@@ -0,0 +1,190 @@
|
||||
table_name: "FI-0090 \u0421\u0442\u0430\u0442\u0438\u0441\u0442\u0438\u043A\u0430\
|
||||
\ \u043F\u043E \u0414\u0417/\u041F\u0414\u0417"
|
||||
main_dttm_col: dt
|
||||
description: null
|
||||
default_endpoint: null
|
||||
offset: 0
|
||||
cache_timeout: null
|
||||
schema: dm_view
|
||||
sql: "-- [JINJA_BLOCK] \u0426\u0435\u043D\u0442\u0440\u0430\u043B\u0438\u0437\u043E\
|
||||
\u0432\u0430\u043D\u043D\u043E\u0435 \u043E\u043F\u0440\u0435\u0434\u0435\u043B\u0435\
|
||||
\u043D\u0438\u0435 \u0432\u0441\u0435\u0445 Jinja \u043F\u0435\u0440\u0435\u043C\
|
||||
\u0435\u043D\u043D\u044B\u0445\r\n{% set raw_to = filter_values('last_day_of_month_dt_txt')[0]\
|
||||
\ \r\n if filter_values('last_day_of_month_dt_txt') else '01.05.2025'\
|
||||
\ %}\r\n\r\n{# \u0440\u0430\u0437\u0431\u0438\u0432\u0430\u0435\u043C \xABDD.MM.YYYY\xBB\
|
||||
\ \u043D\u0430 \u0447\u0430\u0441\u0442\u0438 #}\r\n{% set to_parts = raw_to.split('.')\
|
||||
\ %}\r\n\r\n{# \u0441\u043E\u0431\u0438\u0440\u0430\u0435\u043C ISO\u2011\u0441\u0442\
|
||||
\u0440\u043E\u043A\u0443 \xABYYYY-MM-DD\xBB #}\r\n{% set to_dt = to_parts[2] \
|
||||
\ ~ '-' ~ to_parts[1] ~ '-' ~ to_parts[0] %}\r\n\r\nwith \r\ncp_relations_type\
|
||||
\ AS (\r\n select * from ( SELECT \r\n ctd.counterparty_code AS counterparty_code,\r\
|
||||
\n min(dt_from) as dt_from,\r\n max(dt_to) as dt_to,\r\n crt.relation_type_code\
|
||||
\ || ' ' || crt.relation_type_name AS relation_type_code_name\r\n FROM\r\n \
|
||||
\ dm_view.counterparty_td ctd\r\n JOIN dm_view.counterparty_relation_type_texts\
|
||||
\ crt \r\n ON ctd.relation_type_code = crt.relation_type_code\r\n GROUP\
|
||||
\ BY\r\n ctd.counterparty_code, ctd.counterparty_full_name,\r\n crt.relation_type_code,crt.relation_type_name)\r\
|
||||
\n WHERE \r\n dt_from <= toDate('{{to_dt }}') AND \r\n \
|
||||
\ dt_to >= toDate('{{to_dt }}')\r\n ),\r\nt_debt as \r\n(SELECT dt, \r\n\
|
||||
counterparty_search_name,\r\ncp_relations_type.relation_type_code_name as relation_type_code_name,\r\
|
||||
\nunit_balance_code || ' ' || unit_balance_name as unit_balance_code_name,\r\n'1.\
|
||||
\ \u0421\u0443\u043C\u043C\u0430' as attribute,\r\nsum(debt_balance_subposition_no_revaluation_usd_amount)\
|
||||
\ as debt_amount,\r\nsumIf(debt_balance_subposition_no_revaluation_usd_amount,dt_overdue\
|
||||
\ < dt) as overdue_amount\r\nfrom dm_view.account_debt_for_working_capital t_debt\r\
|
||||
\njoin cp_relations_type ON\r\ncp_relations_type.counterparty_code = t_debt.counterparty_code\r\
|
||||
\nwhere dt = toLastDayOfMonth(dt)\r\nand match(general_ledger_account_code,'((62)|(60)|(76))')\r\
|
||||
\nand debit_or_credit = 'S'\r\nand account_type = 'D'\r\nand dt between addMonths(toDate('{{to_dt\
|
||||
\ }}'),-12) and toDate('{{to_dt }}')\r\ngroup by dt, counterparty_search_name,unit_balance_code_name,relation_type_code_name\r\
|
||||
\n),\r\n\r\nt_transaction_count_base as \r\n(\r\nselect *,\r\ncp_relations_type.relation_type_code_name\
|
||||
\ as relation_type_code_name,\r\nunit_balance_code || ' ' || unit_balance_name as\
|
||||
\ unit_balance_code_name,\r\n case when dt_overdue<dt_clearing then\r\n \
|
||||
\ dateDiff(day, dt_overdue, dt_clearing) \r\n else 0\r\n end\
|
||||
\ as overdue_days\r\nfrom dm_view.accounting_documents_leading_to_debt t_docs\r\n\
|
||||
join cp_relations_type ON\r\ncp_relations_type.counterparty_code = t_docs.counterparty_code\r\
|
||||
\nwhere 1=1\r\n\r\nand match(general_ledger_account_code,'((62)|(60)|(76))')\r\n\
|
||||
and debit_or_credit = 'S'\r\nand account_type = 'D'\r\n)\r\n\r\nselect * from t_debt\r\
|
||||
\n\r\nunion all \r\n\r\nselect toLastDayOfMonth(dt_debt) as dt, \r\ncounterparty_search_name,\r\
|
||||
\nrelation_type_code_name,\r\nunit_balance_code_name,\r\n'2. \u043A\u043E\u043B\u0438\
|
||||
\u0447\u0435\u0441\u0442\u0432\u043E \u0442\u0440\u0430\u043D\u0437\u0430\u043A\u0446\
|
||||
\u0438\u0439 \u0432 \u043C\u0435\u0441\u044F\u0446' as attribute,\r\ncount(1) as\
|
||||
\ debt_amount,\r\nnull as overdue_amount\r\nfrom t_transaction_count_base\r\nwhere\
|
||||
\ dt_debt between addMonths(toDate('{{to_dt }}'),-12) and toDate('{{to_dt }}')\r\
|
||||
\ngroup by toLastDayOfMonth(dt_debt), \r\ncounterparty_search_name,\r\nrelation_type_code_name,\r\
|
||||
\nunit_balance_code_name,attribute\r\n\r\nunion all \r\n\r\nselect toLastDayOfMonth(dt_clearing)\
|
||||
\ as dt, \r\ncounterparty_search_name,\r\nrelation_type_code_name,\r\nunit_balance_code_name,\r\
|
||||
\n'2. \u043A\u043E\u043B\u0438\u0447\u0435\u0441\u0442\u0432\u043E \u0442\u0440\u0430\
|
||||
\u043D\u0437\u0430\u043A\u0446\u0438\u0439 \u0432 \u043C\u0435\u0441\u044F\u0446\
|
||||
' as attribute,\r\nnull as debt_amount,\r\ncount(1) as overdue_amount\r\nfrom t_transaction_count_base\r\
|
||||
\nwhere dt_clearing between addMonths(toDate('{{to_dt }}'),-12) and toDate('{{to_dt\
|
||||
\ }}')\r\nand overdue_days > 0\r\ngroup by toLastDayOfMonth(dt_clearing), \r\ncounterparty_search_name,\r\
|
||||
\nrelation_type_code_name,\r\nunit_balance_code_name,attribute\r\n\r\nunion all\
|
||||
\ \r\n\r\nselect toLastDayOfMonth(dt_clearing) as dt, \r\ncounterparty_search_name,\r\
|
||||
\nrelation_type_code_name,\r\nunit_balance_code_name,\r\nmultiIf(\r\noverdue_days\
|
||||
\ < 30,'3. \u0434\u043E 30',\r\noverdue_days between 30 and 60, '4. \u043E\u0442\
|
||||
\ 30 \u0434\u043E 60',\r\noverdue_days between 61 and 90, '5. \u043E\u0442 61 \u0434\
|
||||
\u043E 90',\r\noverdue_days>90,'6. \u0431\u043E\u043B\u0435\u0435 90 \u0434\u043D\
|
||||
',\r\nnull\r\n)\r\n as attribute,\r\nnull as debt_amount,\r\ncount(1) as overdue_amount\r\
|
||||
\nfrom t_transaction_count_base\r\nwhere dt_clearing between addMonths(toDate('{{to_dt\
|
||||
\ }}'),-12) and toDate('{{to_dt }}')\r\nand overdue_days > 0\r\ngroup by toLastDayOfMonth(dt_clearing),\
|
||||
\ \r\ncounterparty_search_name,\r\nrelation_type_code_name,\r\nattribute,unit_balance_code_name,attribute\r\
|
||||
\n"
|
||||
params: null
|
||||
template_params: null
|
||||
filter_select_enabled: true
|
||||
fetch_values_predicate: null
|
||||
extra: null
|
||||
normalize_columns: false
|
||||
uuid: 9e645dc0-da25-4f61-9465-6e649b0bc4b1
|
||||
metrics:
|
||||
- metric_name: m_debt_amount
|
||||
verbose_name: "\u0414\u0417, $"
|
||||
metric_type: count
|
||||
expression: sum(debt_amount)
|
||||
description: null
|
||||
d3format: null
|
||||
currency: null
|
||||
extra:
|
||||
warning_markdown: ''
|
||||
warning_text: null
|
||||
- metric_name: m_overdue_amount
|
||||
verbose_name: "\u041F\u0414\u0417, $"
|
||||
metric_type: null
|
||||
expression: sum(overdue_amount)
|
||||
description: null
|
||||
d3format: null
|
||||
currency: null
|
||||
extra:
|
||||
warning_markdown: ''
|
||||
warning_text: null
|
||||
columns:
|
||||
- column_name: debt_amount
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(Decimal(38, 2))
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: overdue_amount
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(Decimal(38, 2))
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: dt
|
||||
verbose_name: null
|
||||
is_dttm: true
|
||||
is_active: true
|
||||
type: Nullable(Date)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: unit_balance_code_name
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(String)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: relation_type_code_name
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(String)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: counterparty_search_name
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(String)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
- column_name: attribute
|
||||
verbose_name: null
|
||||
is_dttm: false
|
||||
is_active: true
|
||||
type: Nullable(String)
|
||||
advanced_data_type: null
|
||||
groupby: true
|
||||
filterable: true
|
||||
expression: null
|
||||
description: null
|
||||
python_date_format: null
|
||||
extra:
|
||||
warning_markdown: null
|
||||
version: 1.0.0
|
||||
database_uuid: 97aced68-326a-4094-b381-27980560efa9
|
||||
@@ -0,0 +1,3 @@
|
||||
version: 1.0.0
|
||||
type: Dashboard
|
||||
timestamp: '2026-01-14T11:21:08.078620+00:00'
|
||||
@@ -13,7 +13,7 @@ The settings mechanism allows users to configure multiple Superset environments
|
||||
Configuration is structured using Pydantic models in `backend/src/core/config_models.py`:
|
||||
|
||||
- `Environment`: Represents a Superset instance (URL, credentials). The `base_url` is automatically normalized to include the `/api/v1` suffix if missing.
|
||||
- `GlobalSettings`: Global application parameters (e.g., `backup_path`).
|
||||
- `GlobalSettings`: Global application parameters (e.g., `storage.root_path`).
|
||||
- `AppConfig`: The root configuration object.
|
||||
|
||||
### Configuration Manager
|
||||
@@ -43,4 +43,4 @@ The settings page is located at `frontend/src/pages/Settings.svelte`. It provide
|
||||
|
||||
Existing plugins and utilities use the `ConfigManager` to fetch configuration:
|
||||
- `superset_tool/utils/init_clients.py`: Dynamically initializes Superset clients from the configured environments.
|
||||
- `BackupPlugin`: Uses the configured `backup_path` as the default storage location.
|
||||
- `BackupPlugin`: Uses the configured `storage.root_path` as the default storage location.
|
||||
|
||||
7
frontend/.eslintignore
Normal file
7
frontend/.eslintignore
Normal file
@@ -0,0 +1,7 @@
|
||||
node_modules/
|
||||
dist/
|
||||
build/
|
||||
.svelte-kit/
|
||||
.vite/
|
||||
coverage/
|
||||
*.min.js
|
||||
26
frontend/.gitignore
vendored
Executable file
26
frontend/.gitignore
vendored
Executable file
@@ -0,0 +1,26 @@
|
||||
# Logs
|
||||
logs
|
||||
*.log
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
pnpm-debug.log*
|
||||
lerna-debug.log*
|
||||
|
||||
node_modules
|
||||
dist
|
||||
dist-ssr
|
||||
*.local
|
||||
.svelte-kit
|
||||
build
|
||||
|
||||
# Editor directories and files
|
||||
.vscode/*
|
||||
!.vscode/extensions.json
|
||||
.idea
|
||||
.DS_Store
|
||||
*.suo
|
||||
*.ntvs*
|
||||
*.njsproj
|
||||
*.sln
|
||||
*.sw?
|
||||
9
frontend/.prettierignore
Normal file
9
frontend/.prettierignore
Normal file
@@ -0,0 +1,9 @@
|
||||
node_modules/
|
||||
dist/
|
||||
build/
|
||||
.svelte-kit/
|
||||
.vite/
|
||||
coverage/
|
||||
package-lock.json
|
||||
yarn.lock
|
||||
pnpm-lock.yaml
|
||||
@@ -1,13 +0,0 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>frontend</title>
|
||||
</head>
|
||||
<body>
|
||||
<div id="app"></div>
|
||||
<script type="module" src="/src/main.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,4 +1,5 @@
|
||||
{
|
||||
"extends": "./.svelte-kit/tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"moduleResolution": "bundler",
|
||||
"target": "ESNext",
|
||||
|
||||
@@ -1,117 +0,0 @@
|
||||
<!-- [DEF:App:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: main, entrypoint, layout, navigation
|
||||
@PURPOSE: The root component of the frontend application. Manages navigation and layout.
|
||||
@LAYER: UI
|
||||
@RELATION: DEPENDS_ON -> frontend/src/pages/Dashboard.svelte
|
||||
@RELATION: DEPENDS_ON -> frontend/src/pages/Settings.svelte
|
||||
@RELATION: DEPENDS_ON -> frontend/src/lib/stores.js
|
||||
|
||||
@INVARIANT: Navigation state must be persisted in the currentPage store.
|
||||
-->
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { get } from 'svelte/store';
|
||||
import Dashboard from './pages/Dashboard.svelte';
|
||||
import Settings from './pages/Settings.svelte';
|
||||
import { selectedPlugin, selectedTask, currentPage } from './lib/stores.js';
|
||||
import TaskRunner from './components/TaskRunner.svelte';
|
||||
import DynamicForm from './components/DynamicForm.svelte';
|
||||
import { api } from './lib/api.js';
|
||||
import Toast from './components/Toast.svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [DEF:handleFormSubmit:Function]
|
||||
/**
|
||||
* @purpose Handles form submission for task creation.
|
||||
* @pre event.detail contains form parameters.
|
||||
* @post Task is created and selectedTask is updated.
|
||||
* @param {CustomEvent} event - The submit event from DynamicForm.
|
||||
*/
|
||||
async function handleFormSubmit(event) {
|
||||
console.log("[App.handleFormSubmit][Action] Handling form submission for task creation.");
|
||||
const params = event.detail;
|
||||
try {
|
||||
const plugin = get(selectedPlugin);
|
||||
const task = await api.createTask(plugin.id, params);
|
||||
selectedTask.set(task);
|
||||
selectedPlugin.set(null);
|
||||
console.log(`[App.handleFormSubmit][Coherence:OK] Task created id=${task.id}`);
|
||||
} catch (error) {
|
||||
console.error(`[App.handleFormSubmit][Coherence:Failed] Task creation failed error=${error}`);
|
||||
}
|
||||
}
|
||||
// [/DEF:handleFormSubmit:Function]
|
||||
|
||||
// [DEF:navigate:Function]
|
||||
/**
|
||||
* @purpose Changes the current page and resets state.
|
||||
* @pre Target page name is provided.
|
||||
* @post currentPage store is updated and selection state is reset.
|
||||
* @param {string} page - Target page name.
|
||||
*/
|
||||
function navigate(page) {
|
||||
console.log(`[App.navigate][Action] Navigating to ${page}.`);
|
||||
// Reset selection first
|
||||
if (page !== get(currentPage)) {
|
||||
selectedPlugin.set(null);
|
||||
selectedTask.set(null);
|
||||
}
|
||||
// Then set page
|
||||
currentPage.set(page);
|
||||
}
|
||||
// [/DEF:navigate:Function]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<Toast />
|
||||
|
||||
<main class="bg-gray-50 min-h-screen">
|
||||
<header class="bg-white shadow-md p-4 flex justify-between items-center">
|
||||
<button
|
||||
type="button"
|
||||
class="text-3xl font-bold text-gray-800 focus:outline-none"
|
||||
on:click={() => navigate('dashboard')}
|
||||
>
|
||||
Superset Tools
|
||||
</button>
|
||||
<nav class="space-x-4">
|
||||
<button
|
||||
type="button"
|
||||
on:click={() => navigate('dashboard')}
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$currentPage === 'dashboard' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Dashboard
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
on:click={() => navigate('settings')}
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$currentPage === 'settings' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Settings
|
||||
</button>
|
||||
</nav>
|
||||
</header>
|
||||
|
||||
<div class="p-4">
|
||||
{#if $currentPage === 'settings'}
|
||||
<Settings />
|
||||
{:else if $selectedTask}
|
||||
<TaskRunner />
|
||||
<button on:click={() => selectedTask.set(null)} class="mt-4 bg-blue-500 text-white p-2 rounded">
|
||||
Back to Task List
|
||||
</button>
|
||||
{:else if $selectedPlugin}
|
||||
<h2 class="text-2xl font-bold mb-4">{$selectedPlugin.name}</h2>
|
||||
<DynamicForm schema={$selectedPlugin.schema} on:submit={handleFormSubmit} />
|
||||
<button on:click={() => selectedPlugin.set(null)} class="mt-4 bg-gray-500 text-white p-2 rounded">
|
||||
Back to Dashboard
|
||||
</button>
|
||||
{:else}
|
||||
<Dashboard />
|
||||
{/if}
|
||||
</div>
|
||||
</main>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:App:Component] -->
|
||||
@@ -12,6 +12,9 @@
|
||||
// [SECTION: IMPORTS]
|
||||
import { createEventDispatcher } from 'svelte';
|
||||
import type { DashboardMetadata } from '../types/dashboard';
|
||||
import { t } from '../lib/i18n';
|
||||
import { Button, Input } from '../lib/ui';
|
||||
import GitManager from './git/GitManager.svelte';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
@@ -27,6 +30,12 @@
|
||||
let sortDirection: "asc" | "desc" = "asc";
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: UI STATE]
|
||||
let showGitManager = false;
|
||||
let gitDashboardId: number | null = null;
|
||||
let gitDashboardTitle = "";
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: DERIVED]
|
||||
$: filteredDashboards = dashboards.filter(d =>
|
||||
d.title.toLowerCase().includes(filterText.toLowerCase())
|
||||
@@ -120,61 +129,83 @@
|
||||
}
|
||||
// [/DEF:goToPage:Function]
|
||||
|
||||
// [DEF:openGit:Function]
|
||||
/**
|
||||
* @purpose Opens the Git management modal for a dashboard.
|
||||
*/
|
||||
function openGit(dashboard: DashboardMetadata) {
|
||||
gitDashboardId = dashboard.id;
|
||||
gitDashboardTitle = dashboard.title;
|
||||
showGitManager = true;
|
||||
}
|
||||
// [/DEF:openGit:Function]
|
||||
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="dashboard-grid">
|
||||
<!-- Filter Input -->
|
||||
<div class="mb-4">
|
||||
<input
|
||||
type="text"
|
||||
<div class="mb-6">
|
||||
<Input
|
||||
bind:value={filterText}
|
||||
placeholder="Search dashboards..."
|
||||
class="w-full px-3 py-2 border border-gray-300 rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500"
|
||||
placeholder={$t.dashboard.search}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<!-- Grid/Table -->
|
||||
<div class="overflow-x-auto">
|
||||
<table class="min-w-full bg-white border border-gray-300">
|
||||
<div class="overflow-x-auto rounded-lg border border-gray-200">
|
||||
<table class="min-w-full divide-y divide-gray-200">
|
||||
<thead class="bg-gray-50">
|
||||
<tr>
|
||||
<th class="px-4 py-2 border-b">
|
||||
<th class="px-6 py-3 text-left">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={allSelected}
|
||||
indeterminate={someSelected && !allSelected}
|
||||
on:change={(e) => handleSelectAll((e.target as HTMLInputElement).checked)}
|
||||
class="h-4 w-4 text-blue-600 border-gray-300 rounded focus:ring-blue-500"
|
||||
/>
|
||||
</th>
|
||||
<th class="px-4 py-2 border-b cursor-pointer" on:click={() => handleSort('title')}>
|
||||
Title {sortColumn === 'title' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider cursor-pointer hover:text-gray-700 transition-colors" on:click={() => handleSort('title')}>
|
||||
{$t.dashboard.title} {sortColumn === 'title' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
|
||||
</th>
|
||||
<th class="px-4 py-2 border-b cursor-pointer" on:click={() => handleSort('last_modified')}>
|
||||
Last Modified {sortColumn === 'last_modified' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider cursor-pointer hover:text-gray-700 transition-colors" on:click={() => handleSort('last_modified')}>
|
||||
{$t.dashboard.last_modified} {sortColumn === 'last_modified' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
|
||||
</th>
|
||||
<th class="px-4 py-2 border-b cursor-pointer" on:click={() => handleSort('status')}>
|
||||
Status {sortColumn === 'status' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider cursor-pointer hover:text-gray-700 transition-colors" on:click={() => handleSort('status')}>
|
||||
{$t.dashboard.status} {sortColumn === 'status' ? (sortDirection === 'asc' ? '↑' : '↓') : ''}
|
||||
</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.dashboard.git}</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tbody class="bg-white divide-y divide-gray-200">
|
||||
{#each paginatedDashboards as dashboard (dashboard.id)}
|
||||
<tr class="hover:bg-gray-50">
|
||||
<td class="px-4 py-2 border-b">
|
||||
<tr class="hover:bg-gray-50 transition-colors">
|
||||
<td class="px-6 py-4 whitespace-nowrap">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={selectedIds.includes(dashboard.id)}
|
||||
on:change={(e) => handleSelectionChange(dashboard.id, (e.target as HTMLInputElement).checked)}
|
||||
class="h-4 w-4 text-blue-600 border-gray-300 rounded focus:ring-blue-500"
|
||||
/>
|
||||
</td>
|
||||
<td class="px-4 py-2 border-b">{dashboard.title}</td>
|
||||
<td class="px-4 py-2 border-b">{new Date(dashboard.last_modified).toLocaleDateString()}</td>
|
||||
<td class="px-4 py-2 border-b">
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm font-medium text-gray-900">{dashboard.title}</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm text-gray-500">{new Date(dashboard.last_modified).toLocaleDateString()}</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm text-gray-500">
|
||||
<span class="px-2 py-1 text-xs font-medium rounded-full {dashboard.status === 'published' ? 'bg-green-100 text-green-800' : 'bg-gray-100 text-gray-800'}">
|
||||
{dashboard.status}
|
||||
</span>
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm font-medium">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
on:click={() => openGit(dashboard)}
|
||||
class="text-blue-600 hover:text-blue-900"
|
||||
>
|
||||
{$t.git.manage}
|
||||
</Button>
|
||||
</td>
|
||||
</tr>
|
||||
{/each}
|
||||
</tbody>
|
||||
@@ -182,28 +213,42 @@
|
||||
</div>
|
||||
|
||||
<!-- Pagination Controls -->
|
||||
<div class="flex items-center justify-between mt-4">
|
||||
<div class="text-sm text-gray-700">
|
||||
Showing {currentPage * pageSize + 1} to {Math.min((currentPage + 1) * pageSize, sortedDashboards.length)} of {sortedDashboards.length} dashboards
|
||||
<div class="flex items-center justify-between mt-6">
|
||||
<div class="text-sm text-gray-500">
|
||||
{($t.dashboard?.showing || "")
|
||||
.replace('{start}', (currentPage * pageSize + 1).toString())
|
||||
.replace('{end}', Math.min((currentPage + 1) * pageSize, sortedDashboards.length).toString())
|
||||
.replace('{total}', sortedDashboards.length.toString())}
|
||||
</div>
|
||||
<div class="flex space-x-2">
|
||||
<button
|
||||
class="px-3 py-1 text-sm border border-gray-300 rounded-md hover:bg-gray-50 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
<div class="flex gap-2">
|
||||
<Button
|
||||
variant="secondary"
|
||||
size="sm"
|
||||
disabled={currentPage === 0}
|
||||
on:click={() => goToPage(currentPage - 1)}
|
||||
>
|
||||
Previous
|
||||
</button>
|
||||
<button
|
||||
class="px-3 py-1 text-sm border border-gray-300 rounded-md hover:bg-gray-50 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
{$t.dashboard.previous}
|
||||
</Button>
|
||||
<Button
|
||||
variant="secondary"
|
||||
size="sm"
|
||||
disabled={currentPage >= totalPages - 1}
|
||||
on:click={() => goToPage(currentPage + 1)}
|
||||
>
|
||||
Next
|
||||
</button>
|
||||
{$t.dashboard.next}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{#if showGitManager && gitDashboardId}
|
||||
<GitManager
|
||||
dashboardId={gitDashboardId}
|
||||
dashboardTitle={gitDashboardTitle}
|
||||
bind:show={showGitManager}
|
||||
/>
|
||||
{/if}
|
||||
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
|
||||
@@ -57,4 +57,4 @@
|
||||
/* Component specific styles */
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:EnvSelector:Component] -->
|
||||
<!-- [/DEF:EnvSelector:Component] -->
|
||||
@@ -7,53 +7,73 @@
|
||||
-->
|
||||
<script>
|
||||
import { page } from '$app/stores';
|
||||
import { t } from '$lib/i18n';
|
||||
import { LanguageSwitcher } from '$lib/ui';
|
||||
import { auth } from '../lib/auth/store';
|
||||
import { goto } from '$app/navigation';
|
||||
|
||||
function handleLogout() {
|
||||
auth.logout();
|
||||
goto('/login');
|
||||
}
|
||||
</script>
|
||||
|
||||
<header class="bg-white shadow-md p-4 flex justify-between items-center">
|
||||
<a
|
||||
href="/"
|
||||
class="text-3xl font-bold text-gray-800 focus:outline-none"
|
||||
class="text-2xl font-bold text-gray-800 focus:outline-none"
|
||||
>
|
||||
Superset Tools
|
||||
</a>
|
||||
<nav class="space-x-4">
|
||||
<nav class="flex items-center space-x-4">
|
||||
<a
|
||||
href="/"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname === '/' ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Dashboard
|
||||
</a>
|
||||
<a
|
||||
href="/migration"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/migration') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Migration
|
||||
{$t.nav.dashboard}
|
||||
</a>
|
||||
<a
|
||||
href="/tasks"
|
||||
class="text-gray-600 hover:text-blue-600 font-medium {$page.url.pathname.startsWith('/tasks') ? 'text-blue-600 border-b-2 border-blue-600' : ''}"
|
||||
>
|
||||
Tasks
|
||||
{$t.nav.tasks}
|
||||
</a>
|
||||
<div class="relative inline-block group">
|
||||
<button class="text-gray-600 hover:text-blue-600 font-medium pb-1 {$page.url.pathname.startsWith('/tools') ? 'text-blue-600 border-b-2 border-blue-600' : ''}">
|
||||
Tools
|
||||
</button>
|
||||
<div class="absolute hidden group-hover:block bg-white shadow-lg rounded-md mt-1 py-2 w-48 z-10 border border-gray-100 before:absolute before:-top-2 before:left-0 before:right-0 before:h-2 before:content-[''] right-0">
|
||||
<a href="/tools/search" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">Dataset Search</a>
|
||||
<a href="/tools/mapper" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">Dataset Mapper</a>
|
||||
<a href="/tools/debug" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">System Debug</a>
|
||||
</div>
|
||||
</div>
|
||||
<div class="relative inline-block group">
|
||||
<button class="text-gray-600 hover:text-blue-600 font-medium pb-1 {$page.url.pathname.startsWith('/settings') ? 'text-blue-600 border-b-2 border-blue-600' : ''}">
|
||||
Settings
|
||||
{$t.nav.settings}
|
||||
</button>
|
||||
<div class="absolute hidden group-hover:block bg-white shadow-lg rounded-md mt-1 py-2 w-48 z-10 border border-gray-100 before:absolute before:-top-2 before:left-0 before:right-0 before:h-2 before:content-[''] right-0">
|
||||
<a href="/settings" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">General Settings</a>
|
||||
<a href="/settings/connections" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">Connections</a>
|
||||
<a href="/settings" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_general}</a>
|
||||
<a href="/settings/connections" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_connections}</a>
|
||||
<a href="/settings/git" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.settings_git}</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{#if $auth.isAuthenticated && $auth.user?.roles?.some(r => r.name === 'Admin')}
|
||||
<div class="relative inline-block group">
|
||||
<button class="text-gray-600 hover:text-blue-600 font-medium pb-1 {$page.url.pathname.startsWith('/admin') ? 'text-blue-600 border-b-2 border-blue-600' : ''}">
|
||||
{$t.nav.admin}
|
||||
</button>
|
||||
<div class="absolute hidden group-hover:block bg-white shadow-lg rounded-md mt-1 py-2 w-48 z-10 border border-gray-100 before:absolute before:-top-2 before:left-0 before:right-0 before:h-2 before:content-[''] right-0">
|
||||
<a href="/admin/users" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.admin_users}</a>
|
||||
<a href="/admin/roles" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.admin_roles}</a>
|
||||
<a href="/admin/settings" class="block px-4 py-2 text-sm text-gray-700 hover:bg-blue-50 hover:text-blue-600">{$t.nav.admin_settings}</a>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
<LanguageSwitcher />
|
||||
|
||||
{#if $auth.isAuthenticated}
|
||||
<div class="flex items-center space-x-2 border-l pl-4 ml-4">
|
||||
<span class="text-sm text-gray-600">{$auth.user?.username}</span>
|
||||
<button
|
||||
on:click={handleLogout}
|
||||
class="text-sm text-red-600 hover:text-red-800 font-medium"
|
||||
>
|
||||
Logout
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
</nav>
|
||||
</header>
|
||||
<!-- [/DEF:Navbar:Component] -->
|
||||
|
||||
@@ -21,7 +21,9 @@
|
||||
// @POST: tasks array is updated and selectedTask status synchronized.
|
||||
async function fetchTasks() {
|
||||
try {
|
||||
const res = await fetch('/api/tasks?limit=10');
|
||||
const token = localStorage.getItem('auth_token');
|
||||
const headers = token ? { 'Authorization': `Bearer ${token}` } : {};
|
||||
const res = await fetch('/api/tasks?limit=10', { headers });
|
||||
if (!res.ok) throw new Error('Failed to fetch tasks');
|
||||
tasks = await res.json();
|
||||
|
||||
@@ -58,7 +60,9 @@
|
||||
const params = new URLSearchParams();
|
||||
if (status) params.append('status', status);
|
||||
|
||||
const res = await fetch(`${url}?${params.toString()}`, { method: 'DELETE' });
|
||||
const token = localStorage.getItem('auth_token');
|
||||
const headers = token ? { 'Authorization': `Bearer ${token}` } : {};
|
||||
const res = await fetch(`${url}?${params.toString()}`, { method: 'DELETE', headers });
|
||||
if (!res.ok) throw new Error('Failed to clear tasks');
|
||||
|
||||
await fetchTasks();
|
||||
@@ -75,7 +79,9 @@
|
||||
async function selectTask(task) {
|
||||
try {
|
||||
// Fetch the full task details (including logs) before setting it as selected
|
||||
const res = await fetch(`/api/tasks/${task.id}`);
|
||||
const token = localStorage.getItem('auth_token');
|
||||
const headers = token ? { 'Authorization': `Bearer ${token}` } : {};
|
||||
const res = await fetch(`/api/tasks/${task.id}`, { headers });
|
||||
if (res.ok) {
|
||||
const fullTask = await res.json();
|
||||
selectedTask.set(fullTask);
|
||||
|
||||
@@ -9,6 +9,7 @@
|
||||
<script lang="ts">
|
||||
import { createEventDispatcher } from 'svelte';
|
||||
import { formatDistanceToNow } from 'date-fns';
|
||||
import { t } from '../lib/i18n';
|
||||
|
||||
export let tasks: Array<any> = [];
|
||||
export let loading: boolean = false;
|
||||
@@ -58,9 +59,9 @@
|
||||
|
||||
<div class="bg-white shadow overflow-hidden sm:rounded-md">
|
||||
{#if loading && tasks.length === 0}
|
||||
<div class="p-4 text-center text-gray-500">Loading tasks...</div>
|
||||
<div class="p-4 text-center text-gray-500">{$t.tasks?.loading || 'Loading...'}</div>
|
||||
{:else if tasks.length === 0}
|
||||
<div class="p-4 text-center text-gray-500">No tasks found.</div>
|
||||
<div class="p-4 text-center text-gray-500">{$t.tasks?.no_tasks || 'No tasks found.'}</div>
|
||||
{:else}
|
||||
<ul class="divide-y divide-gray-200">
|
||||
{#each tasks as task (task.id)}
|
||||
@@ -94,7 +95,7 @@
|
||||
<path fill-rule="evenodd" d="M6 2a1 1 0 00-1 1v1H4a2 2 0 00-2 2v10a2 2 0 002 2h12a2 2 0 002-2V6a2 2 0 00-2-2h-1V3a1 1 0 10-2 0v1H7V3a1 1 0 00-1-1zm0 5a1 1 0 000 2h8a1 1 0 100-2H6z" clip-rule="evenodd" />
|
||||
</svg>
|
||||
<p>
|
||||
Started {formatTime(task.started_at)}
|
||||
{($t.tasks?.started || "").replace('{time}', formatTime(task.started_at))}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -8,6 +8,8 @@
|
||||
<script>
|
||||
import { createEventDispatcher, onMount, onDestroy } from 'svelte';
|
||||
import { getTaskLogs } from '../services/taskService.js';
|
||||
import { t } from '../lib/i18n';
|
||||
import { Button } from '../lib/ui';
|
||||
|
||||
export let show = false;
|
||||
export let inline = false;
|
||||
@@ -143,20 +145,20 @@
|
||||
<div class="flex flex-col h-full w-full p-4">
|
||||
<div class="flex justify-between items-center mb-4">
|
||||
<h3 class="text-lg font-medium text-gray-900">
|
||||
Task Logs <span class="text-sm text-gray-500 font-normal">({taskId})</span>
|
||||
{$t.tasks?.logs_title} <span class="text-sm text-gray-500 font-normal">({taskId})</span>
|
||||
</h3>
|
||||
<button on:click={fetchLogs} class="text-sm text-indigo-600 hover:text-indigo-900">Refresh</button>
|
||||
<Button variant="ghost" size="sm" on:click={fetchLogs} class="text-blue-600">{$t.tasks?.refresh}</Button>
|
||||
</div>
|
||||
|
||||
<div class="flex-1 border rounded-md bg-gray-50 p-4 overflow-y-auto font-mono text-sm"
|
||||
<div class="flex-1 border rounded-md bg-gray-50 p-4 overflow-y-auto font-mono text-sm"
|
||||
bind:this={logContainer}
|
||||
on:scroll={handleScroll}>
|
||||
{#if loading && logs.length === 0}
|
||||
<p class="text-gray-500 text-center">Loading logs...</p>
|
||||
<p class="text-gray-500 text-center">{$t.tasks?.loading}</p>
|
||||
{:else if error}
|
||||
<p class="text-red-500 text-center">{error}</p>
|
||||
{:else if logs.length === 0}
|
||||
<p class="text-gray-500 text-center">No logs available.</p>
|
||||
<p class="text-gray-500 text-center">{$t.tasks?.no_logs}</p>
|
||||
{:else}
|
||||
{#each logs as log}
|
||||
<div class="mb-1 hover:bg-gray-100 p-1 rounded">
|
||||
@@ -192,19 +194,19 @@
|
||||
<div class="sm:flex sm:items-start">
|
||||
<div class="mt-3 text-center sm:mt-0 sm:ml-4 sm:text-left w-full">
|
||||
<h3 class="text-lg leading-6 font-medium text-gray-900 flex justify-between items-center" id="modal-title">
|
||||
<span>Task Logs <span class="text-sm text-gray-500 font-normal">({taskId})</span></span>
|
||||
<button on:click={fetchLogs} class="text-sm text-indigo-600 hover:text-indigo-900">Refresh</button>
|
||||
<span>{$t.tasks.logs_title} <span class="text-sm text-gray-500 font-normal">({taskId})</span></span>
|
||||
<Button variant="ghost" size="sm" on:click={fetchLogs} class="text-blue-600">{$t.tasks.refresh}</Button>
|
||||
</h3>
|
||||
|
||||
<div class="mt-4 border rounded-md bg-gray-50 p-4 h-96 overflow-y-auto font-mono text-sm"
|
||||
<div class="mt-4 border rounded-md bg-gray-50 p-4 h-96 overflow-y-auto font-mono text-sm"
|
||||
bind:this={logContainer}
|
||||
on:scroll={handleScroll}>
|
||||
{#if loading && logs.length === 0}
|
||||
<p class="text-gray-500 text-center">Loading logs...</p>
|
||||
<p class="text-gray-500 text-center">{$t.tasks.loading}</p>
|
||||
{:else if error}
|
||||
<p class="text-red-500 text-center">{error}</p>
|
||||
{:else if logs.length === 0}
|
||||
<p class="text-gray-500 text-center">No logs available.</p>
|
||||
<p class="text-gray-500 text-center">{$t.tasks.no_logs}</p>
|
||||
{:else}
|
||||
{#each logs as log}
|
||||
<div class="mb-1 hover:bg-gray-100 p-1 rounded">
|
||||
@@ -230,13 +232,9 @@
|
||||
</div>
|
||||
</div>
|
||||
<div class="bg-gray-50 px-4 py-3 sm:px-6 sm:flex sm:flex-row-reverse">
|
||||
<button
|
||||
type="button"
|
||||
class="mt-3 w-full inline-flex justify-center rounded-md border border-gray-300 shadow-sm px-4 py-2 bg-white text-base font-medium text-gray-700 hover:bg-gray-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:mt-0 sm:ml-3 sm:w-auto sm:text-sm"
|
||||
on:click={close}
|
||||
>
|
||||
Close
|
||||
</button>
|
||||
<Button variant="secondary" on:click={close}>
|
||||
{$t.common.cancel}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
61
frontend/src/components/auth/ProtectedRoute.svelte
Normal file
61
frontend/src/components/auth/ProtectedRoute.svelte
Normal file
@@ -0,0 +1,61 @@
|
||||
<!-- [DEF:ProtectedRoute:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: auth, guard, route, protection
|
||||
@PURPOSE: Wraps content to ensure only authenticated users can access it.
|
||||
@LAYER: Component
|
||||
@RELATION: USES -> authStore
|
||||
@RELATION: CALLS -> goto
|
||||
|
||||
@INVARIANT: Redirects to /login if user is not authenticated.
|
||||
-->
|
||||
|
||||
<script lang="ts">
|
||||
import { onMount } from 'svelte';
|
||||
import { auth } from '../../lib/auth/store';
|
||||
import { goto } from '$app/navigation';
|
||||
|
||||
// [SECTION: TEMPLATE]
|
||||
// Only render slot if authenticated
|
||||
// [/SECTION: TEMPLATE]
|
||||
|
||||
onMount(async () => {
|
||||
// Check if we have a token but no user profile yet
|
||||
if ($auth.token && !$auth.user) {
|
||||
auth.setLoading(true);
|
||||
try {
|
||||
const response = await fetch('/api/auth/me', {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${$auth.token}`
|
||||
}
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const user = await response.json();
|
||||
auth.setUser(user);
|
||||
} else {
|
||||
// Token invalid or expired
|
||||
auth.logout();
|
||||
goto('/login');
|
||||
}
|
||||
} catch (e) {
|
||||
console.error('Failed to verify session:', e);
|
||||
auth.logout();
|
||||
goto('/login');
|
||||
} finally {
|
||||
auth.setLoading(false);
|
||||
}
|
||||
} else if (!$auth.token) {
|
||||
goto('/login');
|
||||
}
|
||||
});
|
||||
</script>
|
||||
|
||||
{#if $auth.loading}
|
||||
<div class="flex items-center justify-center min-h-screen">
|
||||
<div class="animate-spin rounded-full h-12 w-12 border-b-2 border-blue-600"></div>
|
||||
</div>
|
||||
{:else if $auth.isAuthenticated}
|
||||
<slot />
|
||||
{/if}
|
||||
|
||||
<!-- [/DEF:ProtectedRoute:Component] -->
|
||||
84
frontend/src/components/backups/BackupList.svelte
Normal file
84
frontend/src/components/backups/BackupList.svelte
Normal file
@@ -0,0 +1,84 @@
|
||||
<!-- [DEF:BackupList:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: backup, list, table
|
||||
@PURPOSE: Displays a list of existing backups.
|
||||
@LAYER: Component
|
||||
@RELATION: USED_BY -> frontend/src/components/backups/BackupManager.svelte
|
||||
-->
|
||||
|
||||
<script lang="ts">
|
||||
// [SECTION: IMPORTS]
|
||||
import { t } from '../../lib/i18n';
|
||||
import { Button } from '../../lib/ui';
|
||||
import type { Backup } from '../../types/backup';
|
||||
import { goto } from '$app/navigation';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
/**
|
||||
* @type {Backup[]}
|
||||
* @description Array of backup objects to display.
|
||||
*/
|
||||
export let backups: Backup[] = [];
|
||||
// [/SECTION]
|
||||
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="overflow-x-auto rounded-lg border border-gray-200">
|
||||
<table class="min-w-full divide-y divide-gray-200">
|
||||
<thead class="bg-gray-50">
|
||||
<tr>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">
|
||||
{$t.storage.table.name}
|
||||
</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">
|
||||
{$t.tasks.target_env}
|
||||
</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">
|
||||
{$t.storage.table.created_at}
|
||||
</th>
|
||||
<th class="px-6 py-3 text-right text-xs font-medium text-gray-500 uppercase tracking-wider">
|
||||
{$t.storage.table.actions}
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody class="bg-white divide-y divide-gray-200">
|
||||
{#each backups as backup}
|
||||
<tr class="hover:bg-gray-50 transition-colors">
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm font-medium text-gray-900">
|
||||
{backup.name}
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm text-gray-500">
|
||||
{backup.environment}
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-sm text-gray-500">
|
||||
{new Date(backup.created_at).toLocaleString()}
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-right text-sm font-medium">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
class="text-blue-600 hover:text-blue-900"
|
||||
on:click={() => goto(`/tools/storage?path=backups/${backup.name}`)}
|
||||
>
|
||||
{$t.storage.table.go_to_storage}
|
||||
</Button>
|
||||
</td>
|
||||
</tr>
|
||||
{:else}
|
||||
<tr>
|
||||
<td colspan="4" class="px-6 py-10 text-center text-gray-500">
|
||||
{$t.storage.no_files}
|
||||
</td>
|
||||
</tr>
|
||||
{/each}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<style>
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:BackupList:Component] -->
|
||||
241
frontend/src/components/backups/BackupManager.svelte
Normal file
241
frontend/src/components/backups/BackupManager.svelte
Normal file
@@ -0,0 +1,241 @@
|
||||
<!-- [DEF:BackupManager:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: backup, manager, orchestrator
|
||||
@PURPOSE: Main container for backup management, handling creation and listing.
|
||||
@LAYER: Feature
|
||||
@RELATION: USES -> BackupList
|
||||
@RELATION: USES -> api
|
||||
|
||||
@INVARIANT: Only one backup task can be triggered at a time from the UI.
|
||||
-->
|
||||
|
||||
<script lang="ts">
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount } from 'svelte';
|
||||
import { t } from '../../lib/i18n';
|
||||
import { api, requestApi } from '../../lib/api';
|
||||
import { addToast } from '../../lib/toasts';
|
||||
import { Button, Card, Select, Input } from '../../lib/ui';
|
||||
import BackupList from './BackupList.svelte';
|
||||
import type { Backup } from '../../types/backup';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let backups: Backup[] = [];
|
||||
let environments: any[] = [];
|
||||
let selectedEnvId = '';
|
||||
let loading = true;
|
||||
let creating = false;
|
||||
let savingSchedule = false;
|
||||
|
||||
// Schedule state for selected environment
|
||||
let scheduleEnabled = false;
|
||||
let cronExpression = '0 0 * * *';
|
||||
|
||||
$: selectedEnv = environments.find(e => e.id === selectedEnvId);
|
||||
$: if (selectedEnv) {
|
||||
scheduleEnabled = selectedEnv.backup_schedule?.enabled ?? false;
|
||||
cronExpression = selectedEnv.backup_schedule?.cron_expression ?? '0 0 * * *';
|
||||
}
|
||||
// [/SECTION]
|
||||
|
||||
// [DEF:loadData:Function]
|
||||
/**
|
||||
* @purpose Loads backups and environments from the backend.
|
||||
*
|
||||
* @pre API must be reachable.
|
||||
* @post environments and backups stores are populated.
|
||||
*
|
||||
* @returns {Promise<void>}
|
||||
* @side_effect Updates local state variables.
|
||||
*/
|
||||
// @RELATION: CALLS -> api.getEnvironmentsList
|
||||
// @RELATION: CALLS -> api.requestApi
|
||||
async function loadData() {
|
||||
console.log("[BackupManager][Entry] Loading data.");
|
||||
loading = true;
|
||||
try {
|
||||
const [envsData, storageData] = await Promise.all([
|
||||
api.getEnvironmentsList(),
|
||||
requestApi('/storage/files?category=backups')
|
||||
]);
|
||||
environments = envsData;
|
||||
|
||||
// Map storage files to Backup type
|
||||
backups = (storageData || []).map((file: any) => ({
|
||||
id: file.name,
|
||||
name: file.name,
|
||||
environment: file.path.split('/')[0] || 'Unknown',
|
||||
created_at: file.created_at,
|
||||
size_bytes: file.size,
|
||||
status: 'success'
|
||||
}));
|
||||
console.log("[BackupManager][Action] Data loaded successfully.");
|
||||
} catch (error) {
|
||||
console.error("[BackupManager][Coherence:Failed] Load failed", error);
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:loadData:Function]
|
||||
|
||||
// [DEF:handleCreateBackup:Function]
|
||||
/**
|
||||
* @purpose Triggers a new backup task for the selected environment.
|
||||
*
|
||||
* @pre selectedEnvId must be a valid environment ID.
|
||||
* @post A new task is created on the backend.
|
||||
*
|
||||
* @returns {Promise<void>}
|
||||
* @side_effect Dispatches a toast notification.
|
||||
*/
|
||||
// @RELATION: CALLS -> api.createTask
|
||||
// [DEF:handleUpdateSchedule:Function]
|
||||
/**
|
||||
* @purpose Updates the backup schedule for the selected environment.
|
||||
* @pre selectedEnvId must be set.
|
||||
* @post Environment config is updated on the backend.
|
||||
*/
|
||||
async function handleUpdateSchedule() {
|
||||
if (!selectedEnvId) return;
|
||||
|
||||
console.log(`[BackupManager][Action] Updating schedule for env: ${selectedEnvId}`);
|
||||
savingSchedule = true;
|
||||
try {
|
||||
await api.updateEnvironmentSchedule(selectedEnvId, {
|
||||
enabled: scheduleEnabled,
|
||||
cron_expression: cronExpression
|
||||
});
|
||||
addToast($t.common.success, 'success');
|
||||
|
||||
// Update local state
|
||||
environments = environments.map(e =>
|
||||
e.id === selectedEnvId
|
||||
? { ...e, backup_schedule: { enabled: scheduleEnabled, cron_expression: cronExpression } }
|
||||
: e
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("[BackupManager][Coherence:Failed] Schedule update failed", error);
|
||||
} finally {
|
||||
savingSchedule = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handleUpdateSchedule:Function]
|
||||
|
||||
async function handleCreateBackup() {
|
||||
if (!selectedEnvId) {
|
||||
addToast($t.tasks.select_env, 'error');
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`[BackupManager][Action] Triggering backup for env: ${selectedEnvId}`);
|
||||
creating = true;
|
||||
try {
|
||||
await api.createTask('superset-backup', { environment_id: selectedEnvId });
|
||||
addToast($t.common.success, 'success');
|
||||
console.log("[BackupManager][Coherence:OK] Backup task triggered.");
|
||||
} catch (error) {
|
||||
console.error("[BackupManager][Coherence:Failed] Create failed", error);
|
||||
} finally {
|
||||
creating = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:handleCreateBackup:Function]
|
||||
|
||||
onMount(loadData);
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="space-y-6">
|
||||
<Card title={$t.tasks.manual_backup}>
|
||||
<div class="space-y-4">
|
||||
<div class="flex items-end gap-4">
|
||||
<div class="flex-1">
|
||||
<Select
|
||||
label={$t.tasks.target_env}
|
||||
bind:value={selectedEnvId}
|
||||
options={[
|
||||
{ value: '', label: $t.tasks.select_env },
|
||||
...environments.map(e => ({ value: e.id, label: e.name }))
|
||||
]}
|
||||
/>
|
||||
</div>
|
||||
<Button
|
||||
variant="primary"
|
||||
on:click={handleCreateBackup}
|
||||
disabled={creating || !selectedEnvId}
|
||||
>
|
||||
{creating ? $t.common.loading : $t.tasks.start_backup}
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{#if selectedEnvId}
|
||||
<div class="pt-6 border-t border-gray-100 mt-4">
|
||||
<h3 class="text-sm font-semibold text-gray-800 mb-4 flex items-center gap-2">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" class="h-4 w-4 text-blue-500" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 8v4l3 3m6-3a9 9 0 11-18 0 9 9 0 0118 0z" />
|
||||
</svg>
|
||||
{$t.tasks.backup_schedule}
|
||||
</h3>
|
||||
|
||||
<div class="bg-gray-50 rounded-lg p-4 border border-gray-200">
|
||||
<div class="flex flex-col md:flex-row md:items-start gap-6">
|
||||
<div class="pt-8">
|
||||
<label class="flex items-center gap-3 cursor-pointer group">
|
||||
<div class="relative inline-flex items-center">
|
||||
<input type="checkbox" bind:checked={scheduleEnabled} class="sr-only peer" />
|
||||
<div class="w-11 h-6 bg-gray-200 peer-focus:outline-none peer-focus:ring-4 peer-focus:ring-blue-300 rounded-full peer peer-checked:after:translate-x-full peer-checked:after:border-white after:content-[''] after:absolute after:top-[2px] after:left-[2px] after:bg-white after:border-gray-300 after:border after:rounded-full after:h-5 after:w-5 after:transition-all peer-checked:bg-blue-600"></div>
|
||||
<span class="ml-3 text-sm font-medium text-gray-700 group-hover:text-gray-900 transition-colors">{$t.tasks.schedule_enabled}</span>
|
||||
</div>
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<div class="flex-1 space-y-2">
|
||||
<Input
|
||||
label={$t.tasks.cron_label}
|
||||
placeholder="0 0 * * *"
|
||||
bind:value={cronExpression}
|
||||
disabled={!scheduleEnabled}
|
||||
/>
|
||||
<p class="text-xs text-gray-500 italic">{$t.tasks.cron_hint}</p>
|
||||
</div>
|
||||
|
||||
<div class="pt-8">
|
||||
<Button
|
||||
variant="secondary"
|
||||
on:click={handleUpdateSchedule}
|
||||
disabled={savingSchedule}
|
||||
class="min-w-[100px]"
|
||||
>
|
||||
{#if savingSchedule}
|
||||
<span class="flex items-center gap-2">
|
||||
<svg class="animate-spin h-4 w-4" viewBox="0 0 24 24">
|
||||
<circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4" fill="none"></circle>
|
||||
<path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
|
||||
</svg>
|
||||
{$t.common.loading}
|
||||
</span>
|
||||
{:else}
|
||||
{$t.common.save}
|
||||
{/if}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
</Card>
|
||||
|
||||
<div class="space-y-3">
|
||||
<h2 class="text-lg font-semibold text-gray-700">{$t.storage.backups}</h2>
|
||||
{#if loading}
|
||||
<div class="py-10 text-center text-gray-500">{$t.common.loading}</div>
|
||||
{:else}
|
||||
<BackupList {backups} />
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:BackupManager:Component] -->
|
||||
178
frontend/src/components/git/BranchSelector.svelte
Normal file
178
frontend/src/components/git/BranchSelector.svelte
Normal file
@@ -0,0 +1,178 @@
|
||||
<!-- [DEF:BranchSelector:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: git, branch, selection, checkout
|
||||
@PURPOSE: UI для выбора и создания веток Git.
|
||||
@LAYER: Component
|
||||
@RELATION: CALLS -> gitService.getBranches
|
||||
@RELATION: CALLS -> gitService.checkoutBranch
|
||||
@RELATION: CALLS -> gitService.createBranch
|
||||
@RELATION: DISPATCHES -> change
|
||||
-->
|
||||
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount, createEventDispatcher } from 'svelte';
|
||||
import { gitService } from '../../services/gitService';
|
||||
import { addToast as toast } from '../../lib/toasts.js';
|
||||
import { t } from '../../lib/i18n';
|
||||
import { Button, Select, Input } from '../../lib/ui';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let dashboardId;
|
||||
export let currentBranch = 'main';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let branches = [];
|
||||
let loading = false;
|
||||
let showCreate = false;
|
||||
let newBranchName = '';
|
||||
// [/SECTION]
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
// [DEF:onMount:Function]
|
||||
/**
|
||||
* @purpose Load branches when component is mounted.
|
||||
* @pre Component is initialized.
|
||||
* @post loadBranches is called.
|
||||
*/
|
||||
onMount(async () => {
|
||||
await loadBranches();
|
||||
});
|
||||
// [/DEF:onMount:Function]
|
||||
|
||||
// [DEF:loadBranches:Function]
|
||||
/**
|
||||
* @purpose Загружает список веток для дашборда.
|
||||
* @pre dashboardId is provided.
|
||||
* @post branches обновлен.
|
||||
*/
|
||||
async function loadBranches() {
|
||||
console.log(`[BranchSelector][Action] Loading branches for dashboard ${dashboardId}`);
|
||||
loading = true;
|
||||
try {
|
||||
branches = await gitService.getBranches(dashboardId);
|
||||
console.log(`[BranchSelector][Coherence:OK] Loaded ${branches.length} branches`);
|
||||
} catch (e) {
|
||||
console.error(`[BranchSelector][Coherence:Failed] ${e.message}`);
|
||||
toast('Failed to load branches', 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:loadBranches:Function]
|
||||
|
||||
// [DEF:handleSelect:Function]
|
||||
/**
|
||||
* @purpose Handles branch selection from dropdown.
|
||||
* @pre event contains branch name.
|
||||
* @post handleCheckout is called with selected branch.
|
||||
*/
|
||||
function handleSelect(event) {
|
||||
handleCheckout(event.target.value);
|
||||
}
|
||||
// [/DEF:handleSelect:Function]
|
||||
|
||||
// [DEF:handleCheckout:Function]
|
||||
/**
|
||||
* @purpose Переключает текущую ветку.
|
||||
* @param {string} branchName - Имя ветки.
|
||||
* @post currentBranch обновлен, событие отправлено.
|
||||
*/
|
||||
async function handleCheckout(branchName) {
|
||||
console.log(`[BranchSelector][Action] Checking out branch ${branchName}`);
|
||||
try {
|
||||
await gitService.checkoutBranch(dashboardId, branchName);
|
||||
currentBranch = branchName;
|
||||
dispatch('change', { branch: branchName });
|
||||
toast(`Switched to ${branchName}`, 'success');
|
||||
console.log(`[BranchSelector][Coherence:OK] Checked out ${branchName}`);
|
||||
} catch (e) {
|
||||
console.error(`[BranchSelector][Coherence:Failed] ${e.message}`);
|
||||
toast(e.message, 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleCheckout:Function]
|
||||
|
||||
// [DEF:handleCreate:Function]
|
||||
/**
|
||||
* @purpose Создает новую ветку.
|
||||
* @pre newBranchName is not empty.
|
||||
* @post Новая ветка создана и загружена; showCreate reset.
|
||||
*/
|
||||
async function handleCreate() {
|
||||
if (!newBranchName) return;
|
||||
console.log(`[BranchSelector][Action] Creating branch ${newBranchName} from ${currentBranch}`);
|
||||
try {
|
||||
await gitService.createBranch(dashboardId, newBranchName, currentBranch);
|
||||
toast(`Created branch ${newBranchName}`, 'success');
|
||||
showCreate = false;
|
||||
newBranchName = '';
|
||||
await loadBranches();
|
||||
console.log(`[BranchSelector][Coherence:OK] Branch created`);
|
||||
} catch (e) {
|
||||
console.error(`[BranchSelector][Coherence:Failed] ${e.message}`);
|
||||
toast(e.message, 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleCreate:Function]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="space-y-3">
|
||||
<div class="flex items-center gap-3">
|
||||
<div class="flex-grow">
|
||||
<Select
|
||||
bind:value={currentBranch}
|
||||
on:change={handleSelect}
|
||||
disabled={loading}
|
||||
options={branches.map(b => ({ value: b.name, label: b.name }))}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
on:click={() => showCreate = !showCreate}
|
||||
disabled={loading}
|
||||
class="text-blue-600"
|
||||
>
|
||||
+ {$t.git.new_branch}
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{#if showCreate}
|
||||
<div class="flex items-end gap-2 bg-gray-50 p-3 rounded-lg border border-dashed border-gray-200">
|
||||
<div class="flex-grow">
|
||||
<Input
|
||||
bind:value={newBranchName}
|
||||
placeholder="branch-name"
|
||||
disabled={loading}
|
||||
/>
|
||||
</div>
|
||||
<Button
|
||||
variant="primary"
|
||||
size="sm"
|
||||
on:click={handleCreate}
|
||||
disabled={loading || !newBranchName}
|
||||
isLoading={loading}
|
||||
class="bg-green-600 hover:bg-green-700"
|
||||
>
|
||||
{$t.git.create}
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
on:click={() => showCreate = false}
|
||||
disabled={loading}
|
||||
>
|
||||
{$t.common.cancel}
|
||||
</Button>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:BranchSelector:Component] -->
|
||||
95
frontend/src/components/git/CommitHistory.svelte
Normal file
95
frontend/src/components/git/CommitHistory.svelte
Normal file
@@ -0,0 +1,95 @@
|
||||
<!-- [DEF:CommitHistory:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: git, history, commits, audit
|
||||
@PURPOSE: Displays the commit history for a specific dashboard.
|
||||
@LAYER: Component
|
||||
@RELATION: CALLS -> gitService.getHistory
|
||||
-->
|
||||
|
||||
<script>
|
||||
// [SECTION: IMPORTS]
|
||||
import { onMount } from 'svelte';
|
||||
import { gitService } from '../../services/gitService';
|
||||
import { addToast as toast } from '../../lib/toasts.js';
|
||||
import { t } from '../../lib/i18n';
|
||||
import { Button } from '../../lib/ui';
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let dashboardId;
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: STATE]
|
||||
let history = [];
|
||||
let loading = false;
|
||||
// [/SECTION]
|
||||
|
||||
// [DEF:onMount:Function]
|
||||
/**
|
||||
* @purpose Load history when component is mounted.
|
||||
* @pre Component is initialized with dashboardId.
|
||||
* @post loadHistory is called.
|
||||
*/
|
||||
onMount(async () => {
|
||||
await loadHistory();
|
||||
});
|
||||
// [/DEF:onMount:Function]
|
||||
|
||||
// [DEF:loadHistory:Function]
|
||||
/**
|
||||
* @purpose Fetch commit history from the backend.
|
||||
* @pre dashboardId is valid.
|
||||
* @post history state is updated.
|
||||
*/
|
||||
async function loadHistory() {
|
||||
console.log(`[CommitHistory][Action] Loading history for dashboard ${dashboardId}`);
|
||||
loading = true;
|
||||
try {
|
||||
history = await gitService.getHistory(dashboardId);
|
||||
console.log(`[CommitHistory][Coherence:OK] Loaded ${history.length} commits`);
|
||||
} catch (e) {
|
||||
console.error(`[CommitHistory][Coherence:Failed] ${e.message}`);
|
||||
toast('Failed to load commit history', 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:loadHistory:Function]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="mt-2">
|
||||
<div class="flex justify-between items-center mb-6">
|
||||
<h3 class="text-sm font-semibold text-gray-400 uppercase tracking-wider">
|
||||
{$t.git.history}
|
||||
</h3>
|
||||
<Button variant="ghost" size="sm" on:click={loadHistory} class="text-blue-600">
|
||||
{$t.git.refresh}
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{#if loading}
|
||||
<div class="flex items-center justify-center py-12">
|
||||
<div class="animate-spin rounded-full h-6 w-6 border-b-2 border-blue-600"></div>
|
||||
</div>
|
||||
{:else if history.length === 0}
|
||||
<p class="text-gray-500 italic text-center py-12">{$t.git.no_commits}</p>
|
||||
{:else}
|
||||
<div class="space-y-3 max-h-96 overflow-y-auto pr-2">
|
||||
{#each history as commit}
|
||||
<div class="border-l-2 border-blue-500 pl-4 py-1">
|
||||
<div class="flex justify-between items-start">
|
||||
<span class="font-medium text-sm">{commit.message}</span>
|
||||
<span class="text-xs text-gray-400 font-mono">{commit.hash.substring(0, 7)}</span>
|
||||
</div>
|
||||
<div class="text-xs text-gray-500 mt-1">
|
||||
{commit.author} • {new Date(commit.timestamp).toLocaleString()}
|
||||
</div>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:CommitHistory:Component] -->
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user