Compare commits
8 Commits
018-task-l
...
d29bc511a2
| Author | SHA1 | Date | |
|---|---|---|---|
| d29bc511a2 | |||
| a3a9f0788d | |||
| 77147dc95b | |||
| 026239e3bf | |||
| 4a0273a604 | |||
| edb2dd5263 | |||
| 76b98fcf8f | |||
| 794cc55fe7 |
@@ -33,6 +33,8 @@ Auto-generated from all feature plans. Last updated: 2025-12-19
|
||||
- N/A (UI reorganization and API integration) (015-frontend-nav-redesign)
|
||||
- SQLite (`auth.db`) for Users, Roles, Permissions, and Mappings. (016-multi-user-auth)
|
||||
- SQLite (existing `tasks.db` for results, `auth.db` for permissions, `mappings.db` or new `plugins.db` for provider config/metadata) (017-llm-analysis-plugin)
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, SQLAlchemy, WebSocket (existing) (019-superset-ux-redesign)
|
||||
- SQLite (tasks.db, auth.db, migrations.db) - no new database tables required (019-superset-ux-redesign)
|
||||
|
||||
- Python 3.9+ (Backend), Node.js 18+ (Frontend Build) (001-plugin-arch-svelte-ui)
|
||||
|
||||
@@ -53,9 +55,9 @@ cd src; pytest; ruff check .
|
||||
Python 3.9+ (Backend), Node.js 18+ (Frontend Build): Follow standard conventions
|
||||
|
||||
## Recent Changes
|
||||
- 019-superset-ux-redesign: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI, SvelteKit, Tailwind CSS, SQLAlchemy, WebSocket (existing)
|
||||
- 017-llm-analysis-plugin: Added Python 3.9+ (Backend), Node.js 18+ (Frontend)
|
||||
- 016-multi-user-auth: Added Python 3.9+ (Backend), Node.js 18+ (Frontend)
|
||||
- 015-frontend-nav-redesign: Added Python 3.9+ (Backend), Node.js 18+ (Frontend) + FastAPI (Backend), SvelteKit + Tailwind CSS (Frontend)
|
||||
|
||||
|
||||
<!-- MANUAL ADDITIONS START -->
|
||||
|
||||
@@ -117,7 +117,8 @@ You **MUST** consider the user input before proceeding (if not empty).
|
||||
- **Validation checkpoints**: Verify each phase completion before proceeding
|
||||
|
||||
7. Implementation execution rules:
|
||||
- **Strict Adherence**: Apply `semantic_protocol.md` rules - every file must start with [DEF] header, include @TIER, and define contracts
|
||||
- **Strict Adherence**: Apply `semantic_protocol.md` rules - every file must start with [DEF] header, include @TIER, and define contracts.
|
||||
- **CRITICAL Contracts**: If a task description contains a contract summary (e.g., `CRITICAL: PRE: ..., POST: ...`), these constraints are **MANDATORY** and must be strictly implemented in the code using guards/assertions (if applicable per protocol).
|
||||
- **Setup first**: Initialize project structure, dependencies, configuration
|
||||
- **Tests before code**: If you need to write tests for contracts, entities, and integration scenarios
|
||||
- **Core development**: Implement models, services, CLI commands, endpoints
|
||||
|
||||
@@ -66,25 +66,30 @@ You **MUST** consider the user input before proceeding (if not empty).
|
||||
|
||||
0. **Validate Design against UX Reference**:
|
||||
- Check if the proposed architecture supports the latency, interactivity, and flow defined in `ux_reference.md`.
|
||||
- **CRITICAL**: If the technical plan requires compromising the UX defined in `ux_reference.md` (e.g. "We can't do real-time validation because X"), you **MUST STOP** and warn the user. Do not proceed until resolved.
|
||||
- **Linkage**: Ensure key UI states from `ux_reference.md` map to Component Contracts (`@UX_STATE`).
|
||||
- **CRITICAL**: If the technical plan compromises the UX (e.g. "We can't do real-time validation"), you **MUST STOP** and warn the user.
|
||||
|
||||
1. **Extract entities from feature spec** → `data-model.md`:
|
||||
- Entity name, fields, relationships
|
||||
- Validation rules from requirements
|
||||
- State transitions if applicable
|
||||
- Entity name, fields, relationships, validation rules.
|
||||
|
||||
2. **Define Module & Function Contracts (Semantic Protocol)**:
|
||||
- **MANDATORY**: For every new module, define the [DEF] Header and Module-level Contract (@TIER, @PURPOSE, @INVARIANT) as per `semantic_protocol.md`.
|
||||
- **REQUIRED**: Define Function Contracts (@PRE, @POST) for critical logic.
|
||||
- Output specific contract definitions to `contracts/modules.md` or append to `data-model.md` to guide implementation.
|
||||
- Ensure strict adherence to `semantic_protocol.md` syntax.
|
||||
2. **Design & Verify Contracts (Semantic Protocol)**:
|
||||
- **Drafting**: Define [DEF] Headers and Contracts for all new modules based on `semantic_protocol.md`.
|
||||
- **TIER Classification**: Explicitly assign `@TIER: [CRITICAL|STANDARD|TRIVIAL]` to each module.
|
||||
- **CRITICAL Requirements**: For all CRITICAL modules, define full `@PRE`, `@POST`, and (if UI) `@UX_STATE` contracts.
|
||||
- **Self-Review**:
|
||||
- *Completeness*: Do `@PRE`/`@POST` cover edge cases identified in Research?
|
||||
- *Connectivity*: Do `@RELATION` tags form a coherent graph?
|
||||
- *Compliance*: Does syntax match `[DEF:id:Type]` exactly?
|
||||
- **Output**: Write verified contracts to `contracts/modules.md`.
|
||||
|
||||
3. **Generate API contracts** from functional requirements:
|
||||
- For each user action → endpoint
|
||||
- Use standard REST/GraphQL patterns
|
||||
- Output OpenAPI/GraphQL schema to `/contracts/`
|
||||
3. **Simulate Contract Usage**:
|
||||
- Trace one key user scenario through the defined contracts to ensure data flow continuity.
|
||||
- If a contract interface mismatch is found, fix it immediately.
|
||||
|
||||
3. **Agent context update**:
|
||||
4. **Generate API contracts**:
|
||||
- Output OpenAPI/GraphQL schema to `/contracts/` for backend-frontend sync.
|
||||
|
||||
5. **Agent context update**:
|
||||
- Run `.specify/scripts/bash/update-agent-context.sh kilocode`
|
||||
- These scripts detect which AI agent is in use
|
||||
- Update the appropriate agent-specific context file
|
||||
|
||||
@@ -119,7 +119,10 @@ Every task MUST strictly follow this format:
|
||||
- If tests requested: Tests specific to that story
|
||||
- Mark story dependencies (most stories should be independent)
|
||||
|
||||
2. **From Contracts**:
|
||||
2. **From Contracts (CRITICAL TIER)**:
|
||||
- Identify components marked as `@TIER: CRITICAL` in `contracts/modules.md`.
|
||||
- For these components, **MUST** append the summary of `@PRE`, `@POST`, and `@UX_STATE` contracts directly to the task description.
|
||||
- Example: `- [ ] T005 [P] [US1] Implement Auth (CRITICAL: PRE: token exists, POST: returns User) in src/auth.py`
|
||||
- Map each contract/endpoint → to the user story it serves
|
||||
- If tests requested: Each contract → contract test task [P] before implementation in that story's phase
|
||||
|
||||
|
||||
@@ -1,9 +1,16 @@
|
||||
---
|
||||
description: Run semantic validation and functional tests for a specific feature, module, or file.
|
||||
№ **speckit.tasks.md**
|
||||
### Modified Workflow
|
||||
|
||||
```markdown
|
||||
description: Generate an actionable, dependency-ordered tasks.md for the feature based on available design artifacts.
|
||||
handoffs:
|
||||
- label: Fix Implementation
|
||||
- label: Analyze For Consistency
|
||||
agent: speckit.analyze
|
||||
prompt: Run a project analysis for consistency
|
||||
send: true
|
||||
- label: Implement Project
|
||||
agent: speckit.implement
|
||||
prompt: Fix the issues found during testing...
|
||||
prompt: Start the implementation in phases
|
||||
send: true
|
||||
---
|
||||
|
||||
@@ -13,54 +20,97 @@ handoffs:
|
||||
$ARGUMENTS
|
||||
```
|
||||
|
||||
**Input format:** Can be a file path, a directory, or a feature name.
|
||||
You **MUST** consider the user input before proceeding (if not empty).
|
||||
|
||||
## Outline
|
||||
|
||||
1. **Context Analysis**:
|
||||
- Determine the target scope (Backend vs Frontend vs Full Feature).
|
||||
- Read `semantic_protocol.md` to load validation rules.
|
||||
1. **Setup**: Run `.specify/scripts/bash/check-prerequisites.sh --json` from repo root and parse FEATURE_DIR and AVAILABLE_DOCS list. All paths must be absolute.
|
||||
|
||||
2. **Phase 1: Semantic Static Analysis (The "Compiler" Check)**
|
||||
- **Command:** Use `grep` or script to verify Protocol compliance before running code.
|
||||
- **Check:**
|
||||
- Does the file start with `[DEF:...]` header?
|
||||
- Are `@TIER` and `@PURPOSE` defined?
|
||||
- Are imports located *after* the contracts?
|
||||
- Do functions marked "Critical" have `@PRE`/`@POST` tags?
|
||||
- **Action:** If this phase fails, **STOP** and report "Semantic Compilation Failed". Do not run runtime tests.
|
||||
2. **Load design documents**: Read from FEATURE_DIR:
|
||||
- **Required**: plan.md (tech stack, libraries, structure), spec.md (user stories with priorities), ux_reference.md (experience source of truth)
|
||||
- **Optional**: data-model.md (entities), contracts/ (API endpoints), research.md (decisions)
|
||||
|
||||
3. **Phase 2: Environment Prep**
|
||||
- Detect project type:
|
||||
- **Python**: Check if `.venv` is active.
|
||||
- **Svelte**: Check if `node_modules` exists.
|
||||
- **Command:** Run linter (e.g., `ruff check`, `eslint`) to catch syntax errors immediately.
|
||||
3. **Execute task generation workflow**:
|
||||
- **Architecture Analysis (CRITICAL)**: Scan existing codebase for patterns (DI, Auth, ORM).
|
||||
- Load plan.md/spec.md.
|
||||
- Generate tasks organized by user story.
|
||||
- **Apply Fractal Co-location**: Ensure all unit tests are mapped to `__tests__` subdirectories relative to the code.
|
||||
- Validate task completeness.
|
||||
|
||||
4. **Phase 3: Test Execution (Runtime)**
|
||||
- Select the test runner based on the file path:
|
||||
- **Backend (`*.py`)**:
|
||||
- Command: `pytest <path_to_test_file> -v`
|
||||
- If no specific test file exists, try to find it by convention: `tests/test_<module_name>.py`.
|
||||
- **Frontend (`*.svelte`, `*.ts`)**:
|
||||
- Command: `npm run test -- <path_to_component>`
|
||||
|
||||
- **Verification**:
|
||||
- Analyze output logs.
|
||||
- If tests fail, summarize the failure (AssertionError, Timeout, etc.).
|
||||
4. **Generate tasks.md**: Use `.specify/templates/tasks-template.md` as structure.
|
||||
- Phase 1: Context & Setup.
|
||||
- Phase 2: Foundational tasks.
|
||||
- Phase 3+: User Stories (Priority order).
|
||||
- Final Phase: Polish.
|
||||
- **Strict Constraint**: Ensure tasks follow the Co-location and Mocking rules below.
|
||||
|
||||
5. **Phase 4: Contract Coverage Check (Manual/LLM verify)**
|
||||
- Review the test cases executed.
|
||||
- **Question**: Do the tests explicitly verify the `@POST` guarantees defined in the module header?
|
||||
- **Report**: Mark as "Weak Coverage" if contracts exist but aren't tested.
|
||||
5. **Report**: Output path to generated tasks.md and summary.
|
||||
|
||||
## Execution Rules
|
||||
Context for task generation: $ARGUMENTS
|
||||
|
||||
- **Fail Fast**: If semantic headers are missing, don't waste time running pytest.
|
||||
- **No Silent Failures**: Always output the full error log if a command fails.
|
||||
- **Auto-Correction Hint**: If a test fails, suggest the specific `speckit.implement` command to fix it.
|
||||
## Task Generation Rules
|
||||
|
||||
## Example Commands
|
||||
**CRITICAL**: Tasks MUST be actionable, specific, architecture-aware, and context-local.
|
||||
|
||||
- **Python**: `pytest backend/tests/test_auth.py`
|
||||
- **Svelte**: `npm run test:unit -- src/components/Button.svelte`
|
||||
- **Lint**: `ruff check backend/src/api/`
|
||||
### Implementation & Testing Constraints (ANTI-LOOP & CO-LOCATION)
|
||||
|
||||
To prevent infinite debugging loops and context fragmentation, apply these rules:
|
||||
|
||||
1. **Fractal Co-location Strategy (MANDATORY)**:
|
||||
- **Rule**: Unit tests MUST live next to the code they verify.
|
||||
- **Forbidden**: Do NOT create unit tests in root `tests/` or `backend/tests/`. Those are for E2E/Integration only.
|
||||
- **Pattern (Python)**:
|
||||
- Source: `src/domain/order/processing.py`
|
||||
- Test Task: `Create tests in src/domain/order/__tests__/test_processing.py`
|
||||
- **Pattern (Frontend)**:
|
||||
- Source: `src/lib/components/UserCard.svelte`
|
||||
- Test Task: `Create tests in src/lib/components/__tests__/UserCard.test.ts`
|
||||
|
||||
2. **Semantic Relations**:
|
||||
- Test generation tasks must explicitly instruct to add the relation header: `# @RELATION: VERIFIES -> [TargetComponent]`
|
||||
|
||||
3. **Strict Mocking for Unit Tests**:
|
||||
- Any task creating Unit Tests MUST specify: *"Use `unittest.mock.MagicMock` for heavy dependencies (DB sessions, Auth). Do NOT instantiate real service classes."*
|
||||
|
||||
4. **Schema/Model Separation**:
|
||||
- Explicitly separate tasks for ORM Models (SQLAlchemy) and Pydantic Schemas.
|
||||
|
||||
### UX Preservation (CRITICAL)
|
||||
|
||||
- **Source of Truth**: `ux_reference.md` is the absolute standard.
|
||||
- **Verification Task**: You **MUST** add a specific task at the end of each User Story phase: `- [ ] Txxx [USx] Verify implementation matches ux_reference.md (Happy Path & Errors)`
|
||||
|
||||
### Checklist Format (REQUIRED)
|
||||
|
||||
Every task MUST strictly follow this format:
|
||||
|
||||
```text
|
||||
- [ ] [TaskID] [P?] [Story?] Description with file path
|
||||
```
|
||||
|
||||
**Examples**:
|
||||
- ✅ `- [ ] T005 [US1] Create unit tests for OrderService in src/services/__tests__/test_order.py (Mock DB)`
|
||||
- ✅ `- [ ] T006 [US1] Implement OrderService in src/services/order.py`
|
||||
- ❌ `- [ ] T005 [US1] Create tests in backend/tests/test_order.py` (VIOLATION: Wrong location)
|
||||
|
||||
### Task Organization & Phase Structure
|
||||
|
||||
**Phase 1: Context & Setup**
|
||||
- **Goal**: Prepare environment and understand existing patterns.
|
||||
- **Mandatory Task**: `- [ ] T001 Analyze existing project structure, auth patterns, and `conftest.py` location`
|
||||
|
||||
**Phase 2: Foundational (Data & Core)**
|
||||
- Database Models (ORM).
|
||||
- Pydantic Schemas (DTOs).
|
||||
- Core Service interfaces.
|
||||
|
||||
**Phase 3+: User Stories (Iterative)**
|
||||
- **Step 1: Isolation Tests (Co-located)**:
|
||||
- `- [ ] Txxx [USx] Create unit tests for [Component] in [Path]/__tests__/test_[name].py`
|
||||
- *Note: Specify using MagicMock for external deps.*
|
||||
- **Step 2: Implementation**: Services -> Endpoints.
|
||||
- **Step 3: Integration**: Wire up real dependencies (if E2E tests requested).
|
||||
- **Step 4: UX Verification**.
|
||||
|
||||
**Final Phase: Polish**
|
||||
- Linting, formatting, final manual verify.
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
<!--
|
||||
SYNC IMPACT REPORT
|
||||
Version: 2.2.0 (ConfigManager Discipline)
|
||||
Version: 2.3.0 (Tailwind CSS & Scoped CSS Minimization)
|
||||
Changes:
|
||||
- Updated Principle II: Added mandatory requirement for using `ConfigManager` (via dependency injection) for all configuration access to ensure consistent environment handling and avoid hardcoded values.
|
||||
- Updated Principle III: Refined `requestApi` requirement.
|
||||
- Updated Principle III: Added mandatory requirement for Tailwind CSS usage and minimization of scoped `<style>` blocks in Svelte components.
|
||||
- Version bump: 2.2.0 -> 2.3.0 (Minor: New material guidance added).
|
||||
Templates Status:
|
||||
- .specify/templates/plan-template.md: ✅ Aligned.
|
||||
- .specify/templates/spec-template.md: ✅ Aligned.
|
||||
@@ -27,6 +27,7 @@ All functional extensions, tools, or major features must be implemented as modul
|
||||
### III. Unified Frontend Experience
|
||||
To ensure a consistent and accessible user experience, all frontend implementations must strictly adhere to the unified design and localization standards.
|
||||
- **Component Reusability**: All UI elements MUST utilize the standardized Svelte component library (`src/lib/ui`) and centralized design tokens.
|
||||
- **Tailwind CSS First**: All styling MUST be implemented using Tailwind CSS utility classes. The use of scoped `<style>` blocks in Svelte components MUST be minimized and reserved only for complex animations or third-party overrides that cannot be achieved via Tailwind.
|
||||
- **Internationalization (i18n)**: All user-facing text MUST be extracted to the translation system (`src/lib/i18n`).
|
||||
- **Backend Communication**: All API requests MUST use the `requestApi` wrapper (or its derivatives like `fetchApi`, `postApi`) from `src/lib/api.js`. Direct use of the native `fetch` API for backend communication is FORBIDDEN to ensure consistent authentication (JWT) and error handling.
|
||||
|
||||
@@ -52,4 +53,4 @@ This Constitution establishes the "Semantic Code Generation Protocol" as the sup
|
||||
- **Amendments**: Changes to core principles require a Constitution amendment. Changes to technical syntax require a Protocol update.
|
||||
- **Compliance**: Failure to adhere to the Protocol constitutes a build failure.
|
||||
|
||||
**Version**: 2.2.0 | **Ratified**: 2025-12-19 | **Last Amended**: 2026-01-29
|
||||
**Version**: 2.3.0 | **Ratified**: 2025-12-19 | **Last Amended**: 2026-02-18
|
||||
|
||||
@@ -17,8 +17,8 @@
|
||||
the iteration process.
|
||||
-->
|
||||
|
||||
**Language/Version**: [e.g., Python 3.11, Swift 5.9, Rust 1.75 or NEEDS CLARIFICATION]
|
||||
**Primary Dependencies**: [e.g., FastAPI, UIKit, LLVM or NEEDS CLARIFICATION]
|
||||
**Language/Version**: [e.g., Python 3.11, Swift 5.9, Rust 1.75 or NEEDS CLARIFICATION]
|
||||
**Primary Dependencies**: [e.g., FastAPI, Tailwind CSS, SvelteKit or NEEDS CLARIFICATION]
|
||||
**Storage**: [if applicable, e.g., PostgreSQL, CoreData, files or N/A]
|
||||
**Testing**: [e.g., pytest, XCTest, cargo test or NEEDS CLARIFICATION]
|
||||
**Target Platform**: [e.g., Linux server, iOS 15+, WASM or NEEDS CLARIFICATION]
|
||||
|
||||
@@ -93,7 +93,8 @@ Examples of foundational tasks (adjust based on your project):
|
||||
- [ ] T014 [US1] Implement [Service] in src/services/[service].py (depends on T012, T013)
|
||||
- [ ] T015 [US1] Implement [endpoint/feature] in src/[location]/[file].py
|
||||
- [ ] T016 [US1] Add validation and error handling
|
||||
- [ ] T017 [US1] Add logging for user story 1 operations
|
||||
- [ ] T017 [US1] [P] Implement UI using Tailwind CSS (minimize scoped styles)
|
||||
- [ ] T018 [US1] Add logging for user story 1 operations
|
||||
|
||||
**Checkpoint**: At this point, User Story 1 should be fully functional and testable independently
|
||||
|
||||
|
||||
115378
backend/logs/app.log.1
115378
backend/logs/app.log.1
File diff suppressed because it is too large
Load Diff
Binary file not shown.
@@ -1 +1,3 @@
|
||||
from . import plugins, tasks, settings, connections, environments, mappings, migration, git, storage, admin
|
||||
|
||||
__all__ = ['plugins', 'tasks', 'settings', 'connections', 'environments', 'mappings', 'migration', 'git', 'storage', 'admin']
|
||||
|
||||
@@ -21,8 +21,8 @@ from ...schemas.auth import (
|
||||
RoleSchema, RoleCreate, RoleUpdate, PermissionSchema,
|
||||
ADGroupMappingSchema, ADGroupMappingCreate
|
||||
)
|
||||
from ...models.auth import User, Role, Permission, ADGroupMapping
|
||||
from ...dependencies import has_permission, get_current_user
|
||||
from ...models.auth import User, Role, ADGroupMapping
|
||||
from ...dependencies import has_permission
|
||||
from ...core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@ from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.orm import Session
|
||||
from ...core.database import get_db
|
||||
from ...models.connection import ConnectionConfig
|
||||
from pydantic import BaseModel, Field
|
||||
from pydantic import BaseModel
|
||||
from datetime import datetime
|
||||
from ...core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
327
backend/src/api/routes/dashboards.py
Normal file
327
backend/src/api/routes/dashboards.py
Normal file
@@ -0,0 +1,327 @@
|
||||
# [DEF:backend.src.api.routes.dashboards:Module]
|
||||
#
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: api, dashboards, resources, hub
|
||||
# @PURPOSE: API endpoints for the Dashboard Hub - listing dashboards with Git and task status
|
||||
# @LAYER: API
|
||||
# @RELATION: DEPENDS_ON -> backend.src.dependencies
|
||||
# @RELATION: DEPENDS_ON -> backend.src.services.resource_service
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.superset_client
|
||||
#
|
||||
# @INVARIANT: All dashboard responses include git_status and last_task metadata
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List, Optional, Dict
|
||||
from pydantic import BaseModel, Field
|
||||
from ...dependencies import get_config_manager, get_task_manager, get_resource_service, get_mapping_service, has_permission
|
||||
from ...core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter(prefix="/api/dashboards", tags=["Dashboards"])
|
||||
|
||||
# [DEF:GitStatus:DataClass]
|
||||
class GitStatus(BaseModel):
|
||||
branch: Optional[str] = None
|
||||
sync_status: Optional[str] = Field(None, pattern="^OK|DIFF$")
|
||||
# [/DEF:GitStatus:DataClass]
|
||||
|
||||
# [DEF:LastTask:DataClass]
|
||||
class LastTask(BaseModel):
|
||||
task_id: Optional[str] = None
|
||||
status: Optional[str] = Field(None, pattern="^RUNNING|SUCCESS|ERROR|WAITING_INPUT$")
|
||||
# [/DEF:LastTask:DataClass]
|
||||
|
||||
# [DEF:DashboardItem:DataClass]
|
||||
class DashboardItem(BaseModel):
|
||||
id: int
|
||||
title: str
|
||||
slug: Optional[str] = None
|
||||
url: Optional[str] = None
|
||||
last_modified: Optional[str] = None
|
||||
git_status: Optional[GitStatus] = None
|
||||
last_task: Optional[LastTask] = None
|
||||
# [/DEF:DashboardItem:DataClass]
|
||||
|
||||
# [DEF:DashboardsResponse:DataClass]
|
||||
class DashboardsResponse(BaseModel):
|
||||
dashboards: List[DashboardItem]
|
||||
total: int
|
||||
page: int
|
||||
page_size: int
|
||||
total_pages: int
|
||||
# [/DEF:DashboardsResponse:DataClass]
|
||||
|
||||
# [DEF:get_dashboards:Function]
|
||||
# @PURPOSE: Fetch list of dashboards from a specific environment with Git status and last task status
|
||||
# @PRE: env_id must be a valid environment ID
|
||||
# @PRE: page must be >= 1 if provided
|
||||
# @PRE: page_size must be between 1 and 100 if provided
|
||||
# @POST: Returns a list of dashboards with enhanced metadata and pagination info
|
||||
# @POST: Response includes pagination metadata (page, page_size, total, total_pages)
|
||||
# @PARAM: env_id (str) - The environment ID to fetch dashboards from
|
||||
# @PARAM: search (Optional[str]) - Filter by title/slug
|
||||
# @PARAM: page (Optional[int]) - Page number (default: 1)
|
||||
# @PARAM: page_size (Optional[int]) - Items per page (default: 10, max: 100)
|
||||
# @RETURN: DashboardsResponse - List of dashboards with status metadata
|
||||
# @RELATION: CALLS -> ResourceService.get_dashboards_with_status
|
||||
@router.get("", response_model=DashboardsResponse)
|
||||
async def get_dashboards(
|
||||
env_id: str,
|
||||
search: Optional[str] = None,
|
||||
page: int = 1,
|
||||
page_size: int = 10,
|
||||
config_manager=Depends(get_config_manager),
|
||||
task_manager=Depends(get_task_manager),
|
||||
resource_service=Depends(get_resource_service),
|
||||
_ = Depends(has_permission("plugin:migration", "READ"))
|
||||
):
|
||||
with belief_scope("get_dashboards", f"env_id={env_id}, search={search}, page={page}, page_size={page_size}"):
|
||||
# Validate pagination parameters
|
||||
if page < 1:
|
||||
logger.error(f"[get_dashboards][Coherence:Failed] Invalid page: {page}")
|
||||
raise HTTPException(status_code=400, detail="Page must be >= 1")
|
||||
if page_size < 1 or page_size > 100:
|
||||
logger.error(f"[get_dashboards][Coherence:Failed] Invalid page_size: {page_size}")
|
||||
raise HTTPException(status_code=400, detail="Page size must be between 1 and 100")
|
||||
|
||||
# Validate environment exists
|
||||
environments = config_manager.get_environments()
|
||||
env = next((e for e in environments if e.id == env_id), None)
|
||||
if not env:
|
||||
logger.error(f"[get_dashboards][Coherence:Failed] Environment not found: {env_id}")
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
try:
|
||||
# Get all tasks for status lookup
|
||||
all_tasks = task_manager.get_all_tasks()
|
||||
|
||||
# Fetch dashboards with status using ResourceService
|
||||
dashboards = await resource_service.get_dashboards_with_status(env, all_tasks)
|
||||
|
||||
# Apply search filter if provided
|
||||
if search:
|
||||
search_lower = search.lower()
|
||||
dashboards = [
|
||||
d for d in dashboards
|
||||
if search_lower in d.get('title', '').lower()
|
||||
or search_lower in d.get('slug', '').lower()
|
||||
]
|
||||
|
||||
# Calculate pagination
|
||||
total = len(dashboards)
|
||||
total_pages = (total + page_size - 1) // page_size if total > 0 else 1
|
||||
start_idx = (page - 1) * page_size
|
||||
end_idx = start_idx + page_size
|
||||
|
||||
# Slice dashboards for current page
|
||||
paginated_dashboards = dashboards[start_idx:end_idx]
|
||||
|
||||
logger.info(f"[get_dashboards][Coherence:OK] Returning {len(paginated_dashboards)} dashboards (page {page}/{total_pages}, total: {total})")
|
||||
|
||||
return DashboardsResponse(
|
||||
dashboards=paginated_dashboards,
|
||||
total=total,
|
||||
page=page,
|
||||
page_size=page_size,
|
||||
total_pages=total_pages
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[get_dashboards][Coherence:Failed] Failed to fetch dashboards: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Failed to fetch dashboards: {str(e)}")
|
||||
# [/DEF:get_dashboards:Function]
|
||||
|
||||
# [DEF:MigrateRequest:DataClass]
|
||||
class MigrateRequest(BaseModel):
|
||||
source_env_id: str = Field(..., description="Source environment ID")
|
||||
target_env_id: str = Field(..., description="Target environment ID")
|
||||
dashboard_ids: List[int] = Field(..., description="List of dashboard IDs to migrate")
|
||||
db_mappings: Optional[Dict[str, str]] = Field(None, description="Database mappings for migration")
|
||||
replace_db_config: bool = Field(False, description="Replace database configuration")
|
||||
# [/DEF:MigrateRequest:DataClass]
|
||||
|
||||
# [DEF:TaskResponse:DataClass]
|
||||
class TaskResponse(BaseModel):
|
||||
task_id: str
|
||||
# [/DEF:TaskResponse:DataClass]
|
||||
|
||||
# [DEF:migrate_dashboards:Function]
|
||||
# @PURPOSE: Trigger bulk migration of dashboards from source to target environment
|
||||
# @PRE: User has permission plugin:migration:execute
|
||||
# @PRE: source_env_id and target_env_id are valid environment IDs
|
||||
# @PRE: dashboard_ids is a non-empty list
|
||||
# @POST: Returns task_id for tracking migration progress
|
||||
# @POST: Task is created and queued for execution
|
||||
# @PARAM: request (MigrateRequest) - Migration request with source, target, and dashboard IDs
|
||||
# @RETURN: TaskResponse - Task ID for tracking
|
||||
# @RELATION: DISPATCHES -> MigrationPlugin
|
||||
# @RELATION: CALLS -> task_manager.create_task
|
||||
@router.post("/migrate", response_model=TaskResponse)
|
||||
async def migrate_dashboards(
|
||||
request: MigrateRequest,
|
||||
config_manager=Depends(get_config_manager),
|
||||
task_manager=Depends(get_task_manager),
|
||||
_ = Depends(has_permission("plugin:migration", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("migrate_dashboards", f"source={request.source_env_id}, target={request.target_env_id}, count={len(request.dashboard_ids)}"):
|
||||
# Validate request
|
||||
if not request.dashboard_ids:
|
||||
logger.error("[migrate_dashboards][Coherence:Failed] No dashboard IDs provided")
|
||||
raise HTTPException(status_code=400, detail="At least one dashboard ID must be provided")
|
||||
|
||||
# Validate environments exist
|
||||
environments = config_manager.get_environments()
|
||||
source_env = next((e for e in environments if e.id == request.source_env_id), None)
|
||||
target_env = next((e for e in environments if e.id == request.target_env_id), None)
|
||||
|
||||
if not source_env:
|
||||
logger.error(f"[migrate_dashboards][Coherence:Failed] Source environment not found: {request.source_env_id}")
|
||||
raise HTTPException(status_code=404, detail="Source environment not found")
|
||||
if not target_env:
|
||||
logger.error(f"[migrate_dashboards][Coherence:Failed] Target environment not found: {request.target_env_id}")
|
||||
raise HTTPException(status_code=404, detail="Target environment not found")
|
||||
|
||||
try:
|
||||
# Create migration task
|
||||
task_params = {
|
||||
'source_env_id': request.source_env_id,
|
||||
'target_env_id': request.target_env_id,
|
||||
'selected_ids': request.dashboard_ids,
|
||||
'replace_db_config': request.replace_db_config,
|
||||
'db_mappings': request.db_mappings or {}
|
||||
}
|
||||
|
||||
task_obj = await task_manager.create_task(
|
||||
plugin_id='superset-migration',
|
||||
params=task_params
|
||||
)
|
||||
|
||||
logger.info(f"[migrate_dashboards][Coherence:OK] Migration task created: {task_obj.id} for {len(request.dashboard_ids)} dashboards")
|
||||
|
||||
return TaskResponse(task_id=str(task_obj.id))
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[migrate_dashboards][Coherence:Failed] Failed to create migration task: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Failed to create migration task: {str(e)}")
|
||||
# [/DEF:migrate_dashboards:Function]
|
||||
|
||||
# [DEF:BackupRequest:DataClass]
|
||||
class BackupRequest(BaseModel):
|
||||
env_id: str = Field(..., description="Environment ID")
|
||||
dashboard_ids: List[int] = Field(..., description="List of dashboard IDs to backup")
|
||||
schedule: Optional[str] = Field(None, description="Cron schedule for recurring backups (e.g., '0 0 * * *')")
|
||||
# [/DEF:BackupRequest:DataClass]
|
||||
|
||||
# [DEF:backup_dashboards:Function]
|
||||
# @PURPOSE: Trigger bulk backup of dashboards with optional cron schedule
|
||||
# @PRE: User has permission plugin:backup:execute
|
||||
# @PRE: env_id is a valid environment ID
|
||||
# @PRE: dashboard_ids is a non-empty list
|
||||
# @POST: Returns task_id for tracking backup progress
|
||||
# @POST: Task is created and queued for execution
|
||||
# @POST: If schedule is provided, a scheduled task is created
|
||||
# @PARAM: request (BackupRequest) - Backup request with environment and dashboard IDs
|
||||
# @RETURN: TaskResponse - Task ID for tracking
|
||||
# @RELATION: DISPATCHES -> BackupPlugin
|
||||
# @RELATION: CALLS -> task_manager.create_task
|
||||
@router.post("/backup", response_model=TaskResponse)
|
||||
async def backup_dashboards(
|
||||
request: BackupRequest,
|
||||
config_manager=Depends(get_config_manager),
|
||||
task_manager=Depends(get_task_manager),
|
||||
_ = Depends(has_permission("plugin:backup", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("backup_dashboards", f"env={request.env_id}, count={len(request.dashboard_ids)}, schedule={request.schedule}"):
|
||||
# Validate request
|
||||
if not request.dashboard_ids:
|
||||
logger.error("[backup_dashboards][Coherence:Failed] No dashboard IDs provided")
|
||||
raise HTTPException(status_code=400, detail="At least one dashboard ID must be provided")
|
||||
|
||||
# Validate environment exists
|
||||
environments = config_manager.get_environments()
|
||||
env = next((e for e in environments if e.id == request.env_id), None)
|
||||
|
||||
if not env:
|
||||
logger.error(f"[backup_dashboards][Coherence:Failed] Environment not found: {request.env_id}")
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
try:
|
||||
# Create backup task
|
||||
task_params = {
|
||||
'env': request.env_id,
|
||||
'dashboards': request.dashboard_ids,
|
||||
'schedule': request.schedule
|
||||
}
|
||||
|
||||
task_obj = await task_manager.create_task(
|
||||
plugin_id='superset-backup',
|
||||
params=task_params
|
||||
)
|
||||
|
||||
logger.info(f"[backup_dashboards][Coherence:OK] Backup task created: {task_obj.id} for {len(request.dashboard_ids)} dashboards")
|
||||
|
||||
return TaskResponse(task_id=str(task_obj.id))
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[backup_dashboards][Coherence:Failed] Failed to create backup task: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Failed to create backup task: {str(e)}")
|
||||
# [/DEF:backup_dashboards:Function]
|
||||
|
||||
# [DEF:DatabaseMapping:DataClass]
|
||||
class DatabaseMapping(BaseModel):
|
||||
source_db: str
|
||||
target_db: str
|
||||
source_db_uuid: Optional[str] = None
|
||||
target_db_uuid: Optional[str] = None
|
||||
confidence: float
|
||||
# [/DEF:DatabaseMapping:DataClass]
|
||||
|
||||
# [DEF:DatabaseMappingsResponse:DataClass]
|
||||
class DatabaseMappingsResponse(BaseModel):
|
||||
mappings: List[DatabaseMapping]
|
||||
# [/DEF:DatabaseMappingsResponse:DataClass]
|
||||
|
||||
# [DEF:get_database_mappings:Function]
|
||||
# @PURPOSE: Get database mapping suggestions between source and target environments
|
||||
# @PRE: User has permission plugin:migration:read
|
||||
# @PRE: source_env_id and target_env_id are valid environment IDs
|
||||
# @POST: Returns list of suggested database mappings with confidence scores
|
||||
# @PARAM: source_env_id (str) - Source environment ID
|
||||
# @PARAM: target_env_id (str) - Target environment ID
|
||||
# @RETURN: DatabaseMappingsResponse - List of suggested mappings
|
||||
# @RELATION: CALLS -> MappingService.get_suggestions
|
||||
@router.get("/db-mappings", response_model=DatabaseMappingsResponse)
|
||||
async def get_database_mappings(
|
||||
source_env_id: str,
|
||||
target_env_id: str,
|
||||
mapping_service=Depends(get_mapping_service),
|
||||
_ = Depends(has_permission("plugin:migration", "READ"))
|
||||
):
|
||||
with belief_scope("get_database_mappings", f"source={source_env_id}, target={target_env_id}"):
|
||||
try:
|
||||
# Get mapping suggestions using MappingService
|
||||
suggestions = await mapping_service.get_suggestions(source_env_id, target_env_id)
|
||||
|
||||
# Format suggestions as DatabaseMapping objects
|
||||
mappings = [
|
||||
DatabaseMapping(
|
||||
source_db=s.get('source_db', ''),
|
||||
target_db=s.get('target_db', ''),
|
||||
source_db_uuid=s.get('source_db_uuid'),
|
||||
target_db_uuid=s.get('target_db_uuid'),
|
||||
confidence=s.get('confidence', 0.0)
|
||||
)
|
||||
for s in suggestions
|
||||
]
|
||||
|
||||
logger.info(f"[get_database_mappings][Coherence:OK] Returning {len(mappings)} database mapping suggestions")
|
||||
|
||||
return DatabaseMappingsResponse(mappings=mappings)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[get_database_mappings][Coherence:Failed] Failed to get database mappings: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Failed to get database mappings: {str(e)}")
|
||||
# [/DEF:get_database_mappings:Function]
|
||||
|
||||
# [/DEF:backend.src.api.routes.dashboards:Module]
|
||||
395
backend/src/api/routes/datasets.py
Normal file
395
backend/src/api/routes/datasets.py
Normal file
@@ -0,0 +1,395 @@
|
||||
# [DEF:backend.src.api.routes.datasets:Module]
|
||||
#
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: api, datasets, resources, hub
|
||||
# @PURPOSE: API endpoints for the Dataset Hub - listing datasets with mapping progress
|
||||
# @LAYER: API
|
||||
# @RELATION: DEPENDS_ON -> backend.src.dependencies
|
||||
# @RELATION: DEPENDS_ON -> backend.src.services.resource_service
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.superset_client
|
||||
#
|
||||
# @INVARIANT: All dataset responses include last_task metadata
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List, Optional
|
||||
from pydantic import BaseModel, Field
|
||||
from ...dependencies import get_config_manager, get_task_manager, get_resource_service, has_permission
|
||||
from ...core.logger import logger, belief_scope
|
||||
from ...core.superset_client import SupersetClient
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter(prefix="/api/datasets", tags=["Datasets"])
|
||||
|
||||
# [DEF:MappedFields:DataClass]
|
||||
class MappedFields(BaseModel):
|
||||
total: int
|
||||
mapped: int
|
||||
# [/DEF:MappedFields:DataClass]
|
||||
|
||||
# [DEF:LastTask:DataClass]
|
||||
class LastTask(BaseModel):
|
||||
task_id: Optional[str] = None
|
||||
status: Optional[str] = Field(None, pattern="^RUNNING|SUCCESS|ERROR|WAITING_INPUT$")
|
||||
# [/DEF:LastTask:DataClass]
|
||||
|
||||
# [DEF:DatasetItem:DataClass]
|
||||
class DatasetItem(BaseModel):
|
||||
id: int
|
||||
table_name: str
|
||||
schema: str
|
||||
database: str
|
||||
mapped_fields: Optional[MappedFields] = None
|
||||
last_task: Optional[LastTask] = None
|
||||
# [/DEF:DatasetItem:DataClass]
|
||||
|
||||
# [DEF:LinkedDashboard:DataClass]
|
||||
class LinkedDashboard(BaseModel):
|
||||
id: int
|
||||
title: str
|
||||
slug: Optional[str] = None
|
||||
# [/DEF:LinkedDashboard:DataClass]
|
||||
|
||||
# [DEF:DatasetColumn:DataClass]
|
||||
class DatasetColumn(BaseModel):
|
||||
id: int
|
||||
name: str
|
||||
type: Optional[str] = None
|
||||
is_dttm: bool = False
|
||||
is_active: bool = True
|
||||
description: Optional[str] = None
|
||||
# [/DEF:DatasetColumn:DataClass]
|
||||
|
||||
# [DEF:DatasetDetailResponse:DataClass]
|
||||
class DatasetDetailResponse(BaseModel):
|
||||
id: int
|
||||
table_name: Optional[str] = None
|
||||
schema: Optional[str] = None
|
||||
database: str
|
||||
description: Optional[str] = None
|
||||
columns: List[DatasetColumn]
|
||||
column_count: int
|
||||
sql: Optional[str] = None
|
||||
linked_dashboards: List[LinkedDashboard]
|
||||
linked_dashboard_count: int
|
||||
is_sqllab_view: bool = False
|
||||
created_on: Optional[str] = None
|
||||
changed_on: Optional[str] = None
|
||||
# [/DEF:DatasetDetailResponse:DataClass]
|
||||
|
||||
# [DEF:DatasetsResponse:DataClass]
|
||||
class DatasetsResponse(BaseModel):
|
||||
datasets: List[DatasetItem]
|
||||
total: int
|
||||
page: int
|
||||
page_size: int
|
||||
total_pages: int
|
||||
# [/DEF:DatasetsResponse:DataClass]
|
||||
|
||||
# [DEF:TaskResponse:DataClass]
|
||||
class TaskResponse(BaseModel):
|
||||
task_id: str
|
||||
# [/DEF:TaskResponse:DataClass]
|
||||
|
||||
# [DEF:get_dataset_ids:Function]
|
||||
# @PURPOSE: Fetch list of all dataset IDs from a specific environment (without pagination)
|
||||
# @PRE: env_id must be a valid environment ID
|
||||
# @POST: Returns a list of all dataset IDs
|
||||
# @PARAM: env_id (str) - The environment ID to fetch datasets from
|
||||
# @PARAM: search (Optional[str]) - Filter by table name
|
||||
# @RETURN: List[int] - List of dataset IDs
|
||||
# @RELATION: CALLS -> ResourceService.get_datasets_with_status
|
||||
@router.get("/ids")
|
||||
async def get_dataset_ids(
|
||||
env_id: str,
|
||||
search: Optional[str] = None,
|
||||
config_manager=Depends(get_config_manager),
|
||||
task_manager=Depends(get_task_manager),
|
||||
resource_service=Depends(get_resource_service),
|
||||
_ = Depends(has_permission("plugin:migration", "READ"))
|
||||
):
|
||||
with belief_scope("get_dataset_ids", f"env_id={env_id}, search={search}"):
|
||||
# Validate environment exists
|
||||
environments = config_manager.get_environments()
|
||||
env = next((e for e in environments if e.id == env_id), None)
|
||||
if not env:
|
||||
logger.error(f"[get_dataset_ids][Coherence:Failed] Environment not found: {env_id}")
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
try:
|
||||
# Get all tasks for status lookup
|
||||
all_tasks = task_manager.get_all_tasks()
|
||||
|
||||
# Fetch datasets with status using ResourceService
|
||||
datasets = await resource_service.get_datasets_with_status(env, all_tasks)
|
||||
|
||||
# Apply search filter if provided
|
||||
if search:
|
||||
search_lower = search.lower()
|
||||
datasets = [
|
||||
d for d in datasets
|
||||
if search_lower in d.get('table_name', '').lower()
|
||||
]
|
||||
|
||||
# Extract and return just the IDs
|
||||
dataset_ids = [d['id'] for d in datasets]
|
||||
logger.info(f"[get_dataset_ids][Coherence:OK] Returning {len(dataset_ids)} dataset IDs")
|
||||
|
||||
return {"dataset_ids": dataset_ids}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[get_dataset_ids][Coherence:Failed] Failed to fetch dataset IDs: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Failed to fetch dataset IDs: {str(e)}")
|
||||
# [/DEF:get_dataset_ids:Function]
|
||||
|
||||
# [DEF:get_datasets:Function]
|
||||
# @PURPOSE: Fetch list of datasets from a specific environment with mapping progress
|
||||
# @PRE: env_id must be a valid environment ID
|
||||
# @PRE: page must be >= 1 if provided
|
||||
# @PRE: page_size must be between 1 and 100 if provided
|
||||
# @POST: Returns a list of datasets with enhanced metadata and pagination info
|
||||
# @POST: Response includes pagination metadata (page, page_size, total, total_pages)
|
||||
# @PARAM: env_id (str) - The environment ID to fetch datasets from
|
||||
# @PARAM: search (Optional[str]) - Filter by table name
|
||||
# @PARAM: page (Optional[int]) - Page number (default: 1)
|
||||
# @PARAM: page_size (Optional[int]) - Items per page (default: 10, max: 100)
|
||||
# @RETURN: DatasetsResponse - List of datasets with status metadata
|
||||
# @RELATION: CALLS -> ResourceService.get_datasets_with_status
|
||||
@router.get("", response_model=DatasetsResponse)
|
||||
async def get_datasets(
|
||||
env_id: str,
|
||||
search: Optional[str] = None,
|
||||
page: int = 1,
|
||||
page_size: int = 10,
|
||||
config_manager=Depends(get_config_manager),
|
||||
task_manager=Depends(get_task_manager),
|
||||
resource_service=Depends(get_resource_service),
|
||||
_ = Depends(has_permission("plugin:migration", "READ"))
|
||||
):
|
||||
with belief_scope("get_datasets", f"env_id={env_id}, search={search}, page={page}, page_size={page_size}"):
|
||||
# Validate pagination parameters
|
||||
if page < 1:
|
||||
logger.error(f"[get_datasets][Coherence:Failed] Invalid page: {page}")
|
||||
raise HTTPException(status_code=400, detail="Page must be >= 1")
|
||||
if page_size < 1 or page_size > 100:
|
||||
logger.error(f"[get_datasets][Coherence:Failed] Invalid page_size: {page_size}")
|
||||
raise HTTPException(status_code=400, detail="Page size must be between 1 and 100")
|
||||
|
||||
# Validate environment exists
|
||||
environments = config_manager.get_environments()
|
||||
env = next((e for e in environments if e.id == env_id), None)
|
||||
if not env:
|
||||
logger.error(f"[get_datasets][Coherence:Failed] Environment not found: {env_id}")
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
try:
|
||||
# Get all tasks for status lookup
|
||||
all_tasks = task_manager.get_all_tasks()
|
||||
|
||||
# Fetch datasets with status using ResourceService
|
||||
datasets = await resource_service.get_datasets_with_status(env, all_tasks)
|
||||
|
||||
# Apply search filter if provided
|
||||
if search:
|
||||
search_lower = search.lower()
|
||||
datasets = [
|
||||
d for d in datasets
|
||||
if search_lower in d.get('table_name', '').lower()
|
||||
]
|
||||
|
||||
# Calculate pagination
|
||||
total = len(datasets)
|
||||
total_pages = (total + page_size - 1) // page_size if total > 0 else 1
|
||||
start_idx = (page - 1) * page_size
|
||||
end_idx = start_idx + page_size
|
||||
|
||||
# Slice datasets for current page
|
||||
paginated_datasets = datasets[start_idx:end_idx]
|
||||
|
||||
logger.info(f"[get_datasets][Coherence:OK] Returning {len(paginated_datasets)} datasets (page {page}/{total_pages}, total: {total})")
|
||||
|
||||
return DatasetsResponse(
|
||||
datasets=paginated_datasets,
|
||||
total=total,
|
||||
page=page,
|
||||
page_size=page_size,
|
||||
total_pages=total_pages
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[get_datasets][Coherence:Failed] Failed to fetch datasets: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Failed to fetch datasets: {str(e)}")
|
||||
# [/DEF:get_datasets:Function]
|
||||
|
||||
# [DEF:MapColumnsRequest:DataClass]
|
||||
class MapColumnsRequest(BaseModel):
|
||||
env_id: str = Field(..., description="Environment ID")
|
||||
dataset_ids: List[int] = Field(..., description="List of dataset IDs to map")
|
||||
source_type: str = Field(..., description="Source type: 'postgresql' or 'xlsx'")
|
||||
connection_id: Optional[str] = Field(None, description="Connection ID for PostgreSQL source")
|
||||
file_data: Optional[str] = Field(None, description="File path or data for XLSX source")
|
||||
# [/DEF:MapColumnsRequest:DataClass]
|
||||
|
||||
# [DEF:map_columns:Function]
|
||||
# @PURPOSE: Trigger bulk column mapping for datasets
|
||||
# @PRE: User has permission plugin:mapper:execute
|
||||
# @PRE: env_id is a valid environment ID
|
||||
# @PRE: dataset_ids is a non-empty list
|
||||
# @POST: Returns task_id for tracking mapping progress
|
||||
# @POST: Task is created and queued for execution
|
||||
# @PARAM: request (MapColumnsRequest) - Mapping request with environment and dataset IDs
|
||||
# @RETURN: TaskResponse - Task ID for tracking
|
||||
# @RELATION: DISPATCHES -> MapperPlugin
|
||||
# @RELATION: CALLS -> task_manager.create_task
|
||||
@router.post("/map-columns", response_model=TaskResponse)
|
||||
async def map_columns(
|
||||
request: MapColumnsRequest,
|
||||
config_manager=Depends(get_config_manager),
|
||||
task_manager=Depends(get_task_manager),
|
||||
_ = Depends(has_permission("plugin:mapper", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("map_columns", f"env={request.env_id}, count={len(request.dataset_ids)}, source={request.source_type}"):
|
||||
# Validate request
|
||||
if not request.dataset_ids:
|
||||
logger.error("[map_columns][Coherence:Failed] No dataset IDs provided")
|
||||
raise HTTPException(status_code=400, detail="At least one dataset ID must be provided")
|
||||
|
||||
# Validate source type
|
||||
if request.source_type not in ['postgresql', 'xlsx']:
|
||||
logger.error(f"[map_columns][Coherence:Failed] Invalid source type: {request.source_type}")
|
||||
raise HTTPException(status_code=400, detail="Source type must be 'postgresql' or 'xlsx'")
|
||||
|
||||
# Validate environment exists
|
||||
environments = config_manager.get_environments()
|
||||
env = next((e for e in environments if e.id == request.env_id), None)
|
||||
|
||||
if not env:
|
||||
logger.error(f"[map_columns][Coherence:Failed] Environment not found: {request.env_id}")
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
try:
|
||||
# Create mapping task
|
||||
task_params = {
|
||||
'env': request.env_id,
|
||||
'dataset_id': request.dataset_ids[0] if request.dataset_ids else None,
|
||||
'source': request.source_type,
|
||||
'connection_id': request.connection_id,
|
||||
'file_data': request.file_data
|
||||
}
|
||||
|
||||
task_obj = await task_manager.create_task(
|
||||
plugin_id='dataset-mapper',
|
||||
params=task_params
|
||||
)
|
||||
|
||||
logger.info(f"[map_columns][Coherence:OK] Mapping task created: {task_obj.id} for {len(request.dataset_ids)} datasets")
|
||||
|
||||
return TaskResponse(task_id=str(task_obj.id))
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[map_columns][Coherence:Failed] Failed to create mapping task: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Failed to create mapping task: {str(e)}")
|
||||
# [/DEF:map_columns:Function]
|
||||
|
||||
# [DEF:GenerateDocsRequest:DataClass]
|
||||
class GenerateDocsRequest(BaseModel):
|
||||
env_id: str = Field(..., description="Environment ID")
|
||||
dataset_ids: List[int] = Field(..., description="List of dataset IDs to generate docs for")
|
||||
llm_provider: str = Field(..., description="LLM provider to use")
|
||||
options: Optional[dict] = Field(None, description="Additional options for documentation generation")
|
||||
# [/DEF:GenerateDocsRequest:DataClass]
|
||||
|
||||
# [DEF:generate_docs:Function]
|
||||
# @PURPOSE: Trigger bulk documentation generation for datasets
|
||||
# @PRE: User has permission plugin:llm_analysis:execute
|
||||
# @PRE: env_id is a valid environment ID
|
||||
# @PRE: dataset_ids is a non-empty list
|
||||
# @POST: Returns task_id for tracking documentation generation progress
|
||||
# @POST: Task is created and queued for execution
|
||||
# @PARAM: request (GenerateDocsRequest) - Documentation generation request
|
||||
# @RETURN: TaskResponse - Task ID for tracking
|
||||
# @RELATION: DISPATCHES -> LLMAnalysisPlugin
|
||||
# @RELATION: CALLS -> task_manager.create_task
|
||||
@router.post("/generate-docs", response_model=TaskResponse)
|
||||
async def generate_docs(
|
||||
request: GenerateDocsRequest,
|
||||
config_manager=Depends(get_config_manager),
|
||||
task_manager=Depends(get_task_manager),
|
||||
_ = Depends(has_permission("plugin:llm_analysis", "EXECUTE"))
|
||||
):
|
||||
with belief_scope("generate_docs", f"env={request.env_id}, count={len(request.dataset_ids)}, provider={request.llm_provider}"):
|
||||
# Validate request
|
||||
if not request.dataset_ids:
|
||||
logger.error("[generate_docs][Coherence:Failed] No dataset IDs provided")
|
||||
raise HTTPException(status_code=400, detail="At least one dataset ID must be provided")
|
||||
|
||||
# Validate environment exists
|
||||
environments = config_manager.get_environments()
|
||||
env = next((e for e in environments if e.id == request.env_id), None)
|
||||
|
||||
if not env:
|
||||
logger.error(f"[generate_docs][Coherence:Failed] Environment not found: {request.env_id}")
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
try:
|
||||
# Create documentation generation task
|
||||
task_params = {
|
||||
'environment_id': request.env_id,
|
||||
'dataset_id': str(request.dataset_ids[0]) if request.dataset_ids else None,
|
||||
'provider_id': request.llm_provider,
|
||||
'options': request.options or {}
|
||||
}
|
||||
|
||||
task_obj = await task_manager.create_task(
|
||||
plugin_id='llm_documentation',
|
||||
params=task_params
|
||||
)
|
||||
|
||||
logger.info(f"[generate_docs][Coherence:OK] Documentation generation task created: {task_obj.id} for {len(request.dataset_ids)} datasets")
|
||||
|
||||
return TaskResponse(task_id=str(task_obj.id))
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[generate_docs][Coherence:Failed] Failed to create documentation generation task: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Failed to create documentation generation task: {str(e)}")
|
||||
# [/DEF:generate_docs:Function]
|
||||
|
||||
# [DEF:get_dataset_detail:Function]
|
||||
# @PURPOSE: Get detailed dataset information including columns and linked dashboards
|
||||
# @PRE: env_id is a valid environment ID
|
||||
# @PRE: dataset_id is a valid dataset ID
|
||||
# @POST: Returns detailed dataset info with columns and linked dashboards
|
||||
# @PARAM: env_id (str) - The environment ID
|
||||
# @PARAM: dataset_id (int) - The dataset ID
|
||||
# @RETURN: DatasetDetailResponse - Detailed dataset information
|
||||
# @RELATION: CALLS -> SupersetClient.get_dataset_detail
|
||||
@router.get("/{dataset_id}", response_model=DatasetDetailResponse)
|
||||
async def get_dataset_detail(
|
||||
env_id: str,
|
||||
dataset_id: int,
|
||||
config_manager=Depends(get_config_manager),
|
||||
_ = Depends(has_permission("plugin:migration", "READ"))
|
||||
):
|
||||
with belief_scope("get_dataset_detail", f"env_id={env_id}, dataset_id={dataset_id}"):
|
||||
# Validate environment exists
|
||||
environments = config_manager.get_environments()
|
||||
env = next((e for e in environments if e.id == env_id), None)
|
||||
if not env:
|
||||
logger.error(f"[get_dataset_detail][Coherence:Failed] Environment not found: {env_id}")
|
||||
raise HTTPException(status_code=404, detail="Environment not found")
|
||||
|
||||
try:
|
||||
# Fetch detailed dataset info using SupersetClient
|
||||
client = SupersetClient(env)
|
||||
dataset_detail = client.get_dataset_detail(dataset_id)
|
||||
|
||||
logger.info(f"[get_dataset_detail][Coherence:OK] Retrieved dataset {dataset_id} with {dataset_detail['column_count']} columns and {dataset_detail['linked_dashboard_count']} linked dashboards")
|
||||
|
||||
return DatasetDetailResponse(**dataset_detail)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[get_dataset_detail][Coherence:Failed] Failed to fetch dataset detail: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Failed to fetch dataset detail: {str(e)}")
|
||||
# [/DEF:get_dataset_detail:Function]
|
||||
|
||||
# [/DEF:backend.src.api.routes.datasets:Module]
|
||||
@@ -11,15 +11,14 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List, Dict, Optional
|
||||
from typing import List, Optional
|
||||
from ...dependencies import get_config_manager, get_scheduler_service, has_permission
|
||||
from ...core.superset_client import SupersetClient
|
||||
from pydantic import BaseModel, Field
|
||||
from ...core.config_models import Environment as EnvModel
|
||||
from ...core.logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter()
|
||||
router = APIRouter(prefix="/api/environments", tags=["Environments"])
|
||||
|
||||
# [DEF:ScheduleSchema:DataClass]
|
||||
class ScheduleSchema(BaseModel):
|
||||
@@ -44,6 +43,8 @@ class DatabaseResponse(BaseModel):
|
||||
|
||||
# [DEF:get_environments:Function]
|
||||
# @PURPOSE: List all configured environments.
|
||||
# @LAYER: API
|
||||
# @SEMANTICS: list, environments, config
|
||||
# @PRE: config_manager is injected via Depends.
|
||||
# @POST: Returns a list of EnvironmentResponse objects.
|
||||
# @RETURN: List[EnvironmentResponse]
|
||||
@@ -72,6 +73,8 @@ async def get_environments(
|
||||
|
||||
# [DEF:update_environment_schedule:Function]
|
||||
# @PURPOSE: Update backup schedule for an environment.
|
||||
# @LAYER: API
|
||||
# @SEMANTICS: update, schedule, backup, environment
|
||||
# @PRE: Environment id exists, schedule is valid ScheduleSchema.
|
||||
# @POST: Backup schedule updated and scheduler reloaded.
|
||||
# @PARAM: id (str) - The environment ID.
|
||||
@@ -104,6 +107,8 @@ async def update_environment_schedule(
|
||||
|
||||
# [DEF:get_environment_databases:Function]
|
||||
# @PURPOSE: Fetch the list of databases from a specific environment.
|
||||
# @LAYER: API
|
||||
# @SEMANTICS: fetch, databases, superset, environment
|
||||
# @PRE: Environment id exists.
|
||||
# @POST: Returns a list of database summaries from the environment.
|
||||
# @PARAM: id (str) - The environment ID.
|
||||
|
||||
@@ -16,17 +16,17 @@ from typing import List, Optional
|
||||
import typing
|
||||
from src.dependencies import get_config_manager, has_permission
|
||||
from src.core.database import get_db
|
||||
from src.models.git import GitServerConfig, GitStatus, DeploymentEnvironment, GitRepository
|
||||
from src.models.git import GitServerConfig, GitRepository
|
||||
from src.api.routes.git_schemas import (
|
||||
GitServerConfigSchema, GitServerConfigCreate,
|
||||
GitRepositorySchema, BranchSchema, BranchCreate,
|
||||
BranchSchema, BranchCreate,
|
||||
BranchCheckout, CommitSchema, CommitCreate,
|
||||
DeploymentEnvironmentSchema, DeployRequest, RepoInitRequest
|
||||
)
|
||||
from src.services.git_service import GitService
|
||||
from src.core.logger import logger, belief_scope
|
||||
|
||||
router = APIRouter(prefix="/api/git", tags=["git"])
|
||||
router = APIRouter(tags=["git"])
|
||||
git_service = GitService()
|
||||
|
||||
# [DEF:get_git_configs:Function]
|
||||
|
||||
@@ -11,7 +11,6 @@
|
||||
from pydantic import BaseModel, Field
|
||||
from typing import List, Optional
|
||||
from datetime import datetime
|
||||
from uuid import UUID
|
||||
from src.models.git import GitProvider, GitStatus, SyncStatus
|
||||
|
||||
# [DEF:GitServerConfigBase:Class]
|
||||
|
||||
@@ -16,7 +16,7 @@ from sqlalchemy.orm import Session
|
||||
|
||||
# [DEF:router:Global]
|
||||
# @PURPOSE: APIRouter instance for LLM routes.
|
||||
router = APIRouter(prefix="/api/llm", tags=["LLM"])
|
||||
router = APIRouter(tags=["LLM"])
|
||||
# [/DEF:router:Global]
|
||||
|
||||
# [DEF:get_providers:Function]
|
||||
|
||||
@@ -21,7 +21,7 @@ from ...models.mapping import DatabaseMapping
|
||||
from pydantic import BaseModel
|
||||
# [/SECTION]
|
||||
|
||||
router = APIRouter(prefix="/api/mappings", tags=["mappings"])
|
||||
router = APIRouter(tags=["mappings"])
|
||||
|
||||
# [DEF:MappingCreate:DataClass]
|
||||
class MappingCreate(BaseModel):
|
||||
@@ -31,6 +31,7 @@ class MappingCreate(BaseModel):
|
||||
target_db_uuid: str
|
||||
source_db_name: str
|
||||
target_db_name: str
|
||||
engine: Optional[str] = None
|
||||
# [/DEF:MappingCreate:DataClass]
|
||||
|
||||
# [DEF:MappingResponse:DataClass]
|
||||
@@ -42,6 +43,7 @@ class MappingResponse(BaseModel):
|
||||
target_db_uuid: str
|
||||
source_db_name: str
|
||||
target_db_name: str
|
||||
engine: Optional[str] = None
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
@@ -94,6 +96,7 @@ async def create_mapping(
|
||||
if existing:
|
||||
existing.target_db_uuid = mapping.target_db_uuid
|
||||
existing.target_db_name = mapping.target_db_name
|
||||
existing.engine = mapping.engine
|
||||
db.commit()
|
||||
db.refresh(existing)
|
||||
return existing
|
||||
|
||||
@@ -7,7 +7,7 @@
|
||||
# @RELATION: DEPENDS_ON -> backend.src.models.dashboard
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from typing import List, Dict
|
||||
from typing import List
|
||||
from ...dependencies import get_config_manager, get_task_manager, has_permission
|
||||
from ...models.dashboard import DashboardMetadata, DashboardSelection
|
||||
from ...core.superset_client import SupersetClient
|
||||
@@ -44,7 +44,7 @@ async def get_dashboards(
|
||||
# @POST: Starts the migration task and returns the task ID.
|
||||
# @PARAM: selection (DashboardSelection) - The dashboards to migrate.
|
||||
# @RETURN: Dict - {"task_id": str, "message": str}
|
||||
@router.post("/migration/execute")
|
||||
@router.post("/execute")
|
||||
async def execute_migration(
|
||||
selection: DashboardSelection,
|
||||
config_manager=Depends(get_config_manager),
|
||||
|
||||
@@ -17,9 +17,8 @@ from ...core.config_models import AppConfig, Environment, GlobalSettings, Loggin
|
||||
from ...models.storage import StorageConfig
|
||||
from ...dependencies import get_config_manager, has_permission
|
||||
from ...core.config_manager import ConfigManager
|
||||
from ...core.logger import logger, belief_scope, get_task_log_level
|
||||
from ...core.logger import logger, belief_scope
|
||||
from ...core.superset_client import SupersetClient
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:LoggingConfigResponse:Class]
|
||||
@@ -279,4 +278,99 @@ async def update_logging_config(
|
||||
)
|
||||
# [/DEF:update_logging_config:Function]
|
||||
|
||||
# [DEF:ConsolidatedSettingsResponse:Class]
|
||||
class ConsolidatedSettingsResponse(BaseModel):
|
||||
environments: List[dict]
|
||||
connections: List[dict]
|
||||
llm: dict
|
||||
llm_providers: List[dict]
|
||||
logging: dict
|
||||
storage: dict
|
||||
# [/DEF:ConsolidatedSettingsResponse:Class]
|
||||
|
||||
# [DEF:get_consolidated_settings:Function]
|
||||
# @PURPOSE: Retrieves all settings categories in a single call
|
||||
# @PRE: Config manager is available.
|
||||
# @POST: Returns all consolidated settings.
|
||||
# @RETURN: ConsolidatedSettingsResponse - All settings categories.
|
||||
@router.get("/consolidated", response_model=ConsolidatedSettingsResponse)
|
||||
async def get_consolidated_settings(
|
||||
config_manager: ConfigManager = Depends(get_config_manager),
|
||||
_ = Depends(has_permission("admin:settings", "READ"))
|
||||
):
|
||||
with belief_scope("get_consolidated_settings"):
|
||||
logger.info("[get_consolidated_settings][Entry] Fetching all consolidated settings")
|
||||
|
||||
config = config_manager.get_config()
|
||||
|
||||
from ...services.llm_provider import LLMProviderService
|
||||
from ...core.database import SessionLocal
|
||||
db = SessionLocal()
|
||||
try:
|
||||
llm_service = LLMProviderService(db)
|
||||
providers = llm_service.get_all_providers()
|
||||
llm_providers_list = [
|
||||
{
|
||||
"id": p.id,
|
||||
"provider_type": p.provider_type,
|
||||
"name": p.name,
|
||||
"base_url": p.base_url,
|
||||
"api_key": "********",
|
||||
"default_model": p.default_model,
|
||||
"is_active": p.is_active
|
||||
} for p in providers
|
||||
]
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
return ConsolidatedSettingsResponse(
|
||||
environments=[env.dict() for env in config.environments],
|
||||
connections=config.settings.connections,
|
||||
llm=config.settings.llm,
|
||||
llm_providers=llm_providers_list,
|
||||
logging=config.settings.logging.dict(),
|
||||
storage=config.settings.storage.dict()
|
||||
)
|
||||
# [/DEF:get_consolidated_settings:Function]
|
||||
|
||||
# [DEF:update_consolidated_settings:Function]
|
||||
# @PURPOSE: Bulk update application settings from the consolidated view.
|
||||
# @PRE: User has admin permissions, config is valid.
|
||||
# @POST: Settings are updated and saved via ConfigManager.
|
||||
@router.patch("/consolidated")
|
||||
async def update_consolidated_settings(
|
||||
settings_patch: dict,
|
||||
config_manager: ConfigManager = Depends(get_config_manager),
|
||||
_ = Depends(has_permission("admin:settings", "WRITE"))
|
||||
):
|
||||
with belief_scope("update_consolidated_settings"):
|
||||
logger.info("[update_consolidated_settings][Entry] Applying consolidated settings patch")
|
||||
|
||||
current_config = config_manager.get_config()
|
||||
current_settings = current_config.settings
|
||||
|
||||
# Update connections if provided
|
||||
if "connections" in settings_patch:
|
||||
current_settings.connections = settings_patch["connections"]
|
||||
|
||||
# Update LLM if provided
|
||||
if "llm" in settings_patch:
|
||||
current_settings.llm = settings_patch["llm"]
|
||||
|
||||
# Update Logging if provided
|
||||
if "logging" in settings_patch:
|
||||
current_settings.logging = LoggingConfig(**settings_patch["logging"])
|
||||
|
||||
# Update Storage if provided
|
||||
if "storage" in settings_patch:
|
||||
new_storage = StorageConfig(**settings_patch["storage"])
|
||||
is_valid, message = config_manager.validate_path(new_storage.root_path)
|
||||
if not is_valid:
|
||||
raise HTTPException(status_code=400, detail=message)
|
||||
current_settings.storage = new_storage
|
||||
|
||||
config_manager.update_global_settings(current_settings)
|
||||
return {"status": "success", "message": "Settings updated"}
|
||||
# [/DEF:update_consolidated_settings:Function]
|
||||
|
||||
# [/DEF:SettingsRouter:Module]
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
# @RELATION: Depends on the TaskManager. It is included by the main app.
|
||||
from typing import List, Dict, Any, Optional
|
||||
from fastapi import APIRouter, Depends, HTTPException, status, Query
|
||||
from pydantic import BaseModel, Field
|
||||
from pydantic import BaseModel
|
||||
from ...core.logger import belief_scope
|
||||
|
||||
from ...core.task_manager import TaskManager, Task, TaskStatus, LogEntry
|
||||
|
||||
@@ -6,26 +6,23 @@
|
||||
# @RELATION: Depends on the dependency module and API route modules.
|
||||
# @INVARIANT: Only one FastAPI app instance exists per process.
|
||||
# @INVARIANT: All WebSocket connections must be properly cleaned up on disconnect.
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# project_root is used for static files mounting
|
||||
project_root = Path(__file__).resolve().parent.parent.parent
|
||||
|
||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, Request, HTTPException
|
||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Request, HTTPException
|
||||
from starlette.middleware.sessions import SessionMiddleware
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from fastapi.responses import FileResponse
|
||||
import asyncio
|
||||
import os
|
||||
|
||||
from .dependencies import get_task_manager, get_scheduler_service
|
||||
from .core.utils.network import NetworkError
|
||||
from .core.logger import logger, belief_scope
|
||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration, connections, git, storage, admin, llm
|
||||
from .api.routes import plugins, tasks, settings, environments, mappings, migration, connections, git, storage, admin, llm, dashboards, datasets
|
||||
from .api import auth
|
||||
from .core.database import init_db
|
||||
|
||||
# [DEF:App:Global]
|
||||
# @SEMANTICS: app, fastapi, instance
|
||||
@@ -118,12 +115,21 @@ app.include_router(plugins.router, prefix="/api/plugins", tags=["Plugins"])
|
||||
app.include_router(tasks.router, prefix="/api/tasks", tags=["Tasks"])
|
||||
app.include_router(settings.router, prefix="/api/settings", tags=["Settings"])
|
||||
app.include_router(connections.router, prefix="/api/settings/connections", tags=["Connections"])
|
||||
app.include_router(environments.router, prefix="/api/environments", tags=["Environments"])
|
||||
app.include_router(mappings.router)
|
||||
app.include_router(environments.router, tags=["Environments"])
|
||||
app.include_router(mappings.router, prefix="/api/mappings", tags=["Mappings"])
|
||||
app.include_router(migration.router)
|
||||
app.include_router(git.router)
|
||||
app.include_router(llm.router)
|
||||
app.include_router(git.router, prefix="/api/git", tags=["Git"])
|
||||
app.include_router(llm.router, prefix="/api/llm", tags=["LLM"])
|
||||
app.include_router(storage.router, prefix="/api/storage", tags=["Storage"])
|
||||
app.include_router(dashboards.router)
|
||||
app.include_router(datasets.router)
|
||||
|
||||
|
||||
# [DEF:api.include_routers:Action]
|
||||
# @PURPOSE: Registers all API routers with the FastAPI application.
|
||||
# @LAYER: API
|
||||
# @SEMANTICS: routes, registration, api
|
||||
# [/DEF:api.include_routers:Action]
|
||||
|
||||
# [DEF:websocket_endpoint:Function]
|
||||
# @PURPOSE: Provides a WebSocket endpoint for real-time log streaming of a task with server-side filtering.
|
||||
@@ -234,25 +240,20 @@ async def websocket_endpoint(
|
||||
frontend_path = project_root / "frontend" / "build"
|
||||
if frontend_path.exists():
|
||||
app.mount("/_app", StaticFiles(directory=str(frontend_path / "_app")), name="static")
|
||||
|
||||
# Serve other static files from the root of build directory
|
||||
# [DEF:serve_spa:Function]
|
||||
# @PURPOSE: Serves frontend static files or index.html for SPA routing.
|
||||
# @PRE: file_path is requested by the client.
|
||||
# @POST: Returns the requested file or index.html as a fallback.
|
||||
@app.get("/{file_path:path}")
|
||||
|
||||
@app.get("/{file_path:path}", include_in_schema=False)
|
||||
async def serve_spa(file_path: str):
|
||||
with belief_scope("serve_spa", f"path={file_path}"):
|
||||
# Don't serve SPA for API routes that fell through
|
||||
if file_path.startswith("api/"):
|
||||
logger.info(f"[DEBUG] API route fell through to serve_spa: {file_path}")
|
||||
raise HTTPException(status_code=404, detail=f"API endpoint not found: {file_path}")
|
||||
|
||||
full_path = frontend_path / file_path
|
||||
if full_path.is_file():
|
||||
return FileResponse(str(full_path))
|
||||
# Fallback to index.html for SPA routing
|
||||
return FileResponse(str(frontend_path / "index.html"))
|
||||
# Only serve SPA for non-API paths
|
||||
# API routes are registered separately and should be matched by FastAPI first
|
||||
if file_path and (file_path.startswith("api/") or file_path.startswith("/api/") or file_path == "api"):
|
||||
# This should not happen if API routers are properly registered
|
||||
# Return 404 instead of serving HTML
|
||||
raise HTTPException(status_code=404, detail=f"API endpoint not found: {file_path}")
|
||||
|
||||
full_path = frontend_path / file_path
|
||||
if file_path and full_path.is_file():
|
||||
return FileResponse(str(full_path))
|
||||
return FileResponse(str(frontend_path / "index.html"))
|
||||
# [/DEF:serve_spa:Function]
|
||||
else:
|
||||
# [DEF:read_root:Function]
|
||||
|
||||
@@ -10,7 +10,6 @@
|
||||
# [SECTION: IMPORTS]
|
||||
from pydantic import Field
|
||||
from pydantic_settings import BaseSettings
|
||||
import os
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:AuthConfig:Class]
|
||||
|
||||
@@ -11,8 +11,8 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional, List
|
||||
from jose import JWTError, jwt
|
||||
from typing import Optional
|
||||
from jose import jwt
|
||||
from .config import auth_config
|
||||
from ..logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
@@ -11,7 +11,7 @@
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Optional, List
|
||||
from sqlalchemy.orm import Session
|
||||
from ...models.auth import User, Role, Permission, ADGroupMapping
|
||||
from ...models.auth import User, Role, Permission
|
||||
from ..logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
@@ -15,7 +15,7 @@ import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Optional, List
|
||||
from .config_models import AppConfig, Environment, GlobalSettings
|
||||
from .config_models import AppConfig, Environment, GlobalSettings, StorageConfig
|
||||
from .logger import logger, configure_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
@@ -46,7 +46,7 @@ class ConfigManager:
|
||||
# 3. Runtime check of @POST
|
||||
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
||||
|
||||
logger.info(f"[ConfigManager][Exit] Initialized")
|
||||
logger.info("[ConfigManager][Exit] Initialized")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:_load_config:Function]
|
||||
@@ -59,7 +59,7 @@ class ConfigManager:
|
||||
logger.debug(f"[_load_config][Entry] Loading from {self.config_path}")
|
||||
|
||||
if not self.config_path.exists():
|
||||
logger.info(f"[_load_config][Action] Config file not found. Creating default.")
|
||||
logger.info("[_load_config][Action] Config file not found. Creating default.")
|
||||
default_config = AppConfig(
|
||||
environments=[],
|
||||
settings=GlobalSettings()
|
||||
@@ -75,7 +75,7 @@ class ConfigManager:
|
||||
del data["settings"]["backup_path"]
|
||||
|
||||
config = AppConfig(**data)
|
||||
logger.info(f"[_load_config][Coherence:OK] Configuration loaded")
|
||||
logger.info("[_load_config][Coherence:OK] Configuration loaded")
|
||||
return config
|
||||
except Exception as e:
|
||||
logger.error(f"[_load_config][Coherence:Failed] Error loading config: {e}")
|
||||
@@ -103,7 +103,7 @@ class ConfigManager:
|
||||
try:
|
||||
with open(self.config_path, "w") as f:
|
||||
json.dump(config.dict(), f, indent=4)
|
||||
logger.info(f"[_save_config_to_disk][Action] Configuration saved")
|
||||
logger.info("[_save_config_to_disk][Action] Configuration saved")
|
||||
except Exception as e:
|
||||
logger.error(f"[_save_config_to_disk][Coherence:Failed] Failed to save: {e}")
|
||||
# [/DEF:_save_config_to_disk:Function]
|
||||
@@ -134,7 +134,7 @@ class ConfigManager:
|
||||
# @PARAM: settings (GlobalSettings) - The new global settings.
|
||||
def update_global_settings(self, settings: GlobalSettings):
|
||||
with belief_scope("update_global_settings"):
|
||||
logger.info(f"[update_global_settings][Entry] Updating settings")
|
||||
logger.info("[update_global_settings][Entry] Updating settings")
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(settings, GlobalSettings), "settings must be an instance of GlobalSettings"
|
||||
@@ -146,7 +146,7 @@ class ConfigManager:
|
||||
# Reconfigure logger with new settings
|
||||
configure_logger(settings.logging)
|
||||
|
||||
logger.info(f"[update_global_settings][Exit] Settings updated")
|
||||
logger.info("[update_global_settings][Exit] Settings updated")
|
||||
# [/DEF:update_global_settings:Function]
|
||||
|
||||
# [DEF:validate_path:Function]
|
||||
@@ -222,7 +222,7 @@ class ConfigManager:
|
||||
self.config.environments.append(env)
|
||||
self.save()
|
||||
|
||||
logger.info(f"[add_environment][Exit] Environment added")
|
||||
logger.info("[add_environment][Exit] Environment added")
|
||||
# [/DEF:add_environment:Function]
|
||||
|
||||
# [DEF:update_environment:Function]
|
||||
|
||||
@@ -48,6 +48,8 @@ class GlobalSettings(BaseModel):
|
||||
storage: StorageConfig = Field(default_factory=StorageConfig)
|
||||
default_environment_id: Optional[str] = None
|
||||
logging: LoggingConfig = Field(default_factory=LoggingConfig)
|
||||
connections: List[dict] = []
|
||||
llm: dict = Field(default_factory=lambda: {"providers": [], "default_provider": ""})
|
||||
|
||||
# Task retention settings
|
||||
task_retention_days: int = 30
|
||||
|
||||
@@ -11,14 +11,9 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker, Session
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from ..models.mapping import Base
|
||||
# Import models to ensure they're registered with Base
|
||||
from ..models.task import TaskRecord
|
||||
from ..models.connection import ConnectionConfig
|
||||
from ..models.git import GitServerConfig, GitRepository, DeploymentEnvironment
|
||||
from ..models.auth import User, Role, Permission, ADGroupMapping
|
||||
from ..models.llm import LLMProvider, ValidationRecord
|
||||
from .logger import belief_scope
|
||||
from .auth.config import auth_config
|
||||
import os
|
||||
|
||||
@@ -111,7 +111,6 @@ def configure_logger(config):
|
||||
|
||||
# Add file handler if file_path is set
|
||||
if config.file_path:
|
||||
import os
|
||||
from pathlib import Path
|
||||
log_file = Path(config.file_path)
|
||||
log_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
@@ -11,12 +11,10 @@
|
||||
import zipfile
|
||||
import yaml
|
||||
import os
|
||||
import shutil
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from typing import Dict
|
||||
from .logger import logger, belief_scope
|
||||
import yaml
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:MigrationEngine:Class]
|
||||
|
||||
@@ -1,9 +1,8 @@
|
||||
import importlib.util
|
||||
import os
|
||||
import sys # Added this line
|
||||
from typing import Dict, Type, List, Optional
|
||||
from typing import Dict, List, Optional
|
||||
from .plugin_base import PluginBase, PluginConfig
|
||||
from jsonschema import validate
|
||||
from .logger import belief_scope
|
||||
|
||||
# [DEF:PluginLoader:Class]
|
||||
|
||||
@@ -10,7 +10,6 @@ from apscheduler.schedulers.background import BackgroundScheduler
|
||||
from apscheduler.triggers.cron import CronTrigger
|
||||
from .logger import logger, belief_scope
|
||||
from .config_manager import ConfigManager
|
||||
from typing import Optional
|
||||
import asyncio
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
@@ -13,10 +13,10 @@
|
||||
import json
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional, Tuple, Union, cast
|
||||
from typing import Dict, List, Optional, Tuple, Union, cast
|
||||
from requests import Response
|
||||
from .logger import logger as app_logger, belief_scope
|
||||
from .utils.network import APIClient, SupersetAPIError, AuthenticationError, DashboardNotFoundError, NetworkError
|
||||
from .utils.network import APIClient, SupersetAPIError
|
||||
from .utils.fileio import get_filename_from_headers
|
||||
from .config_models import Environment
|
||||
# [/SECTION]
|
||||
@@ -87,11 +87,11 @@ class SupersetClient:
|
||||
if 'columns' not in validated_query:
|
||||
validated_query['columns'] = ["slug", "id", "changed_on_utc", "dashboard_title", "published"]
|
||||
|
||||
total_count = self._fetch_total_object_count(endpoint="/dashboard/")
|
||||
paginated_data = self._fetch_all_pages(
|
||||
endpoint="/dashboard/",
|
||||
pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"},
|
||||
pagination_options={"base_query": validated_query, "results_field": "result"},
|
||||
)
|
||||
total_count = len(paginated_data)
|
||||
app_logger.info("[get_dashboards][Exit] Found %d dashboards.", total_count)
|
||||
return total_count, paginated_data
|
||||
# [/DEF:get_dashboards:Function]
|
||||
@@ -203,15 +203,121 @@ class SupersetClient:
|
||||
app_logger.info("[get_datasets][Enter] Fetching datasets.")
|
||||
validated_query = self._validate_query_params(query)
|
||||
|
||||
total_count = self._fetch_total_object_count(endpoint="/dataset/")
|
||||
paginated_data = self._fetch_all_pages(
|
||||
endpoint="/dataset/",
|
||||
pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"},
|
||||
pagination_options={"base_query": validated_query, "results_field": "result"},
|
||||
)
|
||||
total_count = len(paginated_data)
|
||||
app_logger.info("[get_datasets][Exit] Found %d datasets.", total_count)
|
||||
return total_count, paginated_data
|
||||
# [/DEF:get_datasets:Function]
|
||||
|
||||
# [DEF:get_datasets_summary:Function]
|
||||
# @PURPOSE: Fetches dataset metadata optimized for the Dataset Hub grid.
|
||||
# @PRE: Client is authenticated.
|
||||
# @POST: Returns a list of dataset metadata summaries.
|
||||
# @RETURN: List[Dict]
|
||||
def get_datasets_summary(self) -> List[Dict]:
|
||||
with belief_scope("SupersetClient.get_datasets_summary"):
|
||||
query = {
|
||||
"columns": ["id", "table_name", "schema", "database"]
|
||||
}
|
||||
_, datasets = self.get_datasets(query=query)
|
||||
|
||||
# Map fields to match the contracts
|
||||
result = []
|
||||
for ds in datasets:
|
||||
result.append({
|
||||
"id": ds.get("id"),
|
||||
"table_name": ds.get("table_name"),
|
||||
"schema": ds.get("schema"),
|
||||
"database": ds.get("database", {}).get("database_name", "Unknown")
|
||||
})
|
||||
return result
|
||||
# [/DEF:get_datasets_summary:Function]
|
||||
|
||||
# [DEF:get_dataset_detail:Function]
|
||||
# @PURPOSE: Fetches detailed dataset information including columns and linked dashboards
|
||||
# @PRE: Client is authenticated and dataset_id exists.
|
||||
# @POST: Returns detailed dataset info with columns and linked dashboards.
|
||||
# @PARAM: dataset_id (int) - The dataset ID to fetch details for.
|
||||
# @RETURN: Dict - Dataset details with columns and linked_dashboards.
|
||||
# @RELATION: CALLS -> self.get_dataset
|
||||
# @RELATION: CALLS -> self.network.request (for related_objects)
|
||||
def get_dataset_detail(self, dataset_id: int) -> Dict:
|
||||
with belief_scope("SupersetClient.get_dataset_detail", f"id={dataset_id}"):
|
||||
# Get base dataset info
|
||||
response = self.get_dataset(dataset_id)
|
||||
|
||||
# If the response is a dict and has a 'result' key, use that (standard Superset API)
|
||||
if isinstance(response, dict) and 'result' in response:
|
||||
dataset = response['result']
|
||||
else:
|
||||
dataset = response
|
||||
|
||||
# Extract columns information
|
||||
columns = dataset.get("columns", [])
|
||||
column_info = []
|
||||
for col in columns:
|
||||
column_info.append({
|
||||
"id": col.get("id"),
|
||||
"name": col.get("column_name"),
|
||||
"type": col.get("type"),
|
||||
"is_dttm": col.get("is_dttm", False),
|
||||
"is_active": col.get("is_active", True),
|
||||
"description": col.get("description", "")
|
||||
})
|
||||
|
||||
# Get linked dashboards using related_objects endpoint
|
||||
linked_dashboards = []
|
||||
try:
|
||||
related_objects = self.network.request(
|
||||
method="GET",
|
||||
endpoint=f"/dataset/{dataset_id}/related_objects"
|
||||
)
|
||||
|
||||
# Handle different response formats
|
||||
if isinstance(related_objects, dict):
|
||||
if "dashboards" in related_objects:
|
||||
dashboards_data = related_objects["dashboards"]
|
||||
elif "result" in related_objects and isinstance(related_objects["result"], dict):
|
||||
dashboards_data = related_objects["result"].get("dashboards", [])
|
||||
else:
|
||||
dashboards_data = []
|
||||
|
||||
for dash in dashboards_data:
|
||||
linked_dashboards.append({
|
||||
"id": dash.get("id"),
|
||||
"title": dash.get("dashboard_title") or dash.get("title", "Unknown"),
|
||||
"slug": dash.get("slug")
|
||||
})
|
||||
except Exception as e:
|
||||
app_logger.warning(f"[get_dataset_detail][Warning] Failed to fetch related dashboards: {e}")
|
||||
linked_dashboards = []
|
||||
|
||||
# Extract SQL table information
|
||||
sql = dataset.get("sql", "")
|
||||
|
||||
result = {
|
||||
"id": dataset.get("id"),
|
||||
"table_name": dataset.get("table_name"),
|
||||
"schema": dataset.get("schema"),
|
||||
"database": dataset.get("database", {}).get("database_name", "Unknown"),
|
||||
"description": dataset.get("description", ""),
|
||||
"columns": column_info,
|
||||
"column_count": len(column_info),
|
||||
"sql": sql,
|
||||
"linked_dashboards": linked_dashboards,
|
||||
"linked_dashboard_count": len(linked_dashboards),
|
||||
"is_sqllab_view": dataset.get("is_sqllab_view", False),
|
||||
"created_on": dataset.get("created_on"),
|
||||
"changed_on": dataset.get("changed_on")
|
||||
}
|
||||
|
||||
app_logger.info(f"[get_dataset_detail][Exit] Got dataset {dataset_id} with {len(column_info)} columns and {len(linked_dashboards)} linked dashboards")
|
||||
return result
|
||||
# [/DEF:get_dataset_detail:Function]
|
||||
|
||||
# [DEF:get_dataset:Function]
|
||||
# @PURPOSE: Получает информацию о конкретном датасете по его ID.
|
||||
# @PARAM: dataset_id (int) - ID датасета.
|
||||
@@ -264,11 +370,12 @@ class SupersetClient:
|
||||
validated_query = self._validate_query_params(query or {})
|
||||
if 'columns' not in validated_query:
|
||||
validated_query['columns'] = []
|
||||
total_count = self._fetch_total_object_count(endpoint="/database/")
|
||||
|
||||
paginated_data = self._fetch_all_pages(
|
||||
endpoint="/database/",
|
||||
pagination_options={"base_query": validated_query, "total_count": total_count, "results_field": "result"},
|
||||
pagination_options={"base_query": validated_query, "results_field": "result"},
|
||||
)
|
||||
total_count = len(paginated_data)
|
||||
app_logger.info("[get_databases][Exit] Found %d databases.", total_count)
|
||||
return total_count, paginated_data
|
||||
# [/DEF:get_databases:Function]
|
||||
|
||||
@@ -5,7 +5,6 @@
|
||||
# @LAYER: Core
|
||||
# @RELATION: Uses TaskPersistenceService and TaskLogPersistenceService to delete old tasks and logs.
|
||||
|
||||
from datetime import datetime, timedelta
|
||||
from typing import List
|
||||
from .persistence import TaskPersistenceService, TaskLogPersistenceService
|
||||
from ..logger import logger, belief_scope
|
||||
|
||||
@@ -7,7 +7,7 @@
|
||||
# @INVARIANT: Each TaskContext is bound to a single task execution.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Dict, Any, Optional, Callable
|
||||
from typing import Dict, Any, Callable
|
||||
from .task_logger import TaskLogger
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
@@ -14,7 +14,7 @@ from concurrent.futures import ThreadPoolExecutor
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, List, Optional
|
||||
|
||||
from .models import Task, TaskStatus, LogEntry, LogFilter, LogStats, TaskLog
|
||||
from .models import Task, TaskStatus, LogEntry, LogFilter, LogStats
|
||||
from .persistence import TaskPersistenceService, TaskLogPersistenceService
|
||||
from .context import TaskContext
|
||||
from ..logger import logger, belief_scope, should_log_task_level
|
||||
@@ -136,7 +136,7 @@ class TaskManager:
|
||||
logger.error(f"Plugin with ID '{plugin_id}' not found.")
|
||||
raise ValueError(f"Plugin with ID '{plugin_id}' not found.")
|
||||
|
||||
plugin = self.plugin_loader.get_plugin(plugin_id)
|
||||
self.plugin_loader.get_plugin(plugin_id)
|
||||
|
||||
if not isinstance(params, dict):
|
||||
logger.error("Task parameters must be a dictionary.")
|
||||
@@ -248,7 +248,8 @@ class TaskManager:
|
||||
async def wait_for_resolution(self, task_id: str):
|
||||
with belief_scope("TaskManager.wait_for_resolution", f"task_id={task_id}"):
|
||||
task = self.tasks.get(task_id)
|
||||
if not task: return
|
||||
if not task:
|
||||
return
|
||||
|
||||
task.status = TaskStatus.AWAITING_MAPPING
|
||||
self.persistence_service.persist_task(task)
|
||||
@@ -269,7 +270,8 @@ class TaskManager:
|
||||
async def wait_for_input(self, task_id: str):
|
||||
with belief_scope("TaskManager.wait_for_input", f"task_id={task_id}"):
|
||||
task = self.tasks.get(task_id)
|
||||
if not task: return
|
||||
if not task:
|
||||
return
|
||||
|
||||
# Status is already set to AWAITING_INPUT by await_input()
|
||||
self.task_futures[task_id] = self.loop.create_future()
|
||||
|
||||
@@ -7,11 +7,10 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from datetime import datetime
|
||||
from typing import List, Optional, Dict, Any
|
||||
from typing import List, Optional
|
||||
import json
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
from sqlalchemy import and_, or_
|
||||
from ...models.task import TaskRecord, TaskLogRecord
|
||||
from ..database import TasksSessionLocal
|
||||
from .models import Task, TaskStatus, LogEntry, TaskLog, LogFilter, LogStats
|
||||
|
||||
@@ -8,7 +8,6 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Dict, Any, Optional, Callable
|
||||
from datetime import datetime
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:TaskLogger:Class]
|
||||
|
||||
@@ -11,7 +11,7 @@
|
||||
# [SECTION: IMPORTS]
|
||||
import pandas as pd # type: ignore
|
||||
import psycopg2 # type: ignore
|
||||
from typing import Dict, List, Optional, Any
|
||||
from typing import Dict, Optional, Any
|
||||
from ..logger import logger as app_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
@@ -19,7 +19,6 @@ from datetime import date, datetime
|
||||
import shutil
|
||||
import zlib
|
||||
from dataclasses import dataclass
|
||||
import yaml
|
||||
from ..logger import logger as app_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
@@ -42,6 +42,8 @@ def suggest_mappings(source_databases: List[Dict], target_databases: List[Dict],
|
||||
name, score, index = match
|
||||
if score >= threshold:
|
||||
suggestions.append({
|
||||
"source_db": s_db['database_name'],
|
||||
"target_db": target_databases[index]['database_name'],
|
||||
"source_db_uuid": s_db['uuid'],
|
||||
"target_db_uuid": target_databases[index]['uuid'],
|
||||
"confidence": score / 100.0
|
||||
|
||||
@@ -118,14 +118,41 @@ class APIClient:
|
||||
def _init_session(self) -> requests.Session:
|
||||
with belief_scope("_init_session"):
|
||||
session = requests.Session()
|
||||
|
||||
# Create a custom adapter that handles TLS issues
|
||||
class TLSAdapter(HTTPAdapter):
|
||||
def init_poolmanager(self, connections, maxsize, block=False):
|
||||
from urllib3.poolmanager import PoolManager
|
||||
import ssl
|
||||
|
||||
# Create an SSL context that ignores TLSv1 unrecognized name errors
|
||||
ctx = ssl.create_default_context()
|
||||
ctx.set_ciphers('HIGH:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!SRP:!CAMELLIA')
|
||||
|
||||
# Ignore TLSV1_UNRECOGNIZED_NAME errors by disabling hostname verification
|
||||
# This is safe when verify_ssl is false (we're already not verifying the certificate)
|
||||
ctx.check_hostname = False
|
||||
|
||||
self.poolmanager = PoolManager(
|
||||
num_pools=connections,
|
||||
maxsize=maxsize,
|
||||
block=block,
|
||||
ssl_context=ctx
|
||||
)
|
||||
|
||||
retries = Retry(total=3, backoff_factor=0.5, status_forcelist=[500, 502, 503, 504])
|
||||
adapter = HTTPAdapter(max_retries=retries)
|
||||
adapter = TLSAdapter(max_retries=retries)
|
||||
session.mount('http://', adapter)
|
||||
session.mount('https://', adapter)
|
||||
|
||||
if not self.request_settings["verify_ssl"]:
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
app_logger.warning("[_init_session][State] SSL verification disabled.")
|
||||
session.verify = self.request_settings["verify_ssl"]
|
||||
# When verify_ssl is false, we should also disable hostname verification
|
||||
session.verify = False
|
||||
else:
|
||||
session.verify = True
|
||||
|
||||
return session
|
||||
# [/DEF:_init_session:Function]
|
||||
|
||||
@@ -177,7 +204,8 @@ class APIClient:
|
||||
# @POST: Returns headers including auth tokens.
|
||||
def headers(self) -> Dict[str, str]:
|
||||
with belief_scope("headers"):
|
||||
if not self._authenticated: self.authenticate()
|
||||
if not self._authenticated:
|
||||
self.authenticate()
|
||||
return {
|
||||
"Authorization": f"Bearer {self._tokens['access_token']}",
|
||||
"X-CSRFToken": self._tokens.get("csrf_token", ""),
|
||||
@@ -200,7 +228,8 @@ class APIClient:
|
||||
with belief_scope("request"):
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy()
|
||||
if headers: _headers.update(headers)
|
||||
if headers:
|
||||
_headers.update(headers)
|
||||
|
||||
try:
|
||||
response = self.session.request(method, full_url, headers=_headers, **kwargs)
|
||||
@@ -223,9 +252,12 @@ class APIClient:
|
||||
status_code = e.response.status_code
|
||||
if status_code == 502 or status_code == 503 or status_code == 504:
|
||||
raise NetworkError(f"Environment unavailable (Status {status_code})", status_code=status_code) from e
|
||||
if status_code == 404: raise DashboardNotFoundError(endpoint) from e
|
||||
if status_code == 403: raise PermissionDeniedError() from e
|
||||
if status_code == 401: raise AuthenticationError() from e
|
||||
if status_code == 404:
|
||||
raise DashboardNotFoundError(endpoint) from e
|
||||
if status_code == 403:
|
||||
raise PermissionDeniedError() from e
|
||||
if status_code == 401:
|
||||
raise AuthenticationError() from e
|
||||
raise SupersetAPIError(f"API Error {status_code}: {e.response.text}") from e
|
||||
# [/DEF:_handle_http_error:Function]
|
||||
|
||||
@@ -237,9 +269,12 @@ class APIClient:
|
||||
# @POST: Raises a NetworkError.
|
||||
def _handle_network_error(self, e: requests.exceptions.RequestException, url: str):
|
||||
with belief_scope("_handle_network_error"):
|
||||
if isinstance(e, requests.exceptions.Timeout): msg = "Request timeout"
|
||||
elif isinstance(e, requests.exceptions.ConnectionError): msg = "Connection error"
|
||||
else: msg = f"Unknown network error: {e}"
|
||||
if isinstance(e, requests.exceptions.Timeout):
|
||||
msg = "Request timeout"
|
||||
elif isinstance(e, requests.exceptions.ConnectionError):
|
||||
msg = "Connection error"
|
||||
else:
|
||||
msg = f"Unknown network error: {e}"
|
||||
raise NetworkError(msg, url=url) from e
|
||||
# [/DEF:_handle_network_error:Function]
|
||||
|
||||
@@ -256,7 +291,9 @@ class APIClient:
|
||||
def upload_file(self, endpoint: str, file_info: Dict[str, Any], extra_data: Optional[Dict] = None, timeout: Optional[int] = None) -> Dict:
|
||||
with belief_scope("upload_file"):
|
||||
full_url = f"{self.base_url}{endpoint}"
|
||||
_headers = self.headers.copy(); _headers.pop('Content-Type', None)
|
||||
_headers = self.headers.copy()
|
||||
_headers.pop('Content-Type', None)
|
||||
|
||||
|
||||
file_obj, file_name, form_field = file_info.get("file_obj"), file_info.get("file_name"), file_info.get("form_field", "file")
|
||||
|
||||
@@ -318,20 +355,40 @@ class APIClient:
|
||||
# @PURPOSE: Автоматически собирает данные со всех страниц пагинированного эндпоинта.
|
||||
# @PARAM: endpoint (str) - Эндпоинт.
|
||||
# @PARAM: pagination_options (Dict[str, Any]) - Опции пагинации.
|
||||
# @PRE: pagination_options must contain 'base_query', 'total_count', 'results_field'.
|
||||
# @PRE: pagination_options must contain 'base_query', 'results_field'. 'total_count' is optional.
|
||||
# @POST: Returns all items across all pages.
|
||||
# @RETURN: List[Any] - Список данных.
|
||||
def fetch_paginated_data(self, endpoint: str, pagination_options: Dict[str, Any]) -> List[Any]:
|
||||
with belief_scope("fetch_paginated_data"):
|
||||
base_query, total_count = pagination_options["base_query"], pagination_options["total_count"]
|
||||
results_field, page_size = pagination_options["results_field"], base_query.get('page_size')
|
||||
assert page_size and page_size > 0, "'page_size' must be a positive number."
|
||||
base_query = pagination_options["base_query"]
|
||||
total_count = pagination_options.get("total_count")
|
||||
|
||||
results_field = pagination_options["results_field"]
|
||||
count_field = pagination_options.get("count_field", "count")
|
||||
page_size = base_query.get('page_size', 1000)
|
||||
assert page_size > 0, "'page_size' must be a positive number."
|
||||
|
||||
results = []
|
||||
for page in range((total_count + page_size - 1) // page_size):
|
||||
page = 0
|
||||
|
||||
# Fetch first page to get data and total count if not provided
|
||||
query = {**base_query, 'page': page}
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query)}))
|
||||
|
||||
first_page_results = response_json.get(results_field, [])
|
||||
results.extend(first_page_results)
|
||||
|
||||
if total_count is None:
|
||||
total_count = response_json.get(count_field, len(first_page_results))
|
||||
app_logger.debug(f"[fetch_paginated_data][State] Total count resolved from first page: {total_count}")
|
||||
|
||||
# Fetch remaining pages
|
||||
total_pages = (total_count + page_size - 1) // page_size
|
||||
for page in range(1, total_pages):
|
||||
query = {**base_query, 'page': page}
|
||||
response_json = cast(Dict[str, Any], self.request("GET", endpoint, params={"q": json.dumps(query)}))
|
||||
results.extend(response_json.get(results_field, []))
|
||||
|
||||
return results
|
||||
# [/DEF:fetch_paginated_data:Function]
|
||||
|
||||
|
||||
@@ -1,11 +1,10 @@
|
||||
# [DEF:Dependencies:Module]
|
||||
# @SEMANTICS: dependency, injection, singleton, factory, auth, jwt
|
||||
# @PURPOSE: Manages the creation and provision of shared application dependencies, such as the PluginLoader and TaskManager, to avoid circular imports.
|
||||
# @PURPOSE: Manages creation and provision of shared application dependencies, such as PluginLoader and TaskManager, to avoid circular imports.
|
||||
# @LAYER: Core
|
||||
# @RELATION: Used by the main app and API routers to get access to shared instances.
|
||||
# @RELATION: Used by main app and API routers to get access to shared instances.
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
from fastapi import Depends, HTTPException, status
|
||||
from fastapi.security import OAuth2PasswordBearer
|
||||
from jose import JWTError
|
||||
@@ -13,8 +12,10 @@ from .core.plugin_loader import PluginLoader
|
||||
from .core.task_manager import TaskManager
|
||||
from .core.config_manager import ConfigManager
|
||||
from .core.scheduler import SchedulerService
|
||||
from .services.resource_service import ResourceService
|
||||
from .services.mapping_service import MappingService
|
||||
from .core.database import init_db, get_auth_db
|
||||
from .core.logger import logger, belief_scope
|
||||
from .core.logger import logger
|
||||
from .core.auth.jwt import decode_token
|
||||
from .core.auth.repository import AuthRepository
|
||||
from .models.auth import User
|
||||
@@ -29,12 +30,12 @@ config_manager = ConfigManager(config_path=str(config_path))
|
||||
init_db()
|
||||
|
||||
# [DEF:get_config_manager:Function]
|
||||
# @PURPOSE: Dependency injector for the ConfigManager.
|
||||
# @PURPOSE: Dependency injector for ConfigManager.
|
||||
# @PRE: Global config_manager must be initialized.
|
||||
# @POST: Returns shared ConfigManager instance.
|
||||
# @RETURN: ConfigManager - The shared config manager instance.
|
||||
def get_config_manager() -> ConfigManager:
|
||||
"""Dependency injector for the ConfigManager."""
|
||||
"""Dependency injector for ConfigManager."""
|
||||
return config_manager
|
||||
# [/DEF:get_config_manager:Function]
|
||||
|
||||
@@ -50,45 +51,68 @@ logger.info("TaskManager initialized")
|
||||
scheduler_service = SchedulerService(task_manager, config_manager)
|
||||
logger.info("SchedulerService initialized")
|
||||
|
||||
resource_service = ResourceService()
|
||||
logger.info("ResourceService initialized")
|
||||
|
||||
# [DEF:get_plugin_loader:Function]
|
||||
# @PURPOSE: Dependency injector for the PluginLoader.
|
||||
# @PURPOSE: Dependency injector for PluginLoader.
|
||||
# @PRE: Global plugin_loader must be initialized.
|
||||
# @POST: Returns shared PluginLoader instance.
|
||||
# @RETURN: PluginLoader - The shared plugin loader instance.
|
||||
def get_plugin_loader() -> PluginLoader:
|
||||
"""Dependency injector for the PluginLoader."""
|
||||
"""Dependency injector for PluginLoader."""
|
||||
return plugin_loader
|
||||
# [/DEF:get_plugin_loader:Function]
|
||||
|
||||
# [DEF:get_task_manager:Function]
|
||||
# @PURPOSE: Dependency injector for the TaskManager.
|
||||
# @PURPOSE: Dependency injector for TaskManager.
|
||||
# @PRE: Global task_manager must be initialized.
|
||||
# @POST: Returns shared TaskManager instance.
|
||||
# @RETURN: TaskManager - The shared task manager instance.
|
||||
def get_task_manager() -> TaskManager:
|
||||
"""Dependency injector for the TaskManager."""
|
||||
"""Dependency injector for TaskManager."""
|
||||
return task_manager
|
||||
# [/DEF:get_task_manager:Function]
|
||||
|
||||
# [DEF:get_scheduler_service:Function]
|
||||
# @PURPOSE: Dependency injector for the SchedulerService.
|
||||
# @PURPOSE: Dependency injector for SchedulerService.
|
||||
# @PRE: Global scheduler_service must be initialized.
|
||||
# @POST: Returns shared SchedulerService instance.
|
||||
# @RETURN: SchedulerService - The shared scheduler service instance.
|
||||
def get_scheduler_service() -> SchedulerService:
|
||||
"""Dependency injector for the SchedulerService."""
|
||||
"""Dependency injector for SchedulerService."""
|
||||
return scheduler_service
|
||||
# [/DEF:get_scheduler_service:Function]
|
||||
|
||||
# [DEF:get_resource_service:Function]
|
||||
# @PURPOSE: Dependency injector for ResourceService.
|
||||
# @PRE: Global resource_service must be initialized.
|
||||
# @POST: Returns shared ResourceService instance.
|
||||
# @RETURN: ResourceService - The shared resource service instance.
|
||||
def get_resource_service() -> ResourceService:
|
||||
"""Dependency injector for ResourceService."""
|
||||
return resource_service
|
||||
# [/DEF:get_resource_service:Function]
|
||||
|
||||
# [DEF:get_mapping_service:Function]
|
||||
# @PURPOSE: Dependency injector for MappingService.
|
||||
# @PRE: Global config_manager must be initialized.
|
||||
# @POST: Returns new MappingService instance.
|
||||
# @RETURN: MappingService - A new mapping service instance.
|
||||
def get_mapping_service() -> MappingService:
|
||||
"""Dependency injector for MappingService."""
|
||||
return MappingService(config_manager)
|
||||
# [/DEF:get_mapping_service:Function]
|
||||
|
||||
# [DEF:oauth2_scheme:Variable]
|
||||
# @PURPOSE: OAuth2 password bearer scheme for token extraction.
|
||||
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/api/auth/login")
|
||||
# [/DEF:oauth2_scheme:Variable]
|
||||
|
||||
# [DEF:get_current_user:Function]
|
||||
# @PURPOSE: Dependency for retrieving the currently authenticated user from a JWT.
|
||||
# @PURPOSE: Dependency for retrieving currently authenticated user from a JWT.
|
||||
# @PRE: JWT token provided in Authorization header.
|
||||
# @POST: Returns the User object if token is valid.
|
||||
# @POST: Returns User object if token is valid.
|
||||
# @THROW: HTTPException 401 if token is invalid or user not found.
|
||||
# @PARAM: token (str) - Extracted JWT token.
|
||||
# @PARAM: db (Session) - Auth database session.
|
||||
@@ -144,4 +168,4 @@ def has_permission(resource: str, action: str):
|
||||
return permission_checker
|
||||
# [/DEF:has_permission:Function]
|
||||
|
||||
# [/DEF:Dependencies:Module]
|
||||
# [/DEF:Dependencies:Module]
|
||||
|
||||
@@ -11,7 +11,7 @@
|
||||
# [SECTION: IMPORTS]
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
from sqlalchemy import Column, String, Boolean, DateTime, ForeignKey, Table, Enum
|
||||
from sqlalchemy import Column, String, Boolean, DateTime, ForeignKey, Table
|
||||
from sqlalchemy.orm import relationship
|
||||
from .mapping import Base
|
||||
# [/SECTION]
|
||||
|
||||
@@ -8,7 +8,6 @@
|
||||
import enum
|
||||
from datetime import datetime
|
||||
from sqlalchemy import Column, String, Integer, DateTime, Enum, ForeignKey, Boolean
|
||||
from sqlalchemy.dialects.postgresql import UUID
|
||||
import uuid
|
||||
from src.core.database import Base
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
# @LAYER: Domain
|
||||
# @RELATION: INHERITS_FROM -> backend.src.models.mapping.Base
|
||||
|
||||
from sqlalchemy import Column, String, Boolean, DateTime, JSON, Enum, Text
|
||||
from sqlalchemy import Column, String, Boolean, DateTime, JSON, Text
|
||||
from datetime import datetime
|
||||
import uuid
|
||||
from .mapping import Base
|
||||
|
||||
@@ -95,7 +95,7 @@ class BackupPlugin(PluginBase):
|
||||
with belief_scope("get_schema"):
|
||||
config_manager = get_config_manager()
|
||||
envs = [e.name for e in config_manager.get_environments()]
|
||||
default_path = config_manager.get_config().settings.storage.root_path
|
||||
config_manager.get_config().settings.storage.root_path
|
||||
|
||||
return {
|
||||
"type": "object",
|
||||
|
||||
@@ -5,10 +5,9 @@
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> backend.src.plugins.llm_analysis.service.LLMClient
|
||||
|
||||
from typing import List, Optional
|
||||
from typing import List
|
||||
from tenacity import retry, stop_after_attempt, wait_exponential
|
||||
from ..llm_analysis.service import LLMClient
|
||||
from ..llm_analysis.models import LLMProviderType
|
||||
from ...core.logger import belief_scope, logger
|
||||
|
||||
# [DEF:GitLLMExtension:Class]
|
||||
|
||||
@@ -54,7 +54,7 @@ class GitPlugin(PluginBase):
|
||||
self.config_manager = config_manager
|
||||
app_logger.info("GitPlugin initialized using shared config_manager.")
|
||||
return
|
||||
except:
|
||||
except Exception:
|
||||
config_path = "config.json"
|
||||
|
||||
self.config_manager = ConfigManager(config_path)
|
||||
@@ -135,7 +135,7 @@ class GitPlugin(PluginBase):
|
||||
# @POST: Плагин готов к выполнению задач.
|
||||
async def initialize(self):
|
||||
with belief_scope("GitPlugin.initialize"):
|
||||
logger.info("[GitPlugin.initialize][Action] Initializing Git Integration Plugin logic.")
|
||||
app_logger.info("[GitPlugin.initialize][Action] Initializing Git Integration Plugin logic.")
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Основной метод выполнения задач плагина с поддержкой TaskContext.
|
||||
@@ -246,15 +246,15 @@ class GitPlugin(PluginBase):
|
||||
# 5. Автоматический staging изменений (не коммит, чтобы юзер мог проверить diff)
|
||||
try:
|
||||
repo.git.add(A=True)
|
||||
logger.info(f"[_handle_sync][Action] Changes staged in git")
|
||||
app_logger.info("[_handle_sync][Action] Changes staged in git")
|
||||
except Exception as ge:
|
||||
logger.warning(f"[_handle_sync][Action] Failed to stage changes: {ge}")
|
||||
app_logger.warning(f"[_handle_sync][Action] Failed to stage changes: {ge}")
|
||||
|
||||
logger.info(f"[_handle_sync][Coherence:OK] Dashboard {dashboard_id} synced successfully.")
|
||||
app_logger.info(f"[_handle_sync][Coherence:OK] Dashboard {dashboard_id} synced successfully.")
|
||||
return {"status": "success", "message": "Dashboard synced and flattened in local repository"}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[_handle_sync][Coherence:Failed] Sync failed: {e}")
|
||||
app_logger.error(f"[_handle_sync][Coherence:Failed] Sync failed: {e}")
|
||||
raise
|
||||
# [/DEF:_handle_sync:Function]
|
||||
|
||||
@@ -292,7 +292,8 @@ class GitPlugin(PluginBase):
|
||||
if ".git" in dirs:
|
||||
dirs.remove(".git")
|
||||
for file in files:
|
||||
if file == ".git" or file.endswith(".zip"): continue
|
||||
if file == ".git" or file.endswith(".zip"):
|
||||
continue
|
||||
file_path = Path(root) / file
|
||||
# Prepend the root directory name to the archive path
|
||||
arcname = Path(root_dir_name) / file_path.relative_to(repo_path)
|
||||
@@ -315,16 +316,16 @@ class GitPlugin(PluginBase):
|
||||
f.write(zip_buffer.getvalue())
|
||||
|
||||
try:
|
||||
logger.info(f"[_handle_deploy][Action] Importing dashboard to {env.name}")
|
||||
app_logger.info(f"[_handle_deploy][Action] Importing dashboard to {env.name}")
|
||||
result = client.import_dashboard(temp_zip_path)
|
||||
logger.info(f"[_handle_deploy][Coherence:OK] Deployment successful for dashboard {dashboard_id}.")
|
||||
app_logger.info(f"[_handle_deploy][Coherence:OK] Deployment successful for dashboard {dashboard_id}.")
|
||||
return {"status": "success", "message": f"Dashboard deployed to {env.name}", "details": result}
|
||||
finally:
|
||||
if temp_zip_path.exists():
|
||||
os.remove(temp_zip_path)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[_handle_deploy][Coherence:Failed] Deployment failed: {e}")
|
||||
app_logger.error(f"[_handle_deploy][Coherence:Failed] Deployment failed: {e}")
|
||||
raise
|
||||
# [/DEF:_handle_deploy:Function]
|
||||
|
||||
@@ -336,13 +337,13 @@ class GitPlugin(PluginBase):
|
||||
# @RETURN: Environment - Объект конфигурации окружения.
|
||||
def _get_env(self, env_id: Optional[str] = None):
|
||||
with belief_scope("GitPlugin._get_env"):
|
||||
logger.info(f"[_get_env][Entry] Fetching environment for ID: {env_id}")
|
||||
app_logger.info(f"[_get_env][Entry] Fetching environment for ID: {env_id}")
|
||||
|
||||
# Priority 1: ConfigManager (config.json)
|
||||
if env_id:
|
||||
env = self.config_manager.get_environment(env_id)
|
||||
if env:
|
||||
logger.info(f"[_get_env][Exit] Found environment by ID in ConfigManager: {env.name}")
|
||||
app_logger.info(f"[_get_env][Exit] Found environment by ID in ConfigManager: {env.name}")
|
||||
return env
|
||||
|
||||
# Priority 2: Database (DeploymentEnvironment)
|
||||
@@ -355,12 +356,12 @@ class GitPlugin(PluginBase):
|
||||
db_env = db.query(DeploymentEnvironment).filter(DeploymentEnvironment.id == env_id).first()
|
||||
else:
|
||||
# If no ID, try to find active or any environment in DB
|
||||
db_env = db.query(DeploymentEnvironment).filter(DeploymentEnvironment.is_active == True).first()
|
||||
db_env = db.query(DeploymentEnvironment).filter(DeploymentEnvironment.is_active).first()
|
||||
if not db_env:
|
||||
db_env = db.query(DeploymentEnvironment).first()
|
||||
|
||||
if db_env:
|
||||
logger.info(f"[_get_env][Exit] Found environment in DB: {db_env.name}")
|
||||
app_logger.info(f"[_get_env][Exit] Found environment in DB: {db_env.name}")
|
||||
from src.core.config_models import Environment
|
||||
# Use token as password for SupersetClient
|
||||
return Environment(
|
||||
@@ -382,14 +383,14 @@ class GitPlugin(PluginBase):
|
||||
# but we have other envs, maybe it's one of them?
|
||||
env = next((e for e in envs if e.id == env_id), None)
|
||||
if env:
|
||||
logger.info(f"[_get_env][Exit] Found environment {env_id} in ConfigManager list")
|
||||
app_logger.info(f"[_get_env][Exit] Found environment {env_id} in ConfigManager list")
|
||||
return env
|
||||
|
||||
if not env_id:
|
||||
logger.info(f"[_get_env][Exit] Using first environment from ConfigManager: {envs[0].name}")
|
||||
app_logger.info(f"[_get_env][Exit] Using first environment from ConfigManager: {envs[0].name}")
|
||||
return envs[0]
|
||||
|
||||
logger.error(f"[_get_env][Coherence:Failed] No environments configured (searched config.json and DB). env_id={env_id}")
|
||||
app_logger.error(f"[_get_env][Coherence:Failed] No environments configured (searched config.json and DB). env_id={env_id}")
|
||||
raise ValueError("No environments configured. Please add a Superset Environment in Settings.")
|
||||
# [/DEF:_get_env:Function]
|
||||
|
||||
|
||||
@@ -9,4 +9,6 @@ LLM Analysis Plugin for automated dashboard validation and dataset documentation
|
||||
|
||||
from .plugin import DashboardValidationPlugin, DocumentationPlugin
|
||||
|
||||
__all__ = ['DashboardValidationPlugin', 'DocumentationPlugin']
|
||||
|
||||
# [/DEF:backend/src/plugins/llm_analysis/__init__.py:Module]
|
||||
|
||||
@@ -10,15 +10,13 @@
|
||||
# @RELATION: USES -> TaskContext
|
||||
# @INVARIANT: All LLM interactions must be executed as asynchronous tasks.
|
||||
|
||||
from typing import Dict, Any, Optional, List
|
||||
from typing import Dict, Any, Optional
|
||||
import os
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime, timedelta
|
||||
from ...core.plugin_base import PluginBase
|
||||
from ...core.logger import belief_scope, logger
|
||||
from ...core.database import SessionLocal
|
||||
from ...core.config_manager import ConfigManager
|
||||
from ...services.llm_provider import LLMProviderService
|
||||
from ...core.superset_client import SupersetClient
|
||||
from .service import ScreenshotService, LLMClient
|
||||
@@ -97,7 +95,7 @@ class DashboardValidationPlugin(PluginBase):
|
||||
log.error(f"LLM Provider {provider_id} not found")
|
||||
raise ValueError(f"LLM Provider {provider_id} not found")
|
||||
|
||||
llm_log.debug(f"Retrieved provider config:")
|
||||
llm_log.debug("Retrieved provider config:")
|
||||
llm_log.debug(f" Provider ID: {db_provider.id}")
|
||||
llm_log.debug(f" Provider Name: {db_provider.name}")
|
||||
llm_log.debug(f" Provider Type: {db_provider.provider_type}")
|
||||
@@ -299,7 +297,7 @@ class DocumentationPlugin(PluginBase):
|
||||
log.error(f"LLM Provider {provider_id} not found")
|
||||
raise ValueError(f"LLM Provider {provider_id} not found")
|
||||
|
||||
llm_log.debug(f"Retrieved provider config:")
|
||||
llm_log.debug("Retrieved provider config:")
|
||||
llm_log.debug(f" Provider ID: {db_provider.id}")
|
||||
llm_log.debug(f" Provider Name: {db_provider.name}")
|
||||
llm_log.debug(f" Provider Type: {db_provider.provider_type}")
|
||||
|
||||
@@ -12,12 +12,12 @@ import asyncio
|
||||
import base64
|
||||
import json
|
||||
import io
|
||||
from typing import List, Optional, Dict, Any
|
||||
from typing import List, Dict, Any
|
||||
from PIL import Image
|
||||
from playwright.async_api import async_playwright
|
||||
from openai import AsyncOpenAI, RateLimitError, AuthenticationError as OpenAIAuthenticationError
|
||||
from tenacity import retry, stop_after_attempt, wait_exponential, retry_if_exception
|
||||
from .models import LLMProviderType, ValidationResult, ValidationStatus, DetectedIssue
|
||||
from .models import LLMProviderType
|
||||
from ...core.logger import belief_scope, logger
|
||||
from ...core.config_models import Environment
|
||||
|
||||
@@ -96,7 +96,7 @@ class ScreenshotService:
|
||||
"password": ['input[name="password"]', 'input#password', 'input[placeholder*="Password"]', 'input[type="password"]'],
|
||||
"submit": ['button[type="submit"]', 'button#submit', '.btn-primary', 'input[type="submit"]']
|
||||
}
|
||||
logger.info(f"[DEBUG] Attempting to find login form elements...")
|
||||
logger.info("[DEBUG] Attempting to find login form elements...")
|
||||
|
||||
try:
|
||||
# Find and fill username
|
||||
@@ -190,27 +190,27 @@ class ScreenshotService:
|
||||
try:
|
||||
# Wait for the dashboard grid to be present
|
||||
await page.wait_for_selector('.dashboard-component, .dashboard-header, [data-test="dashboard-grid"]', timeout=30000)
|
||||
logger.info(f"[DEBUG] Dashboard container loaded")
|
||||
logger.info("[DEBUG] Dashboard container loaded")
|
||||
|
||||
# Wait for charts to finish loading (Superset uses loading spinners/skeletons)
|
||||
# We wait until loading indicators disappear or a timeout occurs
|
||||
try:
|
||||
# Wait for loading indicators to disappear
|
||||
await page.wait_for_selector('.loading, .ant-skeleton, .spinner', state="hidden", timeout=60000)
|
||||
logger.info(f"[DEBUG] Loading indicators hidden")
|
||||
except:
|
||||
logger.warning(f"[DEBUG] Timeout waiting for loading indicators to hide")
|
||||
logger.info("[DEBUG] Loading indicators hidden")
|
||||
except Exception:
|
||||
logger.warning("[DEBUG] Timeout waiting for loading indicators to hide")
|
||||
|
||||
# Wait for charts to actually render their content (e.g., ECharts, NVD3)
|
||||
# We look for common chart containers that should have content
|
||||
try:
|
||||
await page.wait_for_selector('.chart-container canvas, .slice_container svg, .superset-chart-canvas, .grid-content .chart-container', timeout=60000)
|
||||
logger.info(f"[DEBUG] Chart content detected")
|
||||
except:
|
||||
logger.warning(f"[DEBUG] Timeout waiting for chart content")
|
||||
logger.info("[DEBUG] Chart content detected")
|
||||
except Exception:
|
||||
logger.warning("[DEBUG] Timeout waiting for chart content")
|
||||
|
||||
# Additional check: wait for all chart containers to have non-empty content
|
||||
logger.info(f"[DEBUG] Waiting for all charts to have rendered content...")
|
||||
logger.info("[DEBUG] Waiting for all charts to have rendered content...")
|
||||
await page.wait_for_function("""() => {
|
||||
const charts = document.querySelectorAll('.chart-container, .slice_container');
|
||||
if (charts.length === 0) return true; // No charts to wait for
|
||||
@@ -223,10 +223,10 @@ class ScreenshotService:
|
||||
return hasCanvas || hasSvg || hasContent;
|
||||
});
|
||||
}""", timeout=60000)
|
||||
logger.info(f"[DEBUG] All charts have rendered content")
|
||||
logger.info("[DEBUG] All charts have rendered content")
|
||||
|
||||
# Scroll to bottom and back to top to trigger lazy loading of all charts
|
||||
logger.info(f"[DEBUG] Scrolling to trigger lazy loading...")
|
||||
logger.info("[DEBUG] Scrolling to trigger lazy loading...")
|
||||
await page.evaluate("""async () => {
|
||||
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
|
||||
for (let i = 0; i < document.body.scrollHeight; i += 500) {
|
||||
@@ -241,7 +241,7 @@ class ScreenshotService:
|
||||
logger.warning(f"[DEBUG] Dashboard content wait failed: {e}, proceeding anyway after delay")
|
||||
|
||||
# Final stabilization delay - increased for complex dashboards
|
||||
logger.info(f"[DEBUG] Final stabilization delay...")
|
||||
logger.info("[DEBUG] Final stabilization delay...")
|
||||
await asyncio.sleep(15)
|
||||
|
||||
# Logic to handle tabs and full-page capture
|
||||
@@ -251,7 +251,8 @@ class ScreenshotService:
|
||||
processed_tabs = set()
|
||||
|
||||
async def switch_tabs(depth=0):
|
||||
if depth > 3: return # Limit recursion depth
|
||||
if depth > 3:
|
||||
return # Limit recursion depth
|
||||
|
||||
tab_selectors = [
|
||||
'.ant-tabs-nav-list .ant-tabs-tab',
|
||||
@@ -262,7 +263,8 @@ class ScreenshotService:
|
||||
found_tabs = []
|
||||
for selector in tab_selectors:
|
||||
found_tabs = await page.locator(selector).all()
|
||||
if found_tabs: break
|
||||
if found_tabs:
|
||||
break
|
||||
|
||||
if found_tabs:
|
||||
logger.info(f"[DEBUG][TabSwitching] Found {len(found_tabs)} tabs at depth {depth}")
|
||||
@@ -292,7 +294,8 @@ class ScreenshotService:
|
||||
if "ant-tabs-tab-active" not in (await first_tab.get_attribute("class") or ""):
|
||||
await first_tab.click()
|
||||
await asyncio.sleep(1)
|
||||
except: pass
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
await switch_tabs()
|
||||
|
||||
@@ -423,7 +426,7 @@ class LLMClient:
|
||||
self.default_model = default_model
|
||||
|
||||
# DEBUG: Log initialization parameters (without exposing full API key)
|
||||
logger.info(f"[LLMClient.__init__] Initializing LLM client:")
|
||||
logger.info("[LLMClient.__init__] Initializing LLM client:")
|
||||
logger.info(f"[LLMClient.__init__] Provider Type: {provider_type}")
|
||||
logger.info(f"[LLMClient.__init__] Base URL: {base_url}")
|
||||
logger.info(f"[LLMClient.__init__] Default Model: {default_model}")
|
||||
|
||||
@@ -7,15 +7,13 @@
|
||||
# @RELATION: DEPENDS_ON -> superset_tool.utils
|
||||
# @RELATION: USES -> TaskContext
|
||||
|
||||
from typing import Dict, Any, List, Optional
|
||||
from pathlib import Path
|
||||
import zipfile
|
||||
from typing import Dict, Any, Optional
|
||||
import re
|
||||
|
||||
from ..core.plugin_base import PluginBase
|
||||
from ..core.logger import belief_scope, logger as app_logger
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.utils.fileio import create_temp_file, update_yamls, create_dashboard_export
|
||||
from ..core.utils.fileio import create_temp_file
|
||||
from ..dependencies import get_config_manager
|
||||
from ..core.migration_engine import MigrationEngine
|
||||
from ..core.database import SessionLocal
|
||||
@@ -151,8 +149,8 @@ class MigrationPlugin(PluginBase):
|
||||
dashboard_regex = params.get("dashboard_regex")
|
||||
|
||||
replace_db_config = params.get("replace_db_config", False)
|
||||
from_db_id = params.get("from_db_id")
|
||||
to_db_id = params.get("to_db_id")
|
||||
params.get("from_db_id")
|
||||
params.get("to_db_id")
|
||||
|
||||
# [DEF:MigrationPlugin.execute:Action]
|
||||
# @PURPOSE: Execute the migration logic with proper task logging.
|
||||
@@ -221,22 +219,29 @@ class MigrationPlugin(PluginBase):
|
||||
log.warning("No dashboards found matching criteria.")
|
||||
return
|
||||
|
||||
# Fetch mappings from database
|
||||
db_mapping = {}
|
||||
# Get mappings from params
|
||||
db_mapping = params.get("db_mappings", {})
|
||||
if not isinstance(db_mapping, dict):
|
||||
db_mapping = {}
|
||||
|
||||
# Fetch additional mappings from database if requested
|
||||
if replace_db_config:
|
||||
db = SessionLocal()
|
||||
try:
|
||||
# Find environment IDs by name
|
||||
src_env = db.query(Environment).filter(Environment.name == from_env_name).first()
|
||||
tgt_env = db.query(Environment).filter(Environment.name == to_env_name).first()
|
||||
src_env_db = db.query(Environment).filter(Environment.name == from_env_name).first()
|
||||
tgt_env_db = db.query(Environment).filter(Environment.name == to_env_name).first()
|
||||
|
||||
if src_env and tgt_env:
|
||||
mappings = db.query(DatabaseMapping).filter(
|
||||
DatabaseMapping.source_env_id == src_env.id,
|
||||
DatabaseMapping.target_env_id == tgt_env.id
|
||||
if src_env_db and tgt_env_db:
|
||||
stored_mappings = db.query(DatabaseMapping).filter(
|
||||
DatabaseMapping.source_env_id == src_env_db.id,
|
||||
DatabaseMapping.target_env_id == tgt_env_db.id
|
||||
).all()
|
||||
db_mapping = {m.source_db_uuid: m.target_db_uuid for m in mappings}
|
||||
log.info(f"Loaded {len(db_mapping)} database mappings.")
|
||||
# Provided mappings override stored ones
|
||||
stored_map_dict = {m.source_db_uuid: m.target_db_uuid for m in stored_mappings}
|
||||
stored_map_dict.update(db_mapping)
|
||||
db_mapping = stored_map_dict
|
||||
log.info(f"Loaded {len(stored_mappings)} database mappings from database.")
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
@@ -301,7 +306,7 @@ class MigrationPlugin(PluginBase):
|
||||
if match_alt:
|
||||
db_name = match_alt.group(1)
|
||||
|
||||
logger.warning(f"[MigrationPlugin][Action] Detected missing password for database: {db_name}")
|
||||
app_logger.warning(f"[MigrationPlugin][Action] Detected missing password for database: {db_name}")
|
||||
|
||||
if task_id:
|
||||
input_request = {
|
||||
@@ -320,19 +325,19 @@ class MigrationPlugin(PluginBase):
|
||||
|
||||
# Retry import with password
|
||||
if passwords:
|
||||
logger.info(f"[MigrationPlugin][Action] Retrying import for {title} with provided passwords.")
|
||||
app_logger.info(f"[MigrationPlugin][Action] Retrying import for {title} with provided passwords.")
|
||||
to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug, passwords=passwords)
|
||||
logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported after password injection.")
|
||||
app_logger.info(f"[MigrationPlugin][Success] Dashboard {title} imported after password injection.")
|
||||
# Clear passwords from params after use for security
|
||||
if "passwords" in task.params:
|
||||
del task.params["passwords"]
|
||||
continue
|
||||
|
||||
logger.error(f"[MigrationPlugin][Failure] Failed to migrate dashboard {title}: {exc}", exc_info=True)
|
||||
app_logger.error(f"[MigrationPlugin][Failure] Failed to migrate dashboard {title}: {exc}", exc_info=True)
|
||||
|
||||
logger.info("[MigrationPlugin][Exit] Migration finished.")
|
||||
app_logger.info("[MigrationPlugin][Exit] Migration finished.")
|
||||
except Exception as e:
|
||||
logger.critical(f"[MigrationPlugin][Failure] Fatal error during migration: {e}", exc_info=True)
|
||||
app_logger.critical(f"[MigrationPlugin][Failure] Fatal error during migration: {e}", exc_info=True)
|
||||
raise e
|
||||
# [/DEF:MigrationPlugin.execute:Action]
|
||||
# [/DEF:execute:Function]
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import re
|
||||
from typing import Dict, Any, List, Optional
|
||||
from typing import Dict, Any, Optional
|
||||
from ..core.plugin_base import PluginBase
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.logger import logger, belief_scope
|
||||
@@ -116,7 +116,7 @@ class SearchPlugin(PluginBase):
|
||||
log = context.logger if context else logger
|
||||
|
||||
# Create sub-loggers for different components
|
||||
superset_log = log.with_source("superset_api") if context else log
|
||||
log.with_source("superset_api") if context else log
|
||||
search_log = log.with_source("search") if context else log
|
||||
|
||||
if not env_name or not search_query:
|
||||
|
||||
@@ -19,7 +19,7 @@ from fastapi import UploadFile
|
||||
|
||||
from ...core.plugin_base import PluginBase
|
||||
from ...core.logger import belief_scope, logger
|
||||
from ...models.storage import StoredFile, FileCategory, StorageConfig
|
||||
from ...models.storage import StoredFile, FileCategory
|
||||
from ...dependencies import get_config_manager
|
||||
from ...core.task_manager.context import TaskContext
|
||||
# [/SECTION]
|
||||
@@ -126,7 +126,7 @@ class StoragePlugin(PluginBase):
|
||||
|
||||
# Create sub-loggers for different components
|
||||
storage_log = log.with_source("storage") if context else log
|
||||
filesystem_log = log.with_source("filesystem") if context else log
|
||||
log.with_source("filesystem") if context else log
|
||||
|
||||
storage_log.info(f"Executing with params: {params}")
|
||||
# [/DEF:execute:Function]
|
||||
|
||||
@@ -10,7 +10,7 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Optional
|
||||
from pydantic import BaseModel, EmailStr, Field
|
||||
from pydantic import BaseModel, EmailStr
|
||||
from datetime import datetime
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
@@ -20,7 +20,7 @@ sys.path.append(str(Path(__file__).parent.parent.parent))
|
||||
|
||||
from src.core.database import AuthSessionLocal, init_db
|
||||
from src.core.auth.security import get_password_hash
|
||||
from src.models.auth import User, Role, Permission
|
||||
from src.models.auth import User, Role
|
||||
from src.core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
@@ -9,13 +9,12 @@
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import sys
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path
|
||||
sys.path.append(str(Path(__file__).parent.parent.parent))
|
||||
|
||||
from src.core.database import init_db, auth_engine
|
||||
from src.core.database import init_db
|
||||
from src.core.logger import logger, belief_scope
|
||||
from src.scripts.seed_permissions import seed_permissions
|
||||
# [/SECTION]
|
||||
|
||||
163
backend/src/scripts/test_dataset_dashboard_relations.py
Normal file
163
backend/src/scripts/test_dataset_dashboard_relations.py
Normal file
@@ -0,0 +1,163 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Script to test dataset-to-dashboard relationships from Superset API.
|
||||
|
||||
Usage:
|
||||
cd backend && .venv/bin/python3 src/scripts/test_dataset_dashboard_relations.py
|
||||
"""
|
||||
|
||||
import json
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path (parent of scripts directory)
|
||||
sys.path.append(str(Path(__file__).parent.parent.parent))
|
||||
|
||||
from src.core.superset_client import SupersetClient
|
||||
from src.core.config_manager import ConfigManager
|
||||
from src.core.logger import logger
|
||||
|
||||
|
||||
def test_dashboard_dataset_relations():
|
||||
"""Test fetching dataset-to-dashboard relationships."""
|
||||
|
||||
# Load environment from existing config
|
||||
config_manager = ConfigManager()
|
||||
environments = config_manager.get_environments()
|
||||
|
||||
if not environments:
|
||||
logger.error("No environments configured!")
|
||||
return
|
||||
|
||||
# Use first available environment
|
||||
env = environments[0]
|
||||
logger.info(f"Using environment: {env.name} ({env.url})")
|
||||
|
||||
client = SupersetClient(env)
|
||||
|
||||
try:
|
||||
# Authenticate
|
||||
logger.info("Authenticating to Superset...")
|
||||
client.authenticate()
|
||||
logger.info("Authentication successful!")
|
||||
|
||||
# Test dashboard ID 13
|
||||
dashboard_id = 13
|
||||
logger.info(f"\n=== Fetching Dashboard {dashboard_id} ===")
|
||||
dashboard = client.network.request(method="GET", endpoint=f"/dashboard/{dashboard_id}")
|
||||
|
||||
print("\nDashboard structure:")
|
||||
print(f" ID: {dashboard.get('id')}")
|
||||
print(f" Title: {dashboard.get('dashboard_title')}")
|
||||
print(f" Published: {dashboard.get('published')}")
|
||||
|
||||
# Check for slices/charts
|
||||
if 'slices' in dashboard:
|
||||
logger.info(f"\n Found {len(dashboard['slices'])} slices/charts in dashboard")
|
||||
for i, slice_data in enumerate(dashboard['slices'][:5]): # Show first 5
|
||||
print(f" Slice {i+1}:")
|
||||
print(f" ID: {slice_data.get('slice_id')}")
|
||||
print(f" Name: {slice_data.get('slice_name')}")
|
||||
# Check for datasource_id
|
||||
if 'datasource_id' in slice_data:
|
||||
print(f" Datasource ID: {slice_data['datasource_id']}")
|
||||
if 'datasource_name' in slice_data:
|
||||
print(f" Datasource Name: {slice_data['datasource_name']}")
|
||||
if 'datasource_type' in slice_data:
|
||||
print(f" Datasource Type: {slice_data['datasource_type']}")
|
||||
else:
|
||||
logger.warning(" No 'slices' field found in dashboard response")
|
||||
logger.info(f" Available fields: {list(dashboard.keys())}")
|
||||
|
||||
# Test dataset ID 26
|
||||
dataset_id = 26
|
||||
logger.info(f"\n=== Fetching Dataset {dataset_id} ===")
|
||||
dataset = client.get_dataset(dataset_id)
|
||||
|
||||
print("\nDataset structure:")
|
||||
print(f" ID: {dataset.get('id')}")
|
||||
print(f" Table Name: {dataset.get('table_name')}")
|
||||
print(f" Schema: {dataset.get('schema')}")
|
||||
print(f" Database: {dataset.get('database', {}).get('database_name', 'Unknown')}")
|
||||
|
||||
# Check for dashboards that use this dataset
|
||||
logger.info(f"\n=== Finding Dashboards using Dataset {dataset_id} ===")
|
||||
|
||||
# Method: Use Superset's related_objects API
|
||||
try:
|
||||
logger.info(f" Using /api/v1/dataset/{dataset_id}/related_objects endpoint...")
|
||||
related_objects = client.network.request(
|
||||
method="GET",
|
||||
endpoint=f"/dataset/{dataset_id}/related_objects"
|
||||
)
|
||||
|
||||
logger.info(f" Related objects response type: {type(related_objects)}")
|
||||
logger.info(f" Related objects keys: {list(related_objects.keys()) if isinstance(related_objects, dict) else 'N/A'}")
|
||||
|
||||
# Check for dashboards in related objects
|
||||
if 'dashboards' in related_objects:
|
||||
dashboards = related_objects['dashboards']
|
||||
logger.info(f" Found {len(dashboards)} dashboards using this dataset:")
|
||||
|
||||
for dash in dashboards:
|
||||
logger.info(f" - Dashboard ID {dash.get('id')}: {dash.get('dashboard_title', dash.get('title', 'Unknown'))}")
|
||||
elif 'result' in related_objects:
|
||||
# Some Superset versions use 'result' wrapper
|
||||
result = related_objects['result']
|
||||
if 'dashboards' in result:
|
||||
dashboards = result['dashboards']
|
||||
logger.info(f" Found {len(dashboards)} dashboards using this dataset:")
|
||||
|
||||
for dash in dashboards:
|
||||
logger.info(f" - Dashboard ID {dash.get('id')}: {dash.get('dashboard_title', dash.get('title', 'Unknown'))}")
|
||||
else:
|
||||
logger.warning(f" No 'dashboards' key in result. Keys: {list(result.keys())}")
|
||||
else:
|
||||
logger.warning(f" No 'dashboards' key in response. Available keys: {list(related_objects.keys())}")
|
||||
logger.info(f" Full related_objects response:")
|
||||
print(json.dumps(related_objects, indent=2, default=str)[:1000])
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f" Error fetching related objects: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
|
||||
# Method 2: Try to use the position_json from dashboard
|
||||
logger.info(f"\n=== Analyzing Dashboard Position JSON ===")
|
||||
if 'position_json' in dashboard:
|
||||
position_data = json.loads(dashboard['position_json'])
|
||||
logger.info(f" Position data type: {type(position_data)}")
|
||||
|
||||
# Look for datasource references
|
||||
datasource_ids = set()
|
||||
if isinstance(position_data, dict):
|
||||
for key, value in position_data.items():
|
||||
if 'datasource' in key.lower() or key == 'DASHBOARD_VERSION_KEY':
|
||||
logger.debug(f" Key: {key}, Value type: {type(value)}")
|
||||
elif isinstance(position_data, list):
|
||||
logger.info(f" Position data has {len(position_data)} items")
|
||||
for item in position_data[:3]: # Show first 3
|
||||
logger.debug(f" Item: {type(item)}, keys: {list(item.keys()) if isinstance(item, dict) else 'N/A'}")
|
||||
if isinstance(item, dict):
|
||||
if 'datasource_id' in item:
|
||||
datasource_ids.add(item['datasource_id'])
|
||||
|
||||
if datasource_ids:
|
||||
logger.info(f" Found datasource IDs: {datasource_ids}")
|
||||
|
||||
# Save full response for analysis
|
||||
output_file = Path(__file__).parent / "dataset_dashboard_analysis.json"
|
||||
with open(output_file, 'w') as f:
|
||||
json.dump({
|
||||
'dashboard': dashboard,
|
||||
'dataset': dataset
|
||||
}, f, indent=2, default=str)
|
||||
logger.info(f"\nFull response saved to: {output_file}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error: {e}", exc_info=True)
|
||||
raise
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_dashboard_dataset_relations()
|
||||
18
backend/src/services/__init__.py
Normal file
18
backend/src/services/__init__.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# [DEF:backend.src.services:Module]
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: services, package, init
|
||||
# @PURPOSE: Package initialization for services module
|
||||
# @LAYER: Core
|
||||
# @RELATION: EXPORTS -> resource_service, mapping_service
|
||||
# @NOTE: Only export services that don't cause circular imports
|
||||
# @NOTE: GitService, AuthService, LLMProviderService have circular import issues - import directly when needed
|
||||
|
||||
# Only export services that don't cause circular imports
|
||||
from .mapping_service import MappingService
|
||||
from .resource_service import ResourceService
|
||||
|
||||
__all__ = [
|
||||
'MappingService',
|
||||
'ResourceService',
|
||||
]
|
||||
# [/DEF:backend.src.services:Module]
|
||||
@@ -10,11 +10,11 @@
|
||||
# @INVARIANT: Authentication must verify both credentials and account status.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import Optional, Dict, Any, List
|
||||
from typing import Dict, Any
|
||||
from sqlalchemy.orm import Session
|
||||
from ..models.auth import User, Role
|
||||
from ..core.auth.repository import AuthRepository
|
||||
from ..core.auth.security import verify_password, get_password_hash
|
||||
from ..core.auth.security import verify_password
|
||||
from ..core.auth.jwt import create_access_token
|
||||
from ..core.logger import belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
@@ -10,11 +10,10 @@
|
||||
# @INVARIANT: All Git operations must be performed on a valid local directory.
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import httpx
|
||||
from git import Repo, RemoteProgress
|
||||
from git import Repo
|
||||
from fastapi import HTTPException
|
||||
from typing import List, Optional
|
||||
from typing import List
|
||||
from datetime import datetime
|
||||
from src.core.logger import logger, belief_scope
|
||||
from src.models.git import GitProvider
|
||||
@@ -167,7 +166,7 @@ class GitService:
|
||||
|
||||
# Handle empty repository case (no commits)
|
||||
if not repo.heads and not repo.remotes:
|
||||
logger.warning(f"[create_branch][Action] Repository is empty. Creating initial commit to enable branching.")
|
||||
logger.warning("[create_branch][Action] Repository is empty. Creating initial commit to enable branching.")
|
||||
readme_path = os.path.join(repo.working_dir, "README.md")
|
||||
if not os.path.exists(readme_path):
|
||||
with open(readme_path, "w") as f:
|
||||
@@ -178,7 +177,7 @@ class GitService:
|
||||
# Verify source branch exists
|
||||
try:
|
||||
repo.commit(from_branch)
|
||||
except:
|
||||
except Exception:
|
||||
logger.warning(f"[create_branch][Action] Source branch {from_branch} not found, using HEAD")
|
||||
from_branch = repo.head
|
||||
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
from typing import List, Optional
|
||||
from sqlalchemy.orm import Session
|
||||
from ..models.llm import LLMProvider
|
||||
from ..plugins.llm_analysis.models import LLMProviderConfig, LLMProviderType
|
||||
from ..plugins.llm_analysis.models import LLMProviderConfig
|
||||
from ..core.logger import belief_scope, logger
|
||||
from cryptography.fernet import Fernet
|
||||
import os
|
||||
|
||||
251
backend/src/services/resource_service.py
Normal file
251
backend/src/services/resource_service.py
Normal file
@@ -0,0 +1,251 @@
|
||||
# [DEF:backend.src.services.resource_service:Module]
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: service, resources, dashboards, datasets, tasks, git
|
||||
# @PURPOSE: Shared service for fetching resource data with Git status and task status
|
||||
# @LAYER: Service
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.superset_client
|
||||
# @RELATION: DEPENDS_ON -> backend.src.core.task_manager
|
||||
# @RELATION: DEPENDS_ON -> backend.src.services.git_service
|
||||
# @INVARIANT: All resources include metadata about their current state
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
from typing import List, Dict, Optional, Any
|
||||
from ..core.superset_client import SupersetClient
|
||||
from ..core.task_manager.models import Task
|
||||
from ..services.git_service import GitService
|
||||
from ..core.logger import logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:ResourceService:Class]
|
||||
# @PURPOSE: Provides centralized access to resource data with enhanced metadata
|
||||
class ResourceService:
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initialize the resource service with dependencies
|
||||
# @PRE: None
|
||||
# @POST: ResourceService is ready to fetch resources
|
||||
def __init__(self):
|
||||
with belief_scope("ResourceService.__init__"):
|
||||
self.git_service = GitService()
|
||||
logger.info("[ResourceService][Action] Initialized ResourceService")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:get_dashboards_with_status:Function]
|
||||
# @PURPOSE: Fetch dashboards from environment with Git status and last task status
|
||||
# @PRE: env is a valid Environment object
|
||||
# @POST: Returns list of dashboards with enhanced metadata
|
||||
# @PARAM: env (Environment) - The environment to fetch from
|
||||
# @PARAM: tasks (List[Task]) - List of tasks to check for status
|
||||
# @RETURN: List[Dict] - Dashboards with git_status and last_task fields
|
||||
# @RELATION: CALLS -> SupersetClient.get_dashboards_summary
|
||||
# @RELATION: CALLS -> self._get_git_status_for_dashboard
|
||||
# @RELATION: CALLS -> self._get_last_task_for_resource
|
||||
async def get_dashboards_with_status(
|
||||
self,
|
||||
env: Any,
|
||||
tasks: Optional[List[Task]] = None
|
||||
) -> List[Dict[str, Any]]:
|
||||
with belief_scope("get_dashboards_with_status", f"env={env.id}"):
|
||||
client = SupersetClient(env)
|
||||
dashboards = client.get_dashboards_summary()
|
||||
|
||||
# Enhance each dashboard with Git status and task status
|
||||
result = []
|
||||
for dashboard in dashboards:
|
||||
# dashboard is already a dict, no need to call .dict()
|
||||
dashboard_dict = dashboard
|
||||
dashboard_id = dashboard_dict.get('id')
|
||||
|
||||
# Get Git status if repo exists
|
||||
git_status = self._get_git_status_for_dashboard(dashboard_id)
|
||||
dashboard_dict['git_status'] = git_status
|
||||
|
||||
# Get last task status
|
||||
last_task = self._get_last_task_for_resource(
|
||||
f"dashboard-{dashboard_id}",
|
||||
tasks
|
||||
)
|
||||
dashboard_dict['last_task'] = last_task
|
||||
|
||||
result.append(dashboard_dict)
|
||||
|
||||
logger.info(f"[ResourceService][Coherence:OK] Fetched {len(result)} dashboards with status")
|
||||
return result
|
||||
# [/DEF:get_dashboards_with_status:Function]
|
||||
|
||||
# [DEF:get_datasets_with_status:Function]
|
||||
# @PURPOSE: Fetch datasets from environment with mapping progress and last task status
|
||||
# @PRE: env is a valid Environment object
|
||||
# @POST: Returns list of datasets with enhanced metadata
|
||||
# @PARAM: env (Environment) - The environment to fetch from
|
||||
# @PARAM: tasks (List[Task]) - List of tasks to check for status
|
||||
# @RETURN: List[Dict] - Datasets with mapped_fields and last_task fields
|
||||
# @RELATION: CALLS -> SupersetClient.get_datasets_summary
|
||||
# @RELATION: CALLS -> self._get_last_task_for_resource
|
||||
async def get_datasets_with_status(
|
||||
self,
|
||||
env: Any,
|
||||
tasks: Optional[List[Task]] = None
|
||||
) -> List[Dict[str, Any]]:
|
||||
with belief_scope("get_datasets_with_status", f"env={env.id}"):
|
||||
client = SupersetClient(env)
|
||||
datasets = client.get_datasets_summary()
|
||||
|
||||
# Enhance each dataset with task status
|
||||
result = []
|
||||
for dataset in datasets:
|
||||
# dataset is already a dict, no need to call .dict()
|
||||
dataset_dict = dataset
|
||||
dataset_id = dataset_dict.get('id')
|
||||
|
||||
# Get last task status
|
||||
last_task = self._get_last_task_for_resource(
|
||||
f"dataset-{dataset_id}",
|
||||
tasks
|
||||
)
|
||||
dataset_dict['last_task'] = last_task
|
||||
|
||||
result.append(dataset_dict)
|
||||
|
||||
logger.info(f"[ResourceService][Coherence:OK] Fetched {len(result)} datasets with status")
|
||||
return result
|
||||
# [/DEF:get_datasets_with_status:Function]
|
||||
|
||||
# [DEF:get_activity_summary:Function]
|
||||
# @PURPOSE: Get summary of active and recent tasks for the activity indicator
|
||||
# @PRE: tasks is a list of Task objects
|
||||
# @POST: Returns summary with active_count and recent_tasks
|
||||
# @PARAM: tasks (List[Task]) - List of tasks to summarize
|
||||
# @RETURN: Dict - Activity summary
|
||||
def get_activity_summary(self, tasks: List[Task]) -> Dict[str, Any]:
|
||||
with belief_scope("get_activity_summary"):
|
||||
# Count active (RUNNING, WAITING_INPUT) tasks
|
||||
active_tasks = [
|
||||
t for t in tasks
|
||||
if t.status in ['RUNNING', 'WAITING_INPUT']
|
||||
]
|
||||
|
||||
# Get recent tasks (last 5)
|
||||
recent_tasks = sorted(
|
||||
tasks,
|
||||
key=lambda t: t.created_at,
|
||||
reverse=True
|
||||
)[:5]
|
||||
|
||||
# Format recent tasks for frontend
|
||||
recent_tasks_formatted = []
|
||||
for task in recent_tasks:
|
||||
resource_name = self._extract_resource_name_from_task(task)
|
||||
recent_tasks_formatted.append({
|
||||
'task_id': str(task.id),
|
||||
'resource_name': resource_name,
|
||||
'resource_type': self._extract_resource_type_from_task(task),
|
||||
'status': task.status,
|
||||
'started_at': task.created_at.isoformat() if task.created_at else None
|
||||
})
|
||||
|
||||
return {
|
||||
'active_count': len(active_tasks),
|
||||
'recent_tasks': recent_tasks_formatted
|
||||
}
|
||||
# [/DEF:get_activity_summary:Function]
|
||||
|
||||
# [DEF:_get_git_status_for_dashboard:Function]
|
||||
# @PURPOSE: Get Git sync status for a dashboard
|
||||
# @PRE: dashboard_id is a valid integer
|
||||
# @POST: Returns git status or None if no repo exists
|
||||
# @PARAM: dashboard_id (int) - The dashboard ID
|
||||
# @RETURN: Optional[Dict] - Git status with branch and sync_status
|
||||
# @RELATION: CALLS -> GitService.get_repo
|
||||
def _get_git_status_for_dashboard(self, dashboard_id: int) -> Optional[Dict[str, Any]]:
|
||||
try:
|
||||
repo = self.git_service.get_repo(dashboard_id)
|
||||
if not repo:
|
||||
return None
|
||||
|
||||
# Check if there are uncommitted changes
|
||||
try:
|
||||
# Get current branch
|
||||
branch = repo.active_branch.name
|
||||
|
||||
# Check for uncommitted changes
|
||||
is_dirty = repo.is_dirty()
|
||||
|
||||
# Check for unpushed commits
|
||||
unpushed = len(list(repo.iter_commits(f'{branch}@{{u}}..{branch}'))) if '@{u}' in str(repo.refs) else 0
|
||||
|
||||
if is_dirty or unpushed > 0:
|
||||
sync_status = 'DIFF'
|
||||
else:
|
||||
sync_status = 'OK'
|
||||
|
||||
return {
|
||||
'branch': branch,
|
||||
'sync_status': sync_status
|
||||
}
|
||||
except Exception:
|
||||
logger.warning(f"[ResourceService][Warning] Failed to get git status for dashboard {dashboard_id}")
|
||||
return None
|
||||
except Exception:
|
||||
# No repo exists for this dashboard
|
||||
return None
|
||||
# [/DEF:_get_git_status_for_dashboard:Function]
|
||||
|
||||
# [DEF:_get_last_task_for_resource:Function]
|
||||
# @PURPOSE: Get the most recent task for a specific resource
|
||||
# @PRE: resource_id is a valid string
|
||||
# @POST: Returns task summary or None if no tasks found
|
||||
# @PARAM: resource_id (str) - The resource identifier (e.g., "dashboard-123")
|
||||
# @PARAM: tasks (Optional[List[Task]]) - List of tasks to search
|
||||
# @RETURN: Optional[Dict] - Task summary with task_id and status
|
||||
def _get_last_task_for_resource(
|
||||
self,
|
||||
resource_id: str,
|
||||
tasks: Optional[List[Task]] = None
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
if not tasks:
|
||||
return None
|
||||
|
||||
# Filter tasks for this resource
|
||||
resource_tasks = []
|
||||
for task in tasks:
|
||||
params = task.params or {}
|
||||
if params.get('resource_id') == resource_id:
|
||||
resource_tasks.append(task)
|
||||
|
||||
if not resource_tasks:
|
||||
return None
|
||||
|
||||
# Get most recent task
|
||||
last_task = max(resource_tasks, key=lambda t: t.created_at)
|
||||
|
||||
return {
|
||||
'task_id': str(last_task.id),
|
||||
'status': last_task.status
|
||||
}
|
||||
# [/DEF:_get_last_task_for_resource:Function]
|
||||
|
||||
# [DEF:_extract_resource_name_from_task:Function]
|
||||
# @PURPOSE: Extract resource name from task params
|
||||
# @PRE: task is a valid Task object
|
||||
# @POST: Returns resource name or task ID
|
||||
# @PARAM: task (Task) - The task to extract from
|
||||
# @RETURN: str - Resource name or fallback
|
||||
def _extract_resource_name_from_task(self, task: Task) -> str:
|
||||
params = task.params or {}
|
||||
return params.get('resource_name', f"Task {task.id}")
|
||||
# [/DEF:_extract_resource_name_from_task:Function]
|
||||
|
||||
# [DEF:_extract_resource_type_from_task:Function]
|
||||
# @PURPOSE: Extract resource type from task params
|
||||
# @PRE: task is a valid Task object
|
||||
# @POST: Returns resource type or 'unknown'
|
||||
# @PARAM: task (Task) - The task to extract from
|
||||
# @RETURN: str - Resource type
|
||||
def _extract_resource_type_from_task(self, task: Task) -> str:
|
||||
params = task.params or {}
|
||||
return params.get('resource_type', 'unknown')
|
||||
# [/DEF:_extract_resource_type_from_task:Function]
|
||||
|
||||
# [/DEF:ResourceService:Class]
|
||||
# [/DEF:backend.src.services.resource_service:Module]
|
||||
BIN
backend/tasks.db
BIN
backend/tasks.db
Binary file not shown.
@@ -1,8 +1,6 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Debug script to test Superset API authentication"""
|
||||
|
||||
import json
|
||||
import requests
|
||||
from pprint import pprint
|
||||
from src.core.superset_client import SupersetClient
|
||||
from src.core.config_manager import ConfigManager
|
||||
@@ -53,7 +51,7 @@ def main():
|
||||
print("\n--- Response Headers ---")
|
||||
pprint(dict(ui_response.headers))
|
||||
|
||||
print(f"\n--- Response Content Preview (200 chars) ---")
|
||||
print("\n--- Response Content Preview (200 chars) ---")
|
||||
print(repr(ui_response.text[:200]))
|
||||
|
||||
if ui_response.status_code == 200:
|
||||
|
||||
@@ -19,17 +19,17 @@ db = SessionLocal()
|
||||
provider = db.query(LLMProvider).filter(LLMProvider.id == '6c899741-4108-4196-aea4-f38ad2f0150e').first()
|
||||
|
||||
if provider:
|
||||
print(f"\nProvider found:")
|
||||
print("\nProvider found:")
|
||||
print(f" ID: {provider.id}")
|
||||
print(f" Name: {provider.name}")
|
||||
print(f" Encrypted API Key (first 50 chars): {provider.api_key[:50]}")
|
||||
print(f" Encrypted API Key Length: {len(provider.api_key)}")
|
||||
|
||||
# Test decryption
|
||||
print(f"\nAttempting decryption...")
|
||||
print("\nAttempting decryption...")
|
||||
try:
|
||||
decrypted = fernet.decrypt(provider.api_key.encode()).decode()
|
||||
print(f"Decryption successful!")
|
||||
print("Decryption successful!")
|
||||
print(f" Decrypted key length: {len(decrypted)}")
|
||||
print(f" Decrypted key (first 8 chars): {decrypted[:8]}")
|
||||
print(f" Decrypted key is empty: {len(decrypted) == 0}")
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import sys
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path
|
||||
@@ -8,7 +7,7 @@ sys.path.append(str(Path(__file__).parent.parent / "src"))
|
||||
import pytest
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from src.core.database import Base, get_auth_db
|
||||
from src.core.database import Base
|
||||
from src.models.auth import User, Role, Permission, ADGroupMapping
|
||||
from src.services.auth_service import AuthService
|
||||
from src.core.auth.repository import AuthRepository
|
||||
|
||||
73
backend/tests/test_dashboards_api.py
Normal file
73
backend/tests/test_dashboards_api.py
Normal file
@@ -0,0 +1,73 @@
|
||||
# [DEF:backend.tests.test_dashboards_api:Module]
|
||||
# @TIER: STANDARD
|
||||
# @PURPOSE: Contract-driven tests for Dashboard Hub API
|
||||
# @LAYER: Domain (Tests)
|
||||
# @SEMANTICS: tests, dashboards, api, contract
|
||||
# @RELATION: TESTS -> backend.src.api.routes.dashboards
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
from unittest.mock import MagicMock, patch
|
||||
from src.app import app
|
||||
from src.api.routes.dashboards import DashboardsResponse
|
||||
|
||||
client = TestClient(app)
|
||||
|
||||
# [DEF:test_get_dashboards_success:Function]
|
||||
# @TEST: GET /api/dashboards returns 200 and valid schema
|
||||
# @PRE: env_id exists
|
||||
# @POST: Response matches DashboardsResponse schema
|
||||
def test_get_dashboards_success():
|
||||
with patch("src.api.routes.dashboards.get_config_manager") as mock_config, \
|
||||
patch("src.api.routes.dashboards.get_resource_service") as mock_service, \
|
||||
patch("src.api.routes.dashboards.has_permission") as mock_perm:
|
||||
|
||||
# Mock environment
|
||||
mock_env = MagicMock()
|
||||
mock_env.id = "prod"
|
||||
mock_config.return_value.get_environments.return_value = [mock_env]
|
||||
|
||||
# Mock resource service response
|
||||
mock_service.return_value.get_dashboards_with_status.return_value = [
|
||||
{
|
||||
"id": 1,
|
||||
"title": "Sales Report",
|
||||
"slug": "sales",
|
||||
"git_status": {"branch": "main", "sync_status": "OK"},
|
||||
"last_task": {"task_id": "task-1", "status": "SUCCESS"}
|
||||
}
|
||||
]
|
||||
|
||||
# Mock permission
|
||||
mock_perm.return_value = lambda: True
|
||||
|
||||
response = client.get("/api/dashboards?env_id=prod")
|
||||
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert "dashboards" in data
|
||||
assert len(data["dashboards"]) == 1
|
||||
assert data["dashboards"][0]["title"] == "Sales Report"
|
||||
# Validate against Pydantic model
|
||||
DashboardsResponse(**data)
|
||||
|
||||
# [/DEF:test_get_dashboards_success:Function]
|
||||
|
||||
# [DEF:test_get_dashboards_env_not_found:Function]
|
||||
# @TEST: GET /api/dashboards returns 404 if env_id missing
|
||||
# @PRE: env_id does not exist
|
||||
# @POST: Returns 404 error
|
||||
def test_get_dashboards_env_not_found():
|
||||
with patch("src.api.routes.dashboards.get_config_manager") as mock_config, \
|
||||
patch("src.api.routes.dashboards.has_permission") as mock_perm:
|
||||
|
||||
mock_config.return_value.get_environments.return_value = []
|
||||
mock_perm.return_value = lambda: True
|
||||
|
||||
response = client.get("/api/dashboards?env_id=nonexistent")
|
||||
|
||||
assert response.status_code == 404
|
||||
assert "Environment not found" in response.json()["detail"]
|
||||
|
||||
# [/DEF:test_get_dashboards_env_not_found:Function]
|
||||
|
||||
# [/DEF:backend.tests.test_dashboards_api:Module]
|
||||
@@ -6,9 +6,7 @@
|
||||
# @TIER: STANDARD
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import pytest
|
||||
from datetime import datetime
|
||||
from unittest.mock import Mock, patch, MagicMock
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import pytest
|
||||
import logging
|
||||
from src.core.logger import (
|
||||
belief_scope,
|
||||
logger,
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import pytest
|
||||
from src.core.config_models import Environment
|
||||
from src.core.logger import belief_scope
|
||||
|
||||
|
||||
123
backend/tests/test_resource_hubs.py
Normal file
123
backend/tests/test_resource_hubs.py
Normal file
@@ -0,0 +1,123 @@
|
||||
import pytest
|
||||
from fastapi.testclient import TestClient
|
||||
from unittest.mock import MagicMock
|
||||
from src.app import app
|
||||
from src.dependencies import get_config_manager, get_task_manager, get_resource_service, has_permission
|
||||
|
||||
client = TestClient(app)
|
||||
|
||||
# [DEF:test_dashboards_api:Test]
|
||||
# @PURPOSE: Verify GET /api/dashboards contract compliance
|
||||
# @TEST: Valid env_id returns 200 and dashboard list
|
||||
# @TEST: Invalid env_id returns 404
|
||||
# @TEST: Search filter works
|
||||
|
||||
@pytest.fixture
|
||||
def mock_deps():
|
||||
config_manager = MagicMock()
|
||||
task_manager = MagicMock()
|
||||
resource_service = MagicMock()
|
||||
|
||||
# Mock environment
|
||||
env = MagicMock()
|
||||
env.id = "env1"
|
||||
config_manager.get_environments.return_value = [env]
|
||||
|
||||
# Mock tasks
|
||||
task_manager.get_all_tasks.return_value = []
|
||||
|
||||
# Mock dashboards
|
||||
resource_service.get_dashboards_with_status.return_value = [
|
||||
{"id": 1, "title": "Sales", "slug": "sales", "git_status": {"branch": "main", "sync_status": "OK"}, "last_task": None},
|
||||
{"id": 2, "title": "Marketing", "slug": "mkt", "git_status": None, "last_task": {"task_id": "t1", "status": "SUCCESS"}}
|
||||
]
|
||||
|
||||
app.dependency_overrides[get_config_manager] = lambda: config_manager
|
||||
app.dependency_overrides[get_task_manager] = lambda: task_manager
|
||||
app.dependency_overrides[get_resource_service] = lambda: resource_service
|
||||
|
||||
# Bypass permission check
|
||||
mock_user = MagicMock()
|
||||
mock_user.username = "testadmin"
|
||||
|
||||
# Override both get_current_user and has_permission
|
||||
from src.dependencies import get_current_user
|
||||
app.dependency_overrides[get_current_user] = lambda: mock_user
|
||||
|
||||
# We need to override the specific instance returned by has_permission
|
||||
app.dependency_overrides[has_permission("plugin:migration", "READ")] = lambda: mock_user
|
||||
|
||||
yield {
|
||||
"config": config_manager,
|
||||
"task": task_manager,
|
||||
"resource": resource_service
|
||||
}
|
||||
|
||||
app.dependency_overrides.clear()
|
||||
|
||||
def test_get_dashboards_success(mock_deps):
|
||||
response = client.get("/api/dashboards?env_id=env1")
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert "dashboards" in data
|
||||
assert len(data["dashboards"]) == 2
|
||||
assert data["dashboards"][0]["title"] == "Sales"
|
||||
assert data["dashboards"][0]["git_status"]["sync_status"] == "OK"
|
||||
|
||||
def test_get_dashboards_not_found(mock_deps):
|
||||
response = client.get("/api/dashboards?env_id=invalid")
|
||||
assert response.status_code == 404
|
||||
|
||||
def test_get_dashboards_search(mock_deps):
|
||||
response = client.get("/api/dashboards?env_id=env1&search=Sales")
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert len(data["dashboards"]) == 1
|
||||
assert data["dashboards"][0]["title"] == "Sales"
|
||||
|
||||
# [/DEF:test_dashboards_api:Test]
|
||||
|
||||
# [DEF:test_datasets_api:Test]
|
||||
# @PURPOSE: Verify GET /api/datasets contract compliance
|
||||
# @TEST: Valid env_id returns 200 and dataset list
|
||||
# @TEST: Invalid env_id returns 404
|
||||
# @TEST: Search filter works
|
||||
# @TEST: Negative - Service failure returns 503
|
||||
|
||||
def test_get_datasets_success(mock_deps):
|
||||
mock_deps["resource"].get_datasets_with_status.return_value = [
|
||||
{"id": 1, "table_name": "orders", "schema": "public", "database": "db1", "mapped_fields": {"total": 10, "mapped": 5}, "last_task": None}
|
||||
]
|
||||
|
||||
response = client.get("/api/datasets?env_id=env1")
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert "datasets" in data
|
||||
assert len(data["datasets"]) == 1
|
||||
assert data["datasets"][0]["table_name"] == "orders"
|
||||
assert data["datasets"][0]["mapped_fields"]["mapped"] == 5
|
||||
|
||||
def test_get_datasets_not_found(mock_deps):
|
||||
response = client.get("/api/datasets?env_id=invalid")
|
||||
assert response.status_code == 404
|
||||
|
||||
def test_get_datasets_search(mock_deps):
|
||||
mock_deps["resource"].get_datasets_with_status.return_value = [
|
||||
{"id": 1, "table_name": "orders", "schema": "public", "database": "db1", "mapped_fields": {"total": 10, "mapped": 5}, "last_task": None},
|
||||
{"id": 2, "table_name": "users", "schema": "public", "database": "db1", "mapped_fields": {"total": 5, "mapped": 5}, "last_task": None}
|
||||
]
|
||||
|
||||
response = client.get("/api/datasets?env_id=env1&search=orders")
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert len(data["datasets"]) == 1
|
||||
assert data["datasets"][0]["table_name"] == "orders"
|
||||
|
||||
def test_get_datasets_service_failure(mock_deps):
|
||||
mock_deps["resource"].get_datasets_with_status.side_effect = Exception("Superset down")
|
||||
|
||||
response = client.get("/api/datasets?env_id=env1")
|
||||
assert response.status_code == 503
|
||||
assert "Failed to fetch datasets" in response.json()["detail"]
|
||||
|
||||
# [/DEF:test_datasets_api:Test]
|
||||
49
backend/tests/test_resource_service.py
Normal file
49
backend/tests/test_resource_service.py
Normal file
@@ -0,0 +1,49 @@
|
||||
# [DEF:backend.tests.test_resource_service:Module]
|
||||
# @TIER: STANDARD
|
||||
# @PURPOSE: Contract-driven tests for ResourceService
|
||||
# @RELATION: TESTS -> backend.src.services.resource_service
|
||||
|
||||
import pytest
|
||||
from unittest.mock import MagicMock, patch
|
||||
from src.services.resource_service import ResourceService
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_dashboards_with_status():
|
||||
# [DEF:test_get_dashboards_with_status:Function]
|
||||
# @TEST: ResourceService correctly enhances dashboard data
|
||||
# @PRE: SupersetClient returns raw dashboards
|
||||
# @POST: Returned dicts contain git_status and last_task
|
||||
|
||||
with patch("src.services.resource_service.SupersetClient") as mock_client, \
|
||||
patch("src.services.resource_service.GitService") as mock_git:
|
||||
|
||||
service = ResourceService()
|
||||
|
||||
# Mock Superset response
|
||||
mock_client.return_value.get_dashboards_summary.return_value = [
|
||||
{"id": 1, "title": "Test Dashboard", "slug": "test"}
|
||||
]
|
||||
|
||||
# Mock Git status
|
||||
mock_git.return_value.get_repo.return_value = None # No repo
|
||||
|
||||
# Mock tasks
|
||||
mock_task = MagicMock()
|
||||
mock_task.id = "task-123"
|
||||
mock_task.status = "RUNNING"
|
||||
mock_task.params = {"resource_id": "dashboard-1"}
|
||||
|
||||
env = MagicMock()
|
||||
env.id = "prod"
|
||||
|
||||
result = await service.get_dashboards_with_status(env, [mock_task])
|
||||
|
||||
assert len(result) == 1
|
||||
assert result[0]["id"] == 1
|
||||
assert "git_status" in result[0]
|
||||
assert result[0]["last_task"]["task_id"] == "task-123"
|
||||
assert result[0]["last_task"]["status"] == "RUNNING"
|
||||
|
||||
# [/DEF:test_get_dashboards_with_status:Function]
|
||||
|
||||
# [/DEF:backend.tests.test_resource_service:Module]
|
||||
@@ -6,13 +6,10 @@
|
||||
# @TIER: STANDARD
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import pytest
|
||||
from unittest.mock import Mock, MagicMock
|
||||
from datetime import datetime
|
||||
from unittest.mock import Mock
|
||||
|
||||
from src.core.task_manager.task_logger import TaskLogger
|
||||
from src.core.task_manager.context import TaskContext
|
||||
from src.core.task_manager.models import LogEntry
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:TestTaskLogger:Class]
|
||||
|
||||
145
docs/design/resource_centric_layout.md
Normal file
145
docs/design/resource_centric_layout.md
Normal file
@@ -0,0 +1,145 @@
|
||||
# Design Document: Resource-Centric UI & Unified Task Experience
|
||||
|
||||
## 1. Core Philosophy
|
||||
|
||||
The application moves from a **Task-Centric** model (where users navigate to "Migration Tool" or "Git Tool") to a **Resource-Centric** model. Users navigate to the object they want to manage (Dashboard, Dataset) and perform actions on it.
|
||||
|
||||
**Goals:**
|
||||
1. **Context preservation:** Users shouldn't lose their place in a list just to see a log.
|
||||
2. **Discoverability:** All actions available for a resource are grouped together.
|
||||
3. **Traceability:** Every action is explicitly linked to a Task ID with accessible logs.
|
||||
|
||||
---
|
||||
|
||||
## 2. Navigation Structure (Navbar)
|
||||
|
||||
**Old Menu:**
|
||||
`[Home] [Migration] [Git Manager] [Mapper] [Settings] [Logout]`
|
||||
|
||||
**New Menu:**
|
||||
`[Superset Manager] [Dashboards] [Datasets] [Storage] | [Activity (0)] [Settings] [User]`
|
||||
|
||||
* **Dashboards**: Main hub for all dashboard operations (Migrate, Backup, Git).
|
||||
* **Datasets**: Hub for dataset documentation and mapping.
|
||||
* **Storage**: File management (Backups, Repositories).
|
||||
* **Activity**: Global indicator of running tasks. Clicking it opens the Task Drawer.
|
||||
|
||||
---
|
||||
|
||||
## 3. Page Layouts
|
||||
|
||||
### 3.1. Dashboard Hub (`/dashboards`)
|
||||
|
||||
The central place for managing Superset Dashboards.
|
||||
|
||||
**Wireframe:**
|
||||
```text
|
||||
+-----------------------------------------------------------------------+
|
||||
| Select Source Env: [ Development (v) ] [ Refresh ] |
|
||||
+-----------------------------------------------------------------------+
|
||||
| Search: [ Filter by title... ] |
|
||||
+-----------------------------------------------------------------------+
|
||||
| Title | Slug | Git Status | Last Task | Actions |
|
||||
|------------------|-------------|---------------|-----------|----------|
|
||||
| Sales Report | sales-2023 | 🌿 main (OK) | (v) Done | [ ... ] |
|
||||
| HR Analytics | hr-dash | - | ( ) Idle | [ ... ] |
|
||||
| Logs Monitor | logs-v2 | 🌿 dev (Diff) | (@) Run.. | [ ... ] |
|
||||
+-----------------------------------------------------------------------+
|
||||
```
|
||||
|
||||
**Interaction Details:**
|
||||
1. **Source Env Selector**: Loads dashboards via `superset_client.get_dashboards`.
|
||||
2. **Status Column ("Last Task")**:
|
||||
* Shows the status of the *last known action* for this dashboard in the current session.
|
||||
* **States**: `Idle`, `Running` (Spinner), `Waiting Input` (Orange Key), `Success` (Green Check), `Error` (Red X).
|
||||
* **Click Action**: Clicking the icon/badge opens the **Task Drawer**.
|
||||
3. **Actions Menu ([ ... ])**:
|
||||
* **Migrate**: Opens `DeploymentModal` (Simplified: just Target Env selector).
|
||||
* **Backup**: Immediately triggers `BackupPlugin`.
|
||||
* **Git Operations**:
|
||||
* *Init Repo* (if Git Status is empty).
|
||||
* *Commit/Push*, *History*, *Checkout* (if Git initialized).
|
||||
* **Validate**: Triggers LLM Analysis.
|
||||
|
||||
### 3.2. Dataset Hub (`/datasets`)
|
||||
|
||||
The central place for managing physical datasets and semantic layers.
|
||||
|
||||
**Wireframe:**
|
||||
```text
|
||||
+-----------------------------------------------------------------------+
|
||||
| Select Source Env: [ Production (v) ] |
|
||||
+-----------------------------------------------------------------------+
|
||||
| Table Name | Schema | Mapped Fields | Last Task | Actions |
|
||||
|------------------|-------------|---------------|-----------|----------|
|
||||
| fact_orders | public | 15 / 20 | (v) Done | [ ... ] |
|
||||
| dim_users | auth | 0 / 5 | ( ) Idle | [ ... ] |
|
||||
+-----------------------------------------------------------------------+
|
||||
```
|
||||
|
||||
**Actions Menu ([ ... ])**:
|
||||
* **Map Columns**: Opens the Mapping Modal (replaces `MapperPage`).
|
||||
* **Generate Docs**: Triggers `DocumentationPlugin`.
|
||||
|
||||
---
|
||||
|
||||
## 4. The Global Task Drawer
|
||||
|
||||
**Concept:** A slide-out panel that overlays the right side of the screen. It persists in the DOM layout (Global Layout) but is hidden until triggered.
|
||||
|
||||
**Trigger Points:**
|
||||
1. Clicking a **Status Badge** in any Grid row (Dashboard or Dataset).
|
||||
2. Clicking the **Activity** indicator in the Navbar.
|
||||
|
||||
**Layout:**
|
||||
```text
|
||||
+---------------------------------------------------------------+
|
||||
| Task: Migration "Sales Report" [X] Close |
|
||||
| ID: 1234-5678-uuid |
|
||||
| Status: WAITING_INPUT (Paused) |
|
||||
+---------------------------------------------------------------+
|
||||
| |
|
||||
| [Log Stream Area] |
|
||||
| 10:00:01 [INFO] Starting migration... |
|
||||
| 10:00:02 [INFO] Exporting dashboard... |
|
||||
| 10:00:05 [WARN] Target DB requires password! |
|
||||
| |
|
||||
+---------------------------------------------------------------+
|
||||
| INTERACTIVE AREA (Dynamic) |
|
||||
| |
|
||||
| Target Database: "Production DB" |
|
||||
| Enter Password: [ ********** ] |
|
||||
| |
|
||||
| [ Cancel ] [ Resume Task ] |
|
||||
+---------------------------------------------------------------+
|
||||
```
|
||||
|
||||
**Behavior:**
|
||||
* **Context Aware**: If I trigger a migration on "Sales Report", the Drawer automatically opens and subscribes to that task's ID.
|
||||
* **Multi-Tasking**: I can close the drawer (click [X]) to let the task run in the background. The "Activity" badge in the navbar increments.
|
||||
* **Input Handling**: Components like `PasswordPrompt` or `MissingMappingModal` are no longer center-screen modals blocking the whole UI. They are rendered *inside* the Interactive Area of the Drawer.
|
||||
|
||||
---
|
||||
|
||||
## 5. Technical Component Architecture
|
||||
|
||||
### 5.1. Stores (`stores/tasks.js`)
|
||||
Needs a new reactive store structure to map Resources to Tasks.
|
||||
|
||||
```javascript
|
||||
// Map resource UUIDs to their active/latest task UUIDs
|
||||
export const resourceTaskMap = writable({
|
||||
"dashboard-uuid-1": { taskId: "task-uuid-A", status: "RUNNING" },
|
||||
"dataset-uuid-2": { taskId: "task-uuid-B", status: "SUCCESS" }
|
||||
});
|
||||
|
||||
// The currently focused task in the Drawer
|
||||
export const activeDrawerTask = writable(null); // { taskId: "..." }
|
||||
export const isDrawerOpen = writable(false);
|
||||
```
|
||||
|
||||
### 5.2. Components
|
||||
* `DashboardHub.svelte`: Main page.
|
||||
* `DatasetHub.svelte`: Main page.
|
||||
* `GlobalTaskDrawer.svelte`: Lives in `+layout.svelte`. Connects to `activeDrawerTask`.
|
||||
* `ActionMenu.svelte`: Reusable dropdown for grids.
|
||||
@@ -14,7 +14,7 @@
|
||||
// [/SECTION]
|
||||
|
||||
// [SECTION: PROPS]
|
||||
export let sourceDatabases: Array<{uuid: string, database_name: string}> = [];
|
||||
export let sourceDatabases: Array<{uuid: string, database_name: string, engine?: string}> = [];
|
||||
export let targetDatabases: Array<{uuid: string, database_name: string}> = [];
|
||||
export let mappings: Array<{source_db_uuid: string, target_db_uuid: string}> = [];
|
||||
export let suggestions: Array<{source_db_uuid: string, target_db_uuid: string, confidence: number}> = [];
|
||||
@@ -29,7 +29,16 @@
|
||||
* @post 'update' event is dispatched.
|
||||
*/
|
||||
function updateMapping(sourceUuid: string, targetUuid: string) {
|
||||
dispatch('update', { sourceUuid, targetUuid });
|
||||
const sDb = sourceDatabases.find(d => d.uuid === sourceUuid);
|
||||
const tDb = targetDatabases.find(d => d.uuid === targetUuid);
|
||||
|
||||
dispatch('update', {
|
||||
sourceUuid,
|
||||
targetUuid,
|
||||
sourceName: sDb?.database_name || "",
|
||||
targetName: tDb?.database_name || "",
|
||||
engine: sDb?.engine || ""
|
||||
});
|
||||
}
|
||||
// [/DEF:updateMapping:Function]
|
||||
|
||||
|
||||
@@ -1,21 +1,33 @@
|
||||
<!-- [DEF:TaskLogViewer:Component] -->
|
||||
<!--
|
||||
@SEMANTICS: task, log, viewer, modal, inline
|
||||
@PURPOSE: Displays detailed logs for a specific task in a modal or inline using TaskLogPanel.
|
||||
@TIER: CRITICAL
|
||||
@SEMANTICS: task, log, viewer, inline, realtime
|
||||
@PURPOSE: Displays task logs inline (in drawer) or as modal. Merges real-time WebSocket logs with polled historical logs.
|
||||
@LAYER: UI
|
||||
@RELATION: USES -> frontend/src/services/taskService.js, frontend/src/components/tasks/TaskLogPanel.svelte
|
||||
@RELATION: USES -> frontend/src/services/taskService.js
|
||||
@RELATION: USES -> frontend/src/components/tasks/TaskLogPanel.svelte
|
||||
@INVARIANT: Real-time logs are always appended without duplicates.
|
||||
-->
|
||||
<script>
|
||||
import { createEventDispatcher, onMount, onDestroy } from 'svelte';
|
||||
import { getTaskLogs } from '../services/taskService.js';
|
||||
import { t } from '../lib/i18n';
|
||||
import { Button } from '../lib/ui';
|
||||
import TaskLogPanel from './tasks/TaskLogPanel.svelte';
|
||||
/**
|
||||
* @TIER CRITICAL
|
||||
* @PURPOSE Displays detailed logs for a specific task inline or in a modal using TaskLogPanel.
|
||||
* @UX_STATE Loading -> Shows spinner/text while fetching initial logs
|
||||
* @UX_STATE Streaming -> Displays logs with auto-scroll, real-time appending
|
||||
* @UX_STATE Error -> Shows error message with recovery option
|
||||
* @UX_FEEDBACK Auto-scroll keeps newest logs visible
|
||||
* @UX_RECOVERY Refresh button re-fetches logs from API
|
||||
*/
|
||||
import { createEventDispatcher, onDestroy } from "svelte";
|
||||
import { getTaskLogs } from "../services/taskService.js";
|
||||
import { t } from "../lib/i18n";
|
||||
import TaskLogPanel from "./tasks/TaskLogPanel.svelte";
|
||||
|
||||
export let show = false;
|
||||
export let inline = false;
|
||||
export let taskId = null;
|
||||
export let taskStatus = null; // To know if we should poll
|
||||
export let taskStatus = null;
|
||||
export let realTimeLogs = [];
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
@@ -24,29 +36,47 @@
|
||||
let error = "";
|
||||
let interval;
|
||||
let autoScroll = true;
|
||||
let selectedSource = 'all';
|
||||
let selectedLevel = 'all';
|
||||
|
||||
$: shouldShow = inline || show;
|
||||
|
||||
// [DEF:handleRealTimeLogs:Action]
|
||||
/** @PURPOSE Append real-time logs as they arrive from WebSocket, preventing duplicates */
|
||||
$: if (realTimeLogs && realTimeLogs.length > 0) {
|
||||
const lastLog = realTimeLogs[realTimeLogs.length - 1];
|
||||
const exists = logs.some(
|
||||
(l) =>
|
||||
l.timestamp === lastLog.timestamp &&
|
||||
l.message === lastLog.message,
|
||||
);
|
||||
if (!exists) {
|
||||
logs = [...logs, lastLog];
|
||||
console.log(
|
||||
`[TaskLogViewer][Action] Appended real-time log, total=${logs.length}`,
|
||||
);
|
||||
}
|
||||
}
|
||||
// [/DEF:handleRealTimeLogs:Action]
|
||||
|
||||
// [DEF:fetchLogs:Function]
|
||||
/**
|
||||
* @purpose Fetches logs for the current task.
|
||||
* @pre taskId must be set.
|
||||
* @post logs array is updated with data from taskService.
|
||||
* @side_effect Updates logs, loading, and error state.
|
||||
* @PURPOSE Fetches logs for the current task from API (polling fallback).
|
||||
* @PRE taskId must be set.
|
||||
* @POST logs array is updated with data from taskService.
|
||||
* @SIDE_EFFECT Updates logs, loading, and error state.
|
||||
*/
|
||||
async function fetchLogs() {
|
||||
if (!taskId) return;
|
||||
console.log(`[fetchLogs][Action] Fetching logs for task context={{'taskId': '${taskId}', 'source': '${selectedSource}', 'level': '${selectedLevel}'}}`);
|
||||
console.log(`[TaskLogViewer][Action] Fetching logs for task=${taskId}`);
|
||||
try {
|
||||
// Note: getTaskLogs currently doesn't support filters, but we can filter client-side for now
|
||||
// or update taskService later. For US1, the WebSocket handles real-time filtering.
|
||||
logs = await getTaskLogs(taskId);
|
||||
console.log(`[fetchLogs][Coherence:OK] Logs fetched context={{'count': ${logs.length}}}`);
|
||||
console.log(
|
||||
`[TaskLogViewer][Coherence:OK] Logs fetched count=${logs.length}`,
|
||||
);
|
||||
} catch (e) {
|
||||
error = e.message;
|
||||
console.error(`[fetchLogs][Coherence:Failed] Error fetching logs context={{'error': '${e.message}'}}`);
|
||||
console.error(
|
||||
`[TaskLogViewer][Coherence:Failed] Error: ${e.message}`,
|
||||
);
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
@@ -55,121 +85,192 @@
|
||||
|
||||
function handleFilterChange(event) {
|
||||
const { source, level } = event.detail;
|
||||
selectedSource = source;
|
||||
selectedLevel = level;
|
||||
// Re-fetch or re-filter if needed.
|
||||
// For now, we just log it as the WebSocket will handle real-time updates with filters.
|
||||
console.log(`[TaskLogViewer] Filter changed: source=${source}, level=${level}`);
|
||||
console.log(
|
||||
`[TaskLogViewer][Action] Filter changed: source=${source}, level=${level}`,
|
||||
);
|
||||
}
|
||||
|
||||
// [DEF:close:Function]
|
||||
/**
|
||||
* @purpose Closes the log viewer modal.
|
||||
* @pre Modal is open.
|
||||
* @post Modal is closed and close event is dispatched.
|
||||
*/
|
||||
function close() {
|
||||
dispatch('close');
|
||||
show = false;
|
||||
function handleRefresh() {
|
||||
console.log(`[TaskLogViewer][Action] Manual refresh`);
|
||||
fetchLogs();
|
||||
}
|
||||
// [/DEF:close:Function]
|
||||
|
||||
// React to changes in show/taskId/taskStatus
|
||||
$: if (shouldShow && taskId) {
|
||||
if (interval) clearInterval(interval);
|
||||
|
||||
logs = [];
|
||||
loading = true;
|
||||
error = "";
|
||||
fetchLogs();
|
||||
|
||||
// Poll if task is running (Fallback for when WS is not used)
|
||||
if (taskStatus === 'RUNNING' || taskStatus === 'AWAITING_INPUT' || taskStatus === 'AWAITING_MAPPING') {
|
||||
interval = setInterval(fetchLogs, 3000);
|
||||
|
||||
if (
|
||||
taskStatus === "RUNNING" ||
|
||||
taskStatus === "AWAITING_INPUT" ||
|
||||
taskStatus === "AWAITING_MAPPING"
|
||||
) {
|
||||
interval = setInterval(fetchLogs, 5000);
|
||||
}
|
||||
} else {
|
||||
if (interval) clearInterval(interval);
|
||||
}
|
||||
|
||||
// [DEF:onDestroy:Function]
|
||||
/**
|
||||
* @purpose Cleans up the polling interval.
|
||||
* @pre Component is being destroyed.
|
||||
* @post Polling interval is cleared.
|
||||
*/
|
||||
onDestroy(() => {
|
||||
if (interval) clearInterval(interval);
|
||||
});
|
||||
// [/DEF:onDestroy:Function]
|
||||
</script>
|
||||
|
||||
{#if shouldShow}
|
||||
{#if inline}
|
||||
<div class="flex flex-col h-full w-full p-4">
|
||||
<div class="flex justify-between items-center mb-4">
|
||||
<h3 class="text-lg font-medium text-gray-900">
|
||||
{$t.tasks?.logs_title} <span class="text-sm text-gray-500 font-normal">({taskId})</span>
|
||||
</h3>
|
||||
<Button variant="ghost" size="sm" on:click={fetchLogs} class="text-blue-600">{$t.tasks?.refresh}</Button>
|
||||
</div>
|
||||
|
||||
<div class="flex-1 min-h-[400px]">
|
||||
{#if loading && logs.length === 0}
|
||||
<p class="text-gray-500 text-center">{$t.tasks?.loading}</p>
|
||||
{:else if error}
|
||||
<p class="text-red-500 text-center">{error}</p>
|
||||
{:else}
|
||||
<TaskLogPanel
|
||||
{taskId}
|
||||
{logs}
|
||||
{autoScroll}
|
||||
on:filterChange={handleFilterChange}
|
||||
/>
|
||||
{/if}
|
||||
</div>
|
||||
<div class="log-viewer-inline">
|
||||
{#if loading && logs.length === 0}
|
||||
<div class="loading-state">
|
||||
<div class="loading-spinner"></div>
|
||||
<span>{$t.tasks?.loading || "Loading logs..."}</span>
|
||||
</div>
|
||||
{:else if error}
|
||||
<div class="error-state">
|
||||
<span class="error-icon">⚠</span>
|
||||
<span>{error}</span>
|
||||
<button class="retry-btn" on:click={handleRefresh}
|
||||
>Retry</button
|
||||
>
|
||||
</div>
|
||||
{:else}
|
||||
<TaskLogPanel
|
||||
{taskId}
|
||||
{logs}
|
||||
{autoScroll}
|
||||
on:filterChange={handleFilterChange}
|
||||
on:refresh={handleRefresh}
|
||||
/>
|
||||
{/if}
|
||||
</div>
|
||||
{:else}
|
||||
<div class="fixed inset-0 z-50 overflow-y-auto" aria-labelledby="modal-title" role="dialog" aria-modal="true">
|
||||
<div class="flex items-end justify-center min-h-screen pt-4 px-4 pb-20 text-center sm:block sm:p-0">
|
||||
<!-- Background overlay -->
|
||||
<div class="fixed inset-0 bg-gray-500 bg-opacity-75 transition-opacity" aria-hidden="true" on:click={close}></div>
|
||||
<div
|
||||
class="fixed inset-0 z-50 overflow-y-auto"
|
||||
aria-labelledby="modal-title"
|
||||
role="dialog"
|
||||
aria-modal="true"
|
||||
>
|
||||
<div
|
||||
class="flex items-end justify-center min-h-screen pt-4 px-4 pb-20 text-center sm:block sm:p-0"
|
||||
>
|
||||
<div
|
||||
class="fixed inset-0 bg-gray-500 bg-opacity-75 transition-opacity"
|
||||
aria-hidden="true"
|
||||
on:click={() => {
|
||||
show = false;
|
||||
dispatch("close");
|
||||
}}
|
||||
on:keydown={(e) => e.key === "Escape" && (show = false)}
|
||||
role="presentation"
|
||||
></div>
|
||||
|
||||
<span class="hidden sm:inline-block sm:align-middle sm:h-screen" aria-hidden="true">​</span>
|
||||
|
||||
<div class="inline-block align-bottom bg-white rounded-lg text-left overflow-hidden shadow-xl transform transition-all sm:my-8 sm:align-middle sm:max-w-4xl sm:w-full">
|
||||
<div class="bg-white px-4 pt-5 pb-4 sm:p-6 sm:pb-4">
|
||||
<div class="sm:flex sm:items-start">
|
||||
<div class="mt-3 text-center sm:mt-0 sm:ml-4 sm:text-left w-full">
|
||||
<h3 class="text-lg leading-6 font-medium text-gray-900 flex justify-between items-center mb-4" id="modal-title">
|
||||
<span>{$t.tasks.logs_title} <span class="text-sm text-gray-500 font-normal">({taskId})</span></span>
|
||||
<Button variant="ghost" size="sm" on:click={fetchLogs} class="text-blue-600">{$t.tasks.refresh}</Button>
|
||||
</h3>
|
||||
|
||||
<div class="h-[500px]">
|
||||
{#if loading && logs.length === 0}
|
||||
<p class="text-gray-500 text-center">{$t.tasks.loading}</p>
|
||||
{:else if error}
|
||||
<p class="text-red-500 text-center">{error}</p>
|
||||
{:else}
|
||||
<TaskLogPanel
|
||||
{taskId}
|
||||
{logs}
|
||||
{autoScroll}
|
||||
on:filterChange={handleFilterChange}
|
||||
/>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
<div
|
||||
class="inline-block align-bottom bg-gray-900 rounded-lg text-left overflow-hidden shadow-xl transform transition-all sm:my-8 sm:align-middle sm:max-w-4xl sm:w-full"
|
||||
>
|
||||
<div class="p-6">
|
||||
<div class="flex justify-between items-center mb-4">
|
||||
<h3
|
||||
class="text-lg font-medium text-gray-100"
|
||||
id="modal-title"
|
||||
>
|
||||
{$t.tasks?.logs_title || "Task Logs"}
|
||||
</h3>
|
||||
<button
|
||||
class="text-gray-500 hover:text-gray-300"
|
||||
on:click={() => {
|
||||
show = false;
|
||||
dispatch("close");
|
||||
}}
|
||||
aria-label="Close">✕</button
|
||||
>
|
||||
</div>
|
||||
<div class="h-[500px]">
|
||||
{#if loading && logs.length === 0}
|
||||
<p class="text-gray-500 text-center">
|
||||
{$t.tasks?.loading || "Loading..."}
|
||||
</p>
|
||||
{:else if error}
|
||||
<p class="text-red-400 text-center">{error}</p>
|
||||
{:else}
|
||||
<TaskLogPanel
|
||||
{taskId}
|
||||
{logs}
|
||||
{autoScroll}
|
||||
on:filterChange={handleFilterChange}
|
||||
/>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
<div class="bg-gray-50 px-4 py-3 sm:px-6 sm:flex sm:flex-row-reverse">
|
||||
<Button variant="secondary" on:click={close}>
|
||||
{$t.common.cancel}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
{/if}
|
||||
<!-- [/DEF:TaskLogViewer:Component] -->
|
||||
|
||||
<!-- [/DEF:TaskLogViewer:Component] -->
|
||||
|
||||
<style>
|
||||
.log-viewer-inline {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.loading-state {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
gap: 0.75rem;
|
||||
height: 100%;
|
||||
color: #64748b;
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
|
||||
.loading-spinner {
|
||||
width: 1.25rem;
|
||||
height: 1.25rem;
|
||||
border: 2px solid #334155;
|
||||
border-top-color: #3b82f6;
|
||||
border-radius: 50%;
|
||||
animation: spin 0.8s linear infinite;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
to {
|
||||
transform: rotate(360deg);
|
||||
}
|
||||
}
|
||||
|
||||
.error-state {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
gap: 0.5rem;
|
||||
height: 100%;
|
||||
color: #f87171;
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
|
||||
.error-icon {
|
||||
font-size: 1.25rem;
|
||||
}
|
||||
|
||||
.retry-btn {
|
||||
background-color: #1e293b;
|
||||
color: #94a3b8;
|
||||
border: 1px solid #334155;
|
||||
border-radius: 0.375rem;
|
||||
padding: 0.25rem 0.75rem;
|
||||
font-size: 0.75rem;
|
||||
cursor: pointer;
|
||||
transition: all 0.15s;
|
||||
}
|
||||
|
||||
.retry-btn:hover {
|
||||
background-color: #334155;
|
||||
color: #e2e8f0;
|
||||
}
|
||||
</style>
|
||||
|
||||
@@ -1,196 +1,207 @@
|
||||
<!-- [DEF:LogEntryRow:Component] -->
|
||||
<!-- @SEMANTICS: log, entry, row, ui, svelte -->
|
||||
<!-- @PURPOSE: Optimized row rendering for a single log entry with color coding and progress bar support. -->
|
||||
<!-- @TIER: STANDARD -->
|
||||
<!-- @LAYER: UI -->
|
||||
<!-- @UX_STATE: Idle -> (displays log entry) -->
|
||||
|
||||
<!--
|
||||
@TIER: STANDARD
|
||||
@SEMANTICS: log, entry, row, ui
|
||||
@PURPOSE: Renders a single log entry with stacked layout optimized for narrow drawer panels.
|
||||
@LAYER: UI
|
||||
@UX_STATE: Idle -> Displays log entry with color-coded level and source badges.
|
||||
-->
|
||||
<script>
|
||||
/** @type {Object} log - The log entry object */
|
||||
export let log;
|
||||
/** @type {boolean} showSource - Whether to show the source tag */
|
||||
export let showSource = true;
|
||||
|
||||
// Format timestamp for display
|
||||
// [DEF:formatTime:Function]
|
||||
/** @PURPOSE Format ISO timestamp to HH:MM:SS */
|
||||
$: formattedTime = formatTime(log.timestamp);
|
||||
|
||||
function formatTime(timestamp) {
|
||||
if (!timestamp) return '';
|
||||
if (!timestamp) return "";
|
||||
const date = new Date(timestamp);
|
||||
return date.toLocaleTimeString('en-US', {
|
||||
hour12: false,
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
second: '2-digit'
|
||||
return date.toLocaleTimeString("en-US", {
|
||||
hour12: false,
|
||||
hour: "2-digit",
|
||||
minute: "2-digit",
|
||||
second: "2-digit",
|
||||
});
|
||||
}
|
||||
// [/DEF:formatTime:Function]
|
||||
|
||||
// Get level class for styling
|
||||
$: levelClass = getLevelClass(log.level);
|
||||
|
||||
function getLevelClass(level) {
|
||||
switch (level?.toUpperCase()) {
|
||||
case 'DEBUG': return 'level-debug';
|
||||
case 'INFO': return 'level-info';
|
||||
case 'WARNING': return 'level-warning';
|
||||
case 'ERROR': return 'level-error';
|
||||
default: return 'level-info';
|
||||
case "DEBUG":
|
||||
return "level-debug";
|
||||
case "INFO":
|
||||
return "level-info";
|
||||
case "WARNING":
|
||||
return "level-warning";
|
||||
case "ERROR":
|
||||
return "level-error";
|
||||
default:
|
||||
return "level-info";
|
||||
}
|
||||
}
|
||||
|
||||
// Get source class for styling
|
||||
$: sourceClass = getSourceClass(log.source);
|
||||
|
||||
function getSourceClass(source) {
|
||||
if (!source) return 'source-default';
|
||||
return `source-${source.toLowerCase().replace(/[^a-z0-9]/g, '-')}`;
|
||||
if (!source) return "source-default";
|
||||
return `source-${source.toLowerCase().replace(/[^a-z0-9]/g, "-")}`;
|
||||
}
|
||||
|
||||
// Check if log has progress metadata
|
||||
$: hasProgress = log.metadata?.progress !== undefined;
|
||||
$: progressPercent = log.metadata?.progress || 0;
|
||||
</script>
|
||||
|
||||
<div class="log-entry-row {levelClass}" class:has-progress={hasProgress}>
|
||||
<span class="log-time">{formattedTime}</span>
|
||||
<span class="log-level {levelClass}">{log.level || 'INFO'}</span>
|
||||
{#if showSource}
|
||||
<span class="log-source {sourceClass}">{log.source || 'system'}</span>
|
||||
{/if}
|
||||
<span class="log-message">
|
||||
{log.message}
|
||||
{#if hasProgress}
|
||||
<div class="progress-bar-container">
|
||||
<div class="progress-bar" style="width: {progressPercent}%"></div>
|
||||
<span class="progress-text">{progressPercent.toFixed(0)}%</span>
|
||||
</div>
|
||||
<div class="log-row">
|
||||
<!-- Meta line: time + level + source -->
|
||||
<div class="log-meta">
|
||||
<span class="log-time">{formattedTime}</span>
|
||||
<span class="log-level {levelClass}">{log.level || "INFO"}</span>
|
||||
{#if showSource && log.source}
|
||||
<span class="log-source {sourceClass}">{log.source}</span>
|
||||
{/if}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<!-- Message -->
|
||||
<div class="log-message">{log.message}</div>
|
||||
|
||||
<!-- Progress bar (if applicable) -->
|
||||
{#if hasProgress}
|
||||
<div class="progress-container">
|
||||
<div class="progress-track">
|
||||
<div class="progress-fill" style="width: {progressPercent}%"></div>
|
||||
</div>
|
||||
<span class="progress-text">{progressPercent.toFixed(0)}%</span>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/DEF:LogEntryRow:Component] -->
|
||||
|
||||
<style>
|
||||
.log-entry-row {
|
||||
display: grid;
|
||||
grid-template-columns: 80px 70px auto 1fr;
|
||||
gap: 0.75rem;
|
||||
padding: 0.375rem 0.75rem;
|
||||
font-family: 'JetBrains Mono', 'Fira Code', monospace;
|
||||
font-size: 0.8125rem;
|
||||
border-bottom: 1px solid #1e293b;
|
||||
align-items: start;
|
||||
.log-row {
|
||||
padding: 0.5rem 0.75rem;
|
||||
border-bottom: 1px solid rgba(30, 41, 59, 0.6);
|
||||
transition: background-color 0.1s;
|
||||
}
|
||||
|
||||
.log-entry-row.has-progress {
|
||||
grid-template-columns: 80px 70px auto 1fr;
|
||||
}
|
||||
|
||||
.log-entry-row:hover {
|
||||
.log-row:hover {
|
||||
background-color: rgba(30, 41, 59, 0.5);
|
||||
}
|
||||
|
||||
/* Alternating row backgrounds handled by parent */
|
||||
.log-meta {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
margin-bottom: 0.25rem;
|
||||
}
|
||||
|
||||
.log-time {
|
||||
color: #64748b;
|
||||
font-size: 0.75rem;
|
||||
white-space: nowrap;
|
||||
font-family: "JetBrains Mono", "Fira Code", monospace;
|
||||
font-size: 0.6875rem;
|
||||
color: #475569;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.log-level {
|
||||
font-family: "JetBrains Mono", "Fira Code", monospace;
|
||||
font-weight: 600;
|
||||
text-transform: uppercase;
|
||||
font-size: 0.6875rem;
|
||||
padding: 0.125rem 0.375rem;
|
||||
border-radius: 0.25rem;
|
||||
text-align: center;
|
||||
font-size: 0.625rem;
|
||||
padding: 0.0625rem 0.375rem;
|
||||
border-radius: 0.1875rem;
|
||||
letter-spacing: 0.03em;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.level-debug {
|
||||
color: #64748b;
|
||||
background-color: rgba(100, 116, 139, 0.2);
|
||||
background-color: rgba(100, 116, 139, 0.15);
|
||||
}
|
||||
|
||||
.level-info {
|
||||
color: #3b82f6;
|
||||
background-color: rgba(59, 130, 246, 0.15);
|
||||
color: #38bdf8;
|
||||
background-color: rgba(56, 189, 248, 0.1);
|
||||
}
|
||||
|
||||
.level-warning {
|
||||
color: #f59e0b;
|
||||
background-color: rgba(245, 158, 11, 0.15);
|
||||
color: #fbbf24;
|
||||
background-color: rgba(251, 191, 36, 0.1);
|
||||
}
|
||||
|
||||
.level-error {
|
||||
color: #ef4444;
|
||||
background-color: rgba(239, 68, 68, 0.15);
|
||||
color: #f87171;
|
||||
background-color: rgba(248, 113, 113, 0.1);
|
||||
}
|
||||
|
||||
.log-source {
|
||||
font-size: 0.6875rem;
|
||||
padding: 0.125rem 0.375rem;
|
||||
border-radius: 0.25rem;
|
||||
background-color: rgba(100, 116, 139, 0.2);
|
||||
color: #94a3b8;
|
||||
white-space: nowrap;
|
||||
text-overflow: ellipsis;
|
||||
overflow: hidden;
|
||||
max-width: 120px;
|
||||
font-size: 0.625rem;
|
||||
padding: 0.0625rem 0.375rem;
|
||||
border-radius: 0.1875rem;
|
||||
background-color: rgba(100, 116, 139, 0.15);
|
||||
color: #64748b;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.source-plugin {
|
||||
background-color: rgba(34, 197, 94, 0.15);
|
||||
color: #22c55e;
|
||||
background-color: rgba(34, 197, 94, 0.1);
|
||||
color: #4ade80;
|
||||
}
|
||||
|
||||
.source-superset-api, .source-superset_api {
|
||||
background-color: rgba(168, 85, 247, 0.15);
|
||||
color: #a855f7;
|
||||
.source-superset-api,
|
||||
.source-superset_api {
|
||||
background-color: rgba(168, 85, 247, 0.1);
|
||||
color: #c084fc;
|
||||
}
|
||||
|
||||
.source-git {
|
||||
background-color: rgba(249, 115, 22, 0.15);
|
||||
color: #f97316;
|
||||
background-color: rgba(249, 115, 22, 0.1);
|
||||
color: #fb923c;
|
||||
}
|
||||
|
||||
.source-system {
|
||||
background-color: rgba(59, 130, 246, 0.15);
|
||||
color: #3b82f6;
|
||||
background-color: rgba(56, 189, 248, 0.1);
|
||||
color: #38bdf8;
|
||||
}
|
||||
|
||||
.log-message {
|
||||
color: #e2e8f0;
|
||||
font-family: "JetBrains Mono", "Fira Code", monospace;
|
||||
font-size: 0.8125rem;
|
||||
line-height: 1.5;
|
||||
color: #cbd5e1;
|
||||
word-break: break-word;
|
||||
white-space: pre-wrap;
|
||||
}
|
||||
|
||||
.progress-bar-container {
|
||||
.progress-container {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
margin-top: 0.25rem;
|
||||
background-color: #1e293b;
|
||||
border-radius: 0.25rem;
|
||||
overflow: hidden;
|
||||
height: 1rem;
|
||||
margin-top: 0.375rem;
|
||||
}
|
||||
|
||||
.progress-bar {
|
||||
background: linear-gradient(90deg, #3b82f6, #8b5cf6);
|
||||
.progress-track {
|
||||
flex: 1;
|
||||
height: 0.375rem;
|
||||
background-color: #1e293b;
|
||||
border-radius: 9999px;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.progress-fill {
|
||||
height: 100%;
|
||||
background: linear-gradient(90deg, #3b82f6, #8b5cf6);
|
||||
border-radius: 9999px;
|
||||
transition: width 0.3s ease;
|
||||
}
|
||||
|
||||
.progress-text {
|
||||
font-family: "JetBrains Mono", "Fira Code", monospace;
|
||||
font-size: 0.625rem;
|
||||
color: #94a3b8;
|
||||
padding: 0 0.25rem;
|
||||
position: absolute;
|
||||
right: 0.25rem;
|
||||
}
|
||||
|
||||
.progress-bar-container {
|
||||
position: relative;
|
||||
color: #64748b;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:LogEntryRow:Component] -->
|
||||
|
||||
@@ -1,161 +1,234 @@
|
||||
<!-- [DEF:LogFilterBar:Component] -->
|
||||
<!-- @SEMANTICS: log, filter, ui, svelte -->
|
||||
<!-- @PURPOSE: UI component for filtering logs by level, source, and text search. -->
|
||||
<!-- @TIER: STANDARD -->
|
||||
<!-- @LAYER: UI -->
|
||||
<!-- @UX_STATE: Idle -> FilterChanged -> (parent applies filter) -->
|
||||
|
||||
<!--
|
||||
@TIER: STANDARD
|
||||
@SEMANTICS: log, filter, ui
|
||||
@PURPOSE: Compact filter toolbar for logs — level, source, and text search in a single dense row.
|
||||
@LAYER: UI
|
||||
@UX_STATE: Idle -> Shows filter controls
|
||||
@UX_STATE: Active -> Filters applied, clear button visible
|
||||
-->
|
||||
<script>
|
||||
import { createEventDispatcher } from 'svelte';
|
||||
import { createEventDispatcher } from "svelte";
|
||||
|
||||
// Props
|
||||
/** @type {string[]} availableSources - List of available source options */
|
||||
export let availableSources = [];
|
||||
/** @type {string} selectedLevel - Currently selected log level filter */
|
||||
export let selectedLevel = '';
|
||||
export let selectedLevel = "";
|
||||
/** @type {string} selectedSource - Currently selected source filter */
|
||||
export let selectedSource = '';
|
||||
export let selectedSource = "";
|
||||
/** @type {string} searchText - Current search text */
|
||||
export let searchText = '';
|
||||
export let searchText = "";
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
// Log level options
|
||||
const levelOptions = [
|
||||
{ value: '', label: 'All Levels' },
|
||||
{ value: 'DEBUG', label: 'Debug' },
|
||||
{ value: 'INFO', label: 'Info' },
|
||||
{ value: 'WARNING', label: 'Warning' },
|
||||
{ value: 'ERROR', label: 'Error' }
|
||||
{ value: "", label: "All" },
|
||||
{ value: "DEBUG", label: "Debug" },
|
||||
{ value: "INFO", label: "Info" },
|
||||
{ value: "WARNING", label: "Warn" },
|
||||
{ value: "ERROR", label: "Error" },
|
||||
];
|
||||
|
||||
// Handle filter changes
|
||||
function handleLevelChange(event) {
|
||||
selectedLevel = event.target.value;
|
||||
dispatch('filter-change', { level: selectedLevel, source: selectedSource, search: searchText });
|
||||
dispatch("filter-change", {
|
||||
level: selectedLevel,
|
||||
source: selectedSource,
|
||||
search: searchText,
|
||||
});
|
||||
}
|
||||
|
||||
function handleSourceChange(event) {
|
||||
selectedSource = event.target.value;
|
||||
dispatch('filter-change', { level: selectedLevel, source: selectedSource, search: searchText });
|
||||
dispatch("filter-change", {
|
||||
level: selectedLevel,
|
||||
source: selectedSource,
|
||||
search: searchText,
|
||||
});
|
||||
}
|
||||
|
||||
function handleSearchChange(event) {
|
||||
searchText = event.target.value;
|
||||
dispatch('filter-change', { level: selectedLevel, source: selectedSource, search: searchText });
|
||||
dispatch("filter-change", {
|
||||
level: selectedLevel,
|
||||
source: selectedSource,
|
||||
search: searchText,
|
||||
});
|
||||
}
|
||||
|
||||
function clearFilters() {
|
||||
selectedLevel = '';
|
||||
selectedSource = '';
|
||||
searchText = '';
|
||||
dispatch('filter-change', { level: '', source: '', search: '' });
|
||||
selectedLevel = "";
|
||||
selectedSource = "";
|
||||
searchText = "";
|
||||
dispatch("filter-change", { level: "", source: "", search: "" });
|
||||
}
|
||||
|
||||
$: hasActiveFilters = selectedLevel || selectedSource || searchText;
|
||||
</script>
|
||||
|
||||
<div class="log-filter-bar">
|
||||
<div class="filter-group">
|
||||
<label for="level-filter" class="filter-label">Level:</label>
|
||||
<select id="level-filter" class="filter-select" value={selectedLevel} on:change={handleLevelChange}>
|
||||
<div class="filter-bar">
|
||||
<div class="filter-controls">
|
||||
<select
|
||||
class="filter-select"
|
||||
value={selectedLevel}
|
||||
on:change={handleLevelChange}
|
||||
aria-label="Filter by level"
|
||||
>
|
||||
{#each levelOptions as option}
|
||||
<option value={option.value}>{option.label}</option>
|
||||
{/each}
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="filter-group">
|
||||
<label for="source-filter" class="filter-label">Source:</label>
|
||||
<select id="source-filter" class="filter-select" value={selectedSource} on:change={handleSourceChange}>
|
||||
<select
|
||||
class="filter-select"
|
||||
value={selectedSource}
|
||||
on:change={handleSourceChange}
|
||||
aria-label="Filter by source"
|
||||
>
|
||||
<option value="">All Sources</option>
|
||||
{#each availableSources as source}
|
||||
<option value={source}>{source}</option>
|
||||
{/each}
|
||||
</select>
|
||||
|
||||
<div class="search-wrapper">
|
||||
<svg
|
||||
class="search-icon"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="14"
|
||||
height="14"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
stroke-width="2"
|
||||
>
|
||||
<circle cx="11" cy="11" r="8" />
|
||||
<path d="M21 21l-4.35-4.35" />
|
||||
</svg>
|
||||
<input
|
||||
type="text"
|
||||
class="search-input"
|
||||
placeholder="Search..."
|
||||
value={searchText}
|
||||
on:input={handleSearchChange}
|
||||
aria-label="Search logs"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="filter-group search-group">
|
||||
<label for="search-filter" class="filter-label">Search:</label>
|
||||
<input
|
||||
id="search-filter"
|
||||
type="text"
|
||||
class="filter-input"
|
||||
placeholder="Search logs..."
|
||||
value={searchText}
|
||||
on:input={handleSearchChange}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{#if selectedLevel || selectedSource || searchText}
|
||||
<button class="clear-btn" on:click={clearFilters}>
|
||||
Clear Filters
|
||||
{#if hasActiveFilters}
|
||||
<button
|
||||
class="clear-btn"
|
||||
on:click={clearFilters}
|
||||
aria-label="Clear filters"
|
||||
>
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="12"
|
||||
height="12"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
stroke-width="2"
|
||||
>
|
||||
<path d="M18 6L6 18M6 6l12 12" />
|
||||
</svg>
|
||||
</button>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/DEF:LogFilterBar:Component] -->
|
||||
|
||||
<style>
|
||||
.log-filter-bar {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 1rem;
|
||||
align-items: center;
|
||||
padding: 0.75rem;
|
||||
background-color: #1e293b;
|
||||
border-radius: 0.5rem;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.filter-group {
|
||||
.filter-bar {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.filter-label {
|
||||
font-size: 0.875rem;
|
||||
color: #94a3b8;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.filter-select, .filter-input {
|
||||
background-color: #334155;
|
||||
color: #e2e8f0;
|
||||
border: 1px solid #475569;
|
||||
border-radius: 0.375rem;
|
||||
gap: 0.375rem;
|
||||
padding: 0.5rem 0.75rem;
|
||||
font-size: 0.875rem;
|
||||
min-width: 120px;
|
||||
background-color: #0f172a;
|
||||
border-bottom: 1px solid #1e293b;
|
||||
}
|
||||
|
||||
.filter-select:focus, .filter-input:focus {
|
||||
.filter-controls {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.375rem;
|
||||
flex: 1;
|
||||
min-width: 0;
|
||||
}
|
||||
|
||||
.filter-select {
|
||||
background-color: #1e293b;
|
||||
color: #94a3b8;
|
||||
border: 1px solid #334155;
|
||||
border-radius: 0.25rem;
|
||||
padding: 0.3125rem 0.5rem;
|
||||
font-size: 0.75rem;
|
||||
cursor: pointer;
|
||||
flex-shrink: 0;
|
||||
appearance: none;
|
||||
-webkit-appearance: none;
|
||||
background-image: url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' width='12' height='12' viewBox='0 0 24 24' fill='none' stroke='%2364748b' stroke-width='2'%3E%3Cpath d='M6 9l6 6 6-6'/%3E%3C/svg%3E");
|
||||
background-repeat: no-repeat;
|
||||
background-position: right 0.375rem center;
|
||||
padding-right: 1.5rem;
|
||||
}
|
||||
|
||||
.filter-select:focus {
|
||||
outline: none;
|
||||
border-color: #3b82f6;
|
||||
box-shadow: 0 0 0 2px rgba(59, 130, 246, 0.2);
|
||||
}
|
||||
|
||||
.search-group {
|
||||
.search-wrapper {
|
||||
position: relative;
|
||||
flex: 1;
|
||||
min-width: 200px;
|
||||
min-width: 0;
|
||||
}
|
||||
|
||||
.filter-input {
|
||||
.search-icon {
|
||||
position: absolute;
|
||||
left: 0.5rem;
|
||||
top: 50%;
|
||||
transform: translateY(-50%);
|
||||
color: #475569;
|
||||
pointer-events: none;
|
||||
}
|
||||
|
||||
.search-input {
|
||||
width: 100%;
|
||||
max-width: 300px;
|
||||
background-color: #1e293b;
|
||||
color: #e2e8f0;
|
||||
border: 1px solid #334155;
|
||||
border-radius: 0.25rem;
|
||||
padding: 0.3125rem 0.5rem 0.3125rem 1.75rem;
|
||||
font-size: 0.75rem;
|
||||
}
|
||||
|
||||
.search-input::placeholder {
|
||||
color: #475569;
|
||||
}
|
||||
|
||||
.search-input:focus {
|
||||
outline: none;
|
||||
border-color: #3b82f6;
|
||||
}
|
||||
|
||||
.clear-btn {
|
||||
background-color: #475569;
|
||||
color: #e2e8f0;
|
||||
border: none;
|
||||
border-radius: 0.375rem;
|
||||
padding: 0.5rem 1rem;
|
||||
font-size: 0.875rem;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
padding: 0.3125rem;
|
||||
background: none;
|
||||
border: 1px solid #334155;
|
||||
border-radius: 0.25rem;
|
||||
color: #64748b;
|
||||
cursor: pointer;
|
||||
transition: background-color 0.2s;
|
||||
flex-shrink: 0;
|
||||
transition: all 0.15s;
|
||||
}
|
||||
|
||||
.clear-btn:hover {
|
||||
background-color: #64748b;
|
||||
color: #f87171;
|
||||
border-color: #f87171;
|
||||
background-color: rgba(248, 113, 113, 0.1);
|
||||
}
|
||||
</style>
|
||||
|
||||
<!-- [/DEF:LogFilterBar:Component] -->
|
||||
|
||||
@@ -2,58 +2,82 @@
|
||||
<!--
|
||||
@TIER: STANDARD
|
||||
@SEMANTICS: task, log, panel, filter, list
|
||||
@PURPOSE: Combines log filtering and display into a single cohesive panel.
|
||||
@PURPOSE: Combines log filtering and display into a single cohesive dark-themed panel.
|
||||
@LAYER: UI
|
||||
@RELATION: USES -> frontend/src/components/tasks/LogFilterBar.svelte
|
||||
@RELATION: USES -> frontend/src/components/tasks/LogEntryRow.svelte
|
||||
@INVARIANT: Must always display logs in chronological order and respect auto-scroll preference.
|
||||
@UX_STATE: Empty -> Displays "No logs" message
|
||||
@UX_STATE: Populated -> Displays list of LogEntryRow components
|
||||
@UX_STATE: AutoScroll -> Automatically scrolls to bottom on new logs
|
||||
-->
|
||||
<script>
|
||||
import { createEventDispatcher, onMount, afterUpdate } from 'svelte';
|
||||
import LogFilterBar from './LogFilterBar.svelte';
|
||||
import LogEntryRow from './LogEntryRow.svelte';
|
||||
import { createEventDispatcher, onMount, afterUpdate } from "svelte";
|
||||
import LogFilterBar from "./LogFilterBar.svelte";
|
||||
import LogEntryRow from "./LogEntryRow.svelte";
|
||||
|
||||
/**
|
||||
* @PURPOSE: Component properties and state.
|
||||
* @PRE: taskId is a valid string, logs is an array of LogEntry objects.
|
||||
* @UX_STATE: [Empty] -> Displays "No logs available" message.
|
||||
* @UX_STATE: [Populated] -> Displays list of LogEntryRow components.
|
||||
* @UX_STATE: [AutoScroll] -> Automatically scrolls to bottom on new logs.
|
||||
* @PURPOSE Component properties and state.
|
||||
* @PRE taskId is a valid string, logs is an array of LogEntry objects.
|
||||
*/
|
||||
export let taskId = '';
|
||||
export let taskId = "";
|
||||
export let logs = [];
|
||||
export let autoScroll = true;
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
let scrollContainer;
|
||||
let selectedSource = 'all';
|
||||
let selectedLevel = 'all';
|
||||
let selectedSource = "all";
|
||||
let selectedLevel = "all";
|
||||
let searchText = "";
|
||||
|
||||
/**
|
||||
* @PURPOSE: Handles filter changes from LogFilterBar.
|
||||
* @PRE: event.detail contains source and level.
|
||||
* @POST: Dispatches filterChange event to parent.
|
||||
* @SIDE_EFFECT: Updates local filter state.
|
||||
*/
|
||||
function handleFilterChange(event) {
|
||||
const { source, level } = event.detail;
|
||||
selectedSource = source;
|
||||
selectedLevel = level;
|
||||
console.log(`[TaskLogPanel][STATE] Filter changed: source=${source}, level=${level}`);
|
||||
dispatch('filterChange', { source, level });
|
||||
// Filtered logs based on current filters
|
||||
$: filteredLogs = filterLogs(logs, selectedLevel, selectedSource, searchText);
|
||||
|
||||
function filterLogs(allLogs, level, source, search) {
|
||||
return allLogs.filter((log) => {
|
||||
if (
|
||||
level &&
|
||||
level !== "all" &&
|
||||
log.level?.toUpperCase() !== level.toUpperCase()
|
||||
)
|
||||
return false;
|
||||
if (
|
||||
source &&
|
||||
source !== "all" &&
|
||||
log.source?.toLowerCase() !== source.toLowerCase()
|
||||
)
|
||||
return false;
|
||||
if (search && !log.message?.toLowerCase().includes(search.toLowerCase()))
|
||||
return false;
|
||||
return true;
|
||||
});
|
||||
}
|
||||
|
||||
// Extract unique sources from logs
|
||||
$: availableSources = [...new Set(logs.map((l) => l.source).filter(Boolean))];
|
||||
|
||||
function handleFilterChange(event) {
|
||||
const { source, level, search } = event.detail;
|
||||
selectedSource = source || "all";
|
||||
selectedLevel = level || "all";
|
||||
searchText = search || "";
|
||||
console.log(
|
||||
`[TaskLogPanel][Action] Filter: level=${selectedLevel}, source=${selectedSource}, search=${searchText}`,
|
||||
);
|
||||
dispatch("filterChange", { source, level });
|
||||
}
|
||||
|
||||
/**
|
||||
* @PURPOSE: Scrolls the log container to the bottom.
|
||||
* @PRE: autoScroll is true and scrollContainer is bound.
|
||||
* @POST: scrollContainer.scrollTop is set to scrollHeight.
|
||||
*/
|
||||
function scrollToBottom() {
|
||||
if (autoScroll && scrollContainer) {
|
||||
scrollContainer.scrollTop = scrollContainer.scrollHeight;
|
||||
}
|
||||
}
|
||||
|
||||
function toggleAutoScroll() {
|
||||
autoScroll = !autoScroll;
|
||||
if (autoScroll) scrollToBottom();
|
||||
}
|
||||
|
||||
afterUpdate(() => {
|
||||
scrollToBottom();
|
||||
});
|
||||
@@ -63,57 +87,162 @@
|
||||
});
|
||||
</script>
|
||||
|
||||
<div class="flex flex-col h-full bg-gray-900 text-gray-100 rounded-lg overflow-hidden border border-gray-700">
|
||||
<!-- Header / Filter Bar -->
|
||||
<div class="p-2 bg-gray-800 border-b border-gray-700">
|
||||
<LogFilterBar
|
||||
{taskId}
|
||||
on:filter={handleFilterChange}
|
||||
/>
|
||||
</div>
|
||||
<div class="log-panel">
|
||||
<!-- Filter Bar -->
|
||||
<LogFilterBar {availableSources} on:filter-change={handleFilterChange} />
|
||||
|
||||
<!-- Log List -->
|
||||
<div
|
||||
bind:this={scrollContainer}
|
||||
class="flex-1 overflow-y-auto p-2 font-mono text-sm space-y-0.5"
|
||||
>
|
||||
{#if logs.length === 0}
|
||||
<div class="text-gray-500 italic text-center py-4">
|
||||
No logs available for this task.
|
||||
<div bind:this={scrollContainer} class="log-list">
|
||||
{#if filteredLogs.length === 0}
|
||||
<div class="empty-logs">
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="32"
|
||||
height="32"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
stroke-width="1.5"
|
||||
>
|
||||
<path d="M14 2H6a2 2 0 00-2 2v16a2 2 0 002 2h12a2 2 0 002-2V8z" />
|
||||
<polyline points="14 2 14 8 20 8" />
|
||||
<line x1="16" y1="13" x2="8" y2="13" />
|
||||
<line x1="16" y1="17" x2="8" y2="17" />
|
||||
<polyline points="10 9 9 9 8 9" />
|
||||
</svg>
|
||||
<span>No logs available</span>
|
||||
</div>
|
||||
{:else}
|
||||
{#each logs as log}
|
||||
{#each filteredLogs as log}
|
||||
<LogEntryRow {log} />
|
||||
{/each}
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- Footer / Stats -->
|
||||
<div class="px-3 py-1 bg-gray-800 border-t border-gray-700 text-xs text-gray-400 flex justify-between items-center">
|
||||
<span>Total: {logs.length} entries</span>
|
||||
{#if autoScroll}
|
||||
<span class="text-green-500 flex items-center gap-1">
|
||||
<span class="w-2 h-2 bg-green-500 rounded-full animate-pulse"></span>
|
||||
Auto-scroll active
|
||||
</span>
|
||||
{/if}
|
||||
<!-- Footer Stats -->
|
||||
<div class="log-footer">
|
||||
<span class="log-count">
|
||||
{filteredLogs.length}{filteredLogs.length !== logs.length
|
||||
? ` / ${logs.length}`
|
||||
: ""} entries
|
||||
</span>
|
||||
<button
|
||||
class="autoscroll-btn"
|
||||
class:active={autoScroll}
|
||||
on:click={toggleAutoScroll}
|
||||
aria-label="Toggle auto-scroll"
|
||||
>
|
||||
{#if autoScroll}
|
||||
<span class="pulse-dot"></span>
|
||||
{/if}
|
||||
Auto-scroll {autoScroll ? "on" : "off"}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
<!-- [/DEF:TaskLogPanel:Component] -->
|
||||
|
||||
<style>
|
||||
/* Custom scrollbar for the log container */
|
||||
div::-webkit-scrollbar {
|
||||
width: 8px;
|
||||
.log-panel {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
height: 100%;
|
||||
background-color: #0f172a;
|
||||
overflow: hidden;
|
||||
}
|
||||
div::-webkit-scrollbar-track {
|
||||
background: #1f2937;
|
||||
|
||||
.log-list {
|
||||
flex: 1;
|
||||
overflow-y: auto;
|
||||
overflow-x: hidden;
|
||||
}
|
||||
div::-webkit-scrollbar-thumb {
|
||||
background: #4b5563;
|
||||
border-radius: 4px;
|
||||
|
||||
/* Custom scrollbar */
|
||||
.log-list::-webkit-scrollbar {
|
||||
width: 6px;
|
||||
}
|
||||
div::-webkit-scrollbar-thumb:hover {
|
||||
background: #6b7280;
|
||||
|
||||
.log-list::-webkit-scrollbar-track {
|
||||
background: #0f172a;
|
||||
}
|
||||
|
||||
.log-list::-webkit-scrollbar-thumb {
|
||||
background: #334155;
|
||||
border-radius: 3px;
|
||||
}
|
||||
|
||||
.log-list::-webkit-scrollbar-thumb:hover {
|
||||
background: #475569;
|
||||
}
|
||||
|
||||
.empty-logs {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
padding: 3rem 1rem;
|
||||
color: #334155;
|
||||
gap: 0.75rem;
|
||||
}
|
||||
|
||||
.empty-logs span {
|
||||
font-size: 0.8125rem;
|
||||
color: #475569;
|
||||
}
|
||||
|
||||
.log-footer {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
padding: 0.375rem 0.75rem;
|
||||
border-top: 1px solid #1e293b;
|
||||
background-color: #0f172a;
|
||||
}
|
||||
|
||||
.log-count {
|
||||
font-family: "JetBrains Mono", "Fira Code", monospace;
|
||||
font-size: 0.6875rem;
|
||||
color: #475569;
|
||||
}
|
||||
|
||||
.autoscroll-btn {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.375rem;
|
||||
background: none;
|
||||
border: none;
|
||||
color: #475569;
|
||||
font-size: 0.6875rem;
|
||||
cursor: pointer;
|
||||
padding: 0.125rem 0.375rem;
|
||||
border-radius: 0.25rem;
|
||||
transition: all 0.15s;
|
||||
}
|
||||
|
||||
.autoscroll-btn:hover {
|
||||
background-color: #1e293b;
|
||||
color: #94a3b8;
|
||||
}
|
||||
|
||||
.autoscroll-btn.active {
|
||||
color: #22d3ee;
|
||||
}
|
||||
|
||||
.pulse-dot {
|
||||
display: inline-block;
|
||||
width: 5px;
|
||||
height: 5px;
|
||||
border-radius: 50%;
|
||||
background-color: #22d3ee;
|
||||
animation: pulse 2s infinite;
|
||||
}
|
||||
|
||||
@keyframes pulse {
|
||||
0%,
|
||||
100% {
|
||||
opacity: 1;
|
||||
}
|
||||
50% {
|
||||
opacity: 0.3;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
<!-- [/DEF:TaskLogPanel:Component] -->
|
||||
@@ -165,6 +165,36 @@ export const api = {
|
||||
getStorageSettings: () => fetchApi('/settings/storage'),
|
||||
updateStorageSettings: (storage) => requestApi('/settings/storage', 'PUT', storage),
|
||||
getEnvironmentsList: () => fetchApi('/environments'),
|
||||
getEnvironmentDatabases: (id) => fetchApi(`/environments/${id}/databases`),
|
||||
|
||||
// Dashboards
|
||||
getDashboards: (envId, options = {}) => {
|
||||
const params = new URLSearchParams({ env_id: envId });
|
||||
if (options.search) params.append('search', options.search);
|
||||
if (options.page) params.append('page', options.page);
|
||||
if (options.page_size) params.append('page_size', options.page_size);
|
||||
return fetchApi(`/dashboards?${params.toString()}`);
|
||||
},
|
||||
getDatabaseMappings: (sourceEnvId, targetEnvId) => fetchApi(`/dashboards/db-mappings?source_env_id=${sourceEnvId}&target_env_id=${targetEnvId}`),
|
||||
|
||||
// Datasets
|
||||
getDatasets: (envId, options = {}) => {
|
||||
const params = new URLSearchParams({ env_id: envId });
|
||||
if (options.search) params.append('search', options.search);
|
||||
if (options.page) params.append('page', options.page);
|
||||
if (options.page_size) params.append('page_size', options.page_size);
|
||||
return fetchApi(`/datasets?${params.toString()}`);
|
||||
},
|
||||
getDatasetIds: (envId, options = {}) => {
|
||||
const params = new URLSearchParams({ env_id: envId });
|
||||
if (options.search) params.append('search', options.search);
|
||||
return fetchApi(`/datasets/ids?${params.toString()}`);
|
||||
},
|
||||
getDatasetDetail: (envId, datasetId) => fetchApi(`/datasets/${datasetId}?env_id=${envId}`),
|
||||
|
||||
// Settings
|
||||
getConsolidatedSettings: () => fetchApi('/settings/consolidated'),
|
||||
updateConsolidatedSettings: (settings) => requestApi('/settings/consolidated', 'PATCH', settings),
|
||||
};
|
||||
// [/DEF:api:Data]
|
||||
|
||||
@@ -187,3 +217,7 @@ export const updateEnvironmentSchedule = api.updateEnvironmentSchedule;
|
||||
export const getEnvironmentsList = api.getEnvironmentsList;
|
||||
export const getStorageSettings = api.getStorageSettings;
|
||||
export const updateStorageSettings = api.updateStorageSettings;
|
||||
export const getDashboards = api.getDashboards;
|
||||
export const getDatasets = api.getDatasets;
|
||||
export const getConsolidatedSettings = api.getConsolidatedSettings;
|
||||
export const updateConsolidatedSettings = api.updateConsolidatedSettings;
|
||||
|
||||
@@ -1,30 +1,70 @@
|
||||
<!-- [DEF:frontend.src.routes.+layout:Module] -->
|
||||
<!--
|
||||
@TIER: STANDARD
|
||||
@SEMANTICS: layout, root, navigation, sidebar, toast
|
||||
@PURPOSE: Root layout component that provides global UI structure (Sidebar, Navbar, Footer, TaskDrawer, Toasts).
|
||||
@LAYER: UI (Layout)
|
||||
@RELATION: DEPENDS_ON -> Sidebar
|
||||
@RELATION: DEPENDS_ON -> TopNavbar
|
||||
@RELATION: DEPENDS_ON -> Footer
|
||||
@RELATION: DEPENDS_ON -> Toast
|
||||
@RELATION: DEPENDS_ON -> ProtectedRoute
|
||||
@RELATION: DEPENDS_ON -> TaskDrawer
|
||||
@INVARIANT: All pages except /login are wrapped in ProtectedRoute.
|
||||
-->
|
||||
|
||||
<!-- [DEF:layout:Module] -->
|
||||
<script>
|
||||
import '../app.css';
|
||||
import Navbar from '../components/Navbar.svelte';
|
||||
import Footer from '../components/Footer.svelte';
|
||||
import Toast from '../components/Toast.svelte';
|
||||
import ProtectedRoute from '../components/auth/ProtectedRoute.svelte';
|
||||
import Breadcrumbs from '$lib/components/layout/Breadcrumbs.svelte';
|
||||
import Sidebar from '$lib/components/layout/Sidebar.svelte';
|
||||
import TopNavbar from '$lib/components/layout/TopNavbar.svelte';
|
||||
import TaskDrawer from '$lib/components/layout/TaskDrawer.svelte';
|
||||
import { page } from '$app/stores';
|
||||
import { sidebarStore } from '$lib/stores/sidebar.js';
|
||||
|
||||
$: isLoginPage = $page.url.pathname === '/login';
|
||||
$: isExpanded = $sidebarStore?.isExpanded || true;
|
||||
</script>
|
||||
|
||||
<Toast />
|
||||
|
||||
<main class="bg-gray-50 min-h-screen flex flex-col">
|
||||
<main class="bg-gray-50 min-h-screen">
|
||||
{#if isLoginPage}
|
||||
<div class="p-4 flex-grow">
|
||||
<div class="p-4">
|
||||
<slot />
|
||||
</div>
|
||||
{:else}
|
||||
<ProtectedRoute>
|
||||
<Navbar />
|
||||
<!-- Sidebar -->
|
||||
<Sidebar />
|
||||
|
||||
<div class="p-4 flex-grow">
|
||||
<slot />
|
||||
<!-- Main content area with TopNavbar -->
|
||||
<div class="flex flex-col min-h-screen {isExpanded ? 'md:ml-60' : 'md:ml-16'} transition-all duration-200">
|
||||
<!-- Top Navigation Bar -->
|
||||
<TopNavbar />
|
||||
<!-- Breadcrumbs -->
|
||||
<div class="mt-16">
|
||||
<Breadcrumbs />
|
||||
</div>
|
||||
|
||||
<!-- Page content -->
|
||||
<div class="p-4 flex-grow">
|
||||
<slot />
|
||||
</div>
|
||||
|
||||
<!-- Footer -->
|
||||
<Footer />
|
||||
</div>
|
||||
|
||||
<Footer />
|
||||
<!-- Global Task Drawer -->
|
||||
<TaskDrawer />
|
||||
</ProtectedRoute>
|
||||
{/if}
|
||||
</main>
|
||||
<!-- [/DEF:layout:Module] -->
|
||||
<!-- [/DEF:frontend.src.routes.+layout:Module] -->
|
||||
|
||||
@@ -1,99 +1,35 @@
|
||||
<!-- [DEF:HomePage:Page] -->
|
||||
<script>
|
||||
import { plugins as pluginsStore, selectedPlugin, selectedTask } from '../lib/stores.js';
|
||||
import TaskRunner from '../components/TaskRunner.svelte';
|
||||
import DynamicForm from '../components/DynamicForm.svelte';
|
||||
import { api } from '../lib/api.js';
|
||||
import { get } from 'svelte/store';
|
||||
import { goto } from '$app/navigation';
|
||||
import { t } from '$lib/i18n';
|
||||
import { Button, Card, PageHeader } from '$lib/ui';
|
||||
/**
|
||||
* @TIER: CRITICAL
|
||||
* @PURPOSE: Redirect to Dashboard Hub as per UX requirements
|
||||
* @LAYER: UI
|
||||
* @INVARIANT: Always redirects to /dashboards
|
||||
*
|
||||
* @UX_STATE: Loading -> Shows loading indicator
|
||||
* @UX_FEEDBACK: Redirects to /dashboards
|
||||
*/
|
||||
|
||||
/** @type {import('./$types').PageData} */
|
||||
export let data;
|
||||
import { onMount } from 'svelte';
|
||||
import { goto } from '$app/navigation';
|
||||
|
||||
// Sync store with loaded data if needed, or just use data.plugins directly
|
||||
$: if (data.plugins) {
|
||||
pluginsStore.set(data.plugins);
|
||||
}
|
||||
|
||||
// [DEF:selectPlugin:Function]
|
||||
/* @PURPOSE: Handles plugin selection and navigation.
|
||||
@PRE: plugin object must be provided.
|
||||
@POST: Navigates to migration or sets selectedPlugin store.
|
||||
*/
|
||||
function selectPlugin(plugin) {
|
||||
console.log(`[Dashboard][Action] Selecting plugin: ${plugin.id}`);
|
||||
if (plugin.ui_route) {
|
||||
goto(plugin.ui_route);
|
||||
} else {
|
||||
selectedPlugin.set(plugin);
|
||||
}
|
||||
}
|
||||
// [/DEF:selectPlugin:Function]
|
||||
|
||||
// [DEF:handleFormSubmit:Function]
|
||||
/* @PURPOSE: Handles task creation from dynamic form submission.
|
||||
@PRE: event.detail must contain task parameters.
|
||||
@POST: Task is created via API and selectedTask store is updated.
|
||||
*/
|
||||
async function handleFormSubmit(event) {
|
||||
console.log("[App.handleFormSubmit][Action] Handling form submission for task creation.");
|
||||
const params = event.detail;
|
||||
try {
|
||||
const plugin = get(selectedPlugin);
|
||||
const task = await api.createTask(plugin.id, params);
|
||||
selectedTask.set(task);
|
||||
selectedPlugin.set(null);
|
||||
console.log(`[App.handleFormSubmit][Coherence:OK] Task created id=${task.id}`);
|
||||
} catch (error) {
|
||||
console.error(`[App.handleFormSubmit][Coherence:Failed] Task creation failed error=${error}`);
|
||||
}
|
||||
}
|
||||
// [/DEF:handleFormSubmit:Function]
|
||||
onMount(() => {
|
||||
// Redirect to Dashboard Hub as per UX requirements
|
||||
goto('/dashboards', { replaceState: true });
|
||||
});
|
||||
</script>
|
||||
|
||||
<div class="container mx-auto p-4">
|
||||
{#if $selectedTask}
|
||||
<TaskRunner />
|
||||
<div class="mt-4">
|
||||
<Button variant="primary" on:click={() => selectedTask.set(null)}>
|
||||
{$t.common.cancel}
|
||||
</Button>
|
||||
</div>
|
||||
{:else if $selectedPlugin}
|
||||
<PageHeader title={$selectedPlugin.name} />
|
||||
<Card>
|
||||
<DynamicForm schema={$selectedPlugin.schema} on:submit={handleFormSubmit} />
|
||||
</Card>
|
||||
<div class="mt-4">
|
||||
<Button variant="secondary" on:click={() => selectedPlugin.set(null)}>
|
||||
{$t.common.cancel}
|
||||
</Button>
|
||||
</div>
|
||||
{:else}
|
||||
<PageHeader title={$t.nav.dashboard} />
|
||||
|
||||
{#if data.error}
|
||||
<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4">
|
||||
{data.error}
|
||||
</div>
|
||||
{/if}
|
||||
<style>
|
||||
.loading {
|
||||
@apply flex items-center justify-center min-h-screen;
|
||||
}
|
||||
</style>
|
||||
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6">
|
||||
{#each data.plugins.filter(p => p.id !== 'superset-search') as plugin}
|
||||
<div
|
||||
on:click={() => selectPlugin(plugin)}
|
||||
role="button"
|
||||
tabindex="0"
|
||||
on:keydown={(e) => e.key === 'Enter' && selectPlugin(plugin)}
|
||||
class="cursor-pointer transition-transform hover:scale-[1.02]"
|
||||
>
|
||||
<Card title={plugin.name}>
|
||||
<p class="text-gray-600 mb-4">{plugin.description}</p>
|
||||
<span class="text-xs font-mono text-gray-400 bg-gray-50 px-2 py-1 rounded">v{plugin.version}</span>
|
||||
</Card>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
{/if}
|
||||
<div class="loading">
|
||||
<svg class="animate-spin h-8 w-8 text-blue-600" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24">
|
||||
<circle class="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" stroke-width="4"></circle>
|
||||
<path class="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
|
||||
</svg>
|
||||
</div>
|
||||
|
||||
<!-- [/DEF:HomePage:Page] -->
|
||||
|
||||
943
frontend/src/routes/datasets/+page.svelte
Normal file
943
frontend/src/routes/datasets/+page.svelte
Normal file
@@ -0,0 +1,943 @@
|
||||
<!-- [DEF:DatasetHub:Page] -->
|
||||
<script>
|
||||
/**
|
||||
* @TIER: CRITICAL
|
||||
* @PURPOSE: Dataset Hub - Dedicated hub for datasets with mapping progress
|
||||
* @LAYER: UI
|
||||
* @RELATION: BINDS_TO -> sidebarStore, taskDrawerStore
|
||||
* @INVARIANT: Always shows environment selector and dataset grid
|
||||
*
|
||||
* @UX_STATE: Loading -> Shows skeleton loader
|
||||
* @UX_STATE: Loaded -> Shows dataset grid with mapping progress
|
||||
* @UX_STATE: Error -> Shows error banner with retry button
|
||||
* @UX_STATE: Selecting -> Checkboxes checked, floating action panel appears
|
||||
* @UX_STATE: BulkAction-Modal -> Map Columns or Generate Docs modal open
|
||||
* @UX_FEEDBACK: Clicking task status opens Task Drawer
|
||||
* @UX_FEEDBACK: Mapped % column shows progress bar + percentage text
|
||||
* @UX_FEEDBACK: Floating panel slides up from bottom when items selected
|
||||
* @UX_RECOVERY: Refresh button reloads dataset list
|
||||
*/
|
||||
|
||||
import { onMount } from 'svelte';
|
||||
import { goto } from '$app/navigation';
|
||||
import { t } from '$lib/i18n';
|
||||
import { openDrawerForTask } from '$lib/stores/taskDrawer.js';
|
||||
import { api } from '$lib/api.js';
|
||||
import { debounce } from '$lib/utils/debounce.js';
|
||||
|
||||
// State
|
||||
let selectedEnv = null;
|
||||
let datasets = [];
|
||||
let isLoading = true;
|
||||
let error = null;
|
||||
|
||||
// Pagination state
|
||||
let currentPage = 1;
|
||||
let pageSize = 10;
|
||||
let totalPages = 1;
|
||||
let total = 0;
|
||||
|
||||
// Selection state
|
||||
let selectedIds = new Set();
|
||||
let isAllSelected = false;
|
||||
let isAllVisibleSelected = false;
|
||||
|
||||
// Search state
|
||||
let searchQuery = '';
|
||||
|
||||
// Bulk action modal state
|
||||
let showMapColumnsModal = false;
|
||||
let showGenerateDocsModal = false;
|
||||
let mapSourceType = 'postgresql';
|
||||
let mapConnectionId = '';
|
||||
let mapFileData = null;
|
||||
let mapFileInput;
|
||||
let llmProvider = '';
|
||||
let llmOptions = {};
|
||||
|
||||
// Environment options - will be loaded from API
|
||||
let environments = [];
|
||||
|
||||
// Debounced search function
|
||||
const debouncedSearch = debounce((query) => {
|
||||
searchQuery = query;
|
||||
loadDatasets();
|
||||
}, 300);
|
||||
|
||||
// Load environments and datasets on mount
|
||||
onMount(async () => {
|
||||
await loadEnvironments();
|
||||
await loadDatasets();
|
||||
});
|
||||
|
||||
// Load environments from API
|
||||
async function loadEnvironments() {
|
||||
try {
|
||||
const response = await api.getEnvironments();
|
||||
environments = response;
|
||||
// Set first environment as default if no selection
|
||||
if (environments.length > 0 && !selectedEnv) {
|
||||
selectedEnv = environments[0].id;
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('[DatasetHub][Coherence:Failed] Failed to load environments:', err);
|
||||
// Use fallback environments if API fails
|
||||
environments = [
|
||||
{ id: 'development', name: 'Development' },
|
||||
{ id: 'staging', name: 'Staging' },
|
||||
{ id: 'production', name: 'Production' }
|
||||
];
|
||||
}
|
||||
}
|
||||
|
||||
// Load datasets from API
|
||||
async function loadDatasets() {
|
||||
if (!selectedEnv) return;
|
||||
|
||||
isLoading = true;
|
||||
error = null;
|
||||
try {
|
||||
const response = await api.getDatasets(selectedEnv, {
|
||||
search: searchQuery || undefined,
|
||||
page: currentPage,
|
||||
page_size: pageSize
|
||||
});
|
||||
|
||||
// Preserve selected IDs across pagination
|
||||
const newSelectedIds = new Set();
|
||||
response.datasets.forEach(d => {
|
||||
if (selectedIds.has(d.id)) {
|
||||
newSelectedIds.add(d.id);
|
||||
}
|
||||
});
|
||||
selectedIds = newSelectedIds;
|
||||
|
||||
datasets = response.datasets.map(d => ({
|
||||
id: d.id,
|
||||
table_name: d.table_name,
|
||||
schema: d.schema,
|
||||
database: d.database,
|
||||
mappedFields: d.mapped_fields ? {
|
||||
total: d.mapped_fields.total,
|
||||
mapped: d.mapped_fields.mapped
|
||||
} : null,
|
||||
lastTask: d.last_task ? {
|
||||
status: d.last_task.status?.toLowerCase() || null,
|
||||
id: d.last_task.task_id
|
||||
} : null,
|
||||
actions: ['map_columns'] // All datasets have map columns option
|
||||
}));
|
||||
|
||||
// Update pagination state
|
||||
total = response.total;
|
||||
totalPages = response.total_pages;
|
||||
|
||||
// Update selection state
|
||||
updateSelectionState();
|
||||
} catch (err) {
|
||||
error = err.message || 'Failed to load datasets';
|
||||
console.error('[DatasetHub][Coherence:Failed]', err);
|
||||
} finally {
|
||||
isLoading = false;
|
||||
}
|
||||
}
|
||||
|
||||
// Handle environment change
|
||||
function handleEnvChange(event) {
|
||||
selectedEnv = event.target.value;
|
||||
currentPage = 1;
|
||||
selectedIds.clear();
|
||||
loadDatasets();
|
||||
}
|
||||
|
||||
// Handle search input
|
||||
function handleSearch(event) {
|
||||
debouncedSearch(event.target.value);
|
||||
}
|
||||
|
||||
// Handle page change
|
||||
function handlePageChange(page) {
|
||||
currentPage = page;
|
||||
loadDatasets();
|
||||
}
|
||||
|
||||
// Handle page size change
|
||||
function handlePageSizeChange(event) {
|
||||
pageSize = parseInt(event.target.value);
|
||||
currentPage = 1;
|
||||
loadDatasets();
|
||||
}
|
||||
|
||||
// Update selection state based on current selection
|
||||
function updateSelectionState() {
|
||||
const visibleCount = datasets.length;
|
||||
const totalCount = total;
|
||||
|
||||
isAllSelected = selectedIds.size === totalCount && totalCount > 0;
|
||||
isAllVisibleSelected = selectedIds.size === visibleCount && visibleCount > 0;
|
||||
}
|
||||
|
||||
// Handle checkbox change for individual dataset
|
||||
function handleCheckboxChange(dataset, event) {
|
||||
if (event.target.checked) {
|
||||
selectedIds.add(dataset.id);
|
||||
} else {
|
||||
selectedIds.delete(dataset.id);
|
||||
}
|
||||
selectedIds = selectedIds; // Trigger reactivity
|
||||
updateSelectionState();
|
||||
}
|
||||
|
||||
// Handle select all
|
||||
async function handleSelectAll() {
|
||||
if (isAllSelected) {
|
||||
selectedIds.clear();
|
||||
} else {
|
||||
// Get all dataset IDs from API (including non-visible ones)
|
||||
try {
|
||||
const response = await api.getDatasetIds(selectedEnv, {
|
||||
search: searchQuery || undefined
|
||||
});
|
||||
response.dataset_ids.forEach(id => selectedIds.add(id));
|
||||
} catch (err) {
|
||||
console.error('[DatasetHub][Coherence:Failed] Failed to fetch all dataset IDs:', err);
|
||||
// Fallback to selecting visible datasets if API fails
|
||||
datasets.forEach(d => selectedIds.add(d.id));
|
||||
}
|
||||
}
|
||||
selectedIds = selectedIds; // Trigger reactivity
|
||||
updateSelectionState();
|
||||
}
|
||||
|
||||
// Handle select visible
|
||||
function handleSelectVisible() {
|
||||
if (isAllVisibleSelected) {
|
||||
datasets.forEach(d => selectedIds.delete(d.id));
|
||||
} else {
|
||||
datasets.forEach(d => selectedIds.add(d.id));
|
||||
}
|
||||
selectedIds = selectedIds; // Trigger reactivity
|
||||
updateSelectionState();
|
||||
}
|
||||
|
||||
// Handle action click
|
||||
function handleAction(dataset, action) {
|
||||
console.log(`[DatasetHub][Action] ${action} on dataset ${dataset.table_name}`);
|
||||
|
||||
if (action === 'map_columns') {
|
||||
// Show map columns modal
|
||||
showMapColumnsModal = true;
|
||||
mapSourceType = 'postgresql';
|
||||
mapConnectionId = null;
|
||||
mapFileData = null;
|
||||
} else if (action === 'generate_docs') {
|
||||
// Show generate docs modal
|
||||
showGenerateDocsModal = true;
|
||||
llmProvider = '';
|
||||
llmOptions = {};
|
||||
}
|
||||
}
|
||||
|
||||
// Handle bulk map columns
|
||||
async function handleBulkMapColumns() {
|
||||
console.log('[DatasetHub][handleBulkMapColumns][Entry]', {
|
||||
selectedIds: Array.from(selectedIds),
|
||||
mapSourceType,
|
||||
mapConnectionId,
|
||||
mapFileData
|
||||
});
|
||||
|
||||
if (selectedIds.size === 0) {
|
||||
console.log('[DatasetHub][handleBulkMapColumns] No datasets selected');
|
||||
return;
|
||||
}
|
||||
|
||||
if (mapSourceType === 'postgresql' && !mapConnectionId) {
|
||||
console.log('[DatasetHub][handleBulkMapColumns] No connection ID provided for PostgreSQL');
|
||||
return;
|
||||
}
|
||||
|
||||
if (mapSourceType === 'xlsx' && (!mapFileData || mapFileData.length === 0)) {
|
||||
console.log('[DatasetHub][handleBulkMapColumns] No file selected for XLSX');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
let fileData = null;
|
||||
if (mapSourceType === 'xlsx' && mapFileData && mapFileData.length > 0) {
|
||||
// For now we send the filename as a placeholder or handle upload if needed.
|
||||
// The backend expects a string 'file_data' in the current schema.
|
||||
fileData = mapFileData[0].name;
|
||||
}
|
||||
|
||||
const response = await api.postApi('/datasets/map-columns', {
|
||||
env_id: selectedEnv,
|
||||
dataset_ids: Array.from(selectedIds),
|
||||
source_type: mapSourceType,
|
||||
connection_id: mapConnectionId || undefined,
|
||||
file_data: fileData || undefined
|
||||
});
|
||||
console.log('[DatasetHub][Action] Bulk map columns task created:', response.task_id);
|
||||
|
||||
// Close modal and open task drawer
|
||||
showMapColumnsModal = false;
|
||||
selectedIds.clear();
|
||||
updateSelectionState();
|
||||
|
||||
if (response.task_id) {
|
||||
openDrawerForTask(response.task_id);
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('[DatasetHub][Coherence:Failed]', err);
|
||||
alert('Failed to create mapping task');
|
||||
}
|
||||
}
|
||||
|
||||
// Handle bulk generate docs
|
||||
async function handleBulkGenerateDocs() {
|
||||
if (selectedIds.size === 0) return;
|
||||
if (!llmProvider) {
|
||||
alert('Please select an LLM provider');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await api.postApi('/datasets/generate-docs', {
|
||||
env_id: selectedEnv,
|
||||
dataset_ids: Array.from(selectedIds),
|
||||
llm_provider: llmProvider,
|
||||
options: llmOptions
|
||||
});
|
||||
console.log('[DatasetHub][Action] Bulk generate docs task created:', response.task_id);
|
||||
|
||||
// Close modal and open task drawer
|
||||
showGenerateDocsModal = false;
|
||||
selectedIds.clear();
|
||||
updateSelectionState();
|
||||
|
||||
if (response.task_id) {
|
||||
openDrawerForTask(response.task_id);
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('[DatasetHub][Coherence:Failed]', err);
|
||||
alert('Failed to create documentation generation task');
|
||||
}
|
||||
}
|
||||
|
||||
// Handle task status click - open Task Drawer
|
||||
function handleTaskStatusClick(dataset) {
|
||||
if (dataset.lastTask?.id) {
|
||||
console.log(`[DatasetHub][Action] Open task drawer for task ${dataset.lastTask.id}`);
|
||||
openDrawerForTask(dataset.lastTask.id);
|
||||
}
|
||||
}
|
||||
|
||||
// Get task status icon
|
||||
function getTaskStatusIcon(status) {
|
||||
if (!status) return '';
|
||||
switch (status.toLowerCase()) {
|
||||
case 'running':
|
||||
return '<svg class="animate-spin" width="16" height="16" viewBox="0 0 24 24"><path fill="currentColor" d="M12 2a10 10 0 1 0 10 10A10 10 0 0 0 12 2zm0 18a8 8 0 1 1 8-8 8 0 0 1-8 8z"/></svg>';
|
||||
case 'success':
|
||||
return '<svg width="16" height="16" viewBox="0 0 24 24" fill="currentColor"><path d="M9 16.17L4.83 12l-1.42 1.41L9 19 21 7l-1.41-1.41L9 16.17z"/></svg>';
|
||||
case 'error':
|
||||
return '<svg width="16" height="16" viewBox="0 0 24 24" fill="currentColor"><path d="M12 2C6.48 2 2 6.48 2 12s4.48 10 10 10-4.48 10-10S17.52 2 12 2zm1 15h-2v-2h2v2zm0-4h-2V7h2v6z"/></svg>';
|
||||
case 'waiting_input':
|
||||
return '<svg width="16" height="16" viewBox="0 0 24 24" fill="currentColor"><path d="M12 2C6.48 2 2 6.48 2 12s4.48 10 10 10-4.48 10-10S17.52 2 12 2zm1 15h-2v-2h2v2zm0-4h-2V7h2v6z"/></svg>';
|
||||
default:
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
// Get mapping progress bar class
|
||||
function getMappingProgressClass(mapped, total) {
|
||||
if (!mapped || !total) return 'bg-gray-200';
|
||||
const percentage = (mapped / total) * 100;
|
||||
if (percentage === 100) {
|
||||
return 'bg-green-500';
|
||||
} else if (percentage >= 50) {
|
||||
return 'bg-yellow-400';
|
||||
} else {
|
||||
return 'bg-blue-400';
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
<style>
|
||||
.container {
|
||||
@apply max-w-7xl mx-auto px-4 py-6;
|
||||
}
|
||||
|
||||
.header {
|
||||
@apply flex items-center justify-between mb-6;
|
||||
}
|
||||
|
||||
.title {
|
||||
@apply text-2xl font-bold text-gray-900;
|
||||
}
|
||||
|
||||
.env-selector {
|
||||
@apply flex items-center space-x-4;
|
||||
}
|
||||
|
||||
.env-dropdown {
|
||||
@apply px-4 py-2 border border-gray-300 rounded-lg bg-white focus:outline-none focus:ring-2 focus:ring-blue-500;
|
||||
}
|
||||
|
||||
.refresh-btn {
|
||||
@apply px-4 py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700 transition-colors;
|
||||
}
|
||||
|
||||
.search-input {
|
||||
@apply px-4 py-2 border border-gray-300 rounded-lg bg-white focus:outline-none focus:ring-2 focus:ring-blue-500;
|
||||
}
|
||||
|
||||
.error-banner {
|
||||
@apply bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4 flex items-center justify-between;
|
||||
}
|
||||
|
||||
.retry-btn {
|
||||
@apply px-4 py-2 bg-red-600 text-white rounded hover:bg-red-700 transition-colors;
|
||||
}
|
||||
|
||||
.toolbar {
|
||||
@apply flex items-center justify-between mb-4 gap-4;
|
||||
}
|
||||
|
||||
.selection-buttons {
|
||||
@apply flex items-center gap-2;
|
||||
}
|
||||
|
||||
.dataset-grid {
|
||||
@apply bg-white border border-gray-200 rounded-lg overflow-hidden;
|
||||
}
|
||||
|
||||
.grid-header {
|
||||
@apply grid grid-cols-12 gap-4 px-6 py-3 bg-gray-50 border-b border-gray-200 font-semibold text-sm text-gray-700;
|
||||
}
|
||||
|
||||
.grid-row {
|
||||
@apply grid grid-cols-12 gap-4 px-6 py-4 border-b border-gray-200 hover:bg-gray-50 transition-colors;
|
||||
}
|
||||
|
||||
.grid-row:last-child {
|
||||
@apply border-b-0;
|
||||
}
|
||||
|
||||
.col-checkbox {
|
||||
@apply col-span-1;
|
||||
}
|
||||
|
||||
.col-table-name {
|
||||
@apply col-span-3 font-medium text-gray-900;
|
||||
}
|
||||
|
||||
.col-schema {
|
||||
@apply col-span-2;
|
||||
}
|
||||
|
||||
.col-mapping {
|
||||
@apply col-span-2;
|
||||
}
|
||||
|
||||
.col-task {
|
||||
@apply col-span-3;
|
||||
}
|
||||
|
||||
.col-actions {
|
||||
@apply col-span-1;
|
||||
}
|
||||
|
||||
.mapping-progress {
|
||||
@apply w-24 h-2 rounded-full overflow-hidden;
|
||||
}
|
||||
|
||||
.mapping-bar {
|
||||
@apply h-full transition-all duration-300;
|
||||
}
|
||||
|
||||
.task-status {
|
||||
@apply inline-flex items-center space-x-2 cursor-pointer hover:text-blue-600 transition-colors;
|
||||
}
|
||||
|
||||
.action-btn {
|
||||
@apply px-3 py-1 text-sm border border-gray-300 rounded hover:bg-gray-100 transition-colors;
|
||||
}
|
||||
|
||||
.action-btn.primary {
|
||||
@apply bg-blue-600 text-white border-blue-600 hover:bg-blue-700;
|
||||
}
|
||||
|
||||
.empty-state {
|
||||
@apply py-12 text-center text-gray-500;
|
||||
}
|
||||
|
||||
.skeleton {
|
||||
@apply animate-pulse bg-gray-200 rounded;
|
||||
}
|
||||
|
||||
.floating-panel {
|
||||
@apply fixed bottom-0 left-0 right-0 bg-white border-t border-gray-200 shadow-lg p-4 transition-transform transform translate-y-full;
|
||||
}
|
||||
|
||||
.floating-panel.visible {
|
||||
@apply transform translate-y-0;
|
||||
}
|
||||
|
||||
.modal-overlay {
|
||||
@apply fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50;
|
||||
}
|
||||
|
||||
.modal {
|
||||
@apply bg-white rounded-lg shadow-xl max-w-2xl w-full mx-4 max-h-[80vh] overflow-y-auto;
|
||||
}
|
||||
|
||||
.modal-header {
|
||||
@apply px-6 py-4 border-b border-gray-200 flex items-center justify-between relative;
|
||||
}
|
||||
|
||||
.close-modal-btn {
|
||||
@apply absolute top-4 right-4 p-2 text-gray-400 hover:text-gray-600 hover:bg-gray-100 rounded-full transition-all;
|
||||
}
|
||||
|
||||
.modal-body {
|
||||
@apply px-6 py-4;
|
||||
}
|
||||
|
||||
.modal-footer {
|
||||
@apply px-6 py-4 border-t border-gray-200 flex justify-end gap-3;
|
||||
}
|
||||
|
||||
.pagination {
|
||||
@apply flex items-center justify-between px-4 py-3 bg-gray-50 border-t border-gray-200;
|
||||
}
|
||||
|
||||
.pagination-info {
|
||||
@apply text-sm text-gray-600;
|
||||
}
|
||||
|
||||
.pagination-controls {
|
||||
@apply flex items-center gap-2;
|
||||
}
|
||||
|
||||
.page-btn {
|
||||
@apply px-3 py-1 border border-gray-300 rounded hover:bg-gray-100 disabled:opacity-50 disabled:cursor-not-allowed;
|
||||
}
|
||||
|
||||
.page-btn.active {
|
||||
@apply bg-blue-600 text-white border-blue-600;
|
||||
}
|
||||
</style>
|
||||
|
||||
<div class="container">
|
||||
<!-- Header -->
|
||||
<div class="header">
|
||||
<h1 class="title">{$t.nav?.datasets || 'Datasets'}</h1>
|
||||
<div class="env-selector">
|
||||
<select class="env-dropdown" bind:value={selectedEnv} on:change={handleEnvChange}>
|
||||
{#each environments as env}
|
||||
<option value={env.id}>{env.name}</option>
|
||||
{/each}
|
||||
</select>
|
||||
<button class="refresh-btn" on:click={loadDatasets}>
|
||||
{$t.common?.refresh || 'Refresh'}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Error Banner -->
|
||||
{#if error}
|
||||
<div class="error-banner">
|
||||
<span>{error}</span>
|
||||
<button class="retry-btn" on:click={loadDatasets}>
|
||||
{$t.common?.retry || 'Retry'}
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<!-- Loading State -->
|
||||
{#if isLoading}
|
||||
<div class="dataset-grid">
|
||||
<div class="grid-header">
|
||||
<div class="col-checkbox skeleton h-4"></div>
|
||||
<div class="col-table-name skeleton h-4"></div>
|
||||
<div class="col-schema skeleton h-4"></div>
|
||||
<div class="col-mapping skeleton h-4"></div>
|
||||
<div class="col-task skeleton h-4"></div>
|
||||
<div class="col-actions skeleton h-4"></div>
|
||||
</div>
|
||||
{#each Array(5) as _}
|
||||
<div class="grid-row">
|
||||
<div class="col-checkbox skeleton h-4"></div>
|
||||
<div class="col-table-name skeleton h-4"></div>
|
||||
<div class="col-schema skeleton h-4"></div>
|
||||
<div class="col-mapping skeleton h-4"></div>
|
||||
<div class="col-task skeleton h-4"></div>
|
||||
<div class="col-actions skeleton h-4"></div>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
{:else if datasets.length === 0}
|
||||
<!-- Empty State -->
|
||||
<div class="empty-state">
|
||||
<svg class="w-16 h-16 mx-auto mb-4 text-gray-400" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<path d="M3 3h18v18H3V3zm16 16V5H5v14h14z"/>
|
||||
</svg>
|
||||
<p>{$t.datasets?.empty || 'No datasets found'}</p>
|
||||
</div>
|
||||
{:else}
|
||||
<!-- Toolbar -->
|
||||
<div class="toolbar">
|
||||
<div class="selection-buttons">
|
||||
<button
|
||||
class="action-btn"
|
||||
on:click={handleSelectAll}
|
||||
disabled={total === 0}
|
||||
>
|
||||
{isAllSelected ? 'Deselect All' : 'Select All'}
|
||||
</button>
|
||||
<button
|
||||
class="action-btn"
|
||||
on:click={handleSelectVisible}
|
||||
disabled={datasets.length === 0}
|
||||
>
|
||||
{isAllVisibleSelected ? 'Deselect Visible' : 'Select Visible'}
|
||||
</button>
|
||||
{#if selectedIds.size > 0}
|
||||
<span class="text-sm text-gray-600">
|
||||
{selectedIds.size} selected
|
||||
</span>
|
||||
{/if}
|
||||
</div>
|
||||
<div>
|
||||
<input
|
||||
type="text"
|
||||
class="search-input"
|
||||
placeholder="Search datasets..."
|
||||
on:input={handleSearch}
|
||||
value={searchQuery}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Dataset Grid -->
|
||||
<div class="dataset-grid">
|
||||
<!-- Grid Header -->
|
||||
<div class="grid-header">
|
||||
<div class="col-checkbox"></div>
|
||||
<div class="col-table-name">{$t.datasets?.table_name || 'Table Name'}</div>
|
||||
<div class="col-schema">{$t.datasets?.schema || 'Schema'}</div>
|
||||
<div class="col-mapping">{$t.datasets?.mapped_fields || 'Mapped Fields'}</div>
|
||||
<div class="col-task">{$t.datasets?.last_task || 'Last Task'}</div>
|
||||
<div class="col-actions">{$t.datasets?.actions || 'Actions'}</div>
|
||||
</div>
|
||||
|
||||
<!-- Grid Rows -->
|
||||
{#each datasets as dataset}
|
||||
<div class="grid-row">
|
||||
<!-- Checkbox -->
|
||||
<div class="col-checkbox">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={selectedIds.has(dataset.id)}
|
||||
on:change={(e) => handleCheckboxChange(dataset, e)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<!-- Table Name -->
|
||||
<div class="col-table-name">
|
||||
<a
|
||||
href={`/datasets/${dataset.id}?env_id=${selectedEnv}`}
|
||||
class="text-blue-600 hover:text-blue-800 hover:underline"
|
||||
>
|
||||
{dataset.table_name}
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<!-- Schema -->
|
||||
<div class="col-schema">
|
||||
{dataset.schema}
|
||||
</div>
|
||||
|
||||
<!-- Mapping Progress -->
|
||||
<div class="col-mapping">
|
||||
{#if dataset.mappedFields}
|
||||
<div class="mapping-progress" title="{$t.datasets?.mapped_of_total || 'Mapped of total'}: {dataset.mappedFields.mapped} / {dataset.mappedFields.total}">
|
||||
<div class="mapping-bar {getMappingProgressClass(dataset.mappedFields.mapped, dataset.mappedFields.total)}" style="width: {dataset.mappedFields.mapped / dataset.mappedFields.total * 100}%"></div>
|
||||
</div>
|
||||
{:else}
|
||||
<span class="text-gray-400">-</span>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- Last Task -->
|
||||
<div class="col-task">
|
||||
{#if dataset.lastTask}
|
||||
<div
|
||||
class="task-status"
|
||||
on:click={() => handleTaskStatusClick(dataset)}
|
||||
role="button"
|
||||
tabindex="0"
|
||||
aria-label={$t.datasets?.view_task || 'View task'}
|
||||
>
|
||||
{@html getTaskStatusIcon(dataset.lastTask.status)}
|
||||
<span>
|
||||
{#if dataset.lastTask.status.toLowerCase() === 'running'}
|
||||
{$t.datasets?.task_running || 'Running...'}
|
||||
{:else if dataset.lastTask.status.toLowerCase() === 'success'}
|
||||
{$t.datasets?.task_done || 'Done'}
|
||||
{:else if dataset.lastTask.status.toLowerCase() === 'error'}
|
||||
{$t.datasets?.task_failed || 'Failed'}
|
||||
{:else if dataset.lastTask.status.toLowerCase() === 'waiting_input'}
|
||||
{$t.datasets?.task_waiting || 'Waiting'}
|
||||
{/if}
|
||||
</span>
|
||||
</div>
|
||||
{:else}
|
||||
<span class="text-gray-400">-</span>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- Actions -->
|
||||
<div class="col-actions">
|
||||
{#if dataset.actions.includes('map_columns')}
|
||||
<button
|
||||
class="action-btn primary"
|
||||
on:click={() => handleAction(dataset, 'map_columns')}
|
||||
aria-label={$t.datasets?.action_map_columns || 'Map Columns'}
|
||||
>
|
||||
{$t.datasets?.action_map_columns || 'Map Columns'}
|
||||
</button>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
|
||||
<!-- Pagination -->
|
||||
{#if totalPages > 1}
|
||||
<div class="pagination">
|
||||
<div class="pagination-info">
|
||||
Showing {((currentPage - 1) * pageSize) + 1}-{Math.min(currentPage * pageSize, total)} of {total}
|
||||
</div>
|
||||
<div class="pagination-controls">
|
||||
<button
|
||||
class="page-btn"
|
||||
on:click={() => handlePageChange(1)}
|
||||
disabled={currentPage === 1}
|
||||
>
|
||||
First
|
||||
</button>
|
||||
<button
|
||||
class="page-btn"
|
||||
on:click={() => handlePageChange(currentPage - 1)}
|
||||
disabled={currentPage === 1}
|
||||
>
|
||||
Previous
|
||||
</button>
|
||||
{#each Array.from({length: totalPages}, (_, i) => i + 1) as pageNum}
|
||||
<button
|
||||
class="page-btn {pageNum === currentPage ? 'active' : ''}"
|
||||
on:click={() => handlePageChange(pageNum)}
|
||||
>
|
||||
{pageNum}
|
||||
</button>
|
||||
{/each}
|
||||
<button
|
||||
class="page-btn"
|
||||
on:click={() => handlePageChange(currentPage + 1)}
|
||||
disabled={currentPage === totalPages}
|
||||
>
|
||||
Next
|
||||
</button>
|
||||
<button
|
||||
class="page-btn"
|
||||
on:click={() => handlePageChange(totalPages)}
|
||||
disabled={currentPage === totalPages}
|
||||
>
|
||||
Last
|
||||
</button>
|
||||
</div>
|
||||
<div>
|
||||
<select
|
||||
class="env-dropdown"
|
||||
value={pageSize}
|
||||
on:change={handlePageSizeChange}
|
||||
>
|
||||
<option value={5}>5 per page</option>
|
||||
<option value={10}>10 per page</option>
|
||||
<option value={25}>25 per page</option>
|
||||
<option value={50}>50 per page</option>
|
||||
<option value={100}>100 per page</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<!-- Floating Bulk Action Panel -->
|
||||
{#if selectedIds.size > 0}
|
||||
<div class="floating-panel visible">
|
||||
<div class="flex items-center justify-between max-w-7xl mx-auto">
|
||||
<div class="flex items-center gap-4">
|
||||
<span class="font-medium">
|
||||
✓ {selectedIds.size} selected
|
||||
</span>
|
||||
</div>
|
||||
<div class="flex gap-3">
|
||||
<button
|
||||
class="action-btn primary"
|
||||
on:click={() => showMapColumnsModal = true}
|
||||
>
|
||||
Map Columns
|
||||
</button>
|
||||
<button
|
||||
class="action-btn primary"
|
||||
on:click={() => showGenerateDocsModal = true}
|
||||
>
|
||||
Generate Docs
|
||||
</button>
|
||||
<button
|
||||
class="action-btn"
|
||||
on:click={() => selectedIds.clear()}
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
{/if}
|
||||
|
||||
<!-- Map Columns Modal -->
|
||||
{#if showMapColumnsModal}
|
||||
<div class="modal-overlay" on:click={() => showMapColumnsModal = false}>
|
||||
<div class="modal" on:click|stopPropagation>
|
||||
<div class="modal-header">
|
||||
<h2 class="text-xl font-bold">Bulk Column Mapping</h2>
|
||||
<button on:click={() => showMapColumnsModal = false} class="close-modal-btn" aria-label="Close modal">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
|
||||
<line x1="18" y1="6" x2="6" y2="18"></line>
|
||||
<line x1="6" y1="6" x2="18" y2="18"></line>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div class="space-y-4">
|
||||
<div>
|
||||
<label class="block text-sm font-medium mb-2">Source Type</label>
|
||||
<select
|
||||
class="env-dropdown w-full"
|
||||
bind:value={mapSourceType}
|
||||
>
|
||||
<option value="postgresql">PostgreSQL Comments</option>
|
||||
<option value="xlsx">XLSX File</option>
|
||||
</select>
|
||||
</div>
|
||||
{#if mapSourceType === 'postgresql'}
|
||||
<div>
|
||||
<label class="block text-sm font-medium mb-2">Connection ID</label>
|
||||
<input
|
||||
type="text"
|
||||
class="search-input w-full"
|
||||
placeholder="Enter connection ID..."
|
||||
bind:value={mapConnectionId}
|
||||
/>
|
||||
</div>
|
||||
{:else}
|
||||
<div>
|
||||
<label class="block text-sm font-medium mb-2">XLSX File</label>
|
||||
<input
|
||||
type="file"
|
||||
class="w-full"
|
||||
accept=".xlsx,.xls"
|
||||
bind:files={mapFileData}
|
||||
bind:this={mapFileInput}
|
||||
/>
|
||||
</div>
|
||||
{/if}
|
||||
<div>
|
||||
<label class="block text-sm font-medium mb-2">Selected Datasets</label>
|
||||
<div class="max-h-40 overflow-y-auto">
|
||||
{#each Array.from(selectedIds) as id}
|
||||
{#each datasets as d}
|
||||
{#if d.id === id}
|
||||
<div class="text-sm py-1 border-b border-gray-200">{d.table_name}</div>
|
||||
{/if}
|
||||
{/each}
|
||||
{/each}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button class="action-btn" on:click={() => showMapColumnsModal = false}>Cancel</button>
|
||||
<button
|
||||
type="button"
|
||||
class="action-btn primary"
|
||||
on:click|preventDefault={handleBulkMapColumns}
|
||||
disabled={selectedIds.size === 0 || (mapSourceType === 'postgresql' && !mapConnectionId) || (mapSourceType === 'xlsx' && (!mapFileData || mapFileData.length === 0))}
|
||||
>
|
||||
Start Mapping
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<!-- Generate Docs Modal -->
|
||||
{#if showGenerateDocsModal}
|
||||
<div class="modal-overlay" on:click={() => showGenerateDocsModal = false}>
|
||||
<div class="modal" on:click|stopPropagation>
|
||||
<div class="modal-header">
|
||||
<h2 class="text-xl font-bold">Bulk Documentation Generation</h2>
|
||||
<button on:click={() => showGenerateDocsModal = false} class="close-modal-btn" aria-label="Close modal">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
|
||||
<line x1="18" y1="6" x2="6" y2="18"></line>
|
||||
<line x1="6" y1="6" x2="18" y2="18"></line>
|
||||
</svg>
|
||||
</button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div class="space-y-4">
|
||||
<div>
|
||||
<label class="block text-sm font-medium mb-2">LLM Provider</label>
|
||||
<select
|
||||
class="env-dropdown w-full"
|
||||
bind:value={llmProvider}
|
||||
>
|
||||
<option value="">Select LLM provider...</option>
|
||||
<option value="openai">OpenAI</option>
|
||||
<option value="anthropic">Anthropic</option>
|
||||
<option value="cohere">Cohere</option>
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label class="block text-sm font-medium mb-2">Selected Datasets</label>
|
||||
<div class="max-h-40 overflow-y-auto">
|
||||
{#each Array.from(selectedIds) as id}
|
||||
{#each datasets as d}
|
||||
{#if d.id === id}
|
||||
<div class="text-sm py-1 border-b border-gray-200">{d.table_name}</div>
|
||||
{/if}
|
||||
{/each}
|
||||
{/each}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button class="action-btn" on:click={() => showGenerateDocsModal = false}>Cancel</button>
|
||||
<button
|
||||
class="action-btn primary"
|
||||
on:click={handleBulkGenerateDocs}
|
||||
disabled={!llmProvider || selectedIds.size === 0}
|
||||
>
|
||||
Generate Documentation
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- [/DEF:DatasetHub:Page] -->
|
||||
418
frontend/src/routes/datasets/[id]/+page.svelte
Normal file
418
frontend/src/routes/datasets/[id]/+page.svelte
Normal file
@@ -0,0 +1,418 @@
|
||||
<!-- [DEF:DatasetDetail:Page] -->
|
||||
<script>
|
||||
/**
|
||||
* @TIER: CRITICAL
|
||||
* @PURPOSE: Dataset Detail View - Shows detailed dataset information with columns, SQL, and linked dashboards
|
||||
* @LAYER: UI
|
||||
* @RELATION: BINDS_TO -> sidebarStore
|
||||
* @INVARIANT: Always shows dataset details when loaded
|
||||
*
|
||||
* @UX_STATE: Loading -> Shows skeleton loader
|
||||
* @UX_STATE: Loaded -> Shows dataset details with columns and linked dashboards
|
||||
* @UX_STATE: Error -> Shows error banner with retry button
|
||||
* @UX_FEEDBACK: Clicking linked dashboard navigates to dashboard detail
|
||||
* @UX_RECOVERY: Refresh button reloads dataset details
|
||||
*/
|
||||
|
||||
import { onMount } from 'svelte';
|
||||
import { goto } from '$app/navigation';
|
||||
import { page } from '$app/stores';
|
||||
import { t } from '$lib/i18n';
|
||||
import { api } from '$lib/api.js';
|
||||
import { openDrawerForTask } from '$lib/stores/taskDrawer.js';
|
||||
|
||||
// Get dataset ID from URL params
|
||||
$: datasetId = $page.params.id;
|
||||
$: envId = $page.url.searchParams.get('env_id') || '';
|
||||
|
||||
// State
|
||||
let dataset = null;
|
||||
let isLoading = true;
|
||||
let error = null;
|
||||
|
||||
// Load dataset details on mount
|
||||
onMount(async () => {
|
||||
await loadDatasetDetail();
|
||||
});
|
||||
|
||||
// Load dataset details from API
|
||||
async function loadDatasetDetail() {
|
||||
if (!datasetId || !envId) {
|
||||
error = 'Missing dataset ID or environment ID';
|
||||
isLoading = false;
|
||||
return;
|
||||
}
|
||||
|
||||
isLoading = true;
|
||||
error = null;
|
||||
try {
|
||||
const response = await api.getDatasetDetail(envId, datasetId);
|
||||
dataset = response;
|
||||
} catch (err) {
|
||||
error = err.message || 'Failed to load dataset details';
|
||||
console.error('[DatasetDetail][Coherence:Failed]', err);
|
||||
} finally {
|
||||
isLoading = false;
|
||||
}
|
||||
}
|
||||
|
||||
// Navigate to linked dashboard
|
||||
function navigateToDashboard(dashboardId) {
|
||||
goto(`/dashboards/${dashboardId}?env_id=${envId}`);
|
||||
}
|
||||
|
||||
// Navigate back to dataset list
|
||||
function goBack() {
|
||||
goto(`/dashboards?env_id=${envId}`);
|
||||
}
|
||||
|
||||
// Get column type icon/color
|
||||
function getColumnTypeClass(type) {
|
||||
if (!type) return 'text-gray-500';
|
||||
const lowerType = type.toLowerCase();
|
||||
if (lowerType.includes('int') || lowerType.includes('float') || lowerType.includes('num')) {
|
||||
return 'text-blue-600 bg-blue-50';
|
||||
} else if (lowerType.includes('date') || lowerType.includes('time')) {
|
||||
return 'text-green-600 bg-green-50';
|
||||
} else if (lowerType.includes('str') || lowerType.includes('text') || lowerType.includes('char')) {
|
||||
return 'text-purple-600 bg-purple-50';
|
||||
} else if (lowerType.includes('bool')) {
|
||||
return 'text-orange-600 bg-orange-50';
|
||||
}
|
||||
return 'text-gray-600 bg-gray-50';
|
||||
}
|
||||
|
||||
// Get mapping progress percentage
|
||||
function getMappingProgress(column) {
|
||||
// Placeholder: In real implementation, this would check if column has mapping
|
||||
return column.description ? 100 : 0;
|
||||
}
|
||||
</script>
|
||||
|
||||
<style>
|
||||
.container {
|
||||
@apply max-w-7xl mx-auto px-4 py-6;
|
||||
}
|
||||
|
||||
.header {
|
||||
@apply flex items-center justify-between mb-6;
|
||||
}
|
||||
|
||||
.back-btn {
|
||||
@apply flex items-center gap-2 text-gray-600 hover:text-gray-900 transition-colors;
|
||||
}
|
||||
|
||||
.title {
|
||||
@apply text-2xl font-bold text-gray-900;
|
||||
}
|
||||
|
||||
.subtitle {
|
||||
@apply text-sm text-gray-500 mt-1;
|
||||
}
|
||||
|
||||
.detail-grid {
|
||||
@apply grid grid-cols-1 lg:grid-cols-3 gap-6;
|
||||
}
|
||||
|
||||
.detail-card {
|
||||
@apply bg-white border border-gray-200 rounded-lg p-6;
|
||||
}
|
||||
|
||||
.card-title {
|
||||
@apply text-lg font-semibold text-gray-900 mb-4;
|
||||
}
|
||||
|
||||
.info-row {
|
||||
@apply flex justify-between py-2 border-b border-gray-100 last:border-0;
|
||||
}
|
||||
|
||||
.info-label {
|
||||
@apply text-sm text-gray-500;
|
||||
}
|
||||
|
||||
.info-value {
|
||||
@apply text-sm font-medium text-gray-900;
|
||||
}
|
||||
|
||||
.columns-section {
|
||||
@apply lg:col-span-2;
|
||||
}
|
||||
|
||||
.columns-grid {
|
||||
@apply grid grid-cols-1 md:grid-cols-2 gap-3;
|
||||
}
|
||||
|
||||
.column-item {
|
||||
@apply p-3 border border-gray-200 rounded-lg hover:border-blue-300 transition-colors;
|
||||
}
|
||||
|
||||
.column-header {
|
||||
@apply flex items-center justify-between mb-2;
|
||||
}
|
||||
|
||||
.column-name {
|
||||
@apply font-medium text-gray-900;
|
||||
}
|
||||
|
||||
.column-type {
|
||||
@apply text-xs px-2 py-1 rounded;
|
||||
}
|
||||
|
||||
.column-meta {
|
||||
@apply flex items-center gap-2 text-xs text-gray-500;
|
||||
}
|
||||
|
||||
.column-description {
|
||||
@apply text-sm text-gray-600 mt-2;
|
||||
}
|
||||
|
||||
.mapping-badge {
|
||||
@apply inline-flex items-center px-2 py-0.5 text-xs rounded-full;
|
||||
}
|
||||
|
||||
.mapping-badge.mapped {
|
||||
@apply bg-green-100 text-green-800;
|
||||
}
|
||||
|
||||
.mapping-badge.unmapped {
|
||||
@apply bg-gray-100 text-gray-600;
|
||||
}
|
||||
|
||||
.linked-dashboards-list {
|
||||
@apply space-y-2;
|
||||
}
|
||||
|
||||
.linked-dashboard-item {
|
||||
@apply flex items-center gap-3 p-3 border border-gray-200 rounded-lg hover:bg-gray-50 cursor-pointer transition-colors;
|
||||
}
|
||||
|
||||
.dashboard-icon {
|
||||
@apply w-8 h-8 bg-blue-100 rounded-lg flex items-center justify-center text-blue-600;
|
||||
}
|
||||
|
||||
.dashboard-info {
|
||||
@apply flex-1;
|
||||
}
|
||||
|
||||
.dashboard-title {
|
||||
@apply font-medium text-gray-900;
|
||||
}
|
||||
|
||||
.dashboard-id {
|
||||
@apply text-xs text-gray-500;
|
||||
}
|
||||
|
||||
.sql-section {
|
||||
@apply mt-6;
|
||||
}
|
||||
|
||||
.sql-code {
|
||||
@apply bg-gray-900 text-gray-100 p-4 rounded-lg overflow-x-auto text-sm font-mono;
|
||||
}
|
||||
|
||||
.empty-state {
|
||||
@apply py-8 text-center text-gray-500;
|
||||
}
|
||||
|
||||
.skeleton {
|
||||
@apply animate-pulse bg-gray-200 rounded;
|
||||
}
|
||||
|
||||
.error-banner {
|
||||
@apply bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4 flex items-center justify-between;
|
||||
}
|
||||
|
||||
.retry-btn {
|
||||
@apply px-4 py-2 bg-red-600 text-white rounded hover:bg-red-700 transition-colors;
|
||||
}
|
||||
</style>
|
||||
|
||||
<div class="container">
|
||||
<!-- Header -->
|
||||
<div class="header">
|
||||
<div>
|
||||
<button class="back-btn" on:click={goBack}>
|
||||
<svg width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<path d="M19 12H5M12 19l-7-7 7-7"/>
|
||||
</svg>
|
||||
{$t.common?.back || 'Back to Datasets'}
|
||||
</button>
|
||||
{#if dataset}
|
||||
<h1 class="title mt-4">{dataset.table_name}</h1>
|
||||
<p class="subtitle">{dataset.schema} • {dataset.database}</p>
|
||||
{:else if !isLoading}
|
||||
<h1 class="title mt-4">{$t.datasets?.detail_title || 'Dataset Details'}</h1>
|
||||
{/if}
|
||||
</div>
|
||||
<button class="retry-btn" on:click={loadDatasetDetail}>
|
||||
{$t.common?.refresh || 'Refresh'}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Error Banner -->
|
||||
{#if error}
|
||||
<div class="error-banner">
|
||||
<span>{error}</span>
|
||||
<button class="retry-btn" on:click={loadDatasetDetail}>
|
||||
{$t.common?.retry || 'Retry'}
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<!-- Loading State -->
|
||||
{#if isLoading}
|
||||
<div class="detail-grid">
|
||||
<div class="detail-card">
|
||||
<div class="skeleton h-6 w-1/2 mb-4"></div>
|
||||
{#each Array(5) as _}
|
||||
<div class="info-row">
|
||||
<div class="skeleton h-4 w-20"></div>
|
||||
<div class="skeleton h-4 w-32"></div>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
<div class="detail-card columns-section">
|
||||
<div class="skeleton h-6 w-1/3 mb-4"></div>
|
||||
<div class="columns-grid">
|
||||
{#each Array(4) as _}
|
||||
<div class="column-item">
|
||||
<div class="skeleton h-4 w-full mb-2"></div>
|
||||
<div class="skeleton h-3 w-16"></div>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{:else if dataset}
|
||||
<div class="detail-grid">
|
||||
<!-- Dataset Info Card -->
|
||||
<div class="detail-card">
|
||||
<h2 class="card-title">{$t.datasets?.info || 'Dataset Information'}</h2>
|
||||
<div class="info-row">
|
||||
<span class="info-label">{$t.datasets?.table_name || 'Table Name'}</span>
|
||||
<span class="info-value">{dataset.table_name}</span>
|
||||
</div>
|
||||
<div class="info-row">
|
||||
<span class="info-label">{$t.datasets?.schema || 'Schema'}</span>
|
||||
<span class="info-value">{dataset.schema || '-'}</span>
|
||||
</div>
|
||||
<div class="info-row">
|
||||
<span class="info-label">{$t.datasets?.database || 'Database'}</span>
|
||||
<span class="info-value">{dataset.database}</span>
|
||||
</div>
|
||||
<div class="info-row">
|
||||
<span class="info-label">{$t.datasets?.columns_count || 'Columns'}</span>
|
||||
<span class="info-value">{dataset.column_count}</span>
|
||||
</div>
|
||||
<div class="info-row">
|
||||
<span class="info-label">{$t.datasets?.linked_dashboards || 'Linked Dashboards'}</span>
|
||||
<span class="info-value">{dataset.linked_dashboard_count}</span>
|
||||
</div>
|
||||
{#if dataset.is_sqllab_view}
|
||||
<div class="info-row">
|
||||
<span class="info-label">{$t.datasets?.type || 'Type'}</span>
|
||||
<span class="info-value">SQL Lab View</span>
|
||||
</div>
|
||||
{/if}
|
||||
{#if dataset.created_on}
|
||||
<div class="info-row">
|
||||
<span class="info-label">{$t.datasets?.created || 'Created'}</span>
|
||||
<span class="info-value">{new Date(dataset.created_on).toLocaleDateString()}</span>
|
||||
</div>
|
||||
{/if}
|
||||
{#if dataset.changed_on}
|
||||
<div class="info-row">
|
||||
<span class="info-label">{$t.datasets?.updated || 'Updated'}</span>
|
||||
<span class="info-value">{new Date(dataset.changed_on).toLocaleDateString()}</span>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- Linked Dashboards Card -->
|
||||
{#if dataset.linked_dashboards && dataset.linked_dashboards.length > 0}
|
||||
<div class="detail-card">
|
||||
<h2 class="card-title">{$t.datasets?.linked_dashboards || 'Linked Dashboards'} ({dataset.linked_dashboard_count})</h2>
|
||||
<div class="linked-dashboards-list">
|
||||
{#each dataset.linked_dashboards as dashboard}
|
||||
<div
|
||||
class="linked-dashboard-item"
|
||||
on:click={() => navigateToDashboard(dashboard.id)}
|
||||
role="button"
|
||||
tabindex="0"
|
||||
>
|
||||
<div class="dashboard-icon">
|
||||
<svg width="16" height="16" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<rect x="3" y="3" width="18" height="18" rx="2" ry="2"/>
|
||||
<line x1="3" y1="9" x2="21" y2="9"/>
|
||||
<line x1="9" y1="21" x2="9" y2="9"/>
|
||||
</svg>
|
||||
</div>
|
||||
<div class="dashboard-info">
|
||||
<div class="dashboard-title">{dashboard.title}</div>
|
||||
<div class="dashboard-id">ID: {dashboard.id}{#if dashboard.slug} • {dashboard.slug}{/if}</div>
|
||||
</div>
|
||||
<svg width="16" height="16" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" class="text-gray-400">
|
||||
<path d="M9 18l6-6-6-6"/>
|
||||
</svg>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<!-- Columns Card -->
|
||||
<div class="detail-card columns-section">
|
||||
<h2 class="card-title">{$t.datasets?.columns || 'Columns'} ({dataset.column_count})</h2>
|
||||
{#if dataset.columns && dataset.columns.length > 0}
|
||||
<div class="columns-grid">
|
||||
{#each dataset.columns as column}
|
||||
<div class="column-item">
|
||||
<div class="column-header">
|
||||
<span class="column-name">{column.name}</span>
|
||||
{#if column.type}
|
||||
<span class="column-type {getColumnTypeClass(column.type)}">{column.type}</span>
|
||||
{/if}
|
||||
</div>
|
||||
<div class="column-meta">
|
||||
{#if column.is_dttm}
|
||||
<span class="text-xs text-green-600">📅 Date/Time</span>
|
||||
{/if}
|
||||
{#if !column.is_active}
|
||||
<span class="text-xs text-gray-400">(Inactive)</span>
|
||||
{/if}
|
||||
<span class="mapping-badge {column.description ? 'mapped' : 'unmapped'}">
|
||||
{column.description ? '✓ Mapped' : 'Unmapped'}
|
||||
</span>
|
||||
</div>
|
||||
{#if column.description}
|
||||
<p class="column-description">{column.description}</p>
|
||||
{/if}
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
{:else}
|
||||
<div class="empty-state">
|
||||
{$t.datasets?.no_columns || 'No columns found'}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- SQL Section (for SQL Lab views) -->
|
||||
{#if dataset.sql}
|
||||
<div class="detail-card sql-section lg:col-span-3">
|
||||
<h2 class="card-title">{$t.datasets?.sql_query || 'SQL Query'}</h2>
|
||||
<pre class="sql-code">{dataset.sql}</pre>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{:else}
|
||||
<div class="empty-state">
|
||||
<svg class="w-16 h-16 mx-auto mb-4 text-gray-400" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<path d="M3 3h18v18H3V3zm16 16V5H5v14h14z"/>
|
||||
</svg>
|
||||
<p>{$t.datasets?.not_found || 'Dataset not found'}</p>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- [/DEF:DatasetDetail:Page] -->
|
||||
@@ -1,295 +1,539 @@
|
||||
<!-- [DEF:SettingsPage:Page] -->
|
||||
<script>
|
||||
import { onMount } from 'svelte';
|
||||
import { updateGlobalSettings, addEnvironment, updateEnvironment, deleteEnvironment, testEnvironmentConnection, updateStorageSettings } from '../../lib/api';
|
||||
import { addToast } from '../../lib/toasts';
|
||||
import { t } from '$lib/i18n';
|
||||
import { Button, Input, Card, PageHeader } from '$lib/ui';
|
||||
/**
|
||||
* @TIER: CRITICAL
|
||||
* @PURPOSE: Consolidated Settings Page - All settings in one place with tabbed navigation
|
||||
* @LAYER: UI
|
||||
* @RELATION: BINDS_TO -> sidebarStore
|
||||
* @INVARIANT: Always shows tabbed interface with all settings categories
|
||||
*
|
||||
* @UX_STATE: Loading -> Shows skeleton loader
|
||||
* @UX_STATE: Loaded -> Shows tabbed settings interface
|
||||
* @UX_STATE: Error -> Shows error banner with retry button
|
||||
* @UX_FEEDBACK: Toast notifications on save success/failure
|
||||
* @UX_RECOVERY: Refresh button reloads settings data
|
||||
*/
|
||||
|
||||
/** @type {import('./$types').PageData} */
|
||||
export let data;
|
||||
import { onMount } from 'svelte';
|
||||
import { t } from '$lib/i18n';
|
||||
import { api } from '$lib/api.js';
|
||||
import { addToast } from '$lib/toasts';
|
||||
import ProviderConfig from '../../components/llm/ProviderConfig.svelte';
|
||||
|
||||
let settings = data.settings || {
|
||||
environments: [],
|
||||
settings: {
|
||||
storage: {
|
||||
root_path: '',
|
||||
backup_structure_pattern: '',
|
||||
repo_structure_pattern: '',
|
||||
filename_pattern: ''
|
||||
}
|
||||
}
|
||||
// State
|
||||
let activeTab = 'environments';
|
||||
let settings = null;
|
||||
let isLoading = true;
|
||||
let error = null;
|
||||
|
||||
// Environment editing state
|
||||
let editingEnvId = null;
|
||||
let isAddingEnv = false;
|
||||
let newEnv = {
|
||||
id: '',
|
||||
name: '',
|
||||
url: '',
|
||||
username: '',
|
||||
password: '',
|
||||
is_default: false,
|
||||
backup_schedule: {
|
||||
enabled: false,
|
||||
cron_expression: '0 0 * * *'
|
||||
}
|
||||
};
|
||||
|
||||
// Load settings on mount
|
||||
onMount(async () => {
|
||||
await loadSettings();
|
||||
});
|
||||
|
||||
// Load consolidated settings from API
|
||||
async function loadSettings() {
|
||||
isLoading = true;
|
||||
error = null;
|
||||
try {
|
||||
const response = await api.getConsolidatedSettings();
|
||||
settings = response;
|
||||
} catch (err) {
|
||||
error = err.message || 'Failed to load settings';
|
||||
console.error('[SettingsPage][Coherence:Failed]', err);
|
||||
} finally {
|
||||
isLoading = false;
|
||||
}
|
||||
}
|
||||
|
||||
// Handle tab change
|
||||
function handleTabChange(tab) {
|
||||
activeTab = tab;
|
||||
}
|
||||
|
||||
// Get tab class
|
||||
function getTabClass(tab) {
|
||||
return activeTab === tab
|
||||
? 'text-blue-600 border-b-2 border-blue-600'
|
||||
: 'text-gray-600 hover:text-gray-800 border-transparent hover:border-gray-300';
|
||||
}
|
||||
|
||||
// Handle global settings save (Logging, Storage)
|
||||
async function handleSave() {
|
||||
console.log('[SettingsPage][Action] Saving settings');
|
||||
try {
|
||||
// In a real app we might want to only send the changed section,
|
||||
// but updateConsolidatedSettings expects full object or we can use specific endpoints.
|
||||
// For now we use the consolidated update.
|
||||
await api.updateConsolidatedSettings(settings);
|
||||
addToast($t.settings?.save_success || 'Settings saved', 'success');
|
||||
} catch (err) {
|
||||
console.error('[SettingsPage][Coherence:Failed]', err);
|
||||
addToast($t.settings?.save_failed || 'Failed to save settings', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
// Handle environment actions
|
||||
async function handleTestEnv(id) {
|
||||
console.log(`[SettingsPage][Action] Test environment ${id}`);
|
||||
addToast('Testing connection...', 'info');
|
||||
try {
|
||||
const result = await api.testEnvironmentConnection(id);
|
||||
if (result.status === 'success') {
|
||||
addToast('Connection successful', 'success');
|
||||
} else {
|
||||
addToast(`Connection failed: ${result.message}`, 'error');
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('[SettingsPage][Coherence:Failed] Error testing connection:', err);
|
||||
addToast('Failed to test connection', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
function editEnv(env) {
|
||||
console.log(`[SettingsPage][Action] Edit environment ${env.id}`);
|
||||
newEnv = JSON.parse(JSON.stringify(env)); // Deep copy
|
||||
// Ensure backup_schedule exists
|
||||
if (!newEnv.backup_schedule) {
|
||||
newEnv.backup_schedule = { enabled: false, cron_expression: '0 0 * * *' };
|
||||
}
|
||||
editingEnvId = env.id;
|
||||
isAddingEnv = false;
|
||||
}
|
||||
|
||||
function resetEnvForm() {
|
||||
newEnv = {
|
||||
id: '',
|
||||
name: '',
|
||||
url: '',
|
||||
username: '',
|
||||
password: '',
|
||||
is_default: false,
|
||||
backup_schedule: {
|
||||
enabled: false,
|
||||
cron_expression: '0 0 * * *'
|
||||
}
|
||||
};
|
||||
editingEnvId = null;
|
||||
}
|
||||
|
||||
$: if (data.settings) {
|
||||
settings = { ...data.settings };
|
||||
if (settings.settings && !settings.settings.storage) {
|
||||
settings.settings.storage = {
|
||||
root_path: '',
|
||||
backup_structure_pattern: '',
|
||||
repo_structure_pattern: '',
|
||||
filename_pattern: ''
|
||||
};
|
||||
}
|
||||
async function handleAddOrUpdateEnv() {
|
||||
try {
|
||||
console.log(`[SettingsPage][Action] ${editingEnvId ? 'Updating' : 'Adding'} environment.`);
|
||||
|
||||
// Basic validation
|
||||
if (!newEnv.id || !newEnv.name || !newEnv.url) {
|
||||
addToast('Please fill in all required fields (ID, Name, URL)', 'error');
|
||||
return;
|
||||
}
|
||||
|
||||
if (editingEnvId) {
|
||||
await api.updateEnvironment(editingEnvId, newEnv);
|
||||
addToast('Environment updated', 'success');
|
||||
} else {
|
||||
await api.addEnvironment(newEnv);
|
||||
addToast('Environment added', 'success');
|
||||
}
|
||||
|
||||
resetEnvForm();
|
||||
editingEnvId = null;
|
||||
isAddingEnv = false;
|
||||
await loadSettings();
|
||||
} catch (error) {
|
||||
console.error("[SettingsPage][Coherence:Failed] Failed to save environment:", error);
|
||||
addToast(error.message || 'Failed to save environment', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
async function handleDeleteEnv(id) {
|
||||
if (confirm('Are you sure you want to delete this environment?')) {
|
||||
console.log(`[SettingsPage][Action] Delete environment ${id}`);
|
||||
try {
|
||||
await api.deleteEnvironment(id);
|
||||
addToast('Environment deleted', 'success');
|
||||
await loadSettings();
|
||||
} catch (error) {
|
||||
console.error("[SettingsPage][Coherence:Failed] Failed to delete environment:", error);
|
||||
addToast('Failed to delete environment', 'error');
|
||||
}
|
||||
}
|
||||
|
||||
let newEnv = {
|
||||
id: '',
|
||||
name: '',
|
||||
url: '',
|
||||
username: '',
|
||||
password: '',
|
||||
is_default: false
|
||||
};
|
||||
|
||||
let editingEnvId = null;
|
||||
|
||||
// [DEF:handleSaveGlobal:Function]
|
||||
/* @PURPOSE: Saves global application settings.
|
||||
@PRE: settings.settings must contain valid configuration.
|
||||
@POST: Global settings are updated via API.
|
||||
*/
|
||||
async function handleSaveGlobal() {
|
||||
try {
|
||||
console.log("[Settings.handleSaveGlobal][Action] Saving global settings.");
|
||||
await updateGlobalSettings(settings.settings);
|
||||
addToast('Global settings saved', 'success');
|
||||
console.log("[Settings.handleSaveGlobal][Coherence:OK] Global settings saved.");
|
||||
} catch (error) {
|
||||
console.error("[Settings.handleSaveGlobal][Coherence:Failed] Failed to save global settings:", error);
|
||||
addToast('Failed to save global settings', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleSaveGlobal:Function]
|
||||
|
||||
// [DEF:handleSaveStorage:Function]
|
||||
/* @PURPOSE: Saves storage-specific settings.
|
||||
@PRE: settings.settings.storage must contain valid configuration.
|
||||
@POST: Storage settings are updated via API.
|
||||
*/
|
||||
async function handleSaveStorage() {
|
||||
try {
|
||||
console.log("[Settings.handleSaveStorage][Action] Saving storage settings.");
|
||||
await updateStorageSettings(settings.settings.storage);
|
||||
addToast('Storage settings saved', 'success');
|
||||
console.log("[Settings.handleSaveStorage][Coherence:OK] Storage settings saved.");
|
||||
} catch (error) {
|
||||
console.error("[Settings.handleSaveStorage][Coherence:Failed] Failed to save storage settings:", error);
|
||||
addToast(error.message || 'Failed to save storage settings', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleSaveStorage:Function]
|
||||
|
||||
// [DEF:handleAddOrUpdateEnv:Function]
|
||||
/* @PURPOSE: Adds a new environment or updates an existing one.
|
||||
@PRE: newEnv must contain valid environment details.
|
||||
@POST: Environment is saved and page is reloaded to reflect changes.
|
||||
*/
|
||||
async function handleAddOrUpdateEnv() {
|
||||
try {
|
||||
console.log(`[Settings.handleAddOrUpdateEnv][Action] ${editingEnvId ? 'Updating' : 'Adding'} environment.`);
|
||||
if (editingEnvId) {
|
||||
await updateEnvironment(editingEnvId, newEnv);
|
||||
addToast('Environment updated', 'success');
|
||||
} else {
|
||||
await addEnvironment(newEnv);
|
||||
addToast('Environment added', 'success');
|
||||
}
|
||||
resetEnvForm();
|
||||
// In a real app, we might want to invalidate the load function here
|
||||
// For now, we'll just manually update the local state or re-fetch
|
||||
// But since we are using SvelteKit, we should ideally use invalidateAll()
|
||||
location.reload(); // Simple way to refresh data for now
|
||||
console.log("[Settings.handleAddOrUpdateEnv][Coherence:OK] Environment saved.");
|
||||
} catch (error) {
|
||||
console.error("[Settings.handleAddOrUpdateEnv][Coherence:Failed] Failed to save environment:", error);
|
||||
addToast('Failed to save environment', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleAddOrUpdateEnv:Function]
|
||||
|
||||
// [DEF:handleDeleteEnv:Function]
|
||||
/* @PURPOSE: Deletes a Superset environment.
|
||||
@PRE: id must be a valid environment ID.
|
||||
@POST: Environment is removed and page is reloaded.
|
||||
*/
|
||||
async function handleDeleteEnv(id) {
|
||||
if (confirm('Are you sure you want to delete this environment?')) {
|
||||
try {
|
||||
console.log(`[Settings.handleDeleteEnv][Action] Deleting environment: ${id}`);
|
||||
await deleteEnvironment(id);
|
||||
addToast('Environment deleted', 'success');
|
||||
location.reload();
|
||||
console.log("[Settings.handleDeleteEnv][Coherence:OK] Environment deleted.");
|
||||
} catch (error) {
|
||||
console.error("[Settings.handleDeleteEnv][Coherence:Failed] Failed to delete environment:", error);
|
||||
addToast('Failed to delete environment', 'error');
|
||||
}
|
||||
}
|
||||
}
|
||||
// [/DEF:handleDeleteEnv:Function]
|
||||
|
||||
// [DEF:handleTestEnv:Function]
|
||||
/* @PURPOSE: Tests the connection to a Superset environment.
|
||||
@PRE: id must be a valid environment ID.
|
||||
@POST: Displays success or error toast based on connection result.
|
||||
*/
|
||||
async function handleTestEnv(id) {
|
||||
try {
|
||||
console.log(`[Settings.handleTestEnv][Action] Testing environment: ${id}`);
|
||||
const result = await testEnvironmentConnection(id);
|
||||
if (result.status === 'success') {
|
||||
addToast('Connection successful', 'success');
|
||||
console.log("[Settings.handleTestEnv][Coherence:OK] Connection successful.");
|
||||
} else {
|
||||
addToast(`Connection failed: ${result.message}`, 'error');
|
||||
console.log("[Settings.handleTestEnv][Coherence:Failed] Connection failed.");
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("[Settings.handleTestEnv][Coherence:Failed] Error testing connection:", error);
|
||||
addToast('Failed to test connection', 'error');
|
||||
}
|
||||
}
|
||||
// [/DEF:handleTestEnv:Function]
|
||||
|
||||
// [DEF:editEnv:Function]
|
||||
/* @PURPOSE: Populates the environment form for editing.
|
||||
@PRE: env object must be provided.
|
||||
@POST: newEnv and editingEnvId are updated.
|
||||
*/
|
||||
function editEnv(env) {
|
||||
newEnv = { ...env };
|
||||
editingEnvId = env.id;
|
||||
}
|
||||
// [/DEF:editEnv:Function]
|
||||
|
||||
// [DEF:resetEnvForm:Function]
|
||||
/* @PURPOSE: Resets the environment creation/edit form to default state.
|
||||
@PRE: None.
|
||||
@POST: newEnv is cleared and editingEnvId is set to null.
|
||||
*/
|
||||
function resetEnvForm() {
|
||||
newEnv = {
|
||||
id: '',
|
||||
name: '',
|
||||
url: '',
|
||||
username: '',
|
||||
password: '',
|
||||
is_default: false
|
||||
};
|
||||
editingEnvId = null;
|
||||
}
|
||||
// [/DEF:resetEnvForm:Function]
|
||||
}
|
||||
</script>
|
||||
|
||||
<div class="container mx-auto p-4">
|
||||
<PageHeader title={$t.settings.title} />
|
||||
<style>
|
||||
.container {
|
||||
@apply max-w-7xl mx-auto px-4 py-6;
|
||||
}
|
||||
|
||||
{#if data.error}
|
||||
<div class="bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4">
|
||||
{data.error}
|
||||
</div>
|
||||
{/if}
|
||||
.header {
|
||||
@apply flex items-center justify-between mb-6;
|
||||
}
|
||||
|
||||
.title {
|
||||
@apply text-2xl font-bold text-gray-900;
|
||||
}
|
||||
|
||||
<div class="mb-8">
|
||||
<Card title={$t.settings?.storage_title || "File Storage Configuration"}>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-6">
|
||||
<div class="md:col-span-2">
|
||||
<Input
|
||||
label={$t.settings?.storage_root || "Storage Root Path"}
|
||||
bind:value={settings.settings.storage.root_path}
|
||||
/>
|
||||
</div>
|
||||
<Input
|
||||
label={$t.settings?.storage_backup_pattern || "Backup Directory Pattern"}
|
||||
bind:value={settings.settings.storage.backup_structure_pattern}
|
||||
/>
|
||||
<Input
|
||||
label={$t.settings?.storage_repo_pattern || "Repository Directory Pattern"}
|
||||
bind:value={settings.settings.storage.repo_structure_pattern}
|
||||
/>
|
||||
<Input
|
||||
label={$t.settings?.storage_filename_pattern || "Filename Pattern"}
|
||||
bind:value={settings.settings.storage.filename_pattern}
|
||||
/>
|
||||
<div class="bg-gray-50 p-4 rounded border border-gray-200">
|
||||
<span class="block text-xs font-semibold text-gray-500 uppercase mb-2">{$t.settings?.storage_preview || "Path Preview"}</span>
|
||||
<code class="text-sm text-indigo-600">
|
||||
{settings.settings.storage.root_path}/backups/sample_backup.zip
|
||||
</code>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mt-6">
|
||||
<Button on:click={handleSaveStorage}>
|
||||
{$t.common.save}
|
||||
</Button>
|
||||
</div>
|
||||
</Card>
|
||||
.refresh-btn {
|
||||
@apply px-4 py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700 transition-colors;
|
||||
}
|
||||
|
||||
.error-banner {
|
||||
@apply bg-red-100 border border-red-400 text-red-700 px-4 py-3 rounded mb-4 flex items-center justify-between;
|
||||
}
|
||||
|
||||
.retry-btn {
|
||||
@apply px-4 py-2 bg-red-600 text-white rounded hover:bg-red-700 transition-colors;
|
||||
}
|
||||
|
||||
.tabs {
|
||||
@apply border-b border-gray-200 mb-6;
|
||||
}
|
||||
|
||||
.tab-btn {
|
||||
@apply px-4 py-2 text-sm font-medium transition-colors focus:outline-none;
|
||||
}
|
||||
|
||||
.tab-content {
|
||||
@apply bg-white rounded-lg p-6 border border-gray-200;
|
||||
}
|
||||
|
||||
.skeleton {
|
||||
@apply animate-pulse bg-gray-200 rounded;
|
||||
}
|
||||
</style>
|
||||
|
||||
<div class="container">
|
||||
<!-- Header -->
|
||||
<div class="header">
|
||||
<h1 class="title">{$t.settings?.title || 'Settings'}</h1>
|
||||
<button class="refresh-btn" on:click={loadSettings}>
|
||||
{$t.common?.refresh || 'Refresh'}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Error Banner -->
|
||||
{#if error}
|
||||
<div class="error-banner">
|
||||
<span>{error}</span>
|
||||
<button class="retry-btn" on:click={loadSettings}>
|
||||
{$t.common?.retry || 'Retry'}
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<!-- Loading State -->
|
||||
{#if isLoading}
|
||||
<div class="tab-content">
|
||||
<div class="skeleton h-8"></div>
|
||||
<div class="skeleton h-8"></div>
|
||||
<div class="skeleton h-8"></div>
|
||||
<div class="skeleton h-8"></div>
|
||||
<div class="skeleton h-8"></div>
|
||||
</div>
|
||||
{:else if settings}
|
||||
<!-- Tabs -->
|
||||
<div class="tabs">
|
||||
<button
|
||||
class="tab-btn {getTabClass('environments')}"
|
||||
on:click={() => handleTabChange('environments')}
|
||||
>
|
||||
{$t.settings?.environments || 'Environments'}
|
||||
</button>
|
||||
<button
|
||||
class="tab-btn {getTabClass('logging')}"
|
||||
on:click={() => handleTabChange('logging')}
|
||||
>
|
||||
{$t.settings?.logging || 'Logging'}
|
||||
</button>
|
||||
<button
|
||||
class="tab-btn {getTabClass('connections')}"
|
||||
on:click={() => handleTabChange('connections')}
|
||||
>
|
||||
{$t.settings?.connections || 'Connections'}
|
||||
</button>
|
||||
<button
|
||||
class="tab-btn {getTabClass('llm')}"
|
||||
on:click={() => handleTabChange('llm')}
|
||||
>
|
||||
{$t.settings?.llm || 'LLM'}
|
||||
</button>
|
||||
<button
|
||||
class="tab-btn {getTabClass('storage')}"
|
||||
on:click={() => handleTabChange('storage')}
|
||||
>
|
||||
{$t.settings?.storage || 'Storage'}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<section class="mb-8">
|
||||
<Card title={$t.settings?.env_title || "Superset Environments"}>
|
||||
<!-- Tab Content -->
|
||||
<div class="tab-content">
|
||||
{#if activeTab === 'environments'}
|
||||
<!-- Environments Tab -->
|
||||
<div class="text-lg font-medium mb-4">
|
||||
<h2 class="text-xl font-bold mb-4">{$t.settings?.environments || 'Superset Environments'}</h2>
|
||||
<p class="text-gray-600 mb-6">
|
||||
{$t.settings?.env_description || 'Configure Superset environments for dashboards and datasets.'}
|
||||
</p>
|
||||
|
||||
{#if !editingEnvId && !isAddingEnv}
|
||||
<div class="flex justify-end mb-6">
|
||||
<button
|
||||
class="bg-blue-600 text-white px-4 py-2 rounded-lg hover:bg-blue-700"
|
||||
on:click={() => { isAddingEnv = true; resetEnvForm(); }}
|
||||
>
|
||||
{$t.settings?.env_add || 'Add Environment'}
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
{#if settings.environments.length === 0}
|
||||
<div class="mb-4 p-4 bg-yellow-100 border-l-4 border-yellow-500 text-yellow-700">
|
||||
<p class="font-bold">Warning</p>
|
||||
<p>{$t.settings?.env_warning || "No Superset environments configured."}</p>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<div class="mb-6 overflow-x-auto">
|
||||
<table class="min-w-full divide-y divide-gray-200">
|
||||
{#if editingEnvId || isAddingEnv}
|
||||
<!-- Add/Edit Environment Form -->
|
||||
<div class="bg-gray-50 p-6 rounded-lg mb-6 border border-gray-200">
|
||||
<h3 class="text-lg font-medium mb-4">{editingEnvId ? 'Edit' : 'Add'} Environment</h3>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="env_id" class="block text-sm font-medium text-gray-700">ID</label>
|
||||
<input
|
||||
type="text"
|
||||
id="env_id"
|
||||
bind:value={newEnv.id}
|
||||
disabled={!!editingEnvId}
|
||||
class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2 disabled:bg-gray-100 disabled:text-gray-500"
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<label for="env_name" class="block text-sm font-medium text-gray-700">Name</label>
|
||||
<input type="text" id="env_name" bind:value={newEnv.name} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="env_url" class="block text-sm font-medium text-gray-700">URL</label>
|
||||
<input type="text" id="env_url" bind:value={newEnv.url} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="env_user" class="block text-sm font-medium text-gray-700">Username</label>
|
||||
<input type="text" id="env_user" bind:value={newEnv.username} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="env_pass" class="block text-sm font-medium text-gray-700">Password</label>
|
||||
<input type="password" id="env_pass" bind:value={newEnv.password} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div class="flex items-center mt-6">
|
||||
<input type="checkbox" id="env_default" bind:checked={newEnv.is_default} class="h-4 w-4 text-blue-600 border-gray-300 rounded" />
|
||||
<label for="env_default" class="ml-2 block text-sm text-gray-900">Default Environment</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<h3 class="text-lg font-medium mb-4 mt-6">Backup Schedule</h3>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div class="flex items-center">
|
||||
<input type="checkbox" id="backup_enabled" bind:checked={newEnv.backup_schedule.enabled} class="h-4 w-4 text-blue-600 border-gray-300 rounded" />
|
||||
<label for="backup_enabled" class="ml-2 block text-sm text-gray-900">Enable Automatic Backups</label>
|
||||
</div>
|
||||
<div>
|
||||
<label for="cron_expression" class="block text-sm font-medium text-gray-700">Cron Expression</label>
|
||||
<input type="text" id="cron_expression" bind:value={newEnv.backup_schedule.cron_expression} placeholder="0 0 * * *" class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
<p class="text-xs text-gray-500 mt-1">Example: 0 0 * * * (daily at midnight)</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mt-6 flex gap-2 justify-end">
|
||||
<button
|
||||
on:click={() => { isAddingEnv = false; editingEnvId = null; resetEnvForm(); }}
|
||||
class="bg-gray-200 text-gray-700 px-4 py-2 rounded hover:bg-gray-300"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<button
|
||||
on:click={handleAddOrUpdateEnv}
|
||||
class="bg-blue-600 text-white px-4 py-2 rounded hover:bg-blue-700"
|
||||
>
|
||||
{editingEnvId ? 'Update' : 'Add'} Environment
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
{#if settings.environments && settings.environments.length > 0}
|
||||
<div class="mt-6 overflow-x-auto border border-gray-200 rounded-lg">
|
||||
<table class="min-w-full divide-y divide-gray-200">
|
||||
<thead class="bg-gray-50">
|
||||
<tr>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.connections?.name || "Name"}</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">URL</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.connections?.user || "Username"}</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Default</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.git?.actions || "Actions"}</th>
|
||||
</tr>
|
||||
<tr>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.connections?.name || "Name"}</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">URL</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.connections?.user || "Username"}</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider">Default</th>
|
||||
<th class="px-6 py-3 text-right text-xs font-medium text-gray-500 uppercase tracking-wider">{$t.settings?.env_actions || "Actions"}</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody class="bg-white divide-y divide-gray-200">
|
||||
{#each settings.environments as env}
|
||||
<tr>
|
||||
<td class="px-6 py-4 whitespace-nowrap">{env.name}</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">{env.url}</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">{env.username}</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">{env.is_default ? 'Yes' : 'No'}</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">
|
||||
<button on:click={() => handleTestEnv(env.id)} class="text-green-600 hover:text-green-900 mr-4">{$t.settings?.env_test || "Test"}</button>
|
||||
<button on:click={() => editEnv(env)} class="text-indigo-600 hover:text-indigo-900 mr-4">{$t.common.edit}</button>
|
||||
<button on:click={() => handleDeleteEnv(env.id)} class="text-red-600 hover:text-red-900">{$t.settings?.env_delete || "Delete"}</button>
|
||||
</td>
|
||||
</tr>
|
||||
{/each}
|
||||
{#each settings.environments as env}
|
||||
<tr>
|
||||
<td class="px-6 py-4 whitespace-nowrap">{env.name}</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">{env.url}</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">{env.username}</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap">
|
||||
{#if env.is_default}
|
||||
<span class="px-2 inline-flex text-xs leading-5 font-semibold rounded-full bg-green-100 text-green-800">
|
||||
Yes
|
||||
</span>
|
||||
{:else}
|
||||
<span class="text-gray-500">No</span>
|
||||
{/if}
|
||||
</td>
|
||||
<td class="px-6 py-4 whitespace-nowrap text-right text-sm font-medium">
|
||||
<button class="text-green-600 hover:text-green-900 mr-4" on:click={() => handleTestEnv(env.id)}>
|
||||
{$t.settings?.env_test || "Test"}
|
||||
</button>
|
||||
<button class="text-indigo-600 hover:text-indigo-900 mr-4" on:click={() => editEnv(env)}>
|
||||
{$t.common.edit || "Edit"}
|
||||
</button>
|
||||
<button class="text-red-600 hover:text-red-900" on:click={() => handleDeleteEnv(env.id)}>
|
||||
{$t.settings?.env_delete || "Delete"}
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
{/each}
|
||||
</tbody>
|
||||
</table>
|
||||
</table>
|
||||
</div>
|
||||
{:else if !isAddingEnv}
|
||||
<div class="mb-4 p-4 bg-yellow-100 border-l-4 border-yellow-500 text-yellow-700">
|
||||
<p class="font-bold">Warning</p>
|
||||
<p>No Superset environments configured. You must add at least one environment to perform backups or migrations.</p>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{:else if activeTab === 'logging'}
|
||||
<!-- Logging Tab -->
|
||||
<div class="text-lg font-medium mb-4">
|
||||
<h2 class="text-xl font-bold mb-4">{$t.settings?.logging || 'Logging Configuration'}</h2>
|
||||
<p class="text-gray-600 mb-6">
|
||||
{$t.settings?.logging_description || 'Configure logging and task log levels.'}
|
||||
</p>
|
||||
|
||||
<div class="bg-gray-50 p-6 rounded-lg border border-gray-200">
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label for="log_level" class="block text-sm font-medium text-gray-700">Log Level</label>
|
||||
<select id="log_level" bind:value={settings.logging.level} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2">
|
||||
<option value="DEBUG">DEBUG</option>
|
||||
<option value="INFO">INFO</option>
|
||||
<option value="WARNING">WARNING</option>
|
||||
<option value="ERROR">ERROR</option>
|
||||
<option value="CRITICAL">CRITICAL</option>
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label for="task_log_level" class="block text-sm font-medium text-gray-700">Task Log Level</label>
|
||||
<select id="task_log_level" bind:value={settings.logging.task_log_level} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2">
|
||||
<option value="DEBUG">DEBUG</option>
|
||||
<option value="INFO">INFO</option>
|
||||
<option value="WARNING">WARNING</option>
|
||||
<option value="ERROR">ERROR</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="md:col-span-2">
|
||||
<label class="flex items-center">
|
||||
<input type="checkbox" id="enable_belief_state" bind:checked={settings.logging.enable_belief_state} class="h-4 w-4 text-blue-600 border-gray-300 rounded" />
|
||||
<span class="ml-2 block text-sm text-gray-900">Enable Belief State Logging (Beta)</span>
|
||||
</label>
|
||||
<p class="text-xs text-gray-500 mt-1 ml-6">Logs agent reasoning and internal state changes for debugging.</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mt-8 bg-gray-50 p-6 rounded-lg border border-gray-100">
|
||||
<h3 class="text-lg font-medium mb-6">{editingEnvId ? ($t.settings?.env_edit || "Edit Environment") : ($t.settings?.env_add || "Add Environment")}</h3>
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 gap-6">
|
||||
<Input label="ID" bind:value={newEnv.id} disabled={!!editingEnvId} />
|
||||
<Input label={$t.connections?.name || "Name"} bind:value={newEnv.name} />
|
||||
<Input label="URL" bind:value={newEnv.url} />
|
||||
<Input label={$t.connections?.user || "Username"} bind:value={newEnv.username} />
|
||||
<Input label={$t.connections?.pass || "Password"} type="password" bind:value={newEnv.password} />
|
||||
<div class="flex items-center gap-2 h-10 mt-auto">
|
||||
<input type="checkbox" id="env_default" bind:checked={newEnv.is_default} class="h-4 w-4 text-blue-600 border-gray-300 rounded focus:ring-blue-500" />
|
||||
<label for="env_default" class="text-sm font-medium text-gray-700">{$t.settings?.env_default || "Default Environment"}</label>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mt-8 flex gap-3">
|
||||
<Button on:click={handleAddOrUpdateEnv}>
|
||||
{editingEnvId ? $t.common.save : ($t.settings?.env_add || "Add Environment")}
|
||||
</Button>
|
||||
{#if editingEnvId}
|
||||
<Button variant="secondary" on:click={resetEnvForm}>
|
||||
{$t.common.cancel}
|
||||
</Button>
|
||||
{/if}
|
||||
<div class="mt-6 flex justify-end">
|
||||
<button
|
||||
on:click={handleSave}
|
||||
class="bg-blue-600 text-white px-4 py-2 rounded hover:bg-blue-700"
|
||||
>
|
||||
Save Logging Config
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</Card>
|
||||
</section>
|
||||
{:else if activeTab === 'connections'}
|
||||
<!-- Connections Tab -->
|
||||
<div class="text-lg font-medium mb-4">
|
||||
<h2 class="text-xl font-bold mb-4">{$t.settings?.connections || 'Database Connections'}</h2>
|
||||
<p class="text-gray-600 mb-6">
|
||||
{$t.settings?.connections_description || 'Configure database connections for data mapping.'}
|
||||
</p>
|
||||
|
||||
{#if settings.connections && settings.connections.length > 0}
|
||||
<!-- Connections list would go here -->
|
||||
<p class="text-gray-500 italic">No additional connections configured. Superset database connections are used by default.</p>
|
||||
{:else}
|
||||
<div class="text-center py-8 bg-gray-50 rounded-lg border border-dashed border-gray-300">
|
||||
<p class="text-gray-500">No external connections configured.</p>
|
||||
<button class="mt-4 px-4 py-2 border border-blue-600 text-blue-600 rounded hover:bg-blue-50">
|
||||
Add Connection
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{:else if activeTab === 'llm'}
|
||||
<!-- LLM Tab -->
|
||||
<div class="text-lg font-medium mb-4">
|
||||
<h2 class="text-xl font-bold mb-4">{$t.settings?.llm || 'LLM Providers'}</h2>
|
||||
<p class="text-gray-600 mb-6">
|
||||
{$t.settings?.llm_description || 'Configure LLM providers for dataset documentation.'}
|
||||
</p>
|
||||
|
||||
<ProviderConfig providers={settings.llm_providers || []} onSave={loadSettings} />
|
||||
</div>
|
||||
{:else if activeTab === 'storage'}
|
||||
<!-- Storage Tab -->
|
||||
<div class="text-lg font-medium mb-4">
|
||||
<h2 class="text-xl font-bold mb-4">{$t.settings?.storage || 'File Storage Configuration'}</h2>
|
||||
<p class="text-gray-600 mb-6">
|
||||
{$t.settings?.storage_description || 'Configure file storage paths and patterns.'}
|
||||
</p>
|
||||
|
||||
<div class="bg-gray-50 p-6 rounded-lg border border-gray-200">
|
||||
<div class="grid grid-cols-1 gap-4">
|
||||
<div>
|
||||
<label for="storage_path" class="block text-sm font-medium text-gray-700">Root Path</label>
|
||||
<input type="text" id="storage_path" bind:value={settings.storage.root_path} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="backup_path" class="block text-sm font-medium text-gray-700">Backup Path</label>
|
||||
<input type="text" id="backup_path" bind:value={settings.storage.backup_path} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
<div>
|
||||
<label for="repo_path" class="block text-sm font-medium text-gray-700">Repository Path</label>
|
||||
<input type="text" id="repo_path" bind:value={settings.storage.repo_path} class="mt-1 block w-full border border-gray-300 rounded-md shadow-sm p-2" />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mt-6 flex justify-end">
|
||||
<button
|
||||
on:click={() => handleSave()}
|
||||
class="bg-blue-600 text-white px-4 py-2 rounded hover:bg-blue-700"
|
||||
>
|
||||
Save Storage Config
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<!-- [/DEF:SettingsPage:Page] -->
|
||||
|
||||
16
frontend/src/routes/storage/+page.svelte
Normal file
16
frontend/src/routes/storage/+page.svelte
Normal file
@@ -0,0 +1,16 @@
|
||||
<!-- [DEF:StorageIndexPage:Page] -->
|
||||
<!--
|
||||
@TIER: TRIVIAL
|
||||
@PURPOSE: Redirect to the backups page as the default storage view.
|
||||
@LAYER: Page
|
||||
@INVARIANT: Always redirects to /storage/backups.
|
||||
-->
|
||||
<script>
|
||||
import { onMount } from 'svelte';
|
||||
import { goto } from '$app/navigation';
|
||||
|
||||
onMount(() => {
|
||||
goto('/storage/backups');
|
||||
});
|
||||
</script>
|
||||
<!-- [/DEF:StorageIndexPage:Page] -->
|
||||
35
frontend/src/routes/storage/backups/+page.svelte
Normal file
35
frontend/src/routes/storage/backups/+page.svelte
Normal file
@@ -0,0 +1,35 @@
|
||||
<!-- [DEF:StorageBackupsPage:Page] -->
|
||||
<!--
|
||||
@TIER: STANDARD
|
||||
@SEMANTICS: backup, page, tools
|
||||
@PURPOSE: Entry point for the Backup Management interface (moved from /tools/backups).
|
||||
@LAYER: Page
|
||||
@RELATION: USES -> BackupManager
|
||||
|
||||
@INVARIANT: BackupManager component is always rendered.
|
||||
-->
|
||||
|
||||
<script lang="ts">
|
||||
/**
|
||||
* @UX_STATE: Loading -> (via BackupManager) showing spinner.
|
||||
* @UX_STATE: Idle -> Showing BackupManager interface.
|
||||
* @UX_FEEDBACK: Toast -> (via BackupManager) success/error notifications.
|
||||
*/
|
||||
// [SECTION: IMPORTS]
|
||||
import { t } from '$lib/i18n';
|
||||
import { PageHeader } from '$lib/ui';
|
||||
import BackupManager from '../../../components/backups/BackupManager.svelte';
|
||||
// [/SECTION]
|
||||
</script>
|
||||
|
||||
<!-- [SECTION: TEMPLATE] -->
|
||||
<div class="container mx-auto p-4 max-w-6xl">
|
||||
<PageHeader title={$t.nav?.backups || "Backups"} />
|
||||
|
||||
<div class="mt-6">
|
||||
<BackupManager />
|
||||
</div>
|
||||
</div>
|
||||
<!-- [/SECTION] -->
|
||||
|
||||
<!-- [/DEF:StorageBackupsPage:Page] -->
|
||||
110
frontend/src/routes/storage/repos/+page.svelte
Normal file
110
frontend/src/routes/storage/repos/+page.svelte
Normal file
@@ -0,0 +1,110 @@
|
||||
<!-- [DEF:frontend.src.routes.storage.repos.+page:Module] -->
|
||||
<!--
|
||||
@TIER: STANDARD
|
||||
@SEMANTICS: git, dashboard, management, ui
|
||||
@PURPOSE: Dashboard management page for Git integration (moved from /git).
|
||||
@LAYER: UI (Page)
|
||||
@RELATION: DEPENDS_ON -> DashboardGrid
|
||||
@RELATION: DEPENDS_ON -> api
|
||||
@INVARIANT: Dashboard grid is always shown when an environment is selected.
|
||||
-->
|
||||
|
||||
<!-- [DEF:StorageReposPage:Page] -->
|
||||
<script lang="ts">
|
||||
/**
|
||||
* @UX_STATE: Loading -> Showing spinner while fetching environments/dashboards.
|
||||
* @UX_STATE: Idle -> Showing dashboard grid with actions.
|
||||
* @UX_FEEDBACK: Toast -> Error messages on fetch failure.
|
||||
* @UX_RECOVERY: Environment Selection -> Switch environment to retry loading.
|
||||
*/
|
||||
import { onMount } from 'svelte';
|
||||
import DashboardGrid from '../../../components/DashboardGrid.svelte';
|
||||
import { addToast as toast } from '$lib/toasts.js';
|
||||
import { api } from '$lib/api.js';
|
||||
import type { DashboardMetadata } from '$lib/types/dashboard';
|
||||
import { t } from '$lib/i18n';
|
||||
import { Button, Card, PageHeader, Select } from '$lib/ui';
|
||||
|
||||
let environments: any[] = [];
|
||||
let selectedEnvId = "";
|
||||
let dashboards: DashboardMetadata[] = [];
|
||||
let loading = true;
|
||||
let fetchingDashboards = false;
|
||||
|
||||
// [DEF:fetchEnvironments:Function]
|
||||
/**
|
||||
* @PURPOSE: Fetches the list of available environments.
|
||||
* @PRE: None.
|
||||
* @POST: environments array is populated, selectedEnvId is set to first env if available.
|
||||
*/
|
||||
async function fetchEnvironments() {
|
||||
try {
|
||||
environments = await api.getEnvironmentsList();
|
||||
if (environments.length > 0) {
|
||||
selectedEnvId = environments[0].id;
|
||||
}
|
||||
} catch (e) {
|
||||
toast(e.message, 'error');
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:fetchEnvironments:Function]
|
||||
|
||||
// [DEF:fetchDashboards:Function]
|
||||
/**
|
||||
* @PURPOSE: Fetches dashboards for a specific environment.
|
||||
* @PRE: envId is a valid environment ID.
|
||||
* @POST: dashboards array is populated with metadata for the selected environment.
|
||||
*/
|
||||
async function fetchDashboards(envId: string) {
|
||||
if (!envId) return;
|
||||
fetchingDashboards = true;
|
||||
try {
|
||||
dashboards = await api.requestApi(`/environments/${envId}/dashboards`);
|
||||
} catch (e) {
|
||||
toast(e.message, 'error');
|
||||
dashboards = [];
|
||||
} finally {
|
||||
fetchingDashboards = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:fetchDashboards:Function]
|
||||
|
||||
onMount(fetchEnvironments);
|
||||
|
||||
$: if (selectedEnvId) {
|
||||
fetchDashboards(selectedEnvId);
|
||||
localStorage.setItem('selected_env_id', selectedEnvId);
|
||||
}
|
||||
</script>
|
||||
|
||||
<div class="max-w-6xl mx-auto p-6">
|
||||
<PageHeader title={$t.nav?.repositories || "Git Repositories"}>
|
||||
<div slot="actions" class="flex items-center space-x-4">
|
||||
<Select
|
||||
label="Environment"
|
||||
bind:value={selectedEnvId}
|
||||
options={environments.map(e => ({ value: e.id, label: e.name }))}
|
||||
/>
|
||||
</div>
|
||||
</PageHeader>
|
||||
|
||||
{#if loading}
|
||||
<div class="flex justify-center py-12">
|
||||
<div class="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600"></div>
|
||||
</div>
|
||||
{:else}
|
||||
<Card title="Select Dashboard to Manage">
|
||||
{#if fetchingDashboards}
|
||||
<p class="text-gray-500">Loading dashboards...</p>
|
||||
{:else if dashboards.length > 0}
|
||||
<DashboardGrid {dashboards} />
|
||||
{:else}
|
||||
<p class="text-gray-500 italic">No dashboards found in this environment.</p>
|
||||
{/if}
|
||||
</Card>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- [/DEF:StorageReposPage:Page] -->
|
||||
<!-- [/DEF:frontend.src.routes.storage.repos.+page:Module] -->
|
||||
BIN
frontend/static/favicon.png
Normal file
BIN
frontend/static/favicon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 523 B |
@@ -6,6 +6,10 @@ const config = {
|
||||
preprocess: vitePreprocess(),
|
||||
|
||||
kit: {
|
||||
alias: {
|
||||
'$components': 'src/components',
|
||||
'$lib': 'src/lib'
|
||||
},
|
||||
adapter: adapter({
|
||||
pages: 'build',
|
||||
assets: 'build',
|
||||
|
||||
298
logs/лог.md
298
logs/лог.md
@@ -1,298 +0,0 @@
|
||||
PS H:\dev\ss-tools> & C:/ProgramData/anaconda3/python.exe h:/dev/ss-tools/migration_script.py
|
||||
2025-12-16 11:50:28,192 - INFO - [run][Entry] Запуск скрипта миграции.
|
||||
|
||||
=== Поведение при ошибке импорта ===
|
||||
Если импорт завершится ошибкой, удалить существующий дашборд и попытаться импортировать заново? (y/n): n
|
||||
2025-12-16 11:50:33,363 - INFO - [ask_delete_on_failure][State] Delete-on-failure = False
|
||||
2025-12-16 11:50:33,368 - INFO - [select_environments][Entry] Шаг 1/5: Выбор окружений.
|
||||
2025-12-16 11:50:33,374 - INFO - [setup_clients][Enter] Starting Superset clients initialization.
|
||||
2025-12-16 11:50:33,730 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,734 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,739 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,742 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,746 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,750 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,754 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,758 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,761 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,764 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,769 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,772 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,776 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,779 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,782 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,786 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,790 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,794 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,799 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,805 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,808 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,811 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,815 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,820 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,823 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,827 - INFO - [SupersetClient.__init__][Enter] Initializing SupersetClient.
|
||||
2025-12-16 11:50:33,831 - INFO - [APIClient.__init__][Entry] Initializing APIClient.
|
||||
2025-12-16 11:50:33,834 - WARNING - [_init_session][State] SSL verification disabled.
|
||||
2025-12-16 11:50:33,838 - INFO - [APIClient.__init__][Exit] APIClient initialized.
|
||||
2025-12-16 11:50:33,840 - INFO - [SupersetClient.__init__][Exit] SupersetClient initialized.
|
||||
2025-12-16 11:50:33,847 - INFO - [setup_clients][Exit] All clients (dev, prod, sbx, preprod, uatta, dev5) initialized successfully.
|
||||
|
||||
=== Выбор окружения ===
|
||||
Исходное окружение:
|
||||
1) dev
|
||||
2) prod
|
||||
3) sbx
|
||||
4) preprod
|
||||
5) uatta
|
||||
6) dev5
|
||||
|
||||
Введите номер (0 – отмена): 4
|
||||
2025-12-16 11:50:42,379 - INFO - [select_environments][State] from = preprod
|
||||
|
||||
=== Выбор окружения ===
|
||||
Целевое окружение:
|
||||
1) dev
|
||||
2) prod
|
||||
3) sbx
|
||||
4) uatta
|
||||
5) dev5
|
||||
|
||||
Введите номер (0 – отмена): 5
|
||||
2025-12-16 11:50:45,176 - INFO - [select_environments][State] to = dev5
|
||||
2025-12-16 11:50:45,182 - INFO - [select_environments][Exit] Шаг 1 завершён.
|
||||
2025-12-16 11:50:45,186 - INFO - [select_dashboards][Entry] Шаг 2/5: Выбор дашбордов.
|
||||
2025-12-16 11:50:45,190 - INFO - [get_dashboards][Enter] Fetching dashboards.
|
||||
2025-12-16 11:50:45,197 - INFO - [authenticate][Enter] Authenticating to https://preprodta.bi.dwh.rusal.com/api/v1
|
||||
2025-12-16 11:50:45,880 - INFO - [authenticate][Exit] Authenticated successfully.
|
||||
2025-12-16 11:50:46,025 - INFO - [get_dashboards][Exit] Found 95 dashboards.
|
||||
|
||||
=== Поиск ===
|
||||
Введите регулярное выражение для поиска дашбордов:
|
||||
fi
|
||||
|
||||
=== Выбор дашбордов ===
|
||||
Отметьте нужные дашборды (введите номера):
|
||||
1) [ALL] Все дашборды
|
||||
2) [185] FI-0060 Финансы. Налоги. Данные по налогам. Старый
|
||||
3) [184] FI-0083 Статистика по ДЗ/ПДЗ
|
||||
4) [187] FI-0081 ПДЗ Казначейство
|
||||
5) [122] FI-0080 Финансы. Оборотный Капитал ДЗ/КЗ
|
||||
6) [208] FI-0020 Просроченная дебиторская и кредиторская задолженность в динамике
|
||||
7) [126] FI-0022 Кредиторская задолженность для казначейства
|
||||
8) [196] FI-0023 Дебиторская задолженность для казначейства
|
||||
9) [113] FI-0060 Финансы. Налоги. Данные по налогам.
|
||||
10) [173] FI-0040 Оборотно-сальдовая ведомость (ОСВ) по контрагентам
|
||||
11) [174] FI-0021 Дебиторская и кредиторская задолженность по документам
|
||||
12) [172] FI-0030 Дебиторская задолженность по штрафам
|
||||
13) [170] FI-0050 Налог на прибыль (ОНА и ОНО)
|
||||
14) [159] FI-0070 Досье контрагента
|
||||
|
||||
Введите номера через запятую (пустой ввод → отказ): 2
|
||||
2025-12-16 11:50:52,235 - INFO - [select_dashboards][State] Выбрано 1 дашбордов.
|
||||
2025-12-16 11:50:52,242 - INFO - [select_dashboards][Exit] Шаг 2 завершён.
|
||||
|
||||
=== Замена БД ===
|
||||
Заменить конфигурацию БД в YAML‑файлах? (y/n): y
|
||||
2025-12-16 11:50:53,808 - INFO - [_select_databases][Entry] Selecting databases from both environments.
|
||||
2025-12-16 11:50:53,816 - INFO - [get_databases][Enter] Fetching databases.
|
||||
2025-12-16 11:50:53,918 - INFO - [get_databases][Exit] Found 12 databases.
|
||||
2025-12-16 11:50:53,923 - INFO - [get_databases][Enter] Fetching databases.
|
||||
2025-12-16 11:50:53,926 - INFO - [authenticate][Enter] Authenticating to https://dev.bi.dwh.rusal.com/api/v1
|
||||
2025-12-16 11:50:54,450 - INFO - [authenticate][Exit] Authenticated successfully.
|
||||
2025-12-16 11:50:54,551 - INFO - [get_databases][Exit] Found 4 databases.
|
||||
|
||||
=== Выбор исходной БД ===
|
||||
Выберите исходную БД:
|
||||
1) DEV datalab (ID: 9)
|
||||
2) Prod Greenplum (ID: 7)
|
||||
3) DEV Clickhouse New (OLD) (ID: 16)
|
||||
4) Preprod Clickhouse New (ID: 15)
|
||||
5) DEV Greenplum (ID: 1)
|
||||
6) Prod Clickhouse Node 1 (ID: 11)
|
||||
7) Preprod Postgre Superset Internal (ID: 5)
|
||||
8) Prod Postgre Superset Internal (ID: 28)
|
||||
9) Prod Clickhouse (ID: 10)
|
||||
10) Dev Clickhouse (correct) (ID: 14)
|
||||
11) DEV ClickHouse New (ID: 23)
|
||||
12) Sandbox Postgre Superset Internal (ID: 12)
|
||||
|
||||
Введите номер (0 – отмена): 9
|
||||
2025-12-16 11:51:11,008 - INFO - [get_database][Enter] Fetching database 10.
|
||||
2025-12-16 11:51:11,038 - INFO - [get_database][Exit] Got database 10.
|
||||
|
||||
=== Выбор целевой БД ===
|
||||
Выберите целевую БД:
|
||||
1) DEV Greenplum (ID: 2)
|
||||
2) DEV Clickhouse (ID: 3)
|
||||
3) DEV ClickHouse New (ID: 4)
|
||||
4) Dev Postgre Superset Internal (ID: 1)
|
||||
|
||||
Введите номер (0 – отмена): 2
|
||||
2025-12-16 11:51:15,559 - INFO - [get_database][Enter] Fetching database 3.
|
||||
2025-12-16 11:51:15,586 - INFO - [get_database][Exit] Got database 3.
|
||||
2025-12-16 11:51:15,589 - INFO - [_select_databases][Exit] Selected databases: Без имени -> Без имени
|
||||
old_db: {'id': 10, 'result': {'allow_ctas': False, 'allow_cvas': False, 'allow_dml': True, 'allow_file_upload': False, 'allow_run_async': False, 'backen
|
||||
d': 'clickhousedb', 'cache_timeout': None, 'configuration_method': 'sqlalchemy_form', 'database_name': 'Prod Clickhouse', 'driver': 'connect', 'engine_i
|
||||
nformation': {'disable_ssh_tunneling': False, 'supports_file_upload': False}, 'expose_in_sqllab': True, 'force_ctas_schema': None, 'id': 10, 'impersonat
|
||||
e_user': False, 'is_managed_externally': False, 'uuid': '97aced68-326a-4094-b381-27980560efa9'}}
|
||||
2025-12-16 11:51:15,591 - INFO - [confirm_db_config_replacement][State] Replacement set: {'old': {'database_name': None, 'uuid': None, 'id': '10'}, 'new
|
||||
': {'database_name': None, 'uuid': None, 'id': '3'}}
|
||||
2025-12-16 11:51:15,594 - INFO - [execute_migration][Entry] Starting migration of 1 dashboards.
|
||||
|
||||
=== Миграция... ===
|
||||
Миграция: FI-0060 Финансы. Налоги. Данные по налогам. Старый (1/1) 0%2025-12-16 11:51:15,598 - INFO - [export_dashboard][Enter] Exporting dashboard 185.
|
||||
2025-12-16 11:51:16,142 - INFO - [export_dashboard][Exit] Exported dashboard 185 to dashboard_export_20251216T085115.zip.
|
||||
2025-12-16 11:51:16,205 - INFO - [update_yamls][Enter] Starting YAML configuration update.
|
||||
2025-12-16 11:51:16,208 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\metadata.yaml
|
||||
2025-12-16 11:51:16,209 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-01_2787.yaml
|
||||
2025-12-16 11:51:16,210 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_2_4030.yaml
|
||||
2025-12-16 11:51:16,212 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_4029.yaml
|
||||
2025-12-16 11:51:16,213 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_TOTAL2_4036.yaml
|
||||
2025-12-16 11:51:16,215 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_TOTAL2_4037.yaml
|
||||
2025-12-16 11:51:16,216 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_TOTAL_4028.yaml
|
||||
2025-12-16 11:51:16,217 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_ZNODE_ROOT2_4024.yaml
|
||||
2025-12-16 11:51:16,218 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-01_ZNODE_ROOT_4033.yaml
|
||||
2025-12-16 11:51:16,220 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-02_ZFUND-BD2_4021.yaml
|
||||
2025-12-16 11:51:16,221 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-02_ZFUND_4027.yaml
|
||||
2025-12-16 11:51:16,222 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02-02_ZFUND_4034.yaml
|
||||
2025-12-16 11:51:16,224 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02_ZTAX_4022.yaml
|
||||
2025-12-16 11:51:16,226 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-02_ZTAX_4035.yaml
|
||||
2025-12-16 11:51:16,227 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-04-2_4031.yaml
|
||||
2025-12-16 11:51:16,228 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-05-01_4026.yaml
|
||||
2025-12-16 11:51:16,230 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-05-01_4032.yaml
|
||||
2025-12-16 11:51:16,231 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-06_1_4023.yaml
|
||||
2025-12-16 11:51:16,233 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060-06_2_4020.yaml
|
||||
2025-12-16 11:51:16,234 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\charts\FI-0060_4025.yaml
|
||||
2025-12-16 11:51:16,236 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\dashboards\FI-0060_185.yaml
|
||||
2025-12-16 11:51:16,238 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\databases\Prod_Clickhouse_10.yaml
|
||||
2025-12-16 11:51:16,240 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0000_-_685.yaml
|
||||
2025-12-16 11:51:16,241 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-01-2_zfund_reciever_-_861.yaml
|
||||
2025-12-16 11:51:16,242 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-01_zfund_reciever_click_689.yaml
|
||||
2025-12-16 11:51:16,244 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-02_680.yaml
|
||||
2025-12-16 11:51:16,245 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-03_ztax_862.yaml
|
||||
2025-12-16 11:51:16,246 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-04_zpbe_681.yaml
|
||||
2025-12-16 11:51:16,247 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-05_ZTAXZFUND_679.yaml
|
||||
2025-12-16 11:51:16,249 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-06_860.yaml
|
||||
2025-12-16 11:51:16,250 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-08_682.yaml
|
||||
2025-12-16 11:51:16,251 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-10_zpbe_688.yaml
|
||||
2025-12-16 11:51:16,253 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060-11_ZTAX_NAME_863.yaml
|
||||
2025-12-16 11:51:16,254 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_683.yaml
|
||||
2025-12-16 11:51:16,255 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_684.yaml
|
||||
2025-12-16 11:51:16,256 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_686.yaml
|
||||
2025-12-16 11:51:16,258 - INFO - [_update_yaml_file][State] Replaced '10' with '3' for key id in C:\Users\LO54FB~1\Temp\tmpuidfegpd.dir\dashboard_export
|
||||
_20251216T085115\datasets\Prod_Clickhouse_10\FI-0060_690.yaml
|
||||
2025-12-16 11:51:16,259 - INFO - [create_dashboard_export][Enter] Packing dashboard: ['C:\\Users\\LO54FB~1\\Temp\\tmpuidfegpd.dir'] -> C:\Users\LO54FB~1
|
||||
\Temp\tmps7cuv2ti.zip
|
||||
2025-12-16 11:51:16,347 - INFO - [create_dashboard_export][Exit] Archive created: C:\Users\LO54FB~1\Temp\tmps7cuv2ti.zip
|
||||
2025-12-16 11:51:16,372 - ERROR - [import_dashboard][Failure] First import attempt failed: [API_FAILURE] API error during upload: {"errors": [{"message"
|
||||
: "Expecting value: line 1 column 1 (char 0)", "error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "messag
|
||||
e": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448
|
||||
\u0438\u0431\u043a\u0430."}]}}]} | Context: {'type': 'api_call'}
|
||||
Traceback (most recent call last):
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 186, in _perform_upload
|
||||
response.raise_for_status()
|
||||
File "C:\ProgramData\anaconda3\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
|
||||
raise HTTPError(http_error_msg, response=self)
|
||||
requests.exceptions.HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: https://dev.bi.dwh.rusal.com/api/v1/dashboard/import/
|
||||
|
||||
The above exception was the direct cause of the following exception:
|
||||
|
||||
Traceback (most recent call last):
|
||||
File "h:\dev\ss-tools\superset_tool\client.py", line 141, in import_dashboard
|
||||
return self._do_import(file_path)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\client.py", line 197, in _do_import
|
||||
return self.network.upload_file(
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 172, in upload_file
|
||||
return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 196, in _perform_upload
|
||||
raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
|
||||
superset_tool.exceptions.SupersetAPIError: [API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 column 1 (char 0)", "
|
||||
error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437
|
||||
\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]}}]} | Context: {'ty
|
||||
pe': 'api_call'}
|
||||
2025-12-16 11:51:16,511 - ERROR - [execute_migration][Failure] [API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 c
|
||||
olumn 1 (char 0)", "error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u04
|
||||
40\u043e\u0438\u0437\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]
|
||||
}}]} | Context: {'type': 'api_call'}
|
||||
Traceback (most recent call last):
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 186, in _perform_upload
|
||||
response.raise_for_status()
|
||||
File "C:\ProgramData\anaconda3\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
|
||||
raise HTTPError(http_error_msg, response=self)
|
||||
requests.exceptions.HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: https://dev.bi.dwh.rusal.com/api/v1/dashboard/import/
|
||||
|
||||
The above exception was the direct cause of the following exception:
|
||||
|
||||
Traceback (most recent call last):
|
||||
File "h:\dev\ss-tools\migration_script.py", line 366, in execute_migration
|
||||
self.to_c.import_dashboard(file_name=tmp_new_zip, dash_id=dash_id, dash_slug=dash_slug)
|
||||
File "h:\dev\ss-tools\superset_tool\client.py", line 141, in import_dashboard
|
||||
return self._do_import(file_path)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\client.py", line 197, in _do_import
|
||||
return self.network.upload_file(
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 172, in upload_file
|
||||
return self._perform_upload(full_url, files_payload, extra_data, _headers, timeout)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "h:\dev\ss-tools\superset_tool\utils\network.py", line 196, in _perform_upload
|
||||
raise SupersetAPIError(f"API error during upload: {e.response.text}") from e
|
||||
superset_tool.exceptions.SupersetAPIError: [API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 column 1 (char 0)", "
|
||||
error_type": "GENERIC_BACKEND_ERROR", "level": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437
|
||||
\u043e\u0448\u043b\u0430 \u043d\u0435\u0438\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]}}]} | Context: {'ty
|
||||
pe': 'api_call'}
|
||||
|
||||
=== Ошибка ===
|
||||
Не удалось мигрировать дашборд FI-0060 Финансы. Налоги. Данные по налогам. Старый.
|
||||
|
||||
[API_FAILURE] API error during upload: {"errors": [{"message": "Expecting value: line 1 column 1 (char 0)", "error_type": "GENERIC_BACKEND_ERROR", "leve
|
||||
l": "error", "extra": {"issue_codes": [{"code": 1011, "message": "Issue 1011 - \u041f\u0440\u043e\u0438\u0437\u043e\u0448\u043b\u0430 \u043d\u0435\u0438
|
||||
\u0437\u0432\u0435\u0441\u0442\u043d\u0430\u044f \u043e\u0448\u0438\u0431\u043a\u0430."}]}}]} | Context: {'type': 'api_call'}
|
||||
|
||||
100%
|
||||
2025-12-16 11:51:16,598 - INFO - [execute_migration][Exit] Migration finished.
|
||||
|
||||
=== Информация ===
|
||||
Миграция завершена!
|
||||
|
||||
2025-12-16 11:51:16,605 - INFO - [run][Exit] Скрипт миграции завершён.
|
||||
File diff suppressed because it is too large
Load Diff
46
specs/019-superset-ux-redesign/checklists/requirements.md
Normal file
46
specs/019-superset-ux-redesign/checklists/requirements.md
Normal file
@@ -0,0 +1,46 @@
|
||||
# Specification Quality Checklist: Superset-Style UX Redesign
|
||||
|
||||
**Purpose**: Validate specification completeness and quality before proceeding to planning
|
||||
**Created**: 2026-02-08
|
||||
**Feature**: [specs/019-superset-ux-redesign/spec.md](../spec.md)
|
||||
|
||||
## Content Quality
|
||||
|
||||
- [x] No implementation details (languages, frameworks, APIs)
|
||||
- [x] Focused on user value and business needs
|
||||
- [x] Written for non-technical stakeholders
|
||||
- [x] All mandatory sections completed
|
||||
|
||||
## UX Consistency
|
||||
|
||||
- [x] Functional requirements fully support the 'Happy Path' in ux_reference.md
|
||||
- [x] Error handling requirements match the 'Error Experience' in ux_reference.md
|
||||
- [x] No requirements contradict the defined User Persona or Context
|
||||
- [x] Top navbar design is consistent with mockups in ux_reference.md
|
||||
- [x] Settings consolidation aligns with sidebar navigation structure
|
||||
|
||||
## Requirement Completeness
|
||||
|
||||
- [x] No [NEEDS CLARIFICATION] markers remain
|
||||
- [x] Requirements are testable and unambiguous
|
||||
- [x] Success criteria are measurable
|
||||
- [x] Success criteria are technology-agnostic (no implementation details)
|
||||
- [x] All acceptance scenarios are defined
|
||||
- [x] Edge cases are identified
|
||||
- [x] Scope is clearly bounded
|
||||
- [x] Dependencies and assumptions identified
|
||||
|
||||
## Feature Readiness
|
||||
|
||||
- [x] All functional requirements have clear acceptance criteria
|
||||
- [x] User scenarios cover primary flows (6 user stories defined)
|
||||
- [x] Feature meets measurable outcomes defined in Success Criteria
|
||||
- [x] No implementation details leak into specification
|
||||
|
||||
## Notes
|
||||
|
||||
- The specification has been updated to align with the "Resource-Centric" philosophy described in `docs/design/resource_centric_layout.md`.
|
||||
- All "Tool-Centric" references (Migration Tool, Git Tool) have been replaced with "Resource Hubs" (Dashboard Hub, Dataset Hub).
|
||||
- The Task Drawer is now the central mechanism for all task interactions.
|
||||
- **New additions**: Top Navigation Bar design (User Story 5) and Consolidated Settings experience (User Story 6).
|
||||
- Functional requirements have been reorganized into logical groups: Navigation & Layout, Resource Hubs, Task Management, Settings & Configuration.
|
||||
242
specs/019-superset-ux-redesign/contracts/api.md
Normal file
242
specs/019-superset-ux-redesign/contracts/api.md
Normal file
@@ -0,0 +1,242 @@
|
||||
# API Contracts: Superset-Style UX Redesign
|
||||
|
||||
**Feature**: 019-superset-ux-redesign
|
||||
**Date**: 2026-02-09
|
||||
|
||||
## Overview
|
||||
|
||||
This document defines the API contracts for new endpoints required by the Resource-Centric UI. All endpoints follow existing patterns in the codebase.
|
||||
|
||||
## New Endpoints
|
||||
|
||||
### 1. Dashboard Hub API
|
||||
|
||||
#### GET /api/dashboards
|
||||
|
||||
**Purpose**: Fetch list of dashboards from a specific environment for the Dashboard Hub grid.
|
||||
|
||||
**Query Parameters**:
|
||||
| Parameter | Type | Required | Description |
|
||||
|-----------|------|----------|-------------|
|
||||
| env_id | string | Yes | Environment ID to fetch dashboards from |
|
||||
| search | string | No | Filter by title/slug |
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"dashboards": [
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"title": "Sales Report",
|
||||
"slug": "sales-2023",
|
||||
"url": "/superset/dashboard/sales-2023",
|
||||
"git_status": {
|
||||
"branch": "main",
|
||||
"sync_status": "OK" | "DIFF" | null
|
||||
},
|
||||
"last_task": {
|
||||
"task_id": "task-uuid",
|
||||
"status": "SUCCESS" | "RUNNING" | "ERROR" | "WAITING_INPUT" | null
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
**Errors**:
|
||||
- `404`: Environment not found
|
||||
- `503`: Superset connection error
|
||||
|
||||
---
|
||||
|
||||
### 2. Dataset Hub API
|
||||
|
||||
#### GET /api/datasets
|
||||
|
||||
**Purpose**: Fetch list of datasets from a specific environment for the Dataset Hub grid.
|
||||
|
||||
**Query Parameters**:
|
||||
| Parameter | Type | Required | Description |
|
||||
|-----------|------|----------|-------------|
|
||||
| env_id | string | Yes | Environment ID to fetch datasets from |
|
||||
| search | string | No | Filter by table name |
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"datasets": [
|
||||
{
|
||||
"id": "uuid-string",
|
||||
"table_name": "fact_orders",
|
||||
"schema": "public",
|
||||
"database": "Production DB",
|
||||
"mapped_fields": {
|
||||
"total": 20,
|
||||
"mapped": 15
|
||||
},
|
||||
"last_task": {
|
||||
"task_id": "task-uuid",
|
||||
"status": "SUCCESS" | null
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 3. Activity API
|
||||
|
||||
#### GET /api/activity
|
||||
|
||||
**Purpose**: Fetch summary of active and recent tasks for the navbar indicator.
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"active_count": 3,
|
||||
"recent_tasks": [
|
||||
{
|
||||
"task_id": "task-uuid",
|
||||
"resource_name": "Sales Report",
|
||||
"resource_type": "dashboard",
|
||||
"status": "RUNNING",
|
||||
"started_at": "2026-02-09T10:00:00Z"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 4. Consolidated Settings API
|
||||
|
||||
#### GET /api/settings
|
||||
|
||||
**Purpose**: Fetch all settings categories for the consolidated settings page.
|
||||
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"environments": [
|
||||
{ "id": "1", "name": "Development", "url": "http://dev...", "status": "OK" }
|
||||
],
|
||||
"connections": [
|
||||
{ "id": "1", "name": "Prod Clickhouse", "type": "clickhouse" }
|
||||
],
|
||||
"llm": {
|
||||
"provider": "openai",
|
||||
"model": "gpt-4",
|
||||
"enabled": true
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"task_log_level": "DEBUG",
|
||||
"belief_scope_enabled": false
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## WebSocket Events (Existing)
|
||||
|
||||
The Task Drawer subscribes to existing WebSocket events:
|
||||
|
||||
### task:status
|
||||
```json
|
||||
{
|
||||
"task_id": "uuid",
|
||||
"status": "RUNNING" | "SUCCESS" | "ERROR" | "WAITING_INPUT"
|
||||
}
|
||||
```
|
||||
|
||||
### task:log
|
||||
```json
|
||||
{
|
||||
"task_id": "uuid",
|
||||
"timestamp": "2026-02-09T10:00:00Z",
|
||||
"level": "INFO",
|
||||
"source": "plugin",
|
||||
"message": "Starting migration..."
|
||||
}
|
||||
```
|
||||
|
||||
### task:input_required
|
||||
```json
|
||||
{
|
||||
"task_id": "uuid",
|
||||
"input_type": "password" | "mapping_selection",
|
||||
"prompt": "Enter database password"
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Module Contracts (Semantic Protocol)
|
||||
|
||||
### SidebarStore (frontend/src/lib/stores/sidebar.js)
|
||||
|
||||
```javascript
|
||||
// [DEF:SidebarStore:Store]
|
||||
// @TIER: STANDARD
|
||||
// @PURPOSE: Manage sidebar visibility and navigation state
|
||||
// @LAYER: UI
|
||||
// @INVARIANT: isExpanded state is always synced with localStorage
|
||||
|
||||
// @UX_STATE: Idle -> Sidebar visible with current state
|
||||
// @UX_STATE: Toggling -> Animation plays for 200ms
|
||||
|
||||
export const sidebarStore = writable({
|
||||
isExpanded: true,
|
||||
activeCategory: 'dashboards',
|
||||
activeItem: '/dashboards',
|
||||
isMobileOpen: false
|
||||
});
|
||||
|
||||
export function toggleSidebar() { /* ... */ }
|
||||
export function setActiveItem(path) { /* ... */ }
|
||||
// [/DEF:SidebarStore]
|
||||
```
|
||||
|
||||
### TaskDrawerStore (frontend/src/lib/stores/taskDrawer.js)
|
||||
|
||||
```javascript
|
||||
// [DEF:TaskDrawerStore:Store]
|
||||
// @TIER: CRITICAL
|
||||
// @PURPOSE: Manage Task Drawer visibility and resource-to-task mapping
|
||||
// @LAYER: UI
|
||||
// @INVARIANT: resourceTaskMap always reflects current task associations
|
||||
|
||||
// @UX_STATE: Closed -> Drawer hidden, no active task
|
||||
// @UX_STATE: Open -> Drawer visible, logs streaming
|
||||
// @UX_STATE: InputRequired -> Interactive form rendered in drawer
|
||||
|
||||
export const taskDrawerStore = writable({
|
||||
isOpen: false,
|
||||
activeTaskId: null,
|
||||
resourceTaskMap: {}
|
||||
});
|
||||
|
||||
export function openDrawerForTask(taskId) { /* ... */ }
|
||||
export function closeDrawer() { /* ... */ }
|
||||
export function updateResourceTask(resourceId, taskId, status) { /* ... */ }
|
||||
// [/DEF:TaskDrawerStore]
|
||||
```
|
||||
|
||||
### ActivityStore (frontend/src/lib/stores/activity.js)
|
||||
|
||||
```javascript
|
||||
// [DEF:ActivityStore:Store]
|
||||
// @TIER: STANDARD
|
||||
// @PURPOSE: Track active task count for navbar indicator
|
||||
// @LAYER: UI
|
||||
// @RELATION: DEPENDS_ON -> WebSocket connection
|
||||
|
||||
export const activityStore = derived(taskDrawerStore, ($drawer) => {
|
||||
const activeCount = Object.values($drawer.resourceTaskMap)
|
||||
.filter(t => t.status === 'RUNNING').length;
|
||||
return { activeCount };
|
||||
});
|
||||
// [/DEF:ActivityStore]
|
||||
```
|
||||
613
specs/019-superset-ux-redesign/contracts/modules.md
Normal file
613
specs/019-superset-ux-redesign/contracts/modules.md
Normal file
@@ -0,0 +1,613 @@
|
||||
# Module Contracts: Superset-Style UX Redesign
|
||||
|
||||
**Feature**: 019-superset-ux-redesign
|
||||
**Date**: 2026-02-10
|
||||
**Semantic Protocol Version**: GRACE-Poly (UX Edition)
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
This document defines module-level contracts following the Semantic Protocol. Each module is annotated with `[DEF]` anchors, `@TIER` classification, and full Design-by-Contract (`@PRE`, `@POST`, `@UX_STATE`) specifications.
|
||||
|
||||
---
|
||||
|
||||
## Frontend Modules
|
||||
|
||||
### 1. SidebarStore
|
||||
|
||||
```javascript
|
||||
// [DEF:SidebarStore:Store]
|
||||
// @TIER: STANDARD
|
||||
// @SEMANTICS: navigation, state-management, persistence
|
||||
// @PURPOSE: Manage sidebar visibility, active navigation state, and mobile overlay behavior
|
||||
// @LAYER: UI
|
||||
// @RELATION: DEPENDS_ON -> localStorage
|
||||
// @RELATION: BINDS_TO -> +layout.svelte
|
||||
// @INVARIANT: isExpanded state is always synced with localStorage key 'sidebar_state'
|
||||
|
||||
// @UX_STATE: Idle -> Sidebar visible with current category highlighted
|
||||
// @UX_STATE: Collapsed -> Icons-only mode, tooltips on hover
|
||||
// @UX_STATE: MobileOverlay -> Full-screen overlay with backdrop, close on outside click
|
||||
// @UX_STATE: Toggling -> CSS transition animation (200ms ease-in-out)
|
||||
|
||||
// @PRE: localStorage is available and accessible
|
||||
// @PRE: defaultValue has valid shape { isExpanded: boolean, activeCategory: string, activeItem: string, isMobileOpen: boolean }
|
||||
|
||||
// @POST: Store always emits valid SidebarState shape
|
||||
// @POST: localStorage 'sidebar_state' is updated on every state change
|
||||
// @POST: isMobileOpen is reset to false when screen size changes to desktop (>=768px)
|
||||
|
||||
// @SIDE_EFFECT: Writes to localStorage
|
||||
// @SIDE_EFFECT: Triggers CSS transitions via class binding
|
||||
|
||||
export const sidebarStore = persistentStore('sidebar_state', {
|
||||
isExpanded: true,
|
||||
activeCategory: 'dashboards',
|
||||
activeItem: '/dashboards',
|
||||
isMobileOpen: false
|
||||
});
|
||||
|
||||
export function toggleSidebar() { /* ... */ }
|
||||
export function setActiveCategory(category) { /* ... */ }
|
||||
export function setActiveItem(path) { /* ... */ }
|
||||
export function openMobileSidebar() { /* ... */ }
|
||||
export function closeMobileSidebar() { /* ... */ }
|
||||
// [/DEF:SidebarStore]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 2. TaskDrawerStore
|
||||
|
||||
```javascript
|
||||
// [DEF:TaskDrawerStore:Store]
|
||||
// @TIER: CRITICAL
|
||||
// @SEMANTICS: task-management, resource-mapping, real-time, drawer-state
|
||||
// @PURPOSE: Manage Task Drawer visibility, active task tracking, and resource-to-task associations
|
||||
// @LAYER: UI
|
||||
// @RELATION: DEPENDS_ON -> WebSocket connection
|
||||
// @RELATION: DEPENDS_ON -> task:status events
|
||||
// @RELATION: DEPENDS_ON -> task:log events
|
||||
// @RELATION: BINDS_TO -> TaskDrawer.svelte
|
||||
// @INVARIANT: resourceTaskMap always contains valid task associations
|
||||
// @INVARIANT: When isOpen is false, activeTaskId may still reference a running task
|
||||
|
||||
// @UX_STATE: Closed -> Drawer hidden, Activity indicator shows active count
|
||||
// @UX_STATE: Open-Loading -> Drawer visible, loading spinner for logs
|
||||
// @UX_STATE: Open-Streaming -> Drawer visible, real-time log stream
|
||||
// @UX_STATE: Open-InputRequired -> Interactive form rendered in drawer (PasswordPrompt, etc.)
|
||||
// @UX_STATE: Open-Completed -> Task finished, success/error status displayed
|
||||
|
||||
// @UX_FEEDBACK: Toast notification when task completes while drawer is closed
|
||||
// @UX_FEEDBACK: Activity indicator badge pulses when new tasks start
|
||||
// @UX_RECOVERY: If WebSocket disconnects, show "Reconnecting..." with retry button
|
||||
|
||||
// @PRE: WebSocket connection is established before subscribing to task events
|
||||
// @PRE: taskId passed to openDrawerForTask() is a valid UUID string
|
||||
// @PRE: resourceId in updateResourceTask() matches pattern {type}:{uuid}
|
||||
|
||||
// @POST: After openDrawerForTask(taskId): isOpen === true && activeTaskId === taskId
|
||||
// @POST: After closeDrawer(): isOpen === false, activeTaskId unchanged
|
||||
// @POST: resourceTaskMap is updated when task:status events are received
|
||||
// @POST: When task status changes to SUCCESS|ERROR, entry remains in map for 5 minutes then auto-cleanup
|
||||
|
||||
// @SIDE_EFFECT: Subscribes to WebSocket events
|
||||
// @SIDE_EFFECT: May trigger browser notification for task completion
|
||||
|
||||
export const taskDrawerStore = writable({
|
||||
isOpen: false,
|
||||
activeTaskId: null,
|
||||
resourceTaskMap: {} // { resourceId: { taskId, status, startedAt } }
|
||||
});
|
||||
|
||||
export function openDrawerForTask(taskId) { /* ... */ }
|
||||
export function closeDrawer() { /* ... */ }
|
||||
export function updateResourceTask(resourceId, taskId, status) { /* ... */ }
|
||||
export function clearCompletedTasks() { /* ... */ }
|
||||
// [/DEF:TaskDrawerStore]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 3. ActivityStore
|
||||
|
||||
```javascript
|
||||
// [DEF:ActivityStore:Store]
|
||||
// @TIER: STANDARD
|
||||
// @SEMANTICS: activity-indicator, derived-state, task-count
|
||||
// @PURPOSE: Provide reactive count of active tasks for the navbar Activity indicator
|
||||
// @LAYER: UI
|
||||
// @RELATION: DEPENDS_ON -> TaskDrawerStore
|
||||
// @RELATION: BINDS_TO -> TopNavbar.svelte
|
||||
// @INVARIANT: activeCount is always >= 0
|
||||
// @INVARIANT: activeCount equals count of RUNNING tasks in resourceTaskMap
|
||||
|
||||
// @UX_STATE: Idle -> Badge hidden or shows "0"
|
||||
// @UX_STATE: Active-N -> Badge shows count N with pulse animation
|
||||
// @UX_STATE: Active-Many -> Badge shows "9+" when count exceeds 9
|
||||
|
||||
// @PRE: TaskDrawerStore is initialized and emitting values
|
||||
|
||||
// @POST: activeCount reflects exact count of tasks with status === 'RUNNING'
|
||||
// @POST: recentTasks contains last 5 tasks sorted by startedAt desc
|
||||
|
||||
export const activityStore = derived(
|
||||
taskDrawerStore,
|
||||
($drawer) => {
|
||||
const tasks = Object.values($drawer.resourceTaskMap);
|
||||
const activeCount = tasks.filter(t => t.status === 'RUNNING').length;
|
||||
const recentTasks = tasks
|
||||
.sort((a, b) => new Date(b.startedAt) - new Date(a.startedAt))
|
||||
.slice(0, 5);
|
||||
return { activeCount, recentTasks };
|
||||
}
|
||||
);
|
||||
// [/DEF:ActivityStore]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 4. Sidebar Component
|
||||
|
||||
```svelte
|
||||
<!-- [DEF:Sidebar:Component] -->
|
||||
<!-- @TIER: CRITICAL -->
|
||||
<!-- @SEMANTICS: navigation, layout, responsive -->
|
||||
<!-- @PURPOSE: Persistent left sidebar with resource categories and collapsible sections -->
|
||||
<!-- @LAYER: UI -->
|
||||
<!-- @RELATION: DEPENDS_ON -> sidebarStore -->
|
||||
<!-- @RELATION: BINDS_TO -> +layout.svelte -->
|
||||
<!-- @INVARIANT: Exactly one category is marked as active at all times -->
|
||||
|
||||
<!-- @UX_STATE: Idle-Expanded -> Full sidebar with text labels, 240px width -->
|
||||
<!-- @UX_STATE: Idle-Collapsed -> Icon-only mode, 64px width, tooltips visible -->
|
||||
<!-- @UX_STATE: Mobile-Hidden -> Sidebar off-screen, hamburger button visible in TopNavbar -->
|
||||
<!-- @UX_STATE: Mobile-Open -> Full overlay with backdrop, closes on outside click -->
|
||||
<!-- @UX_STATE: Category-Expanded -> Sub-items visible under category -->
|
||||
<!-- @UX_STATE: Category-Collapsed -> Sub-items hidden, chevron rotated -->
|
||||
|
||||
<!-- @UX_FEEDBACK: Active item has highlighted background and left border accent -->
|
||||
<!-- @UX_FEEDBACK: Collapse/expand animation is 200ms ease-in-out -->
|
||||
<!-- @UX_FEEDBACK: Mobile overlay has semi-transparent backdrop -->
|
||||
|
||||
<!-- @PRE: sidebarStore is initialized -->
|
||||
<!-- @PRE: i18n translations are loaded for navigation labels -->
|
||||
|
||||
<!-- @POST: Clicking category toggles its expanded state -->
|
||||
<!-- @POST: Clicking item navigates to route and sets activeItem -->
|
||||
<!-- @POST: On mobile, clicking outside closes the sidebar -->
|
||||
|
||||
<script>
|
||||
// Props
|
||||
export let categories = [
|
||||
{ id: 'dashboards', label: 'DASHBOARDS', items: [{ path: '/dashboards', label: 'Overview' }] },
|
||||
{ id: 'datasets', label: 'DATASETS', items: [{ path: '/datasets', label: 'All Datasets' }] },
|
||||
{ id: 'storage', label: 'STORAGE', items: [{ path: '/backups', label: 'Backups' }, { path: '/repos', label: 'Repositories' }] },
|
||||
{ id: 'admin', label: 'ADMIN', items: [{ path: '/users', label: 'Users' }, { path: '/settings', label: 'Settings' }], adminOnly: true }
|
||||
];
|
||||
</script>
|
||||
<!-- [/DEF:Sidebar] -->
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 5. TopNavbar Component
|
||||
|
||||
```svelte
|
||||
<!-- [DEF:TopNavbar:Component] -->
|
||||
<!-- @TIER: CRITICAL -->
|
||||
<!-- @SEMANTICS: global-navigation, activity-indicator, user-menu -->
|
||||
<!-- @PURPOSE: Consistent top navigation bar with global search, activity indicator, and user menu -->
|
||||
<!-- @LAYER: UI -->
|
||||
<!-- @RELATION: DEPENDS_ON -> activityStore -->
|
||||
<!-- @RELATION: DEPENDS_ON -> authStore -->
|
||||
<!-- @RELATION: BINDS_TO -> +layout.svelte -->
|
||||
<!-- @INVARIANT: Height is fixed at 64px across all states -->
|
||||
|
||||
<!-- @UX_STATE: Idle -> All elements visible, activity badge shows count or hidden -->
|
||||
<!-- @UX_STATE: UserMenu-Open -> Dropdown visible with Profile, Settings, Logout -->
|
||||
<!-- @UX_STATE: Search-Focused -> Search input has focus ring, placeholder visible -->
|
||||
<!-- @UX_STATE: Activity-Pulse -> Badge pulses when new task starts -->
|
||||
|
||||
<!-- @UX_FEEDBACK: Clicking logo navigates to home (/dashboards) -->
|
||||
<!-- @UX_FEEDBACK: Activity badge shows count with color: gray (0), blue (running), red (error) -->
|
||||
<!-- @UX_FEEDBACK: User menu closes on outside click or Escape key -->
|
||||
|
||||
<!-- @PRE: User is authenticated -->
|
||||
<!-- @PRE: activityStore is subscribed and emitting values -->
|
||||
|
||||
<!-- @POST: Clicking Activity indicator opens TaskDrawer -->
|
||||
<!-- @POST: Clicking Logout clears auth state and redirects to /login -->
|
||||
|
||||
<script>
|
||||
// Props
|
||||
export let logoText = 'Superset Tools';
|
||||
export let searchPlaceholder = 'Search...';
|
||||
</script>
|
||||
<!-- [/DEF:TopNavbar] -->
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 6. TaskDrawer Component
|
||||
|
||||
```svelte
|
||||
<!-- [DEF:TaskDrawer:Component] -->
|
||||
<!-- @TIER: CRITICAL -->
|
||||
<!-- @SEMANTICS: task-monitoring, log-streaming, interactive-input -->
|
||||
<!-- @PURPOSE: Slide-out drawer for real-time task monitoring and interaction -->
|
||||
<!-- @LAYER: UI -->
|
||||
<!-- @RELATION: DEPENDS_ON -> taskDrawerStore -->
|
||||
<!-- @RELATION: DEPENDS_ON -> TaskLogViewer component -->
|
||||
<!-- @RELATION: DEPENDS_ON -> WebSocket service -->
|
||||
<!-- @RELATION: BINDS_TO -> +layout.svelte -->
|
||||
<!-- @INVARIANT: Drawer width is fixed at 480px on desktop, 100% on mobile -->
|
||||
<!-- @INVARIANT: Log stream auto-scrolls to bottom unless user scrolls up -->
|
||||
|
||||
<!-- @UX_STATE: Closed -> Drawer off-screen right, no content rendered -->
|
||||
<!-- @UX_STATE: Opening -> Slide animation 150ms, content loads -->
|
||||
<!-- @UX_STATE: Open-Loading -> Spinner shown while fetching initial logs -->
|
||||
<!-- @UX_STATE: Open-Streaming -> Logs append in real-time with timestamps -->
|
||||
<!-- @UX_STATE: Open-InputRequired -> Input form overlays log area -->
|
||||
<!-- @UX_STATE: Open-Error -> Error banner at top with retry option -->
|
||||
|
||||
<!-- @UX_FEEDBACK: Close button (X) in top-right corner -->
|
||||
<!-- @UX_FEEDBACK: Task status badge in header (color-coded) -->
|
||||
<!-- @UX_FEEDBACK: Log entries have source labels (plugin, system, user) -->
|
||||
<!-- @UX_RECOVERY: If task fails, show "View Details" button linking to full error -->
|
||||
|
||||
<!-- @PRE: taskDrawerStore.isOpen === true before rendering content -->
|
||||
<!-- @PRE: WebSocket connection is active for real-time updates -->
|
||||
<!-- @PRE: TaskLogViewer component is available -->
|
||||
|
||||
<!-- @POST: Drawer closes when clicking X, Escape key, or outside (optional) -->
|
||||
<!-- @POST: Closing drawer does NOT cancel running task -->
|
||||
<!-- @POST: Re-opening drawer restores previous scroll position -->
|
||||
|
||||
<script>
|
||||
// Props
|
||||
export let width = 480; // px
|
||||
export let closeOnOutsideClick = false;
|
||||
</script>
|
||||
<!-- [/DEF:TaskDrawer] -->
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 7. DashboardHub Component
|
||||
|
||||
```svelte
|
||||
<!-- [DEF:DashboardHub:Component] -->
|
||||
<!-- @TIER: CRITICAL -->
|
||||
<!-- @SEMANTICS: resource-hub, bulk-operations, data-grid -->
|
||||
<!-- @PURPOSE: Central hub for dashboard management with bulk selection and actions -->
|
||||
<!-- @LAYER: UI -->
|
||||
<!-- @RELATION: DEPENDS_ON -> requestApi -->
|
||||
<!-- @RELATION: DEPENDS_ON -> taskDrawerStore -->
|
||||
<!-- @RELATION: CALLS -> GET /api/dashboards -->
|
||||
<!-- @RELATION: DISPATCHES -> MigrationPlugin -->
|
||||
<!-- @RELATION: DISPATCHES -> BackupPlugin -->
|
||||
<!-- @INVARIANT: Selected dashboards persist across pagination -->
|
||||
|
||||
<!-- @UX_STATE: Loading -> Skeleton loaders in grid rows -->
|
||||
<!-- @UX_STATE: Empty-NoEnv -> "Select an environment" message with selector -->
|
||||
<!-- @UX_STATE: Empty-NoData -> "No dashboards found" with refresh button -->
|
||||
<!-- @UX_STATE: Idle-Grid -> Dashboard rows with checkboxes, pagination visible -->
|
||||
<!-- @UX_STATE: Selecting -> Checkboxes checked, floating action panel appears -->
|
||||
<!-- @UX_STATE: BulkAction-Modal -> Migration or Backup modal open -->
|
||||
<!-- @UX_STATE: Task-Triggered -> TaskDrawer opens automatically -->
|
||||
|
||||
<!-- @UX_FEEDBACK: Row hover highlights entire row -->
|
||||
<!-- @UX_FEEDBACK: Git status icon: green check (synced), orange diff (changes), gray (none) -->
|
||||
<!-- @UX_FEEDBACK: Last task status: badge with color (green=success, red=error, blue=running) -->
|
||||
<!-- @UX_FEEDBACK: Floating panel slides up from bottom when items selected -->
|
||||
<!-- @UX_RECOVERY: Failed environment connection shows error banner with retry -->
|
||||
|
||||
<!-- @PRE: User has permission plugin:migration:execute for Migrate action -->
|
||||
<!-- @PRE: User has permission plugin:backup:execute for Backup action -->
|
||||
<!-- @PRE: At least one environment is configured -->
|
||||
|
||||
<!-- @POST: Clicking status badge opens TaskDrawer with that task -->
|
||||
<!-- @POST: Bulk Migrate opens modal with source (read-only), target selector, db mappings -->
|
||||
<!-- @POST: Bulk Backup opens modal with schedule options (one-time or cron) -->
|
||||
<!-- @POST: Search filters results in real-time (debounced 300ms) -->
|
||||
<!-- @POST: Pagination preserves selected items across page changes -->
|
||||
|
||||
<script>
|
||||
// State
|
||||
let dashboards = [];
|
||||
let selectedIds = new Set();
|
||||
let currentPage = 1;
|
||||
let pageSize = 10;
|
||||
let searchQuery = '';
|
||||
let environmentId = null;
|
||||
</script>
|
||||
<!-- [/DEF:DashboardHub] -->
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 8. DatasetHub Component
|
||||
|
||||
```svelte
|
||||
<!-- [DEF:DatasetHub:Component] -->
|
||||
<!-- @TIER: CRITICAL -->
|
||||
<!-- @SEMANTICS: resource-hub, semantic-mapping, documentation -->
|
||||
<!-- @PURPOSE: Central hub for dataset management with column mapping and documentation generation -->
|
||||
<!-- @LAYER: UI -->
|
||||
<!-- @RELATION: DEPENDS_ON -> requestApi -->
|
||||
<!-- @RELATION: DEPENDS_ON -> taskDrawerStore -->
|
||||
<!-- @RELATION: CALLS -> GET /api/datasets -->
|
||||
<!-- @RELATION: DISPATCHES -> MapperPlugin -->
|
||||
<!-- @RELATION: DISPATCHES -> LLMAnalysisPlugin -->
|
||||
<!-- @INVARIANT: Mapped % is calculated as (mapped_columns / total_columns) * 100 -->
|
||||
|
||||
<!-- @UX_STATE: Loading -> Skeleton loaders in grid rows -->
|
||||
<!-- @UX_STATE: Empty-NoEnv -> "Select an environment" message -->
|
||||
<!-- @UX_STATE: Empty-NoData -> "No datasets found" -->
|
||||
<!-- @UX_STATE: Idle-Grid -> Dataset rows with metadata columns -->
|
||||
<!-- @UX_STATE: Selecting -> Floating panel with Map Columns, Generate Docs, Validate -->
|
||||
<!-- @UX_STATE: Detail-View -> Expanded row or modal with table breakdown -->
|
||||
|
||||
<!-- @UX_FEEDBACK: Mapped % column shows progress bar + percentage text -->
|
||||
<!-- @UX_FEEDBACK: Tables column shows count of SQL tables extracted -->
|
||||
<!-- @UX_FEEDBACK: Columns column shows "X/Y" format (mapped/total) -->
|
||||
<!-- @UX_FEEDBACK: Start Mapping button is disabled until valid source is configured -->
|
||||
<!-- @UX_RECOVERY: Failed mapping shows error toast with "Retry" action -->
|
||||
|
||||
<!-- @PRE: User has permission plugin:mapper:execute for Map Columns -->
|
||||
<!-- @PRE: User has permission plugin:llm_analysis:execute for Generate Docs -->
|
||||
<!-- @PRE: LLM providers are configured for documentation generation -->
|
||||
|
||||
<!-- @POST: Clicking dataset name opens detail view with tables breakdown -->
|
||||
<!-- @POST: Map Columns modal shows source selection (PostgreSQL or XLSX) -->
|
||||
<!-- @POST: Generate Docs modal shows LLM provider selection and options -->
|
||||
<!-- @POST: Search filters by name, schema, and table names -->
|
||||
|
||||
<script>
|
||||
// State
|
||||
let datasets = [];
|
||||
let selectedIds = new Set();
|
||||
let currentPage = 1;
|
||||
let pageSize = 10;
|
||||
let searchQuery = '';
|
||||
let environmentId = null;
|
||||
</script>
|
||||
<!-- [/DEF:DatasetHub] -->
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 9. Breadcrumbs Component
|
||||
|
||||
```svelte
|
||||
<!-- [DEF:Breadcrumbs:Component] -->
|
||||
<!-- @TIER: STANDARD -->
|
||||
<!-- @SEMANTICS: navigation, hierarchy, wayfinding -->
|
||||
<!-- @PURPOSE: Show page hierarchy and allow navigation to parent levels -->
|
||||
<!-- @LAYER: UI -->
|
||||
<!-- @RELATION: DEPENDS_ON -> $page store from $app/stores -->
|
||||
<!-- @INVARIANT: Home is always first item, current page is last item -->
|
||||
|
||||
<!-- @UX_STATE: Short-Path -> All items visible (<= 3 levels) -->
|
||||
<!-- @UX_STATE: Long-Path -> Middle items truncated with ellipsis (...) -->
|
||||
|
||||
<!-- @UX_FEEDBACK: Clickable items have hover underline -->
|
||||
<!-- @UX_FEEDBACK: Current page (last item) is not clickable, shown in bold -->
|
||||
|
||||
<!-- @PRE: Route structure is defined and consistent -->
|
||||
|
||||
<!-- @POST: Clicking breadcrumb navigates to that level -->
|
||||
<!-- @POST: Deep paths (>3 levels) truncate middle segments -->
|
||||
|
||||
<script>
|
||||
// Props
|
||||
export let items = []; // [{ label: string, path: string | null }]
|
||||
export let maxVisible = 3;
|
||||
</script>
|
||||
<!-- [/DEF:Breadcrumbs] -->
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Backend Modules
|
||||
|
||||
### 10. Dashboards API
|
||||
|
||||
```python
|
||||
# [DEF:DashboardsAPI:Module]
|
||||
# @TIER: CRITICAL
|
||||
# @SEMANTICS: rest-api, resource-fetching, git-integration
|
||||
# @PURPOSE: API endpoints for Dashboard Hub - list dashboards with metadata
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> superset_client
|
||||
# @RELATION: DEPENDS_ON -> git_service
|
||||
# @RELATION: DEPENDS_ON -> task_manager
|
||||
# @RELATION: CALLS -> Superset API /api/v1/dashboard/
|
||||
|
||||
# @PRE: env_id parameter is valid UUID of existing environment
|
||||
# @PRE: User has permission to read dashboards from specified environment
|
||||
|
||||
# @POST: Response includes all dashboards accessible in environment
|
||||
# @POST: Each dashboard has git_status with branch and sync_status
|
||||
# @POST: Each dashboard has last_task with most recent task status
|
||||
# @POST: Search parameter filters by title and slug (case-insensitive)
|
||||
|
||||
# Endpoint: GET /api/dashboards
|
||||
# Query: env_id (required), search (optional)
|
||||
# Response: { dashboards: [...] }
|
||||
|
||||
# Endpoint: POST /api/dashboards/migrate
|
||||
# Body: { source_env_id, target_env_id, dashboard_ids, db_mappings }
|
||||
# Response: { task_id }
|
||||
|
||||
# Endpoint: POST /api/dashboards/backup
|
||||
# Body: { env_id, dashboard_ids, schedule (optional cron) }
|
||||
# Response: { task_id }
|
||||
# [/DEF:DashboardsAPI]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 11. Datasets API
|
||||
|
||||
```python
|
||||
# [DEF:DatasetsAPI:Module]
|
||||
# @TIER: CRITICAL
|
||||
# @SEMANTICS: rest-api, dataset-metadata, column-mapping
|
||||
# @PURPOSE: API endpoints for Dataset Hub - list datasets with mapping info
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> superset_client
|
||||
# @RELATION: DEPENDS_ON -> mapping_service
|
||||
# @RELATION: DEPENDS_ON -> task_manager
|
||||
# @RELATION: CALLS -> Superset API /api/v1/dataset/
|
||||
|
||||
# @PRE: env_id parameter is valid UUID of existing environment
|
||||
# @PRE: User has permission to read datasets from specified environment
|
||||
|
||||
# @POST: Response includes all datasets accessible in environment
|
||||
# @POST: Each dataset has mapped_fields count (mapped/total)
|
||||
# @POST: Each dataset has extracted SQL table names
|
||||
# @POST: Search filters by table_name, schema, and extracted table names
|
||||
|
||||
# Endpoint: GET /api/datasets
|
||||
# Query: env_id (required), search (optional)
|
||||
# Response: { datasets: [...] }
|
||||
|
||||
# Endpoint: POST /api/datasets/map-columns
|
||||
# Body: { env_id, dataset_ids, source_type, connection_id or file }
|
||||
# Response: { task_id }
|
||||
|
||||
# Endpoint: POST /api/datasets/generate-docs
|
||||
# Body: { env_id, dataset_ids, llm_provider, options }
|
||||
# Response: { task_id }
|
||||
# [/DEF:DatasetsAPI]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 12. Activity API
|
||||
|
||||
```python
|
||||
# [DEF:ActivityAPI:Module]
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: rest-api, activity-summary, task-metrics
|
||||
# @PURPOSE: Provide activity summary for navbar indicator
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> task_manager
|
||||
# @RELATION: CALLS -> task_manager.get_active_tasks()
|
||||
|
||||
# @PRE: User is authenticated
|
||||
|
||||
# @POST: Returns count of RUNNING tasks
|
||||
# @POST: Returns last 5 tasks sorted by started_at desc
|
||||
# @POST: Tasks include resource_name and resource_type for display
|
||||
|
||||
# Endpoint: GET /api/activity
|
||||
# Response: { active_count: number, recent_tasks: [...] }
|
||||
# [/DEF:ActivityAPI]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 13. ResourceService
|
||||
|
||||
```python
|
||||
# [DEF:ResourceService:Class]
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: service-layer, shared-logic, resource-fetching
|
||||
# @PURPOSE: Shared logic for fetching resources from Superset with caching
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> superset_client
|
||||
# @RELATION: DEPENDS_ON -> ConfigManager
|
||||
# @INVARIANT: All methods accept env_id and resolve via ConfigManager
|
||||
|
||||
# @PRE: env_id corresponds to valid Environment configuration
|
||||
# @PRE: ConfigManager returns valid Superset connection params
|
||||
|
||||
# @POST: Returns normalized resource data structure
|
||||
# @POST: Handles connection errors gracefully with meaningful messages
|
||||
# @POST: Caches results for 30 seconds to reduce API load
|
||||
|
||||
class ResourceService:
|
||||
async def fetch_dashboards(env_id: str, search: str = None) -> list[Dashboard]: ...
|
||||
async def fetch_datasets(env_id: str, search: str = None) -> list[Dataset]: ...
|
||||
async def get_git_status(env_id: str, resource_id: str) -> GitStatus: ...
|
||||
async def get_last_task(resource_type: str, resource_id: str) -> TaskSummary: ...
|
||||
# [/DEF:ResourceService]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Contract Compliance Matrix
|
||||
|
||||
| Module | TIER | @PRE/@POST | @UX_STATE | @RELATION | Status |
|
||||
|--------|------|------------|-----------|-----------|--------|
|
||||
| SidebarStore | STANDARD | ✅ | ✅ | ✅ | Ready |
|
||||
| TaskDrawerStore | CRITICAL | ✅ | ✅ | ✅ | Ready |
|
||||
| ActivityStore | STANDARD | ✅ | ✅ | ✅ | Ready |
|
||||
| Sidebar | CRITICAL | ✅ | ✅ | ✅ | Ready |
|
||||
| TopNavbar | CRITICAL | ✅ | ✅ | ✅ | Ready |
|
||||
| TaskDrawer | CRITICAL | ✅ | ✅ | ✅ | Ready |
|
||||
| DashboardHub | CRITICAL | ✅ | ✅ | ✅ | Ready |
|
||||
| DatasetHub | CRITICAL | ✅ | ✅ | ✅ | Ready |
|
||||
| Breadcrumbs | STANDARD | ✅ | ✅ | ✅ | Ready |
|
||||
| DashboardsAPI | CRITICAL | ✅ | N/A | ✅ | Ready |
|
||||
| DatasetsAPI | CRITICAL | ✅ | N/A | ✅ | Ready |
|
||||
| ActivityAPI | STANDARD | ✅ | N/A | ✅ | Ready |
|
||||
| ResourceService | STANDARD | ✅ | N/A | ✅ | Ready |
|
||||
|
||||
---
|
||||
|
||||
## UX State Mapping to Components
|
||||
|
||||
| UX State (from ux_reference.md) | Component | @UX_STATE Tag |
|
||||
|----------------------------------|-----------|---------------|
|
||||
| Sidebar Expanded | Sidebar | Idle-Expanded |
|
||||
| Sidebar Collapsed | Sidebar | Idle-Collapsed |
|
||||
| Mobile Overlay | Sidebar | Mobile-Open |
|
||||
| Task Drawer Closed | TaskDrawer | Closed |
|
||||
| Task Drawer Streaming | TaskDrawer | Open-Streaming |
|
||||
| Task Drawer Input Required | TaskDrawer | Open-InputRequired |
|
||||
| Activity Badge Active | TopNavbar | Activity-Pulse |
|
||||
| Dashboard Grid Loading | DashboardHub | Loading |
|
||||
| Dashboard Grid Empty | DashboardHub | Empty-NoData |
|
||||
| Bulk Selection Active | DashboardHub | Selecting |
|
||||
| Migration Modal Open | DashboardHub | BulkAction-Modal |
|
||||
| Dataset Grid Loading | DatasetHub | Loading |
|
||||
| Dataset Detail View | DatasetHub | Detail-View |
|
||||
|
||||
---
|
||||
|
||||
## Design-by-Contract Enforcement
|
||||
|
||||
### Critical Modules (Full Contracts Required)
|
||||
|
||||
All CRITICAL tier modules MUST have:
|
||||
- Complete `@PRE` conditions for all public functions
|
||||
- Complete `@POST` conditions guaranteeing output
|
||||
- All `@UX_STATE` transitions documented
|
||||
- `@UX_FEEDBACK` and `@UX_RECOVERY` where applicable
|
||||
- `@RELATION` tags for all dependencies
|
||||
|
||||
### Standard Modules (Basic Contracts Required)
|
||||
|
||||
All STANDARD tier modules MUST have:
|
||||
- `@PURPOSE` declaration
|
||||
- Key `@PRE` and `@POST` for main operations
|
||||
- Primary `@UX_STATE` states
|
||||
- `@RELATION` tags for major dependencies
|
||||
|
||||
---
|
||||
|
||||
## Constitution Compliance
|
||||
|
||||
| Principle | Compliance | Notes |
|
||||
|-----------|------------|-------|
|
||||
| I. Semantic Protocol | ✅ | All modules use [DEF] anchors and proper tagging |
|
||||
| II. Everything is a Plugin | ✅ | Dashboard/Dataset hubs dispatch to existing plugins |
|
||||
| III. Unified Frontend | ✅ | Uses requestApi, i18n, and component library |
|
||||
| IV. Security & Access Control | ✅ | Permission checks in @PRE conditions |
|
||||
| V. Independent Testability | ✅ | Each hub has defined independent test scenarios |
|
||||
| VI. Asynchronous Execution | ✅ | All bulk operations return task_id, use TaskManager |
|
||||
122
specs/019-superset-ux-redesign/data-model.md
Normal file
122
specs/019-superset-ux-redesign/data-model.md
Normal file
@@ -0,0 +1,122 @@
|
||||
# Data Model: Superset-Style UX Redesign
|
||||
|
||||
**Feature**: 019-superset-ux-redesign
|
||||
**Date**: 2026-02-09
|
||||
|
||||
## Overview
|
||||
|
||||
This feature primarily introduces frontend state management and UI components. No new database tables are required. The data model focuses on client-side stores for managing UI state.
|
||||
|
||||
## Frontend Stores
|
||||
|
||||
### 1. SidebarStore
|
||||
|
||||
**Purpose**: Manage sidebar visibility and collapse state
|
||||
|
||||
```typescript
|
||||
interface SidebarState {
|
||||
isExpanded: boolean; // true = full width, false = icons only
|
||||
activeCategory: string; // 'dashboards' | 'datasets' | 'storage' | 'admin'
|
||||
activeItem: string; // Current route path
|
||||
isMobileOpen: boolean; // For mobile overlay mode
|
||||
}
|
||||
```
|
||||
|
||||
**Persistence**: localStorage key `sidebar_state`
|
||||
|
||||
### 2. TaskDrawerStore
|
||||
|
||||
**Purpose**: Manage Task Drawer visibility and resource-to-task mapping
|
||||
|
||||
```typescript
|
||||
interface TaskDrawerState {
|
||||
isOpen: boolean;
|
||||
activeTaskId: string | null;
|
||||
resourceTaskMap: Record<string, {
|
||||
taskId: string;
|
||||
status: 'IDLE' | 'RUNNING' | 'WAITING_INPUT' | 'SUCCESS' | 'ERROR';
|
||||
}>;
|
||||
}
|
||||
```
|
||||
|
||||
**Example**:
|
||||
```javascript
|
||||
resourceTaskMap: {
|
||||
"dashboard-uuid-123": { taskId: "task-abc", status: "RUNNING" },
|
||||
"dataset-uuid-456": { taskId: "task-def", status: "SUCCESS" }
|
||||
}
|
||||
```
|
||||
|
||||
### 3. ActivityStore
|
||||
|
||||
**Purpose**: Track count of active tasks for navbar indicator
|
||||
|
||||
```typescript
|
||||
interface ActivityState {
|
||||
activeCount: number; // Number of RUNNING tasks
|
||||
recentTasks: TaskSummary[]; // Last 5 tasks for quick access
|
||||
}
|
||||
|
||||
interface TaskSummary {
|
||||
taskId: string;
|
||||
resourceName: string;
|
||||
resourceType: 'dashboard' | 'dataset';
|
||||
status: string;
|
||||
startedAt: string;
|
||||
}
|
||||
```
|
||||
|
||||
### 4. SettingsStore
|
||||
|
||||
**Purpose**: Cache settings data for consolidated settings page
|
||||
|
||||
```typescript
|
||||
interface SettingsState {
|
||||
activeTab: 'environments' | 'connections' | 'llm' | 'logging' | 'system';
|
||||
environments: Environment[];
|
||||
connections: Connection[];
|
||||
llmSettings: LLMSettings;
|
||||
loggingSettings: LoggingSettings;
|
||||
}
|
||||
```
|
||||
|
||||
## Backend Entities (Existing)
|
||||
|
||||
The following entities are used but not modified:
|
||||
|
||||
### Task (from `backend/src/models/task.py`)
|
||||
- `id`: UUID
|
||||
- `status`: Enum (RUNNING, SUCCESS, ERROR, WAITING_INPUT)
|
||||
- `plugin_name`: String
|
||||
- `created_at`: DateTime
|
||||
- `metadata`: JSON (includes `resource_id`, `resource_type`)
|
||||
|
||||
### Environment (from `backend/src/models/connection.py`)
|
||||
- Used for Source Environment selector in hubs
|
||||
|
||||
## State Transitions
|
||||
|
||||
### Task Status in Resource Grid
|
||||
|
||||
```
|
||||
IDLE → (user triggers action) → RUNNING
|
||||
RUNNING → (task needs input) → WAITING_INPUT
|
||||
RUNNING → (task completes) → SUCCESS
|
||||
RUNNING → (task fails) → ERROR
|
||||
WAITING_INPUT → (user provides input) → RUNNING
|
||||
```
|
||||
|
||||
### Sidebar State Flow
|
||||
|
||||
```
|
||||
Expanded ←→ Collapsed (user toggle)
|
||||
Hidden (mobile) → Overlay Open (hamburger click) → Hidden (outside click)
|
||||
```
|
||||
|
||||
### Task Drawer State Flow
|
||||
|
||||
```
|
||||
Closed → Open (click status badge or activity indicator)
|
||||
Open → Closed (click X or select different task)
|
||||
Open → Open+DifferentTask (click different status badge)
|
||||
```
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user