Compare commits
3 Commits
fdcbe32dfa
...
019-supers
| Author | SHA1 | Date | |
|---|---|---|---|
| 0cf0ef25f1 | |||
| af74841765 | |||
| d7e4919d54 |
1250
.ai/MODULE_MAP.md
Normal file
1250
.ai/MODULE_MAP.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -3,7 +3,7 @@
|
||||
> Compressed view for AI Context. Generated automatically.
|
||||
|
||||
- 📦 **generate_semantic_map** (`Module`) `[CRITICAL]`
|
||||
- 📝 Scans the codebase to generate a Semantic Map and Compliance Report based on the System Standard.
|
||||
- 📝 Scans the codebase to generate a Semantic Map, Module Map, and Compliance Report based on the System Standard.
|
||||
- 🏗️ Layer: DevOps/Tooling
|
||||
- 🔒 Invariant: All DEF anchors must have matching closing anchors; TIER determines validation strictness.
|
||||
- ƒ **__init__** (`Function`) `[TRIVIAL]`
|
||||
@@ -71,8 +71,57 @@
|
||||
- 📝 Generates the token-optimized project map with enhanced Svelte details.
|
||||
- ƒ **_write_entity_md** (`Function`) `[CRITICAL]`
|
||||
- 📝 Recursive helper to write entity tree to Markdown with tier badges and enhanced details.
|
||||
- ƒ **_generate_module_map** (`Function`) `[CRITICAL]`
|
||||
- 📝 Generates a module-centric map grouping entities by directory structure.
|
||||
- ƒ **_get_module_path** (`Function`)
|
||||
- 📝 Extracts the module path from a file path.
|
||||
- ƒ **_collect_all_entities** (`Function`)
|
||||
- 📝 Flattens entity tree for easier grouping.
|
||||
- ƒ **to_dict** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- 📦 **TransactionCore** (`Module`) `[CRITICAL]`
|
||||
- 📝 Core banking transaction processor with ACID guarantees.
|
||||
- 🏗️ Layer: Domain (Core)
|
||||
- 🔒 Invariant: Negative transfers are strictly forbidden.
|
||||
- 🔗 DEPENDS_ON -> `[DEF:Infra:PostgresDB]`
|
||||
- 🔗 DEPENDS_ON -> `[DEF:Infra:AuditLog]`
|
||||
- ƒ **execute_transfer** (`Function`)
|
||||
- 📝 Atomically move funds between accounts with audit trails.
|
||||
- 🔗 CALLS -> `atomic_transaction`
|
||||
- 📦 **PluginExampleShot** (`Module`)
|
||||
- 📝 Reference implementation of a plugin following GRACE standards.
|
||||
- 🏗️ Layer: Domain (Business Logic)
|
||||
- 🔒 Invariant: get_schema must return valid JSON Schema.
|
||||
- 🔗 INHERITS -> `PluginBase`
|
||||
- ƒ **get_schema** (`Function`)
|
||||
- 📝 Defines input validation schema.
|
||||
- ƒ **execute** (`Function`)
|
||||
- 📝 Core plugin logic with structured logging and scope isolation.
|
||||
- ƒ **id** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- 📦 **BackendRouteShot** (`Module`)
|
||||
- 📝 Reference implementation of a task-based route using GRACE-Poly.
|
||||
- 🏗️ Layer: Interface (API)
|
||||
- 🔒 Invariant: TaskManager must be available in dependency graph.
|
||||
- 🔗 IMPLEMENTS -> `[DEF:Std:API_FastAPI]`
|
||||
- ƒ **create_task** (`Function`)
|
||||
- 📝 Create and start a new task using TaskManager. Non-blocking.
|
||||
- 🔗 CALLS -> `task_manager.create_task`
|
||||
- 🧩 **FrontendComponentShot** (`Component`) `[CRITICAL]`
|
||||
- 📝 Action button to spawn a new task with full UX feedback cycle.
|
||||
- 🏗️ Layer: UI (Presentation)
|
||||
- 🔒 Invariant: Must prevent double-submission while loading.
|
||||
- 📥 Props: plugin_id: any, params: any
|
||||
- ⬅️ READS_FROM `lib`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- ƒ **spawnTask** (`Function`)
|
||||
- 📦 **DashboardTypes** (`Module`) `[TRIVIAL]`
|
||||
- 📝 TypeScript interfaces for Dashboard entities
|
||||
- 🏗️ Layer: Domain
|
||||
- 🧩 **Counter** (`Component`) `[TRIVIAL]`
|
||||
- 📝 Simple counter demo component
|
||||
- 🏗️ Layer: UI
|
||||
- ➡️ WRITES_TO `state`
|
||||
- 📦 **stores_module** (`Module`)
|
||||
- 📝 Global state management using Svelte stores.
|
||||
- 🏗️ Layer: UI-State
|
||||
@@ -116,6 +165,11 @@
|
||||
- 📝 Generic request wrapper.
|
||||
- 📦 **api** (`Data`)
|
||||
- 📝 API client object with specific methods.
|
||||
- 📦 **Utils** (`Module`) `[TRIVIAL]`
|
||||
- 📝 General utility functions (class merging)
|
||||
- 🏗️ Layer: Infra
|
||||
- ƒ **cn** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- 🗄️ **authStore** (`Store`)
|
||||
- 📝 Manages the global authentication state on the frontend.
|
||||
- 🏗️ Layer: Feature
|
||||
@@ -131,9 +185,9 @@
|
||||
- 📝 Clears authentication state and storage.
|
||||
- ƒ **setLoading** (`Function`)
|
||||
- 📝 Updates the loading state.
|
||||
- 📦 **debounce** (`Module`) `[TRIVIAL]`
|
||||
- 📝 Auto-generated module for frontend/src/lib/utils/debounce.js
|
||||
- 🏗️ Layer: Unknown
|
||||
- 📦 **Debounce** (`Module`) `[TRIVIAL]`
|
||||
- 📝 Debounce utility for limiting function execution rate
|
||||
- 🏗️ Layer: Infra
|
||||
- ƒ **debounce** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- 🗄️ **taskDrawer** (`Store`) `[CRITICAL]`
|
||||
@@ -181,8 +235,9 @@
|
||||
- 📝 Unit tests for sidebar store
|
||||
- 🏗️ Layer: Domain (Tests)
|
||||
- ƒ **test_sidebar_initial_state** (`Function`)
|
||||
- ƒ **test_toggleSidebar** (`Function`)
|
||||
- ƒ **test_setActiveItem** (`Function`)
|
||||
- ƒ **test_toggleSidebar** (`Function`)
|
||||
- ƒ **test_setActiveItem** (`Function`)
|
||||
- ƒ **test_mobile_functions** (`Function`)
|
||||
- 📦 **frontend.src.lib.stores.__tests__.test_activity** (`Module`)
|
||||
- 📝 Unit tests for activity store
|
||||
- 🏗️ Layer: UI
|
||||
@@ -202,7 +257,9 @@
|
||||
- 🧩 **Select** (`Component`) `[TRIVIAL]`
|
||||
- 📝 Standardized dropdown selection component.
|
||||
- 🏗️ Layer: Atom
|
||||
- 📥 Props: label: string , value: string | number , disabled: boolean
|
||||
- ⬅️ READS_FROM `lib`
|
||||
- ➡️ WRITES_TO `bindable`
|
||||
- ➡️ WRITES_TO `props`
|
||||
- 📦 **ui** (`Module`) `[TRIVIAL]`
|
||||
- 📝 Central export point for standardized UI components.
|
||||
- 🏗️ Layer: Atom
|
||||
@@ -210,21 +267,26 @@
|
||||
- 🧩 **PageHeader** (`Component`) `[TRIVIAL]`
|
||||
- 📝 Standardized page header with title and action area.
|
||||
- 🏗️ Layer: Atom
|
||||
- 📥 Props: title: string
|
||||
- ⬅️ READS_FROM `lib`
|
||||
- ➡️ WRITES_TO `props`
|
||||
- 🧩 **Card** (`Component`) `[TRIVIAL]`
|
||||
- 📝 Standardized container with padding and elevation.
|
||||
- 🏗️ Layer: Atom
|
||||
- 📥 Props: title: string
|
||||
- ⬅️ READS_FROM `lib`
|
||||
- ➡️ WRITES_TO `props`
|
||||
- 🧩 **Button** (`Component`) `[TRIVIAL]`
|
||||
- 📝 Define component interface and default values.
|
||||
- 📝 Define component interface and default values (Svelte 5 Runes).
|
||||
- 🏗️ Layer: Atom
|
||||
- 🔒 Invariant: Supports accessible labels and keyboard navigation.
|
||||
- 📥 Props: isLoading: boolean , disabled: boolean
|
||||
- ⬅️ READS_FROM `lib`
|
||||
- ➡️ WRITES_TO `props`
|
||||
- 🧩 **Input** (`Component`) `[TRIVIAL]`
|
||||
- 📝 Standardized text input component with label and error handling.
|
||||
- 🏗️ Layer: Atom
|
||||
- 🔒 Invariant: Consistent spacing and focus states.
|
||||
- 📥 Props: label: string , value: string , placeholder: string , error: string , disabled: boolean
|
||||
- ⬅️ READS_FROM `lib`
|
||||
- ➡️ WRITES_TO `bindable`
|
||||
- ➡️ WRITES_TO `props`
|
||||
- 🧩 **LanguageSwitcher** (`Component`) `[TRIVIAL]`
|
||||
- 📝 Dropdown component to switch between supported languages.
|
||||
- 🏗️ Layer: Atom
|
||||
@@ -293,10 +355,9 @@
|
||||
- 📝 Display page hierarchy navigation
|
||||
- 🏗️ Layer: UI
|
||||
- 🔒 Invariant: Always shows current page path
|
||||
- 📥 Props: maxVisible: any
|
||||
- ⬅️ READS_FROM `app`
|
||||
- ⬅️ READS_FROM `lib`
|
||||
- ⬅️ READS_FROM `page`
|
||||
- ➡️ WRITES_TO `props`
|
||||
- 📦 **Breadcrumbs** (`Module`) `[TRIVIAL]`
|
||||
- 📝 Auto-generated module for frontend/src/lib/components/layout/Breadcrumbs.svelte
|
||||
- 🏗️ Layer: Unknown
|
||||
@@ -328,6 +389,12 @@
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- ƒ **disconnectWebSocket** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- 📦 **ErrorPage** (`Page`)
|
||||
- 📝 Global error page displaying HTTP status and messages
|
||||
- 🏗️ Layer: UI
|
||||
- 📦 **RootLayoutConfig** (`Module`) `[TRIVIAL]`
|
||||
- 📝 Root layout configuration (SPA mode)
|
||||
- 🏗️ Layer: Infra
|
||||
- 📦 **HomePage** (`Page`) `[CRITICAL]`
|
||||
- 📝 Redirect to Dashboard Hub as per UX requirements
|
||||
- 🏗️ Layer: UI
|
||||
@@ -692,8 +759,10 @@
|
||||
- 🧩 **PasswordPrompt** (`Component`)
|
||||
- 📝 A modal component to prompt the user for database passwords when a migration task is paused.
|
||||
- 🏗️ Layer: UI
|
||||
- 📥 Props: show: any, databases: any, errorMessage: any
|
||||
- ⚡ Events: cancel, resume
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `state`
|
||||
- ⬅️ READS_FROM `effect`
|
||||
- ƒ **handleSubmit** (`Function`)
|
||||
- 📝 Validates and dispatches the passwords to resume the task.
|
||||
- ƒ **handleCancel** (`Function`)
|
||||
@@ -703,6 +772,7 @@
|
||||
- 🏗️ Layer: Feature
|
||||
- 🔒 Invariant: Each source database can be mapped to one target database.
|
||||
- ⚡ Events: update
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ƒ **updateMapping** (`Function`)
|
||||
- 📝 Updates a mapping for a specific source database.
|
||||
- ƒ **getSuggestion** (`Function`)
|
||||
@@ -711,13 +781,12 @@
|
||||
- 📝 Displays detailed logs for a specific task inline or in a modal using TaskLogPanel.
|
||||
- 🏗️ Layer: UI
|
||||
- 🔒 Invariant: Real-time logs are always appended without duplicates.
|
||||
- 📥 Props: show: any, inline: any, taskId: any, taskStatus: any, realTimeLogs: any
|
||||
- ⚡ Events: close
|
||||
- ⬅️ READS_FROM `t`
|
||||
- ➡️ WRITES_TO `bindable`
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `state`
|
||||
- 📦 **handleRealTimeLogs** (`Action`)
|
||||
- 📝 Append real-time logs as they arrive from WebSocket, preventing duplicates */
|
||||
- ƒ **fetchLogs** (`Function`)
|
||||
- 📝 Fetches logs for the current task from API (polling fallback).
|
||||
- 📦 **TaskLogViewer** (`Module`) `[TRIVIAL]`
|
||||
- 📝 Auto-generated module for frontend/src/components/TaskLogViewer.svelte
|
||||
- 🏗️ Layer: Unknown
|
||||
@@ -732,8 +801,8 @@
|
||||
- 📝 Prompts the user to provide a database mapping when one is missing during migration.
|
||||
- 🏗️ Layer: Feature
|
||||
- 🔒 Invariant: Modal blocks migration progress until resolved or cancelled.
|
||||
- 📥 Props: show: boolean , sourceDbName: string , sourceDbUuid: string
|
||||
- ⚡ Events: cancel, resolve
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ƒ **resolve** (`Function`)
|
||||
- 📝 Dispatches the resolution event with the selected mapping.
|
||||
- ƒ **cancel** (`Function`)
|
||||
@@ -742,10 +811,10 @@
|
||||
- 📝 Displays a grid of dashboards with selection and pagination.
|
||||
- 🏗️ Layer: Component
|
||||
- 🔒 Invariant: Selected IDs must be a subset of available dashboards.
|
||||
- 📥 Props: dashboards: DashboardMetadata[] , selectedIds: number[] , environmentId: string
|
||||
- ⚡ Events: selectionChanged
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `derived`
|
||||
- ➡️ WRITES_TO `t`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- ƒ **handleValidate** (`Function`)
|
||||
- 📝 Triggers dashboard validation task.
|
||||
- ƒ **handleSort** (`Function`)
|
||||
@@ -816,8 +885,8 @@
|
||||
- 🧩 **TaskList** (`Component`)
|
||||
- 📝 Displays a list of tasks with their status and execution details.
|
||||
- 🏗️ Layer: Component
|
||||
- 📥 Props: tasks: Array<any> , loading: boolean
|
||||
- ⚡ Events: select
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `t`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- ƒ **getStatusColor** (`Function`)
|
||||
@@ -829,8 +898,8 @@
|
||||
- 🧩 **DynamicForm** (`Component`)
|
||||
- 📝 Generates a form dynamically based on a JSON schema.
|
||||
- 🏗️ Layer: UI
|
||||
- 📥 Props: schema: any
|
||||
- ⚡ Events: submit
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ƒ **handleSubmit** (`Function`)
|
||||
- 📝 Dispatches the submit event with the form data.
|
||||
- ƒ **initializeForm** (`Function`)
|
||||
@@ -839,8 +908,8 @@
|
||||
- 📝 Provides a UI component for selecting source and target environments.
|
||||
- 🏗️ Layer: Feature
|
||||
- 🔒 Invariant: Source and target environments must be selectable from the list of configured environments.
|
||||
- 📥 Props: label: string , selectedId: string
|
||||
- ⚡ Events: change
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ƒ **handleSelect** (`Function`)
|
||||
- 📝 Dispatches the selection change event.
|
||||
- 🧩 **ProtectedRoute** (`Component`) `[TRIVIAL]`
|
||||
@@ -850,11 +919,13 @@
|
||||
- ⬅️ READS_FROM `app`
|
||||
- ⬅️ READS_FROM `auth`
|
||||
- 🧩 **TaskLogPanel** (`Component`)
|
||||
- 📝 Component properties and state.
|
||||
- 📝 Combines log filtering and display into a single cohesive dark-themed panel.
|
||||
- 🏗️ Layer: UI
|
||||
- 🔒 Invariant: Must always display logs in chronological order and respect auto-scroll preference.
|
||||
- 📥 Props: logs: any, autoScroll: any
|
||||
- ⚡ Events: filterChange
|
||||
- ➡️ WRITES_TO `bindable`
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `state`
|
||||
- 📦 **TaskLogPanel** (`Module`) `[TRIVIAL]`
|
||||
- 📝 Auto-generated module for frontend/src/components/tasks/TaskLogPanel.svelte
|
||||
- 🏗️ Layer: Unknown
|
||||
@@ -869,7 +940,9 @@
|
||||
- 🧩 **LogFilterBar** (`Component`)
|
||||
- 📝 Compact filter toolbar for logs — level, source, and text search in a single dense row.
|
||||
- 🏗️ Layer: UI
|
||||
- 📥 Props: availableSources: any, selectedLevel: any, selectedSource: any, searchText: any
|
||||
- ➡️ WRITES_TO `bindable`
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `derived`
|
||||
- 📦 **LogFilterBar** (`Module`) `[TRIVIAL]`
|
||||
- 📝 Auto-generated module for frontend/src/components/tasks/LogFilterBar.svelte
|
||||
- 🏗️ Layer: Unknown
|
||||
@@ -884,21 +957,15 @@
|
||||
- 🧩 **LogEntryRow** (`Component`)
|
||||
- 📝 Renders a single log entry with stacked layout optimized for narrow drawer panels.
|
||||
- 🏗️ Layer: UI
|
||||
- 📥 Props: log: any, showSource: any
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `derived`
|
||||
- ƒ **formatTime** (`Function`)
|
||||
- 📝 Format ISO timestamp to HH:MM:SS */
|
||||
- 📦 **LogEntryRow** (`Module`) `[TRIVIAL]`
|
||||
- 📝 Auto-generated module for frontend/src/components/tasks/LogEntryRow.svelte
|
||||
- 🏗️ Layer: Unknown
|
||||
- ƒ **getLevelClass** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- ƒ **getSourceClass** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- 🧩 **FileList** (`Component`)
|
||||
- 📝 Displays a table of files with metadata and actions.
|
||||
- 🏗️ Layer: UI
|
||||
- 📥 Props: files: any
|
||||
- ⚡ Events: delete, navigate
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `t`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- ƒ **isDirectory** (`Function`)
|
||||
@@ -911,6 +978,7 @@
|
||||
- 📝 Provides a form for uploading files to a specific category.
|
||||
- 🏗️ Layer: UI
|
||||
- ⚡ Events: uploaded
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- ➡️ WRITES_TO `t`
|
||||
- ƒ **handleUpload** (`Function`)
|
||||
@@ -964,7 +1032,7 @@
|
||||
- 🧩 **CommitHistory** (`Component`)
|
||||
- 📝 Displays the commit history for a specific dashboard.
|
||||
- 🏗️ Layer: Component
|
||||
- 📥 Props: dashboardId: any
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- ➡️ WRITES_TO `t`
|
||||
- ƒ **onMount** (`Function`)
|
||||
@@ -975,8 +1043,9 @@
|
||||
- 📝 Modal for deploying a dashboard to a target environment.
|
||||
- 🏗️ Layer: Component
|
||||
- 🔒 Invariant: Cannot deploy without a selected environment.
|
||||
- 📥 Props: dashboardId: any, show: any
|
||||
- ⚡ Events: deploy
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ⬅️ READS_FROM `effect`
|
||||
- 📦 **loadStatus** (`Watcher`)
|
||||
- ƒ **loadEnvironments** (`Function`)
|
||||
- 📝 Fetch available environments from API.
|
||||
@@ -986,8 +1055,8 @@
|
||||
- 📝 UI for resolving merge conflicts (Keep Mine / Keep Theirs).
|
||||
- 🏗️ Layer: Component
|
||||
- 🔒 Invariant: User must resolve all conflicts before saving.
|
||||
- 📥 Props: conflicts: any, show: any
|
||||
- ⚡ Events: resolve
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ƒ **resolve** (`Function`)
|
||||
- 📝 Set resolution strategy for a file.
|
||||
- ƒ **handleSave** (`Function`)
|
||||
@@ -995,8 +1064,9 @@
|
||||
- 🧩 **CommitModal** (`Component`)
|
||||
- 📝 Модальное окно для создания коммита с просмотром изменений (diff).
|
||||
- 🏗️ Layer: Component
|
||||
- 📥 Props: dashboardId: any, show: any
|
||||
- ⚡ Events: commit
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ⬅️ READS_FROM `effect`
|
||||
- ƒ **handleGenerateMessage** (`Function`)
|
||||
- 📝 Generates a commit message using LLM.
|
||||
- ƒ **loadStatus** (`Function`)
|
||||
@@ -1006,8 +1076,8 @@
|
||||
- 🧩 **BranchSelector** (`Component`)
|
||||
- 📝 UI для выбора и создания веток Git.
|
||||
- 🏗️ Layer: Component
|
||||
- 📥 Props: dashboardId: any, currentBranch: any
|
||||
- ⚡ Events: change
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- ƒ **onMount** (`Function`)
|
||||
- 📝 Load branches when component is mounted.
|
||||
@@ -1022,7 +1092,7 @@
|
||||
- 🧩 **GitManager** (`Component`)
|
||||
- 📝 Центральный компонент для управления Git-операциями конкретного дашборда.
|
||||
- 🏗️ Layer: Component
|
||||
- 📥 Props: dashboardId: any, dashboardTitle: any, show: any
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `t`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- ƒ **checkStatus** (`Function`)
|
||||
@@ -1038,7 +1108,7 @@
|
||||
- 🧩 **DocPreview** (`Component`)
|
||||
- 📝 UI component for previewing generated dataset documentation before saving.
|
||||
- 🏗️ Layer: UI
|
||||
- 📥 Props: documentation: any, onSave: any, onCancel: any
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `t`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- 📦 **DocPreview** (`Module`) `[TRIVIAL]`
|
||||
@@ -1049,7 +1119,7 @@
|
||||
- 🧩 **ProviderConfig** (`Component`)
|
||||
- 📝 UI form for managing LLM provider configurations.
|
||||
- 🏗️ Layer: UI
|
||||
- 📥 Props: providers: any, onSave: any
|
||||
- ➡️ WRITES_TO `props`
|
||||
- ➡️ WRITES_TO `t`
|
||||
- ⬅️ READS_FROM `t`
|
||||
- 📦 **ProviderConfig** (`Module`) `[TRIVIAL]`
|
||||
@@ -1099,14 +1169,14 @@
|
||||
- 📝 Provides a WebSocket endpoint for real-time log streaming of a task with server-side filtering.
|
||||
- 📦 **StaticFiles** (`Mount`)
|
||||
- 📝 Mounts the frontend build directory to serve static assets.
|
||||
- ƒ **serve_spa** (`Function`)
|
||||
- 📝 Serves the SPA frontend for any path not matched by API routes.
|
||||
- ƒ **read_root** (`Function`)
|
||||
- 📝 A simple root endpoint to confirm that the API is running when frontend is missing.
|
||||
- ƒ **network_error_handler** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- ƒ **matches_filters** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- ƒ **serve_spa** (`Function`) `[TRIVIAL]`
|
||||
- 📝 Auto-detected function (orphan)
|
||||
- 📦 **Dependencies** (`Module`)
|
||||
- 📝 Manages creation and provision of shared application dependencies, such as PluginLoader and TaskManager, to avoid circular imports.
|
||||
- 🏗️ Layer: Core
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# [DEF:Project_Knowledge_Map:Root]
|
||||
# @TIER: CRITICAL
|
||||
# @PURPOSE: Global navigation map for AI-Agent (GRACE Knowledge Graph).
|
||||
# @LAST_UPDATE: 2026-02-19
|
||||
# @LAST_UPDATE: 2026-02-20
|
||||
|
||||
## 1. SYSTEM STANDARDS (Rules of the Game)
|
||||
Strict policies and formatting rules.
|
||||
@@ -26,8 +26,11 @@ Use these for code generation (Style Transfer).
|
||||
* Ref: `.ai/shots/frontend_component.svelte` -> `[DEF:Shot:Svelte_Component]`
|
||||
* **Plugin Module:** Reference implementation of a task plugin.
|
||||
* Ref: `.ai/shots/plugin_example.py` -> `[DEF:Shot:Plugin_Example]`
|
||||
* **Critical Module:** Core banking transaction processor with ACID guarantees.
|
||||
* Ref: `.ai/shots/critical_module.py` -> `[DEF:Shot:Critical_Module]`
|
||||
|
||||
## 3. DOMAIN MAP (Modules)
|
||||
* **Module Map:** `.ai/MODULE_MAP.md` -> `[DEF:Module_Map]`
|
||||
* **Project Map:** `.ai/PROJECT_MAP.md` -> `[DEF:Project_Map]`
|
||||
* **Backend Core:** `backend/src/core` -> `[DEF:Module:Backend_Core]`
|
||||
* **Backend API:** `backend/src/api` -> `[DEF:Module:Backend_API]`
|
||||
|
||||
@@ -1,14 +1,18 @@
|
||||
# [DEF:Shot:FastAPI_Route:Example]
|
||||
# [DEF:BackendRouteShot:Module]
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: Route, Task, API, Async
|
||||
# @PURPOSE: Reference implementation of a task-based route using GRACE-Poly.
|
||||
# @LAYER: Interface (API)
|
||||
# @RELATION: IMPLEMENTS -> [DEF:Std:API_FastAPI]
|
||||
# @INVARIANT: TaskManager must be available in dependency graph.
|
||||
|
||||
from typing import List, Dict, Any, Optional
|
||||
from typing import Dict, Any
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from pydantic import BaseModel
|
||||
from ...core.logger import belief_scope
|
||||
from ...core.task_manager import TaskManager, Task
|
||||
from ...core.config_manager import ConfigManager
|
||||
from ...dependencies import get_task_manager, get_config_manager, has_permission, get_current_user
|
||||
from ...dependencies import get_task_manager, get_config_manager, get_current_user
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@@ -21,37 +25,41 @@ class CreateTaskRequest(BaseModel):
|
||||
# @PURPOSE: Create and start a new task using TaskManager. Non-blocking.
|
||||
# @PARAM: request (CreateTaskRequest) - Plugin and params.
|
||||
# @PARAM: task_manager (TaskManager) - Async task executor.
|
||||
# @PARAM: config (ConfigManager) - Centralized configuration.
|
||||
# @PRE: plugin_id must exist; config must be initialized.
|
||||
# @PRE: plugin_id must match a registered plugin.
|
||||
# @POST: A new task is spawned; Task ID returned immediately.
|
||||
# @SIDE_EFFECT: Writes to DB, Trigger background worker.
|
||||
async def create_task(
|
||||
request: CreateTaskRequest,
|
||||
task_manager: TaskManager = Depends(get_task_manager),
|
||||
config: ConfigManager = Depends(get_config_manager),
|
||||
current_user = Depends(get_current_user)
|
||||
):
|
||||
# RBAC: Dynamic permission check
|
||||
has_permission(f"plugin:{request.plugin_id}", "EXECUTE")(current_user)
|
||||
|
||||
# Context Logging
|
||||
with belief_scope("create_task"):
|
||||
try:
|
||||
# 1. Action: Resolve setting using ConfigManager (Example)
|
||||
# 1. Action: Configuration Resolution
|
||||
timeout = config.get("TASKS_DEFAULT_TIMEOUT", 3600)
|
||||
|
||||
# 2. Action: Spawn async task via TaskManager
|
||||
# 2. Action: Spawn async task
|
||||
# @RELATION: CALLS -> task_manager.create_task
|
||||
task = await task_manager.create_task(
|
||||
plugin_id=request.plugin_id,
|
||||
params={**request.params, "timeout": timeout}
|
||||
)
|
||||
return task
|
||||
|
||||
except ValueError as e:
|
||||
# 3. Recovery: Domain logic error mapping
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=str(e)
|
||||
)
|
||||
except Exception as e:
|
||||
# Evaluation: Proper error mapping and logging
|
||||
# @UX_STATE: Error feedback to frontend
|
||||
# @UX_STATE: Error feedback -> 500 Internal Error
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=f"Task creation failed: {str(e)}"
|
||||
detail="Internal Task Spawning Error"
|
||||
)
|
||||
# [/DEF:create_task:Function]
|
||||
|
||||
# [/DEF:Shot:FastAPI_Route]
|
||||
# [/DEF:BackendRouteShot:Module]
|
||||
79
.ai/shots/critical_module.py
Normal file
79
.ai/shots/critical_module.py
Normal file
@@ -0,0 +1,79 @@
|
||||
# [DEF:TransactionCore:Module]
|
||||
# @TIER: CRITICAL
|
||||
# @SEMANTICS: Finance, ACID, Transfer, Ledger
|
||||
# @PURPOSE: Core banking transaction processor with ACID guarantees.
|
||||
# @LAYER: Domain (Core)
|
||||
# @RELATION: DEPENDS_ON -> [DEF:Infra:PostgresDB]
|
||||
# @RELATION: DEPENDS_ON -> [DEF:Infra:AuditLog]
|
||||
# @INVARIANT: Total system balance must remain constant (Double-Entry Bookkeeping).
|
||||
# @INVARIANT: Negative transfers are strictly forbidden.
|
||||
|
||||
# @TEST_DATA: sufficient_funds -> {"from": "acc_A", "to": "acc_B", "amt": 100.00}
|
||||
# @TEST_DATA: insufficient_funds -> {"from": "acc_empty", "to": "acc_B", "amt": 1000.00}
|
||||
# @TEST_DATA: concurrency_lock -> {./fixtures/transactions.json#race_condition}
|
||||
|
||||
from decimal import Decimal
|
||||
from typing import NamedTuple
|
||||
from ...core.logger import belief_scope
|
||||
from ...core.db import atomic_transaction, get_balance, update_balance
|
||||
from ...core.exceptions import BusinessRuleViolation
|
||||
|
||||
class TransferResult(NamedTuple):
|
||||
tx_id: str
|
||||
status: str
|
||||
new_balance: Decimal
|
||||
|
||||
# [DEF:execute_transfer:Function]
|
||||
# @PURPOSE: Atomically move funds between accounts with audit trails.
|
||||
# @PARAM: sender_id (str) - Source account.
|
||||
# @PARAM: receiver_id (str) - Destination account.
|
||||
# @PARAM: amount (Decimal) - Positive amount to transfer.
|
||||
# @PRE: amount > 0; sender != receiver; sender_balance >= amount.
|
||||
# @POST: sender_balance -= amount; receiver_balance += amount; Audit Record Created.
|
||||
# @SIDE_EFFECT: Database mutation (Rows locked), Audit IO.
|
||||
#
|
||||
# @UX_STATE: Success -> Returns 200 OK + Transaction Receipt.
|
||||
# @UX_STATE: Error(LowBalance) -> 422 Unprocessable -> UI shows "Top-up needed" modal.
|
||||
# @UX_STATE: Error(System) -> 500 Internal -> UI shows "Retry later" toast.
|
||||
def execute_transfer(sender_id: str, receiver_id: str, amount: Decimal) -> TransferResult:
|
||||
# Guard: Input Validation
|
||||
if amount <= Decimal("0.00"):
|
||||
raise BusinessRuleViolation("Transfer amount must be positive.")
|
||||
if sender_id == receiver_id:
|
||||
raise BusinessRuleViolation("Cannot transfer to self.")
|
||||
|
||||
with belief_scope("execute_transfer") as context:
|
||||
context.logger.info("Initiating transfer", data={"from": sender_id, "to": receiver_id})
|
||||
|
||||
try:
|
||||
# 1. Action: Atomic DB Transaction
|
||||
# @RELATION: CALLS -> atomic_transaction
|
||||
with atomic_transaction():
|
||||
# Guard: State Validation (Strict)
|
||||
current_balance = get_balance(sender_id, for_update=True)
|
||||
|
||||
if current_balance < amount:
|
||||
# @UX_FEEDBACK: Triggers specific UI flow for insufficient funds
|
||||
context.logger.warn("Insufficient funds", data={"balance": current_balance})
|
||||
raise BusinessRuleViolation("INSUFFICIENT_FUNDS")
|
||||
|
||||
# 2. Action: Mutation
|
||||
new_src_bal = update_balance(sender_id, -amount)
|
||||
new_dst_bal = update_balance(receiver_id, +amount)
|
||||
|
||||
# 3. Action: Audit
|
||||
tx_id = context.audit.log_transfer(sender_id, receiver_id, amount)
|
||||
|
||||
context.logger.info("Transfer committed", data={"tx_id": tx_id})
|
||||
return TransferResult(tx_id, "COMPLETED", new_src_bal)
|
||||
|
||||
except BusinessRuleViolation as e:
|
||||
# Logic: Explicit re-raise for UI mapping
|
||||
raise e
|
||||
except Exception as e:
|
||||
# Logic: Catch-all safety net
|
||||
context.logger.error("Critical Transfer Failure", error=e)
|
||||
raise RuntimeError("TRANSACTION_ABORTED") from e
|
||||
# [/DEF:execute_transfer:Function]
|
||||
|
||||
# [/DEF:TransactionCore:Module]
|
||||
@@ -1,19 +1,23 @@
|
||||
<!-- [DEF:Shot:Svelte_Component:Example] -->
|
||||
# @PURPOSE: Reference implementation of a task-spawning component using Constitution rules.
|
||||
# @RELATION: IMPLEMENTS -> [DEF:Std:UI_Svelte]
|
||||
|
||||
<!-- [DEF:FrontendComponentShot:Component] -->
|
||||
<script>
|
||||
/**
|
||||
* @TIER: STANDARD
|
||||
* @PURPOSE: Action button to spawn a new task.
|
||||
* @LAYER: UI
|
||||
* @SEMANTICS: Task, Creation, Button
|
||||
* @TIER: CRITICAL
|
||||
* @SEMANTICS: Task, Button, Action, UX
|
||||
* @PURPOSE: Action button to spawn a new task with full UX feedback cycle.
|
||||
* @LAYER: UI (Presentation)
|
||||
* @RELATION: CALLS -> postApi
|
||||
* @INVARIANT: Must prevent double-submission while loading.
|
||||
*
|
||||
* @UX_STATE: Idle -> Button enabled with primary color.
|
||||
* @UX_STATE: Loading -> Button disabled with spinner while postApi resolves.
|
||||
* @UX_FEEDBACK: toast.success on completion; toast.error on failure.
|
||||
* @UX_TEST: Idle -> {click: spawnTask, expected: loading state then success}
|
||||
* @TEST_DATA: idle_state -> {"isLoading": false}
|
||||
* @TEST_DATA: loading_state -> {"isLoading": true}
|
||||
*
|
||||
* @UX_STATE: Idle -> Button enabled, primary color.
|
||||
* @UX_STATE: Loading -> Button disabled, spinner visible.
|
||||
* @UX_STATE: Error -> Toast notification triggers.
|
||||
*
|
||||
* @UX_FEEDBACK: Toast success/error.
|
||||
* @UX_TEST: Idle -> {click: spawnTask, expected: isLoading=true}
|
||||
* @UX_TEST: Success -> {api_resolve: 200, expected: toast.success called}
|
||||
*/
|
||||
import { postApi } from "$lib/api.js";
|
||||
import { t } from "$lib/i18n";
|
||||
@@ -24,40 +28,43 @@
|
||||
|
||||
let isLoading = false;
|
||||
|
||||
async def spawnTask() {
|
||||
// [DEF:spawnTask:Function]
|
||||
async function spawnTask() {
|
||||
isLoading = true;
|
||||
console.log("[FrontendComponentShot][Loading] Spawning task...");
|
||||
|
||||
try {
|
||||
// 1. Action: Constitution Rule - MUST use postApi wrapper
|
||||
// 1. Action: API Call
|
||||
const response = await postApi("/api/tasks", {
|
||||
plugin_id,
|
||||
params
|
||||
});
|
||||
|
||||
// 2. Feedback: UX state management
|
||||
// 2. Feedback: Success
|
||||
if (response.task_id) {
|
||||
console.log("[FrontendComponentShot][Success] Task created.");
|
||||
toast.success($t.tasks.spawned_success);
|
||||
}
|
||||
} catch (error) {
|
||||
// 3. Recovery: Evaluation & UI reporting
|
||||
// 3. Recovery: User notification
|
||||
console.log("[FrontendComponentShot][Error] Failed:", error);
|
||||
toast.error(`${$t.errors.task_failed}: ${error.message}`);
|
||||
} finally {
|
||||
isLoading = false;
|
||||
}
|
||||
}
|
||||
// [/DEF:spawnTask:Function]
|
||||
</script>
|
||||
|
||||
<button
|
||||
on:click={spawnTask}
|
||||
disabled={isLoading}
|
||||
class="bg-blue-600 hover:bg-blue-700 text-white px-4 py-2 rounded-lg flex items-center gap-2"
|
||||
class="btn-primary flex items-center gap-2"
|
||||
aria-busy={isLoading}
|
||||
>
|
||||
{#if isLoading}
|
||||
<span class="animate-spin text-sm">🌀</span>
|
||||
<span class="animate-spin" aria-label="Loading">🌀</span>
|
||||
{/if}
|
||||
<span>{$t.actions.start_task}</span>
|
||||
</button>
|
||||
|
||||
<style>
|
||||
/* Local styles minimized as per Constitution Rule III */
|
||||
</style>
|
||||
<!-- [/DEF:Shot:Svelte_Component] -->
|
||||
<!-- [/DEF:FrontendComponentShot:Component] -->
|
||||
@@ -1,6 +1,10 @@
|
||||
# [DEF:Shot:Plugin_Example:Example]
|
||||
# [DEF:PluginExampleShot:Module]
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: Plugin, Core, Extension
|
||||
# @PURPOSE: Reference implementation of a plugin following GRACE standards.
|
||||
# @RELATION: IMPLEMENTS -> [DEF:Std:Plugin]
|
||||
# @LAYER: Domain (Business Logic)
|
||||
# @RELATION: INHERITS -> PluginBase
|
||||
# @INVARIANT: get_schema must return valid JSON Schema.
|
||||
|
||||
from typing import Dict, Any, Optional
|
||||
from ..core.plugin_base import PluginBase
|
||||
@@ -11,28 +15,15 @@ class ExamplePlugin(PluginBase):
|
||||
def id(self) -> str:
|
||||
return "example-plugin"
|
||||
|
||||
@property
|
||||
def name(self) -> str:
|
||||
return "Example Plugin"
|
||||
|
||||
@property
|
||||
def description(self) -> str:
|
||||
return "A simple plugin that demonstrates structured logging and progress tracking."
|
||||
|
||||
@property
|
||||
def version(self) -> str:
|
||||
return "1.0.0"
|
||||
|
||||
# [DEF:get_schema:Function]
|
||||
# @PURPOSE: Defines input validation schema.
|
||||
# @POST: Returns dict compliant with JSON Schema draft 7.
|
||||
def get_schema(self) -> Dict[str, Any]:
|
||||
return {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"message": {
|
||||
"type": "string",
|
||||
"title": "Message",
|
||||
"description": "A message to log.",
|
||||
"default": "Hello, GRACE!",
|
||||
}
|
||||
},
|
||||
@@ -41,27 +32,33 @@ class ExamplePlugin(PluginBase):
|
||||
# [/DEF:get_schema:Function]
|
||||
|
||||
# [DEF:execute:Function]
|
||||
# @PURPOSE: Core plugin logic with structured logging and progress reporting.
|
||||
# @PURPOSE: Core plugin logic with structured logging and scope isolation.
|
||||
# @PARAM: params (Dict) - Validated input parameters.
|
||||
# @PARAM: context (TaskContext) - Execution context with logging and progress tools.
|
||||
async def execute(self, params: Dict[str, Any], context: Optional[TaskContext] = None):
|
||||
message = params["message"]
|
||||
# @PARAM: context (TaskContext) - Execution tools (log, progress).
|
||||
# @SIDE_EFFECT: Emits logs to centralized system.
|
||||
async def execute(self, params: Dict, context: Optional = None):
|
||||
message = params
|
||||
|
||||
# 1. Action: Structured Logging with Source Attribution
|
||||
if context:
|
||||
log = context.logger.with_source("example_plugin")
|
||||
log.info(f"Starting execution with message: {message}")
|
||||
# 1. Action: System-level tracing (Rule VI)
|
||||
with belief_scope("example_plugin_exec") as b_scope:
|
||||
if context:
|
||||
# Task Logs: Пишем в пользовательский контекст выполнения задачи
|
||||
# @RELATION: BINDS_TO -> context.logger
|
||||
log = context.logger.with_source("example_plugin")
|
||||
|
||||
# 2. Action: Progress Reporting
|
||||
log.progress("Processing step 1...", percent=25)
|
||||
# Simulating some async work...
|
||||
# await some_async_op()
|
||||
b_scope.logger.info("Using provided TaskContext") # System log
|
||||
log.info("Starting execution", data={"msg": message}) # Task log
|
||||
|
||||
log.progress("Processing step 2...", percent=75)
|
||||
log.info("Execution completed successfully.")
|
||||
else:
|
||||
# Fallback for manual/standalone execution
|
||||
print(f"Standalone execution: {message}")
|
||||
# 2. Action: Progress Reporting
|
||||
log.progress("Processing...", percent=50)
|
||||
|
||||
# 3. Action: Finalize
|
||||
log.info("Execution completed.")
|
||||
else:
|
||||
# Standalone Fallback: Замыкаемся на системный scope
|
||||
b_scope.logger.warning("No TaskContext provided. Running standalone.")
|
||||
b_scope.logger.info("Standalone execution", data={"msg": message})
|
||||
print(f"Standalone: {message}")
|
||||
# [/DEF:execute:Function]
|
||||
|
||||
# [/DEF:Shot:Plugin_Example]
|
||||
# [/DEF:PluginExampleShot:Module]
|
||||
27
.dockerignore
Normal file
27
.dockerignore
Normal file
@@ -0,0 +1,27 @@
|
||||
.git
|
||||
.gitignore
|
||||
.pytest_cache
|
||||
.ruff_cache
|
||||
.vscode
|
||||
.ai
|
||||
.specify
|
||||
.kilocode
|
||||
venv
|
||||
backend/.venv
|
||||
backend/.pytest_cache
|
||||
frontend/node_modules
|
||||
frontend/.svelte-kit
|
||||
frontend/.vite
|
||||
frontend/build
|
||||
backend/__pycache__
|
||||
backend/src/__pycache__
|
||||
backend/tests/__pycache__
|
||||
**/__pycache__
|
||||
*.pyc
|
||||
*.pyo
|
||||
*.pyd
|
||||
*.db
|
||||
*.log
|
||||
backups
|
||||
semantics
|
||||
specs
|
||||
36
Dockerfile
Normal file
36
Dockerfile
Normal file
@@ -0,0 +1,36 @@
|
||||
# Stage 1: Build frontend static assets
|
||||
FROM node:20-alpine AS frontend-build
|
||||
WORKDIR /app/frontend
|
||||
|
||||
COPY frontend/package*.json ./
|
||||
RUN npm ci
|
||||
|
||||
COPY frontend/ ./
|
||||
RUN npm run build
|
||||
|
||||
|
||||
# Stage 2: Runtime image for backend + static frontend
|
||||
FROM python:3.11-slim AS runtime
|
||||
|
||||
ENV PYTHONDONTWRITEBYTECODE=1
|
||||
ENV PYTHONUNBUFFERED=1
|
||||
ENV BACKEND_PORT=8000
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
curl \
|
||||
git \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
COPY backend/requirements.txt /app/backend/requirements.txt
|
||||
RUN pip install --no-cache-dir -r /app/backend/requirements.txt
|
||||
|
||||
COPY backend/ /app/backend/
|
||||
COPY --from=frontend-build /app/frontend/build /app/frontend/build
|
||||
|
||||
WORKDIR /app/backend
|
||||
|
||||
EXPOSE 8000
|
||||
|
||||
CMD ["python", "-m", "uvicorn", "src.app:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||
53
README.md
53
README.md
@@ -32,7 +32,7 @@
|
||||
## Технологический стек
|
||||
- **Backend**: Python 3.9+, FastAPI, SQLAlchemy, APScheduler, Pydantic.
|
||||
- **Frontend**: Node.js 18+, SvelteKit, Tailwind CSS.
|
||||
- **Database**: SQLite (для хранения метаданных, задач и настроек доступа).
|
||||
- **Database**: PostgreSQL (для хранения метаданных, задач, логов и конфигурации).
|
||||
|
||||
## Структура проекта
|
||||
- `backend/` — Серверная часть, API и логика плагинов.
|
||||
@@ -61,6 +61,10 @@
|
||||
Переменные окружения:
|
||||
- `BACKEND_PORT`: Порт API (по умолчанию 8000).
|
||||
- `FRONTEND_PORT`: Порт UI (по умолчанию 5173).
|
||||
- `POSTGRES_URL`: Базовый URL PostgreSQL по умолчанию для всех подсистем.
|
||||
- `DATABASE_URL`: URL основной БД (если не задан, используется `POSTGRES_URL`).
|
||||
- `TASKS_DATABASE_URL`: URL БД задач/логов (если не задан, используется `DATABASE_URL`).
|
||||
- `AUTH_DATABASE_URL`: URL БД авторизации (если не задан, используется PostgreSQL дефолт).
|
||||
|
||||
## Разработка
|
||||
Проект следует строгим правилам разработки:
|
||||
@@ -73,5 +77,52 @@
|
||||
- **Frontend**: `cd frontend && npm run dev`
|
||||
- **Тесты**: `cd backend && .venv/bin/pytest`
|
||||
|
||||
## Docker и CI/CD
|
||||
### Локальный запуск в Docker (приложение + PostgreSQL)
|
||||
```bash
|
||||
docker compose up --build
|
||||
```
|
||||
|
||||
После старта:
|
||||
- UI/API: `http://localhost:8000`
|
||||
- PostgreSQL: `localhost:5432` (`postgres/postgres`, DB `ss_tools`)
|
||||
|
||||
Остановить:
|
||||
```bash
|
||||
docker compose down
|
||||
```
|
||||
|
||||
Полная очистка тома БД:
|
||||
```bash
|
||||
docker compose down -v
|
||||
```
|
||||
|
||||
Если `postgres:16-alpine` не тянется из Docker Hub (TLS timeout), используйте fallback image:
|
||||
```bash
|
||||
POSTGRES_IMAGE=mirror.gcr.io/library/postgres:16-alpine docker compose up -d db
|
||||
```
|
||||
или:
|
||||
```bash
|
||||
POSTGRES_IMAGE=bitnami/postgresql:latest docker compose up -d db
|
||||
```
|
||||
Если на хосте уже занят `5432`, поднимайте Postgres на другом порту:
|
||||
```bash
|
||||
POSTGRES_HOST_PORT=5433 docker compose up -d db
|
||||
```
|
||||
|
||||
### Миграция legacy-данных в PostgreSQL
|
||||
Если нужно перенести старые данные из `tasks.db`/`config.json`:
|
||||
```bash
|
||||
cd backend
|
||||
PYTHONPATH=. .venv/bin/python src/scripts/migrate_sqlite_to_postgres.py --sqlite-path tasks.db
|
||||
```
|
||||
|
||||
### CI/CD
|
||||
Добавлен workflow: `.github/workflows/ci-cd.yml`
|
||||
- backend smoke tests
|
||||
- frontend build
|
||||
- docker build
|
||||
- push образа в GHCR на `main/master`
|
||||
|
||||
## Контакты и вклад
|
||||
Для добавления новых функций или исправления ошибок, пожалуйста, ознакомьтесь с `docs/plugin_dev.md` и создайте соответствующую спецификацию в `specs/`.
|
||||
|
||||
Binary file not shown.
@@ -241,6 +241,10 @@ frontend_path = project_root / "frontend" / "build"
|
||||
if frontend_path.exists():
|
||||
app.mount("/_app", StaticFiles(directory=str(frontend_path / "_app")), name="static")
|
||||
|
||||
# [DEF:serve_spa:Function]
|
||||
# @PURPOSE: Serves the SPA frontend for any path not matched by API routes.
|
||||
# @PRE: frontend_path exists.
|
||||
# @POST: Returns the requested file or index.html.
|
||||
@app.get("/{file_path:path}", include_in_schema=False)
|
||||
async def serve_spa(file_path: str):
|
||||
# Only serve SPA for non-API paths
|
||||
|
||||
@@ -24,7 +24,10 @@ class AuthConfig(BaseSettings):
|
||||
REFRESH_TOKEN_EXPIRE_DAYS: int = 7
|
||||
|
||||
# Database Settings
|
||||
AUTH_DATABASE_URL: str = Field(default="sqlite:///./backend/auth.db", env="AUTH_DATABASE_URL")
|
||||
AUTH_DATABASE_URL: str = Field(
|
||||
default="postgresql+psycopg2://postgres:postgres@localhost:5432/ss_tools",
|
||||
env="AUTH_DATABASE_URL",
|
||||
)
|
||||
|
||||
# ADFS Settings
|
||||
ADFS_CLIENT_ID: str = Field(default="", env="ADFS_CLIENT_ID")
|
||||
|
||||
269
backend/src/core/config_manager.py
Executable file → Normal file
269
backend/src/core/config_manager.py
Executable file → Normal file
@@ -1,11 +1,11 @@
|
||||
# [DEF:ConfigManagerModule:Module]
|
||||
#
|
||||
# @SEMANTICS: config, manager, persistence, json
|
||||
# @PURPOSE: Manages application configuration, including loading/saving to JSON and CRUD for environments.
|
||||
# @SEMANTICS: config, manager, persistence, postgresql
|
||||
# @PURPOSE: Manages application configuration persisted in database with one-time migration from JSON.
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> ConfigModels
|
||||
# @RELATION: DEPENDS_ON -> AppConfigRecord
|
||||
# @RELATION: CALLS -> logger
|
||||
# @RELATION: WRITES_TO -> config.json
|
||||
#
|
||||
# @INVARIANT: Configuration must always be valid according to AppConfig model.
|
||||
# @PUBLIC_API: ConfigManager
|
||||
@@ -15,112 +15,143 @@ import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Optional, List
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from .config_models import AppConfig, Environment, GlobalSettings, StorageConfig
|
||||
from .database import SessionLocal
|
||||
from ..models.config import AppConfigRecord
|
||||
from .logger import logger, configure_logger, belief_scope
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
# [DEF:ConfigManager:Class]
|
||||
# @PURPOSE: A class to handle application configuration persistence and management.
|
||||
# @RELATION: WRITES_TO -> config.json
|
||||
class ConfigManager:
|
||||
|
||||
# [DEF:__init__:Function]
|
||||
# @PURPOSE: Initializes the ConfigManager.
|
||||
# @PRE: isinstance(config_path, str) and len(config_path) > 0
|
||||
# @POST: self.config is an instance of AppConfig
|
||||
# @PARAM: config_path (str) - Path to the configuration file.
|
||||
# @PARAM: config_path (str) - Path to legacy JSON config (used only for initial migration fallback).
|
||||
def __init__(self, config_path: str = "config.json"):
|
||||
with belief_scope("__init__"):
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(config_path, str) and config_path, "config_path must be a non-empty string"
|
||||
|
||||
logger.info(f"[ConfigManager][Entry] Initializing with {config_path}")
|
||||
logger.info(f"[ConfigManager][Entry] Initializing with legacy path {config_path}")
|
||||
|
||||
# 2. Logic implementation
|
||||
self.config_path = Path(config_path)
|
||||
self.config: AppConfig = self._load_config()
|
||||
|
||||
# Configure logger with loaded settings
|
||||
configure_logger(self.config.settings.logging)
|
||||
|
||||
# 3. Runtime check of @POST
|
||||
assert isinstance(self.config, AppConfig), "self.config must be an instance of AppConfig"
|
||||
|
||||
logger.info("[ConfigManager][Exit] Initialized")
|
||||
# [/DEF:__init__:Function]
|
||||
|
||||
# [DEF:_default_config:Function]
|
||||
# @PURPOSE: Returns default application configuration.
|
||||
# @RETURN: AppConfig - Default configuration.
|
||||
def _default_config(self) -> AppConfig:
|
||||
return AppConfig(
|
||||
environments=[],
|
||||
settings=GlobalSettings(storage=StorageConfig()),
|
||||
)
|
||||
# [/DEF:_default_config:Function]
|
||||
|
||||
# [DEF:_load_from_legacy_file:Function]
|
||||
# @PURPOSE: Loads legacy configuration from config.json for migration fallback.
|
||||
# @RETURN: AppConfig - Loaded or default configuration.
|
||||
def _load_from_legacy_file(self) -> AppConfig:
|
||||
with belief_scope("_load_from_legacy_file"):
|
||||
if not self.config_path.exists():
|
||||
logger.info("[_load_from_legacy_file][Action] Legacy config file not found, using defaults")
|
||||
return self._default_config()
|
||||
|
||||
try:
|
||||
with open(self.config_path, "r", encoding="utf-8") as f:
|
||||
data = json.load(f)
|
||||
logger.info("[_load_from_legacy_file][Coherence:OK] Legacy configuration loaded")
|
||||
return AppConfig(**data)
|
||||
except Exception as e:
|
||||
logger.error(f"[_load_from_legacy_file][Coherence:Failed] Error loading legacy config: {e}")
|
||||
return self._default_config()
|
||||
# [/DEF:_load_from_legacy_file:Function]
|
||||
|
||||
# [DEF:_get_record:Function]
|
||||
# @PURPOSE: Loads config record from DB.
|
||||
# @PARAM: session (Session) - DB session.
|
||||
# @RETURN: Optional[AppConfigRecord] - Existing record or None.
|
||||
def _get_record(self, session: Session) -> Optional[AppConfigRecord]:
|
||||
return session.query(AppConfigRecord).filter(AppConfigRecord.id == "global").first()
|
||||
# [/DEF:_get_record:Function]
|
||||
|
||||
# [DEF:_load_config:Function]
|
||||
# @PURPOSE: Loads the configuration from disk or creates a default one.
|
||||
# @PRE: self.config_path is set.
|
||||
# @PURPOSE: Loads the configuration from DB or performs one-time migration from JSON file.
|
||||
# @PRE: DB session factory is available.
|
||||
# @POST: isinstance(return, AppConfig)
|
||||
# @RETURN: AppConfig - The loaded or default configuration.
|
||||
# @RETURN: AppConfig - Loaded configuration.
|
||||
def _load_config(self) -> AppConfig:
|
||||
with belief_scope("_load_config"):
|
||||
logger.debug(f"[_load_config][Entry] Loading from {self.config_path}")
|
||||
session: Session = SessionLocal()
|
||||
try:
|
||||
record = self._get_record(session)
|
||||
if record and record.payload:
|
||||
logger.info("[_load_config][Coherence:OK] Configuration loaded from database")
|
||||
return AppConfig(**record.payload)
|
||||
|
||||
if not self.config_path.exists():
|
||||
logger.info("[_load_config][Action] Config file not found. Creating default.")
|
||||
default_config = AppConfig(
|
||||
environments=[],
|
||||
settings=GlobalSettings()
|
||||
)
|
||||
self._save_config_to_disk(default_config)
|
||||
return default_config
|
||||
try:
|
||||
with open(self.config_path, "r") as f:
|
||||
data = json.load(f)
|
||||
|
||||
# Check for deprecated field
|
||||
if "settings" in data and "backup_path" in data["settings"]:
|
||||
del data["settings"]["backup_path"]
|
||||
|
||||
config = AppConfig(**data)
|
||||
logger.info("[_load_config][Coherence:OK] Configuration loaded")
|
||||
logger.info("[_load_config][Action] No database config found, migrating legacy config")
|
||||
config = self._load_from_legacy_file()
|
||||
self._save_config_to_db(config, session=session)
|
||||
return config
|
||||
except Exception as e:
|
||||
logger.error(f"[_load_config][Coherence:Failed] Error loading config: {e}")
|
||||
# Fallback but try to preserve existing settings if possible?
|
||||
# For now, return default to be safe, but log the error prominently.
|
||||
return AppConfig(
|
||||
environments=[],
|
||||
settings=GlobalSettings(storage=StorageConfig())
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"[_load_config][Coherence:Failed] Error loading config from DB: {e}")
|
||||
return self._default_config()
|
||||
finally:
|
||||
session.close()
|
||||
# [/DEF:_load_config:Function]
|
||||
|
||||
# [DEF:_save_config_to_disk:Function]
|
||||
# @PURPOSE: Saves the provided configuration object to disk.
|
||||
# [DEF:_save_config_to_db:Function]
|
||||
# @PURPOSE: Saves the provided configuration object to DB.
|
||||
# @PRE: isinstance(config, AppConfig)
|
||||
# @POST: Configuration saved to disk.
|
||||
# @POST: Configuration saved to database.
|
||||
# @PARAM: config (AppConfig) - The configuration to save.
|
||||
def _save_config_to_disk(self, config: AppConfig):
|
||||
with belief_scope("_save_config_to_disk"):
|
||||
logger.debug(f"[_save_config_to_disk][Entry] Saving to {self.config_path}")
|
||||
# @PARAM: session (Optional[Session]) - Existing DB session for transactional reuse.
|
||||
def _save_config_to_db(self, config: AppConfig, session: Optional[Session] = None):
|
||||
with belief_scope("_save_config_to_db"):
|
||||
assert isinstance(config, AppConfig), "config must be an instance of AppConfig"
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(config, AppConfig), "config must be an instance of AppConfig"
|
||||
|
||||
# 2. Logic implementation
|
||||
try:
|
||||
with open(self.config_path, "w") as f:
|
||||
json.dump(config.dict(), f, indent=4)
|
||||
logger.info("[_save_config_to_disk][Action] Configuration saved")
|
||||
except Exception as e:
|
||||
logger.error(f"[_save_config_to_disk][Coherence:Failed] Failed to save: {e}")
|
||||
# [/DEF:_save_config_to_disk:Function]
|
||||
owns_session = session is None
|
||||
db = session or SessionLocal()
|
||||
try:
|
||||
record = self._get_record(db)
|
||||
payload = config.model_dump()
|
||||
if record is None:
|
||||
record = AppConfigRecord(id="global", payload=payload)
|
||||
db.add(record)
|
||||
else:
|
||||
record.payload = payload
|
||||
db.commit()
|
||||
logger.info("[_save_config_to_db][Action] Configuration saved to database")
|
||||
except Exception as e:
|
||||
db.rollback()
|
||||
logger.error(f"[_save_config_to_db][Coherence:Failed] Failed to save: {e}")
|
||||
raise
|
||||
finally:
|
||||
if owns_session:
|
||||
db.close()
|
||||
# [/DEF:_save_config_to_db:Function]
|
||||
|
||||
# [DEF:save:Function]
|
||||
# @PURPOSE: Saves the current configuration state to disk.
|
||||
# @PURPOSE: Saves the current configuration state to DB.
|
||||
# @PRE: self.config is set.
|
||||
# @POST: self._save_config_to_disk called.
|
||||
# @POST: self._save_config_to_db called.
|
||||
def save(self):
|
||||
with belief_scope("save"):
|
||||
self._save_config_to_disk(self.config)
|
||||
self._save_config_to_db(self.config)
|
||||
# [/DEF:save:Function]
|
||||
|
||||
# [DEF:get_config:Function]
|
||||
# @PURPOSE: Returns the current configuration.
|
||||
# @PRE: self.config is set.
|
||||
# @POST: Returns self.config.
|
||||
# @RETURN: AppConfig - The current configuration.
|
||||
def get_config(self) -> AppConfig:
|
||||
with belief_scope("get_config"):
|
||||
@@ -136,44 +167,34 @@ class ConfigManager:
|
||||
with belief_scope("update_global_settings"):
|
||||
logger.info("[update_global_settings][Entry] Updating settings")
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(settings, GlobalSettings), "settings must be an instance of GlobalSettings"
|
||||
|
||||
# 2. Logic implementation
|
||||
self.config.settings = settings
|
||||
self.save()
|
||||
|
||||
# Reconfigure logger with new settings
|
||||
configure_logger(settings.logging)
|
||||
|
||||
logger.info("[update_global_settings][Exit] Settings updated")
|
||||
assert isinstance(settings, GlobalSettings), "settings must be an instance of GlobalSettings"
|
||||
self.config.settings = settings
|
||||
self.save()
|
||||
configure_logger(settings.logging)
|
||||
logger.info("[update_global_settings][Exit] Settings updated")
|
||||
# [/DEF:update_global_settings:Function]
|
||||
|
||||
# [DEF:validate_path:Function]
|
||||
# @PURPOSE: Validates if a path exists and is writable.
|
||||
# @PRE: path is a string.
|
||||
# @POST: Returns (bool, str) status.
|
||||
# @PARAM: path (str) - The path to validate.
|
||||
# @RETURN: tuple (bool, str) - (is_valid, message)
|
||||
def validate_path(self, path: str) -> tuple[bool, str]:
|
||||
with belief_scope("validate_path"):
|
||||
p = os.path.abspath(path)
|
||||
if not os.path.exists(p):
|
||||
try:
|
||||
os.makedirs(p, exist_ok=True)
|
||||
except Exception as e:
|
||||
return False, f"Path does not exist and could not be created: {e}"
|
||||
if not os.path.exists(p):
|
||||
try:
|
||||
os.makedirs(p, exist_ok=True)
|
||||
except Exception as e:
|
||||
return False, f"Path does not exist and could not be created: {e}"
|
||||
|
||||
if not os.access(p, os.W_OK):
|
||||
return False, "Path is not writable"
|
||||
if not os.access(p, os.W_OK):
|
||||
return False, "Path is not writable"
|
||||
|
||||
return True, "Path is valid and writable"
|
||||
return True, "Path is valid and writable"
|
||||
# [/DEF:validate_path:Function]
|
||||
|
||||
# [DEF:get_environments:Function]
|
||||
# @PURPOSE: Returns the list of configured environments.
|
||||
# @PRE: self.config is set.
|
||||
# @POST: Returns list of environments.
|
||||
# @RETURN: List[Environment] - List of environments.
|
||||
def get_environments(self) -> List[Environment]:
|
||||
with belief_scope("get_environments"):
|
||||
@@ -182,8 +203,6 @@ class ConfigManager:
|
||||
|
||||
# [DEF:has_environments:Function]
|
||||
# @PURPOSE: Checks if at least one environment is configured.
|
||||
# @PRE: self.config is set.
|
||||
# @POST: Returns boolean indicating if environments exist.
|
||||
# @RETURN: bool - True if at least one environment exists.
|
||||
def has_environments(self) -> bool:
|
||||
with belief_scope("has_environments"):
|
||||
@@ -192,8 +211,6 @@ class ConfigManager:
|
||||
|
||||
# [DEF:get_environment:Function]
|
||||
# @PURPOSE: Returns a single environment by ID.
|
||||
# @PRE: self.config is set and isinstance(env_id, str) and len(env_id) > 0.
|
||||
# @POST: Returns Environment object if found, None otherwise.
|
||||
# @PARAM: env_id (str) - The ID of the environment to retrieve.
|
||||
# @RETURN: Optional[Environment] - The environment with the given ID, or None.
|
||||
def get_environment(self, env_id: str) -> Optional[Environment]:
|
||||
@@ -206,79 +223,61 @@ class ConfigManager:
|
||||
|
||||
# [DEF:add_environment:Function]
|
||||
# @PURPOSE: Adds a new environment to the configuration.
|
||||
# @PRE: isinstance(env, Environment)
|
||||
# @POST: Environment added or updated in self.config.environments.
|
||||
# @PARAM: env (Environment) - The environment to add.
|
||||
def add_environment(self, env: Environment):
|
||||
with belief_scope("add_environment"):
|
||||
logger.info(f"[add_environment][Entry] Adding environment {env.id}")
|
||||
assert isinstance(env, Environment), "env must be an instance of Environment"
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert isinstance(env, Environment), "env must be an instance of Environment"
|
||||
|
||||
# 2. Logic implementation
|
||||
# Check for duplicate ID and remove if exists
|
||||
self.config.environments = [e for e in self.config.environments if e.id != env.id]
|
||||
self.config.environments.append(env)
|
||||
self.save()
|
||||
|
||||
logger.info("[add_environment][Exit] Environment added")
|
||||
self.config.environments = [e for e in self.config.environments if e.id != env.id]
|
||||
self.config.environments.append(env)
|
||||
self.save()
|
||||
logger.info("[add_environment][Exit] Environment added")
|
||||
# [/DEF:add_environment:Function]
|
||||
|
||||
# [DEF:update_environment:Function]
|
||||
# @PURPOSE: Updates an existing environment.
|
||||
# @PRE: isinstance(env_id, str) and len(env_id) > 0 and isinstance(updated_env, Environment)
|
||||
# @POST: Returns True if environment was found and updated.
|
||||
# @PARAM: env_id (str) - The ID of the environment to update.
|
||||
# @PARAM: updated_env (Environment) - The updated environment data.
|
||||
# @RETURN: bool - True if updated, False otherwise.
|
||||
def update_environment(self, env_id: str, updated_env: Environment) -> bool:
|
||||
with belief_scope("update_environment"):
|
||||
logger.info(f"[update_environment][Entry] Updating {env_id}")
|
||||
assert env_id and isinstance(env_id, str), "env_id must be a non-empty string"
|
||||
assert isinstance(updated_env, Environment), "updated_env must be an instance of Environment"
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert env_id and isinstance(env_id, str), "env_id must be a non-empty string"
|
||||
assert isinstance(updated_env, Environment), "updated_env must be an instance of Environment"
|
||||
for i, env in enumerate(self.config.environments):
|
||||
if env.id == env_id:
|
||||
if updated_env.password == "********":
|
||||
updated_env.password = env.password
|
||||
|
||||
# 2. Logic implementation
|
||||
for i, env in enumerate(self.config.environments):
|
||||
if env.id == env_id:
|
||||
# If password is masked, keep the old one
|
||||
if updated_env.password == "********":
|
||||
updated_env.password = env.password
|
||||
self.config.environments[i] = updated_env
|
||||
self.save()
|
||||
logger.info(f"[update_environment][Coherence:OK] Updated {env_id}")
|
||||
return True
|
||||
|
||||
self.config.environments[i] = updated_env
|
||||
self.save()
|
||||
logger.info(f"[update_environment][Coherence:OK] Updated {env_id}")
|
||||
return True
|
||||
|
||||
logger.warning(f"[update_environment][Coherence:Failed] Environment {env_id} not found")
|
||||
return False
|
||||
logger.warning(f"[update_environment][Coherence:Failed] Environment {env_id} not found")
|
||||
return False
|
||||
# [/DEF:update_environment:Function]
|
||||
|
||||
# [DEF:delete_environment:Function]
|
||||
# @PURPOSE: Deletes an environment by ID.
|
||||
# @PRE: isinstance(env_id, str) and len(env_id) > 0
|
||||
# @POST: Environment removed from self.config.environments if it existed.
|
||||
# @PARAM: env_id (str) - The ID of the environment to delete.
|
||||
def delete_environment(self, env_id: str):
|
||||
with belief_scope("delete_environment"):
|
||||
logger.info(f"[delete_environment][Entry] Deleting {env_id}")
|
||||
assert env_id and isinstance(env_id, str), "env_id must be a non-empty string"
|
||||
|
||||
# 1. Runtime check of @PRE
|
||||
assert env_id and isinstance(env_id, str), "env_id must be a non-empty string"
|
||||
original_count = len(self.config.environments)
|
||||
self.config.environments = [e for e in self.config.environments if e.id != env_id]
|
||||
|
||||
# 2. Logic implementation
|
||||
original_count = len(self.config.environments)
|
||||
self.config.environments = [e for e in self.config.environments if e.id != env_id]
|
||||
|
||||
if len(self.config.environments) < original_count:
|
||||
self.save()
|
||||
logger.info(f"[delete_environment][Action] Deleted {env_id}")
|
||||
else:
|
||||
logger.warning(f"[delete_environment][Coherence:Failed] Environment {env_id} not found")
|
||||
if len(self.config.environments) < original_count:
|
||||
self.save()
|
||||
logger.info(f"[delete_environment][Action] Deleted {env_id}")
|
||||
else:
|
||||
logger.warning(f"[delete_environment][Coherence:Failed] Environment {env_id} not found")
|
||||
# [/DEF:delete_environment:Function]
|
||||
|
||||
# [/DEF:ConfigManager:Class]
|
||||
|
||||
# [/DEF:ConfigManager:Class]
|
||||
# [/DEF:ConfigManagerModule:Module]
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
# @SEMANTICS: config, models, pydantic
|
||||
# @PURPOSE: Defines the data models for application configuration using Pydantic.
|
||||
# @LAYER: Core
|
||||
# @RELATION: READS_FROM -> config.json
|
||||
# @RELATION: READS_FROM -> app_configurations (database)
|
||||
# @RELATION: USED_BY -> ConfigManager
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
@@ -36,7 +36,7 @@ class Environment(BaseModel):
|
||||
class LoggingConfig(BaseModel):
|
||||
level: str = "INFO"
|
||||
task_log_level: str = "INFO" # Minimum level for task-specific logs (DEBUG, INFO, WARNING, ERROR)
|
||||
file_path: Optional[str] = "logs/app.log"
|
||||
file_path: Optional[str] = None
|
||||
max_bytes: int = 10 * 1024 * 1024
|
||||
backup_count: int = 5
|
||||
enable_belief_state: bool = True
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# [DEF:backend.src.core.database:Module]
|
||||
#
|
||||
# @SEMANTICS: database, sqlite, sqlalchemy, session, persistence
|
||||
# @PURPOSE: Configures the SQLite database connection and session management.
|
||||
# @SEMANTICS: database, postgresql, sqlalchemy, session, persistence
|
||||
# @PURPOSE: Configures database connection and session management (PostgreSQL-first).
|
||||
# @LAYER: Core
|
||||
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||
# @RELATION: USES -> backend.src.models.mapping
|
||||
@@ -14,6 +14,9 @@ from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from ..models.mapping import Base
|
||||
# Import models to ensure they're registered with Base
|
||||
from ..models import task as _task_models # noqa: F401
|
||||
from ..models import auth as _auth_models # noqa: F401
|
||||
from ..models import config as _config_models # noqa: F401
|
||||
from .logger import belief_scope
|
||||
from .auth.config import auth_config
|
||||
import os
|
||||
@@ -21,44 +24,50 @@ from pathlib import Path
|
||||
# [/SECTION]
|
||||
|
||||
# [DEF:BASE_DIR:Variable]
|
||||
# @PURPOSE: Base directory for the backend (where .db files should reside).
|
||||
# @PURPOSE: Base directory for the backend.
|
||||
BASE_DIR = Path(__file__).resolve().parent.parent.parent
|
||||
# [/DEF:BASE_DIR:Variable]
|
||||
|
||||
# [DEF:DATABASE_URL:Constant]
|
||||
# @PURPOSE: URL for the main mappings database.
|
||||
DATABASE_URL = os.getenv("DATABASE_URL", f"sqlite:///{BASE_DIR}/mappings.db")
|
||||
# @PURPOSE: URL for the main application database.
|
||||
DEFAULT_POSTGRES_URL = os.getenv(
|
||||
"POSTGRES_URL",
|
||||
"postgresql+psycopg2://postgres:postgres@localhost:5432/ss_tools",
|
||||
)
|
||||
DATABASE_URL = os.getenv("DATABASE_URL", DEFAULT_POSTGRES_URL)
|
||||
# [/DEF:DATABASE_URL:Constant]
|
||||
|
||||
# [DEF:TASKS_DATABASE_URL:Constant]
|
||||
# @PURPOSE: URL for the tasks execution database.
|
||||
TASKS_DATABASE_URL = os.getenv("TASKS_DATABASE_URL", f"sqlite:///{BASE_DIR}/tasks.db")
|
||||
# Defaults to DATABASE_URL to keep task logs in the same PostgreSQL instance.
|
||||
TASKS_DATABASE_URL = os.getenv("TASKS_DATABASE_URL", DATABASE_URL)
|
||||
# [/DEF:TASKS_DATABASE_URL:Constant]
|
||||
|
||||
# [DEF:AUTH_DATABASE_URL:Constant]
|
||||
# @PURPOSE: URL for the authentication database.
|
||||
AUTH_DATABASE_URL = os.getenv("AUTH_DATABASE_URL", auth_config.AUTH_DATABASE_URL)
|
||||
# If it's a relative sqlite path starting with ./backend/, fix it to be absolute or relative to BASE_DIR
|
||||
if AUTH_DATABASE_URL.startswith("sqlite:///./backend/"):
|
||||
AUTH_DATABASE_URL = AUTH_DATABASE_URL.replace("sqlite:///./backend/", f"sqlite:///{BASE_DIR}/")
|
||||
elif AUTH_DATABASE_URL.startswith("sqlite:///./") and not AUTH_DATABASE_URL.startswith("sqlite:///./backend/"):
|
||||
# If it's just ./ but we are in backend, it's fine, but let's make it absolute for robustness
|
||||
AUTH_DATABASE_URL = AUTH_DATABASE_URL.replace("sqlite:///./", f"sqlite:///{BASE_DIR}/")
|
||||
# [/DEF:AUTH_DATABASE_URL:Constant]
|
||||
|
||||
# [DEF:engine:Variable]
|
||||
def _build_engine(db_url: str):
|
||||
with belief_scope("_build_engine"):
|
||||
if db_url.startswith("sqlite"):
|
||||
return create_engine(db_url, connect_args={"check_same_thread": False})
|
||||
return create_engine(db_url, pool_pre_ping=True)
|
||||
|
||||
|
||||
# @PURPOSE: SQLAlchemy engine for mappings database.
|
||||
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
engine = _build_engine(DATABASE_URL)
|
||||
# [/DEF:engine:Variable]
|
||||
|
||||
# [DEF:tasks_engine:Variable]
|
||||
# @PURPOSE: SQLAlchemy engine for tasks database.
|
||||
tasks_engine = create_engine(TASKS_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
tasks_engine = _build_engine(TASKS_DATABASE_URL)
|
||||
# [/DEF:tasks_engine:Variable]
|
||||
|
||||
# [DEF:auth_engine:Variable]
|
||||
# @PURPOSE: SQLAlchemy engine for authentication database.
|
||||
auth_engine = create_engine(AUTH_DATABASE_URL, connect_args={"check_same_thread": False})
|
||||
auth_engine = _build_engine(AUTH_DATABASE_URL)
|
||||
# [/DEF:auth_engine:Variable]
|
||||
|
||||
# [DEF:SessionLocal:Class]
|
||||
|
||||
@@ -24,10 +24,10 @@ from .models.auth import User
|
||||
# Use absolute path relative to this file to ensure plugins are found regardless of CWD
|
||||
project_root = Path(__file__).parent.parent.parent
|
||||
config_path = project_root / "config.json"
|
||||
config_manager = ConfigManager(config_path=str(config_path))
|
||||
|
||||
# Initialize database before any other services that might use it
|
||||
# Initialize database before services that use persisted configuration.
|
||||
init_db()
|
||||
config_manager = ConfigManager(config_path=str(config_path))
|
||||
|
||||
# [DEF:get_config_manager:Function]
|
||||
# @PURPOSE: Dependency injector for ConfigManager.
|
||||
|
||||
26
backend/src/models/config.py
Normal file
26
backend/src/models/config.py
Normal file
@@ -0,0 +1,26 @@
|
||||
# [DEF:backend.src.models.config:Module]
|
||||
#
|
||||
# @TIER: STANDARD
|
||||
# @SEMANTICS: database, config, settings, sqlalchemy
|
||||
# @PURPOSE: Defines database schema for persisted application configuration.
|
||||
# @LAYER: Domain
|
||||
# @RELATION: DEPENDS_ON -> sqlalchemy
|
||||
|
||||
from sqlalchemy import Column, String, DateTime, JSON
|
||||
from sqlalchemy.sql import func
|
||||
|
||||
from .mapping import Base
|
||||
|
||||
|
||||
# [DEF:AppConfigRecord:Class]
|
||||
# @PURPOSE: Stores the single source of truth for application configuration.
|
||||
class AppConfigRecord(Base):
|
||||
__tablename__ = "app_configurations"
|
||||
|
||||
id = Column(String, primary_key=True)
|
||||
payload = Column(JSON, nullable=False)
|
||||
updated_at = Column(DateTime(timezone=True), server_default=func.now(), onupdate=func.now())
|
||||
|
||||
|
||||
# [/DEF:AppConfigRecord:Class]
|
||||
# [/DEF:backend.src.models.config:Module]
|
||||
@@ -22,6 +22,8 @@ class FileCategory(str, Enum):
|
||||
# @PURPOSE: Configuration model for the storage system, defining paths and naming patterns.
|
||||
class StorageConfig(BaseModel):
|
||||
root_path: str = Field(default="backups", description="Absolute path to the storage root directory.")
|
||||
backup_path: str = Field(default="backups", description="Subpath for backups.")
|
||||
repo_path: str = Field(default="repositorys", description="Subpath for repositories.")
|
||||
backup_structure_pattern: str = Field(default="{category}/", description="Pattern for backup directory structure.")
|
||||
repo_structure_pattern: str = Field(default="{category}/", description="Pattern for repository directory structure.")
|
||||
filename_pattern: str = Field(default="{name}_{timestamp}", description="Pattern for filenames.")
|
||||
|
||||
350
backend/src/scripts/migrate_sqlite_to_postgres.py
Normal file
350
backend/src/scripts/migrate_sqlite_to_postgres.py
Normal file
@@ -0,0 +1,350 @@
|
||||
# [DEF:backend.src.scripts.migrate_sqlite_to_postgres:Module]
|
||||
#
|
||||
# @SEMANTICS: migration, sqlite, postgresql, config, task_logs, task_records
|
||||
# @PURPOSE: Migrates legacy config and task history from SQLite/file storage to PostgreSQL.
|
||||
# @LAYER: Scripts
|
||||
# @RELATION: READS_FROM -> backend/tasks.db
|
||||
# @RELATION: READS_FROM -> backend/config.json
|
||||
# @RELATION: WRITES_TO -> postgresql.task_records
|
||||
# @RELATION: WRITES_TO -> postgresql.task_logs
|
||||
# @RELATION: WRITES_TO -> postgresql.app_configurations
|
||||
#
|
||||
# @INVARIANT: Script is idempotent for task_records and app_configurations.
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import sqlite3
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, Iterable, Optional
|
||||
|
||||
from sqlalchemy import create_engine, text
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
|
||||
from src.core.logger import belief_scope, logger
|
||||
# [/SECTION]
|
||||
|
||||
|
||||
# [DEF:Constants:Section]
|
||||
DEFAULT_TARGET_URL = os.getenv(
|
||||
"DATABASE_URL",
|
||||
os.getenv("POSTGRES_URL", "postgresql+psycopg2://postgres:postgres@localhost:5432/ss_tools"),
|
||||
)
|
||||
# [/DEF:Constants:Section]
|
||||
|
||||
|
||||
# [DEF:_json_load_if_needed:Function]
|
||||
# @PURPOSE: Parses JSON-like values from SQLite TEXT/JSON columns to Python objects.
|
||||
def _json_load_if_needed(value: Any) -> Any:
|
||||
with belief_scope("_json_load_if_needed"):
|
||||
if value is None:
|
||||
return None
|
||||
if isinstance(value, (dict, list)):
|
||||
return value
|
||||
if isinstance(value, str):
|
||||
raw = value.strip()
|
||||
if not raw:
|
||||
return None
|
||||
if raw[0] in "{[":
|
||||
try:
|
||||
return json.loads(raw)
|
||||
except json.JSONDecodeError:
|
||||
return value
|
||||
return value
|
||||
|
||||
|
||||
# [DEF:_find_legacy_config_path:Function]
|
||||
# @PURPOSE: Resolves the existing legacy config.json path from candidates.
|
||||
def _find_legacy_config_path(explicit_path: Optional[str]) -> Optional[Path]:
|
||||
with belief_scope("_find_legacy_config_path"):
|
||||
if explicit_path:
|
||||
p = Path(explicit_path)
|
||||
return p if p.exists() else None
|
||||
|
||||
candidates = [
|
||||
Path("backend/config.json"),
|
||||
Path("config.json"),
|
||||
]
|
||||
for candidate in candidates:
|
||||
if candidate.exists():
|
||||
return candidate
|
||||
return None
|
||||
|
||||
|
||||
# [DEF:_connect_sqlite:Function]
|
||||
# @PURPOSE: Opens a SQLite connection with row factory.
|
||||
def _connect_sqlite(path: Path) -> sqlite3.Connection:
|
||||
with belief_scope("_connect_sqlite"):
|
||||
conn = sqlite3.connect(str(path))
|
||||
conn.row_factory = sqlite3.Row
|
||||
return conn
|
||||
|
||||
|
||||
# [DEF:_ensure_target_schema:Function]
|
||||
# @PURPOSE: Ensures required PostgreSQL tables exist before migration.
|
||||
def _ensure_target_schema(engine) -> None:
|
||||
with belief_scope("_ensure_target_schema"):
|
||||
stmts: Iterable[str] = (
|
||||
"""
|
||||
CREATE TABLE IF NOT EXISTS app_configurations (
|
||||
id TEXT PRIMARY KEY,
|
||||
payload JSONB NOT NULL,
|
||||
updated_at TIMESTAMPTZ DEFAULT NOW()
|
||||
)
|
||||
""",
|
||||
"""
|
||||
CREATE TABLE IF NOT EXISTS task_records (
|
||||
id TEXT PRIMARY KEY,
|
||||
type TEXT NOT NULL,
|
||||
status TEXT NOT NULL,
|
||||
environment_id TEXT NULL,
|
||||
started_at TIMESTAMPTZ NULL,
|
||||
finished_at TIMESTAMPTZ NULL,
|
||||
logs JSONB NULL,
|
||||
error TEXT NULL,
|
||||
result JSONB NULL,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
params JSONB NULL
|
||||
)
|
||||
""",
|
||||
"""
|
||||
CREATE TABLE IF NOT EXISTS task_logs (
|
||||
id INTEGER PRIMARY KEY,
|
||||
task_id TEXT NOT NULL,
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
level VARCHAR(16) NOT NULL,
|
||||
source VARCHAR(64) NOT NULL DEFAULT 'system',
|
||||
message TEXT NOT NULL,
|
||||
metadata_json TEXT NULL,
|
||||
CONSTRAINT fk_task_logs_task
|
||||
FOREIGN KEY(task_id)
|
||||
REFERENCES task_records(id)
|
||||
ON DELETE CASCADE
|
||||
)
|
||||
""",
|
||||
"CREATE INDEX IF NOT EXISTS ix_task_logs_task_timestamp ON task_logs (task_id, timestamp)",
|
||||
"CREATE INDEX IF NOT EXISTS ix_task_logs_task_level ON task_logs (task_id, level)",
|
||||
"CREATE INDEX IF NOT EXISTS ix_task_logs_task_source ON task_logs (task_id, source)",
|
||||
"""
|
||||
DO $$
|
||||
BEGIN
|
||||
IF EXISTS (
|
||||
SELECT 1 FROM pg_class WHERE relkind = 'S' AND relname = 'task_logs_id_seq'
|
||||
) THEN
|
||||
PERFORM 1;
|
||||
ELSE
|
||||
CREATE SEQUENCE task_logs_id_seq OWNED BY task_logs.id;
|
||||
END IF;
|
||||
END $$;
|
||||
""",
|
||||
"ALTER TABLE task_logs ALTER COLUMN id SET DEFAULT nextval('task_logs_id_seq')",
|
||||
)
|
||||
with engine.begin() as conn:
|
||||
for stmt in stmts:
|
||||
conn.execute(text(stmt))
|
||||
|
||||
|
||||
# [DEF:_migrate_config:Function]
|
||||
# @PURPOSE: Migrates legacy config.json into app_configurations(global).
|
||||
def _migrate_config(engine, legacy_config_path: Optional[Path]) -> int:
|
||||
with belief_scope("_migrate_config"):
|
||||
if legacy_config_path is None:
|
||||
logger.info("[_migrate_config][Action] No legacy config.json found, skipping")
|
||||
return 0
|
||||
|
||||
payload = json.loads(legacy_config_path.read_text(encoding="utf-8"))
|
||||
with engine.begin() as conn:
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
INSERT INTO app_configurations (id, payload, updated_at)
|
||||
VALUES ('global', CAST(:payload AS JSONB), NOW())
|
||||
ON CONFLICT (id)
|
||||
DO UPDATE SET payload = EXCLUDED.payload, updated_at = NOW()
|
||||
"""
|
||||
),
|
||||
{"payload": json.dumps(payload, ensure_ascii=True)},
|
||||
)
|
||||
logger.info("[_migrate_config][Coherence:OK] Config migrated from %s", legacy_config_path)
|
||||
return 1
|
||||
|
||||
|
||||
# [DEF:_migrate_tasks_and_logs:Function]
|
||||
# @PURPOSE: Migrates task_records and task_logs from SQLite into PostgreSQL.
|
||||
def _migrate_tasks_and_logs(engine, sqlite_conn: sqlite3.Connection) -> Dict[str, int]:
|
||||
with belief_scope("_migrate_tasks_and_logs"):
|
||||
stats = {"task_records_total": 0, "task_records_inserted": 0, "task_logs_total": 0, "task_logs_inserted": 0}
|
||||
|
||||
rows = sqlite_conn.execute(
|
||||
"""
|
||||
SELECT id, type, status, environment_id, started_at, finished_at, logs, error, result, created_at, params
|
||||
FROM task_records
|
||||
ORDER BY created_at ASC
|
||||
"""
|
||||
).fetchall()
|
||||
stats["task_records_total"] = len(rows)
|
||||
|
||||
with engine.begin() as conn:
|
||||
existing_env_ids = {
|
||||
row[0]
|
||||
for row in conn.execute(text("SELECT id FROM environments")).fetchall()
|
||||
}
|
||||
for row in rows:
|
||||
params_obj = _json_load_if_needed(row["params"])
|
||||
result_obj = _json_load_if_needed(row["result"])
|
||||
logs_obj = _json_load_if_needed(row["logs"])
|
||||
environment_id = row["environment_id"]
|
||||
if environment_id and environment_id not in existing_env_ids:
|
||||
# Legacy task may reference environments that were not migrated; keep task row and drop FK value.
|
||||
environment_id = None
|
||||
|
||||
res = conn.execute(
|
||||
text(
|
||||
"""
|
||||
INSERT INTO task_records (
|
||||
id, type, status, environment_id, started_at, finished_at,
|
||||
logs, error, result, created_at, params
|
||||
) VALUES (
|
||||
:id, :type, :status, :environment_id, :started_at, :finished_at,
|
||||
CAST(:logs AS JSONB), :error, CAST(:result AS JSONB), :created_at, CAST(:params AS JSONB)
|
||||
)
|
||||
ON CONFLICT (id) DO NOTHING
|
||||
"""
|
||||
),
|
||||
{
|
||||
"id": row["id"],
|
||||
"type": row["type"],
|
||||
"status": row["status"],
|
||||
"environment_id": environment_id,
|
||||
"started_at": row["started_at"],
|
||||
"finished_at": row["finished_at"],
|
||||
"logs": json.dumps(logs_obj, ensure_ascii=True) if logs_obj is not None else None,
|
||||
"error": row["error"],
|
||||
"result": json.dumps(result_obj, ensure_ascii=True) if result_obj is not None else None,
|
||||
"created_at": row["created_at"],
|
||||
"params": json.dumps(params_obj, ensure_ascii=True) if params_obj is not None else None,
|
||||
},
|
||||
)
|
||||
if res.rowcount and res.rowcount > 0:
|
||||
stats["task_records_inserted"] += int(res.rowcount)
|
||||
|
||||
log_rows = sqlite_conn.execute(
|
||||
"""
|
||||
SELECT id, task_id, timestamp, level, source, message, metadata_json
|
||||
FROM task_logs
|
||||
ORDER BY id ASC
|
||||
"""
|
||||
).fetchall()
|
||||
stats["task_logs_total"] = len(log_rows)
|
||||
|
||||
with engine.begin() as conn:
|
||||
for row in log_rows:
|
||||
# Preserve original IDs to keep migration idempotent.
|
||||
res = conn.execute(
|
||||
text(
|
||||
"""
|
||||
INSERT INTO task_logs (id, task_id, timestamp, level, source, message, metadata_json)
|
||||
VALUES (:id, :task_id, :timestamp, :level, :source, :message, :metadata_json)
|
||||
ON CONFLICT (id) DO NOTHING
|
||||
"""
|
||||
),
|
||||
{
|
||||
"id": row["id"],
|
||||
"task_id": row["task_id"],
|
||||
"timestamp": row["timestamp"],
|
||||
"level": row["level"],
|
||||
"source": row["source"] or "system",
|
||||
"message": row["message"],
|
||||
"metadata_json": row["metadata_json"],
|
||||
},
|
||||
)
|
||||
if res.rowcount and res.rowcount > 0:
|
||||
stats["task_logs_inserted"] += int(res.rowcount)
|
||||
|
||||
# Ensure sequence is aligned after explicit id inserts.
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
SELECT setval(
|
||||
'task_logs_id_seq',
|
||||
COALESCE((SELECT MAX(id) FROM task_logs), 1),
|
||||
TRUE
|
||||
)
|
||||
"""
|
||||
)
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"[_migrate_tasks_and_logs][Coherence:OK] task_records=%s/%s task_logs=%s/%s",
|
||||
stats["task_records_inserted"],
|
||||
stats["task_records_total"],
|
||||
stats["task_logs_inserted"],
|
||||
stats["task_logs_total"],
|
||||
)
|
||||
return stats
|
||||
|
||||
|
||||
# [DEF:run_migration:Function]
|
||||
# @PURPOSE: Orchestrates migration from SQLite/file to PostgreSQL.
|
||||
def run_migration(sqlite_path: Path, target_url: str, legacy_config_path: Optional[Path]) -> Dict[str, int]:
|
||||
with belief_scope("run_migration"):
|
||||
logger.info("[run_migration][Entry] sqlite=%s target=%s", sqlite_path, target_url)
|
||||
if not sqlite_path.exists():
|
||||
raise FileNotFoundError(f"SQLite source not found: {sqlite_path}")
|
||||
|
||||
sqlite_conn = _connect_sqlite(sqlite_path)
|
||||
engine = create_engine(target_url, pool_pre_ping=True)
|
||||
try:
|
||||
_ensure_target_schema(engine)
|
||||
config_upserted = _migrate_config(engine, legacy_config_path)
|
||||
stats = _migrate_tasks_and_logs(engine, sqlite_conn)
|
||||
stats["config_upserted"] = config_upserted
|
||||
return stats
|
||||
finally:
|
||||
sqlite_conn.close()
|
||||
|
||||
|
||||
# [DEF:main:Function]
|
||||
# @PURPOSE: CLI entrypoint.
|
||||
def main() -> int:
|
||||
with belief_scope("main"):
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Migrate legacy config.json and task logs from SQLite to PostgreSQL.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--sqlite-path",
|
||||
default="backend/tasks.db",
|
||||
help="Path to source SQLite DB with task_records/task_logs (default: backend/tasks.db).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--target-url",
|
||||
default=DEFAULT_TARGET_URL,
|
||||
help="Target PostgreSQL SQLAlchemy URL (default: DATABASE_URL/POSTGRES_URL env).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--config-path",
|
||||
default=None,
|
||||
help="Optional path to legacy config.json (auto-detected when omitted).",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
sqlite_path = Path(args.sqlite_path)
|
||||
legacy_config_path = _find_legacy_config_path(args.config_path)
|
||||
try:
|
||||
stats = run_migration(sqlite_path=sqlite_path, target_url=args.target_url, legacy_config_path=legacy_config_path)
|
||||
print("Migration completed.")
|
||||
print(json.dumps(stats, indent=2))
|
||||
return 0
|
||||
except (SQLAlchemyError, OSError, sqlite3.Error, ValueError) as e:
|
||||
logger.error("[main][Coherence:Failed] Migration failed: %s", e)
|
||||
print(f"Migration failed: {e}")
|
||||
return 1
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
raise SystemExit(main())
|
||||
# [/DEF:main:Function]
|
||||
|
||||
# [/DEF:backend.src.scripts.migrate_sqlite_to_postgres:Module]
|
||||
@@ -18,3 +18,4 @@ def __getattr__(name):
|
||||
from .resource_service import ResourceService
|
||||
return ResourceService
|
||||
raise AttributeError(f"module {__name__!r} has no attribute {name!r}")
|
||||
# [/DEF:backend.src.services:Module]
|
||||
|
||||
BIN
backend/tasks.db
BIN
backend/tasks.db
Binary file not shown.
42
docker-compose.yml
Normal file
42
docker-compose.yml
Normal file
@@ -0,0 +1,42 @@
|
||||
services:
|
||||
db:
|
||||
image: ${POSTGRES_IMAGE:-postgres:16-alpine}
|
||||
container_name: ss_tools_db
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
POSTGRES_DB: ss_tools
|
||||
POSTGRES_USER: postgres
|
||||
POSTGRES_PASSWORD: postgres
|
||||
ports:
|
||||
- "${POSTGRES_HOST_PORT:-5432}:5432"
|
||||
volumes:
|
||||
- postgres_data:/var/lib/postgresql/data
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U postgres -d ss_tools"]
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 10
|
||||
|
||||
app:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
container_name: ss_tools_app
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
environment:
|
||||
POSTGRES_URL: postgresql+psycopg2://postgres:postgres@db:5432/ss_tools
|
||||
DATABASE_URL: postgresql+psycopg2://postgres:postgres@db:5432/ss_tools
|
||||
TASKS_DATABASE_URL: postgresql+psycopg2://postgres:postgres@db:5432/ss_tools
|
||||
AUTH_DATABASE_URL: postgresql+psycopg2://postgres:postgres@db:5432/ss_tools
|
||||
BACKEND_PORT: 8000
|
||||
ports:
|
||||
- "8000:8000"
|
||||
volumes:
|
||||
- ./backups:/app/backups
|
||||
- ./backend/git_repos:/app/backend/git_repos
|
||||
|
||||
volumes:
|
||||
postgres_data:
|
||||
@@ -13,8 +13,8 @@
|
||||
|
||||
const dispatch = createEventDispatcher();
|
||||
|
||||
let passwords = {};
|
||||
let submitting = false;
|
||||
let passwords = $state({});
|
||||
let submitting = $state(false);
|
||||
|
||||
// [DEF:handleSubmit:Function]
|
||||
// @PURPOSE: Validates and dispatches the passwords to resume the task.
|
||||
@@ -69,7 +69,7 @@
|
||||
<div
|
||||
class="fixed inset-0 bg-gray-500 bg-opacity-75 transition-opacity"
|
||||
aria-hidden="true"
|
||||
on:click={handleCancel}
|
||||
onclick={handleCancel}
|
||||
></div>
|
||||
|
||||
<span
|
||||
@@ -126,7 +126,7 @@
|
||||
{/if}
|
||||
|
||||
<form
|
||||
on:submit|preventDefault={handleSubmit}
|
||||
onsubmit={(e) => { e.preventDefault(); handleSubmit(); }}
|
||||
class="space-y-4"
|
||||
>
|
||||
{#each databases as dbName}
|
||||
@@ -158,7 +158,7 @@
|
||||
<button
|
||||
type="button"
|
||||
class="w-full inline-flex justify-center rounded-md border border-transparent shadow-sm px-4 py-2 bg-indigo-600 text-base font-medium text-white hover:bg-indigo-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:ml-3 sm:w-auto sm:text-sm disabled:opacity-50"
|
||||
on:click={handleSubmit}
|
||||
onclick={handleSubmit}
|
||||
disabled={submitting}
|
||||
>
|
||||
{submitting ? "Resuming..." : "Resume Migration"}
|
||||
@@ -166,7 +166,7 @@
|
||||
<button
|
||||
type="button"
|
||||
class="mt-3 w-full inline-flex justify-center rounded-md border border-gray-300 shadow-sm px-4 py-2 bg-white text-base font-medium text-gray-700 hover:bg-gray-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-indigo-500 sm:mt-0 sm:ml-3 sm:w-auto sm:text-sm"
|
||||
on:click={handleCancel}
|
||||
onclick={handleCancel}
|
||||
disabled={submitting}
|
||||
>
|
||||
Cancel
|
||||
|
||||
@@ -123,7 +123,7 @@
|
||||
<span>{error}</span>
|
||||
<button
|
||||
class="bg-terminal-surface text-terminal-text-subtle border border-terminal-border rounded-md px-3 py-1 text-xs cursor-pointer transition-all hover:bg-terminal-border hover:text-terminal-text-bright"
|
||||
on:click={handleRefresh}>Retry</button
|
||||
onclick={handleRefresh}>Retry</button
|
||||
>
|
||||
</div>
|
||||
{:else}
|
||||
@@ -149,11 +149,11 @@
|
||||
<div
|
||||
class="fixed inset-0 bg-gray-500/75 transition-opacity"
|
||||
aria-hidden="true"
|
||||
on:click={() => {
|
||||
onclick={() => {
|
||||
show = false;
|
||||
dispatch("close");
|
||||
}}
|
||||
on:keydown={(e) => e.key === "Escape" && (show = false)}
|
||||
onkeydown={(e) => e.key === "Escape" && (show = false)}
|
||||
role="presentation"
|
||||
></div>
|
||||
|
||||
@@ -170,7 +170,7 @@
|
||||
</h3>
|
||||
<button
|
||||
class="text-gray-500 hover:text-gray-300"
|
||||
on:click={() => {
|
||||
onclick={() => {
|
||||
show = false;
|
||||
dispatch("close");
|
||||
}}
|
||||
|
||||
@@ -7,67 +7,74 @@
|
||||
-->
|
||||
|
||||
<script>
|
||||
import { onMount } from 'svelte';
|
||||
import { t } from '../../lib/i18n';
|
||||
import { requestApi } from '../../lib/api';
|
||||
import { onMount } from "svelte";
|
||||
import { t } from "../../lib/i18n";
|
||||
import { requestApi } from "../../lib/api";
|
||||
|
||||
/** @type {Array} */
|
||||
let {
|
||||
provider,
|
||||
config = {},
|
||||
} = $props();
|
||||
|
||||
let { providers = [], onSave = () => {} } = $props();
|
||||
|
||||
let editingProvider = null;
|
||||
let showForm = false;
|
||||
|
||||
let formData = {
|
||||
name: '',
|
||||
provider_type: 'openai',
|
||||
base_url: 'https://api.openai.com/v1',
|
||||
api_key: '',
|
||||
default_model: 'gpt-4o',
|
||||
is_active: true
|
||||
name: "",
|
||||
provider_type: "openai",
|
||||
base_url: "https://api.openai.com/v1",
|
||||
api_key: "",
|
||||
default_model: "gpt-4o",
|
||||
is_active: true,
|
||||
};
|
||||
|
||||
let testStatus = { type: '', message: '' };
|
||||
let testStatus = { type: "", message: "" };
|
||||
let isTesting = false;
|
||||
|
||||
function resetForm() {
|
||||
formData = {
|
||||
name: '',
|
||||
provider_type: 'openai',
|
||||
base_url: 'https://api.openai.com/v1',
|
||||
api_key: '',
|
||||
default_model: 'gpt-4o',
|
||||
is_active: true
|
||||
name: "",
|
||||
provider_type: "openai",
|
||||
base_url: "https://api.openai.com/v1",
|
||||
api_key: "",
|
||||
default_model: "gpt-4o",
|
||||
is_active: true,
|
||||
};
|
||||
editingProvider = null;
|
||||
testStatus = { type: '', message: '' };
|
||||
testStatus = { type: "", message: "" };
|
||||
}
|
||||
|
||||
function handleEdit(provider) {
|
||||
editingProvider = provider;
|
||||
formData = { ...provider, api_key: '' }; // Don't populate key for security
|
||||
formData = { ...provider, api_key: "" }; // Don't populate key for security
|
||||
showForm = true;
|
||||
}
|
||||
|
||||
async function testConnection() {
|
||||
console.log("[ProviderConfig][Action] Testing connection", formData);
|
||||
isTesting = true;
|
||||
testStatus = { type: 'info', message: $t.llm.testing };
|
||||
testStatus = { type: "info", message: $t.llm.testing };
|
||||
|
||||
try {
|
||||
const endpoint = editingProvider ? `/llm/providers/${editingProvider.id}/test` : '/llm/providers/test';
|
||||
const result = await requestApi(endpoint, 'POST', formData);
|
||||
const endpoint = editingProvider
|
||||
? `/llm/providers/${editingProvider.id}/test`
|
||||
: "/llm/providers/test";
|
||||
const result = await requestApi(endpoint, "POST", formData);
|
||||
|
||||
if (result.success) {
|
||||
testStatus = { type: 'success', message: $t.llm.connection_success };
|
||||
testStatus = { type: "success", message: $t.llm.connection_success };
|
||||
} else {
|
||||
testStatus = { type: 'error', message: $t.llm.connection_failed.replace('{error}', result.error || 'Unknown error') };
|
||||
testStatus = {
|
||||
type: "error",
|
||||
message: $t.llm.connection_failed.replace(
|
||||
"{error}",
|
||||
result.error || "Unknown error",
|
||||
),
|
||||
};
|
||||
}
|
||||
} catch (err) {
|
||||
testStatus = { type: 'error', message: $t.llm.connection_failed.replace('{error}', err.message) };
|
||||
testStatus = {
|
||||
type: "error",
|
||||
message: $t.llm.connection_failed.replace("{error}", err.message),
|
||||
};
|
||||
} finally {
|
||||
isTesting = false;
|
||||
}
|
||||
@@ -75,8 +82,10 @@
|
||||
|
||||
async function handleSubmit() {
|
||||
console.log("[ProviderConfig][Action] Submitting provider config");
|
||||
const method = editingProvider ? 'PUT' : 'POST';
|
||||
const endpoint = editingProvider ? `/llm/providers/${editingProvider.id}` : '/llm/providers';
|
||||
const method = editingProvider ? "PUT" : "POST";
|
||||
const endpoint = editingProvider
|
||||
? `/llm/providers/${editingProvider.id}`
|
||||
: "/llm/providers";
|
||||
|
||||
// When editing, only include api_key if user entered a new one
|
||||
const submitData = { ...formData };
|
||||
@@ -97,9 +106,9 @@
|
||||
|
||||
async function toggleActive(provider) {
|
||||
try {
|
||||
await requestApi(`/llm/providers/${provider.id}`, 'PUT', {
|
||||
await requestApi(`/llm/providers/${provider.id}`, "PUT", {
|
||||
...provider,
|
||||
is_active: !provider.is_active
|
||||
is_active: !provider.is_active,
|
||||
});
|
||||
onSave();
|
||||
} catch (err) {
|
||||
@@ -113,26 +122,51 @@
|
||||
<h2 class="text-xl font-bold">{$t.llm.providers_title}</h2>
|
||||
<button
|
||||
class="bg-blue-600 text-white px-4 py-2 rounded hover:bg-blue-700 transition"
|
||||
on:click={() => { resetForm(); showForm = true; }}
|
||||
on:click={() => {
|
||||
resetForm();
|
||||
showForm = true;
|
||||
}}
|
||||
>
|
||||
{$t.llm.add_provider}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{#if showForm}
|
||||
<div class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
|
||||
<div
|
||||
class="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50"
|
||||
>
|
||||
<div class="bg-white p-6 rounded-lg shadow-xl w-full max-w-md">
|
||||
<h3 class="text-lg font-semibold mb-4">{editingProvider ? $t.llm.edit_provider : $t.llm.new_provider}</h3>
|
||||
<h3 class="text-lg font-semibold mb-4">
|
||||
{editingProvider ? $t.llm.edit_provider : $t.llm.new_provider}
|
||||
</h3>
|
||||
|
||||
<div class="space-y-4">
|
||||
<div>
|
||||
<label for="provider-name" class="block text-sm font-medium text-gray-700">{$t.llm.name}</label>
|
||||
<input id="provider-name" type="text" bind:value={formData.name} class="mt-1 block w-full border rounded-md p-2" placeholder="e.g. My OpenAI" />
|
||||
<label
|
||||
for="provider-name"
|
||||
class="block text-sm font-medium text-gray-700"
|
||||
>{$t.llm.name}</label
|
||||
>
|
||||
<input
|
||||
id="provider-name"
|
||||
type="text"
|
||||
bind:value={formData.name}
|
||||
class="mt-1 block w-full border rounded-md p-2"
|
||||
placeholder="e.g. My OpenAI"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label for="provider-type" class="block text-sm font-medium text-gray-700">{$t.llm.type}</label>
|
||||
<select id="provider-type" bind:value={formData.provider_type} class="mt-1 block w-full border rounded-md p-2">
|
||||
<label
|
||||
for="provider-type"
|
||||
class="block text-sm font-medium text-gray-700"
|
||||
>{$t.llm.type}</label
|
||||
>
|
||||
<select
|
||||
id="provider-type"
|
||||
bind:value={formData.provider_type}
|
||||
class="mt-1 block w-full border rounded-md p-2"
|
||||
>
|
||||
<option value="openai">OpenAI</option>
|
||||
<option value="openrouter">OpenRouter</option>
|
||||
<option value="kilo">Kilo</option>
|
||||
@@ -140,28 +174,67 @@
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label for="provider-base-url" class="block text-sm font-medium text-gray-700">{$t.llm.base_url}</label>
|
||||
<input id="provider-base-url" type="text" bind:value={formData.base_url} class="mt-1 block w-full border rounded-md p-2" />
|
||||
<label
|
||||
for="provider-base-url"
|
||||
class="block text-sm font-medium text-gray-700"
|
||||
>{$t.llm.base_url}</label
|
||||
>
|
||||
<input
|
||||
id="provider-base-url"
|
||||
type="text"
|
||||
bind:value={formData.base_url}
|
||||
class="mt-1 block w-full border rounded-md p-2"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label for="provider-api-key" class="block text-sm font-medium text-gray-700">{$t.llm.api_key}</label>
|
||||
<input id="provider-api-key" type="password" bind:value={formData.api_key} class="mt-1 block w-full border rounded-md p-2" placeholder={editingProvider ? "••••••••" : "sk-..."} />
|
||||
<label
|
||||
for="provider-api-key"
|
||||
class="block text-sm font-medium text-gray-700"
|
||||
>{$t.llm.api_key}</label
|
||||
>
|
||||
<input
|
||||
id="provider-api-key"
|
||||
type="password"
|
||||
bind:value={formData.api_key}
|
||||
class="mt-1 block w-full border rounded-md p-2"
|
||||
placeholder={editingProvider ? "••••••••" : "sk-..."}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label for="provider-default-model" class="block text-sm font-medium text-gray-700">{$t.llm.default_model}</label>
|
||||
<input id="provider-default-model" type="text" bind:value={formData.default_model} class="mt-1 block w-full border rounded-md p-2" placeholder="gpt-4o" />
|
||||
<label
|
||||
for="provider-default-model"
|
||||
class="block text-sm font-medium text-gray-700"
|
||||
>{$t.llm.default_model}</label
|
||||
>
|
||||
<input
|
||||
id="provider-default-model"
|
||||
type="text"
|
||||
bind:value={formData.default_model}
|
||||
class="mt-1 block w-full border rounded-md p-2"
|
||||
placeholder="gpt-4o"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div class="flex items-center">
|
||||
<input id="provider-active" type="checkbox" bind:checked={formData.is_active} class="mr-2" />
|
||||
<label for="provider-active" class="text-sm font-medium text-gray-700">{$t.llm.active}</label>
|
||||
<input
|
||||
id="provider-active"
|
||||
type="checkbox"
|
||||
bind:checked={formData.is_active}
|
||||
class="mr-2"
|
||||
/>
|
||||
<label
|
||||
for="provider-active"
|
||||
class="text-sm font-medium text-gray-700">{$t.llm.active}</label
|
||||
>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{#if testStatus.message}
|
||||
<div class={`mt-4 p-2 rounded text-sm ${testStatus.type === 'success' ? 'bg-green-100 text-green-800' : testStatus.type === 'error' ? 'bg-red-100 text-red-800' : 'bg-blue-100 text-blue-800'}`}>
|
||||
<div
|
||||
class={`mt-4 p-2 rounded text-sm ${testStatus.type === "success" ? "bg-green-100 text-green-800" : testStatus.type === "error" ? "bg-red-100 text-red-800" : "bg-blue-100 text-blue-800"}`}
|
||||
>
|
||||
{testStatus.message}
|
||||
</div>
|
||||
{/if}
|
||||
@@ -169,7 +242,9 @@
|
||||
<div class="mt-6 flex justify-between gap-2">
|
||||
<button
|
||||
class="px-4 py-2 border rounded hover:bg-gray-50 flex-1"
|
||||
on:click={() => { showForm = false; }}
|
||||
on:click={() => {
|
||||
showForm = false;
|
||||
}}
|
||||
>
|
||||
{$t.llm.cancel}
|
||||
</button>
|
||||
@@ -193,15 +268,21 @@
|
||||
|
||||
<div class="grid gap-4">
|
||||
{#each providers as provider}
|
||||
<div class="border rounded-lg p-4 flex justify-between items-center bg-white shadow-sm">
|
||||
<div
|
||||
class="border rounded-lg p-4 flex justify-between items-center bg-white shadow-sm"
|
||||
>
|
||||
<div>
|
||||
<div class="font-bold flex items-center gap-2">
|
||||
{provider.name}
|
||||
<span class={`text-xs px-2 py-0.5 rounded-full ${provider.is_active ? 'bg-green-100 text-green-800' : 'bg-gray-100 text-gray-800'}`}>
|
||||
{provider.is_active ? $t.llm.active : 'Inactive'}
|
||||
<span
|
||||
class={`text-xs px-2 py-0.5 rounded-full ${provider.is_active ? "bg-green-100 text-green-800" : "bg-gray-100 text-gray-800"}`}
|
||||
>
|
||||
{provider.is_active ? $t.llm.active : "Inactive"}
|
||||
</span>
|
||||
</div>
|
||||
<div class="text-sm text-gray-500">{provider.provider_type} • {provider.default_model}</div>
|
||||
<div class="text-sm text-gray-500">
|
||||
{provider.provider_type} • {provider.default_model}
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex gap-2">
|
||||
<button
|
||||
@@ -211,15 +292,17 @@
|
||||
{$t.common.edit}
|
||||
</button>
|
||||
<button
|
||||
class={`text-sm ${provider.is_active ? 'text-orange-600' : 'text-green-600'} hover:underline`}
|
||||
class={`text-sm ${provider.is_active ? "text-orange-600" : "text-green-600"} hover:underline`}
|
||||
on:click={() => toggleActive(provider)}
|
||||
>
|
||||
{provider.is_active ? 'Deactivate' : 'Activate'}
|
||||
{provider.is_active ? "Deactivate" : "Activate"}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
{:else}
|
||||
<div class="text-center py-8 text-gray-500 border-2 border-dashed rounded-lg">
|
||||
<div
|
||||
class="text-center py-8 text-gray-500 border-2 border-dashed rounded-lg"
|
||||
>
|
||||
{$t.llm.no_providers}
|
||||
</div>
|
||||
{/each}
|
||||
|
||||
@@ -74,7 +74,7 @@
|
||||
class="bg-terminal-surface text-terminal-text-subtle border border-terminal-border rounded px-2 py-[0.3125rem] text-xs cursor-pointer shrink-0 appearance-none pr-6 focus:outline-none focus:border-primary-ring"
|
||||
style="background-image: url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' width='12' height='12' viewBox='0 0 24 24' fill='none' stroke='%2364748b' stroke-width='2'%3E%3Cpath d='M6 9l6 6 6-6'/%3E%3C/svg%3E"); background-repeat: no-repeat; background-position: right 0.375rem center;"
|
||||
value={selectedLevel}
|
||||
on:change={handleLevelChange}
|
||||
onchange={handleLevelChange}
|
||||
aria-label="Filter by level"
|
||||
>
|
||||
{#each levelOptions as option}
|
||||
@@ -86,7 +86,7 @@
|
||||
class="bg-terminal-surface text-terminal-text-subtle border border-terminal-border rounded px-2 py-[0.3125rem] text-xs cursor-pointer shrink-0 appearance-none pr-6 focus:outline-none focus:border-primary-ring"
|
||||
style="background-image: url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' width='12' height='12' viewBox='0 0 24 24' fill='none' stroke='%2364748b' stroke-width='2'%3E%3Cpath d='M6 9l6 6 6-6'/%3E%3C/svg%3E"); background-repeat: no-repeat; background-position: right 0.375rem center;"
|
||||
value={selectedSource}
|
||||
on:change={handleSourceChange}
|
||||
onchange={handleSourceChange}
|
||||
aria-label="Filter by source"
|
||||
>
|
||||
<option value="">All Sources</option>
|
||||
@@ -114,7 +114,7 @@
|
||||
class="w-full bg-terminal-surface text-terminal-text-bright border border-terminal-border rounded py-[0.3125rem] px-2 pl-7 text-xs placeholder:text-terminal-text-muted focus:outline-none focus:border-primary-ring"
|
||||
placeholder="Search..."
|
||||
value={searchText}
|
||||
on:input={handleSearchChange}
|
||||
oninput={handleSearchChange}
|
||||
aria-label="Search logs"
|
||||
/>
|
||||
</div>
|
||||
@@ -123,7 +123,7 @@
|
||||
{#if hasActiveFilters}
|
||||
<button
|
||||
class="flex items-center justify-center p-[0.3125rem] bg-transparent border border-terminal-border rounded text-terminal-text-subtle shrink-0 cursor-pointer transition-all hover:text-log-error hover:border-log-error hover:bg-log-error/10"
|
||||
on:click={clearFilters}
|
||||
onclick={clearFilters}
|
||||
aria-label="Clear filters"
|
||||
>
|
||||
<svg
|
||||
|
||||
@@ -134,7 +134,7 @@
|
||||
<button
|
||||
class="flex items-center gap-1.5 bg-transparent border-none text-terminal-text-muted text-[0.6875rem] cursor-pointer py-px px-1.5 rounded transition-all hover:bg-terminal-surface hover:text-terminal-text-subtle
|
||||
{autoScroll ? 'text-terminal-accent' : ''}"
|
||||
on:click={toggleAutoScroll}
|
||||
onclick={toggleAutoScroll}
|
||||
aria-label="Toggle auto-scroll"
|
||||
>
|
||||
{#if autoScroll}
|
||||
|
||||
@@ -1,10 +1,17 @@
|
||||
<!-- [DEF:Counter:Component] -->
|
||||
<!--
|
||||
@TIER: TRIVIAL
|
||||
@PURPOSE: Simple counter demo component
|
||||
@LAYER: UI
|
||||
-->
|
||||
<script>
|
||||
let count = $state(0)
|
||||
let count = $state(0);
|
||||
const increment = () => {
|
||||
count += 1
|
||||
}
|
||||
count += 1;
|
||||
};
|
||||
</script>
|
||||
|
||||
<button onclick={increment}>
|
||||
count is {count}
|
||||
</button>
|
||||
<!-- [/DEF:Counter:Component] -->
|
||||
|
||||
@@ -28,6 +28,7 @@ describe('SidebarStore', () => {
|
||||
expect(state.isMobileOpen).toBe(false);
|
||||
});
|
||||
});
|
||||
// [/DEF:test_sidebar_initial_state:Function]
|
||||
|
||||
// [DEF:test_toggleSidebar:Function]
|
||||
// @TEST: toggleSidebar toggles isExpanded state
|
||||
@@ -52,6 +53,7 @@ describe('SidebarStore', () => {
|
||||
expect(state.isExpanded).toBe(true);
|
||||
});
|
||||
});
|
||||
// [/DEF:test_toggleSidebar:Function]
|
||||
|
||||
// [DEF:test_setActiveItem:Function]
|
||||
// @TEST: setActiveItem updates activeCategory and activeItem
|
||||
@@ -74,6 +76,7 @@ describe('SidebarStore', () => {
|
||||
expect(state.activeItem).toBe('/settings');
|
||||
});
|
||||
});
|
||||
// [/DEF:test_setActiveItem:Function]
|
||||
|
||||
// [DEF:test_mobile_functions:Function]
|
||||
// @TEST: Mobile functions correctly update isMobileOpen
|
||||
@@ -110,6 +113,7 @@ describe('SidebarStore', () => {
|
||||
expect(state2.isMobileOpen).toBe(initialMobileOpen);
|
||||
});
|
||||
});
|
||||
// [/DEF:test_mobile_functions:Function]
|
||||
});
|
||||
|
||||
// [/DEF:frontend.src.lib.stores.__tests__.sidebar:Module]
|
||||
|
||||
@@ -1,4 +1,9 @@
|
||||
// [DEF:Utils:Module]
|
||||
/**
|
||||
* @TIER: TRIVIAL
|
||||
* @PURPOSE: General utility functions (class merging)
|
||||
* @LAYER: Infra
|
||||
*
|
||||
* Merges class names into a single string.
|
||||
* @param {...(string | undefined | null | false)} inputs
|
||||
* @returns {string}
|
||||
@@ -6,3 +11,4 @@
|
||||
export function cn(...inputs) {
|
||||
return inputs.filter(Boolean).join(" ");
|
||||
}
|
||||
// [/DEF:Utils:Module]
|
||||
|
||||
@@ -1,4 +1,9 @@
|
||||
// [DEF:Debounce:Module]
|
||||
/**
|
||||
* @TIER: TRIVIAL
|
||||
* @PURPOSE: Debounce utility for limiting function execution rate
|
||||
* @LAYER: Infra
|
||||
*
|
||||
* Debounce utility function
|
||||
* Delays the execution of a function until a specified time has passed since the last call
|
||||
*
|
||||
@@ -17,3 +22,4 @@ export function debounce(func, wait) {
|
||||
timeout = setTimeout(later, wait);
|
||||
};
|
||||
}
|
||||
// [/DEF:Debounce:Module]
|
||||
|
||||
@@ -1,11 +1,24 @@
|
||||
<!-- [DEF:ErrorPage:Page] -->
|
||||
<script>
|
||||
import { page } from '$app/stores';
|
||||
/**
|
||||
* @TIER: STANDARD
|
||||
* @PURPOSE: Global error page displaying HTTP status and messages
|
||||
* @LAYER: UI
|
||||
* @UX_STATE: Error -> Displays error code and message with home link
|
||||
*/
|
||||
import { page } from "$app/stores";
|
||||
</script>
|
||||
|
||||
<div class="container mx-auto p-4 text-center mt-20">
|
||||
<h1 class="text-6xl font-bold text-gray-800 mb-4">{$page.status}</h1>
|
||||
<p class="text-2xl text-gray-600 mb-8">{$page.error?.message || 'Page not found'}</p>
|
||||
<a href="/" class="bg-blue-500 text-white px-6 py-3 rounded-lg hover:bg-blue-600 transition-colors">
|
||||
<p class="text-2xl text-gray-600 mb-8">
|
||||
{$page.error?.message || "Page not found"}
|
||||
</p>
|
||||
<a
|
||||
href="/"
|
||||
class="bg-blue-500 text-white px-6 py-3 rounded-lg hover:bg-blue-600 transition-colors"
|
||||
>
|
||||
Back to Dashboard
|
||||
</a>
|
||||
</div>
|
||||
<!-- [/DEF:ErrorPage:Page] -->
|
||||
|
||||
@@ -1,2 +1,9 @@
|
||||
// [DEF:RootLayoutConfig:Module]
|
||||
/**
|
||||
* @TIER: TRIVIAL
|
||||
* @PURPOSE: Root layout configuration (SPA mode)
|
||||
* @LAYER: Infra
|
||||
*/
|
||||
export const ssr = false;
|
||||
export const prerender = false;
|
||||
// [/DEF:RootLayoutConfig:Module]
|
||||
|
||||
@@ -1,3 +1,9 @@
|
||||
// [DEF:DashboardTypes:Module]
|
||||
/**
|
||||
* @TIER: TRIVIAL
|
||||
* @PURPOSE: TypeScript interfaces for Dashboard entities
|
||||
* @LAYER: Domain
|
||||
*/
|
||||
export interface DashboardMetadata {
|
||||
id: number;
|
||||
title: string;
|
||||
@@ -11,3 +17,4 @@ export interface DashboardSelection {
|
||||
target_env_id: string;
|
||||
replace_db_config?: boolean;
|
||||
}
|
||||
// [/DEF:DashboardTypes:Module]
|
||||
@@ -1,13 +1,14 @@
|
||||
# [DEF:generate_semantic_map:Module]
|
||||
#
|
||||
# @TIER: CRITICAL
|
||||
# @SEMANTICS: semantic_analysis, parser, map_generator, compliance_checker, tier_validation, svelte_props, data_flow
|
||||
# @PURPOSE: Scans the codebase to generate a Semantic Map and Compliance Report based on the System Standard.
|
||||
# @SEMANTICS: semantic_analysis, parser, map_generator, compliance_checker, tier_validation, svelte_props, data_flow, module_map
|
||||
# @PURPOSE: Scans the codebase to generate a Semantic Map, Module Map, and Compliance Report based on the System Standard.
|
||||
# @LAYER: DevOps/Tooling
|
||||
# @INVARIANT: All DEF anchors must have matching closing anchors; TIER determines validation strictness.
|
||||
# @RELATION: READS -> FileSystem
|
||||
# @RELATION: PRODUCES -> semantics/semantic_map.json
|
||||
# @RELATION: PRODUCES -> .ai/PROJECT_MAP.md
|
||||
# @RELATION: PRODUCES -> .ai/MODULE_MAP.md
|
||||
# @RELATION: PRODUCES -> semantics/reports/semantic_report_*.md
|
||||
|
||||
# [SECTION: IMPORTS]
|
||||
@@ -83,6 +84,7 @@ IGNORE_FILES = {
|
||||
}
|
||||
OUTPUT_JSON = "semantics/semantic_map.json"
|
||||
OUTPUT_COMPRESSED_MD = ".ai/PROJECT_MAP.md"
|
||||
OUTPUT_MODULE_MAP_MD = ".ai/MODULE_MAP.md"
|
||||
REPORTS_DIR = "semantics/reports"
|
||||
|
||||
# Tier-based mandatory tags
|
||||
@@ -830,6 +832,7 @@ class SemanticMapGenerator:
|
||||
|
||||
self._generate_report()
|
||||
self._generate_compressed_map()
|
||||
self._generate_module_map()
|
||||
# [/DEF:_generate_artifacts:Function]
|
||||
|
||||
# [DEF:_generate_report:Function]
|
||||
@@ -990,6 +993,163 @@ class SemanticMapGenerator:
|
||||
self._write_entity_md(f, child, level + 1)
|
||||
# [/DEF:_write_entity_md:Function]
|
||||
|
||||
# [DEF:_generate_module_map:Function]
|
||||
# @TIER: CRITICAL
|
||||
# @PURPOSE: Generates a module-centric map grouping entities by directory structure.
|
||||
# @PRE: Entities have been processed.
|
||||
# @POST: Markdown module map is written to .ai/MODULE_MAP.md.
|
||||
def _generate_module_map(self):
|
||||
with belief_scope("_generate_module_map"):
|
||||
os.makedirs(os.path.dirname(OUTPUT_MODULE_MAP_MD), exist_ok=True)
|
||||
|
||||
# Group entities by directory/module
|
||||
modules: Dict[str, Dict[str, Any]] = {}
|
||||
|
||||
# [DEF:_get_module_path:Function]
|
||||
# @TIER: STANDARD
|
||||
# @PURPOSE: Extracts the module path from a file path.
|
||||
# @PRE: file_path is a valid relative path.
|
||||
# @POST: Returns a module path string.
|
||||
def _get_module_path(file_path: str) -> str:
|
||||
# Convert file path to module-like path
|
||||
parts = file_path.replace(os.sep, '/').split('/')
|
||||
# Remove filename
|
||||
if len(parts) > 1:
|
||||
return '/'.join(parts[:-1])
|
||||
return 'root'
|
||||
# [/DEF:_get_module_path:Function]
|
||||
|
||||
# [DEF:_collect_all_entities:Function]
|
||||
# @TIER: STANDARD
|
||||
# @PURPOSE: Flattens entity tree for easier grouping.
|
||||
# @PRE: entity list is valid.
|
||||
# @POST: Returns flat list of all entities with their hierarchy.
|
||||
def _collect_all_entities(entities: List[SemanticEntity], result: List[Tuple[str, SemanticEntity]]):
|
||||
for e in entities:
|
||||
result.append((_get_module_path(e.file_path), e))
|
||||
_collect_all_entities(e.children, result)
|
||||
# [/DEF:_collect_all_entities:Function]
|
||||
|
||||
# Collect all entities
|
||||
all_entities: List[Tuple[str, SemanticEntity]] = []
|
||||
_collect_all_entities(self.entities, all_entities)
|
||||
|
||||
# Group by module path
|
||||
for module_path, entity in all_entities:
|
||||
if module_path not in modules:
|
||||
modules[module_path] = {
|
||||
'entities': [],
|
||||
'files': set(),
|
||||
'layers': set(),
|
||||
'tiers': {'CRITICAL': 0, 'STANDARD': 0, 'TRIVIAL': 0},
|
||||
'relations': []
|
||||
}
|
||||
modules[module_path]['entities'].append(entity)
|
||||
modules[module_path]['files'].add(entity.file_path)
|
||||
if entity.tags.get('LAYER'):
|
||||
modules[module_path]['layers'].add(entity.tags.get('LAYER'))
|
||||
tier = entity.get_tier().value
|
||||
modules[module_path]['tiers'][tier] = modules[module_path]['tiers'].get(tier, 0) + 1
|
||||
for rel in entity.relations:
|
||||
modules[module_path]['relations'].append(rel)
|
||||
|
||||
# Write module map
|
||||
with open(OUTPUT_MODULE_MAP_MD, 'w', encoding='utf-8') as f:
|
||||
f.write("# Module Map\n\n")
|
||||
f.write("> High-level module structure for AI Context. Generated automatically.\n\n")
|
||||
f.write(f"**Generated:** {datetime.datetime.now().isoformat()}\n\n")
|
||||
|
||||
# Summary statistics
|
||||
total_modules = len(modules)
|
||||
total_entities = len(all_entities)
|
||||
f.write("## Summary\n\n")
|
||||
f.write(f"- **Total Modules:** {total_modules}\n")
|
||||
f.write(f"- **Total Entities:** {total_entities}\n\n")
|
||||
|
||||
# Module hierarchy
|
||||
f.write("## Module Hierarchy\n\n")
|
||||
|
||||
# Sort modules by path for consistent output
|
||||
sorted_modules = sorted(modules.items(), key=lambda x: x[0])
|
||||
|
||||
for module_path, data in sorted_modules:
|
||||
# Calculate module depth for indentation
|
||||
depth = module_path.count('/')
|
||||
indent = " " * depth
|
||||
|
||||
# Module header
|
||||
module_name = module_path.split('/')[-1] if module_path != 'root' else 'root'
|
||||
f.write(f"{indent}### 📁 `{module_name}/`\n\n")
|
||||
|
||||
# Module metadata
|
||||
if data['layers']:
|
||||
layers_str = ", ".join(sorted(data['layers']))
|
||||
f.write(f"{indent}- 🏗️ **Layers:** {layers_str}\n")
|
||||
|
||||
tiers_summary = []
|
||||
for tier_name, count in data['tiers'].items():
|
||||
if count > 0:
|
||||
tiers_summary.append(f"{tier_name}: {count}")
|
||||
if tiers_summary:
|
||||
f.write(f"{indent}- 📊 **Tiers:** {', '.join(tiers_summary)}\n")
|
||||
|
||||
f.write(f"{indent}- 📄 **Files:** {len(data['files'])}\n")
|
||||
f.write(f"{indent}- 📦 **Entities:** {len(data['entities'])}\n")
|
||||
|
||||
# List key entities (Modules, Classes, Components only)
|
||||
key_entities = [e for e in data['entities'] if e.type in ['Module', 'Class', 'Component', 'Store']]
|
||||
if key_entities:
|
||||
f.write(f"\n{indent}**Key Entities:**\n\n")
|
||||
for entity in sorted(key_entities, key=lambda x: (x.type, x.name))[:10]:
|
||||
icon = "📦" if entity.type == "Module" else "ℂ" if entity.type == "Class" else "🧩" if entity.type == "Component" else "🗄️"
|
||||
tier_badge = ""
|
||||
if entity.get_tier() == Tier.CRITICAL:
|
||||
tier_badge = " `[CRITICAL]`"
|
||||
elif entity.get_tier() == Tier.TRIVIAL:
|
||||
tier_badge = " `[TRIVIAL]`"
|
||||
purpose = entity.tags.get('PURPOSE', '')[:60] + "..." if entity.tags.get('PURPOSE') and len(entity.tags.get('PURPOSE', '')) > 60 else entity.tags.get('PURPOSE', '')
|
||||
f.write(f"{indent} - {icon} **{entity.name}** ({entity.type}){tier_badge}\n")
|
||||
if purpose:
|
||||
f.write(f"{indent} - {purpose}\n")
|
||||
|
||||
# External relations
|
||||
external_relations = [r for r in data['relations'] if r['type'] in ['DEPENDS_ON', 'IMPLEMENTS', 'INHERITS']]
|
||||
if external_relations:
|
||||
unique_deps = {}
|
||||
for rel in external_relations:
|
||||
key = f"{rel['type']} -> {rel['target']}"
|
||||
unique_deps[key] = rel
|
||||
f.write(f"\n{indent}**Dependencies:**\n\n")
|
||||
for rel_str in sorted(unique_deps.keys())[:5]:
|
||||
f.write(f"{indent} - 🔗 {rel_str}\n")
|
||||
|
||||
f.write("\n")
|
||||
|
||||
# Cross-module dependency graph
|
||||
f.write("## Cross-Module Dependencies\n\n")
|
||||
f.write("```mermaid\n")
|
||||
f.write("graph TD\n")
|
||||
|
||||
# Find inter-module dependencies
|
||||
for module_path, data in sorted_modules:
|
||||
module_name = module_path.split('/')[-1] if module_path != 'root' else 'root'
|
||||
safe_name = module_name.replace('-', '_').replace('.', '_')
|
||||
|
||||
for rel in data['relations']:
|
||||
target = rel.get('target', '')
|
||||
# Check if target references another module
|
||||
for other_module in modules:
|
||||
if other_module != module_path and other_module in target:
|
||||
other_name = other_module.split('/')[-1]
|
||||
safe_other = other_name.replace('-', '_').replace('.', '_')
|
||||
f.write(f" {safe_name}-->|{rel['type']}|{safe_other}\n")
|
||||
break
|
||||
|
||||
f.write("```\n")
|
||||
|
||||
print(f"Generated {OUTPUT_MODULE_MAP_MD}")
|
||||
# [/DEF:_generate_module_map:Function]
|
||||
|
||||
# [/DEF:SemanticMapGenerator:Class]
|
||||
|
||||
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
15
ut
Normal file
15
ut
Normal file
@@ -0,0 +1,15 @@
|
||||
Prepended http:// to './RealiTLScanner'
|
||||
--2026-02-20 11:14:59-- http://./RealiTLScanner
|
||||
Распознаётся . (.)… ошибка: С именем узла не связано ни одного адреса.
|
||||
wget: не удаётся разрешить адрес ‘.’
|
||||
Prepended http:// to 'www.microsoft.com'
|
||||
--2026-02-20 11:14:59-- http://www.microsoft.com/
|
||||
Распознаётся www.microsoft.com (www.microsoft.com)… 95.100.178.81
|
||||
Подключение к www.microsoft.com (www.microsoft.com)|95.100.178.81|:80... соединение установлено.
|
||||
HTTP-запрос отправлен. Ожидание ответа… 403 Forbidden
|
||||
2026-02-20 11:15:00 ОШИБКА 403: Forbidden.
|
||||
|
||||
Prepended http:// to 'file.csv'
|
||||
--2026-02-20 11:15:00-- http://file.csv/
|
||||
Распознаётся file.csv (file.csv)… ошибка: Неизвестное имя или служба.
|
||||
wget: не удаётся разрешить адрес ‘file.csv’
|
||||
Reference in New Issue
Block a user