# Module Contracts: LLM Analysis Plugin **Feature**: `017-llm-analysis-plugin` **Protocol**: `semantic_protocol.md` ## Backend Modules ### Plugin Entry Point ```python # [DEF:backend/src/plugins/llm_analysis/plugin.py:Module] # @TIER: STANDARD # @SEMANTICS: plugin, task_handler, validation, documentation # @PURPOSE: Orchestrate LLM-based analysis tasks (Dashboard Validation, Documentation) within the Plugin architecture. # @LAYER: Domain # @RELATION: DEPENDS_ON -> backend/src/plugins/llm_analysis/service.py # @RELATION: IMPLEMENTS -> backend/src/core/plugin_base.py # @INVARIANT: All tasks must report status updates via WebSocket. # [DEF:validate_dashboard:Function] # @PURPOSE: Analyze dashboard health using screenshot and logs. # @PRE: Dashboard ID exists and provider is configured. # @POST: ValidationResult is persisted and notification sent if failed. # [/DEF:validate_dashboard] # [DEF:generate_documentation:Function] # @PURPOSE: Generate and apply metadata descriptions for datasets. # @PRE: Dataset is accessible and schema is valid. # @POST: Dataset metadata is updated in the database. # [/DEF:generate_documentation] # [/DEF:backend/src/plugins/llm_analysis/plugin.py] ``` ### LLM Service ```python # [DEF:backend/src/plugins/llm_analysis/service.py:Module] # @TIER: CRITICAL # @SEMANTICS: llm, api, client, retry, prompt # @PURPOSE: Execute reliable interactions with external LLM providers (OpenAI, etc.) with retry logic and prompt management. # @LAYER: Infrastructure # @RELATION: CALLS -> external_api # @RELATION: DEPENDS_ON -> backend/src/services/llm_provider.py # @INVARIANT: API Keys must never be logged in plain text. # [DEF:analyze_multimodal:Function] # @PURPOSE: Send image and text context to LLM for analysis. # @PRE: Provider is active and quota is available. # @POST: Returns structured analysis or raises RetryError. # @SIDE_EFFECT: Makes external HTTP request. # [/DEF:analyze_multimodal] # [/DEF:backend/src/plugins/llm_analysis/service.py] ``` ### Provider Management Service ```python # [DEF:backend/src/services/llm_provider.py:Module] # @TIER: CRITICAL # @SEMANTICS: security, config, encryption, provider # @PURPOSE: Manage lifecycle and secure storage of LLM Provider configurations. # @LAYER: Domain # @RELATION: DEPENDS_ON -> backend/src/core/database.py # @INVARIANT: Stored API keys must be encrypted at rest. # [DEF:get_provider_client:Function] # @PURPOSE: Instantiate an authenticated LLM client for a specific provider. # @PRE: Provider ID exists and is active. # @POST: Returns configured client instance with decrypted key. # [/DEF:get_provider_client] # [/DEF:backend/src/services/llm_provider.py] ``` ### Data Models ```python # [DEF:backend/src/plugins/llm_analysis/models.py:Module] # @TIER: TRIVIAL # @SEMANTICS: dto, pydantic, schema # @PURPOSE: Define Pydantic models for configuration, requests, and results. # @LAYER: Domain/Data # [/DEF:backend/src/plugins/llm_analysis/models.py] ``` ## Frontend Components ### Configuration UI ```html ``` ### Validation Report ```html