2.9 KiB
2.9 KiB
Phase 0 Research: Service ID Synchronization and Cross-Filter Recovery
1. Technical Context Clarifications
- Language/Version: Python 3.11+
- Primary Dependencies: FastAPI, SQLite (local mapping DB), APScheduler (background jobs)
- Storage: SQLite (mappings.db) or similar existing local store / ORM for storing
resource_mappings. - Testing: pytest (backend)
- Target Platform: Dockerized Linux service
2. Research Tasks & Unknowns
Q: Where to store the environment mappings?
- Decision: A new SQLite table
resource_mappingsinside the existing backend database (or standard SQLAlchemy models if used). - Rationale: Minimal infrastructure overhead. We already use an internal database (implied by
mappings.dbor SQLAlchemy models). - Alternatives considered: Redis (too volatile, don't want to lose mapping on restart), separate PostgreSQL (overkill).
Q: How to schedule the synchronization background job?
- Decision: Use
APScheduleror FastAPI'sBackgroundTaskstriggered by a simple internal mechanism. IfAPScheduleris already in the project, we'll use that. - Rationale: Standard Python ecosystem approach for periodic in-process background tasks.
- Alternatives considered: Celery (too heavy), cron container (requires external setup).
Q: How to intercept the migration payload (ZIP) and patch it?
- Decision: In
MigrationEngine.migrate(), before uploading to the Target Environment viaapi_client, intercept the downloadedZIPbytes. Extractdashboards/*.yaml, parse YAML, find and replaceidreferences injson_metadata, write back to YAML, repack the ZIP, and dispatch. - Rationale: Dashboards export as ZIP files containing YAML manifests.
json_metadatain these YAMLs contains thechart_configurationwe need to patch. - Alternatives considered: Patching via Superset API after import (harder, requires precise API calls and dealing with Superset's strict dashboard validation).
Q: How to handle the "Double Import" strategy?
- Decision: If Target Check shows missing UUIDs, execute temporary import, trigger direct
IdMappingService.sync_environment(target_env)to fetch newly created IDs, then proceed with the patching and final import withoverwrite=true. - Rationale: Superset generates IDs upon creation. We cannot guess them. We must force it to create them first.
- Alternatives considered: Creating charts via API manually one-by-one (loss of complex export logic, reinventing the importer).
Q: Integrating the Checkbox into UI
- Decision: Pass
fix_cross_filters=Trueboolean in the migration payload from Frontend to Backend. - Rationale: Standard API contract extension.
3. Existing Code Integration
We need to hook into backend/src/core/migration_engine.py. This file likely handles the export -> import flow. We will add a pre-processing step right before the final api_client.import_dashboards() call on the target environment.