77 KiB
SIDEL ScriptsManager - Multi-Language Script Manager Application Specification
Overview
A Python Flask web application for Linux environments that automatically discovers and executes scripts from a backend directory structure. The SIDEL ScriptsManager provides user management, multi-language support, dark/light mode themes, SIDEL corporate branding, and advanced script metadata management with automatic header parsing and web interface lifecycle management.
Basic Concepts
Script Architecture
Each script in SIDEL ScriptsManager follows a standardized architecture pattern:
- Individual Flask Servers: Every script runs its own Flask server instance
- Port Management: SIDEL ScriptsManager assigns unique ports to each script instance
- Parameter Injection: Scripts receive essential parameters from SIDEL ScriptsManager:
- Data Directory: Path to persistent data storage for the script
- User Level: Current user's permission level (
admin
,developer
,operator
,viewer
) - Flask Port: Assigned port number for the script's web interface
- Project ID: Current project identifier for data isolation
- Project Name: Human-readable project name for display purposes
- Theme: Current user's selected theme (
light
,dark
) - Language: Current user's selected language (
en
,es
,it
,fr
)
- SIDEL Branding Integration: Scripts receive SIDEL logo path and must display corporate branding
- Consistent User Experience: All script interfaces maintain SIDEL visual identity, theme, and language
Data Persistence Architecture
data/
├── script_groups/
│ ├── group1/
│ │ ├── user1/
│ │ │ ├── project_default/
│ │ │ │ ├── config.json
│ │ │ │ ├── settings.json
│ │ │ │ └── custom_data.json
│ │ │ └── project_alpha/
│ │ │ ├── config.json
│ │ │ └── results.json
│ │ └── user2/
│ │ └── project_default/
│ │ └── config.json
│ └── group2/
│ └── user1/
│ └── project_beta/
│ └── workflow.json
├── logs/
│ ├── executions/
│ │ ├── user1/
│ │ │ ├── 2025-09-12/
│ │ │ │ ├── execution_abc123.log
│ │ │ │ ├── execution_def456.log
│ │ │ │ └── execution_ghi789.log
│ │ │ └── 2025-09-11/
│ │ │ └── execution_jkl012.log
│ │ └── user2/
│ │ └── 2025-09-12/
│ │ └── execution_mno345.log
│ ├── system/
│ │ ├── application.log
│ │ ├── error.log
│ │ └── access.log
│ └── audit/
│ ├── user_actions.log
│ └── admin_actions.log
Multi-User & Multi-Project Management
- User Isolation: Each user maintains separate configuration and data files
- Project Segregation: Users can work on multiple projects with independent settings
- Default Project: Every user automatically gets a
project_default
for immediate use - Port Allocation: Dynamic port assignment prevents conflicts between users and scripts
Script Lifecycle
- Initialization: ScriptsManager discovers and registers the script
- Execution Request: User requests script execution through ScriptsManager interface
- Parameter Injection: ScriptsManager passes data directory, user level, and port
- Flask Startup: Script starts its own Flask server on assigned port
- Interface Opening: ScriptsManager opens browser tab to script's interface
- Session Management: Script maintains user session and project context
- Log Generation: Comprehensive logging throughout script execution with user context
- Graceful Shutdown: Script terminates when tab is closed or session expires
User-Centric Logging Architecture
- User Isolation: Each user can only access their own execution logs
- Project Context: Logs are associated with specific user projects for better organization
- Real-time Streaming: Live log updates during script execution via WebSocket
- Persistent Storage: All execution logs are permanently stored for future reference
- Comprehensive Capture: Logs include:
- Standard Output: All script output and results
- Error Output: Error messages and stack traces
- Debug Information: Internal system messages and performance metrics
- Execution Context: Parameters, environment, duration, and exit codes
- Session Metadata: User, project, script, and interface information
- Post-Execution Access: Users can review logs after script completion
- Retention Management: Configurable log retention based on user level and storage policies
- Export Capabilities: Download logs in multiple formats for external analysis
Core Features
1. Automatic Script Discovery
- Directory Structure: Scans
app/backend/script_groups/
for script collections - ⚠️ Important: Scripts must be located ONLY in
app/backend/script_groups/
- there should be NO separatebackend/script_groups/
directory - Metadata Support: Reads JSON configuration files for script descriptions and parameters
- Header Parsing: Automatically extracts script metadata from file headers on first discovery
- Dynamic Loading: Automatically detects new scripts without application restart
- File Types: Supports Python scripts (.py)
- Conda Environment Management: Each script group can use a different conda environment
- Environment Auto-Detection: Automatically detects available conda environments on Windows/Linux
- Script Metadata Editing: Developers and administrators can edit script descriptions and execution levels
- Web Interface Management: Automatically manages Flask server lifecycle for each script
2. User Management & Access Control
- User Levels:
admin
: Full access to all scripts, user management, and script metadata editingdeveloper
: Access to development and testing scripts, script metadata editing capabilitiesoperator
: Access to production and operational scriptsviewer
: Read-only access to logs and documentation
- Authentication: Simple login system with session management
- Permission System: Script-level permissions based on user roles
3. Multi-Language Support
- Primary Language: English (default)
- Supported Languages:
- Spanish (es)
- Italian (it)
- French (fr)
- Translation Files: JSON-based language files in
translations/
directory - Dynamic Switching: Language can be changed without logout
4. User Interface
- Theme Support: Light/Dark mode toggle with user preference persistence
- Responsive Design: Bootstrap-based responsive layout
- Real-time Updates: WebSocket integration for live log streaming
- Modern UI: Clean, intuitive interface with icons and visual feedback
- Script Metadata Editor: Inline editing capabilities for script descriptions and execution levels (developer+)
- Web Interface Lifecycle: Automatic management of script Flask servers with tab monitoring
Technical Architecture
⚠️ Directory Structure Clarification
IMPORTANT: All scripts must be located in app/backend/script_groups/
only. There should be NO separate backend/script_groups/
directory at the root level.
Correct Structure:
- ✅
app/backend/script_groups/
- Contains all script groups and scripts - ❌
backend/script_groups/
- Should NOT exist
Rationale: This ensures a clean separation of concerns and prevents confusion between the application backend and the scripts directory.
Backend Structure
app/
├── app.py # Main Flask application
├── config/
│ ├── config.py # Application configuration
│ ├── database.py # Database models and setup
│ └── permissions.py # Permission management
├── backend/
│ └── script_groups/ # Auto-discovered script directories
│ ├── group1/
│ │ ├── metadata.json # Group description and settings
│ │ ├── script1.py
│ │ ├── script2.sh
│ │ ├── docs/ # Markdown documentation files
│ │ │ ├── script1.md
│ │ │ ├── script1_es.md
│ │ │ ├── script1_it.md
│ │ │ └── script1_fr.md
│ │ └── scripts/
│ │ └── script_metadata.json
│ └── group2/
│ ├── metadata.json
│ └── scripts/
├── translations/
│ ├── en.json # English (default)
│ ├── es.json # Spanish
│ ├── it.json # Italian
│ └── fr.json # French
├── static/
│ ├── css/
│ │ ├── main.css
│ │ ├── themes.css # Light/dark themes
│ │ ├── responsive.css
│ │ └── markdown-viewer.css # Markdown styling
│ ├── js/
│ │ ├── main.js
│ │ ├── websocket.js # Real-time log updates
│ │ ├── theme-manager.js # Theme switching
│ │ ├── language-manager.js # Language switching
│ │ ├── markdown-viewer.js # Markdown rendering
│ │ └── markdown-editor.js # Markdown editing
│ ├── images/
│ │ └── SIDEL.png # SIDEL corporate logo
│ └── icons/
├── templates/
│ ├── base.html # Base template with theme/language support
│ ├── login.html # Authentication
│ ├── dashboard.html # Main script discovery interface
│ ├── script_group.html # Individual group view
│ ├── logs.html # User log viewer with filtering and search
│ ├── log_detail.html # Detailed log view for specific execution
│ └── admin/
│ ├── users.html # User management
│ ├── permissions.html # Permission management
│ └── system_logs.html # System-wide log management (admin only)
├── models/
│ ├── user.py # User model
│ ├── script.py # Script metadata model
│ └── execution_log.py # Enhanced execution log model with user context
├── services/
│ ├── script_discovery.py # Auto-discovery service with header parsing
│ ├── script_executor.py # Script execution service with comprehensive logging
│ ├── user_service.py # User management
│ ├── translation_service.py # Multi-language support
│ ├── conda_service.py # Conda environment management
│ ├── metadata_service.py # Script metadata management and editing
│ ├── web_interface_manager.py # Flask server lifecycle management
│ ├── data_manager.py # Data persistence and project management
│ ├── port_manager.py # Port allocation and management
│ ├── markdown_service.py # Markdown processing and editing
│ ├── log_service.py # User-centric log management service
│ ├── websocket_service.py # Real-time log streaming service
│ ├── tags_service.py # User tags management
│ └── backup_service.py # System backup management
└── utils/
├── permissions.py # Permission decorators
├── validators.py # Input validation
└── helpers.py # Utility functions
Database Schema
-- Users table
CREATE TABLE users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
username VARCHAR(50) UNIQUE NOT NULL,
email VARCHAR(100) UNIQUE NOT NULL,
password_hash VARCHAR(255) NOT NULL,
user_level VARCHAR(20) NOT NULL,
preferred_language VARCHAR(5) DEFAULT 'en',
preferred_theme VARCHAR(10) DEFAULT 'light',
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
last_login TIMESTAMP,
is_active BOOLEAN DEFAULT TRUE
);
-- Script groups table
CREATE TABLE script_groups (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name VARCHAR(100) NOT NULL,
directory_path VARCHAR(255) NOT NULL,
description TEXT,
required_level VARCHAR(20) NOT NULL,
conda_environment VARCHAR(100),
is_active BOOLEAN DEFAULT TRUE,
discovered_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
-- Scripts table
CREATE TABLE scripts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
group_id INTEGER REFERENCES script_groups(id),
filename VARCHAR(100) NOT NULL,
display_name VARCHAR(100),
description TEXT,
description_long_path VARCHAR(255),
tags TEXT, -- Comma-separated tags
required_level VARCHAR(20) NOT NULL,
parameters JSON,
is_active BOOLEAN DEFAULT TRUE,
last_modified TIMESTAMP
);
-- User script tags table (for user-specific tagging)
CREATE TABLE user_script_tags (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER REFERENCES users(id),
script_id INTEGER REFERENCES scripts(id),
tags TEXT, -- Comma-separated user-specific tags
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
UNIQUE(user_id, script_id)
);
-- Conda environments table
CREATE TABLE conda_environments (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name VARCHAR(100) UNIQUE NOT NULL,
path VARCHAR(255) NOT NULL,
python_version VARCHAR(20),
is_available BOOLEAN DEFAULT TRUE,
detected_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
last_verified TIMESTAMP
);
-- User projects table
CREATE TABLE user_projects (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER REFERENCES users(id),
project_name VARCHAR(100) NOT NULL,
group_id INTEGER REFERENCES script_groups(id),
description TEXT,
is_default BOOLEAN DEFAULT FALSE,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
last_accessed TIMESTAMP,
UNIQUE(user_id, project_name, group_id)
);
-- Port allocations table
CREATE TABLE port_allocations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
port_number INTEGER UNIQUE NOT NULL,
script_id INTEGER REFERENCES scripts(id),
user_id INTEGER REFERENCES users(id),
project_id INTEGER REFERENCES user_projects(id),
process_id INTEGER,
status VARCHAR(20) NOT NULL, -- 'allocated', 'active', 'released'
allocated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
released_at TIMESTAMP
);
-- Script execution processes table
CREATE TABLE script_processes (
id INTEGER PRIMARY KEY AUTOINCREMENT,
script_id INTEGER REFERENCES scripts(id),
user_id INTEGER REFERENCES users(id),
process_id INTEGER NOT NULL,
flask_port INTEGER,
tab_session_id VARCHAR(100),
status VARCHAR(20) NOT NULL,
started_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
last_ping TIMESTAMP,
ended_at TIMESTAMP
);
-- Execution logs table (User-centric logging)
CREATE TABLE execution_logs (
id INTEGER PRIMARY KEY AUTOINCREMENT,
script_id INTEGER REFERENCES scripts(id),
user_id INTEGER REFERENCES users(id),
project_id INTEGER REFERENCES user_projects(id),
session_id VARCHAR(100), -- Links to script interface session
execution_uuid VARCHAR(36) UNIQUE NOT NULL, -- Unique execution identifier
status VARCHAR(20) NOT NULL, -- 'running', 'completed', 'failed', 'terminated'
start_time TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
end_time TIMESTAMP,
duration_seconds INTEGER,
output TEXT, -- Standard output from script execution
error_output TEXT, -- Error output from script execution
debug_output TEXT, -- Debug information and internal logs
exit_code INTEGER,
parameters JSON, -- Execution parameters for reference
conda_environment VARCHAR(100), -- Environment used for execution
flask_port INTEGER, -- Port used for script interface
log_level VARCHAR(10) DEFAULT 'INFO', -- DEBUG, INFO, WARNING, ERROR
tags TEXT, -- Comma-separated tags for log categorization
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
last_updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
JSON Configuration Files
Script Header Format for Auto-Discovery
"""
ScriptsManager Metadata:
@description: Creates a complete system backup with compression options
@description_long: docs/backup_system.md
@description_es: Crea una copia de seguridad completa del sistema con opciones de compresión
@description_long_es: docs/backup_system_es.md
@description_it: Crea un backup completo del sistema con opzioni di compressione
@description_long_it: docs/backup_system_it.md
@description_fr: Crée une sauvegarde complète du système avec options de compression
@description_long_fr: docs/backup_system_fr.md
@required_level: admin
@category: backup
@tags: system,maintenance,storage
@parameters: [
{
"name": "destination",
"type": "path",
"required": true,
"description": "Backup destination directory"
},
{
"name": "compression",
"type": "select",
"options": ["none", "gzip", "bzip2"],
"default": "gzip"
}
]
@execution_timeout: 3600
@flask_port: 5200
"""
# Your script code here...
Group Metadata (metadata.json)
{
"name": "System Administration",
"description": {
"en": "System administration and maintenance scripts",
"es": "Scripts de administración y mantenimiento del sistema",
"it": "Script di amministrazione e manutenzione del sistema",
"fr": "Scripts d'administration et de maintenance du système"
},
"icon": "server",
"required_level": "operator",
"category": "system",
"conda_environment": "base",
"auto_discovery": true,
"execution_timeout": 300
}
Script Metadata (script_metadata.json)
{
"backup_system.py": {
"display_name": {
"en": "System Backup",
"es": "Copia de Seguridad del Sistema",
"it": "Backup del Sistema",
"fr": "Sauvegarde du Système"
},
"description": {
"en": "Creates a complete system backup",
"es": "Crea una copia de seguridad completa del sistema",
"it": "Crea un backup completo del sistema",
"fr": "Crée une sauvegarde complète du système"
},
"description_long": {
"en": "docs/backup_system.md",
"es": "docs/backup_system_es.md",
"it": "docs/backup_system_it.md",
"fr": "docs/backup_system_fr.md"
},
"tags": ["system", "maintenance", "storage"],
"required_level": "admin",
"category": "backup",
"parameters": [
{
"name": "destination",
"type": "path",
"required": true,
"description": {
"en": "Backup destination directory",
"es": "Directorio de destino de la copia",
"it": "Directory di destinazione del backup",
"fr": "Répertoire de destination de la sauvegarde"
}
},
{
"name": "compression",
"type": "select",
"options": ["none", "gzip", "bzip2"],
"default": "gzip",
"description": {
"en": "Compression method",
"es": "Método de compresión",
"it": "Metodo di compressione",
"fr": "Méthode de compression"
}
}
],
"execution_timeout": 3600,
"flask_port": 5200,
"auto_open_interface": true,
"requires_parameters": true
}
}
API Endpoints
Authentication
POST /api/auth/login
- User loginPOST /api/auth/logout
- User logoutGET /api/auth/user
- Get current user info
Script Discovery & Management
GET /api/script-groups
- List all discovered script groupsGET /api/script-groups/{group_id}/scripts
- List scripts in groupPOST /api/scripts/{script_id}/execute
- Execute scriptGET /api/scripts/{script_id}/status
- Get script execution statusPOST /api/scripts/{script_id}/stop
- Stop running scriptGET /api/scripts/{script_id}/metadata
- Get script metadataPUT /api/scripts/{script_id}/metadata
- Update script metadata (developer+)POST /api/scripts/{script_id}/refresh-metadata
- Re-parse script headerGET /api/script-groups/{group_id}/refresh
- Refresh all scripts in group
Conda Environment Management
GET /api/conda/environments
- List all available conda environmentsPOST /api/conda/refresh
- Refresh conda environment detectionGET /api/conda/environments/{env_name}/info
- Get environment detailsPOST /api/script-groups/{group_id}/conda
- Set conda environment for group
User Management (Admin only)
GET /api/admin/users
- List all usersPOST /api/admin/users
- Create new userPUT /api/admin/users/{user_id}
- Update userDELETE /api/admin/users/{user_id}
- Delete userGET /api/admin/users/{user_id}/projects
- Get user's projectsPOST /api/admin/users/{user_id}/reset-data
- Reset user's data directories
Project Management
GET /api/projects
- List current user's projectsPOST /api/projects
- Create new project for current userPUT /api/projects/{project_id}
- Update project detailsDELETE /api/projects/{project_id}
- Delete project and associated dataPOST /api/projects/{project_id}/set-active
- Set active project for sessionGET /api/projects/active
- Get current active project
Data Management
GET /api/data/{group_id}/{project_id}/files
- List data files for projectGET /api/data/{group_id}/{project_id}/{filename}
- Get specific data filePOST /api/data/{group_id}/{project_id}/{filename}
- Create/Update data fileDELETE /api/data/{group_id}/{project_id}/{filename}
- Delete data filePOST /api/data/{group_id}/{project_id}/backup
- Create project data backup
Script Documentation & Tags
GET /api/scripts/{script_id}/description-long
- Get script long description (Markdown)PUT /api/scripts/{script_id}/description-long
- Update script long description (developer+)GET /api/scripts/{script_id}/tags
- Get script tagsPUT /api/scripts/{script_id}/tags
- Update user-specific script tagsGET /api/scripts/search-tags
- Search scripts by tagsGET /api/user/tags
- Get all user's tags across scripts
System Backup
POST /api/system/backup
- Create immediate system backupGET /api/system/backups
- List available system backupsDELETE /api/system/backups/{backup_date}
- Delete specific backupGET /api/system/backup-status
- Get backup service status
Logs & Monitoring (User-Centric)
GET /api/logs/execution
- Get current user's execution logs (paginated)GET /api/logs/execution/{execution_id}
- Get specific execution log detailsGET /api/logs/execution/{execution_id}/download
- Download execution log fileGET /api/logs/script/{script_id}
- Get user's logs for specific scriptGET /api/logs/project/{project_id}
- Get user's logs for specific projectGET /api/logs/search
- Search user's logs with filters (script, date, status, text)DELETE /api/logs/execution/{execution_id}
- Delete specific execution log (user owns)POST /api/logs/cleanup
- Clean up old logs based on retention policyGET /api/logs/stats
- Get user's logging statistics and usageWebSocket /ws/logs/{execution_id}
- Real-time log streaming for specific executionWebSocket /ws/logs/user
- Real-time log streaming for all user executions
Admin Logs & Monitoring (Admin only)
GET /api/admin/logs/execution
- Get all users' execution logsGET /api/admin/logs/system
- Get system-wide logsGET /api/admin/logs/user/{user_id}
- Get specific user's logsDELETE /api/admin/logs/user/{user_id}
- Delete all logs for specific userGET /api/admin/logs/audit
- Get system audit logsPOST /api/admin/logs/export
- Export system logs in various formats
Web Interface Management
POST /api/scripts/{script_id}/open-interface
- Open script web interfaceGET /api/scripts/{script_id}/interface-status
- Check interface statusPOST /api/scripts/{script_id}/ping-interface
- Ping from tab to keep alivePOST /api/scripts/{script_id}/close-interface
- Close script interfaceGET /api/active-interfaces
- List all active script interfaces
Internationalization
GET /api/i18n/languages
- Get available languagesGET /api/i18n/{language}
- Get translations for languagePOST /api/user/preferences
- Update user preferences
Features Implementation
1. Script Auto-Discovery Service with Header Parsing
# Pseudo-code for auto-discovery with header parsing
class ScriptDiscoveryService:
def scan_script_groups(self):
# Scan app/backend/script_groups/ directory
# Read metadata.json files
# Detect script files
# Parse script headers for metadata on first discovery
# Update database with discovered scripts
# Handle permission assignments
# Validate conda environment assignments
def parse_script_header(self, script_path):
# Extract metadata from script comments/docstrings
# Parse description, required_level, parameters
# Return structured metadata
def watch_for_changes(self):
# File system watcher for real-time discovery
# Hot-reload capability
# Re-parse headers when files change
2. Conda Environment Service
# Pseudo-code for conda management
class CondaService:
def detect_environments(self):
# Scan for conda installations (Windows/Linux)
# Parse conda environment list
# Validate environment availability
# Update database with environment info
def get_environment_info(self, env_name):
# Get Python version and packages
# Verify environment is functional
# Return environment details
def execute_in_environment(self, env_name, command, args):
# Activate conda environment
# Execute command with proper environment
# Handle cross-platform differences
3. Script Execution Service with Multi-User Support
# Pseudo-code for script execution with multi-user management
class ScriptExecutor:
def execute_script(self, script_id, parameters, user_id, project_id=None):
# Validate user permissions
# Get or create user project (default if none specified)
# Get user preferences (theme, language)
# Allocate unique port for user/script combination
# Get script group conda environment
# Prepare execution environment with conda
# Create user-specific data directory
# Handle parameter injection including:
# - Data directory path
# - User level
# - Flask port
# - Project ID
# - Project name
# - Theme preference
# - Language preference
# Start Flask server for script interface
# Register process with port and user info
# Open new browser tab with user session
# Execute with timeout in specified environment
# Stream output via WebSocket
# Log execution details with user context
# Monitor tab session for closure
def start_script_with_params(self, script_path, data_dir, user_level, port, project_id, project_name, theme, language):
# Execute script with standardized parameters:
# python script.py --data-dir {data_dir} --user-level {user_level} --port {port} --project-id {project_id} --project-name {project_name} --theme {theme} --language {language}
def manage_multi_user_interface(self, script_id, user_id, project_id, port):
# Start Flask server process with user context including theme and language
# Register in active interfaces with user/project info
# Monitor user-specific tab session
# Handle graceful shutdown maintaining user data
### 4. Multi-Language Support
```python
# Pseudo-code for translations
class TranslationService:
def get_translation(self, key, language='en'):
# Load translation file for language
# Return translated string
# Fall back to English if not found
def get_user_language(self, user):
# Return user's preferred language
5. Script Metadata Management Service
# Pseudo-code for metadata management
class MetadataService:
def update_script_metadata(self, script_id, metadata, user):
# Validate user permissions (developer+)
# Update script description, required_level, etc.
# Maintain version history
# Validate required_level values
def parse_and_update_from_header(self, script_id):
# Re-parse script file header
# Update metadata from parsed content
# Preserve user-edited fields
def get_editable_fields(self, user_level):
# Return list of fields user can edit
# Based on permission level
6. Web Interface Manager Service
# Pseudo-code for web interface lifecycle
class WebInterfaceManager:
def start_script_interface(self, script_id, user_id):
# Allocate port from available pool
# Start Flask process for script
# Register in active interfaces table
# Generate session ID for tab tracking
# Return interface URL and session ID
def monitor_tab_session(self, session_id):
# Monitor tab heartbeat via JavaScript
# Detect tab closure
# Trigger graceful script shutdown
def cleanup_inactive_interfaces(self):
# Periodic cleanup of orphaned processes
# Check for unresponsive tabs
# Terminate associated script processes
def get_available_port(self):
# Find available port in configured range
# Avoid conflicts with existing interfaces
# Return available port number
### 8. Data Management Service
```python
# Pseudo-code for data persistence management
class DataManager:
def __init__(self, base_data_path):
self.base_path = base_data_path
def get_user_project_path(self, user_id, group_id, project_name):
# Returns: data/script_groups/group_{group_id}/user_{user_id}/{project_name}/
return os.path.join(
self.base_path,
"script_groups",
f"group_{group_id}",
f"user_{user_id}",
project_name
)
def ensure_project_directory(self, user_id, group_id, project_name):
# Create directory structure if it doesn't exist
# Initialize default configuration files
# Set proper permissions
def get_config_file(self, user_id, group_id, project_name, filename):
# Load JSON configuration file
# Return parsed data or default values
def save_config_file(self, user_id, group_id, project_name, filename, data):
# Save JSON data to configuration file
# Validate data structure
# Handle concurrent access
def list_user_projects(self, user_id, group_id):
# List all projects for user in specific group
# Return project metadata
def create_default_project(self, user_id, group_id):
# Create 'project_default' for new users
# Initialize with default settings
def backup_project_data(self, user_id, group_id, project_name):
# Create timestamped backup of project data
# Compress and store in backups directory
9. Port Management Service
# Pseudo-code for port allocation management
class PortManager:
def __init__(self, port_range_start=5200, port_range_end=5400):
self.port_range = range(port_range_start, port_range_end + 1)
self.allocated_ports = set()
def allocate_port(self, script_id, user_id, project_id):
# Find available port in range (5200+)
# Verify port is not in use by system before allocation
# Mark as allocated in database
# Return port number or None if none available
def release_port(self, port_number):
# Mark port as released
# Clean up database entry
# Add back to available pool
def get_active_ports(self):
# Return list of currently active ports
# With associated script/user information
def cleanup_orphaned_ports(self):
# Release ports from terminated processes
# Clean up stale allocations
def is_port_available(self, port_number):
# Check if specific port is available
# Validate against system and allocated ports
# Verify port is not currently bound to any process
def check_system_port_usage(self, port_number):
# Use system tools to verify port availability
# Check for any existing bindings on the port
# Return True if port is free for use
## Port Management Architecture
### Port Range Allocation
- **Reserved Range**: Ports 5200-5400 are exclusively reserved for script Flask interfaces
- **System Protection**: Ports below 5200 remain available for system services and main application
- **Dynamic Assignment**: Ports are allocated dynamically based on availability
- **Conflict Prevention**: Each allocation is verified against system port usage
### Port Availability Verification
The system implements comprehensive port checking before allocation:
1. **Database Check**: Verify port is not already allocated to another script
2. **System Check**: Use system tools to confirm port is not in use
3. **Binding Test**: Attempt temporary binding to confirm availability
4. **Retry Logic**: If port is unavailable, try next available port in range
### Port Lifecycle Management
- **Allocation**: Port assigned when script interface starts
- **Monitoring**: Regular health checks to ensure script is still running
- **Cleanup**: Automatic release when script terminates or tab closes
- **Recovery**: Orphaned port detection and automatic cleanup
```python
# Enhanced port checking implementation
import socket
import subprocess
import psutil
def is_port_available(port):
"""Comprehensive port availability check"""
# Check if port is in our allocated range
if port < 5200 or port > 5400:
return False
# Try to bind to the port
try:
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as sock:
sock.settimeout(1)
result = sock.connect_ex(('127.0.0.1', port))
return result != 0 # Port is available if connection fails
except Exception:
return False
def find_available_port(start_port=5200, end_port=5400):
"""Find first available port in range"""
for port in range(start_port, end_port + 1):
if is_port_available(port):
return port
return None # No available ports
### 11. Markdown Processing Service
```python
# Pseudo-code for markdown management
class MarkdownService:
def __init__(self, script_groups_path):
self.base_path = script_groups_path
def get_script_long_description(self, script_id, language='en'):
# Get script metadata and long description path
# Load markdown file for specified language
# Return markdown content or fallback to English
def update_script_long_description(self, script_id, markdown_content, language='en'):
# Validate user permissions (developer+)
# Save markdown content to appropriate file
# Update script metadata with new path
def render_markdown_to_html(self, markdown_content):
# Convert markdown to HTML
# Apply syntax highlighting for code blocks
# Handle mathematical formulas if needed
# Return rendered HTML
def get_available_languages_for_script(self, script_id):
# Return list of available language versions
# Check for existing markdown files
12. User Tags Service
# Pseudo-code for user tags management
class TagsService:
def get_user_script_tags(self, user_id, script_id):
# Get user-specific tags for a script
# Return list of tags
def update_user_script_tags(self, user_id, script_id, tags):
# Update user-specific tags for a script
# Validate and clean tag format
# Save to database
def search_scripts_by_tags(self, user_id, tags, match_all=False):
# Search scripts by user tags
# match_all: True for AND operation, False for OR
# Return list of matching script IDs
def get_all_user_tags(self, user_id):
# Get all unique tags used by user
# Return sorted list of tags with usage count
def get_script_system_tags(self, script_id):
# Get system-defined tags from script metadata
# Return list of system tags
def merge_script_tags(self, user_id, script_id):
# Combine system tags and user tags
# Return unified tag list with source indication
13. System Backup Service
# Pseudo-code for system backup management
class BackupService:
def __init__(self, data_path, backup_path):
self.data_path = data_path
self.backup_path = backup_path
def create_daily_backup(self):
# Create compressed backup of entire /data/ directory
# Use current date as backup name: /backup/2025-09-11/
# Compress using gzip or zip
# Clean up old backups based on retention policy
def schedule_daily_backup(self):
# Set up automatic daily backup at configured time
# Use threading.Timer or scheduling library
# Handle backup rotation and cleanup
def list_available_backups(self):
# Return list of available backup dates
# Include backup size and creation time
def delete_backup(self, backup_date):
# Delete specific backup by date
# Validate backup exists before deletion
def get_backup_status(self):
# Return current backup service status
# Include last backup time, next scheduled backup
# Disk usage information
def restore_from_backup(self, backup_date, target_path=None):
# Restore data from specific backup
# Optional: restore to different location
# Admin-only operation
### 14. Log Management Service
```python
# Pseudo-code for user-centric log management
class LogService:
def __init__(self, base_log_path, database):
self.base_path = base_log_path
self.db = database
def create_execution_log(self, script_id, user_id, project_id, session_id):
# Create new execution log entry in database
# Generate unique execution UUID
# Set up log file path: logs/executions/user_{user_id}/YYYY-MM-DD/execution_{uuid}.log
# Return execution log ID and file path
def log_execution_start(self, execution_id, parameters, conda_env, port):
# Log execution start with full context
# Record parameters, environment, and execution setup
# Start real-time log streaming setup
def append_log_output(self, execution_id, output_type, content):
# Append output to log file and database
# output_type: 'stdout', 'stderr', 'debug', 'system'
# Handle real-time streaming to connected WebSocket clients
# Update database with latest output
def log_execution_end(self, execution_id, exit_code, duration):
# Mark execution as completed
# Calculate final statistics
# Close log file and update database
# Notify connected clients of completion
def get_user_logs(self, user_id, filters=None, page=1, per_page=50):
# Get paginated list of user's execution logs
# Apply filters: script_id, project_id, date_range, status
# Return logs with metadata (script name, project, duration, etc.)
def get_execution_log_detail(self, execution_id, user_id):
# Get detailed log information for specific execution
# Validate user owns the log
# Return full log content and metadata
def search_user_logs(self, user_id, search_query, filters=None):
# Full-text search across user's log content
# Search in output, parameters, and metadata
# Return matching executions with highlighted snippets
def delete_user_log(self, execution_id, user_id):
# Delete specific execution log (user validation)
# Remove from database and file system
# Update storage statistics
def export_user_logs(self, user_id, format='json', filters=None):
# Export user's logs in specified format (json, csv, txt)
# Apply optional filters
# Return file path or stream for download
def cleanup_old_logs(self, retention_days=30):
# Clean up logs older than retention period
# Respect user-level retention policies
# Update storage statistics
def get_user_log_statistics(self, user_id):
# Return user's logging statistics
# Total executions, success rate, storage usage
# Most used scripts and projects
### 15. WebSocket Log Streaming Service
```python
# Pseudo-code for real-time log streaming
class WebSocketLogService:
def __init__(self, socketio_instance):
self.socketio = socketio_instance
self.active_connections = {} # execution_id -> [client_sessions]
def connect_to_execution_log(self, client_session, execution_id, user_id):
# Validate user access to execution log
# Add client to active connections for this execution
# Send initial log content to client
# Set up real-time streaming
def broadcast_log_update(self, execution_id, log_content, output_type):
# Send log update to all connected clients for this execution
# Format message with timestamp and output type
# Handle client disconnections gracefully
def disconnect_from_execution_log(self, client_session, execution_id):
# Remove client from active connections
# Clean up resources if no more clients connected
def connect_to_user_logs(self, client_session, user_id):
# Connect to all active executions for user
# Send updates for any of user's running scripts
# Useful for dashboard real-time updates
def get_active_connections_count(self, execution_id):
# Return number of clients watching this execution
# Used for resource management decisions
# Pseudo-code for multi-user session handling
class SessionManager:
def create_user_session(self, user_id, active_project_id=None):
# Create session context for user
# Set active project (default if none specified)
# Initialize user-specific settings
def get_user_context(self, session_id):
# Return user context including:
# - User ID and level
# - Active project
# - Permissions
# - Preferences
def switch_project(self, session_id, project_id):
# Change active project for session
# Validate user access to project
# Update session context
def cleanup_inactive_sessions(self):
# Remove expired sessions
# Release associated resources
# Pseudo-code for permissions
def require_permission(required_level):
def decorator(func):
def wrapper(*args, **kwargs):
# Check user authentication
# Validate user level
# Allow or deny access
return wrapper
return decorator
def can_edit_metadata(user_level):
# Returns True if user can edit script metadata
return user_level in ['developer', 'admin']
Conda Environment Management
Environment Detection
- Automatic Discovery: Scans system for conda installations
- Cross-Platform Support: Works on both Windows and Linux environments
- Environment Validation: Verifies each environment is functional
- Python Version Detection: Identifies Python version for each environment
- Package Information: Optional package listing for environment details
Environment Assignment
- Group-Level Configuration: Each script group can use a different conda environment
- Default Environment: Falls back to 'base' environment if none specified
- Environment Validation: Ensures assigned environment exists before script execution
- Dynamic Updates: Environment assignments can be changed without restart
Execution Integration
- Seamless Activation: Scripts execute within their assigned conda environment
- Cross-Platform Commands: Handles conda activation differences between Windows/Linux
- Error Handling: Graceful fallback if environment becomes unavailable
- Logging: Environment activation and execution logged for troubleshooting
Management Interface
- Environment List: Visual display of all detected conda environments
- Assignment Interface: Dropdown selection for script group environment assignment
- Status Indicators: Visual indicators for environment availability
- Refresh Capability: Manual refresh of environment detection
User Interface Components
SIDEL Corporate Branding Integration
- SIDEL Logo:
app/static/images/SIDEL.png
displayed prominently in application header - Corporate Identity: Consistent SIDEL branding across all pages and script interfaces
- Logo Propagation: SIDEL logo passed to script interfaces for consistent branding
- Responsive Logo: Logo adapts to different screen sizes and theme modes
1. Dashboard (Multi-User)
- SIDEL Header: SIDEL logo and corporate branding in main navigation
- User Context: Display current user information and active project
- Script Group Cards: Visual representation of script groups with icons
- Project Selector: Dropdown to switch between user's projects
- Quick Actions: Recently used scripts and favorites (user-specific)
- System Status: System health indicators and conda environment status
- User Info: Current user, language, theme settings, and active project
- Environment Selector: Quick conda environment overview per group
- Active Sessions: List of user's currently running script interfaces
2. Script Group View (Project-Aware with Enhanced Documentation)
- SIDEL Branding: Consistent SIDEL logo and corporate identity
- Project Context: Display current active project at top
- Script List: Filterable and searchable script listing with tag filtering
- Tag Management: Add/edit user-specific tags for scripts
- Documentation Viewer: Integrated Markdown viewer for long descriptions
- Multi-Language Documentation: Switch between available language versions
- Execution Forms: Dynamic forms based on script parameters
- Inline Help: Short descriptions with expandable long descriptions
- Execution History: Previous runs for current user/project
- Environment Info: Display active conda environment for the group
- Environment Management: Change conda environment (admin/developer only)
- Metadata Editor: Inline editing for script descriptions and levels (developer+)
- Markdown Editor: Edit long descriptions in Markdown format (developer+)
- Header Re-parsing: Refresh metadata from script headers
- Active Interfaces: List of currently running script web interfaces for user
- Project Management: Create, switch, or manage projects (inline controls)
3. Log Viewer (User-Centric)
- User-Isolated Logs: Each user can only view their own script execution logs
- Project-Specific Logs: Logs are organized by user projects for better context
- Real-time Logs: Live log streaming via WebSocket during script execution
- Post-Execution Logs: Persistent log storage for reviewing past script executions
- Execution Status Indicators: Visual status badges (Running, Completed, Failed, Terminated)
- Log Filtering & Search:
- Filter by script name, project, execution date range
- Filter by execution status and duration
- Full-text search across log content and parameters
- Save and reuse filter presets
- Log List View:
- Paginated table with execution summary
- Columns: Script Name, Project, Start Time, Duration, Status, Actions
- Sortable by any column
- Quick preview of output and errors
- Detailed Log View:
- Full execution log with syntax highlighting
- Tabbed interface: Output, Errors, Debug, Parameters, Environment
- Execution timeline and performance metrics
- Download individual log files
- Real-time Monitoring:
- Live updates during script execution
- Progress indicators and execution timeline
- Auto-scroll with pause/resume controls
- Multiple execution monitoring in tabs
- Export & Management:
- Download logs in various formats (TXT, JSON, CSV, PDF)
- Bulk operations: select multiple logs for export or deletion
- Log retention settings per user
- Storage usage statistics and cleanup recommendations
- Integration Features:
- Quick navigation to script or project from log entry
- Re-run script with same parameters from log
- Share log details with developers (admin only)
- Log bookmarking and notes
4. Admin Panel (Enhanced)
- User Management: CRUD operations for users
- User Data Management: View and manage user projects and data
- Permission Matrix: Visual permission assignment
- System Configuration: Application settings
- Discovery Management: Manual script discovery triggers
- Environment Management: Conda environment detection and assignment
- Interface Monitoring: Active script web interfaces management (all users)
- Metadata Audit: Track metadata changes
- Port Allocation: Monitor and manage port usage across users
- Data Directory Overview: System-wide view of user data usage
- Backup Management: System backup configuration and monitoring
- Documentation Management: Overview of script documentation status
Engineering-Focused Features
Multi-Language Technical Documentation
- Markdown Support: Full Markdown rendering with syntax highlighting for code blocks
- Multi-Language Documentation: Support for technical documentation in multiple languages
- Mathematical Formulas: Support for LaTeX/MathJax mathematical expressions in documentation
- Technical Diagrams: Embedding of diagrams and flowcharts in Markdown
- Code Examples: Syntax-highlighted code examples in documentation
User-Centric Organization
- Personal Tagging System: Each user can tag scripts with custom labels for personal organization
- Search by Tags: Quick filtering and searching of scripts using personal tags
- Algorithm Categorization: Organize engineering algorithms by domain, complexity, or use case
- Long-Term Accessibility: Documentation designed for easy understanding after extended periods
Documentation Structure
app/backend/script_groups/thermal_analysis/
├── metadata.json
├── heat_transfer.py
├── thermal_stress.py
├── docs/
│ ├── heat_transfer.md # English documentation
│ ├── heat_transfer_es.md # Spanish documentation
│ ├── heat_transfer_it.md # Italian documentation
│ ├── heat_transfer_fr.md # French documentation
│ ├── thermal_stress.md
│ ├── thermal_stress_es.md
│ ├── thermal_stress_it.md
│ └── thermal_stress_fr.md
└── examples/
├── heat_exchanger_example.json
└── stress_analysis_case.json
Backup and Data Integrity
- Daily System Backup: Automatic compression and backup of entire data directory
- Simple Backup Structure:
/backup/YYYY-MM-DD/
format for easy manual recovery - Data Persistence: All configurations and results preserved across sessions
- Project Isolation: Each engineering project maintains independent configurations
Engineering Workflow Integration
- Parameter Persistence: Engineering calculations and configurations saved per project
- Result Documentation: Integration with script-generated reports and outputs
- Configuration Templates: Reusable parameter sets for common engineering scenarios
- Cross-Reference Documentation: Link related algorithms and reference implementations
System Backup Architecture
Backup Directory Structure
backup/
├── 2025-09-11/
│ └── data_backup_20250911_020000.tar.gz
├── 2025-09-10/
│ └── data_backup_20250910_020000.tar.gz
├── 2025-09-09/
│ └── data_backup_20250909_020000.tar.gz
└── backup_logs/
├── backup_20250911.log
├── backup_20250910.log
└── backup_20250909.log
Backup Service Configuration
{
"backup": {
"enabled": true,
"schedule_time": "02:00",
"retention_days": 30,
"compression": "gzip",
"backup_path": "./backup",
"exclude_patterns": ["*.tmp", "*.log", "__pycache__"],
"max_backup_size_gb": 10
}
}
User-Centric Log Management System
Log Organization Structure
- User-Isolated Storage: Each user's logs stored separately to ensure privacy
- Project-Based Organization: Logs grouped by user projects for better context
- Date-Based Hierarchy: Daily directories for efficient log retrieval
- Unique Execution IDs: Each script execution gets a UUID for precise tracking
Log Content Structure
{
"execution_id": "abc123-def456-ghi789",
"script_id": 42,
"script_name": "data_analysis.py",
"user_id": 5,
"username": "john_doe",
"project_id": 12,
"project_name": "monthly_reports",
"execution_start": "2025-09-12T10:30:00Z",
"execution_end": "2025-09-12T10:35:30Z",
"duration_seconds": 330,
"status": "completed",
"exit_code": 0,
"parameters": {
"input_file": "data/reports/raw_data.csv",
"output_format": "excel",
"date_range": "2025-09-01_to_2025-09-11"
},
"environment": {
"conda_env": "data_analysis",
"python_version": "3.11.4",
"flask_port": 5023
},
"output": {
"stdout": "Processing 1000 records...\nAnalysis complete.\nReport saved to output/monthly_report_2025-09.xlsx",
"stderr": "",
"debug": "Memory usage: 45MB\nExecution time breakdown: Data loading (2.3s), Processing (4.1s), Export (1.2s)"
},
"session_info": {
"browser_session": "sess_xyz789",
"interface_url": "http://127.0.0.1:5023",
"real_time_viewers": 1
}
}
Real-time Log Streaming
- WebSocket Integration: Live updates during script execution
- Multiple Viewer Support: Multiple users can monitor same execution (admin)
- Selective Streaming: Choose which output types to stream (stdout, stderr, debug)
- Bandwidth Optimization: Compress and buffer log updates for efficiency
Log Retention & Cleanup
- User Level Based Retention:
viewer
: 7 daysoperator
: 30 daysdeveloper
: 90 daysadmin
: 365 days (configurable)
- Automatic Cleanup: Daily maintenance job removes expired logs
- Manual Management: Users can delete their own logs before expiration
- Storage Quotas: Per-user storage limits with notifications
Log Search & Analytics
- Full-Text Search: Search across all log content using database indexing
- Advanced Filters: Combine multiple criteria (date, script, status, duration)
- Execution Statistics: Success rates, average durations, most used scripts
- Trend Analysis: Execution patterns and performance trends over time
Log Security & Privacy
- User Isolation: Strict access control - users see only their own logs
- Admin Override: Administrators can access any user's logs for debugging
- Audit Trail: Log access events are recorded for security auditing
- Data Protection: Sensitive parameters can be masked in log storage
Script Standard and Interface Contract
Required Script Parameters
Every script must accept the following command-line parameters:
python script.py --data-dir <path> --user-level <level> --port <number> --project-id <id> --project-name <name> --theme <theme> --language <lang>
Parameter Details:
--data-dir
: Absolute path to user/project data directory--user-level
: User permission level (admin
,developer
,operator
,viewer
)--port
: Assigned Flask port number for the script's web interface--project-id
: Current project identifier for data isolation--project-name
: Human-readable project name for display in script frontend--theme
: Current user theme (light
,dark
) for consistent UI appearance--language
: Current user language (en
,es
,it
,fr
) for localized interfaces
Script Implementation Template
import argparse
import os
import json
from flask import Flask, render_template, request, jsonify
def parse_arguments():
parser = argparse.ArgumentParser(description='SIDEL ScriptsManager Script')
parser.add_argument('--data-dir', required=True, help='Data directory path')
parser.add_argument('--user-level', required=True, choices=['admin', 'developer', 'operator', 'viewer'])
parser.add_argument('--port', type=int, required=True, help='Flask port number')
parser.add_argument('--project-id', required=True, help='Project identifier')
parser.add_argument('--project-name', required=True, help='Project display name')
parser.add_argument('--theme', required=True, choices=['light', 'dark'], help='Current user theme')
parser.add_argument('--language', required=True, choices=['en', 'es', 'it', 'fr'], help='Current user language')
return parser.parse_args()
class ScriptDataManager:
def __init__(self, data_dir, project_id, project_name):
self.data_dir = data_dir
self.project_id = project_id
self.project_name = project_name
self.ensure_data_structure()
def ensure_data_structure(self):
"""Create data directory structure if it doesn't exist"""
os.makedirs(self.data_dir, exist_ok=True)
def load_config(self, filename='config.json'):
"""Load configuration from JSON file"""
config_path = os.path.join(self.data_dir, filename)
if os.path.exists(config_path):
with open(config_path, 'r') as f:
return json.load(f)
return {}
def save_config(self, config, filename='config.json'):
"""Save configuration to JSON file"""
config_path = os.path.join(self.data_dir, filename)
with open(config_path, 'w') as f:
json.dump(config, f, indent=2)
def create_flask_app(data_manager, user_level, project_id, project_name, theme, language):
app = Flask(__name__)
# SIDEL logo path for consistent branding
sidel_logo = '/static/images/SIDEL.png'
@app.route('/')
def index():
config = data_manager.load_config()
return render_template('index.html',
config=config,
user_level=user_level,
project_id=project_id,
project_name=project_name,
theme=theme,
language=language,
sidel_logo=sidel_logo)
@app.route('/api/config', methods=['GET', 'POST'])
def handle_config():
if request.method == 'GET':
return jsonify(data_manager.load_config())
else:
config = request.json
data_manager.save_config(config)
return jsonify({'status': 'success'})
@app.route('/api/project-info')
def get_project_info():
return jsonify({
'project_id': project_id,
'project_name': project_name,
'user_level': user_level,
'theme': theme,
'language': language,
'sidel_logo': sidel_logo
})
return app
if __name__ == '__main__':
args = parse_arguments()
# Initialize data manager with project information
data_manager = ScriptDataManager(args.data_dir, args.project_id, args.project_name)
# Create Flask application with SIDEL branding, project context, theme and language
app = create_flask_app(data_manager, args.user_level, args.project_id, args.project_name, args.theme, args.language)
# Run Flask server
print(f"Starting SIDEL script for project: {args.project_name} (Theme: {args.theme}, Language: {args.language})")
app.run(host='127.0.0.1', port=args.port, debug=False)
Data Management Guidelines
- Use Provided Data Directory: Always use the
--data-dir
parameter for persistent storage - JSON Configuration: Store settings in JSON files for easy management
- User Level Awareness: Adapt interface based on user permission level
- Project Isolation: Use project ID to separate data when needed
- Project Display: Use project name for user-friendly display in interfaces
- SIDEL Branding: Include SIDEL logo and corporate branding in all interfaces
- Theme Consistency: Apply the provided theme (
light
/dark
) to maintain visual consistency - Language Localization: Use the provided language parameter for interface localization
- Error Handling: Gracefully handle missing or corrupted data files
Flask Interface Requirements
- Port Binding: Must bind to the exact port provided by SIDEL ScriptsManager
- Host Restriction: Bind only to
127.0.0.1
for security - Graceful Shutdown: Handle SIGTERM for clean shutdown
- Session Management: Maintain user context throughout session
- Error Reporting: Report errors through standard logging
- SIDEL Branding: Include SIDEL logo and consistent visual identity
- Project Context: Display project name prominently in interface
- Theme Consistency: Apply the provided theme (light/dark) throughout the interface
- Language Support: Use the provided language for interface localization and messages
Multi-User Data Architecture
Data Directory Structure
data/
├── script_groups/
│ ├── group_analytics/
│ │ ├── user_john/
│ │ │ ├── project_default/
│ │ │ │ ├── config.json
│ │ │ │ ├── datasets.json
│ │ │ │ └── analysis_results.json
│ │ │ ├── project_monthly_report/
│ │ │ │ ├── config.json
│ │ │ │ └── report_data.json
│ │ │ └── project_customer_analysis/
│ │ │ └── config.json
│ │ └── user_mary/
│ │ ├── project_default/
│ │ │ └── config.json
│ │ └── project_experimental/
│ │ ├── config.json
│ │ └── experiments.json
│ └── group_automation/
│ ├── user_john/
│ │ └── project_default/
│ │ ├── workflows.json
│ │ └── schedules.json
│ └── user_admin/
│ └── project_system_maintenance/
│ └── maintenance_config.json
├── backups/
│ ├── user_john_project_monthly_report_20250911_143022.zip
│ └── user_mary_project_experimental_20250910_091545.zip
└── system/
├── port_allocations.json
└── active_sessions.json
Project Management Workflow
- User Login: ScriptsManager creates user session
- Project Selection: User selects or creates project
- Script Execution: ScriptsManager passes project-specific data directory
- Data Persistence: Script manages its own JSON files within provided directory
- Session Continuity: Project context maintained across script executions
- Data Backup: Automatic and manual backup capabilities
Web Interface Lifecycle Management
Interface Startup
- Port Allocation: Automatically assigns available ports from configured range (5200-5400)
- Process Registration: Tracks script processes with PID, port, and session ID
- Tab Session Tracking: Generates unique session IDs for browser tab monitoring
- Automatic Opening: Opens script interface in new browser tab upon execution
Session Monitoring
- Heartbeat System: JavaScript in each tab sends periodic pings to maintain session
- Tab Detection: Detects when browser tabs are closed or become inactive
- Process Linking: Links browser sessions to running script processes
- Timeout Management: Configurable timeout for inactive sessions
Graceful Shutdown
- Tab Closure Detection: Monitors for tab closure events via JavaScript
- Process Termination: Gracefully terminates script processes when tabs close
- Resource Cleanup: Frees allocated ports and removes database records
- Orphan Prevention: Periodic cleanup of abandoned processes
Configuration Options
{
"web_interface": {
"port_range": {"start": 5200, "end": 5400},
"session_timeout": 1800,
"heartbeat_interval": 30,
"cleanup_interval": 300,
"max_concurrent_interfaces": 20,
"max_interfaces_per_user": 5,
"port_availability_check": true,
"port_check_retries": 3
},
"data_management": {
"base_data_path": "./data",
"auto_backup": true,
"backup_interval_hours": 24,
"max_backup_versions": 30,
"compress_backups": true,
"backup_schedule_time": "02:00"
},
"documentation": {
"markdown_extensions": ["codehilite", "tables", "toc", "math"],
"supported_languages": ["en", "es", "it", "fr"],
"default_language": "en",
"enable_math_rendering": true,
"enable_diagram_rendering": false
},
"tagging": {
"max_tags_per_script": 20,
"max_tag_length": 30,
"allowed_tag_chars": "alphanumeric_underscore_dash",
"enable_tag_suggestions": true
},
"multi_user": {
"max_projects_per_user": 50,
"default_project_name": "project_default",
"auto_create_default_project": true,
"user_data_isolation": true
},
"security": {
"data_directory_permissions": "755",
"config_file_permissions": "644",
"enable_project_sharing": false,
"admin_can_access_all_data": true
}
}
Script Metadata Management
Header Parsing Rules
- First Discovery: Automatically extracts metadata from script docstring/comments
- Precedence: User-edited metadata takes precedence over header-parsed data
- Re-parsing: Manual refresh option to update from modified headers
- Validation: Validates required_level values against allowed user levels
Editable Fields (Developer+ Only)
- Description: Multi-language script descriptions (short)
- Long Description: Multi-language Markdown documentation (long)
- Required Level: Minimum user level required for script execution/viewing
- Category: Script categorization for filtering and organization
- System Tags: Script tags for classification and searching
- Parameters: Script parameter definitions and validation rules
- Execution Settings: Timeout, conda environment, interface settings
Documentation Management
- Markdown Files: Automatic creation and management of documentation files
- Language Versions: Support for multiple language versions of documentation
- Template Generation: Auto-generate documentation templates for new scripts
- Content Validation: Basic validation of Markdown syntax and structure
Security Considerations
1. Authentication & Authorization
- Secure password hashing (bcrypt)
- Session management with timeout
- CSRF protection
- Input validation and sanitization
2. Script Execution Security
- Sandboxed execution environment
- Resource limits (CPU, memory, time)
- Whitelist of allowed script locations
- Parameter validation and escaping
- Web interface port isolation
- Process monitoring and automatic cleanup
3. Access Control
- Role-based access control (RBAC)
- Principle of least privilege
- Audit logging for all actions
- Secure file path handling
- Metadata editing permissions (developer+ only)
- Web interface session security
Installation & Deployment
Requirements
- Python 3.12+ (minimum required version)
- Operating System: Linux (primary) with Windows support
- Conda/Miniconda: Required for environment management
- WebSocket support: For real-time log streaming
Database Engine
SQLite (Recommended for cross-platform deployment)
- Rationale:
- Zero-configuration setup
- Cross-platform compatibility (Linux/Windows)
- Single file database for easy backup
- Built-in Python support
- Sufficient performance for engineering script management
- No additional server requirements
- File-based storage: Simplifies deployment and maintenance
- Automatic backup integration: Single file backup with system data
- Migration path: Can upgrade to PostgreSQL if needed in future
Python Dependencies
# Core Framework
flask>=3.0.0
flask-sqlalchemy>=3.1.0
flask-login>=0.6.0
flask-wtf>=1.2.0
flask-socketio>=5.3.0
# Database
sqlite3 # Built-in with Python 3.12+
# Web Server
gunicorn>=21.2.0 # Production WSGI server
eventlet>=0.33.0 # WebSocket support
# Conda Environment Management
conda-pack>=0.7.0 # Conda environment utilities
subprocess32>=3.5.4 # Enhanced subprocess handling
# Markdown Processing
markdown>=3.5.0
markdown-extensions>=0.1.0
pygments>=2.16.0 # Syntax highlighting
# File Management & Compression
watchdog>=3.0.0 # File system monitoring
zipfile36>=0.1.0 # Enhanced zip functionality
# Utilities
pyyaml>=6.0.1 # YAML configuration support
python-dateutil>=2.8.2 # Date/time utilities
psutil>=5.9.0 # Process management
requests>=2.31.0 # HTTP client for health checks
# Development & Testing (optional)
pytest>=7.4.0
pytest-flask>=1.3.0
black>=23.9.0 # Code formatting
flake8>=6.1.0 # Code linting
Installation Script
# Create Python 3.12+ virtual environment
python3.12 -m venv scriptsmanager_env
source scriptsmanager_env/bin/activate # Linux
# scriptsmanager_env\Scripts\activate # Windows
# Install dependencies
pip install --upgrade pip
pip install -r requirements.txt
# Initialize database
python init_db.py
# Create initial admin user
python create_admin.py --username admin --password <secure_password>
# Start application
python app.py
Database Setup
# Database initialization with SQLite
import sqlite3
import os
from datetime import datetime
def initialize_database(db_path="data/scriptsmanager.db"):
"""Initialize SQLite database with required tables"""
# Ensure data directory exists
os.makedirs(os.path.dirname(db_path), exist_ok=True)
conn = sqlite3.connect(db_path)
cursor = conn.cursor()
# Create all tables from schema
with open('sql/create_tables.sql', 'r') as f:
schema_sql = f.read()
cursor.executescript(schema_sql)
# Create default admin user
admin_password = hash_password("admin123") # Change in production
cursor.execute("""
INSERT OR IGNORE INTO users
(username, email, password_hash, user_level, is_active)
VALUES (?, ?, ?, ?, ?)
""", ("admin", "admin@localhost", admin_password, "admin", True))
conn.commit()
conn.close()
print(f"Database initialized at: {db_path}")
if __name__ == "__main__":
initialize_database()
Configuration Management
# config/app_config.py
import os
from pathlib import Path
class Config:
# Database Configuration
DATABASE_URL = os.getenv('DATABASE_URL', 'sqlite:///data/scriptsmanager.db')
# Application Settings
SECRET_KEY = os.getenv('SECRET_KEY', 'your-secret-key-change-in-production')
DEBUG = os.getenv('DEBUG', 'False').lower() == 'true'
# Multi-user Settings
BASE_DATA_PATH = Path(os.getenv('BASE_DATA_PATH', './data'))
MAX_PROJECTS_PER_USER = int(os.getenv('MAX_PROJECTS_PER_USER', '50'))
# Port Management
PORT_RANGE_START = int(os.getenv('PORT_RANGE_START', '5200'))
PORT_RANGE_END = int(os.getenv('PORT_RANGE_END', '5400'))
PORT_AVAILABILITY_CHECK = os.getenv('PORT_AVAILABILITY_CHECK', 'True').lower() == 'true'
PORT_CHECK_RETRIES = int(os.getenv('PORT_CHECK_RETRIES', '3'))
# Backup Configuration
BACKUP_ENABLED = os.getenv('BACKUP_ENABLED', 'True').lower() == 'true'
BACKUP_SCHEDULE_TIME = os.getenv('BACKUP_SCHEDULE_TIME', '02:00')
BACKUP_RETENTION_DAYS = int(os.getenv('BACKUP_RETENTION_DAYS', '30'))
# Conda Environment
CONDA_AUTO_DETECT = os.getenv('CONDA_AUTO_DETECT', 'True').lower() == 'true'
# Supported Languages
SUPPORTED_LANGUAGES = ['en', 'es', 'it', 'fr']
DEFAULT_LANGUAGE = os.getenv('DEFAULT_LANGUAGE', 'en')
Conda Environment Detection
# services/conda_service.py
import subprocess
import json
import os
from pathlib import Path
class CondaService:
def __init__(self):
self.conda_executable = self.find_conda_executable()
def find_conda_executable(self):
"""Find conda executable on Windows/Linux"""
possible_paths = [
'conda',
'/opt/conda/bin/conda',
'/usr/local/bin/conda',
os.path.expanduser('~/miniconda3/bin/conda'),
os.path.expanduser('~/anaconda3/bin/conda'),
# Windows paths
r'C:\ProgramData\Miniconda3\Scripts\conda.exe',
r'C:\ProgramData\Anaconda3\Scripts\conda.exe',
os.path.expanduser(r'~\Miniconda3\Scripts\conda.exe'),
os.path.expanduser(r'~\Anaconda3\Scripts\conda.exe'),
]
for path in possible_paths:
try:
result = subprocess.run([path, '--version'],
capture_output=True, text=True, timeout=10)
if result.returncode == 0:
return path
except (FileNotFoundError, subprocess.TimeoutExpired):
continue
raise RuntimeError("Conda executable not found. Please install Miniconda or Anaconda.")
def list_environments(self):
"""List all available conda environments"""
try:
result = subprocess.run([self.conda_executable, 'env', 'list', '--json'],
capture_output=True, text=True, timeout=30)
if result.returncode == 0:
env_data = json.loads(result.stdout)
return env_data.get('envs', [])
except Exception as e:
print(f"Error listing conda environments: {e}")
### Production Deployment
```bash
# Production deployment with Gunicorn
gunicorn --bind 0.0.0.0:8000 \
--workers 4 \
--worker-class eventlet \
--timeout 300 \
--keep-alive 30 \
--access-logfile logs/access.log \
--error-logfile logs/error.log \
app:app
# Systemd service file (Linux)
# /etc/systemd/system/scriptsmanager.service
[Unit]
Description=ScriptsManager Web Application
After=network.target
[Service]
Type=simple
User=scriptsmanager
Group=scriptsmanager
WorkingDirectory=/opt/scriptsmanager
Environment=PATH=/opt/scriptsmanager/venv/bin
ExecStart=/opt/scriptsmanager/venv/bin/gunicorn --config gunicorn.conf.py app:app
Restart=always
RestartSec=10
[Install]
WantedBy=multi-user.target
Development Setup
# Development environment setup
git clone <repository-url> scriptsmanager
cd scriptsmanager
# Create virtual environment with Python 3.12+
python3.12 -m venv venv
source venv/bin/activate # Linux/Mac
# venv\Scripts\activate # Windows
# Install development dependencies
pip install -r requirements-dev.txt
# Initialize development database
python scripts/init_dev_db.py
# Start development server
flask run --debug --host=127.0.0.1 --port=5000
Cross-Platform Considerations
- Path Handling: Use
pathlib.Path
for cross-platform file operations - Process Management: Platform-specific conda activation commands
- Service Installation: Different approaches for Linux (systemd) vs Windows (Windows Service)
- File Permissions: Appropriate permission handling for each OS
- Environment Variables: Platform-specific environment variable handling
Monitoring & Health Checks
# Health check endpoint
@app.route('/health')
def health_check():
return {
'status': 'healthy',
'timestamp': datetime.utcnow().isoformat(),
'database': check_database_connection(),
'conda': check_conda_availability(),
'active_scripts': get_active_script_count(),
'port_usage': get_port_allocation_stats()
}