Implement code changes to enhance functionality and improve performance

This commit is contained in:
Miguel 2025-09-03 09:41:25 +02:00
parent 3e517bb1ee
commit 958d8ac994
9 changed files with 13928 additions and 160 deletions

View File

@ -0,0 +1,125 @@
# Solución: Aplicación de Configuraciones Globales a Proyectos
## Problema Identificado
Cuando se agregaban nuevos proyectos al sistema, no se estaban asignando las configuraciones globales de backup definidas en `config.json`. Los proyectos solo recibían configuraciones básicas como `default_schedule` y `default_schedule_time`, pero faltaban configuraciones importantes como:
- `retry_delay_hours`: Tiempo de espera antes de reintentar un backup fallido
- `backup_timeout_minutes`: Tiempo máximo para completar un backup
- `min_backup_interval_minutes`: Intervalo mínimo entre backups del mismo proyecto
- `min_free_space_mb`: Espacio mínimo requerido antes de hacer backup
- Todas las configuraciones de `backup_options` (compresión, algoritmo de hash, etc.)
## Cambios Realizados
### 1. Nuevo Método en ProjectDiscoveryService
Se agregó el método `_apply_global_configurations()` en `src/services/project_discovery_service.py`:
```python
def _apply_global_configurations(self, project_data: Dict[str, Any]) -> Dict[str, Any]:
"""Apply all global configurations to a new project"""
global_settings = self.config.global_settings
backup_options = self.config.backup_options
# Apply global settings to schedule_config
project_data["schedule_config"].update({
"retry_delay_hours": global_settings.get("retry_delay_hours", 1),
"backup_timeout_minutes": global_settings.get("backup_timeout_minutes", 0),
"min_backup_interval_minutes": global_settings.get("min_backup_interval_minutes", 10),
"min_free_space_mb": global_settings.get("min_free_space_mb", 100),
# ... configuraciones existentes
})
# Apply backup options
project_data["backup_options"] = {
"compression_level": backup_options.get("compression_level", 6),
"include_subdirectories": backup_options.get("include_subdirectories", True),
# ... todas las opciones de backup
}
return project_data
```
### 2. Modificación de Funciones de Creación de Proyectos
Se actualizaron las funciones `_create_project_from_s7p()` y `_create_manual_project()` para aplicar las configuraciones globales:
```python
# Apply global configurations
project_data = self._apply_global_configurations(project_data)
return Project(project_data)
```
### 3. Actualización del Modelo Project
Se expandió la clase `Project` en `src/models/project_model.py` para manejar las nuevas configuraciones:
**En el constructor:**
```python
# Configuraciones de backup (heredadas de configuración global)
backup_options = project_data.get("backup_options", {})
self.compression_level = backup_options.get("compression_level", 6)
self.include_subdirectories = backup_options.get("include_subdirectories", True)
# ... todas las opciones de backup
# Configuraciones adicionales del proyecto (heredadas de global_settings)
self.retry_delay_hours = schedule_config.get("retry_delay_hours", 1)
self.backup_timeout_minutes = schedule_config.get("backup_timeout_minutes", 0)
# ... configuraciones adicionales
```
**En el método `to_dict()`:**
```python
"schedule_config": {
# ... configuraciones existentes
"retry_delay_hours": self.retry_delay_hours,
"backup_timeout_minutes": self.backup_timeout_minutes,
"min_backup_interval_minutes": self.min_backup_interval_minutes,
"min_free_space_mb": self.min_free_space_mb,
},
"backup_options": {
"compression_level": self.compression_level,
"include_subdirectories": self.include_subdirectories,
# ... todas las opciones de backup
},
```
### 4. Actualización de Proyectos Existentes
Se ejecutó un script para actualizar los 158 proyectos existentes con las configuraciones globales que faltaban, creando un backup de seguridad antes de la actualización.
## Configuraciones Globales Aplicadas
### Global Settings (en schedule_config)
- `retry_delay_hours`: 1 hora por defecto
- `backup_timeout_minutes`: 0 (sin límite) por defecto
- `min_backup_interval_minutes`: 10 minutos por defecto
- `min_free_space_mb`: 100 MB por defecto
### Backup Options
- `compression_level`: 6 (balanceado)
- `include_subdirectories`: true
- `preserve_directory_structure`: true
- `hash_algorithm`: "md5"
- `hash_includes`: ["timestamp", "size"]
- `backup_type`: "complete"
- `process_priority`: "low"
- `sequential_execution`: true
- `filename_format`: "HH-MM-SS_projects.zip"
- `date_format`: "YYYY-MM-DD"
## Verificación
Se realizaron pruebas que confirmaron:
1. ✅ **Proyectos existentes actualizados**: Los 158 proyectos existentes ahora tienen todas las configuraciones globales
2. ✅ **Nuevos proyectos manuales**: Reciben automáticamente todas las configuraciones globales
3. ✅ **Nuevos proyectos S7P**: También reciben automáticamente todas las configuraciones globales
## Beneficios
- **Consistencia**: Todos los proyectos ahora heredan las mismas configuraciones globales
- **Mantenibilidad**: Cambios en configuraciones globales se aplican a futuros proyectos
- **Flexibilidad**: Cada proyecto mantiene sus configuraciones individuales que pueden ser modificadas independientemente
- **Retrocompatibilidad**: Los proyectos existentes mantienen sus configuraciones actuales mientras reciben las nuevas
## Archivos Modificados
- `src/services/project_discovery_service.py`: Nuevo método `_apply_global_configurations()`
- `src/models/project_model.py`: Expansión del modelo Project para manejar configuraciones adicionales
- `projects.json`: Actualizado con configuraciones globales para todos los proyectos existentes
- `projects.json.backup`: Backup de seguridad creado automáticamente

File diff suppressed because it is too large Load Diff

10130
projects.json.backup Normal file

File diff suppressed because it is too large Load Diff

View File

@ -74,6 +74,32 @@ class Project:
self.discovery_method = discovery_info.get("discovery_method", "")
self.auto_discovered = discovery_info.get("auto_discovered", True)
# Configuraciones de backup (heredadas de configuración global)
backup_options = project_data.get("backup_options", {})
self.compression_level = backup_options.get("compression_level", 6)
self.include_subdirectories = backup_options.get("include_subdirectories", True)
self.preserve_directory_structure = backup_options.get(
"preserve_directory_structure", True
)
self.hash_algorithm = backup_options.get("hash_algorithm", "md5")
self.hash_includes = backup_options.get("hash_includes", ["timestamp", "size"])
self.backup_type = backup_options.get("backup_type", "complete")
self.process_priority = backup_options.get("process_priority", "low")
self.sequential_execution = backup_options.get("sequential_execution", True)
self.filename_format = backup_options.get(
"filename_format", "HH-MM-SS_projects.zip"
)
self.date_format = backup_options.get("date_format", "YYYY-MM-DD")
# Configuraciones adicionales del proyecto (heredadas de global_settings)
schedule_config = project_data.get("schedule_config", {})
self.retry_delay_hours = schedule_config.get("retry_delay_hours", 1)
self.backup_timeout_minutes = schedule_config.get("backup_timeout_minutes", 0)
self.min_backup_interval_minutes = schedule_config.get(
"min_backup_interval_minutes", 10
)
self.min_free_space_mb = schedule_config.get("min_free_space_mb", 100)
def to_dict(self) -> Dict[str, Any]:
"""Convertir el proyecto a diccionario para serialización JSON"""
return {
@ -90,6 +116,10 @@ class Project:
"schedule_time": self.schedule_time,
"enabled": self.enabled,
"next_scheduled_backup": self.next_scheduled_backup,
"retry_delay_hours": self.retry_delay_hours,
"backup_timeout_minutes": self.backup_timeout_minutes,
"min_backup_interval_minutes": self.min_backup_interval_minutes,
"min_free_space_mb": self.min_free_space_mb,
},
"backup_history": {
"last_backup_date": self.last_backup_date,
@ -120,6 +150,18 @@ class Project:
"discovery_method": self.discovery_method,
"auto_discovered": self.auto_discovered,
},
"backup_options": {
"compression_level": self.compression_level,
"include_subdirectories": self.include_subdirectories,
"preserve_directory_structure": self.preserve_directory_structure,
"hash_algorithm": self.hash_algorithm,
"hash_includes": self.hash_includes,
"backup_type": self.backup_type,
"process_priority": self.process_priority,
"sequential_execution": self.sequential_execution,
"filename_format": self.filename_format,
"date_format": self.date_format,
},
}
def update_status(self, status: ProjectStatus, error_message: str = None) -> None:

View File

@ -262,6 +262,9 @@ class ProjectDiscoveryService:
},
}
# Apply global configurations
project_data = self._apply_global_configurations(project_data)
return Project(project_data)
except Exception as e:
@ -329,6 +332,9 @@ class ProjectDiscoveryService:
},
}
# Apply global configurations
project_data = self._apply_global_configurations(project_data)
return Project(project_data)
except Exception as e:
@ -337,6 +343,54 @@ class ProjectDiscoveryService:
)
return None
def _apply_global_configurations(
self, project_data: Dict[str, Any]
) -> Dict[str, Any]:
"""Apply all global configurations to a new project"""
global_settings = self.config.global_settings
backup_options = self.config.backup_options
# Apply global settings to schedule_config
if "schedule_config" not in project_data:
project_data["schedule_config"] = {}
project_data["schedule_config"].update(
{
"schedule": global_settings.get("default_schedule", "daily"),
"schedule_time": global_settings.get("default_schedule_time", "02:00"),
"retry_delay_hours": global_settings.get("retry_delay_hours", 1),
"backup_timeout_minutes": global_settings.get(
"backup_timeout_minutes", 0
),
"min_backup_interval_minutes": global_settings.get(
"min_backup_interval_minutes", 10
),
"min_free_space_mb": global_settings.get("min_free_space_mb", 100),
}
)
# Apply backup options
project_data["backup_options"] = {
"compression_level": backup_options.get("compression_level", 6),
"include_subdirectories": backup_options.get(
"include_subdirectories", True
),
"preserve_directory_structure": backup_options.get(
"preserve_directory_structure", True
),
"hash_algorithm": backup_options.get("hash_algorithm", "md5"),
"hash_includes": backup_options.get("hash_includes", ["timestamp", "size"]),
"backup_type": backup_options.get("backup_type", "complete"),
"process_priority": backup_options.get("process_priority", "low"),
"sequential_execution": backup_options.get("sequential_execution", True),
"filename_format": backup_options.get(
"filename_format", "HH-MM-SS_projects.zip"
),
"date_format": backup_options.get("date_format", "YYYY-MM-DD"),
}
return project_data
def _generate_project_id(self, path: str) -> str:
"""Generar ID único para un proyecto basado en su ruta"""
# Usar hash de la ruta + timestamp para garantizar unicidad

117
test_new_project_config.py Normal file
View File

@ -0,0 +1,117 @@
#!/usr/bin/env python3
"""
Test script to verify that new projects get global configurations applied
"""
import sys
import os
from pathlib import Path
# Add src to path so we can import the modules
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "src"))
from models.config_model import Config
from services.project_discovery_service import ProjectDiscoveryService
from models.project_model import ProjectManager
import tempfile
def test_new_project_global_config():
"""Test that new projects get global configurations applied"""
# Create a temporary config
config = Config()
# Create project manager
project_manager = ProjectManager()
# Create discovery service
discovery_service = ProjectDiscoveryService(config, project_manager)
# Test manual project creation
print("Testing manual project creation...")
obs_dir = {
"path": "D:\\Test\\TestProject",
"type": "manual",
"description": "Test project",
}
# Create the manual project
manual_project = discovery_service._create_manual_project(obs_dir)
if manual_project:
print("✅ Manual project created successfully")
# Check if global configurations were applied
schedule_config_keys = list(manual_project.to_dict()["schedule_config"].keys())
has_backup_options = "backup_options" in manual_project.to_dict()
print(f"Schedule config keys: {schedule_config_keys}")
print(f"Has backup_options: {has_backup_options}")
# Verify specific global settings
project_dict = manual_project.to_dict()
schedule_config = project_dict["schedule_config"]
backup_options = project_dict.get("backup_options", {})
print(
f"Retry delay hours: {schedule_config.get('retry_delay_hours', 'MISSING')}"
)
print(
f"Backup timeout: {schedule_config.get('backup_timeout_minutes', 'MISSING')}"
)
print(
f"Min backup interval: {schedule_config.get('min_backup_interval_minutes', 'MISSING')}"
)
print(f"Min free space: {schedule_config.get('min_free_space_mb', 'MISSING')}")
print(
f"Compression level: {backup_options.get('compression_level', 'MISSING')}"
)
print(
f"Include subdirectories: {backup_options.get('include_subdirectories', 'MISSING')}"
)
# Check if all expected keys are present
expected_schedule_keys = [
"schedule",
"schedule_time",
"enabled",
"next_scheduled_backup",
"retry_delay_hours",
"backup_timeout_minutes",
"min_backup_interval_minutes",
"min_free_space_mb",
]
expected_backup_keys = [
"compression_level",
"include_subdirectories",
"preserve_directory_structure",
"hash_algorithm",
"hash_includes",
"backup_type",
"process_priority",
"sequential_execution",
"filename_format",
"date_format",
]
missing_schedule_keys = [
key for key in expected_schedule_keys if key not in schedule_config
]
missing_backup_keys = [
key for key in expected_backup_keys if key not in backup_options
]
if not missing_schedule_keys and not missing_backup_keys:
print("✅ ALL global configurations successfully applied to new project!")
else:
print(f"❌ Missing schedule keys: {missing_schedule_keys}")
print(f"❌ Missing backup keys: {missing_backup_keys}")
else:
print("❌ Failed to create manual project")
if __name__ == "__main__":
test_new_project_global_config()

38
test_projects_config.py Normal file
View File

@ -0,0 +1,38 @@
#!/usr/bin/env python3
"""
Test script to verify project configurations
"""
import json
from pathlib import Path
def main():
projects_file = Path("projects.json")
if projects_file.exists():
with open(projects_file, "r", encoding="utf-8") as f:
data = json.load(f)
print(f'Proyectos existentes: {len(data.get("projects", []))}')
if data.get("projects"):
project = data["projects"][0]
print(f'Primer proyecto: {project.get("name", "N/A")}')
print(
f'Configuraciones en schedule_config: {list(project.get("schedule_config", {}).keys())}'
)
print(f'Tiene backup_options: {"backup_options" in project}')
# Mostrar configuraciones actuales
schedule_config = project.get("schedule_config", {})
print(f"Schedule config actual: {schedule_config}")
backup_options = project.get("backup_options", {})
print(f"Backup options actual: {backup_options}")
else:
print("No hay proyectos en el archivo")
else:
print("No hay archivo projects.json")
if __name__ == "__main__":
main()

116
test_s7p_project_config.py Normal file
View File

@ -0,0 +1,116 @@
#!/usr/bin/env python3
"""
Test script to verify that S7P projects also get global configurations applied
"""
import sys
import os
# Add src to path so we can import the modules
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "src"))
from models.config_model import Config
from services.project_discovery_service import ProjectDiscoveryService
from models.project_model import ProjectManager
def test_s7p_project_global_config():
"""Test that S7P projects get global configurations applied"""
# Create a temporary config
config = Config()
# Create project manager
project_manager = ProjectManager()
# Create discovery service
discovery_service = ProjectDiscoveryService(config, project_manager)
# Test S7P project creation
print("Testing S7P project creation...")
obs_dir = {
"path": "D:\\Test\\ObservationDir",
"type": "siemens_s7",
"description": "Test S7 observation directory",
}
# Simulate an S7P file path
s7p_file_path = "D:\\Test\\ObservationDir\\TestProject\\TestProject.s7p"
# Create the S7P project
s7p_project = discovery_service._create_project_from_s7p(s7p_file_path, obs_dir)
if s7p_project:
print("✅ S7P project created successfully")
# Check if global configurations were applied
project_dict = s7p_project.to_dict()
schedule_config_keys = list(project_dict["schedule_config"].keys())
has_backup_options = "backup_options" in project_dict
print(f"Schedule config keys: {schedule_config_keys}")
print(f"Has backup_options: {has_backup_options}")
# Verify specific global settings
schedule_config = project_dict["schedule_config"]
backup_options = project_dict.get("backup_options", {})
print(
f"Retry delay hours: {schedule_config.get('retry_delay_hours', 'MISSING')}"
)
print(
f"Backup timeout: {schedule_config.get('backup_timeout_minutes', 'MISSING')}"
)
print(
f"Min backup interval: {schedule_config.get('min_backup_interval_minutes', 'MISSING')}"
)
print(f"Min free space: {schedule_config.get('min_free_space_mb', 'MISSING')}")
print(
f"Compression level: {backup_options.get('compression_level', 'MISSING')}"
)
print(f"Hash algorithm: {backup_options.get('hash_algorithm', 'MISSING')}")
# Check if all expected keys are present
expected_schedule_keys = [
"schedule",
"schedule_time",
"enabled",
"next_scheduled_backup",
"retry_delay_hours",
"backup_timeout_minutes",
"min_backup_interval_minutes",
"min_free_space_mb",
]
expected_backup_keys = [
"compression_level",
"include_subdirectories",
"preserve_directory_structure",
"hash_algorithm",
"hash_includes",
"backup_type",
"process_priority",
"sequential_execution",
"filename_format",
"date_format",
]
missing_schedule_keys = [
key for key in expected_schedule_keys if key not in schedule_config
]
missing_backup_keys = [
key for key in expected_backup_keys if key not in backup_options
]
if not missing_schedule_keys and not missing_backup_keys:
print("✅ ALL global configurations successfully applied to S7P project!")
else:
print(f"❌ Missing schedule keys: {missing_schedule_keys}")
print(f"❌ Missing backup keys: {missing_backup_keys}")
else:
print("❌ Failed to create S7P project")
if __name__ == "__main__":
test_s7p_project_global_config()

143
update_projects_config.py Normal file
View File

@ -0,0 +1,143 @@
#!/usr/bin/env python3
"""
Script to update existing projects with global configurations
"""
import json
from pathlib import Path
from datetime import datetime, timezone
def apply_global_configurations_to_project(
project_data, global_settings, backup_options
):
"""Apply global configurations to a project data dictionary"""
# Apply global settings to schedule_config
if "schedule_config" not in project_data:
project_data["schedule_config"] = {}
schedule_config = project_data["schedule_config"]
# Add missing global settings (preserving existing values)
schedule_config.setdefault(
"retry_delay_hours", global_settings.get("retry_delay_hours", 1)
)
schedule_config.setdefault(
"backup_timeout_minutes", global_settings.get("backup_timeout_minutes", 0)
)
schedule_config.setdefault(
"min_backup_interval_minutes",
global_settings.get("min_backup_interval_minutes", 10),
)
schedule_config.setdefault(
"min_free_space_mb", global_settings.get("min_free_space_mb", 100)
)
# Apply backup options (only if not present)
if "backup_options" not in project_data:
project_data["backup_options"] = {
"compression_level": backup_options.get("compression_level", 6),
"include_subdirectories": backup_options.get(
"include_subdirectories", True
),
"preserve_directory_structure": backup_options.get(
"preserve_directory_structure", True
),
"hash_algorithm": backup_options.get("hash_algorithm", "md5"),
"hash_includes": backup_options.get("hash_includes", ["timestamp", "size"]),
"backup_type": backup_options.get("backup_type", "complete"),
"process_priority": backup_options.get("process_priority", "low"),
"sequential_execution": backup_options.get("sequential_execution", True),
"filename_format": backup_options.get(
"filename_format", "HH-MM-SS_projects.zip"
),
"date_format": backup_options.get("date_format", "YYYY-MM-DD"),
}
return project_data
def main():
# Load current config
config_file = Path("config.json")
if not config_file.exists():
print("Error: No se encontró config.json")
return
with open(config_file, "r", encoding="utf-8") as f:
config_data = json.load(f)
global_settings = config_data.get("global_settings", {})
backup_options = config_data.get("backup_options", {})
print(f"Configuraciones globales cargadas:")
print(f" - Global settings: {list(global_settings.keys())}")
print(f" - Backup options: {list(backup_options.keys())}")
# Load current projects
projects_file = Path("projects.json")
if not projects_file.exists():
print("Error: No se encontró projects.json")
return
with open(projects_file, "r", encoding="utf-8") as f:
projects_data = json.load(f)
projects = projects_data.get("projects", [])
print(f"Procesando {len(projects)} proyectos...")
# Update each project
updated_count = 0
for project in projects:
original_schedule_keys = len(project.get("schedule_config", {}))
has_backup_options = "backup_options" in project
# Apply global configurations
project = apply_global_configurations_to_project(
project, global_settings, backup_options
)
new_schedule_keys = len(project.get("schedule_config", {}))
now_has_backup_options = "backup_options" in project
if new_schedule_keys > original_schedule_keys or (
not has_backup_options and now_has_backup_options
):
updated_count += 1
# Update metadata
projects_data["metadata"]["last_updated"] = datetime.now(timezone.utc).isoformat()
projects_data["metadata"]["auto_config_update"] = datetime.now(
timezone.utc
).isoformat()
# Save updated projects
backup_file = Path("projects.json.backup")
# Create backup
with open(backup_file, "w", encoding="utf-8") as f:
json.dump(projects_data, f, indent=2, ensure_ascii=False)
# Save updated file
with open(projects_file, "w", encoding="utf-8") as f:
json.dump(projects_data, f, indent=2, ensure_ascii=False)
print(f"✅ Actualización completada:")
print(f" - Proyectos actualizados: {updated_count}/{len(projects)}")
print(f" - Backup creado en: {backup_file}")
print(f" - Archivo actualizado: {projects_file}")
# Verify first project
if projects:
first_project = projects[0]
print(
f"\\nVerificación del primer proyecto '{first_project.get('name', 'N/A')}':"
)
print(
f" - Schedule config keys: {list(first_project.get('schedule_config', {}).keys())}"
)
print(f" - Tiene backup_options: {'backup_options' in first_project}")
if __name__ == "__main__":
main()