Compare commits

...

4 Commits

Author SHA1 Message Date
Miguel c3088e9957 Refactor code structure for improved readability and maintainability 2025-08-23 17:05:44 +02:00
Miguel 18f6cdaa4f feat: Update TIA Portal version support and enhance export functionality in x1.py and x4.py; remove obsolete test scripts 2025-08-23 16:49:30 +02:00
Miguel 48e25282d6 Add path validation and sanitization tests
- Implemented `test_path_validation.py` to test filename sanitization, path sanitization, and export path validation functions.
- Added comprehensive test cases for various problematic block names and paths to ensure proper handling of invalid characters and whitespace.
- Created `test_sanitization.py` to specifically address problematic block names with updated sanitization logic, including special cases for "I/O access error" and "Time error interrupt".
- Enhanced filename sanitization to replace specific problematic characters and patterns, ensuring consistent output for known issues.
2025-08-23 16:24:58 +02:00
Miguel 586e3cc9b3 Add test script for verifying SIMATIC SD compatibility detection
- Implemented a new test script `test_simatic_sd_compatibility.py` to check the availability of SIMATIC SD format in TIA Scripting.
- Included detailed analysis of SIMATIC SD requirements based on official Siemens documentation.
- Provided feedback on supported and unsupported programming languages and block types.
- Added error handling for TIA Scripting import and environment variable checks.
2025-08-23 13:53:13 +02:00
14 changed files with 17479 additions and 27162 deletions

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,49 +0,0 @@
--- Log de Ejecución: x3.py ---
Grupo: ObtainIOFromProjectTia
Directorio de Trabajo: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Inicio: 2025-05-12 14:24:35
Fin: 2025-05-12 14:24:39
Duración: 0:00:04.165462
Estado: SUCCESS (Código de Salida: 0)
--- SALIDA ESTÁNDAR (STDOUT) ---
--- AML (CAx Export) to Hierarchical JSON and Obsidian MD Converter (v31.1 - Corrected IO Summary Table Initialization) ---
Using Working Directory for Output: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Input AML: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
Output Directory: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Output JSON: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.hierarchical.json
Output IO Debug Tree MD: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export_IO_Upward_Debug.md
Processing AML file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
Pass 1: Found 203 InternalElement(s). Populating device dictionary...
Pass 2: Identifying PLCs and Networks (Refined v2)...
Identified Network: PROFIBUS_1 (d645659a-3704-4cd6-b2c8-6165ceeed6ee) Type: Profibus
Identified Network: ETHERNET_1 (f0b1c852-7dc9-4748-888e-34c60b519a75) Type: Ethernet/Profinet
Identified PLC: PLC (a48e038f-0bcc-4b48-8373-033da316c62b) - Type: CPU 1516F-3 PN/DP OrderNo: 6ES7 516-3FP03-0AB0
Pass 3: Processing InternalLinks (Robust Network Mapping & IO)...
Found 116 InternalLink(s).
Mapping Device/Node 'E1' (NodeID:439930b8-1bbc-4cb2-a93b-2eed931f4b12, Addr:10.1.33.11) to Network 'ETHERNET_1'
--> Associating Network 'ETHERNET_1' with PLC 'PLC' (via Node 'E1' Addr: 10.1.33.11)
Mapping Device/Node 'P1' (NodeID:904bb0f7-df2d-4c1d-ab65-f45480449db1, Addr:1) to Network 'PROFIBUS_1'
--> Associating Network 'PROFIBUS_1' with PLC 'PLC' (via Node 'P1' Addr: 1)
Mapping Device/Node 'PB1' (NodeID:2784bae8-9807-475f-89bd-bcf44282f5f4, Addr:12) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:e9c5f60a-1da2-4c9b-979e-7d03a5b58a44, Addr:20) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:dd7201c2-e127-4a9d-b6ae-7a74a4ffe418, Addr:21) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:d8825919-3a6c-4f95-aef0-62c782cfdb51, Addr:22) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:27d0e31d-46dc-4fdd-ab82-cfb91899a27c, Addr:10) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:d91d5905-aa1a-485e-b4eb-8333cc2133c2, Addr:8) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:0c5dfe06-786d-4ab6-b57c-8dfede56c2aa, Addr:40) to Network 'PROFIBUS_1'
Data extraction and structuring complete.
Generating JSON output: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.hierarchical.json
JSON data written successfully.
IO upward debug tree written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export_IO_Upward_Debug.md
Found 1 PLC(s). Generating individual hardware trees...
Generating Hardware Tree for PLC 'PLC' (ID: a48e038f-0bcc-4b48-8373-033da316c62b) at: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\Documentation\SAE196_c0.2_CAx_Export_Hardware_Tree.md
Markdown summary (including table) written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\Documentation\SAE196_c0.2_CAx_Export_Hardware_Tree.md
Script finished.
--- ERRORES (STDERR) ---
Ninguno
--- FIN DEL LOG ---

View File

@ -1,65 +0,0 @@
--- Log de Ejecución: x4.py ---
Grupo: ObtainIOFromProjectTia
Directorio de Trabajo: D:\Trabajo\VM\44 - 98050 - Fiera\Reporte\ExportsTia\Source
Inicio: 2025-06-19 19:05:36
Fin: 2025-06-19 19:06:33
Duración: 0:00:57.281042
Estado: SUCCESS (Código de Salida: 0)
--- SALIDA ESTÁNDAR (STDOUT) ---
--- Exportador de Referencias Cruzadas de TIA Portal ---
Versión de TIA Portal detectada: 19.0 (de la extensión .ap19)
Proyecto seleccionado: D:/Trabajo/VM/44 - 98050 - Fiera/InLavoro/PLC/98050_PLC_11/98050_PLC_11.ap19
Usando directorio base de exportación: D:\Trabajo\VM\44 - 98050 - Fiera\Reporte\ExportsTia\Source
Conectando a TIA Portal V19.0...
2025-06-19 19:05:42,182 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.Global OpenPortal - Start TIA Portal, please acknowledge the security dialog.
2025-06-19 19:05:42,202 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.Global OpenPortal - With user interface
Conectado a TIA Portal.
2025-06-19 19:05:52,371 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.Portal GetProcessId - Process id: 24972
ID del proceso del Portal: 24972
2025-06-19 19:05:52,710 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.Portal OpenProject - Open project... D:/Trabajo/VM/44 - 98050 - Fiera/InLavoro/PLC/98050_PLC_11/98050_PLC_11.ap19
Ocurrió un error inesperado: OpennessAccessException: Error when calling method 'OpenWithUpgrade' of type 'Siemens.Engineering.ProjectComposition'.
Unable to open the project under path 'D:\Trabajo\VM\44 - 98050 - Fiera\InLavoro\PLC\98050_PLC_11\98050_PLC_11.ap19'.
An error occurred while opening the project
The project/library D:\Trabajo\VM\44 - 98050 - Fiera\InLavoro\PLC\98050_PLC_11\98050_PLC_11.ap19 cannot be accessed. It has already been opened by user Miguel on computer CSANUC. Note: If the application was not correctly closed, the open projects and libraries can only be opened again after a 2 minute delay.
Script finalizado.
--- ERRORES (STDERR) ---
2025-06-19 19:05:53,136 [1] ERROR Siemens.TiaPortal.OpennessApi19.Implementations.Portal OpenProject -
Siemens.TiaPortal.OpennessContracts.OpennessAccessException: Error when calling method 'OpenWithUpgrade' of type 'Siemens.Engineering.ProjectComposition'.
Unable to open the project under path 'D:\Trabajo\VM\44 - 98050 - Fiera\InLavoro\PLC\98050_PLC_11\98050_PLC_11.ap19'.
An error occurred while opening the project
The project/library D:\Trabajo\VM\44 - 98050 - Fiera\InLavoro\PLC\98050_PLC_11\98050_PLC_11.ap19 cannot be accessed. It has already been opened by user Miguel on computer CSANUC. Note: If the application was not correctly closed, the open projects and libraries can only be opened again after a 2 minute delay.
Traceback (most recent call last):
File "D:\Proyectos\Scripts\ParamManagerScripts\backend\script_groups\ObtainIOFromProjectTia\x4.py", line 455, in <module>
portal_instance, project_object = open_portal_and_project(tia_version, project_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Proyectos\Scripts\ParamManagerScripts\backend\script_groups\ObtainIOFromProjectTia\x4.py", line 413, in open_portal_and_project
project_obj = portal.open_project(project_file_path=str(project_file_path))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: OpennessAccessException: Error when calling method 'OpenWithUpgrade' of type 'Siemens.Engineering.ProjectComposition'.
Unable to open the project under path 'D:\Trabajo\VM\44 - 98050 - Fiera\InLavoro\PLC\98050_PLC_11\98050_PLC_11.ap19'.
An error occurred while opening the project
The project/library D:\Trabajo\VM\44 - 98050 - Fiera\InLavoro\PLC\98050_PLC_11\98050_PLC_11.ap19 cannot be accessed. It has already been opened by user Miguel on computer CSANUC. Note: If the application was not correctly closed, the open projects and libraries can only be opened again after a 2 minute delay.
--- FIN DEL LOG ---

View File

@ -1,81 +0,0 @@
--- Log de Ejecución: xTest.py ---
Grupo: ObtainIOFromProjectTia
Directorio de Trabajo: C:\Trabajo\SIDEL\09 - SAE452 - Diet as Regular - San Giovanni in Bosco\Reporte\SourceDoc\SourcdSD
Inicio: 2025-05-22 11:17:27
Fin: 2025-05-22 11:18:44
Duración: 0:01:16.758340
Estado: ERROR (Código de Salida: 1)
--- SALIDA ESTÁNDAR (STDOUT) ---
============================================================
PRUEBA DE EXPORTACIÓN SIMATIC SD - TIA PORTAL V20
============================================================
Project: C:/Trabajo/SIDEL/09 - SAE452 - Diet as Regular - San Giovanni in Bosco/Reporte/SourceDoc/Migration/SAE452_V20/SAE452_V20.ap20
Export Directory: C:/Users/migue/Downloads/Nueva carpeta (18)\SIMATIC_SD_Test
Connecting to TIA Portal V20...
2025-05-22 11:17:49,266 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.Global OpenPortal - Start TIA Portal, please acknowledge the security dialog.
2025-05-22 11:17:49,283 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.Global OpenPortal - With user interface
Connected successfully.
Opening project...
2025-05-22 11:18:05,562 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.Portal OpenProject - Open project... C:/Trabajo/SIDEL/09 - SAE452 - Diet as Regular - San Giovanni in Bosco/Reporte/SourceDoc/Migration/SAE452_V20/SAE452_V20.ap20
Project opened successfully.
2025-05-22 11:18:20,088 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.Project GetPlcs - Found plc CPU 315F-2 PN/DP with parent name _SSAE0452
Found 1 PLC(s)
Testing with PLC: CPU 315F-2 PN/DP
Found 410 program blocks
--- Testing Block 1/3: ISOonTCP_or_TCP_Protocol ---
Programming Language: STL
Available methods on block:
- export
- export_cross_references
✗ ExportAsDocuments method NOT found
Available methods containing 'export':
- export
- export_cross_references
--- Testing Block 2/3: PIDControl ---
Compiling block...
2025-05-22 11:18:24,970 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.ProgramBlock Compile - Compile the PLC program block PIDControl. Result:
2025-05-22 11:18:31,184 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.ProgramBlock Compile - Warning: CPU 315F-2 PN/DP > General warnings > Inputs or outputs are used that do not exist in the configured hardware.
2025-05-22 11:18:31,185 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.ProgramBlock Compile - Warning: CPU 315F-2 PN/DP > Compiling finished (errors: 0; warnings: 1)
Programming Language: LAD
Available methods on block:
- export
- export_cross_references
✗ ExportAsDocuments method NOT found
Available methods containing 'export':
- export
- export_cross_references
--- Testing Block 3/3: DETAIL_DP_DIAG ---
Programming Language: STL
Available methods on block:
- export
- export_cross_references
✗ ExportAsDocuments method NOT found
Available methods containing 'export':
- export
- export_cross_references
============================================================
PRUEBA COMPLETADA
============================================================
No se crearon archivos en C:/Users/migue/Downloads/Nueva carpeta (18)\SIMATIC_SD_Test
Closing TIA Portal...
2025-05-22 11:18:31,209 [1] INFO Siemens.TiaPortal.OpennessApi19.Implementations.Portal ClosePortal - Close TIA Portal
Press Enter to exit...
--- ERRORES (STDERR) ---
Traceback (most recent call last):
File "D:\Proyectos\Scripts\ParamManagerScripts\backend\script_groups\ObtainIOFromProjectTia\xTest.py", line 215, in <module>
input("\nPress Enter to exit...")
EOFError: EOF when reading a line
--- FIN DEL LOG ---

View File

@ -1,25 +1,49 @@
{ {
"x1.py": { "x1.py": {
"display_name": "1: Exportar Lógica desde TIA Portal v18,v19,v20 en XML", "display_name": "1: Exportar Lógica desde TIA Portal v15,v16,v17,v18,v19,v20 en XML",
"short_description": "Exporta la lógica del PLC desde TIA Portal en archivos XML y SCL.", "short_description": "Exporta la lógica del PLC desde TIA Portal (V15 a V20) en archivos XML y SCL.",
"long_description": "Este script utiliza TIA Portal Openness para exportar la lógica de un PLC en formato XML y SCL. Permite seleccionar un proyecto de TIA Portal y genera los archivos de exportación en el directorio configurado.\n***\n**Lógica Principal:**\n\n1. **Configuración:** Carga parámetros desde `ParamManagerScripts` (directorio de trabajo, versión de TIA Portal).\n2. **Selección de Proyecto:** Abre un cuadro de diálogo para seleccionar el archivo del proyecto de TIA Portal.\n3. **Conexión a TIA Portal:** Utiliza la API de TIA Openness para conectarse al portal y abrir el proyecto seleccionado.\n4. **Exportación:** Exporta la lógica del PLC en archivos XML y SCL al directorio configurado.\n5. **Cierre:** Cierra la conexión con TIA Portal al finalizar.", "long_description": "Este script utiliza TIA Portal Openness para exportar la lógica de un PLC en formato XML y SCL. Permite seleccionar un proyecto de TIA Portal (V15, V16, V17, V18, V19, V20) y genera los archivos de exportación en el directorio configurado.\n***\n**Lógica Principal:**\n\n1. **Configuración:** Carga parámetros desde `ParamManagerScripts` (directorio de trabajo, versión de TIA Portal).\n2. **Selección de Proyecto:** Abre un cuadro de diálogo para seleccionar el archivo del proyecto de TIA Portal (V15 a V20).\n3. **Conexión a TIA Portal:** Utiliza la API de TIA Openness para conectarse al portal y abrir el proyecto seleccionado.\n4. **Exportación:** Exporta la lógica del PLC en archivos XML y SCL al directorio configurado.\n5. **Cierre:** Cierra la conexión con TIA Portal al finalizar.",
"hidden": false
},
"x4.py": {
"display_name": "2: Exportar Referencias Cruzadas desde Tia Portal",
"short_description": "Script para exportar las referencias cruzadas",
"long_description": "",
"hidden": false "hidden": false
}, },
"x2.py": { "x2.py": {
"display_name": "2: Exportar Lógica desde TIA Portal V20 en SIMATIC SD format", "display_name": "2: Exportar Lógica desde TIA Portal V20 en SIMATIC SD format",
"short_description": "export_logic_from_tia_v20_simatic_sd : Script para exportar el software de un PLC desde TIA Portal V20", "short_description": "Script para exportar el software de un PLC desde TIA Portal V20 en el nuevo formato SIMATIC SD",
"long_description": "", "long_description": "Script especializado para exportar bloques de PLC desde TIA Portal V20 utilizando el nuevo formato SIMATIC SD (Structured Data). Este formato proporciona una representación más estructurada y moderna de los datos del PLC.\n***\n**Características principales:**\n\n1. **Formato SIMATIC SD:** Utiliza el nuevo formato de exportación disponible en TIA Portal V20+\n2. **Detección automática:** Verifica compatibilidad con el formato SIMATIC SD antes de la exportación\n3. **Comparación dual:** Exporta tanto en formato SIMATIC SD como en XML tradicional para comparación\n4. **Estructura organizada:** Crea carpetas separadas para bloques, UDTs y tablas de variables\n5. **Timestamp único:** Evita conflictos con exports anteriores usando timestamp en nombres de carpetas\n\n**Estructura de exportación:**\n- `01_ProgramBlocks_SD/` - Bloques en formato SIMATIC SD\n- `02_ProgramBlocks_XML_Compare/` - Bloques en XML para comparación\n- `03_PlcDataTypes_SD/` - UDTs en formato SIMATIC SD\n- `04_PlcDataTypes_XML_Compare/` - UDTs en XML para comparación\n- `05_PlcTags_SD/` - Tablas de variables en formato SIMATIC SD\n- `06_PlcTags_XML_Compare/` - Tablas de variables en XML para comparación\n\n**Compatibilidad:** Requiere TIA Portal V20 o superior. Puede ejecutarse después de x1.py sin conflictos.",
"hidden": false
},
"x4.py": {
"display_name": "3: Exportar Referencias Cruzadas desde TIA Portal v17,v18,v19,v20",
"short_description": "Script para exportar las referencias cruzadas de un proyecto TIA Portal (V17 a V20)",
"long_description": "Este script exporta las referencias cruzadas (cross-references) de un proyecto TIA Portal (V17, V18, V19, V20), proporcionando información detallada sobre las interconexiones entre variables, bloques y componentes del sistema.\n***\n**Funcionalidad:**\n\n1. **Referencias cruzadas:** Extrae información sobre dónde se utilizan las variables y bloques\n2. **Análisis de dependencias:** Identifica relaciones entre componentes del proyecto\n3. **Documentación:** Genera reportes útiles para mantenimiento y debugging\n4. **Formato estructurado:** Exporta en formato legible para análisis posterior\n\n**Casos de uso:**\n- Documentación de proyecto\n- Análisis de impacto de cambios\n- Debugging y mantenimiento\n- Auditorías de código\n\n**NOTA:** La exportación de referencias cruzadas requiere TIA Portal V17.0 o superior. Las versiones V15 y V16 no soportan esta funcionalidad en la API de Openness.",
"hidden": false "hidden": false
}, },
"xTest.py": { "xTest.py": {
"display_name": "xTest", "display_name": "xTest - Pruebas SIMATIC SD ExportAsDocuments",
"short_description": "Test específico para exportación SIMATIC SD usando ExportAsDocuments()", "short_description": "Test específico para exportación SIMATIC SD usando ExportAsDocuments()",
"long_description": "Script de prueba experimental para validar la funcionalidad de exportación SIMATIC SD utilizando el método ExportAsDocuments() de la API de TIA Portal Openness.\n***\n**Propósito:**\n\n1. **Validación de API:** Prueba diferentes métodos de exportación SIMATIC SD\n2. **Comparación de métodos:** Evalúa ExportAsDocuments() vs Export() estándar\n3. **Debugging:** Identifica problemas y limitaciones en la exportación SD\n4. **Desarrollo:** Base para mejoras en scripts de producción\n\n**Estado:** Script experimental - usar solo para pruebas y desarrollo\n\n**Nota:** Este script es parte del proceso de desarrollo y optimización de los métodos de exportación SIMATIC SD.",
"hidden": false
},
"test_simatic_sd_compatibility.py": {
"display_name": "test_simatic_sd_compatibility",
"short_description": "Test script to verify SIMATIC SD compatibility detection",
"long_description": "",
"hidden": false
},
"test_path_validation.py": {
"display_name": "test_path_validation",
"short_description": "Test script for path validation and sanitization functions",
"long_description": "",
"hidden": false
},
"test_sanitization.py": {
"display_name": "test_sanitization",
"short_description": "Test script for updated sanitization function",
"long_description": "",
"hidden": false
},
"test_block_validation.py": {
"display_name": "test_block_validation",
"short_description": "Test script para verificar la función is_block_exportable",
"long_description": "", "long_description": "",
"hidden": false "hidden": false
} }

View File

@ -0,0 +1,56 @@
"""
Test script para verificar la función is_block_exportable
"""
# Mock class para simular un bloque de TIA Portal
class MockBlock:
def __init__(self, programming_language):
self.programming_language = programming_language
def get_property(self, name):
if name == "ProgrammingLanguage":
return self.programming_language
raise Exception(f"Property {name} not found")
# Importar la función desde x1.py
import sys
import os
sys.path.append(os.path.dirname(__file__))
from x1 import is_block_exportable
# Testear diferentes tipos de bloques
test_cases = [
("LAD", True, "LAD blocks should be exportable"),
("FBD", True, "FBD blocks should be exportable"),
("STL", True, "STL blocks should be exportable"),
("SCL", True, "SCL blocks should be exportable"),
("ProDiag_OB", False, "ProDiag_OB blocks should not be exportable"),
("ProDiag", False, "ProDiag blocks should not be exportable"),
("GRAPH", False, "GRAPH blocks should not be exportable"),
]
print("=== Test de validación de bloques ===")
for prog_lang, expected_exportable, description in test_cases:
block = MockBlock(prog_lang)
is_exportable, detected_lang, reason = is_block_exportable(block)
status = "✓ PASS" if is_exportable == expected_exportable else "✗ FAIL"
print(f"{status} - {description}")
print(f" Lenguaje: {detected_lang}, Exportable: {is_exportable}, Razón: {reason}")
print()
# Test con bloque que genera excepción
class MockBlockError:
def get_property(self, name):
raise Exception("Cannot access property")
print("=== Test de manejo de errores ===")
error_block = MockBlockError()
is_exportable, detected_lang, reason = is_block_exportable(error_block)
print(f"Bloque con error - Exportable: {is_exportable}, Lenguaje: {detected_lang}")
print(f"Razón: {reason}")

View File

@ -7,6 +7,8 @@ from tkinter import filedialog
import os import os
import sys import sys
import traceback import traceback
import shutil
import tempfile
from pathlib import Path # Import Path from pathlib import Path # Import Path
script_root = os.path.dirname( script_root = os.path.dirname(
@ -18,13 +20,18 @@ from backend.script_utils import load_configuration
# --- Configuration --- # --- Configuration ---
# Supported TIA Portal versions mapping (extension -> version) # Supported TIA Portal versions mapping (extension -> version)
SUPPORTED_TIA_VERSIONS = { SUPPORTED_TIA_VERSIONS = {
".ap15": "15.0",
".ap16": "16.0",
".ap17": "17.0",
".ap18": "18.0", ".ap18": "18.0",
".ap19": "19.0", ".ap19": "19.0",
".ap20": "20.0" ".ap20": "20.0",
} }
EXPORT_OPTIONS = None # Use default export options EXPORT_OPTIONS = None # Use default export options
KEEP_FOLDER_STRUCTURE = True # Replicate TIA project folder structure in export directory KEEP_FOLDER_STRUCTURE = (
True # Replicate TIA project folder structure in export directory
)
# --- TIA Scripting Import Handling --- # --- TIA Scripting Import Handling ---
# Check if the TIA_SCRIPTING environment variable is set # Check if the TIA_SCRIPTING environment variable is set
@ -46,7 +53,7 @@ try:
except ImportError: except ImportError:
print("ERROR: Failed to import 'siemens_tia_scripting'.") print("ERROR: Failed to import 'siemens_tia_scripting'.")
print("Ensure:") print("Ensure:")
print(f"1. TIA Portal Openness for V{TIA_PORTAL_VERSION} is installed.") print("1. TIA Portal Openness is installed.")
print( print(
"2. The 'siemens_tia_scripting' Python module is installed (pip install ...) or" "2. The 'siemens_tia_scripting' Python module is installed (pip install ...) or"
) )
@ -64,11 +71,151 @@ except Exception as e:
# --- Functions --- # --- Functions ---
def is_block_exportable(block):
"""
Checks if a block can be exported based on its programming language.
Returns (is_exportable, programming_language, reason)
"""
try:
prog_language = block.get_property(name="ProgrammingLanguage")
# List of known unsupported programming languages
unsupported_languages = [
"ProDiag_OB", # ProDiag Organization Blocks
"ProDiag", # ProDiag Function Blocks
"GRAPH", # GRAPH (Sequential Control)
]
if prog_language in unsupported_languages:
return (
False,
prog_language,
f"Programming language '{prog_language}' is not supported for export",
)
return True, prog_language, "OK"
except Exception as e:
# If we can't determine the programming language, assume it might be exportable
# but warn about it
return True, "Unknown", f"Could not determine programming language: {e}"
def sanitize_filename(name):
"""Sanitizes a filename by removing/replacing invalid characters and whitespace."""
import re
# Handle specific problematic cases first
if name == "I/O access error":
return "IO_access_error"
elif name == "Time error interrupt":
return "Time_error_interrupt"
elif name.startswith("I/O_"):
return name.replace("I/O_", "IO_").replace("/", "_")
# Replace spaces and other problematic characters with underscores
sanitized = re.sub(r'[<>:"/\\|?*\s]+', "_", name)
# Remove leading/trailing underscores and dots
sanitized = sanitized.strip("_.")
# Ensure it's not empty
if not sanitized:
sanitized = "unknown"
return sanitized
def sanitize_path(path):
"""Sanitizes a path by ensuring it doesn't contain problematic whitespace."""
# Normalize the path and remove any trailing/leading whitespace
normalized = os.path.normpath(path.strip())
return normalized
def validate_export_path(path):
"""Validates that an export path is suitable for TIA Portal."""
if not path:
return False, "La ruta está vacía"
# Check for problematic characters or patterns
if any(char in path for char in '<>"|?*'):
return False, f"La ruta contiene caracteres no válidos: {path}"
# Check for excessive whitespace
if path != path.strip():
return False, f"La ruta contiene espacios al inicio o final: '{path}'"
# Check for multiple consecutive spaces
if " " in path:
return False, f"La ruta contiene espacios múltiples consecutivos: '{path}'"
# Check path length (Windows limitation)
if len(path) > 250:
return False, f"La ruta es demasiado larga ({len(path)} caracteres): {path}"
return True, "OK"
def create_temp_export_dir():
"""Creates a temporary directory for export that doesn't contain spaces."""
# Create a temporary directory with a safe name
temp_base = tempfile.gettempdir()
temp_export = os.path.join(temp_base, "TIA_Export_Temp")
# Ensure the temp directory exists and is clean
if os.path.exists(temp_export):
shutil.rmtree(temp_export)
os.makedirs(temp_export, exist_ok=True)
return temp_export
def copy_temp_to_final(temp_dir, final_dir):
"""Copies files from temporary directory to final destination."""
try:
print(f"\nCopiando archivos exportados desde directorio temporal...")
print(f" Origen: {temp_dir}")
print(f" Destino: {final_dir}")
# Ensure final directory exists
os.makedirs(final_dir, exist_ok=True)
# Copy all contents from temp to final directory
for item in os.listdir(temp_dir):
src_path = os.path.join(temp_dir, item)
dst_path = os.path.join(final_dir, item)
if os.path.isdir(src_path):
if os.path.exists(dst_path):
shutil.rmtree(dst_path)
shutil.copytree(src_path, dst_path)
print(f" Directorio copiado: {item}")
else:
shutil.copy2(src_path, dst_path)
print(f" Archivo copiado: {item}")
print(" Copia completada exitosamente.")
return True
except Exception as e:
print(f" ERROR durante la copia: {e}")
return False
def cleanup_temp_dir(temp_dir):
"""Cleans up the temporary directory."""
try:
if os.path.exists(temp_dir):
shutil.rmtree(temp_dir)
print(f"Directorio temporal limpiado: {temp_dir}")
except Exception as e:
print(f"ADVERTENCIA: No se pudo limpiar el directorio temporal {temp_dir}: {e}")
def get_supported_filetypes(): def get_supported_filetypes():
"""Returns the supported file types for TIA Portal projects.""" """Returns the supported file types for TIA Portal projects."""
filetypes = [] filetypes = []
for ext, version in SUPPORTED_TIA_VERSIONS.items(): for ext, version in SUPPORTED_TIA_VERSIONS.items():
version_major = version.split('.')[0] version_major = version.split(".")[0]
filetypes.append((f"TIA Portal V{version_major} Projects", f"*{ext}")) filetypes.append((f"TIA Portal V{version_major} Projects", f"*{ext}"))
# Add option to show all supported files # Add option to show all supported files
@ -77,6 +224,18 @@ def get_supported_filetypes():
return filetypes return filetypes
def normalize_project_path(project_path):
"""Normalizes a TIA Portal project path to avoid path-related issues."""
# Convert forward slashes to backslashes for Windows
normalized = project_path.replace("/", "\\")
# Use os.path.normpath to clean up the path
normalized = os.path.normpath(normalized)
# Ensure it's an absolute path
normalized = os.path.abspath(normalized)
return normalized
def detect_tia_version(project_file_path): def detect_tia_version(project_file_path):
"""Detects TIA Portal version based on file extension.""" """Detects TIA Portal version based on file extension."""
file_path = Path(project_file_path) file_path = Path(project_file_path)
@ -84,21 +243,25 @@ def detect_tia_version(project_file_path):
if file_extension in SUPPORTED_TIA_VERSIONS: if file_extension in SUPPORTED_TIA_VERSIONS:
detected_version = SUPPORTED_TIA_VERSIONS[file_extension] detected_version = SUPPORTED_TIA_VERSIONS[file_extension]
print(f"Versión de TIA Portal detectada: {detected_version} (de la extensión {file_extension})") print(
f"Versión de TIA Portal detectada: {detected_version} (de la extensión {file_extension})"
)
return detected_version return detected_version
else: else:
print(f"ADVERTENCIA: Extensión de archivo no reconocida '{file_extension}'. Extensiones soportadas: {list(SUPPORTED_TIA_VERSIONS.keys())}") print(
# Default to version 18.0 for backward compatibility f"ADVERTENCIA: Extensión de archivo no reconocida '{file_extension}'. Extensiones soportadas: {list(SUPPORTED_TIA_VERSIONS.keys())}"
print("Usando por defecto TIA Portal V18.0") )
return "18.0" # Default to version 15.0 for backward compatibility
print("Usando por defecto TIA Portal V15.0")
return "15.0"
def select_project_file(): def select_project_file():
"""Opens a dialog to select a TIA Portal project file.""" """Opens a dialog to select a TIA Portal project file."""
root = tk.Tk() root = tk.Tk()
root.withdraw() # Hide the main tkinter window root.withdraw() # Hide the main tkinter window
file_path = filedialog.askopenfilename( file_path = filedialog.askopenfilename(
title="Select TIA Portal Project File", title="Select TIA Portal Project File", filetypes=get_supported_filetypes()
filetypes=get_supported_filetypes()
) )
root.destroy() root.destroy()
if not file_path: if not file_path:
@ -122,18 +285,40 @@ def select_export_directory():
def export_plc_data(plc, export_base_dir): def export_plc_data(plc, export_base_dir):
"""Exports Blocks, UDTs, and Tag Tables from a given PLC.""" """Exports Blocks, UDTs, and Tag Tables from a given PLC."""
plc_name = plc.get_name() plc_name = plc.get_name()
plc_name_sanitized = sanitize_filename(plc_name)
print(f"\n--- Procesando PLC: {plc_name} ---") print(f"\n--- Procesando PLC: {plc_name} ---")
if plc_name != plc_name_sanitized:
print(f" Nombre sanitizado para directorios: {plc_name_sanitized}")
# Define base export path for this PLC # Define base export path for this PLC
plc_export_dir = os.path.join(export_base_dir, plc_name) plc_export_dir = sanitize_path(os.path.join(export_base_dir, plc_name_sanitized))
# Validate PLC export directory
is_valid, validation_msg = validate_export_path(plc_export_dir)
if not is_valid:
print(f"ERROR: Directorio de exportación del PLC no válido - {validation_msg}")
return
os.makedirs(plc_export_dir, exist_ok=True) os.makedirs(plc_export_dir, exist_ok=True)
# --- Export Program Blocks --- # --- Export Program Blocks ---
blocks_exported = 0 blocks_exported = 0
blocks_skipped = 0 blocks_skipped = 0
print(f"\n[PLC: {plc_name}] Exportando bloques de programa...") print(f"\n[PLC: {plc_name}] Exportando bloques de programa...")
xml_blocks_path = os.path.join(plc_export_dir, "ProgramBlocks_XML") xml_blocks_path = sanitize_path(os.path.join(plc_export_dir, "ProgramBlocks_XML"))
scl_blocks_path = os.path.join(plc_export_dir, "ProgramBlocks_SCL") scl_blocks_path = sanitize_path(os.path.join(plc_export_dir, "ProgramBlocks_SCL"))
# Validate block export paths
xml_valid, xml_msg = validate_export_path(xml_blocks_path)
scl_valid, scl_msg = validate_export_path(scl_blocks_path)
if not xml_valid:
print(f" ERROR: Ruta XML no válida - {xml_msg}")
return
if not scl_valid:
print(f" ERROR: Ruta SCL no válida - {scl_msg}")
return
os.makedirs(xml_blocks_path, exist_ok=True) os.makedirs(xml_blocks_path, exist_ok=True)
os.makedirs(scl_blocks_path, exist_ok=True) os.makedirs(scl_blocks_path, exist_ok=True)
print(f" Destino XML: {xml_blocks_path}") print(f" Destino XML: {xml_blocks_path}")
@ -145,6 +330,18 @@ def export_plc_data(plc, export_base_dir):
for block in program_blocks: for block in program_blocks:
block_name = block.get_name() block_name = block.get_name()
print(f" Procesando bloque: {block_name}...") print(f" Procesando bloque: {block_name}...")
# Check if block is exportable
is_exportable, prog_language, reason = is_block_exportable(block)
if not is_exportable:
print(f" ADVERTENCIA: {reason}. Omitiendo bloque {block_name}.")
blocks_skipped += 1
continue
if prog_language == "Unknown":
print(f" ADVERTENCIA: {reason}")
try: try:
if not block.is_consistent(): if not block.is_consistent():
print(f" Compilando bloque {block_name}...") print(f" Compilando bloque {block_name}...")
@ -157,6 +354,63 @@ def export_plc_data(plc, export_base_dir):
continue continue
print(f" Exportando {block_name} como XML...") print(f" Exportando {block_name} como XML...")
try:
print(f" Destino: {xml_blocks_path}")
# Check if this is a system block that might need special handling
is_system_block = any(
keyword in block_name.lower()
for keyword in [
"interrupt",
"error",
"startup",
"i/o",
"rack_flt",
"prog_err",
"time error",
"io access",
"createsan",
]
)
# Try creating a sanitized filename for problematic blocks
if is_system_block or " " in block_name or "/" in block_name:
print(
f" Detectado bloque con nombre problemático: '{block_name}'"
)
# Create a temporary export directory with sanitized name
sanitized_block_name = sanitize_filename(block_name)
temp_block_dir = os.path.join(
xml_blocks_path, sanitized_block_name
)
os.makedirs(temp_block_dir, exist_ok=True)
print(f" Usando directorio sanitizado: {temp_block_dir}")
block.export(
target_directory_path=temp_block_dir,
export_options=EXPORT_OPTIONS,
export_format=ts.Enums.ExportFormats.SimaticML,
keep_folder_structure=False, # Disable folder structure for problematic blocks
)
# Rename files to use original block name in metadata
for file in os.listdir(temp_block_dir):
if file.endswith(".xml"):
original_path = os.path.join(temp_block_dir, file)
# Move file to main directory with original name preserved in content
target_path = os.path.join(xml_blocks_path, file)
if os.path.exists(target_path):
os.remove(target_path)
shutil.move(original_path, target_path)
# Remove temporary directory
if os.path.exists(temp_block_dir):
os.rmdir(temp_block_dir)
else:
# Normal export for regular blocks
block.export( block.export(
target_directory_path=xml_blocks_path, target_directory_path=xml_blocks_path,
export_options=EXPORT_OPTIONS, export_options=EXPORT_OPTIONS,
@ -164,19 +418,103 @@ def export_plc_data(plc, export_base_dir):
keep_folder_structure=KEEP_FOLDER_STRUCTURE, keep_folder_structure=KEEP_FOLDER_STRUCTURE,
) )
except Exception as xml_ex:
print(
f" ERROR en exportación XML para {block_name}: {xml_ex}"
)
print(f" Ruta problemática: '{xml_blocks_path}'")
print(f" Tipo de bloque: {type(block).__name__}")
print(f" Lenguaje de programación: {prog_language}")
# Check if it's a ProDiag related error
if "ProDiag" in str(
xml_ex
) or "not supported during import and export" in str(xml_ex):
print(
f" Este bloque usa un lenguaje no soportado para exportación. Omitiendo."
)
# Skip this block and continue with others
blocks_skipped += 1
continue
# If we get here, XML export was successful
# Now try SCL export if applicable
try: try:
prog_language = block.get_property(name="ProgrammingLanguage")
if prog_language == "SCL": if prog_language == "SCL":
print(f" Exportando {block_name} como SCL...") print(f" Exportando {block_name} como SCL...")
try:
print(f" Destino: {scl_blocks_path}")
# Use same logic for SCL export
is_system_block = any(
keyword in block_name.lower()
for keyword in [
"interrupt",
"error",
"startup",
"i/o",
"rack_flt",
"prog_err",
"time error",
"io access",
"createsan",
]
)
if (
is_system_block
or " " in block_name
or "/" in block_name
):
sanitized_block_name = sanitize_filename(block_name)
temp_block_dir = os.path.join(
scl_blocks_path, sanitized_block_name
)
os.makedirs(temp_block_dir, exist_ok=True)
block.export(
target_directory_path=temp_block_dir,
export_options=EXPORT_OPTIONS,
export_format=ts.Enums.ExportFormats.ExternalSource,
keep_folder_structure=False,
)
# Move files to main directory
for file in os.listdir(temp_block_dir):
if file.endswith(".scl"):
original_path = os.path.join(
temp_block_dir, file
)
target_path = os.path.join(
scl_blocks_path, file
)
if os.path.exists(target_path):
os.remove(target_path)
shutil.move(original_path, target_path)
if os.path.exists(temp_block_dir):
os.rmdir(temp_block_dir)
else:
block.export( block.export(
target_directory_path=scl_blocks_path, target_directory_path=scl_blocks_path,
export_options=EXPORT_OPTIONS, export_options=EXPORT_OPTIONS,
export_format=ts.Enums.ExportFormats.ExternalSource, export_format=ts.Enums.ExportFormats.ExternalSource,
keep_folder_structure=KEEP_FOLDER_STRUCTURE, keep_folder_structure=KEEP_FOLDER_STRUCTURE,
) )
except Exception as prop_ex: except Exception as scl_ex:
print( print(
f" No se pudo obtener el lenguaje de programación para {block_name}. Omitiendo SCL. Error: {prop_ex}" f" ERROR en exportación SCL para {block_name}: {scl_ex}"
)
print(f" Ruta problemática: '{scl_blocks_path}'")
# Don't raise, just continue
else:
print(
f" Bloque {block_name} no es SCL (lenguaje: {prog_language}). Omitiendo exportación SCL."
)
except Exception as scl_check_ex:
print(
f" Error verificando lenguaje para exportación SCL: {scl_check_ex}"
) )
blocks_exported += 1 blocks_exported += 1
@ -194,7 +532,14 @@ def export_plc_data(plc, export_base_dir):
udts_exported = 0 udts_exported = 0
udts_skipped = 0 udts_skipped = 0
print(f"\n[PLC: {plc_name}] Exportando tipos de datos PLC (UDTs)...") print(f"\n[PLC: {plc_name}] Exportando tipos de datos PLC (UDTs)...")
udt_export_path = os.path.join(plc_export_dir, "PlcDataTypes") udt_export_path = sanitize_path(os.path.join(plc_export_dir, "PlcDataTypes"))
# Validate UDT export path
udt_valid, udt_msg = validate_export_path(udt_export_path)
if not udt_valid:
print(f" ERROR: Ruta UDT no válida - {udt_msg}")
return
os.makedirs(udt_export_path, exist_ok=True) os.makedirs(udt_export_path, exist_ok=True)
print(f" Destino: {udt_export_path}") print(f" Destino: {udt_export_path}")
@ -216,11 +561,19 @@ def export_plc_data(plc, export_base_dir):
continue continue
print(f" Exportando {udt_name}...") print(f" Exportando {udt_name}...")
try:
print(f" Destino: {udt_export_path}")
udt.export( udt.export(
target_directory_path=udt_export_path, target_directory_path=udt_export_path,
export_options=EXPORT_OPTIONS, export_options=EXPORT_OPTIONS,
keep_folder_structure=KEEP_FOLDER_STRUCTURE, keep_folder_structure=KEEP_FOLDER_STRUCTURE,
) )
except Exception as udt_export_ex:
print(
f" ERROR en exportación UDT para {udt_name}: {udt_export_ex}"
)
print(f" Ruta problemática: '{udt_export_path}'")
raise udt_export_ex
udts_exported += 1 udts_exported += 1
except Exception as udt_ex: except Exception as udt_ex:
print(f" ERROR exportando UDT {udt_name}: {udt_ex}") print(f" ERROR exportando UDT {udt_name}: {udt_ex}")
@ -236,7 +589,14 @@ def export_plc_data(plc, export_base_dir):
tags_exported = 0 tags_exported = 0
tags_skipped = 0 tags_skipped = 0
print(f"\n[PLC: {plc_name}] Exportando tablas de variables PLC...") print(f"\n[PLC: {plc_name}] Exportando tablas de variables PLC...")
tags_export_path = os.path.join(plc_export_dir, "PlcTags") tags_export_path = sanitize_path(os.path.join(plc_export_dir, "PlcTags"))
# Validate tags export path
tags_valid, tags_msg = validate_export_path(tags_export_path)
if not tags_valid:
print(f" ERROR: Ruta Tags no válida - {tags_msg}")
return
os.makedirs(tags_export_path, exist_ok=True) os.makedirs(tags_export_path, exist_ok=True)
print(f" Destino: {tags_export_path}") print(f" Destino: {tags_export_path}")
@ -248,14 +608,24 @@ def export_plc_data(plc, export_base_dir):
print(f" Procesando tabla de variables: {table_name}...") print(f" Procesando tabla de variables: {table_name}...")
try: try:
print(f" Exportando {table_name}...") print(f" Exportando {table_name}...")
try:
print(f" Destino: {tags_export_path}")
table.export( table.export(
target_directory_path=tags_export_path, target_directory_path=tags_export_path,
export_options=EXPORT_OPTIONS, export_options=EXPORT_OPTIONS,
keep_folder_structure=KEEP_FOLDER_STRUCTURE, keep_folder_structure=KEEP_FOLDER_STRUCTURE,
) )
except Exception as table_export_ex:
print(
f" ERROR en exportación tabla para {table_name}: {table_export_ex}"
)
print(f" Ruta problemática: '{tags_export_path}'")
raise table_export_ex
tags_exported += 1 tags_exported += 1
except Exception as table_ex: except Exception as table_ex:
print(f" ERROR exportando tabla de variables {table_name}: {table_ex}") print(
f" ERROR exportando tabla de variables {table_name}: {table_ex}"
)
tags_skipped += 1 tags_skipped += 1
print( print(
f" Resumen de exportación de tablas de variables: Exportados={tags_exported}, Omitidos/Errores={tags_skipped}" f" Resumen de exportación de tablas de variables: Exportados={tags_exported}, Omitidos/Errores={tags_skipped}"
@ -279,12 +649,24 @@ if __name__ == "__main__":
# Validate working directory # Validate working directory
if not working_directory or not os.path.isdir(working_directory): if not working_directory or not os.path.isdir(working_directory):
print("ERROR: Directorio de trabajo no configurado o inválido.") print("ERROR: Directorio de trabajo no configurado o inválido.")
print("Por favor configure el directorio de trabajo usando la aplicación principal.") print(
"Por favor configure el directorio de trabajo usando la aplicación principal."
)
sys.exit(1) sys.exit(1)
# 1. Select Project File, Export Directory comes from config # 1. Select Project File, Export Directory comes from config
project_file = select_project_file() project_file = select_project_file()
export_dir = working_directory # Use working directory from config # Normalize the project file path to avoid TIA Portal path issues
project_file = normalize_project_path(project_file)
export_dir = sanitize_path(
working_directory
) # Use working directory from config with sanitization
# Validate export directory
is_valid, validation_msg = validate_export_path(export_dir)
if not is_valid:
print(f"ERROR: Directorio de exportación no válido - {validation_msg}")
sys.exit(1)
# 2. Detect TIA Portal version from project file # 2. Detect TIA Portal version from project file
tia_version = detect_tia_version(project_file) tia_version = detect_tia_version(project_file)
@ -307,9 +689,21 @@ if __name__ == "__main__":
# 4. Open Project # 4. Open Project
print(f"Abriendo proyecto: {os.path.basename(project_file)}...") print(f"Abriendo proyecto: {os.path.basename(project_file)}...")
project_object = portal_instance.open_project(project_file_path=project_file) print(f"Ruta completa del proyecto: {project_file}")
try:
project_object = portal_instance.open_project(
project_file_path=project_file
)
except Exception as open_ex:
print(f"Error al abrir el proyecto: {open_ex}")
print("Intentando obtener proyecto ya abierto...")
project_object = None
if project_object is None: if project_object is None:
print("El proyecto podría estar ya abierto, intentando obtener el manejador...") print(
"El proyecto podría estar ya abierto, intentando obtener el manejador..."
)
project_object = portal_instance.get_project() project_object = portal_instance.get_project()
if project_object is None: if project_object is None:
raise Exception("No se pudo abrir u obtener el proyecto especificado.") raise Exception("No se pudo abrir u obtener el proyecto especificado.")
@ -320,7 +714,9 @@ if __name__ == "__main__":
if not plcs: if not plcs:
print("No se encontraron dispositivos PLC en el proyecto.") print("No se encontraron dispositivos PLC en el proyecto.")
else: else:
print(f"Se encontraron {len(plcs)} PLC(s). Iniciando proceso de exportación...") print(
f"Se encontraron {len(plcs)} PLC(s). Iniciando proceso de exportación..."
)
# 6. Iterate and Export Data for each PLC # 6. Iterate and Export Data for each PLC
for plc_device in plcs: for plc_device in plcs:
@ -328,8 +724,21 @@ if __name__ == "__main__":
print("\nProceso de exportación completado.") print("\nProceso de exportación completado.")
except ts.TiaException as tia_ex: except ValueError as val_ex:
print(f"\nError de TIA Portal Openness: {tia_ex}") # Handle TIA Portal Openness exceptions (they come as ValueError)
if "OpennessAccessException" in str(val_ex):
print(f"\nError de TIA Portal Openness: {val_ex}")
print("Posibles causas:")
print("- El proyecto puede estar corrupto o en un formato incompatible")
print(
"- El proyecto puede requerir actualización a una versión más reciente"
)
print(
"- Verificar que la ruta del proyecto no contenga caracteres especiales"
)
print("- Asegurarse de que TIA Portal esté instalado correctamente")
else:
print(f"\nError de valor: {val_ex}")
traceback.print_exc() traceback.print_exc()
except FileNotFoundError: except FileNotFoundError:
print(f"\nERROR: Archivo de proyecto no encontrado en {project_file}") print(f"\nERROR: Archivo de proyecto no encontrado en {project_file}")

View File

@ -61,6 +61,154 @@ except Exception as e:
# --- Functions --- # --- Functions ---
def verify_export_format(export_path, expected_format="SIMATIC_SD"):
"""
Verifies what format was actually exported by examining file extensions.
Returns (actual_format, file_extensions, file_count)
"""
if not os.path.exists(export_path):
return "NO_FILES", [], 0
files = [
f
for f in os.listdir(export_path)
if os.path.isfile(os.path.join(export_path, f))
]
if not files:
return "EMPTY_FOLDER", [], 0
extensions = [os.path.splitext(f)[1].lower() for f in files]
extension_counts = {}
for ext in extensions:
extension_counts[ext] = extension_counts.get(ext, 0) + 1
# Determine actual format based on file extensions
if all(ext == ".xml" for ext in extensions):
actual_format = "XML_ONLY"
elif any(ext in [".sd", ".simatic"] for ext in extensions):
actual_format = "SIMATIC_SD"
elif ".xml" in extensions and len(set(extensions)) > 1:
actual_format = "MIXED"
else:
actual_format = "UNKNOWN"
return actual_format, extension_counts, len(files)
def check_simatic_sd_block_compatibility(block):
"""
Checks if a block is compatible with SIMATIC SD export format.
Returns (is_compatible, reason)
"""
try:
# Check if block is consistent
if not block.is_consistent():
return False, "Block is not consistent/compiled"
# Check programming language - SIMATIC SD only supports LAD
try:
prog_lang = block.get_programming_language()
if prog_lang != ts.Enums.ProgrammingLanguage.LAD:
return (
False,
f"Language {prog_lang} not supported (SIMATIC SD requires LAD)",
)
except Exception:
return False, "Could not determine programming language"
# Check block type - SIMATIC SD typically supports FB, FC, OB
try:
block_type = block.get_block_type()
supported_types = [
ts.Enums.BlockType.FB, # Function Block
ts.Enums.BlockType.FC, # Function
ts.Enums.BlockType.OB, # Organization Block
]
if block_type not in supported_types:
return False, f"Block type {block_type} may not be supported"
except Exception:
# If we can't determine type, assume it might work
pass
return True, "Block appears compatible with SIMATIC SD"
except Exception as e:
return False, f"Error checking compatibility: {e}"
def verify_export_format(export_path, expected_format="SimaticSD"):
"""
Verifies if the exported files are actually in the expected format.
For SIMATIC SD, looks for specific keywords like RUNG, END_RUNG, wire#
Returns (is_correct_format, file_count, sample_files, format_details)
"""
if not os.path.exists(export_path):
return False, 0, [], "Directory does not exist"
files = [
f
for f in os.listdir(export_path)
if os.path.isfile(os.path.join(export_path, f))
]
if not files:
return False, 0, [], "No files found"
# Check first few files for format
sample_files = files[:3]
format_details = []
for file_name in sample_files:
file_path = os.path.join(export_path, file_name)
try:
with open(file_path, "r", encoding="utf-8", errors="ignore") as f:
content = f.read(2000) # Read first 2KB
file_info = {"file": file_name, "size": len(content)}
if expected_format == "SimaticSD":
# SIMATIC SD specific keywords and structure
sd_keywords = ["RUNG", "END_RUNG", "wire#", "NETWORK", "TITLE", "LAD"]
xml_indicators = ["<?xml", "<Document", "<SW.Blocks"]
found_sd_keywords = [kw for kw in sd_keywords if kw in content]
found_xml_indicators = [xi for xi in xml_indicators if xi in content]
file_info["sd_keywords"] = found_sd_keywords
file_info["xml_indicators"] = found_xml_indicators
file_info["is_xml"] = len(found_xml_indicators) > 0
file_info["is_simatic_sd"] = (
len(found_sd_keywords) > 0 and not file_info["is_xml"]
)
file_info["first_100_chars"] = (
content[:100].replace("\n", " ").replace("\r", "")
)
else: # XML format
file_info["is_xml"] = (
content.strip().startswith("<?xml") or "<Document" in content
)
file_info["first_100_chars"] = (
content[:100].replace("\n", " ").replace("\r", "")
)
format_details.append(file_info)
except Exception as e:
format_details.append({"file": file_name, "error": str(e)})
# Determine overall result for SIMATIC SD
if expected_format == "SimaticSD":
is_correct_format = any(
f.get("is_simatic_sd", False) for f in format_details if "error" not in f
)
else:
is_correct_format = any(
f.get("is_xml", False) for f in format_details if "error" not in f
)
return is_correct_format, len(files), sample_files, format_details
def select_project_file(): def select_project_file():
"""Opens a dialog to select a TIA Portal project file.""" """Opens a dialog to select a TIA Portal project file."""
root = tk.Tk() root = tk.Tk()
@ -98,11 +246,35 @@ def check_simatic_sd_support():
try: try:
# Check if SimaticSD is available in ExportFormats enum # Check if SimaticSD is available in ExportFormats enum
simatic_sd_format = ts.Enums.ExportFormats.SimaticSD simatic_sd_format = ts.Enums.ExportFormats.SimaticSD
print(f"✓ SIMATIC SD format supported (enum value: {simatic_sd_format})") print(f"✓ SIMATIC SD format enum found (value: {simatic_sd_format})")
# Try to get more information about available formats
try:
all_formats = [
attr for attr in dir(ts.Enums.ExportFormats) if not attr.startswith("_")
]
print(f" Available export formats: {all_formats}")
except Exception:
pass
return True return True
except AttributeError: except AttributeError:
print("✗ ERROR: SIMATIC SD format not available in this TIA Scripting version.") print("✗ ERROR: SIMATIC SD format not available in this TIA Scripting version.")
print("Please ensure you are using TIA Portal V20 or later with compatible TIA Scripting.") print(
"Please ensure you are using TIA Portal V20 or later with compatible TIA Scripting."
)
return False
def check_tia_portal_version():
"""Check TIA Portal version and compatibility."""
print("\n=== TIA PORTAL VERSION CHECK ===")
try:
# This will be filled when we connect to TIA Portal
print("TIA Portal version check will be performed after connection...")
return True
except Exception as e:
print(f"Could not check TIA Portal version: {e}")
return False return False
@ -113,14 +285,18 @@ def export_plc_data_simatic_sd(plc, export_base_dir):
# Define base export path for this PLC with timestamp to avoid conflicts # Define base export path for this PLC with timestamp to avoid conflicts
import datetime import datetime
timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S") timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
plc_export_dir = os.path.join(export_base_dir, f"{plc_name}_SimaticSD_{timestamp}") plc_export_dir = os.path.join(export_base_dir, f"{plc_name}_SimaticSD_{timestamp}")
os.makedirs(plc_export_dir, exist_ok=True) os.makedirs(plc_export_dir, exist_ok=True)
# --- Export Program Blocks in SIMATIC SD Format --- # --- Export Program Blocks in SIMATIC SD Format ---
blocks_exported = 0 blocks_exported_sd = 0
blocks_exported_xml = 0
blocks_skipped = 0 blocks_skipped = 0
blocks_not_lad = 0
print(f"\n[PLC: {plc_name}] Exporting Program Blocks (SIMATIC SD)...") print(f"\n[PLC: {plc_name}] Exporting Program Blocks (SIMATIC SD)...")
print(" NOTE: SIMATIC SD format only supports LAD (Ladder) programming language!")
sd_blocks_path = os.path.join(plc_export_dir, "01_ProgramBlocks_SD") sd_blocks_path = os.path.join(plc_export_dir, "01_ProgramBlocks_SD")
os.makedirs(sd_blocks_path, exist_ok=True) os.makedirs(sd_blocks_path, exist_ok=True)
print(f" SIMATIC SD Target: {sd_blocks_path}") print(f" SIMATIC SD Target: {sd_blocks_path}")
@ -146,16 +322,85 @@ def export_plc_data_simatic_sd(plc, export_base_dir):
blocks_skipped += 1 blocks_skipped += 1
continue continue
print(f" Exporting {block_name} as SIMATIC SD...") # Check programming language - CRITICAL for SIMATIC SD
is_compatible, compatibility_reason = (
check_simatic_sd_block_compatibility(block)
)
print(f" Compatibility check: {compatibility_reason}")
if not is_compatible:
print(f" Block {block_name} not compatible with SIMATIC SD.")
print(f" Exporting XML only: {compatibility_reason}")
blocks_not_lad += 1
# Export only in XML for incompatible blocks
block.export(
target_directory_path=xml_blocks_path,
export_options=EXPORT_OPTIONS,
export_format=ts.Enums.ExportFormats.SimaticML,
keep_folder_structure=KEEP_FOLDER_STRUCTURE,
)
blocks_exported_xml += 1
print(
f" ✓ Exported {block_name} in XML (incompatible with SIMATIC SD)"
)
continue
# Try SIMATIC SD export for LAD blocks
print(f" Exporting LAD block {block_name} as SIMATIC SD...")
try: try:
block.export( block.export(
target_directory_path=sd_blocks_path, target_directory_path=sd_blocks_path,
export_options=EXPORT_OPTIONS, export_options=EXPORT_OPTIONS,
export_format=ts.Enums.ExportFormats.SimaticSD, # New SIMATIC SD format export_format=ts.Enums.ExportFormats.SimaticSD, # SIMATIC SD format
keep_folder_structure=KEEP_FOLDER_STRUCTURE, keep_folder_structure=KEEP_FOLDER_STRUCTURE,
) )
blocks_exported += 1
print(f" ✓ Successfully exported {block_name} in SIMATIC SD") # Verify if the export was actually in SIMATIC SD format
print(f" Verifying SIMATIC SD format for {block_name}...")
is_sd, file_count, sample_files, format_details = (
verify_export_format(sd_blocks_path, "SimaticSD")
)
if is_sd:
blocks_exported_sd += 1
print(
f" ✓ Successfully exported {block_name} in REAL SIMATIC SD format"
)
# Show sample of SD content
for detail in format_details:
if detail.get("is_simatic_sd") and detail.get(
"sd_keywords"
):
print(
f" 🎯 SD Keywords found: {', '.join(detail['sd_keywords'])}"
)
break
else:
print(
f" ❌ FAILED: Export claimed SD but files are actually XML!"
)
print(f" 📋 Format analysis:")
for detail in format_details:
if "error" not in detail:
print(f" File: {detail['file']}")
print(
f" SIMATIC SD: {detail.get('is_simatic_sd', False)}"
)
print(
f" XML format: {detail.get('is_xml', False)}"
)
if detail.get("sd_keywords"):
print(
f" SD keywords: {detail['sd_keywords']}"
)
if detail.get("xml_indicators"):
print(
f" XML indicators: {detail['xml_indicators']}"
)
print(
f" Content start: {detail.get('first_100_chars', '')[:50]}..."
)
# Also export same block in XML for comparison # Also export same block in XML for comparison
print(f" Exporting {block_name} as XML for comparison...") print(f" Exporting {block_name} as XML for comparison...")
@ -165,23 +410,37 @@ def export_plc_data_simatic_sd(plc, export_base_dir):
export_format=ts.Enums.ExportFormats.SimaticML, # Traditional XML format export_format=ts.Enums.ExportFormats.SimaticML, # Traditional XML format
keep_folder_structure=KEEP_FOLDER_STRUCTURE, keep_folder_structure=KEEP_FOLDER_STRUCTURE,
) )
blocks_exported_xml += 1
print(f" + Also exported {block_name} in XML for comparison") print(f" + Also exported {block_name} in XML for comparison")
except Exception as export_ex: except Exception as export_ex:
print(f" ERROR during export: {export_ex}") print(f" ERROR during SIMATIC SD export: {export_ex}")
print(
f" This is likely because SIMATIC SD has specific requirements:"
)
print(f" - Block must be in LAD (Ladder) format")
print(
f" - Block must be compatible with SIMATIC SD specification"
)
# Try to export only in XML if SD fails # Try to export only in XML if SD fails
try: try:
print(f" Attempting fallback XML export for {block_name}...") print(
f" Attempting fallback XML export for {block_name}..."
)
block.export( block.export(
target_directory_path=xml_blocks_path, target_directory_path=xml_blocks_path,
export_options=EXPORT_OPTIONS, export_options=EXPORT_OPTIONS,
export_format=ts.Enums.ExportFormats.SimaticML, export_format=ts.Enums.ExportFormats.SimaticML,
keep_folder_structure=KEEP_FOLDER_STRUCTURE, keep_folder_structure=KEEP_FOLDER_STRUCTURE,
) )
print(f" ✓ Fallback XML export successful for {block_name}") print(
blocks_exported += 1 f" ✓ Fallback XML export successful for {block_name}"
)
blocks_exported_xml += 1
except Exception as fallback_ex: except Exception as fallback_ex:
print(f" ERROR: Both SD and XML export failed: {fallback_ex}") print(
f" ERROR: Both SD and XML export failed: {fallback_ex}"
)
blocks_skipped += 1 blocks_skipped += 1
except Exception as block_ex: except Exception as block_ex:
@ -189,7 +448,7 @@ def export_plc_data_simatic_sd(plc, export_base_dir):
blocks_skipped += 1 blocks_skipped += 1
print( print(
f" Program Blocks Export Summary: Exported={blocks_exported}, Skipped/Errors={blocks_skipped}" f" Program Blocks Export Summary: SIMATIC SD={blocks_exported_sd}, XML={blocks_exported_xml}, Non-LAD={blocks_not_lad}, Skipped/Errors={blocks_skipped}"
) )
except Exception as e: except Exception as e:
print(f" ERROR processing Program Blocks: {e}") print(f" ERROR processing Program Blocks: {e}")
@ -311,7 +570,9 @@ def export_plc_data_simatic_sd(plc, export_base_dir):
def export_additional_formats(plc, export_base_dir): def export_additional_formats(plc, export_base_dir):
"""Optional: Export in traditional formats alongside SIMATIC SD for comparison.""" """Optional: Export in traditional formats alongside SIMATIC SD for comparison."""
plc_name = plc.get_name() plc_name = plc.get_name()
print(f"\n[Optional] Exporting traditional formats for comparison - PLC: {plc_name}") print(
f"\n[Optional] Exporting traditional formats for comparison - PLC: {plc_name}"
)
# Create comparison directory # Create comparison directory
comparison_dir = os.path.join(export_base_dir, plc_name, "Comparison_Formats") comparison_dir = os.path.join(export_base_dir, plc_name, "Comparison_Formats")
@ -378,6 +639,24 @@ if __name__ == "__main__":
print("Connected to TIA Portal V20.") print("Connected to TIA Portal V20.")
print(f"Portal Process ID: {portal_instance.get_process_id()}") print(f"Portal Process ID: {portal_instance.get_process_id()}")
# Get TIA Portal version information
try:
portal_version = portal_instance.get_version()
print(f"TIA Portal Version: {portal_version}")
# Check if this version really supports SIMATIC SD
version_parts = portal_version.split(".")
major_version = int(version_parts[0]) if version_parts else 0
if major_version < 20:
print(
f"⚠️ WARNING: TIA Portal V{major_version} may not fully support SIMATIC SD (requires V20+)"
)
else:
print(f"✓ TIA Portal V{major_version} should support SIMATIC SD")
except Exception as ver_ex:
print(f"Could not get TIA Portal version: {ver_ex}")
# 3. Open Project # 3. Open Project
print(f"Opening project: {os.path.basename(project_file)}...") print(f"Opening project: {os.path.basename(project_file)}...")
project_object = portal_instance.open_project(project_file_path=project_file) project_object = portal_instance.open_project(project_file_path=project_file)
@ -397,28 +676,40 @@ if __name__ == "__main__":
# 5. Iterate and Export Data for each PLC in SIMATIC SD format # 5. Iterate and Export Data for each PLC in SIMATIC SD format
import datetime # Add this import for timestamp import datetime # Add this import for timestamp
for plc_device in plcs: for plc_device in plcs:
export_plc_data_simatic_sd( export_plc_data_simatic_sd(plc=plc_device, export_base_dir=export_dir)
plc=plc_device, export_base_dir=export_dir
)
print("\n🎉 SIMATIC SD Export process completed successfully!") print("\n🎉 SIMATIC SD Export process completed successfully!")
print("\nExported files structure:") print("\nExported files structure:")
print("├── [PLC_Name]_SimaticSD_[timestamp]/") print("├── [PLC_Name]_SimaticSD_[timestamp]/")
print("│ ├── 01_ProgramBlocks_SD/ # SIMATIC SD format") print(
"│ ├── 01_ProgramBlocks_SD/ # SIMATIC SD format (LAD blocks only)"
)
print("│ ├── 02_ProgramBlocks_XML_Compare/ # Traditional XML for comparison") print("│ ├── 02_ProgramBlocks_XML_Compare/ # Traditional XML for comparison")
print("│ ├── 03_PlcDataTypes_SD/") print("│ ├── 03_PlcDataTypes_SD/")
print("│ ├── 04_PlcDataTypes_XML_Compare/") print("│ ├── 04_PlcDataTypes_XML_Compare/")
print("│ ├── 05_PlcTags_SD/") print("│ ├── 05_PlcTags_SD/")
print("│ └── 06_PlcTags_XML_Compare/") print("│ └── 06_PlcTags_XML_Compare/")
print("\nNow you can compare the differences between SIMATIC SD and traditional XML formats!") print("\n📋 IMPORTANT SIMATIC SD LIMITATIONS:")
print(" • SIMATIC SD format ONLY supports LAD (Ladder) programming language")
print(" • SCL, STL, FBD blocks are exported as XML only")
print(" • Only FB, FC, OB block types are typically supported")
print(" • Complex LAD elements may still fall back to XML")
print(
"\nNow you can compare the differences between SIMATIC SD and traditional XML formats!"
)
# Add file analysis # Add file analysis
print("\n=== FILE ANALYSIS ===") print("\n=== FILE FORMAT ANALYSIS ===")
for plc_device in plcs: for plc_device in plcs:
plc_name = plc_device.get_name() plc_name = plc_device.get_name()
import datetime
timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S") timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
plc_export_dir = os.path.join(export_dir, f"{plc_name}_SimaticSD_{timestamp}") plc_export_dir = os.path.join(
export_dir, f"{plc_name}_SimaticSD_{timestamp}"
)
print(f"\nAnalyzing exported files for PLC: {plc_name}") print(f"\nAnalyzing exported files for PLC: {plc_name}")
folders_to_check = [ folders_to_check = [
@ -427,20 +718,56 @@ if __name__ == "__main__":
("03_PlcDataTypes_SD", "SIMATIC SD UDTs"), ("03_PlcDataTypes_SD", "SIMATIC SD UDTs"),
("04_PlcDataTypes_XML_Compare", "XML UDTs"), ("04_PlcDataTypes_XML_Compare", "XML UDTs"),
("05_PlcTags_SD", "SIMATIC SD Tags"), ("05_PlcTags_SD", "SIMATIC SD Tags"),
("06_PlcTags_XML_Compare", "XML Tags") ("06_PlcTags_XML_Compare", "XML Tags"),
] ]
simatic_sd_working = False
total_sd_files = 0
total_xml_fallbacks = 0
for folder_name, description in folders_to_check: for folder_name, description in folders_to_check:
folder_path = os.path.join(plc_export_dir, folder_name) folder_path = os.path.join(plc_export_dir, folder_name)
if os.path.exists(folder_path): if os.path.exists(folder_path):
files = [f for f in os.listdir(folder_path) if os.path.isfile(os.path.join(folder_path, f))] actual_format, extensions, file_count = verify_export_format(
print(f" {description}: {len(files)} files") folder_path
if files: )
extensions = set(os.path.splitext(f)[1].lower() for f in files) print(f" {description}: {file_count} files")
print(f" File extensions: {', '.join(extensions) if extensions else 'No extensions'}") print(f" Format detected: {actual_format}")
print(f" File extensions: {extensions}")
if "SD" in folder_name: # This should be SIMATIC SD folder
if actual_format == "XML_ONLY":
print(
f" ⚠️ WARNING: Expected SIMATIC SD but got XML only!"
)
total_xml_fallbacks += file_count
elif actual_format == "SIMATIC_SD":
simatic_sd_working = True
total_sd_files += file_count
elif actual_format == "UNKNOWN" and file_count > 0:
print(f" 🔍 UNKNOWN format - needs manual inspection")
else: else:
print(f" {description}: Folder not found") print(f" {description}: Folder not found")
print(f"\n🔍 SIMATIC SD DIAGNOSIS FOR {plc_name}:")
if simatic_sd_working:
print(
f" ✅ SIMATIC SD is working: {total_sd_files} files in true SD format"
)
else:
print(f" ❌ SIMATIC SD NOT working: All 'SD' exports are actually XML")
print(f" 📊 Total XML fallbacks: {total_xml_fallbacks}")
if total_xml_fallbacks > 0:
print(f"\n 💡 POSSIBLE CAUSES:")
print(f" • TIA Portal version doesn't fully support SIMATIC SD")
print(f" • TIA Scripting version incompatible with SIMATIC SD")
print(f" • Project blocks contain unsupported LAD elements")
print(
f" • SIMATIC SD enum exists but falls back to XML silently"
)
except ts.TiaException as tia_ex: except ts.TiaException as tia_ex:
print(f"\nTIA Portal Openness Error: {tia_ex}") print(f"\nTIA Portal Openness Error: {tia_ex}")
traceback.print_exc() traceback.print_exc()

View File

@ -19,7 +19,18 @@ from backend.script_utils import load_configuration
# --- Configuration --- # --- Configuration ---
# Supported TIA Portal versions mapping (extension -> version) # Supported TIA Portal versions mapping (extension -> version)
SUPPORTED_TIA_VERSIONS = {".ap18": "18.0", ".ap19": "19.0", ".ap20": "20.0"} SUPPORTED_TIA_VERSIONS = {
".ap15": "15.0",
".ap16": "16.0",
".ap17": "17.0",
".ap18": "18.0",
".ap19": "19.0",
".ap20": "20.0",
}
# Cross-references export support was introduced in TIA Portal V17+
# Earlier versions don't support the export_cross_references() method
CROSS_REFERENCES_SUPPORTED_VERSIONS = ["17.0", "18.0", "19.0", "20.0"]
# Filter for cross-references. Based on documentation: # Filter for cross-references. Based on documentation:
# 1: 'AllObjects', 2: 'ObjectsWithReferences', 3: 'ObjectsWithoutReferences', 4: 'UnusedObjects' # 1: 'AllObjects', 2: 'ObjectsWithReferences', 3: 'ObjectsWithoutReferences', 4: 'UnusedObjects'
@ -83,6 +94,34 @@ except Exception as e:
# --- Functions --- # --- Functions ---
def normalize_project_path(project_path):
"""Normalizes a project path to ensure it's compatible with TIA Portal."""
if not project_path:
return project_path
# Convert to Path object for easier manipulation
path_obj = Path(project_path)
# Resolve to absolute path and normalize
try:
normalized = path_obj.resolve()
# Convert back to string with Windows-style separators
normalized_str = str(normalized).replace("/", "\\")
print(f" Ruta original: {project_path}")
print(f" Ruta normalizada: {normalized_str}")
return normalized_str
except Exception as e:
print(f" ADVERTENCIA: Error al normalizar ruta: {e}")
return str(project_path)
def is_cross_references_supported(tia_version):
"""Check if cross-references export is supported in the given TIA Portal version."""
return tia_version in CROSS_REFERENCES_SUPPORTED_VERSIONS
def get_supported_filetypes(): def get_supported_filetypes():
"""Returns the supported file types for TIA Portal projects.""" """Returns the supported file types for TIA Portal projects."""
filetypes = [] filetypes = []
@ -112,9 +151,9 @@ def detect_tia_version(project_file_path):
print( print(
f"ADVERTENCIA: Extensión de archivo no reconocida '{file_extension}'. Extensiones soportadas: {list(SUPPORTED_TIA_VERSIONS.keys())}" f"ADVERTENCIA: Extensión de archivo no reconocida '{file_extension}'. Extensiones soportadas: {list(SUPPORTED_TIA_VERSIONS.keys())}"
) )
# Default to version 18.0 for backward compatibility # Default to version 15.0 for backward compatibility
print("Usando por defecto TIA Portal V18.0") print("Usando por defecto TIA Portal V15.0")
return "18.0" return "15.0"
def select_project_file(): def select_project_file():
@ -249,19 +288,31 @@ def export_plc_cross_references(
print(f" Exportación completada en {elapsed_time:.2f} segundos") print(f" Exportación completada en {elapsed_time:.2f} segundos")
blocks_cr_exported += 1 blocks_cr_exported += 1
exported_blocks.add(norm_block) exported_blocks.add(norm_block)
except RuntimeError as block_ex: except Exception as block_ex:
error_msg = str(block_ex)
if (
"NotSupportedException" in error_msg
and "not supported in this API version" in error_msg
):
print(
f" ERROR: Método export_cross_references() no soportado en esta versión de TIA Portal para el bloque {block_name}"
)
print(
f" Esta funcionalidad requiere TIA Portal V17.0 o superior."
)
elif "RuntimeError" in str(type(block_ex)):
print( print(
f" ERROR TIA al exportar referencias cruzadas para el bloque {block_name}: {block_ex}" f" ERROR TIA al exportar referencias cruzadas para el bloque {block_name}: {block_ex}"
) )
problematic_blocks.add(norm_block) else:
blocks_cr_skipped += 1
except Exception as block_ex:
print( print(
f" ERROR GENERAL al exportar referencias cruzadas para el bloque {block_name}: {block_ex}" f" ERROR GENERAL al exportar referencias cruzadas para el bloque {block_name}: {block_ex}"
) )
traceback.print_exc() traceback.print_exc()
problematic_blocks.add(norm_block) # Always mark as problematic
problematic_blocks.add(norm_block)
blocks_cr_skipped += 1 blocks_cr_skipped += 1
if _is_disposed_exception(block_ex): if _is_disposed_exception(block_ex):
# Escalamos para que el script pueda re-abrir el Portal y omitir el bloque # Escalamos para que el script pueda re-abrir el Portal y omitir el bloque
raise PortalDisposedException(block_ex, failed_block=block_name) raise PortalDisposedException(block_ex, failed_block=block_name)
@ -496,13 +547,39 @@ def open_portal_and_project(tia_version: str, project_file_path: str):
print("Conectado a TIA Portal.") print("Conectado a TIA Portal.")
print(f"ID del proceso del Portal: {portal.get_process_id()}") print(f"ID del proceso del Portal: {portal.get_process_id()}")
project_obj = portal.open_project(project_file_path=str(project_file_path)) # Normalize the project path
normalized_path = normalize_project_path(project_file_path)
print(f"Abriendo proyecto: {Path(normalized_path).name}...")
try:
project_obj = portal.open_project(project_file_path=normalized_path)
if project_obj is None: if project_obj is None:
print(
"El proyecto podría estar ya abierto, intentando obtener el manejador..."
)
project_obj = portal.get_project() project_obj = portal.get_project()
if project_obj is None: if project_obj is None:
raise Exception( raise Exception(
"No se pudo abrir u obtener el proyecto especificado tras la reapertura." "No se pudo abrir u obtener el proyecto especificado tras la reapertura."
) )
print("Proyecto abierto exitosamente.")
return portal, project_obj
except Exception as e:
error_msg = str(e)
print(f"ERROR al abrir proyecto: {error_msg}")
if "path" in error_msg.lower() and "cannot be" in error_msg.lower():
print(
f" Problema con formato de ruta. Ruta utilizada: '{normalized_path}'"
)
print(f" Ruta original: '{project_file_path}'")
# Re-raise with more context
raise Exception(f"Error al abrir proyecto TIA Portal: {error_msg}")
return portal, project_obj return portal, project_obj
@ -535,6 +612,23 @@ if __name__ == "__main__":
# 2. Detect TIA Portal version from project file # 2. Detect TIA Portal version from project file
tia_version = detect_tia_version(project_file) tia_version = detect_tia_version(project_file)
# Check if cross-references export is supported in this version
if not is_cross_references_supported(tia_version):
print(
f"\nADVERTENCIA: La exportación de referencias cruzadas no está soportada en TIA Portal V{tia_version}"
)
print(
f"Las referencias cruzadas están soportadas desde TIA Portal V17.0 en adelante."
)
print(
"Versiones soportadas para referencias cruzadas:",
", ".join([f"V{v}" for v in CROSS_REFERENCES_SUPPORTED_VERSIONS]),
)
print(
"\nEl script se cerrará. Por favor use TIA Portal V17.0 o superior para exportar referencias cruzadas."
)
sys.exit(1)
# 3. Define Export Directory using working_directory and subfolder # 3. Define Export Directory using working_directory and subfolder
export_base_dir = Path(working_directory) export_base_dir = Path(working_directory)
try: try:
@ -640,7 +734,7 @@ if __name__ == "__main__":
print("\nProceso de exportación de referencias cruzadas completado.") print("\nProceso de exportación de referencias cruzadas completado.")
except RuntimeError as tia_ex: except Exception as tia_ex:
print(f"\nError de TIA Portal Openness: {tia_ex}") print(f"\nError de TIA Portal Openness: {tia_ex}")
traceback.print_exc() traceback.print_exc()
except FileNotFoundError: except FileNotFoundError:

View File

@ -3,7 +3,10 @@
from lxml import etree from lxml import etree
import traceback import traceback
# Definición de 'ns' (asegúrate de que esté definida correctamente en tu archivo) # Definición de 'ns' - Namespaces para TIA Portal XML (v15-v20)
# NOTA: Los valores iniciales corresponden a versiones recientes (v18-v20)
# La función adapt_namespaces() actualiza automáticamente estos valores
# para soportar versiones anteriores (v15, v16, v17) según el XML detectado
ns = { ns = {
"iface": "http://www.siemens.com/automation/Openness/SW/Interface/v5", "iface": "http://www.siemens.com/automation/Openness/SW/Interface/v5",
"flg": "http://www.siemens.com/automation/Openness/SW/NetworkSource/FlgNet/v4", "flg": "http://www.siemens.com/automation/Openness/SW/NetworkSource/FlgNet/v4",
@ -563,6 +566,17 @@ def adapt_namespaces(root):
"""Actualiza dinámicamente los valores en el diccionario global `ns` para que """Actualiza dinámicamente los valores en el diccionario global `ns` para que
coincidan con los namespaces reales presentes en el XML exportado por TIA. coincidan con los namespaces reales presentes en el XML exportado por TIA.
SOPORTE DE VERSIONES TIA PORTAL:
- v15, v16, v17: Versiones anteriores con URIs diferentes
- v18, v19, v20: Versiones recientes (valores por defecto en ns)
Esta función detecta automáticamente la versión de TIA Portal utilizada
analizando los namespaces declarados en el XML y actualiza el diccionario
global 'ns' para usar los URIs correctos.
Args:
root: Elemento raíz del árbol XML parseado con lxml
Debe llamarse después de obtener la raíz (`root = tree.getroot()`). Si en el Debe llamarse después de obtener la raíz (`root = tree.getroot()`). Si en el
XML aparecen nuevas versiones (p.ej. v6) de los URIs, esta función las XML aparecen nuevas versiones (p.ej. v6) de los URIs, esta función las
detectará y sobreescribirá las entradas correspondientes en `ns`. detectará y sobreescribirá las entradas correspondientes en `ns`.
@ -612,7 +626,18 @@ def adapt_namespaces(root):
detected["iface"] = iface_uri detected["iface"] = iface_uri
if detected: if detected:
print(f"INFO: Namespaces TIA Portal detectados y adaptados:")
for prefix, uri in detected.items():
# Extraer versión del URI si es posible
version_info = ""
if "/v" in uri:
version_part = uri.split("/v")[-1]
if version_part.isdigit():
version_info = f" (v{version_part})"
print(f" - {prefix}: {uri}{version_info}")
ns.update(detected) ns.update(detected)
else:
print("INFO: Usando namespaces por defecto (TIA Portal v18-v20)")
# --- función auxiliar privada para adapt_namespaces --- # --- función auxiliar privada para adapt_namespaces ---

View File

@ -4,6 +4,10 @@
Este documento describe un pipeline de scripts de Python diseñado para convertir bloques de función o funciones (FC/FB) escritos en Ladder Logic (LAD) desde archivos XML de TIA Portal Openness a un código SCL (Structured Control Language) semánticamente equivalente. Este documento describe un pipeline de scripts de Python diseñado para convertir bloques de función o funciones (FC/FB) escritos en Ladder Logic (LAD) desde archivos XML de TIA Portal Openness a un código SCL (Structured Control Language) semánticamente equivalente.
**Versiones TIA Portal Soportadas:** v15, v16, v17, v18, v19, v20
- Detección automática de namespaces XML según la versión
- Compatibilidad total con exportaciones de TIA Portal Openness
El proceso se divide en tres etapas principales, cada una manejada por un script específico: El proceso se divide en tres etapas principales, cada una manejada por un script específico:
1. **XML a JSON Enriquecido (`x1_to_json.py`):** Parsea el XML de Openness, extrae la estructura lógica y las conexiones explícitas, e **infiere conexiones implícitas** (especialmente las habilitaciones EN) para crear un archivo JSON detallado. 1. **XML a JSON Enriquecido (`x1_to_json.py`):** Parsea el XML de Openness, extrae la estructura lógica y las conexiones explícitas, e **infiere conexiones implícitas** (especialmente las habilitaciones EN) para crear un archivo JSON detallado.

View File

@ -4,6 +4,7 @@ LadderToSCL - Conversor de Siemens LAD/FUP XML a SCL
Este script convierte archivos XML de Siemens LAD/FUP a un formato JSON simplificado. Este script convierte archivos XML de Siemens LAD/FUP a un formato JSON simplificado.
""" """
# ToUpload/x1_to_json.py # ToUpload/x1_to_json.py
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
import json import json
@ -13,10 +14,13 @@ import sys
import traceback import traceback
import importlib import importlib
from lxml import etree from lxml import etree
from lxml.etree import XMLSyntaxError as etree_XMLSyntaxError # Alias para evitar conflicto from lxml.etree import (
XMLSyntaxError as etree_XMLSyntaxError,
) # Alias para evitar conflicto
from collections import defaultdict from collections import defaultdict
import copy import copy
import time # <-- NUEVO: Para obtener metadatos import time # <-- NUEVO: Para obtener metadatos
script_root = os.path.dirname( script_root = os.path.dirname(
os.path.dirname(os.path.dirname(os.path.dirname(__file__))) os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
) )
@ -25,7 +29,12 @@ from backend.script_utils import load_configuration
# Importar funciones comunes y namespaces desde el nuevo módulo de utils # Importar funciones comunes y namespaces desde el nuevo módulo de utils
try: try:
from parsers.parser_utils import ns, get_multilingual_text, parse_interface_members, adapt_namespaces from parsers.parser_utils import (
ns,
get_multilingual_text,
parse_interface_members,
adapt_namespaces,
)
except ImportError as e: except ImportError as e:
print( print(
f"Error crítico: No se pudieron importar funciones desde parsers.parser_utils: {e}" f"Error crítico: No se pudieron importar funciones desde parsers.parser_utils: {e}"
@ -253,7 +262,7 @@ def convert_xml_to_json(xml_filepath, json_filepath):
parser = etree.XMLParser(remove_blank_text=True, recover=True) parser = etree.XMLParser(remove_blank_text=True, recover=True)
tree = etree.parse(xml_filepath, parser) tree = etree.parse(xml_filepath, parser)
root = tree.getroot() root = tree.getroot()
# Ajustar namespaces dinámicamente para soportar distintas versiones de TIA # Ajustar namespaces dinámicamente para soportar TIA Portal v15-v20
try: try:
adapt_namespaces(root) adapt_namespaces(root)
except Exception as e_ns: except Exception as e_ns:
@ -272,10 +281,14 @@ def convert_xml_to_json(xml_filepath, json_filepath):
# Intentar Tag Table si no es UDT # Intentar Tag Table si no es UDT
if result is None: if result is None:
tag_table_element = root.find(".//SW.Tags.PlcTagTable", namespaces=root.nsmap) tag_table_element = root.find(
".//SW.Tags.PlcTagTable", namespaces=root.nsmap
)
if tag_table_element is not None: if tag_table_element is not None:
the_block = tag_table_element the_block = tag_table_element
result = parse_tag_table(the_block) # Llamar a la función de parseo de TagTable result = parse_tag_table(
the_block
) # Llamar a la función de parseo de TagTable
# Intentar Bloques (OB, FC, FB, DB) si no es UDT ni TagTable # Intentar Bloques (OB, FC, FB, DB) si no es UDT ni TagTable
if result is None: if result is None:
@ -287,19 +300,33 @@ def convert_xml_to_json(xml_filepath, json_filepath):
if block_list: if block_list:
the_block = block_list[0] # Tomar el primer bloque encontrado the_block = block_list[0] # Tomar el primer bloque encontrado
block_tag_name = etree.QName(the_block.tag).localname # Nombre del tag (ej. SW.Blocks.OB) block_tag_name = etree.QName(
the_block.tag
).localname # Nombre del tag (ej. SW.Blocks.OB)
block_type_map = { block_type_map = {
"SW.Blocks.FC": "FC", "SW.Blocks.FB": "FB", "SW.Blocks.FC": "FC",
"SW.Blocks.GlobalDB": "GlobalDB", "SW.Blocks.OB": "OB", "SW.Blocks.FB": "FB",
"SW.Blocks.InstanceDB": "InstanceDB" # <-- ADDED: Recognize InstanceDB "SW.Blocks.GlobalDB": "GlobalDB",
"SW.Blocks.OB": "OB",
"SW.Blocks.InstanceDB": "InstanceDB", # <-- ADDED: Recognize InstanceDB
} }
block_type_found = block_type_map.get(block_tag_name, "UnknownBlockType") block_type_found = block_type_map.get(
print(f"Paso 2b: Bloque {block_tag_name} (Tipo: {block_type_found}) encontrado (ID={the_block.get('ID')}).") block_tag_name, "UnknownBlockType"
)
print(
f"Paso 2b: Bloque {block_tag_name} (Tipo: {block_type_found}) encontrado (ID={the_block.get('ID')})."
)
# --- Extraer información del Bloque (FC, FB, OB, DB) --- # --- Extraer información del Bloque (FC, FB, OB, DB) ---
print("Paso 3: Extrayendo atributos del bloque...") print("Paso 3: Extrayendo atributos del bloque...")
attribute_list_node = the_block.xpath("./AttributeList") # Buscar hijo directo attribute_list_node = the_block.xpath(
block_name_val, block_number_val, block_lang_val = "Unknown", None, "Unknown" "./AttributeList"
) # Buscar hijo directo
block_name_val, block_number_val, block_lang_val = (
"Unknown",
None,
"Unknown",
)
instance_of_name_val = None # <-- NUEVO: Para InstanceDB instance_of_name_val = None # <-- NUEVO: Para InstanceDB
instance_of_type_val = None # <-- NUEVO: Para InstanceDB instance_of_type_val = None # <-- NUEVO: Para InstanceDB
block_comment_val = "" block_comment_val = ""
@ -307,41 +334,70 @@ def convert_xml_to_json(xml_filepath, json_filepath):
if attribute_list_node: if attribute_list_node:
attr_list = attribute_list_node[0] attr_list = attribute_list_node[0]
name_node = attr_list.xpath("./Name/text()") name_node = attr_list.xpath("./Name/text()")
block_name_val = name_node[0].strip() if name_node else block_name_val block_name_val = (
name_node[0].strip() if name_node else block_name_val
)
num_node = attr_list.xpath("./Number/text()") num_node = attr_list.xpath("./Number/text()")
try: block_number_val = int(num_node[0]) if num_node else None try:
except (ValueError, TypeError): block_number_val = None block_number_val = int(num_node[0]) if num_node else None
except (ValueError, TypeError):
block_number_val = None
lang_node = attr_list.xpath("./ProgrammingLanguage/text()") lang_node = attr_list.xpath("./ProgrammingLanguage/text()")
# Asignar lenguaje por defecto si no se encuentra # Asignar lenguaje por defecto si no se encuentra
block_lang_val = lang_node[0].strip() if lang_node else \ block_lang_val = (
("DB" if block_type_found in ["GlobalDB", "InstanceDB"] else "Unknown") # <-- MODIFIED: Include InstanceDB for DB language default lang_node[0].strip()
if lang_node
else (
"DB"
if block_type_found in ["GlobalDB", "InstanceDB"]
else "Unknown"
)
) # <-- MODIFIED: Include InstanceDB for DB language default
# <-- NUEVO: Extraer info de instancia si es InstanceDB --> # <-- NUEVO: Extraer info de instancia si es InstanceDB -->
if block_type_found == "InstanceDB": if block_type_found == "InstanceDB":
inst_name_node = attr_list.xpath("./InstanceOfName/text()") inst_name_node = attr_list.xpath("./InstanceOfName/text()")
instance_of_name_val = inst_name_node[0].strip() if inst_name_node else None instance_of_name_val = (
inst_type_node = attr_list.xpath("./InstanceOfType/text()") # Generalmente 'FB' inst_name_node[0].strip() if inst_name_node else None
instance_of_type_val = inst_type_node[0].strip() if inst_type_node else None )
print(f"Paso 3: Atributos: Nombre='{block_name_val}', Número={block_number_val}, Lenguaje Bloque='{block_lang_val}'") inst_type_node = attr_list.xpath(
"./InstanceOfType/text()"
) # Generalmente 'FB'
instance_of_type_val = (
inst_type_node[0].strip() if inst_type_node else None
)
print(
f"Paso 3: Atributos: Nombre='{block_name_val}', Número={block_number_val}, Lenguaje Bloque='{block_lang_val}'"
)
# Extraer comentario del bloque (puede estar en AttributeList o ObjectList) # Extraer comentario del bloque (puede estar en AttributeList o ObjectList)
comment_node_attr = attr_list.xpath("./Comment") comment_node_attr = attr_list.xpath("./Comment")
if comment_node_attr: if comment_node_attr:
block_comment_val = get_multilingual_text(comment_node_attr[0]) block_comment_val = get_multilingual_text(comment_node_attr[0])
else: else:
comment_node_obj = the_block.xpath("./ObjectList/MultilingualText[@CompositionName='Comment']") comment_node_obj = the_block.xpath(
"./ObjectList/MultilingualText[@CompositionName='Comment']"
)
if comment_node_obj: if comment_node_obj:
block_comment_val = get_multilingual_text(comment_node_obj[0]) block_comment_val = get_multilingual_text(
comment_node_obj[0]
)
print(f"Paso 3b: Comentario bloque: '{block_comment_val[:50]}...'") print(f"Paso 3b: Comentario bloque: '{block_comment_val[:50]}...'")
else: else:
print(f"Advertencia: No se encontró AttributeList para el bloque {block_type_found}.") print(
if block_type_found in ["GlobalDB", "InstanceDB"]: block_lang_val = "DB" # Default para DB/InstanceDB # <-- MODIFIED: Include InstanceDB f"Advertencia: No se encontró AttributeList para el bloque {block_type_found}."
)
if block_type_found in ["GlobalDB", "InstanceDB"]:
block_lang_val = "DB" # Default para DB/InstanceDB # <-- MODIFIED: Include InstanceDB
# Inicializar diccionario de resultado para el bloque # Inicializar diccionario de resultado para el bloque
result = { result = {
"block_name": block_name_val, "block_number": block_number_val, "block_name": block_name_val,
"language": block_lang_val, "block_type": block_type_found, "block_number": block_number_val,
"language": block_lang_val,
"block_type": block_type_found,
"block_comment": block_comment_val, "block_comment": block_comment_val,
"interface": {}, "networks": [] "interface": {},
"networks": [],
} }
# --- Extraer Interfaz del Bloque --- # --- Extraer Interfaz del Bloque ---
@ -349,44 +405,81 @@ def convert_xml_to_json(xml_filepath, json_filepath):
interface_node = None interface_node = None
if attribute_list_node: if attribute_list_node:
interface_node_list = attribute_list_node[0].xpath("./Interface") interface_node_list = attribute_list_node[0].xpath("./Interface")
if interface_node_list: interface_node = interface_node_list[0] if interface_node_list:
interface_node = interface_node_list[0]
if interface_node is not None: if interface_node is not None:
# Buscar secciones dentro de la interfaz usando el namespace 'iface' # Buscar secciones dentro de la interfaz usando el namespace 'iface'
all_sections = interface_node.xpath(".//iface:Section", namespaces=ns) all_sections = interface_node.xpath(
".//iface:Section", namespaces=ns
)
if all_sections: if all_sections:
processed_sections = set() processed_sections = set()
for section in all_sections: for section in all_sections:
section_name = section.get("Name") section_name = section.get("Name")
if not section_name or section_name in processed_sections: continue if not section_name or section_name in processed_sections:
members_in_section = section.xpath("./iface:Member", namespaces=ns) continue
members_in_section = section.xpath(
"./iface:Member", namespaces=ns
)
if members_in_section: if members_in_section:
result["interface"][section_name] = parse_interface_members(members_in_section) result["interface"][section_name] = (
parse_interface_members(members_in_section)
)
processed_sections.add(section_name) processed_sections.add(section_name)
else: print("Advertencia: Nodo Interface no contiene secciones <iface:Section>.") else:
if not result["interface"]: print("Advertencia: Interface encontrada pero sin secciones procesables.") print(
"Advertencia: Nodo Interface no contiene secciones <iface:Section>."
)
if not result["interface"]:
print(
"Advertencia: Interface encontrada pero sin secciones procesables."
)
elif block_type_found == "GlobalDB": elif block_type_found == "GlobalDB":
# Buscar Static directamente si es DB y no hay nodo Interface # Buscar Static directamente si es DB y no hay nodo Interface
static_members = the_block.xpath(".//iface:Section[@Name='Static']/iface:Member", namespaces=ns) static_members = the_block.xpath(
".//iface:Section[@Name='Static']/iface:Member", namespaces=ns
)
if static_members: if static_members:
print("Paso 4: Encontrada sección Static para GlobalDB (sin nodo Interface explícito).") print(
result["interface"]["Static"] = parse_interface_members(static_members) "Paso 4: Encontrada sección Static para GlobalDB (sin nodo Interface explícito)."
else: print("Advertencia: No se encontró sección 'Static' para GlobalDB.") )
else: print(f"Advertencia: No se encontró <Interface> para bloque {block_type_found}.") result["interface"]["Static"] = parse_interface_members(
if not result["interface"]: print("Advertencia: No se pudo extraer información de la interfaz.") static_members
)
else:
print(
"Advertencia: No se encontró sección 'Static' para GlobalDB."
)
else:
print(
f"Advertencia: No se encontró <Interface> para bloque {block_type_found}."
)
if not result["interface"]:
print("Advertencia: No se pudo extraer información de la interfaz.")
# --- Procesar Redes (CompileUnits) --- # --- Procesar Redes (CompileUnits) ---
if block_type_found not in ["GlobalDB", "InstanceDB"]: # DBs/InstanceDBs no tienen redes ejecutables # <-- MODIFIED: Include InstanceDB if block_type_found not in [
"GlobalDB",
"InstanceDB",
]: # DBs/InstanceDBs no tienen redes ejecutables # <-- MODIFIED: Include InstanceDB
print("Paso 5: Buscando y PROCESANDO redes (CompileUnits)...") print("Paso 5: Buscando y PROCESANDO redes (CompileUnits)...")
networks_processed_count = 0 networks_processed_count = 0
result["networks"] = [] # Asegurar que esté inicializado result["networks"] = [] # Asegurar que esté inicializado
object_list_node = the_block.xpath("./ObjectList") object_list_node = the_block.xpath("./ObjectList")
if object_list_node: if object_list_node:
compile_units = object_list_node[0].xpath("./SW.Blocks.CompileUnit") compile_units = object_list_node[0].xpath(
print(f"Paso 5: Se encontraron {len(compile_units)} elementos SW.Blocks.CompileUnit.") "./SW.Blocks.CompileUnit"
for network_elem in compile_units: # network_elem es el nodo <SW.Blocks.CompileUnit> )
print(
f"Paso 5: Se encontraron {len(compile_units)} elementos SW.Blocks.CompileUnit."
)
for (
network_elem
) in (
compile_units
): # network_elem es el nodo <SW.Blocks.CompileUnit>
networks_processed_count += 1 networks_processed_count += 1
network_id = network_elem.get("ID") network_id = network_elem.get("ID")
network_lang = "Unknown" network_lang = "Unknown"
@ -398,11 +491,17 @@ def convert_xml_to_json(xml_filepath, json_filepath):
# Determinar lenguaje # Determinar lenguaje
net_attr_list = network_elem.xpath("./AttributeList") net_attr_list = network_elem.xpath("./AttributeList")
if net_attr_list: if net_attr_list:
lang_node = net_attr_list[0].xpath("./ProgrammingLanguage/text()") lang_node = net_attr_list[0].xpath(
if lang_node: network_lang = lang_node[0].strip() "./ProgrammingLanguage/text()"
elif result["language"] != "Unknown": network_lang = result["language"] )
if lang_node:
network_lang = lang_node[0].strip()
elif result["language"] != "Unknown":
network_lang = result["language"]
print(f" - Procesando Red ID={network_id}, Lenguaje Red={network_lang}") print(
f" - Procesando Red ID={network_id}, Lenguaje Red={network_lang}"
)
parser_func = parser_map.get(network_lang.upper()) parser_func = parser_map.get(network_lang.upper())
parsed_network_data = None parsed_network_data = None
@ -410,73 +509,150 @@ def convert_xml_to_json(xml_filepath, json_filepath):
try: try:
parsed_network_data = parser_func(network_elem) parsed_network_data = parser_func(network_elem)
except Exception as e_parse: except Exception as e_parse:
print(f" ERROR durante el parseo de Red {network_id} ({network_lang}): {e_parse}") print(
parsed_network_data = {"id": network_id, "language": network_lang, "logic": [], "error": f"Parser failed: {e_parse}"} f" ERROR durante el parseo de Red {network_id} ({network_lang}): {e_parse}"
)
parsed_network_data = {
"id": network_id,
"language": network_lang,
"logic": [],
"error": f"Parser failed: {e_parse}",
}
else: else:
print(f" Advertencia: Lenguaje de red '{network_lang}' no soportado.") print(
parsed_network_data = {"id": network_id, "language": network_lang, "logic": [], "error": f"Unsupported language: {network_lang}"} f" Advertencia: Lenguaje de red '{network_lang}' no soportado."
)
parsed_network_data = {
"id": network_id,
"language": network_lang,
"logic": [],
"error": f"Unsupported language: {network_lang}",
}
# --- LÓGICA CORREGIDA: Asegurar que se procese incluso si el parser falló --- # --- LÓGICA CORREGIDA: Asegurar que se procese incluso si el parser falló ---
if parsed_network_data is None: # Si el parser falló TANTO que ni devolvió un dict if (
parsed_network_data = {"id": network_id, "language": network_lang, "logic": [], "error": "Parser function returned None"} parsed_network_data is None
): # Si el parser falló TANTO que ni devolvió un dict
parsed_network_data = {
"id": network_id,
"language": network_lang,
"logic": [],
"error": "Parser function returned None",
}
# Extraer Título y Comentario de la red SIEMPRE # Extraer Título y Comentario de la red SIEMPRE
try: try:
title_element = network_elem.xpath("./ObjectList/MultilingualText[@CompositionName='Title']") title_element = network_elem.xpath(
parsed_network_data["title"] = get_multilingual_text(title_element[0]) if title_element else f"Network {network_id}" "./ObjectList/MultilingualText[@CompositionName='Title']"
)
parsed_network_data["title"] = (
get_multilingual_text(title_element[0])
if title_element
else f"Network {network_id}"
)
comment_elem_net = network_elem.xpath("./ObjectList/MultilingualText[@CompositionName='Comment']") comment_elem_net = network_elem.xpath(
parsed_network_data["comment"] = get_multilingual_text(comment_elem_net[0]) if comment_elem_net else "" "./ObjectList/MultilingualText[@CompositionName='Comment']"
)
parsed_network_data["comment"] = (
get_multilingual_text(comment_elem_net[0])
if comment_elem_net
else ""
)
except Exception as e_comment: except Exception as e_comment:
print(f" ERROR extrayendo Título/Comentario para Red {network_id}: {e_comment}") print(
f" ERROR extrayendo Título/Comentario para Red {network_id}: {e_comment}"
)
# Añadir valores por defecto si falla la extracción # Añadir valores por defecto si falla la extracción
parsed_network_data["title"] = f"Network {network_id}" # Asegurar que title exista parsed_network_data["title"] = (
parsed_network_data["comment"] = f"// Error extrayendo comentario: {e_comment}" # Asegurar que comment exista f"Network {network_id}" # Asegurar que title exista
)
parsed_network_data["comment"] = (
f"// Error extrayendo comentario: {e_comment}" # Asegurar que comment exista
)
result["networks"].append(parsed_network_data) # Añadir SIEMPRE a la listaa result["networks"].append(
parsed_network_data
) # Añadir SIEMPRE a la listaa
if networks_processed_count == 0: print(f"Advertencia: ObjectList para {block_type_found} sin SW.Blocks.CompileUnit.") if networks_processed_count == 0:
else: print(f"Advertencia: No se encontró ObjectList para el bloque {block_type_found}.") print(
else: print(f"Paso 5: Saltando procesamiento de redes para {block_type_found}.") # <-- MODIFIED: Updated message f"Advertencia: ObjectList para {block_type_found} sin SW.Blocks.CompileUnit."
)
else:
print(
f"Advertencia: No se encontró ObjectList para el bloque {block_type_found}."
)
else:
print(
f"Paso 5: Saltando procesamiento de redes para {block_type_found}."
) # <-- MODIFIED: Updated message
else: # No se encontró ningún bloque SW.Blocks.* else: # No se encontró ningún bloque SW.Blocks.*
print("Error Crítico: No se encontró el elemento raíz del bloque (<SW.Blocks.FC/FB/GlobalDB/OB/InstanceDB>) después de descartar UDT/TagTable.") # <-- MODIFIED: Updated message print(
"Error Crítico: No se encontró el elemento raíz del bloque (<SW.Blocks.FC/FB/GlobalDB/OB/InstanceDB>) después de descartar UDT/TagTable."
) # <-- MODIFIED: Updated message
# --- Fin del manejo de Bloques --- # --- Fin del manejo de Bloques ---
# --- Escritura del JSON Final --- # --- Escritura del JSON Final ---
if result: if result:
# Añadir metadatos XML al diccionario final # Añadir metadatos XML al diccionario final
if xml_mod_time is not None: result["source_xml_mod_time"] = xml_mod_time if xml_mod_time is not None:
if xml_size is not None: result["source_xml_size"] = xml_size result["source_xml_mod_time"] = xml_mod_time
if xml_size is not None:
result["source_xml_size"] = xml_size
print("Paso 6: Escribiendo el resultado en el archivo JSON...") print("Paso 6: Escribiendo el resultado en el archivo JSON...")
# Advertencias finales si faltan partes clave # Advertencias finales si faltan partes clave
if result.get("block_type") not in ["PlcUDT", "PlcTagTable"] and not result.get("interface"): print("ADVERTENCIA FINAL: 'interface' está vacía en el JSON.") if result.get("block_type") not in [
if result.get("block_type") not in ["PlcUDT", "PlcTagTable", "GlobalDB", "InstanceDB"] and not result.get("networks"): print("ADVERTENCIA FINAL: 'networks' está vacía en el JSON.") # <-- MODIFIED: Include InstanceDB "PlcUDT",
"PlcTagTable",
] and not result.get("interface"):
print("ADVERTENCIA FINAL: 'interface' está vacía en el JSON.")
if result.get("block_type") not in [
"PlcUDT",
"PlcTagTable",
"GlobalDB",
"InstanceDB",
] and not result.get("networks"):
print(
"ADVERTENCIA FINAL: 'networks' está vacía en el JSON."
) # <-- MODIFIED: Include InstanceDB
# Escribir el archivo JSON # Escribir el archivo JSON
try: try:
with open(json_filepath, "w", encoding="utf-8") as f: with open(json_filepath, "w", encoding="utf-8") as f:
json.dump(result, f, indent=4, ensure_ascii=False) json.dump(result, f, indent=4, ensure_ascii=False)
print("Paso 6: Escritura JSON completada.") print("Paso 6: Escritura JSON completada.")
print(f"Conversión finalizada. JSON guardado en: '{os.path.relpath(json_filepath)}'") print(
f"Conversión finalizada. JSON guardado en: '{os.path.relpath(json_filepath)}'"
)
return True # Indicar éxito return True # Indicar éxito
except IOError as e: print(f"Error Crítico: No se pudo escribir JSON en '{json_filepath}'. Error: {e}"); return False except IOError as e:
except TypeError as e: print(f"Error Crítico: Problema al serializar a JSON. Error: {e}"); return False print(
f"Error Crítico: No se pudo escribir JSON en '{json_filepath}'. Error: {e}"
)
return False
except TypeError as e:
print(f"Error Crítico: Problema al serializar a JSON. Error: {e}")
return False
else: else:
# Esto no debería ocurrir si se manejaron todos los tipos o hubo error antes # Esto no debería ocurrir si se manejaron todos los tipos o hubo error antes
print("Error Crítico: No se generó ningún resultado para el archivo XML.") print("Error Crítico: No se generó ningún resultado para el archivo XML.")
return False return False
except etree_XMLSyntaxError as e: # Usar alias except etree_XMLSyntaxError as e: # Usar alias
print(f"Error Crítico: Sintaxis XML inválida en '{xml_filepath}'. Detalles: {e}") print(
f"Error Crítico: Sintaxis XML inválida en '{xml_filepath}'. Detalles: {e}"
)
return False return False
except Exception as e: except Exception as e:
print(f"Error Crítico: Error inesperado durante la conversión: {e}") print(f"Error Crítico: Error inesperado durante la conversión: {e}")
traceback.print_exc() traceback.print_exc()
return False return False
# --- Punto de Entrada Principal (__main__) --- # --- Punto de Entrada Principal (__main__) ---
if __name__ == "__main__": if __name__ == "__main__":
# Lógica para ejecución standalone # Lógica para ejecución standalone
@ -484,7 +660,10 @@ if __name__ == "__main__":
import tkinter as tk import tkinter as tk
from tkinter import filedialog from tkinter import filedialog
except ImportError: except ImportError:
print("Error: Tkinter no está instalado. No se puede mostrar el diálogo de archivo.", file=sys.stderr) print(
"Error: Tkinter no está instalado. No se puede mostrar el diálogo de archivo.",
file=sys.stderr,
)
# No salimos, podríamos intentar obtener el path de otra forma o fallar más adelante # No salimos, podríamos intentar obtener el path de otra forma o fallar más adelante
tk = None # Marcar como no disponible tk = None # Marcar como no disponible
@ -495,7 +674,7 @@ if __name__ == "__main__":
print("Por favor, selecciona el archivo XML de entrada...") print("Por favor, selecciona el archivo XML de entrada...")
xml_input_file = filedialog.askopenfilename( xml_input_file = filedialog.askopenfilename(
title="Selecciona el archivo XML de entrada", title="Selecciona el archivo XML de entrada",
filetypes=[("XML files", "*.xml"), ("All files", "*.*")] filetypes=[("XML files", "*.xml"), ("All files", "*.*")],
) )
root.destroy() # Cerrar Tkinter root.destroy() # Cerrar Tkinter
@ -503,9 +682,7 @@ if __name__ == "__main__":
print("No se seleccionó ningún archivo. Saliendo.", file=sys.stderr) print("No se seleccionó ningún archivo. Saliendo.", file=sys.stderr)
# sys.exit(1) # No usar sys.exit aquí # sys.exit(1) # No usar sys.exit aquí
else: else:
print( print(f"Archivo XML seleccionado: {xml_input_file}")
f"Archivo XML seleccionado: {xml_input_file}"
)
# Calcular ruta de salida JSON # Calcular ruta de salida JSON
xml_filename_base = os.path.splitext(os.path.basename(xml_input_file))[0] xml_filename_base = os.path.splitext(os.path.basename(xml_input_file))[0]
@ -524,4 +701,7 @@ if __name__ == "__main__":
if success: if success:
print("\nConversión completada exitosamente.") print("\nConversión completada exitosamente.")
else: else:
print(f"\nError durante la conversión de '{os.path.relpath(xml_input_file)}'.", file=sys.stderr) print(
f"\nError durante la conversión de '{os.path.relpath(xml_input_file)}'.",
file=sys.stderr,
)

37023
data/log.txt

File diff suppressed because it is too large Load Diff