Compare commits

...

6 Commits

Author SHA1 Message Date
Miguel 88ff4a25a2 Add README and execution log for ObtainIOFromProjectTia and XML Parser to SCL
- Created a README file for the ObtainIOFromProjectTia script group detailing the directory structure and file organization.
- Added a detailed execution log for the x4_cross_reference.py script, including timestamps, output summaries, and error logs.
2025-05-12 12:05:14 +02:00
Miguel 1f64cdf448 Add technical documentation for parsing TIA Portal _XRef.xml files to generate call trees
- Introduced a comprehensive guide detailing the structure and interpretation of _XRef.xml files.
- Explained key XML elements necessary for constructing call trees, including SourceObject, ReferenceObject, and Location.
- Provided a step-by-step data extraction strategy for identifying caller-callee relationships.
- Included example Python code for parsing XML and aggregating call relationships.
- Addressed considerations for handling multiple files and variations across TIA Portal versions.
2025-05-05 12:33:30 +02:00
Miguel 8fcb441003 Agregar script para exportar referencias cruzadas de proyectos TIA Portal a archivos 2025-05-05 12:32:17 +02:00
Miguel 625b639ff5 Se copian los archivos scl exportados del Tia Portal tambien para completar los archivos fuentes 2025-05-04 20:43:45 +02:00
Miguel 24cf3c670b Agregada la funcion de generar DB de Instancias 2025-05-04 00:01:00 +02:00
Miguel 9f8437fc2d Actualizado de descripciones 2025-05-03 23:35:29 +02:00
37 changed files with 29854 additions and 4329 deletions

View File

@ -0,0 +1 @@
{}

View File

@ -0,0 +1,8 @@
{
"x1.py": {
"display_name": "x1",
"short_description": "Script para hacer una union de los cambios generados por un LLM en un archivo de código C#.",
"long_description": "",
"hidden": false
}
}

View File

@ -1,22 +1,22 @@
--- Log de Ejecución: x1.py --- --- Log de Ejecución: x1.py ---
Grupo: EmailCrono Grupo: EmailCrono
Directorio de Trabajo: C:\Trabajo\SIDEL\EMAILs\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS Directorio de Trabajo: C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email
Inicio: 2025-05-03 17:15:12 Inicio: 2025-05-09 16:58:28
Fin: 2025-05-03 17:15:14 Fin: 2025-05-09 16:58:29
Duración: 0:00:01.628641 Duración: 0:00:00.434600
Estado: SUCCESS (Código de Salida: 0) Estado: SUCCESS (Código de Salida: 0)
--- SALIDA ESTÁNDAR (STDOUT) --- --- SALIDA ESTÁNDAR (STDOUT) ---
Working directory: C:\Trabajo\SIDEL\EMAILs\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS Working directory: C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email
Input directory: C:\Trabajo\SIDEL\EMAILs\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS Input directory: C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email
Output directory: C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/00 - MASTER/EMAILs Output directory: C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/10 - E5.007095 - Modifica O&U - SAE463
Cronologia file: C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/00 - MASTER/EMAILs\cronologia.md Cronologia file: C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/10 - E5.007095 - Modifica O&U - SAE463\cronologia.md
Attachments directory: C:\Trabajo\SIDEL\EMAILs\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS\adjuntos Attachments directory: C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email\adjuntos
Beautify rules file: D:\Proyectos\Scripts\ParamManagerScripts\backend\script_groups\EmailCrono\config\beautify_rules.json Beautify rules file: D:\Proyectos\Scripts\ParamManagerScripts\backend\script_groups\EmailCrono\config\beautify_rules.json
Found 1 .eml files Found 1 .eml files
Loaded 0 existing messages Loaded 0 existing messages
Processing C:\Trabajo\SIDEL\EMAILs\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS.eml Processing C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email\R_ {EXT} E5.006894 - Modifica O&U - SAE463 New Analyzer.eml
Aplicando reglas de prioridad 1 Aplicando reglas de prioridad 1
Aplicando reglas de prioridad 2 Aplicando reglas de prioridad 2
Aplicando reglas de prioridad 3 Aplicando reglas de prioridad 3
@ -27,7 +27,7 @@ Estadísticas de procesamiento:
- Mensajes únicos añadidos: 1 - Mensajes únicos añadidos: 1
- Mensajes duplicados ignorados: 0 - Mensajes duplicados ignorados: 0
Writing 1 messages to C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/00 - MASTER/EMAILs\cronologia.md Writing 1 messages to C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/10 - E5.007095 - Modifica O&U - SAE463\cronologia.md
--- ERRORES (STDERR) --- --- ERRORES (STDERR) ---
Ninguno Ninguno

View File

@ -8,7 +8,7 @@
"cronologia_file": "cronologia.md" "cronologia_file": "cronologia.md"
}, },
"level3": { "level3": {
"output_directory": "C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/00 - MASTER/EMAILs" "output_directory": "C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/10 - E5.007095 - Modifica O&U - SAE463"
}, },
"working_directory": "C:\\Trabajo\\SIDEL\\EMAILs\\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS" "working_directory": "C:\\Trabajo\\SIDEL\\10 - E5.007095 - Modifica O&U - SAE463\\Reporte\\Email"
} }

View File

@ -1,6 +1,8 @@
{ {
"path": "C:\\Trabajo\\SIDEL\\EMAILs\\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS", "path": "C:\\Trabajo\\SIDEL\\10 - E5.007095 - Modifica O&U - SAE463\\Reporte\\Email",
"history": [ "history": [
"C:\\Trabajo\\SIDEL\\10 - E5.007095 - Modifica O&U - SAE463\\Reporte\\Email",
"C:\\Trabajo\\SIDEL\\08 - Masselli TEST\\Reporte\\EMAILs",
"C:\\Trabajo\\SIDEL\\EMAILs\\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS", "C:\\Trabajo\\SIDEL\\EMAILs\\I_ E5.007727 _ Evo On - SFSRFH300172 + SFSRFH300109 - ANDIA LACTEOS",
"C:\\Estudio", "C:\\Estudio",
"C:\\Trabajo\\VM\\40 - 93040 - HENKEL - NEXT2 Problem\\Reporte\\EmailTody", "C:\\Trabajo\\VM\\40 - 93040 - HENKEL - NEXT2 Problem\\Reporte\\EmailTody",

View File

@ -0,0 +1,8 @@
{
"x1.py": {
"display_name": "x1",
"short_description": "Script para importar archivos HTML o DOCX y convertirlos a un archivo Markdown.",
"long_description": "",
"hidden": false
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,9 +1,9 @@
--- Log de Ejecución: x2.py --- --- Log de Ejecución: x2.py ---
Grupo: ObtainIOFromProjectTia Grupo: ObtainIOFromProjectTia
Directorio de Trabajo: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport Directorio de Trabajo: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Inicio: 2025-05-02 23:34:21 Inicio: 2025-05-05 12:39:16
Fin: 2025-05-02 23:36:20 Fin: 2025-05-05 12:40:41
Duración: 0:01:58.373747 Duración: 0:01:25.846312
Estado: SUCCESS (Código de Salida: 0) Estado: SUCCESS (Código de Salida: 0)
--- SALIDA ESTÁNDAR (STDOUT) --- --- SALIDA ESTÁNDAR (STDOUT) ---
@ -16,17 +16,17 @@ Will generate summary to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE19
Export log file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.log Export log file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.log
Connecting to TIA Portal V18.0... Connecting to TIA Portal V18.0...
2025-05-02 23:34:30,132 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - Start TIA Portal, please acknowledge the security dialog. 2025-05-05 12:39:20,828 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - Start TIA Portal, please acknowledge the security dialog.
2025-05-02 23:34:30,155 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - With user interface 2025-05-05 12:39:20,847 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - With user interface
Connected. Connected.
Opening project: SAE196_c0.2.ap18... Opening project: SAE196_c0.2.ap18...
2025-05-02 23:35:01,950 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal OpenProject - Open project... C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\InLavoro\PLC\SAE196_c0.2\SAE196_c0.2.ap18 2025-05-05 12:39:43,534 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal OpenProject - Open project... C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\InLavoro\PLC\SAE196_c0.2\SAE196_c0.2.ap18
Project opened. Project opened.
Exporting CAx data for the project to C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml... Exporting CAx data for the project to C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml...
CAx data exported successfully. CAx data exported successfully.
Closing TIA Portal... Closing TIA Portal...
2025-05-02 23:36:15,947 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal ClosePortal - Close TIA Portal 2025-05-05 12:40:38,187 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal ClosePortal - Close TIA Portal
TIA Portal closed. TIA Portal closed.
Parsing AML file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml Parsing AML file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
Markdown summary written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Summary.md Markdown summary written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Summary.md

View File

@ -1,9 +1,9 @@
--- Log de Ejecución: x3.py --- --- Log de Ejecución: x3.py ---
Grupo: ObtainIOFromProjectTia Grupo: ObtainIOFromProjectTia
Directorio de Trabajo: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport Directorio de Trabajo: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Inicio: 2025-05-02 23:43:07 Inicio: 2025-05-05 12:48:16
Fin: 2025-05-02 23:43:12 Fin: 2025-05-05 12:48:22
Duración: 0:00:05.235415 Duración: 0:00:06.125698
Estado: SUCCESS (Código de Salida: 0) Estado: SUCCESS (Código de Salida: 0)
--- SALIDA ESTÁNDAR (STDOUT) --- --- SALIDA ESTÁNDAR (STDOUT) ---
@ -17,22 +17,22 @@ Output IO Debug Tree MD: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196
Processing AML file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml Processing AML file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
Pass 1: Found 203 InternalElement(s). Populating device dictionary... Pass 1: Found 203 InternalElement(s). Populating device dictionary...
Pass 2: Identifying PLCs and Networks (Refined v2)... Pass 2: Identifying PLCs and Networks (Refined v2)...
Identified Network: PROFIBUS_1 (bcc6f2bd-3d71-4407-90f2-bccff6064051) Type: Profibus Identified Network: PROFIBUS_1 (17667a38-6bbe-481a-a234-6c9ac582adb9) Type: Profibus
Identified Network: ETHERNET_1 (c6d49787-a076-4592-994d-876eea123dfd) Type: Ethernet/Profinet Identified Network: ETHERNET_1 (4fa8d8c4-4fb5-4df5-a82e-ec6829530c2e) Type: Ethernet/Profinet
Identified PLC: PLC (a48e038f-0bcc-4b48-8373-033da316c62b) - Type: CPU 1516F-3 PN/DP OrderNo: 6ES7 516-3FP03-0AB0 Identified PLC: PLC (a48e038f-0bcc-4b48-8373-033da316c62b) - Type: CPU 1516F-3 PN/DP OrderNo: 6ES7 516-3FP03-0AB0
Pass 3: Processing InternalLinks (Robust Network Mapping & IO)... Pass 3: Processing InternalLinks (Robust Network Mapping & IO)...
Found 118 InternalLink(s). Found 118 InternalLink(s).
Mapping Device/Node 'E1' (NodeID:1643b51f-7067-4565-8f8e-109a1a775fed, Addr:10.1.33.11) to Network 'ETHERNET_1' Mapping Device/Node 'E1' (NodeID:e15ed19e-b5e1-4cc2-9690-ee5b2132ed74, Addr:10.1.33.11) to Network 'ETHERNET_1'
--> Associating Network 'ETHERNET_1' with PLC 'PLC' (via Node 'E1' Addr: 10.1.33.11) --> Associating Network 'ETHERNET_1' with PLC 'PLC' (via Node 'E1' Addr: 10.1.33.11)
Mapping Device/Node 'P1' (NodeID:5aff409b-2573-485f-82bf-0e08c9200086, Addr:1) to Network 'PROFIBUS_1' Mapping Device/Node 'P1' (NodeID:d9426769-3159-4c09-af54-e00a677183fd, Addr:1) to Network 'PROFIBUS_1'
--> Associating Network 'PROFIBUS_1' with PLC 'PLC' (via Node 'P1' Addr: 1) --> Associating Network 'PROFIBUS_1' with PLC 'PLC' (via Node 'P1' Addr: 1)
Mapping Device/Node 'PB1' (NodeID:c796e175-c770-43f0-8191-fc91996c0147, Addr:12) to Network 'PROFIBUS_1' Mapping Device/Node 'PB1' (NodeID:086deb3e-1f8a-471c-8d00-879d11991c6d, Addr:12) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:0b44f55a-63c1-49e8-beea-24dc5d3226e3, Addr:20) to Network 'PROFIBUS_1' Mapping Device/Node 'PB1' (NodeID:7cf9f331-96fd-4a89-bf31-7faf501077cd, Addr:20) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:25cfc251-f946-40c5-992d-ad6387677acb, Addr:21) to Network 'PROFIBUS_1' Mapping Device/Node 'PB1' (NodeID:d5a8a086-5c97-4c7d-b488-823e7d75370e, Addr:21) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:57999375-ec72-46ef-8ec2-6c3178e8acf8, Addr:22) to Network 'PROFIBUS_1' Mapping Device/Node 'PB1' (NodeID:e2893de8-90e6-42e6-9e83-7838a57f5038, Addr:22) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:54e8db6a-9443-41a4-a85b-cf0722c1d299, Addr:10) to Network 'PROFIBUS_1' Mapping Device/Node 'PB1' (NodeID:af0f7eb6-720e-42c3-9a97-bf75183d0dc2, Addr:10) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:4786bab6-4097-4651-ac19-6cadfc7ea735, Addr:8) to Network 'PROFIBUS_1' Mapping Device/Node 'PB1' (NodeID:bfff87c4-b07c-441c-b977-58e967b96587, Addr:8) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:1f08afcb-111f-428f-915e-69363af1b09a, Addr:40) to Network 'PROFIBUS_1' Mapping Device/Node 'PB1' (NodeID:69c65f28-7810-44e6-aae6-fcecb035f91b, Addr:40) to Network 'PROFIBUS_1'
Data extraction and structuring complete. Data extraction and structuring complete.
Generating JSON output: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.hierarchical.json Generating JSON output: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.hierarchical.json
JSON data written successfully. JSON data written successfully.

View File

@ -0,0 +1,433 @@
--- Log de Ejecución: x4.py ---
Grupo: ObtainIOFromProjectTia
Directorio de Trabajo: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Inicio: 2025-05-05 13:26:23
Fin: 2025-05-05 13:35:16
Duración: 0:08:53.119788
Estado: SUCCESS (Código de Salida: 0)
--- SALIDA ESTÁNDAR (STDOUT) ---
--- TIA Portal Cross-Reference Exporter ---
Selected Project: C:/Trabajo/SIDEL/06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)/InLavoro/PLC/SAE196_c0.2/SAE196_c0.2.ap18
Using Base Export Directory: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Connecting to TIA Portal V18.0...
2025-05-05 13:26:29,175 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - Start TIA Portal, please acknowledge the security dialog.
2025-05-05 13:26:29,200 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - With user interface
Connected to TIA Portal.
2025-05-05 13:27:07,831 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal GetProcessId - Process id: 5272
Portal Process ID: 5272
Opening project: SAE196_c0.2.ap18...
2025-05-05 13:27:08,303 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal OpenProject - Open project... C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\InLavoro\PLC\SAE196_c0.2\SAE196_c0.2.ap18
Project opened successfully.
2025-05-05 13:27:39,932 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Project GetPlcs - Found plc PLC with parent name S71500/ET200MP station_1
Found 1 PLC(s). Starting cross-reference export process...
--- Processing PLC: PLC ---
[PLC: PLC] Exporting Program Block Cross-References...
Target: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\ProgramBlocks_CR
Found 380 program blocks.
Processing block: _CYCL_EXC...
Exporting cross-references for _CYCL_EXC...
Processing block: COMPLETE RESTART...
Exporting cross-references for COMPLETE RESTART...
Processing block: I/O_FLT1...
Exporting cross-references for I/O_FLT1...
Processing block: MOD_ERR...
Exporting cross-references for MOD_ERR...
Processing block: ProDiagOB...
Exporting cross-references for ProDiagOB...
Processing block: Programming error...
Exporting cross-references for Programming error...
Processing block: RACK_FLT...
Exporting cross-references for RACK_FLT...
Processing block: Time error interrupt...
Exporting cross-references for Time error interrupt...
Processing block: Baialage...
Exporting cross-references for Baialage...
Processing block: BlenderCtrl__Main...
Exporting cross-references for BlenderCtrl__Main...
Processing block: BlenderCtrl_CIPModeInit...
Exporting cross-references for BlenderCtrl_CIPModeInit...
Processing block: BlenderCtrl_ProdModeInit...
Exporting cross-references for BlenderCtrl_ProdModeInit...
Processing block: BlenderCtrl_ResetSPWord...
Exporting cross-references for BlenderCtrl_ResetSPWord...
Processing block: BlenderCtrl_UpdatePWord...
Exporting cross-references for BlenderCtrl_UpdatePWord...
Processing block: BlenderPID_NextRecipe...
Exporting cross-references for BlenderPID_NextRecipe...
Processing block: BlenderRinse...
Exporting cross-references for BlenderRinse...
Processing block: BlenderRinse_Done...
Exporting cross-references for BlenderRinse_Done...
Processing block: BlenderRun_ProdTime...
Exporting cross-references for BlenderRun_ProdTime...
Processing block: BlenderRun_Stopping...
Exporting cross-references for BlenderRun_Stopping...
Processing block: Blocco_1...
Exporting cross-references for Blocco_1...
Processing block: Block_compare...
Exporting cross-references for Block_compare...
Processing block: Block_move...
Exporting cross-references for Block_move...
Processing block: CarboWaterLine_Seq...
Exporting cross-references for CarboWaterLine_Seq...
Processing block: Cetrifugal_Head...
Exporting cross-references for Cetrifugal_Head...
Processing block: CIP CVQ...
Exporting cross-references for CIP CVQ...
Processing block: CIP FlipFlop...
Exporting cross-references for CIP FlipFlop...
Processing block: CIPLocal_ProgInizialize...
Exporting cross-references for CIPLocal_ProgInizialize...
Processing block: CIPLocal_WaitEvent_Ctrl...
Exporting cross-references for CIPLocal_WaitEvent_Ctrl...
Processing block: CIPMain...
Exporting cross-references for CIPMain...
Processing block: CIPMain_Flood...
Exporting cross-references for CIPMain_Flood...
Processing block: CIPMain_Total Drain...
Exporting cross-references for CIPMain_Total Drain...
Processing block: Clock Signal...
Exporting cross-references for Clock Signal...
Processing block: CO2 Solubility...
Exporting cross-references for CO2 Solubility...
Processing block: CO2EqPress...
Exporting cross-references for CO2EqPress...
Processing block: CO2InjPressure...
Exporting cross-references for CO2InjPressure...
Processing block: CTRLCoolingSystem...
Exporting cross-references for CTRLCoolingSystem...
Processing block: DeairCO2TempComp...
Exporting cross-references for DeairCO2TempComp...
Processing block: DeaireationValve...
Exporting cross-references for DeaireationValve...
Processing block: Deaireator StartUp_Seq...
Exporting cross-references for Deaireator StartUp_Seq...
Processing block: DeltaP...
Exporting cross-references for DeltaP...
Processing block: FeedForward...
Exporting cross-references for FeedForward...
Processing block: Flow_To_Press_Loss...
Exporting cross-references for Flow_To_Press_Loss...
Processing block: Freq_To_mmH2O...
Exporting cross-references for Freq_To_mmH2O...
Processing block: FrictionLoss...
Exporting cross-references for FrictionLoss...
Processing block: FW_DRand...
Exporting cross-references for FW_DRand...
Processing block: GetProdBrixCO2_Anal_Inpt...
Exporting cross-references for GetProdBrixCO2_Anal_Inpt...
Processing block: Interlocking_Panel_1...
Exporting cross-references for Interlocking_Panel_1...
Processing block: ITC Communic CIPRoom...
Exporting cross-references for ITC Communic CIPRoom...
Processing block: ITC Communic Filler...
Exporting cross-references for ITC Communic Filler...
Processing block: ITC Communic MainRoutine...
Exporting cross-references for ITC Communic MainRoutine...
Processing block: ITC Communic ProdRoom...
Exporting cross-references for ITC Communic ProdRoom...
Processing block: ITC DataIn...
Exporting cross-references for ITC DataIn...
Processing block: ITC DataOut...
Exporting cross-references for ITC DataOut...
Processing block: ITC Exchange MainRoutine...
Exporting cross-references for ITC Exchange MainRoutine...
Processing block: ITC MainRoutine...
Exporting cross-references for ITC MainRoutine...
Processing block: LIMIT_I...
Exporting cross-references for LIMIT_I...
Processing block: LIMIT_R...
Exporting cross-references for LIMIT_R...
Processing block: Maselli_PA_Control...
Exporting cross-references for Maselli_PA_Control...
Processing block: Maselli_PA_Ctrl_Transfer...
Exporting cross-references for Maselli_PA_Ctrl_Transfer...
Processing block: Maselli_PA_Ctrl_Write...
Exporting cross-references for Maselli_PA_Ctrl_Write...
Processing block: MFMAnalogValues_Totalize...
Exporting cross-references for MFMAnalogValues_Totalize...
Processing block: mmH2O_TO_Freq...
Exporting cross-references for mmH2O_TO_Freq...
Processing block: ModValveFault...
Exporting cross-references for ModValveFault...
Processing block: mPDS_SYR_PA_Control...
Exporting cross-references for mPDS_SYR_PA_Control...
Processing block: ONS_R...
Exporting cross-references for ONS_R...
Processing block: Prod Tank RunOut_Seq...
Exporting cross-references for Prod Tank RunOut_Seq...
Processing block: ProductLiterInTank...
Exporting cross-references for ProductLiterInTank...
Processing block: ProductPipeDrain_Seq...
Exporting cross-references for ProductPipeDrain_Seq...
Processing block: ProductPipeRunOut_Seq...
Exporting cross-references for ProductPipeRunOut_Seq...
Processing block: ProductQuality...
Exporting cross-references for ProductQuality...
Processing block: SEL_I...
Exporting cross-references for SEL_I...
Processing block: SEL_R...
Exporting cross-references for SEL_R...
Processing block: SelCheckBrixSource...
Exporting cross-references for SelCheckBrixSource...
Processing block: SLIM_Block...
Exporting cross-references for SLIM_Block...
Processing block: SpeedAdjust...
Exporting cross-references for SpeedAdjust...
Processing block: Syrup Line MFM Prep_Seq...
Exporting cross-references for Syrup Line MFM Prep_Seq...
Processing block: Syrup MFM StartUp_Seq...
Exporting cross-references for Syrup MFM StartUp_Seq...
Processing block: SyrupDensity...
Exporting cross-references for SyrupDensity...
Processing block: SyrupRoomCtrl...
Exporting cross-references for SyrupRoomCtrl...
Processing block: WaterDensity...
Exporting cross-references for WaterDensity...
Processing block: WritePeripheral...
Exporting cross-references for WritePeripheral...
Processing block: CIPRecipeManagement_Data...
Exporting cross-references for CIPRecipeManagement_Data...
Processing block: Co2_Counters_DB...
Exporting cross-references for Co2_Counters_DB...
Processing block: Default_SupervisionDB...
Exporting cross-references for Default_SupervisionDB...
Processing block: ITC Communic CIP DI...
Exporting cross-references for ITC Communic CIP DI...
Processing block: ITC Communic Filler DI...
Exporting cross-references for ITC Communic Filler DI...
Processing block: ITC Communic Mixer DI...
Exporting cross-references for ITC Communic Mixer DI...
Processing block: ITC Communic Product Room DI...
Exporting cross-references for ITC Communic Product Room DI...
Processing block: Key Read & Write Data...
Exporting cross-references for Key Read & Write Data...
Processing block: mPPM303StartUpRamp...
Exporting cross-references for mPPM303StartUpRamp...
Processing block: PID_RMM304_Data...
Exporting cross-references for PID_RMM304_Data...
Processing block: PID_RVN302_Data...
Exporting cross-references for PID_RVN302_Data...
Processing block: PID_RVS318_Data...
Exporting cross-references for PID_RVS318_Data...
Processing block: ProdBrixRecovery_DB...
Exporting cross-references for ProdBrixRecovery_DB...
Processing block: Prod Tank Drain_Seq...
Exporting cross-references for Prod Tank Drain_Seq...
Processing block: _StepMove...
Exporting cross-references for _StepMove...
Processing block: _StepMove_Test...
Exporting cross-references for _StepMove_Test...
Processing block: RecipeManagement_Data...
Exporting cross-references for RecipeManagement_Data...
Processing block: Blender_Procedure Data...
Exporting cross-references for Blender_Procedure Data...
Processing block: BlenderPID__Main_Data...
Exporting cross-references for BlenderPID__Main_Data...
Processing block: BlenderRun_MeasFil_Data...
Exporting cross-references for BlenderRun_MeasFil_Data...
Processing block: BrixTracking_Data...
Exporting cross-references for BrixTracking_Data...
Processing block: CO2Tracking_Data...
Exporting cross-references for CO2Tracking_Data...
Processing block: FirstProduction_Data...
Exporting cross-references for FirstProduction_Data...
Processing block: Input_Data...
Exporting cross-references for Input_Data...
Processing block: ISOonTCP_or_TCP_Protocol_DB...
Exporting cross-references for ISOonTCP_or_TCP_Protocol_DB...
Processing block: MFM_Analog_Value_Data...
Exporting cross-references for MFM_Analog_Value_Data...
Processing block: PID MAIN Data...
Exporting cross-references for PID MAIN Data...
Processing block: PID_Filling_Head_Data...
Exporting cross-references for PID_Filling_Head_Data...
Processing block: PID_RMM301_Data...
Exporting cross-references for PID_RMM301_Data...
Processing block: PID_RMM303_Data...
Exporting cross-references for PID_RMM303_Data...
Processing block: PID_RMP302_Data...
Exporting cross-references for PID_RMP302_Data...
Processing block: PID_RVM301_Data...
Exporting cross-references for PID_RVM301_Data...
Processing block: PID_RVM319_Data...
Exporting cross-references for PID_RVM319_Data...
Processing block: PID_RVP303_Data...
Exporting cross-references for PID_RVP303_Data...
Processing block: Sel_Check_Brix_Data...
Exporting cross-references for Sel_Check_Brix_Data...
Processing block: Signal_Gen_Data...
Exporting cross-references for Signal_Gen_Data...
Processing block: System_Run_Out_Data...
Exporting cross-references for System_Run_Out_Data...
Processing block: SubCarb_DB...
Exporting cross-references for SubCarb_DB...
Processing block: CYC_INT5...
Exporting cross-references for CYC_INT5...
Processing block: BlenderCtrl_All Auto...
Exporting cross-references for BlenderCtrl_All Auto...
Processing block: BlenderCtrl_InitErrors...
Exporting cross-references for BlenderCtrl_InitErrors...
Processing block: BlenderCtrl_ManualActive...
Exporting cross-references for BlenderCtrl_ManualActive...
Processing block: BlenderCtrl_MFM Command...
Exporting cross-references for BlenderCtrl_MFM Command...
Processing block: BlenderPID_FlowMeterErro...
Exporting cross-references for BlenderPID_FlowMeterErro...
Processing block: BlenderPID_PIDResInteg...
Exporting cross-references for BlenderPID_PIDResInteg...
Processing block: BlenderPIDCtrl_PresRelea...
Exporting cross-references for BlenderPIDCtrl_PresRelea...
Processing block: BlenderPIDCtrl_SaveValve...
Exporting cross-references for BlenderPIDCtrl_SaveValve...
Processing block: BlenderRun__Control...
Exporting cross-references for BlenderRun__Control...
Processing block: BlenderRun_SelectConstan...
Exporting cross-references for BlenderRun_SelectConstan...
Processing block: BlendFill StartUp_Seq...
Exporting cross-references for BlendFill StartUp_Seq...
Processing block: CIP_SimpleProgr_Init...
Exporting cross-references for CIP_SimpleProgr_Init...
Processing block: CIPLocal...
Exporting cross-references for CIPLocal...
Processing block: CIPLocal_ExecSimpleCIP...
Exporting cross-references for CIPLocal_ExecSimpleCIP...
Processing block: CIPLocal_ExecStep...
Exporting cross-references for CIPLocal_ExecStep...
Processing block: CIPLocal_ProgStepDown...
Exporting cross-references for CIPLocal_ProgStepDown...
Processing block: CIPLocal_ProgStepUp...
Exporting cross-references for CIPLocal_ProgStepUp...
Processing block: CIPReportManager...
Exporting cross-references for CIPReportManager...
ERROR accessing Program Blocks for cross-reference export: RemotingException: El objeto '/460a527c_f027_40c0_bbfb_2f9184c04002/hwhq0szmkxqfz2pc1xmghz0a_310.rem' se desconectó o no existe en el servidor.
[PLC: PLC] Exporting PLC Tag Table Cross-References...
Target: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\PlcTags_CR
Found 10 Tag Tables.
Processing Tag Table: Memories...
Exporting cross-references for Memories...
Processing Tag Table: Tabella delle variabili standard...
Exporting cross-references for Tabella delle variabili standard...
Processing Tag Table: Timers_Counters...
Exporting cross-references for Timers_Counters...
Processing Tag Table: Inputs...
Exporting cross-references for Inputs...
Processing Tag Table: Outputs...
Exporting cross-references for Outputs...
Processing Tag Table: Tabella delle variabili_1...
Exporting cross-references for Tabella delle variabili_1...
Processing Tag Table: Tabella delle variabili_2...
Exporting cross-references for Tabella delle variabili_2...
Processing Tag Table: OutputsFesto...
Exporting cross-references for OutputsFesto...
Processing Tag Table: InputsMaster...
Exporting cross-references for InputsMaster...
Processing Tag Table: OutputsMaster...
Exporting cross-references for OutputsMaster...
Tag Table CR Export Summary: Exported=10, Skipped/Errors=0
[PLC: PLC] Exporting PLC Data Type (UDT) Cross-References...
Target: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\PlcDataTypes_CR
Found 24 UDTs.
Processing UDT: AnalogInstrument...
Exporting cross-references for AnalogInstrument...
Processing UDT: CIP_Link_Type...
Exporting cross-references for CIP_Link_Type...
Processing UDT: CIP_Simple_Type...
Exporting cross-references for CIP_Simple_Type...
Processing UDT: CIP_Step_Type...
Exporting cross-references for CIP_Step_Type...
Processing UDT: CIP_WaitEvent_Type...
Exporting cross-references for CIP_WaitEvent_Type...
Processing UDT: Device...
Exporting cross-references for Device...
Processing UDT: DigitalInstrument...
Exporting cross-references for DigitalInstrument...
Processing UDT: FunctionButton...
Exporting cross-references for FunctionButton...
Processing UDT: PID...
Exporting cross-references for PID...
Processing UDT: QCO Phase...
Exporting cross-references for QCO Phase...
Processing UDT: QCO Spare...
Exporting cross-references for QCO Spare...
Processing UDT: QCO Timer...
Exporting cross-references for QCO Timer...
Processing UDT: QCO Timer_Array_1...
Exporting cross-references for QCO Timer_Array_1...
Processing UDT: Recipe_Prod...
Exporting cross-references for Recipe_Prod...
Processing UDT: ReportCIPSimpleData...
Exporting cross-references for ReportCIPSimpleData...
Processing UDT: TADDR_PAR...
Exporting cross-references for TADDR_PAR...
Processing UDT: TCON_PAR...
Exporting cross-references for TCON_PAR...
Processing UDT: TCON_PAR_LF...
Exporting cross-references for TCON_PAR_LF...
Processing UDT: Tipo di dati utente_1...
Exporting cross-references for Tipo di dati utente_1...
Processing UDT: Tipo di dati utente_2...
Exporting cross-references for Tipo di dati utente_2...
Processing UDT: ASLeds...
Exporting cross-references for ASLeds...
Processing UDT: IFLeds...
Exporting cross-references for IFLeds...
Processing UDT: SV_FB_State...
Exporting cross-references for SV_FB_State...
Processing UDT: SV_State...
Exporting cross-references for SV_State...
UDT CR Export Summary: Exported=24, Skipped/Errors=0
[PLC: PLC] Attempting to Export System Block Cross-References...
Target: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\SystemBlocks_CR
Found 9 system blocks (using get_system_blocks).
Processing System Block: T_301...
Exporting cross-references for T_301...
Processing System Block: IEC_Timer_0_DB_9...
Exporting cross-references for IEC_Timer_0_DB_9...
Processing System Block: T_302...
Exporting cross-references for T_302...
Processing System Block: GET_Reciver...
Exporting cross-references for GET_Reciver...
Processing System Block: PUT_Send_Filler...
Exporting cross-references for PUT_Send_Filler...
Processing System Block: LED...
Exporting cross-references for LED...
Processing System Block: SCALE...
Exporting cross-references for SCALE...
Processing System Block: CONT_C...
Exporting cross-references for CONT_C...
Processing System Block: DeviceStates...
Exporting cross-references for DeviceStates...
System Block CR Export Summary: Exported=9, Skipped/Errors=0
[PLC: PLC] Attempting to Export Software Unit Cross-References...
Target: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\SoftwareUnits_CR
Found 0 Software Units.
Software Unit CR Export Summary: Exported=0, Skipped/Errors=0
--- Finished processing PLC: PLC ---
Cross-reference export process completed.
Closing TIA Portal...
2025-05-05 13:35:02,332 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal ClosePortal - Close TIA Portal
TIA Portal closed.
Script finished.
--- ERRORES (STDERR) ---
Traceback (most recent call last):
File "D:\Proyectos\Scripts\ParamManagerScripts\backend\script_groups\ObtainIOFromProjectTia\x4.py", line 99, in export_plc_cross_references
block_name = block.get_name()
^^^^^^^^^^^^^^^^
ValueError: RemotingException: El objeto '/460a527c_f027_40c0_bbfb_2f9184c04002/hwhq0szmkxqfz2pc1xmghz0a_310.rem' se desconectó o no existe en el servidor.
--- FIN DEL LOG ---

View File

@ -0,0 +1,39 @@
### Directory structure
<working_directory>/
├── <PLC1_Name>/
│ ├── ProgramBlocks_XML/
│ │ └── ... (archivos XML de bloques)
│ ├── ProgramBlocks_SCL/
│ │ └── ... (archivos SCL de bloques)
│ ├── ProgramBlocks_CR/
│ │ └── ... (archivos XML de referencias cruzadas de bloques)
│ ├── PlcTags/
│ │ └── ... (archivos XML de tablas de tags)
│ ├── PlcTags_CR/
│ │ └── ... (archivos XML de referencias cruzadas de tablas de tags)
│ ├── PlcDataTypes_CR/
│ │ └── ... (archivos XML de referencias cruzadas de UDTs)
│ ├── SystemBlocks_CR/
│ │ └── ...
│ └── SoftwareUnits_CR/
│ └── ...
│ └── Documentation/
│ └── Source
│ └── ... (archivos md de bloques de programa)
│ └── JSON
│ └── ... (archivos JSON temporales)
│ └── xref_calls_tree.md
│ └── xref_db_usage_summary.md
│ └── xref_plc_tags_summary.md
│ └── full_project_representation.md
│ └── SAE196_c0.2_CAx_Export_Hardware_Tree.md
├── <PLC2_Name>/
│ ├── ProgramBlocks_XML/
│ │ └── ...
│ └── ...
└── ...

View File

@ -0,0 +1,26 @@
{
"x1.py": {
"display_name": "1: Exportar Lógica desde TIA",
"short_description": "Exporta la lógica del PLC desde TIA Portal en archivos XML y SCL.",
"long_description": "Este script utiliza TIA Portal Openness para exportar la lógica de un PLC en formato XML y SCL. Permite seleccionar un proyecto de TIA Portal y genera los archivos de exportación en el directorio configurado.\n***\n**Lógica Principal:**\n\n1. **Configuración:** Carga parámetros desde `ParamManagerScripts` (directorio de trabajo, versión de TIA Portal).\n2. **Selección de Proyecto:** Abre un cuadro de diálogo para seleccionar el archivo del proyecto de TIA Portal.\n3. **Conexión a TIA Portal:** Utiliza la API de TIA Openness para conectarse al portal y abrir el proyecto seleccionado.\n4. **Exportación:** Exporta la lógica del PLC en archivos XML y SCL al directorio configurado.\n5. **Cierre:** Cierra la conexión con TIA Portal al finalizar.",
"hidden": false
},
"x2.py": {
"display_name": "2: Exportar CAx desde TIA",
"short_description": "Exporta datos CAx de un proyecto TIA Portal y genera un resumen en Markdown.",
"long_description": "Este script utiliza TIA Portal Openness para exportar datos CAx de un proyecto de TIA Portal y generar un resumen en formato Markdown.\n***\n**Lógica Principal:**\n\n1. **Configuración:** Carga parámetros desde `ParamManagerScripts` (directorio de trabajo, versión de TIA Portal).\n2. **Selección de Proyecto:** Abre un cuadro de diálogo para seleccionar el archivo del proyecto de TIA Portal.\n3. **Conexión a TIA Portal:** Utiliza la API de TIA Openness para conectarse al portal y abrir el proyecto seleccionado.\n4. **Exportación CAx:** Exporta los datos CAx en formato AML y genera un archivo de resumen en Markdown con la jerarquía del proyecto y los dispositivos encontrados.\n5. **Cierre:** Cierra la conexión con TIA Portal al finalizar.",
"hidden": false
},
"x3.py": {
"display_name": "3: Procesar la exportación AML y generar documentación de IOs",
"short_description": "Extrae IOs de un archivo AML exportado del TIA Portal y genera un archivo Markdown.",
"long_description": "Este script procesa un archivo AML exportado desde TIA Portal para extraer información de los IOs y generar un archivo Markdown con un resumen detallado.\n***\n**Lógica Principal:**\n\n1. **Selección de Archivo AML:** Abre un cuadro de diálogo para seleccionar el archivo AML exportado desde TIA Portal.\n2. **Procesamiento de Datos:**\n * Extrae información de dispositivos, redes y conexiones desde el archivo AML.\n * Identifica PLCs, redes y módulos IO.\n * Genera una estructura jerárquica de los dispositivos y sus conexiones.\n3. **Generación de Markdown:**\n * Crea un archivo Markdown con un resumen jerárquico de hardware y conexiones IO.\n * Incluye un árbol de conexiones IO hacia arriba para depuración.\n4. **Salida:** Guarda los resultados en archivos Markdown y JSON en el directorio configurado.",
"hidden": false
},
"x4.py": {
"display_name": "4: Exportar Referencias Cruzadas",
"short_description": "Script para exportar las referencias cruzadas",
"long_description": "",
"hidden": false
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,5 @@
""" """
export_logic_from_tia : export_logic_from_tia : Script para exportar el software de un PLC desde TIA Portal en archivos XML y SCL.
Script para exportar el software de un PLC desde TIA Portal en archivos XML y SCL.
""" """
import tkinter as tk import tkinter as tk

View File

@ -1,6 +1,5 @@
""" """
export_CAx_from_tia : export_CAx_from_tia : Script que exporta los datos CAx de un proyecto de TIA Portal y genera un resumen en Markdown.
Script que exporta los datos CAx de un proyecto de TIA Portal y genera un resumen en Markdown.
""" """
import tkinter as tk import tkinter as tk

View File

@ -1,7 +1,5 @@
""" """
export_io_from_CAx : export_io_from_CAx : Script que sirve para exraer los IOs de un proyecto de TIA Portal y generar un archivo Markdown con la información.
Script que sirve para exraer los IOs de un proyecto de TIA Portal y
generar un archivo Markdown con la información.
""" """
import os import os

View File

@ -0,0 +1,422 @@
"""
export_cross_references_from_tia : Script para exportar las referencias cruzadas
de un proyecto TIA Portal a archivos (probablemente XML).
"""
import tkinter as tk
from tkinter import filedialog
import os
import sys
import traceback
from pathlib import Path # Import Path for easier path manipulation
script_root = os.path.dirname(
os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
)
sys.path.append(script_root)
from backend.script_utils import load_configuration
# --- Configuration ---
TIA_PORTAL_VERSION = "18.0" # Target TIA Portal version (e.g., "18.0")
# Filter for cross-references. Based on documentation:
# 1: 'AllObjects', 2: 'ObjectsWithReferences', 3: 'ObjectsWithoutReferences', 4: 'UnusedObjects'
# Using 1 to export all. 0 might also work as a default in some API versions.
CROSS_REF_FILTER = 1
# --- TIA Scripting Import Handling ---
# (Same import handling as x1.py)
if os.getenv("TIA_SCRIPTING"):
sys.path.append(os.getenv("TIA_SCRIPTING"))
else:
pass
try:
import siemens_tia_scripting as ts
except ImportError:
print("ERROR: Failed to import 'siemens_tia_scripting'.")
print("Ensure:")
print(f"1. TIA Portal Openness for V{TIA_PORTAL_VERSION} is installed.")
print(
"2. The 'siemens_tia_scripting' Python module is installed (pip install ...) or"
)
print(
" the path to its binaries is set in the 'TIA_SCRIPTING' environment variable."
)
print(
"3. You are using a compatible Python version (e.g., 3.12.X as per documentation)."
)
sys.exit(1)
except Exception as e:
print(f"An unexpected error occurred during import: {e}")
traceback.print_exc()
sys.exit(1)
# --- Functions ---
def select_project_file():
"""Opens a dialog to select a TIA Portal project file."""
root = tk.Tk()
root.withdraw() # Hide the main tkinter window
file_path = filedialog.askopenfilename(
title="Select TIA Portal Project File",
filetypes=[
(
f"TIA Portal V{TIA_PORTAL_VERSION} Projects",
f"*.ap{TIA_PORTAL_VERSION.split('.')[0]}",
)
], # e.g. *.ap18
)
root.destroy()
if not file_path:
print("No project file selected. Exiting.")
sys.exit(0)
return file_path
def export_plc_cross_references(plc, export_base_dir):
"""Exports cross-references for various elements from a given PLC."""
plc_name = plc.get_name()
print(f"\n--- Processing PLC: {plc_name} ---")
# Define base export path for this PLC's cross-references
plc_export_dir = export_base_dir / plc_name
plc_export_dir.mkdir(parents=True, exist_ok=True) # Use pathlib's mkdir
# --- Export Program Block Cross-References ---
blocks_cr_exported = 0
blocks_cr_skipped = 0
print(f"\n[PLC: {plc_name}] Exporting Program Block Cross-References...")
blocks_cr_path = plc_export_dir / "ProgramBlocks_CR"
blocks_cr_path.mkdir(exist_ok=True)
print(f" Target: {blocks_cr_path}")
try:
# Assuming get_program_blocks() doesn't need folder_path to get all blocks
program_blocks = plc.get_program_blocks()
print(f" Found {len(program_blocks)} program blocks.")
for block in program_blocks:
block_name = block.get_name()
print(f" Processing block: {block_name}...")
try:
# Note: Consistency check might not be needed/available before cross-ref export
print(f" Exporting cross-references for {block_name}...")
block.export_cross_references(
target_directorypath=str(
blocks_cr_path
), # API likely needs string path
filter=CROSS_REF_FILTER,
)
blocks_cr_exported += 1
except RuntimeError as block_ex:
print(
f" TIA ERROR exporting cross-references for block {block_name}: {block_ex}"
)
blocks_cr_skipped += 1
except Exception as block_ex:
print(
f" GENERAL ERROR exporting cross-references for block {block_name}: {block_ex}"
)
traceback.print_exc() # Print stack trace for general errors
blocks_cr_skipped += 1
print(
f" Program Block CR Export Summary: Exported={blocks_cr_exported}, Skipped/Errors={blocks_cr_skipped}"
)
except AttributeError:
print(
" AttributeError: Could not find 'get_program_blocks' on PLC object. Skipping Program Blocks."
)
except Exception as e:
print(f" ERROR accessing Program Blocks for cross-reference export: {e}")
traceback.print_exc()
# --- Export PLC Tag Table Cross-References ---
tags_cr_exported = 0
tags_cr_skipped = 0
print(f"\n[PLC: {plc_name}] Exporting PLC Tag Table Cross-References...")
tags_cr_path = plc_export_dir / "PlcTags_CR"
tags_cr_path.mkdir(exist_ok=True)
print(f" Target: {tags_cr_path}")
try:
# Assuming get_plc_tag_tables() doesn't need folder_path to get all tables
tag_tables = plc.get_plc_tag_tables()
print(f" Found {len(tag_tables)} Tag Tables.")
for table in tag_tables:
table_name = table.get_name()
print(f" Processing Tag Table: {table_name}...")
try:
print(f" Exporting cross-references for {table_name}...")
table.export_cross_references(
target_directorypath=str(tags_cr_path), filter=CROSS_REF_FILTER
)
tags_cr_exported += 1
except RuntimeError as table_ex:
print(
f" TIA ERROR exporting cross-references for Tag Table {table_name}: {table_ex}"
)
tags_cr_skipped += 1
except Exception as table_ex:
print(
f" GENERAL ERROR exporting cross-references for Tag Table {table_name}: {table_ex}"
)
traceback.print_exc()
tags_cr_skipped += 1
print(
f" Tag Table CR Export Summary: Exported={tags_cr_exported}, Skipped/Errors={tags_cr_skipped}"
)
except AttributeError:
print(
" AttributeError: Could not find 'get_plc_tag_tables' on PLC object. Skipping Tag Tables."
)
except Exception as e:
print(f" ERROR accessing Tag Tables for cross-reference export: {e}")
traceback.print_exc()
# --- Export PLC Data Type (UDT) Cross-References ---
udts_cr_exported = 0
udts_cr_skipped = 0
print(f"\n[PLC: {plc_name}] Exporting PLC Data Type (UDT) Cross-References...")
udts_cr_path = plc_export_dir / "PlcDataTypes_CR"
udts_cr_path.mkdir(exist_ok=True)
print(f" Target: {udts_cr_path}")
try:
# Assuming get_user_data_types() doesn't need folder_path to get all UDTs
udts = plc.get_user_data_types()
print(f" Found {len(udts)} UDTs.")
for udt in udts:
udt_name = udt.get_name()
print(f" Processing UDT: {udt_name}...")
try:
print(f" Exporting cross-references for {udt_name}...")
udt.export_cross_references(
target_directorypath=str(udts_cr_path), filter=CROSS_REF_FILTER
)
udts_cr_exported += 1
except RuntimeError as udt_ex:
print(
f" TIA ERROR exporting cross-references for UDT {udt_name}: {udt_ex}"
)
udts_cr_skipped += 1
except Exception as udt_ex:
print(
f" GENERAL ERROR exporting cross-references for UDT {udt_name}: {udt_ex}"
)
traceback.print_exc()
udts_cr_skipped += 1
print(
f" UDT CR Export Summary: Exported={udts_cr_exported}, Skipped/Errors={udts_cr_skipped}"
)
except AttributeError:
print(
" AttributeError: Could not find 'get_user_data_types' on PLC object. Skipping UDTs."
)
except Exception as e:
print(f" ERROR accessing UDTs for cross-reference export: {e}")
traceback.print_exc()
# --- Export System Block Cross-References ---
sys_blocks_cr_exported = 0
sys_blocks_cr_skipped = 0
print(f"\n[PLC: {plc_name}] Attempting to Export System Block Cross-References...")
sys_blocks_cr_path = plc_export_dir / "SystemBlocks_CR"
sys_blocks_cr_path.mkdir(exist_ok=True)
print(f" Target: {sys_blocks_cr_path}")
try:
# Check if method exists before calling
if hasattr(plc, "get_system_blocks"):
system_blocks = plc.get_system_blocks()
print(
f" Found {len(system_blocks)} system blocks (using get_system_blocks)."
)
for sys_block in system_blocks:
sys_block_name = sys_block.get_name()
print(f" Processing System Block: {sys_block_name}...")
try:
print(f" Exporting cross-references for {sys_block_name}...")
sys_block.export_cross_references(
target_directorypath=str(sys_blocks_cr_path),
filter=CROSS_REF_FILTER,
)
sys_blocks_cr_exported += 1
except RuntimeError as sys_ex:
print(
f" TIA ERROR exporting cross-references for System Block {sys_block_name}: {sys_ex}"
)
sys_blocks_cr_skipped += 1
except Exception as sys_ex:
print(
f" GENERAL ERROR exporting cross-references for System Block {sys_block_name}: {sys_ex}"
)
traceback.print_exc()
sys_blocks_cr_skipped += 1
else:
print(
" Method 'get_system_blocks' not found on PLC object. Skipping System Blocks."
)
# Alternative: Try navigating DeviceItems if needed, but that's more complex.
print(
f" System Block CR Export Summary: Exported={sys_blocks_cr_exported}, Skipped/Errors={sys_blocks_cr_skipped}"
)
except AttributeError: # Catch if get_name() or other methods fail on sys_block
print(
" AttributeError during System Block processing. Skipping remaining System Blocks."
)
traceback.print_exc()
except Exception as e:
print(
f" ERROR accessing/processing System Blocks for cross-reference export: {e}"
)
traceback.print_exc()
# --- Export Software Unit Cross-References ---
sw_units_cr_exported = 0
sw_units_cr_skipped = 0
print(f"\n[PLC: {plc_name}] Attempting to Export Software Unit Cross-References...")
sw_units_cr_path = plc_export_dir / "SoftwareUnits_CR"
sw_units_cr_path.mkdir(exist_ok=True)
print(f" Target: {sw_units_cr_path}")
try:
# Check if method exists before calling
if hasattr(plc, "get_software_units"):
software_units = plc.get_software_units()
print(f" Found {len(software_units)} Software Units.")
for unit in software_units:
unit_name = unit.get_name()
print(f" Processing Software Unit: {unit_name}...")
try:
print(f" Exporting cross-references for {unit_name}...")
unit.export_cross_references(
target_directorypath=str(sw_units_cr_path),
filter=CROSS_REF_FILTER,
)
sw_units_cr_exported += 1
except RuntimeError as unit_ex:
print(
f" TIA ERROR exporting cross-references for Software Unit {unit_name}: {unit_ex}"
)
sw_units_cr_skipped += 1
except Exception as unit_ex:
print(
f" GENERAL ERROR exporting cross-references for Software Unit {unit_name}: {unit_ex}"
)
traceback.print_exc()
sw_units_cr_skipped += 1
print(
f" Software Unit CR Export Summary: Exported={sw_units_cr_exported}, Skipped/Errors={sw_units_cr_skipped}"
)
else:
print(
" Method 'get_software_units' not found on PLC object. Skipping Software Units."
)
except AttributeError: # Catch if get_name() or other methods fail on unit
print(
" AttributeError during Software Unit processing. Skipping remaining Software Units."
)
traceback.print_exc()
except Exception as e:
print(
f" ERROR accessing/processing Software Units for cross-reference export: {e}"
)
traceback.print_exc()
print(f"\n--- Finished processing PLC: {plc_name} ---")
# --- Main Script ---
if __name__ == "__main__":
configs = load_configuration()
working_directory = configs.get("working_directory")
print("--- TIA Portal Cross-Reference Exporter ---")
# Validate working directory
if not working_directory or not os.path.isdir(working_directory):
print("ERROR: Working directory not set or invalid in configuration.")
print("Please configure the working directory using the main application.")
sys.exit(1)
# 1. Select Project File
project_file = select_project_file()
# 2. Define Export Directory using working_directory and subfolder
# The export base directory is the working directory. PLC-specific folders will be created inside.
export_base_dir = Path(working_directory)
try:
# Ensure the base working directory exists (it should, but check doesn't hurt)
export_base_dir.mkdir(parents=True, exist_ok=True)
print(f"\nSelected Project: {project_file}")
print(f"Using Base Export Directory: {export_base_dir.resolve()}")
except Exception as e:
print(f"ERROR: Could not create export directory '{export_dir}'. Error: {e}")
sys.exit(1)
portal_instance = None
project_object = None
try:
# 3. Connect to TIA Portal
print(f"\nConnecting to TIA Portal V{TIA_PORTAL_VERSION}...")
# Connect using WithGraphicalUserInterface mode for visibility
portal_instance = ts.open_portal(
version=TIA_PORTAL_VERSION,
portal_mode=ts.Enums.PortalMode.WithGraphicalUserInterface,
)
print("Connected to TIA Portal.")
print(f"Portal Process ID: {portal_instance.get_process_id()}")
# 4. Open Project
print(f"Opening project: {os.path.basename(project_file)}...")
project_path_obj = Path(project_file) # Use Path object
project_object = portal_instance.open_project(
project_file_path=str(project_path_obj)
)
if project_object is None:
print("Project might already be open, attempting to get handle...")
project_object = portal_instance.get_project()
if project_object is None:
raise Exception("Failed to open or get the specified project.")
print("Project opened successfully.")
# 5. Get PLCs
plcs = project_object.get_plcs()
if not plcs:
print("No PLC devices found in the project.")
else:
print(
f"Found {len(plcs)} PLC(s). Starting cross-reference export process..."
)
# 6. Iterate and Export Cross-References for each PLC
for plc_device in plcs:
export_plc_cross_references(
plc=plc_device,
export_base_dir=export_base_dir, # Pass the base directory
)
print("\nCross-reference export process completed.")
except RuntimeError as tia_ex:
print(f"\nTIA Portal Openness Error: {tia_ex}")
traceback.print_exc()
except FileNotFoundError:
print(f"\nERROR: Project file not found at {project_file}")
except Exception as e:
print(f"\nAn unexpected error occurred: {e}")
traceback.print_exc()
finally:
# 7. Cleanup
if portal_instance:
try:
print("\nClosing TIA Portal...")
portal_instance.close_portal()
print("TIA Portal closed.")
except Exception as close_ex:
print(f"Error during TIA Portal cleanup: {close_ex}")
print("\nScript finished.")

View File

@ -1,11 +1,11 @@
{ {
"scl_output_dir": "scl_output", "aggregated_filename": "full_project_representation.md",
"xref_output_dir": "xref_output",
"xref_source_subdir": "source",
"call_xref_filename": "xref_calls_tree.md", "call_xref_filename": "xref_calls_tree.md",
"db_usage_xref_filename": "xref_db_usage_summary.md", "db_usage_xref_filename": "xref_db_usage_summary.md",
"max_call_depth": "10",
"max_users_list": "20",
"plc_tag_xref_filename": "xref_plc_tags_summary.md", "plc_tag_xref_filename": "xref_plc_tags_summary.md",
"max_call_depth": 5, "scl_output_dir": "scl_output",
"max_users_list": 20, "xref_output_dir": "xref_output",
"aggregated_filename": "full_project_representation.md" "xref_source_subdir": "source"
} }

View File

@ -1,5 +1,5 @@
{ {
"name": "Procesador de XML exportado de TIA", "name": "Procesador de XML LAD-SCL-AWL exportado de TIA a SCL / Markdown",
"description": "Conjunto de scripts que procesan archivos XML exportados de TIA, conviertiendo los objetos LAD a SCL y generando documentación en formato Markdown. ", "description": "Conjunto de scripts que procesan archivos XML exportados de TIA, conviertiendo los objetos LAD a SCL y generando documentación en formato Markdown. ",
"version": "1.0", "version": "1.0",
"author": "Miguel" "author": "Miguel"

View File

@ -14,7 +14,7 @@ try:
processors_dir = os.path.join(project_base_dir, 'processors') processors_dir = os.path.join(project_base_dir, 'processors')
if processors_dir not in sys.path: if processors_dir not in sys.path:
sys.path.insert(0, processors_dir) # Añadir al path si no está sys.path.insert(0, processors_dir) # Añadir al path si no está
from processor_utils import format_variable_name from processors.processor_utils import format_variable_name
except ImportError: except ImportError:
print("Advertencia: No se pudo importar 'format_variable_name' desde processors.processor_utils.") print("Advertencia: No se pudo importar 'format_variable_name' desde processors.processor_utils.")
print("Usando una implementación local básica.") print("Usando una implementación local básica.")

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,34 @@
--- Log de Ejecución: x4_cross_reference.py ---
Grupo: XML Parser to SCL
Directorio de Trabajo: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Inicio: 2025-05-05 16:34:28
Fin: 2025-05-05 16:34:30
Duración: 0:00:01.642768
Estado: SUCCESS (Código de Salida: 0)
--- SALIDA ESTÁNDAR (STDOUT) ---
(x4 - Standalone) Ejecutando generación de referencias cruzadas...
--- Iniciando Generación de Referencias Cruzadas y Fuentes MD (x4) ---
Buscando archivos JSON procesados en: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC
Directorio de salida XRef: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\xref_output
Directorio fuente SCL/MD (para análisis DB/Tag y copia): scl_output
Subdirectorio fuentes MD para XRef: source
Copiando y preparando archivos fuente para Obsidian en: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\xref_output\source
Archivos fuente preparados: 378 SCL convertidos, 30 MD copiados.
Buscando archivos XML XRef en: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\ProgramBlocks_CR
Archivos JSON encontrados: 342
Datos cargados para 342 bloques.
Mapa InstanciaDB -> FB creado con 0 entradas.
Datos cargados para 342 bloques (1793 PLC Tags globales).
Construyendo grafo de llamadas desde archivos XML XRef...
Archivos XML XRef encontrados: 138
Generando ÁRBOL XRef de llamadas en: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\xref_output\xref_calls_tree.md
Generando RESUMEN XRef de uso de DBs en: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\xref_output\xref_db_usage_summary.md
Generando RESUMEN XRef de uso de PLC Tags en: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\xref_output\xref_plc_tags_summary.md
--- Generación de Referencias Cruzadas y Fuentes MD (x4) Completada ---
(x4 - Standalone) Proceso completado exitosamente.
--- ERRORES (STDERR) ---
Ninguno
--- FIN DEL LOG ---

View File

@ -10,6 +10,44 @@ El proceso se divide en tres etapas principales, cada una manejada por un script
2. **Procesamiento Semántico (`process.py`):** Lee el JSON enriquecido y, de forma iterativa, traduce cada instrucción LAD a su equivalente SCL, manejando dependencias, propagando el estado lógico (RLO), y agrupando lógica paralela. El SCL generado se almacena *dentro* del propio JSON. 2. **Procesamiento Semántico (`process.py`):** Lee el JSON enriquecido y, de forma iterativa, traduce cada instrucción LAD a su equivalente SCL, manejando dependencias, propagando el estado lógico (RLO), y agrupando lógica paralela. El SCL generado se almacena *dentro* del propio JSON.
3. **Generación de SCL Final (`generate_scl.py`):** Lee el JSON completamente procesado y ensambla el código SCL final en un archivo `.scl` formateado, incluyendo declaraciones de variables y el cuerpo del programa. 3. **Generación de SCL Final (`generate_scl.py`):** Lee el JSON completamente procesado y ensambla el código SCL final en un archivo `.scl` formateado, incluyendo declaraciones de variables y el cuerpo del programa.
### Directory structure
<working_directory>/
├── <PLC1_Name>/
│ ├── ProgramBlocks_XML/
│ │ └── ... (archivos XML de bloques)
│ ├── ProgramBlocks_SCL/
│ │ └── ... (archivos SCL de bloques)
│ ├── ProgramBlocks_CR/
│ │ └── ... (archivos XML de referencias cruzadas de bloques)
│ ├── PlcTags/
│ │ └── ... (archivos XML de tablas de tags)
│ ├── PlcTags_CR/
│ │ └── ... (archivos XML de referencias cruzadas de tablas de tags)
│ ├── PlcDataTypes_CR/
│ │ └── ... (archivos XML de referencias cruzadas de UDTs)
│ ├── SystemBlocks_CR/
│ │ └── ...
│ └── SoftwareUnits_CR/
│ └── ...
│ └── Documentation/
│ └── Source
│ └── ... (archivos md de bloques de programa)
│ └── JSON
│ └── ... (archivos JSON temporales)
│ └── xref_calls_tree.md
│ └── xref_db_usage_summary.md
│ └── xref_plc_tags_summary.md
│ └── full_project_representation.md
│ └── SAE196_c0.2_CAx_Export_Hardware_Tree.md
├── <PLC2_Name>/
│ ├── ProgramBlocks_XML/
│ │ └── ...
│ └── ...
└── ...
## 2. Etapas del Pipeline ## 2. Etapas del Pipeline
### Etapa 1: XML a JSON Enriquecido (`x1_to_json.py`) ### Etapa 1: XML a JSON Enriquecido (`x1_to_json.py`)

View File

@ -4,15 +4,15 @@
"model": "gpt-3.5-turbo" "model": "gpt-3.5-turbo"
}, },
"level2": { "level2": {
"scl_output_dir": "scl_output", "aggregated_filename": "full_project_representation.md",
"xref_output_dir": "xref_output",
"xref_source_subdir": "source",
"call_xref_filename": "xref_calls_tree.md", "call_xref_filename": "xref_calls_tree.md",
"db_usage_xref_filename": "xref_db_usage_summary.md", "db_usage_xref_filename": "xref_db_usage_summary.md",
"max_call_depth": "10",
"max_users_list": "20",
"plc_tag_xref_filename": "xref_plc_tags_summary.md", "plc_tag_xref_filename": "xref_plc_tags_summary.md",
"max_call_depth": 5, "scl_output_dir": "scl_output",
"max_users_list": 20, "xref_output_dir": "xref_output",
"aggregated_filename": "full_project_representation.md" "xref_source_subdir": "source"
}, },
"level3": {}, "level3": {},
"working_directory": "C:\\Trabajo\\SIDEL\\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\\Reporte\\IOExport" "working_directory": "C:\\Trabajo\\SIDEL\\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\\Reporte\\IOExport"

View File

@ -1,6 +1,6 @@
{ {
"x0_main.py": { "x0_main.py": {
"display_name": "Procesar Exportación XML", "display_name": "1: Procesar Exportación XML",
"short_description": "LadderToSCL - Conversor de Siemens LAD/FUP XML a SCL", "short_description": "LadderToSCL - Conversor de Siemens LAD/FUP XML a SCL",
"long_description": "Este script es el punto de entrada y orquestador principal para el proceso de conversión de archivos XML de Siemens TIA Portal (LAD/FUP) a código SCL y la generación de documentación relacionada.\n\n**Lógica Principal:**\n\n1. **Configuración:** Carga parámetros desde `ParamManagerScripts` (directorio de trabajo, nombres de carpetas de salida, etc.).\n2. **Logging:** Inicia un archivo `log.txt` para registrar detalladamente el progreso y los errores.\n3. **Descubrimiento:** Busca recursivamente todos los archivos `.xml` dentro del subdirectorio `PLC` del directorio de trabajo configurado.\n4. **Procesamiento Individual (Pasos x1-x3):**\n * Itera sobre cada archivo XML encontrado.\n * Implementa lógica para **saltar** pasos si el XML no ha cambiado y las salidas ya existen y están actualizadas.\n * Llama a funciones de `x1_to_json.py`, `x2_process.py`, y `x3_generate_scl.py` para convertir XML -> JSON intermedio -> JSON procesado -> archivo SCL/Markdown final.\n5. **Referencias Cruzadas (Paso x4):** Llama a una función de `x4_cross_reference.py` para generar análisis de llamadas, uso de DBs, etc., basándose en los archivos procesados.\n6. **Agregación (Paso x5):** Llama a una función de `x5_aggregate.py` para combinar las salidas SCL/Markdown y las referencias cruzadas en un único archivo Markdown resumen.\n7. **Resumen y Salida:** Registra un resumen final del proceso (éxitos, saltos, fallos) y finaliza con un código de estado (0 para éxito, 1 si hubo errores).\n", "long_description": "Este script es el punto de entrada y orquestador principal para el proceso de conversión de archivos XML de Siemens TIA Portal (LAD/FUP) a código SCL y la generación de documentación relacionada.\n\n**Lógica Principal:**\n\n1. **Configuración:** Carga parámetros desde `ParamManagerScripts` (directorio de trabajo, nombres de carpetas de salida, etc.).\n2. **Logging:** Inicia un archivo `log.txt` para registrar detalladamente el progreso y los errores.\n3. **Descubrimiento:** Busca recursivamente todos los archivos `.xml` dentro del subdirectorio `PLC` del directorio de trabajo configurado.\n4. **Procesamiento Individual (Pasos x1-x3):**\n * Itera sobre cada archivo XML encontrado.\n * Implementa lógica para **saltar** pasos si el XML no ha cambiado y las salidas ya existen y están actualizadas.\n * Llama a funciones de `x1_to_json.py`, `x2_process.py`, y `x3_generate_scl.py` para convertir XML -> JSON intermedio -> JSON procesado -> archivo SCL/Markdown final.\n5. **Referencias Cruzadas (Paso x4):** Llama a una función de `x4_cross_reference.py` para generar análisis de llamadas, uso de DBs, etc., basándose en los archivos procesados.\n6. **Agregación (Paso x5):** Llama a una función de `x5_aggregate.py` para combinar las salidas SCL/Markdown y las referencias cruzadas en un único archivo Markdown resumen.\n7. **Resumen y Salida:** Registra un resumen final del proceso (éxitos, saltos, fallos) y finaliza con un código de estado (0 para éxito, 1 si hubo errores).\n",
"hidden": false "hidden": false
@ -24,10 +24,10 @@
"hidden": true "hidden": true
}, },
"x4_cross_reference.py": { "x4_cross_reference.py": {
"display_name": "x4_cross_reference", "display_name": "4: Generar Cross References",
"short_description": "LadderToSCL - Conversor de Siemens LAD/FUP XML a SCL", "short_description": "LadderToSCL - Conversor de Siemens LAD/FUP XML a SCL",
"long_description": "", "long_description": "",
"hidden": true "hidden": false
}, },
"x5_aggregate.py": { "x5_aggregate.py": {
"display_name": "x5_aggregate", "display_name": "x5_aggregate",

View File

@ -17,6 +17,7 @@ import time
import traceback import traceback
import json import json
import datetime # <-- NUEVO: Para timestamps import datetime # <-- NUEVO: Para timestamps
import shutil # <-- ADDED: Import shutil for file copying
script_root = os.path.dirname( script_root = os.path.dirname(
os.path.dirname(os.path.dirname(os.path.dirname(__file__))) os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
) )
@ -232,10 +233,37 @@ if __name__ == "__main__":
for xml_file in xml_files_found for xml_file in xml_files_found
] ]
# --- NUEVO: Identificar bloques SCL nativos ---
log_message("\n--- Fase 0.5: Identificando archivos .scl nativos existentes ---", log_f)
native_scl_blocks = set()
try:
# Usar un patrón similar a la Fase 1.5 para encontrar SCLs en el proyecto fuente
search_scl_pattern_native = os.path.join(xml_project_dir, "**", "*.scl")
existing_scl_files_native = glob.glob(search_scl_pattern_native, recursive=True)
# Excluir directorios de salida para evitar auto-referencias si están anidados
scl_output_dir_abs_native = os.path.abspath(os.path.join(xml_project_dir, cfg_scl_output_dirname))
xref_output_dir_abs_native = os.path.abspath(os.path.join(xml_project_dir, cfg_xref_output_dirname))
for scl_file_path in existing_scl_files_native:
if not os.path.abspath(os.path.dirname(scl_file_path)).startswith(scl_output_dir_abs_native) and \
not os.path.abspath(os.path.dirname(scl_file_path)).startswith(xref_output_dir_abs_native):
base_name = os.path.splitext(os.path.basename(scl_file_path))[0]
native_scl_blocks.add(base_name)
log_message(f"Se identificaron {len(native_scl_blocks)} posibles bloques SCL nativos (con archivo .scl).", log_f)
except Exception as e:
log_message(f"Error durante la identificación de SCL nativos: {e}. Se continuará sin priorización.", log_f)
# --- FIN NUEVO ---
# --- Directorios de salida --- # --- Directorios de salida ---
# Estos directorios ahora se crearán DENTRO de xml_project_dir (es decir, dentro de 'PLC') # Estos directorios ahora se crearán DENTRO de xml_project_dir (es decir, dentro de 'PLC')
scl_output_dir = os.path.join(xml_project_dir, cfg_scl_output_dirname) # Usar valor de config scl_output_dir = os.path.join(xml_project_dir, cfg_scl_output_dirname) # Usar valor de config
xref_output_dir = os.path.join(xml_project_dir, cfg_xref_output_dirname) # Usar valor de config xref_output_dir = os.path.join(xml_project_dir, cfg_xref_output_dirname) # Usar valor de config
# <-- ADDED: Ensure output directories exist -->
os.makedirs(scl_output_dir, exist_ok=True)
os.makedirs(xref_output_dir, exist_ok=True)
# <-- END ADDED -->
# --- PARTE 2: PROCESAMIENTO INDIVIDUAL (x1, x2, x3) --- # --- PARTE 2: PROCESAMIENTO INDIVIDUAL (x1, x2, x3) ---
log_message("\n--- Fase 1: Procesamiento Individual (x1, x2, x3) ---", log_f) log_message("\n--- Fase 1: Procesamiento Individual (x1, x2, x3) ---", log_f)
@ -247,6 +275,7 @@ if __name__ == "__main__":
skipped_full_count = 0 skipped_full_count = 0
failed_count = 0 failed_count = 0
skipped_partial_count = 0 skipped_partial_count = 0
skipped_for_native_scl = 0 # <-- NUEVO: Contador para SCL nativos
for i, xml_filepath in enumerate(xml_files_found): for i, xml_filepath in enumerate(xml_files_found):
relative_path = os.path.relpath(xml_filepath, working_directory) relative_path = os.path.relpath(xml_filepath, working_directory)
@ -261,6 +290,16 @@ if __name__ == "__main__":
parsing_dir, f"{base_filename}_processed.json" # <-- Corregido: nombre correcto parsing_dir, f"{base_filename}_processed.json" # <-- Corregido: nombre correcto
) )
# --- NUEVO: Comprobar si es un SCL nativo ---
if base_filename in native_scl_blocks:
log_message(
f"--- SALTANDO PROCESAMIENTO XML (x1, x2, x3) para: {relative_path}. Se usará el archivo .scl original existente. ---",
log_f,
)
skipped_for_native_scl += 1
continue # Pasar al siguiente archivo XML
# --- FIN NUEVO ---
# 1. Comprobar estado de salto # 1. Comprobar estado de salto
skip_info = check_skip_status( skip_info = check_skip_status(
xml_filepath, processed_json_filepath, scl_output_dir, log_f xml_filepath, processed_json_filepath, scl_output_dir, log_f
@ -348,6 +387,50 @@ if __name__ == "__main__":
failed_count += 1 failed_count += 1
continue # Pasar al siguiente archivo continue # Pasar al siguiente archivo
# <-- ADDED: Phase 1.5: Copy existing SCL files -->
log_message(f"\n--- Fase 1.5: Copiando archivos SCL existentes desde '{xml_project_dir}' a '{scl_output_dir}' ---", log_f)
copied_scl_count = 0
skipped_scl_count = 0
try:
search_scl_pattern = os.path.join(xml_project_dir, "**", "*.scl")
existing_scl_files = glob.glob(search_scl_pattern, recursive=True)
# Exclude files already in the target scl_output_dir or xref_output_dir to avoid self-copying if nested
scl_output_dir_abs = os.path.abspath(scl_output_dir)
xref_output_dir_abs = os.path.abspath(xref_output_dir)
filtered_scl_files = [
f for f in existing_scl_files
if not os.path.abspath(os.path.dirname(f)).startswith(scl_output_dir_abs) and \
not os.path.abspath(os.path.dirname(f)).startswith(xref_output_dir_abs)
]
if not filtered_scl_files:
log_message("No se encontraron archivos .scl existentes para copiar (excluyendo directorios de salida).", log_f)
else:
log_message(f"Se encontraron {len(filtered_scl_files)} archivos .scl existentes para copiar:", log_f)
for src_scl_path in filtered_scl_files:
relative_scl_path = os.path.relpath(src_scl_path, xml_project_dir)
dest_scl_path = os.path.join(scl_output_dir, os.path.basename(src_scl_path)) # Copy directly into scl_output_dir
# Check if a file with the same name was already generated from XML
if os.path.exists(dest_scl_path):
log_message(f" - Omitiendo copia de '{relative_scl_path}': Ya existe un archivo generado con el mismo nombre en el destino.", log_f, also_print=False)
skipped_scl_count += 1
else:
try:
log_message(f" - Copiando '{relative_scl_path}' a '{os.path.relpath(dest_scl_path, working_directory)}'", log_f, also_print=False)
shutil.copy2(src_scl_path, dest_scl_path) # copy2 preserves metadata
copied_scl_count += 1
except Exception as copy_err:
log_message(f" - ERROR copiando '{relative_scl_path}': {copy_err}", log_f)
# Decide if this should count as a general failure
log_message(f"Copia de SCL existentes finalizada. Copiados: {copied_scl_count}, Omitidos (conflicto nombre): {skipped_scl_count}", log_f)
except Exception as e:
log_message(f"Error durante la Fase 1.5 (Copia SCL): {e}", log_f)
# <-- END ADDED -->
# --- PARTE 3: EJECUTAR x4 (Referencias Cruzadas) --- # --- PARTE 3: EJECUTAR x4 (Referencias Cruzadas) ---
log_message( log_message(
f"\n--- Fase 2: Ejecutando x4_cross_reference.py (salida en '{cfg_xref_output_dirname}/') ---", # Usar valor de config f"\n--- Fase 2: Ejecutando x4_cross_reference.py (salida en '{cfg_xref_output_dirname}/') ---", # Usar valor de config
@ -464,12 +547,12 @@ if __name__ == "__main__":
log_f, log_f,
) )
log_message(f"Archivos fallidos (en x1, x2, x3 o error inesperado): {failed_count}", log_f) log_message(f"Archivos fallidos (en x1, x2, x3 o error inesperado): {failed_count}", log_f)
# El detalle de archivos fallidos es más difícil de rastrear ahora sin el dict 'file_status' log_message( # <-- NUEVO: Reportar SCL nativos saltados
# Se podría reintroducir si es necesario, actualizándolo en cada paso. f"Archivos XML omitidos (priorizando .scl nativo): {skipped_for_native_scl}",
# Por ahora, solo mostramos el conteo. log_f,
# if failed_count > 0: )
# log_message("Archivos fallidos:", log_f) log_message(f"Archivos SCL existentes copiados (Fase 1.5): {copied_scl_count}", log_f) # <-- ADDED: Report copied SCL
# ... (lógica para mostrar cuáles fallaron) ... log_message(f"Archivos SCL existentes omitidos por conflicto (Fase 1.5): {skipped_scl_count}", log_f) # <-- ADDED: Report skipped SCL
log_message( log_message(
f"Fase 2 (Generación XRef - x4): {'Completada' if run_x4 and success_x4 else ('Fallida' if run_x4 and not success_x4 else 'Omitida')}", f"Fase 2 (Generación XRef - x4): {'Completada' if run_x4 and success_x4 else ('Fallida' if run_x4 and not success_x4 else 'Omitida')}",
log_f, log_f,

View File

@ -285,7 +285,8 @@ def convert_xml_to_json(xml_filepath, json_filepath):
block_tag_name = etree.QName(the_block.tag).localname # Nombre del tag (ej. SW.Blocks.OB) block_tag_name = etree.QName(the_block.tag).localname # Nombre del tag (ej. SW.Blocks.OB)
block_type_map = { block_type_map = {
"SW.Blocks.FC": "FC", "SW.Blocks.FB": "FB", "SW.Blocks.FC": "FC", "SW.Blocks.FB": "FB",
"SW.Blocks.GlobalDB": "GlobalDB", "SW.Blocks.OB": "OB" "SW.Blocks.GlobalDB": "GlobalDB", "SW.Blocks.OB": "OB",
"SW.Blocks.InstanceDB": "InstanceDB" # <-- ADDED: Recognize InstanceDB
} }
block_type_found = block_type_map.get(block_tag_name, "UnknownBlockType") block_type_found = block_type_map.get(block_tag_name, "UnknownBlockType")
print(f"Paso 2b: Bloque {block_tag_name} (Tipo: {block_type_found}) encontrado (ID={the_block.get('ID')}).") print(f"Paso 2b: Bloque {block_tag_name} (Tipo: {block_type_found}) encontrado (ID={the_block.get('ID')}).")
@ -294,6 +295,8 @@ def convert_xml_to_json(xml_filepath, json_filepath):
print("Paso 3: Extrayendo atributos del bloque...") print("Paso 3: Extrayendo atributos del bloque...")
attribute_list_node = the_block.xpath("./AttributeList") # Buscar hijo directo attribute_list_node = the_block.xpath("./AttributeList") # Buscar hijo directo
block_name_val, block_number_val, block_lang_val = "Unknown", None, "Unknown" block_name_val, block_number_val, block_lang_val = "Unknown", None, "Unknown"
instance_of_name_val = None # <-- NUEVO: Para InstanceDB
instance_of_type_val = None # <-- NUEVO: Para InstanceDB
block_comment_val = "" block_comment_val = ""
if attribute_list_node: if attribute_list_node:
@ -306,7 +309,13 @@ def convert_xml_to_json(xml_filepath, json_filepath):
lang_node = attr_list.xpath("./ProgrammingLanguage/text()") lang_node = attr_list.xpath("./ProgrammingLanguage/text()")
# Asignar lenguaje por defecto si no se encuentra # Asignar lenguaje por defecto si no se encuentra
block_lang_val = lang_node[0].strip() if lang_node else \ block_lang_val = lang_node[0].strip() if lang_node else \
("DB" if block_type_found == "GlobalDB" else "Unknown") ("DB" if block_type_found in ["GlobalDB", "InstanceDB"] else "Unknown") # <-- MODIFIED: Include InstanceDB for DB language default
# <-- NUEVO: Extraer info de instancia si es InstanceDB -->
if block_type_found == "InstanceDB":
inst_name_node = attr_list.xpath("./InstanceOfName/text()")
instance_of_name_val = inst_name_node[0].strip() if inst_name_node else None
inst_type_node = attr_list.xpath("./InstanceOfType/text()") # Generalmente 'FB'
instance_of_type_val = inst_type_node[0].strip() if inst_type_node else None
print(f"Paso 3: Atributos: Nombre='{block_name_val}', Número={block_number_val}, Lenguaje Bloque='{block_lang_val}'") print(f"Paso 3: Atributos: Nombre='{block_name_val}', Número={block_number_val}, Lenguaje Bloque='{block_lang_val}'")
# Extraer comentario del bloque (puede estar en AttributeList o ObjectList) # Extraer comentario del bloque (puede estar en AttributeList o ObjectList)
@ -320,7 +329,7 @@ def convert_xml_to_json(xml_filepath, json_filepath):
print(f"Paso 3b: Comentario bloque: '{block_comment_val[:50]}...'") print(f"Paso 3b: Comentario bloque: '{block_comment_val[:50]}...'")
else: else:
print(f"Advertencia: No se encontró AttributeList para el bloque {block_type_found}.") print(f"Advertencia: No se encontró AttributeList para el bloque {block_type_found}.")
if block_type_found == "GlobalDB": block_lang_val = "DB" # Default para DB if block_type_found in ["GlobalDB", "InstanceDB"]: block_lang_val = "DB" # Default para DB/InstanceDB # <-- MODIFIED: Include InstanceDB
# Inicializar diccionario de resultado para el bloque # Inicializar diccionario de resultado para el bloque
result = { result = {
@ -363,7 +372,7 @@ def convert_xml_to_json(xml_filepath, json_filepath):
# --- Procesar Redes (CompileUnits) --- # --- Procesar Redes (CompileUnits) ---
if block_type_found not in ["GlobalDB"]: # DBs no tienen redes ejecutables if block_type_found not in ["GlobalDB", "InstanceDB"]: # DBs/InstanceDBs no tienen redes ejecutables # <-- MODIFIED: Include InstanceDB
print("Paso 5: Buscando y PROCESANDO redes (CompileUnits)...") print("Paso 5: Buscando y PROCESANDO redes (CompileUnits)...")
networks_processed_count = 0 networks_processed_count = 0
result["networks"] = [] # Asegurar que esté inicializado result["networks"] = [] # Asegurar que esté inicializado
@ -424,11 +433,10 @@ def convert_xml_to_json(xml_filepath, json_filepath):
if networks_processed_count == 0: print(f"Advertencia: ObjectList para {block_type_found} sin SW.Blocks.CompileUnit.") if networks_processed_count == 0: print(f"Advertencia: ObjectList para {block_type_found} sin SW.Blocks.CompileUnit.")
else: print(f"Advertencia: No se encontró ObjectList para el bloque {block_type_found}.") else: print(f"Advertencia: No se encontró ObjectList para el bloque {block_type_found}.")
else: print("Paso 5: Saltando procesamiento de redes para GlobalDB.") else: print(f"Paso 5: Saltando procesamiento de redes para {block_type_found}.") # <-- MODIFIED: Updated message
else: # No se encontró ningún bloque SW.Blocks.* else: # No se encontró ningún bloque SW.Blocks.*
print("Error Crítico: No se encontró el elemento raíz del bloque (<SW.Blocks.FC/FB/GlobalDB/OB>) después de descartar UDT/TagTable.") print("Error Crítico: No se encontró el elemento raíz del bloque (<SW.Blocks.FC/FB/GlobalDB/OB/InstanceDB>) después de descartar UDT/TagTable.") # <-- MODIFIED: Updated message
return False
# --- Fin del manejo de Bloques --- # --- Fin del manejo de Bloques ---
# --- Escritura del JSON Final --- # --- Escritura del JSON Final ---
@ -440,7 +448,7 @@ def convert_xml_to_json(xml_filepath, json_filepath):
print("Paso 6: Escribiendo el resultado en el archivo JSON...") print("Paso 6: Escribiendo el resultado en el archivo JSON...")
# Advertencias finales si faltan partes clave # Advertencias finales si faltan partes clave
if result.get("block_type") not in ["PlcUDT", "PlcTagTable"] and not result.get("interface"): print("ADVERTENCIA FINAL: 'interface' está vacía en el JSON.") if result.get("block_type") not in ["PlcUDT", "PlcTagTable"] and not result.get("interface"): print("ADVERTENCIA FINAL: 'interface' está vacía en el JSON.")
if result.get("block_type") not in ["PlcUDT", "PlcTagTable", "GlobalDB"] and not result.get("networks"): print("ADVERTENCIA FINAL: 'networks' está vacía en el JSON.") if result.get("block_type") not in ["PlcUDT", "PlcTagTable", "GlobalDB", "InstanceDB"] and not result.get("networks"): print("ADVERTENCIA FINAL: 'networks' está vacía en el JSON.") # <-- MODIFIED: Include InstanceDB
# Escribir el archivo JSON # Escribir el archivo JSON
try: try:

View File

@ -275,7 +275,7 @@ def process_json_to_scl(json_filepath, output_json_filepath):
block_type = data.get("block_type", "Unknown") block_type = data.get("block_type", "Unknown")
print(f"Procesando bloque tipo: {block_type}") print(f"Procesando bloque tipo: {block_type}")
if block_type in ["GlobalDB", "PlcUDT", "PlcTagTable"]: if block_type in ["GlobalDB", "PlcUDT", "PlcTagTable", "InstanceDB"]: # <-- MODIFIED: Add InstanceDB
print(f"INFO: El bloque es {block_type}. Saltando procesamiento lógico de x2.") print(f"INFO: El bloque es {block_type}. Saltando procesamiento lógico de x2.")
print( print(
f"Guardando JSON de {block_type} (con metadatos) en: {output_json_filepath}" f"Guardando JSON de {block_type} (con metadatos) en: {output_json_filepath}"

View File

@ -83,6 +83,11 @@ def generate_scl_or_markdown(
generation_function = generate_scl_for_db generation_function = generate_scl_for_db
func_args["project_root_dir"] = project_root_dir func_args["project_root_dir"] = project_root_dir
output_extension = ".scl" output_extension = ".scl"
elif block_type == "InstanceDB": # <-- ADDED: Handle InstanceDB
print(" -> Modo de generación: INSTANCE_DATA_BLOCK SCL")
generation_function = generate_scl_for_db # Use the same generator as GlobalDB
func_args["project_root_dir"] = project_root_dir
output_extension = ".scl"
elif block_type in ["FC", "FB", "OB"]: elif block_type in ["FC", "FB", "OB"]:
print(f" -> Modo de generación: {block_type} SCL") print(f" -> Modo de generación: {block_type} SCL")
generation_function = generate_scl_for_code_block generation_function = generate_scl_for_code_block

View File

@ -14,7 +14,9 @@ import traceback
import glob import glob
import re import re
import urllib.parse import urllib.parse
import xml.etree.ElementTree as ET # <-- NUEVO: Para parsear XML
import shutil # <-- NUEVO: Para copiar archivos import shutil # <-- NUEVO: Para copiar archivos
from generators.generator_utils import format_variable_name
from collections import defaultdict from collections import defaultdict
script_root = os.path.dirname( script_root = os.path.dirname(
os.path.dirname(os.path.dirname(os.path.dirname(__file__))) os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
@ -22,50 +24,15 @@ script_root = os.path.dirname(
sys.path.append(script_root) sys.path.append(script_root)
from backend.script_utils import load_configuration from backend.script_utils import load_configuration
# --- Importar format_variable_name (sin cambios) ---
try:
current_dir = os.path.dirname(os.path.abspath(__file__))
parent_dir = os.path.dirname(current_dir)
if parent_dir not in sys.path:
sys.path.insert(0, parent_dir)
from generators.generator_utils import format_variable_name
print("INFO: format_variable_name importado desde generators.generator_utils")
except ImportError:
print(
"ADVERTENCIA: No se pudo importar format_variable_name desde generators. Usando copia local."
)
def format_variable_name(name): # Fallback
if not name:
return "_INVALID_NAME_"
if name.startswith('"') and name.endswith('"'):
return name
prefix = "#" if name.startswith("#") else ""
if prefix:
name = name[1:]
if name and name[0].isdigit():
name = "_" + name
name = re.sub(r"[^a-zA-Z0-9_]", "_", name)
return prefix + name
# --- Constantes --- # --- Constantes ---
# SCL_OUTPUT_DIRNAME = "scl_output" # Se leerá de config
# XREF_SOURCE_SUBDIR = "source" # Se leerá de config
# CALL_XREF_FILENAME = "xref_calls_tree.md" # Se leerá de config
# DB_USAGE_XREF_FILENAME = "xref_db_usage_summary.md" # Se leerá de config
# PLC_TAG_XREF_FILENAME = "xref_plc_tags_summary.md" # Se leerá de config
# MAX_CALL_DEPTH = 5 # Se leerá de config
INDENT_STEP = " " INDENT_STEP = " "
# MAX_USERS_LIST = 20 # Se leerá de config
# --- Funciones de Análisis (find_calls_in_scl, find_db_tag_usage, find_plc_tag_usage sin cambios) --- # --- Funciones de Análisis (find_calls_in_scl, find_db_tag_usage, find_plc_tag_usage sin cambios) ---
# (Se omiten por brevedad, son las mismas de la versión anterior) # <-- MODIFICADO: Añadir instance_db_to_fb_map como parámetro -->
def find_calls_in_scl(scl_code, block_data): def find_calls_in_scl(scl_code, block_data, instance_db_to_fb_map):
calls = defaultdict(int) calls = defaultdict(int)
known_blocks = set(block_data.keys()) known_blocks = set(block_data.keys())
# La lógica de known_instances puede ser menos relevante ahora, pero la dejamos por si acaso
known_instances = set() known_instances = set()
for name, data in block_data.items(): for name, data in block_data.items():
block_info = data.get("data", {}) block_info = data.get("data", {})
@ -146,17 +113,28 @@ def find_calls_in_scl(scl_code, block_data):
potential_name_quoted = match.group(1) potential_name_quoted = match.group(1)
potential_name_clean = match.group(2) potential_name_clean = match.group(2)
if potential_name_clean.upper() in system_funcs: if potential_name_clean.upper() in system_funcs:
continue continue # Ignorar palabras clave del lenguaje y funciones estándar
is_instance_call = (
potential_name_clean.startswith("#") # <-- NUEVO: Comprobar si es una llamada a un DB de instancia conocido -->
or potential_name_quoted in known_instances fb_type_name = instance_db_to_fb_map.get(potential_name_clean)
) if fb_type_name:
if is_instance_call: # ¡Encontrado! Es una llamada vía DB de instancia. Contabilizar para el FB base.
pass calls[fb_type_name] += 1
elif potential_name_clean in known_blocks: else:
callee_type = block_data[potential_name_clean]["data"].get("block_type") # <-- Lógica Original (modificada para else) -->
if callee_type in ["FC", "FB"]: # No es un DB de instancia conocido, ¿es una llamada a FC/FB directamente o una instancia local (#)?
calls[potential_name_clean] += 1 is_local_instance_call = potential_name_clean.startswith("#")
# La comprobación 'potential_name_quoted in known_instances' es menos fiable, priorizamos el mapa.
if is_local_instance_call:
# Podríamos intentar resolver el tipo de la instancia local si tuviéramos esa info aquí,
# pero por ahora, simplemente la ignoramos para no contarla incorrectamente.
pass
elif potential_name_clean in known_blocks:
# Es un nombre de bloque conocido, ¿es FC o FB?
callee_type = block_data[potential_name_clean]["data"].get("block_type")
if callee_type in ["FC", "FB"]:
calls[potential_name_clean] += 1 # Llamada directa a FC o FB
return calls return calls
@ -364,7 +342,7 @@ def build_call_tree_recursive( # Añadido max_call_depth, xref_source_subdir
visited_in_path.add(current_node) visited_in_path.add(current_node)
if current_node in call_graph: if current_node in call_graph:
callees = sorted(call_graph[current_node].keys()) callees = sorted(call_graph[current_node])
for callee in callees: for callee in callees:
# Llamada recursiva # Llamada recursiva
build_call_tree_recursive( build_call_tree_recursive(
@ -387,7 +365,7 @@ def generate_call_tree_output(call_graph, block_data, base_xref_dir, max_call_de
a los archivos .md en xref_output/source. a los archivos .md en xref_output/source.
""" """
output_lines = ["# Árbol de Referencias Cruzadas de Llamadas\n"] output_lines = ["# Árbol de Referencias Cruzadas de Llamadas\n"]
output_lines.append(f"(Profundidad máxima: {MAX_CALL_DEPTH})\n") output_lines.append(f"(Profundidad máxima: {max_call_depth})\n") # <-- Usar el parámetro
root_nodes = sorted( # Encontrar OBs root_nodes = sorted( # Encontrar OBs
[ [
name name
@ -493,6 +471,84 @@ def generate_plc_tag_summary_output(plc_tag_users, max_users_list): # Añadido m
output_lines.append("") output_lines.append("")
return output_lines return output_lines
# --- NUEVA FUNCION: Parseador de XML XRef ---
def parse_xref_xml_for_calls(xml_file_path):
"""
Parsea un archivo _XRef.xml de TIA Portal y extrae las relaciones de llamada (Caller -> Callee).
Se basa en la estructura descrita en xref_info.md.
Devuelve un diccionario: {caller_name: [callee_name1, callee_name2, ...]}
"""
calls = defaultdict(list)
try:
tree = ET.parse(xml_file_path)
root = tree.getroot()
# Determinar el namespace (puede variar, esto es un intento común)
# Si el namespace es diferente, habrá que ajustarlo aquí.
ns_match = re.match(r'\{([^}]+)\}', root.tag)
ns = {'ns': ns_match.group(1)} if ns_match else {}
ns_prefix = f"{{{ns['ns']}}}" if ns else ""
# Encuentra el SourceObject (el llamador en este archivo)
source_object = root.find(f'.//{ns_prefix}SourceObject')
if source_object is None:
print(f"Advertencia: No se encontró SourceObject en {xml_file_path}", file=sys.stderr)
return {} # Devuelve diccionario vacío si no hay SourceObject
caller_name_elem = source_object.find(f'{ns_prefix}Name')
caller_name = caller_name_elem.text if caller_name_elem is not None and caller_name_elem.text else f"UnknownCaller_{os.path.basename(xml_file_path)}"
# Itera sobre los objetos referenciados (potenciales llamados)
references = source_object.find(f'{ns_prefix}References')
if references is not None:
for ref_object in references.findall(f'{ns_prefix}ReferenceObject'):
ref_name_elem = ref_object.find(f'{ns_prefix}Name')
ref_name = ref_name_elem.text if ref_name_elem is not None and ref_name_elem.text else None
ref_type_name_elem = ref_object.find(f'{ns_prefix}TypeName')
ref_type_name = ref_type_name_elem.text if ref_type_name_elem is not None and ref_type_name_elem.text else ""
if not ref_name: continue # Saltar si el objeto referenciado no tiene nombre
# Itera sobre las localizaciones de la referencia
locations = ref_object.find(f'{ns_prefix}Locations')
if locations is not None:
for location in locations.findall(f'{ns_prefix}Location'):
# <-- NUEVO: Comprobar primero el ReferenceType -->
ref_type_elem = location.find(f'{ns_prefix}ReferenceType')
ref_type_text = ref_type_elem.text if ref_type_elem is not None else ""
# Solo procesar si el SourceObject 'Uses' el ReferenceObject en esta Location
if ref_type_text == 'Uses':
access_elem = location.find(f'{ns_prefix}Access')
access_type = access_elem.text if access_elem is not None and access_elem.text else ""
callee_name = None
if access_type == 'Call':
# Llamada directa a FC
callee_name = ref_name
elif access_type == 'InstanceDB':
# Llamada a FB via DB de Instancia
# Extraer nombre/número del FB desde TypeName (ej: "Instance DB of BlockName [FB123]")
match = re.search(r'Instance DB of\s+(.*?)\s+\[([A-Za-z]+[0-9]+)\]', ref_type_name)
if match:
# Preferir nombre simbólico si existe, si no, el número (FBxxx)
callee_name = match.group(1) if match.group(1) else match.group(2)
elif 'Instance DB of' in ref_type_name: # Fallback si regex falla
callee_name = ref_type_name.split('Instance DB of ')[-1].strip()
if callee_name and callee_name not in calls[caller_name]:
calls[caller_name].append(callee_name)
except ET.ParseError as e:
print(f"Error parseando XML {xml_file_path}: {e}", file=sys.stderr)
except FileNotFoundError:
print(f"Error: Archivo XML no encontrado {xml_file_path}", file=sys.stderr)
except Exception as e:
print(f"Error inesperado procesando XML {xml_file_path}: {e}", file=sys.stderr)
traceback.print_exc(file=sys.stderr)
return dict(calls) # Convertir de defaultdict a dict
# --- Función Principal (MODIFICADA para llamar a copy_and_prepare_source_files) --- # --- Función Principal (MODIFICADA para llamar a copy_and_prepare_source_files) ---
def generate_cross_references( def generate_cross_references(
@ -514,13 +570,21 @@ def generate_cross_references(
print(f"--- Iniciando Generación de Referencias Cruzadas y Fuentes MD (x4) ---") print(f"--- Iniciando Generación de Referencias Cruzadas y Fuentes MD (x4) ---")
print(f"Buscando archivos JSON procesados en: {project_root_dir}") print(f"Buscando archivos JSON procesados en: {project_root_dir}")
print(f"Directorio de salida XRef: {output_dir}") print(f"Directorio de salida XRef: {output_dir}")
print(f"Directorio fuente SCL/MD: {scl_output_dirname}") print(f"Directorio fuente SCL/MD (para análisis DB/Tag y copia): {scl_output_dirname}")
print(f"Subdirectorio fuentes MD para XRef: {xref_source_subdir}") print(f"Subdirectorio fuentes MD para XRef: {xref_source_subdir}")
output_dir_abs = os.path.abspath(output_dir) output_dir_abs = os.path.abspath(output_dir)
# <-- NUEVO: Crear directorio y preparar archivos fuente ANTES de generar XRefs --> # <-- NUEVO: Crear directorio y preparar archivos fuente ANTES de generar XRefs -->
# Pasar los nombres de directorios leídos de la config # Pasar los nombres de directorios leídos de la config
copy_and_prepare_source_files(project_root_dir, output_dir_abs, scl_output_dirname, xref_source_subdir) copy_and_prepare_source_files(project_root_dir, output_dir_abs, scl_output_dirname, xref_source_subdir)
# <-- NUEVO: Definir directorio donde buscar los XML de XRef -->
# <-- MODIFICADO: Buscar dentro del directorio del PLC actual (project_root_dir) -->
# xref_xml_dir = os.path.join(os.path.dirname(project_root_dir), "cross_ref", "PLC", "ProgramBlocks_CR") # Ruta anterior incorrecta
xref_xml_dir = os.path.join(project_root_dir, "ProgramBlocks_CR") # Ruta correcta: <working_dir>/<PLC_Name>/ProgramBlocks_CR/
print(f"Buscando archivos XML XRef en: {xref_xml_dir}")
# <-- FIN NUEVO --> # <-- FIN NUEVO -->
json_files = glob.glob( json_files = glob.glob(
os.path.join(project_root_dir, "**", "*_processed.json"), recursive=True os.path.join(project_root_dir, "**", "*_processed.json"), recursive=True
@ -530,7 +594,7 @@ def generate_cross_references(
return False return False
print(f"Archivos JSON encontrados: {len(json_files)}") print(f"Archivos JSON encontrados: {len(json_files)}")
# 1. Cargar datos (sin cambios) # 1. Cargar datos de JSON (sigue siendo útil para metadatos, enlaces, y análisis DB/Tag)
block_data = {} block_data = {}
all_db_names = set() all_db_names = set()
plc_tag_names = set() plc_tag_names = set()
@ -560,53 +624,82 @@ def generate_cross_references(
if not block_data: if not block_data:
print("Error: No se pudieron cargar datos.", file=sys.stderr) print("Error: No se pudieron cargar datos.", file=sys.stderr)
return False return False
print(
f"Datos cargados para {len(block_data)} bloques."
)
# <-- NUEVO: Crear mapa de DB de Instancia a FB -->
instance_db_to_fb_map = {}
for block_name, block_entry in block_data.items():
b_data = block_entry.get("data", {})
if b_data.get("block_type") == "InstanceDB":
instance_of_name = b_data.get("InstanceOfName") # Clave añadida en x1
if instance_of_name and instance_of_name in block_data: # Verificar que el FB existe
instance_db_to_fb_map[block_name] = instance_of_name
elif instance_of_name:
print(f"Advertencia: InstanceDB '{block_name}' instancia a '{instance_of_name}', pero ese FB no se encontró en los datos cargados.", file=sys.stderr)
print(f"Mapa InstanciaDB -> FB creado con {len(instance_db_to_fb_map)} entradas.")
print( print(
f"Datos cargados para {len(block_data)} bloques ({len(plc_tag_names)} PLC Tags globales)." f"Datos cargados para {len(block_data)} bloques ({len(plc_tag_names)} PLC Tags globales)."
) )
# 2. Analizar datos (sin cambios) # 2. Construir Grafo de Llamadas desde XML XRef
call_graph = defaultdict(lambda: defaultdict(int)) print("Construyendo grafo de llamadas desde archivos XML XRef...")
call_graph = defaultdict(list) # Usamos lista, no necesitamos contar llamadas múltiples aquí
xref_xml_files = glob.glob(os.path.join(xref_xml_dir, "*_XRef.xml"))
if not xref_xml_files:
print(f"ADVERTENCIA: No se encontraron archivos '*_XRef.xml' en {xref_xml_dir}. El árbol de llamadas estará vacío.", file=sys.stderr)
else:
print(f"Archivos XML XRef encontrados: {len(xref_xml_files)}")
for xml_file in xref_xml_files:
file_calls = parse_xref_xml_for_calls(xml_file)
for caller, callees in file_calls.items():
if caller not in call_graph:
call_graph[caller] = []
for callee in callees:
if callee not in call_graph[caller]: # Evitar duplicados si un bloque llama varias veces al mismo
call_graph[caller].append(callee)
# 3. Analizar uso de DBs y PLC Tags desde SCL (esta parte no cambia)
db_users = defaultdict(set) db_users = defaultdict(set)
plc_tag_users = defaultdict(set) plc_tag_users = defaultdict(set)
print("Analizando llamadas y uso de DBs/PLC Tags...")
for block_name, block_entry in block_data.items(): for block_name, block_entry in block_data.items():
data = block_entry["data"] data = block_entry["data"]
block_type = data.get("block_type") block_type = data.get("block_type")
if block_type not in ["OB", "FC", "FB"]: if block_type not in ["OB", "FC", "FB"]:
continue continue
caller_name = block_name caller_name = block_name
for network in data.get("networks", []):
combined_scl = "" # Leer el archivo SCL para análisis de DB/Tags
network_has_code = False scl_filename = format_variable_name(caller_name) + ".scl"
for instruction in network.get("logic", []): # Construir la ruta al archivo SCL dentro del directorio scl_output
if not instruction.get("grouped", False): scl_filepath = os.path.join(project_root_dir, scl_output_dirname, scl_filename)
scl_code = instruction.get("scl", "") full_scl_content = ""
edge_update_code = instruction.get("_edge_mem_update_scl", "") if os.path.exists(scl_filepath):
if scl_code or edge_update_code: try:
network_has_code = True with open(scl_filepath, "r", encoding="utf-8") as f_scl:
combined_scl += ( full_scl_content = f_scl.read()
(scl_code or "") + "\n" + (edge_update_code or "") + "\n" except Exception as read_err:
) print(f" Advertencia: No se pudo leer el archivo SCL '{scl_filepath}' para análisis: {read_err}", file=sys.stderr)
if not network_has_code: else:
continue print(f" Advertencia: No se encontró el archivo SCL '{scl_filepath}' para análisis. El bloque podría no tener código ejecutable o hubo un error previo.", file=sys.stderr)
calls_found = find_calls_in_scl(combined_scl, block_data)
for callee_name, count in calls_found.items(): if full_scl_content:
if callee_name in block_data and block_data[callee_name]["data"].get( # Ya no usamos find_calls_in_scl para el grafo principal
"block_type" # Analizar uso de DBs
) in ["FC", "FB"]: db_usage_found = find_db_tag_usage(full_scl_content)
call_graph[caller_name][callee_name] += count
db_usage_found = find_db_tag_usage(combined_scl)
for db_tag, access_counts in db_usage_found.items(): for db_tag, access_counts in db_usage_found.items():
db_name_part = db_tag.split(".")[0] db_name_part = db_tag.split(".")[0].strip('"') # Limpiar comillas
if db_name_part in all_db_names or ( if db_name_part in all_db_names or (
db_name_part.startswith("DB") and db_name_part[2:].isdigit() db_name_part.startswith("DB") and db_name_part[2:].isdigit()
): ):
db_users[db_name_part].add(caller_name) db_users[db_name_part].add(caller_name)
plc_usage_found = find_plc_tag_usage(combined_scl, plc_tag_names) plc_usage_found = find_plc_tag_usage(full_scl_content, plc_tag_names)
# Analizar uso de PLC Tags
for plc_tag, access_counts in plc_usage_found.items(): for plc_tag, access_counts in plc_usage_found.items():
plc_tag_users[plc_tag].add(caller_name) plc_tag_users[plc_tag].add(caller_name)
# 3. Generar Archivos de Salida XRef (MODIFICADO para usar la nueva función de árbol) # 4. Generar Archivos de Salida XRef
os.makedirs(output_dir_abs, exist_ok=True) os.makedirs(output_dir_abs, exist_ok=True)
call_xref_path = os.path.join(output_dir_abs, call_xref_filename) # Usar parámetro call_xref_path = os.path.join(output_dir_abs, call_xref_filename) # Usar parámetro
db_usage_xref_path = os.path.join(output_dir_abs, db_usage_xref_filename) # Usar parámetro db_usage_xref_path = os.path.join(output_dir_abs, db_usage_xref_filename) # Usar parámetro
@ -615,8 +708,8 @@ def generate_cross_references(
print(f"Generando ÁRBOL XRef de llamadas en: {call_xref_path}") print(f"Generando ÁRBOL XRef de llamadas en: {call_xref_path}")
try: try:
# <-- MODIFICADO: Llamar a la nueva función sin project_root_dir --> # <-- MODIFICADO: Llamar a la nueva función sin project_root_dir -->
call_tree_lines = generate_call_tree_output( # Pasar parámetros call_tree_lines = generate_call_tree_output( # Pasar parámetros (el grafo ya está construido desde XML)
call_graph, block_data, output_dir_abs call_graph, block_data, output_dir_abs, max_call_depth, xref_source_subdir # <-- Pasar max_call_depth
) )
with open(call_xref_path, "w", encoding="utf-8") as f: with open(call_xref_path, "w", encoding="utf-8") as f:
[f.write(line + "\n") for line in call_tree_lines] [f.write(line + "\n") for line in call_tree_lines]
@ -675,8 +768,17 @@ if __name__ == "__main__":
cfg_call_xref_filename = group_config.get("call_xref_filename", "xref_calls_tree.md") cfg_call_xref_filename = group_config.get("call_xref_filename", "xref_calls_tree.md")
cfg_db_usage_xref_filename = group_config.get("db_usage_xref_filename", "xref_db_usage_summary.md") cfg_db_usage_xref_filename = group_config.get("db_usage_xref_filename", "xref_db_usage_summary.md")
cfg_plc_tag_xref_filename = group_config.get("plc_tag_xref_filename", "xref_plc_tags_summary.md") cfg_plc_tag_xref_filename = group_config.get("plc_tag_xref_filename", "xref_plc_tags_summary.md")
cfg_max_call_depth = group_config.get("max_call_depth", 5) # <-- MODIFICADO: Convertir a int y manejar posible error -->
cfg_max_users_list = group_config.get("max_users_list", 20) try:
cfg_max_call_depth = int(group_config.get("max_call_depth", 5))
except (ValueError, TypeError):
print("Advertencia: Valor inválido para 'max_call_depth' en la configuración. Usando valor por defecto 5.", file=sys.stderr)
cfg_max_call_depth = 5
try:
cfg_max_users_list = int(group_config.get("max_users_list", 20))
except (ValueError, TypeError):
print("Advertencia: Valor inválido para 'max_users_list' en la configuración. Usando valor por defecto 20.", file=sys.stderr)
cfg_max_users_list = 20
# Calcular rutas # Calcular rutas
if not working_directory: if not working_directory:
@ -686,7 +788,10 @@ if __name__ == "__main__":
# Calcular rutas basadas en la configuración # Calcular rutas basadas en la configuración
plc_subdir_name = "PLC" # Asumir nombre estándar plc_subdir_name = "PLC" # Asumir nombre estándar
project_root_dir = os.path.join(working_directory, plc_subdir_name) project_root_dir = os.path.join(working_directory, plc_subdir_name)
xref_output_dir = os.path.join(project_root_dir, cfg_xref_output_dirname) # Usar nombre de dir leído # El directorio de salida XRef ahora estará probablemente al mismo nivel que 'PLC'
# o dentro de él, según la configuración. Usemos la configuración directamente.
# xref_output_dir = os.path.join(working_directory, cfg_xref_output_dirname) # <-- Opción 1: Al nivel de working_dir
xref_output_dir = os.path.join(project_root_dir, cfg_xref_output_dirname) # <-- Opción 2: Dentro de PLC (como estaba antes) - Mantenemos esta por consistencia con el código original
if not os.path.isdir(project_root_dir): if not os.path.isdir(project_root_dir):
print(f"Error: Directorio del proyecto '{project_root_dir}' no encontrado.", file=sys.stderr) print(f"Error: Directorio del proyecto '{project_root_dir}' no encontrado.", file=sys.stderr)

View File

@ -0,0 +1,193 @@
## Technical Documentation: Parsing TIA Portal `_XRef.xml` Files for Call Tree Generation
**Version:** 1.0 **Date:** 2025-05-05
### 1. Introduction
This document describes the structure and interpretation of the XML files (`*_XRef.xml`) generated by the TIA Portal Openness `export_cross_references` function (available via libraries like `siemens_tia_scripting`). The primary goal is to enable software developers to programmatically parse these files to extract block call relationships and build a comprehensive call tree for a PLC program.
The `_XRef.xml` file contains detailed information about all objects referenced _by_ a specific source object (e.g., an OB, FB, or FC). By processing these files for all relevant blocks, a complete picture of the program's call structure can be assembled.
### 2. File Format Overview
The `_XRef.xml` file is a standard XML document. Its high-level structure typically looks like this:
XML
```xml
<?xml version="1.0" encoding="utf-8"?>
<CrossReferences xmlns:i="..." xmlns="...">
<Sources>
<SourceObject> <Name>...</Name>
<Address>...</Address>
<Device>...</Device>
<Path>...</Path>
<TypeName>...</TypeName>
<UnderlyingObject>...</UnderlyingObject>
<Children />
<References> <ReferenceObject> <Name>...</Name>
<Address>...</Address>
<Device>...</Device>
<Path>...</Path>
<TypeName>...</TypeName> <UnderlyingObject>...</UnderlyingObject>
<Locations> <Location>
<Access>...</Access> <Address>...</Address>
<Name>...</Name>
<ReferenceLocation>...</ReferenceLocation> <ReferenceType>Uses</ReferenceType>
</Location>
</Locations>
</ReferenceObject>
</References>
</SourceObject>
</Sources>
</CrossReferences>
```
### 3. Key XML Elements for Call Tree Construction
To build a call tree, you need to identify the _caller_ and the _callee_ for each block call. The following XML elements are essential:
1. **`<SourceObject>`:** Represents the block performing the calls (the **caller**).
- **`<Name>`:** The symbolic name of the caller block (e.g., `_CYCL_EXC`).
- **`<Address>`:** The absolute address (e.g., `%OB1`).
- **`<TypeName>`:** The type of the caller block (e.g., `LAD-Organization block`).
2. **`<ReferenceObject>`:** Represents an object being referenced by the `SourceObject`. This _could_ be the **callee**.
- **`<Name>`:** The symbolic name of the referenced object (e.g., `BlenderCtrl__Main`, `Co2_Counters_DB`).
- **`<Address>`:** The absolute address (e.g., `%FC2000`, `%DB1021`).
- **`<TypeName>`:** The type of the referenced object (e.g., `LAD-Function`, `Instance DB of Co2_Counters [FB1020]`). This is vital for identifying FCs and FBs (via their instance DBs).
3. **`<Location>`:** Specifies exactly how and where the `ReferenceObject` is used within the `SourceObject`.
- **`<Access>`:** **This is the most critical element for call trees.** Look for the value `Call`. This indicates a direct Function Call (FC). An access type of `InstanceDB` indicates the usage of an instance DB, which implies a Function Block (FB) call is occurring.
- **`<ReferenceLocation>`:** Provides human-readable context about where the reference occurs within the caller's code (e.g., `@_CYCL_EXC ▶ NW3 (Blender CTRL)`). Useful for debugging or visualization.
### 4. Data Extraction Strategy for Call Tree
A program parsing these files should follow these steps for each `_XRef.xml` file:
1. **Parse XML:** Load the `_XRef.xml` file using a suitable XML parsing library (e.g., Python's `xml.etree.ElementTree` or `lxml`).
2. **Identify Caller:** Navigate to the `<SourceObject>` element and extract its `<Name>`. This is the caller block for all references within this file.
3. **Iterate References:** Loop through each `<ReferenceObject>` within the `<References>` section of the `<SourceObject>`.
4. **Iterate Locations:** For each `<ReferenceObject>`, loop through its `<Location>` elements.
5. **Filter for Calls:** Check the text content of the `<Access>` tag within each `<Location>`.
- **If `Access` is `Call`:**
- The `<Name>` of the current `<ReferenceObject>` is the **callee** (an FC).
- Record the relationship: `Caller Name` -> `Callee Name (FC)`.
- **If `Access` is `InstanceDB`:**
- This signifies an FB call is happening using this instance DB.
- The `<Name>` of the current `<ReferenceObject>` is the Instance DB name (e.g., `Co2_Counters_DB`).
- To find the actual FB being called, examine the `<TypeName>` of this `ReferenceObject`. It usually contains the FB name/number (e.g., `Instance DB of Co2_Counters [FB1020]`). Extract the FB name (`Co2_Counters`) or number (`FB1020`). This is the **callee**.
- Record the relationship: `Caller Name` -> `Callee Name (FB)`.
6. **Store Relationships:** Store the identified caller-callee pairs in a suitable data structure.
### 5. Building the Call Tree Data Structure
After parsing one or more `_XRef.xml` files, the extracted relationships can be stored. Common approaches include:
- **Dictionary (Adjacency List):** A dictionary where keys are caller names and values are lists of callee names.
Python
```python
call_tree = {
'_CYCL_EXC': ['BlenderCtrl__Main', 'MessageScroll', 'ITC_MainRoutine', 'Co2_Counters', 'ProcedureProdBrixRecovery', 'Key Read & Write', 'GNS_PLCdia_MainRoutine'],
'BlenderCtrl__Main': ['SomeOtherBlock', ...],
# ... other callers
}
```
- **Graph Representation:** Using libraries like `networkx` in Python to create a directed graph where blocks are nodes and calls are edges. This allows for more complex analysis (e.g., finding paths, cycles).
- **Custom Objects:** Define `Block` and `Call` classes for a more object-oriented representation.
### 6. Handling Multiple Files
A single `_XRef.xml` file only details the references _from_ one `SourceObject`. To build a complete call tree for the entire program or PLC:
1. **Export References:** Use the Openness script to call `export_cross_references` for _all_ relevant OBs, FBs, and FCs in the project.
2. **Process All Files:** Run the parsing logic described above on each generated `_XRef.xml` file.
3. **Aggregate Results:** Combine the caller-callee relationships extracted from all files into a single data structure (e.g., merge dictionaries or add nodes/edges to the graph).
### 7. Example (Conceptual Python using `xml.etree.ElementTree`)
Python
```python
import xml.etree.ElementTree as ET
import re # For extracting FB name from TypeName
def parse_xref_for_calls(xml_file_path):
"""Parses a _XRef.xml file and extracts call relationships."""
calls = {} # {caller: [callee1, callee2, ...]}
try:
tree = ET.parse(xml_file_path)
root = tree.getroot()
# Namespace handling might be needed depending on the xmlns
ns = {'ns0': 'TestNamespace1'} # Adjust namespace if different in your file
for source_object in root.findall('.//ns0:SourceObject', ns):
caller_name = source_object.findtext('ns0:Name', default='UnknownCaller', namespaces=ns)
if caller_name not in calls:
calls[caller_name] = []
for ref_object in source_object.findall('.//ns0:ReferenceObject', ns):
ref_name = ref_object.findtext('ns0:Name', default='UnknownRef', namespaces=ns)
ref_type_name = ref_object.findtext('ns0:TypeName', default='', namespaces=ns)
for location in ref_object.findall('.//ns0:Location', ns):
access_type = location.findtext('ns0:Access', default='', namespaces=ns)
if access_type == 'Call':
# Direct FC call
if ref_name not in calls[caller_name]:
calls[caller_name].append(ref_name)
elif access_type == 'InstanceDB':
# FB call via Instance DB
# Extract FB name/number from TypeName (e.g., "Instance DB of BlockName [FB123]")
match = re.search(r'Instance DB of (.*?) \[([A-Za-z]+[0-9]+)\]', ref_type_name)
callee_fb_name = 'UnknownFB'
if match:
# Prefer symbolic name if available, else use number
callee_fb_name = match.group(1) if match.group(1) else match.group(2)
elif 'Instance DB of' in ref_type_name: # Fallback if regex fails
callee_fb_name = ref_type_name.split('Instance DB of ')[-1].strip()
if callee_fb_name not in calls[caller_name]:
calls[caller_name].append(callee_fb_name)
except ET.ParseError as e:
print(f"Error parsing XML file {xml_file_path}: {e}")
except FileNotFoundError:
print(f"Error: File not found {xml_file_path}")
# Clean up entries with no calls
calls = {k: v for k, v in calls.items() if v}
return calls
# --- Aggregation Example ---
# all_calls = {}
# for xref_file in list_of_all_xref_files:
# file_calls = parse_xref_for_calls(xref_file)
# for caller, callees in file_calls.items():
# if caller not in all_calls:
# all_calls[caller] = []
# for callee in callees:
# if callee not in all_calls[caller]:
# all_calls[caller].append(callee)
# print(all_calls)
```
_Note: Namespace handling (`ns=...`) in ElementTree might need adjustment based on the exact default namespace declared in your XML files._
### 8. Considerations
- **Function Block Calls:** Remember that FB calls are identified indirectly via the `InstanceDB` access type and parsing the `<TypeName>` of the `ReferenceObject`.
- **System Blocks (SFC/SFB):** Calls to system functions/blocks should appear similarly to FC/FB calls and can be included in the tree. Their `<TypeName>` might indicate they are system blocks.
- **TIA Portal Versions:** While the basic structure is consistent, minor variations in tags or namespaces might exist between different TIA Portal versions. Always test with exports from your specific version.
- **Data References:** This documentation focuses on the call tree. The XML also contains `Read`, `Write`, `RW` access types, which can be parsed similarly to build a full cross-reference map for tags and data blocks.
### 9. Conclusion
The `_XRef.xml` files provide a detailed, machine-readable description of block references within a TIA Portal project. By parsing the XML structure, focusing on the `<SourceObject>`, `<ReferenceObject>`, and specifically the `<Access>` tag within `<Location>`, developers can reliably extract block call information and construct program call trees for analysis, documentation, or visualization purposes. Remember to aggregate data from multiple files for a complete program overview.

File diff suppressed because it is too large Load Diff