Enhance IO Extraction and Markdown Generation in x3.py

- Updated directory structure in readme.md to include project-specific files.
- Refactored find_io_recursively to include module context for better IO address mapping.
- Modified generate_markdown_tree to generate PLC-specific hardware trees and improved markdown formatting.
- Added sanitization function for filenames to ensure valid output paths.
- Improved logging to provide detailed execution flow and output paths.
- Updated process_aml_file to extract and save global outputs, returning project data for further processing.
- Enhanced overall error handling and output messages for clarity.
This commit is contained in:
Miguel 2025-05-12 14:06:21 +02:00
parent 88ff4a25a2
commit bf75f6d4d0
6 changed files with 478 additions and 393 deletions

1
.gitignore vendored
View File

@ -8,6 +8,7 @@ __pycache__/
# Distribution / packaging
.Python
temp/
build/
develop-eggs/
dist/

View File

@ -1,32 +1,32 @@
--- Log de Ejecución: x2.py ---
Grupo: ObtainIOFromProjectTia
Directorio de Trabajo: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Inicio: 2025-05-05 12:39:16
Fin: 2025-05-05 12:40:41
Duración: 0:01:25.846312
Inicio: 2025-05-12 13:33:41
Fin: 2025-05-12 13:36:50
Duración: 0:03:09.036254
Estado: SUCCESS (Código de Salida: 0)
--- SALIDA ESTÁNDAR (STDOUT) ---
--- TIA Portal Project CAx Exporter and Analyzer ---
Selected Project: C:/Trabajo/SIDEL/06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)/InLavoro/PLC/SAE196_c0.2/SAE196_c0.2.ap18
Selected Project: C:/Trabajo/SIDEL/06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)/InLavoro/PLC/_NEW/SAE196_c0.2/SAE196_c0.2.ap18
Using Output Directory (Working Directory): C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Will export CAx data to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
Will generate summary to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Summary.md
Export log file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.log
Connecting to TIA Portal V18.0...
2025-05-05 12:39:20,828 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - Start TIA Portal, please acknowledge the security dialog.
2025-05-05 12:39:20,847 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - With user interface
2025-05-12 13:34:08,368 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - Start TIA Portal, please acknowledge the security dialog.
2025-05-12 13:34:08,393 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Global OpenPortal - With user interface
Connected.
Opening project: SAE196_c0.2.ap18...
2025-05-05 12:39:43,534 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal OpenProject - Open project... C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\InLavoro\PLC\SAE196_c0.2\SAE196_c0.2.ap18
2025-05-12 13:34:56,050 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal OpenProject - Open project... C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\InLavoro\PLC\_NEW\SAE196_c0.2\SAE196_c0.2.ap18
Project opened.
Exporting CAx data for the project to C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml...
CAx data exported successfully.
Closing TIA Portal...
2025-05-05 12:40:38,187 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal ClosePortal - Close TIA Portal
2025-05-12 13:36:46,602 [1] INFO Siemens.TiaPortal.OpennessApi18.Implementations.Portal ClosePortal - Close TIA Portal
TIA Portal closed.
Parsing AML file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
Markdown summary written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Summary.md

View File

@ -1,46 +1,48 @@
--- Log de Ejecución: x3.py ---
Grupo: ObtainIOFromProjectTia
Directorio de Trabajo: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Inicio: 2025-05-05 12:48:16
Fin: 2025-05-05 12:48:22
Duración: 0:00:06.125698
Inicio: 2025-05-12 13:56:30
Fin: 2025-05-12 13:56:34
Duración: 0:00:03.887946
Estado: SUCCESS (Código de Salida: 0)
--- SALIDA ESTÁNDAR (STDOUT) ---
--- AML (CAx Export) to Hierarchical JSON and Obsidian MD Converter (v28 - Working Directory Integration) ---
--- AML (CAx Export) to Hierarchical JSON and Obsidian MD Converter (v30 - Enhanced Module Info in Hardware Tree) ---
Using Working Directory for Output: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Input AML: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
Output Directory: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
Output JSON: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.hierarchical.json
Output Main Tree MD: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export_Hardware_Tree.md
Output IO Debug Tree MD: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export_IO_Upward_Debug.md
Processing AML file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
Pass 1: Found 203 InternalElement(s). Populating device dictionary...
Pass 2: Identifying PLCs and Networks (Refined v2)...
Identified Network: PROFIBUS_1 (17667a38-6bbe-481a-a234-6c9ac582adb9) Type: Profibus
Identified Network: ETHERNET_1 (4fa8d8c4-4fb5-4df5-a82e-ec6829530c2e) Type: Ethernet/Profinet
Identified Network: PROFIBUS_1 (d645659a-3704-4cd6-b2c8-6165ceeed6ee) Type: Profibus
Identified Network: ETHERNET_1 (f0b1c852-7dc9-4748-888e-34c60b519a75) Type: Ethernet/Profinet
Identified PLC: PLC (a48e038f-0bcc-4b48-8373-033da316c62b) - Type: CPU 1516F-3 PN/DP OrderNo: 6ES7 516-3FP03-0AB0
Pass 3: Processing InternalLinks (Robust Network Mapping & IO)...
Found 118 InternalLink(s).
Mapping Device/Node 'E1' (NodeID:e15ed19e-b5e1-4cc2-9690-ee5b2132ed74, Addr:10.1.33.11) to Network 'ETHERNET_1'
Found 116 InternalLink(s).
Mapping Device/Node 'E1' (NodeID:439930b8-1bbc-4cb2-a93b-2eed931f4b12, Addr:10.1.33.11) to Network 'ETHERNET_1'
--> Associating Network 'ETHERNET_1' with PLC 'PLC' (via Node 'E1' Addr: 10.1.33.11)
Mapping Device/Node 'P1' (NodeID:d9426769-3159-4c09-af54-e00a677183fd, Addr:1) to Network 'PROFIBUS_1'
Mapping Device/Node 'P1' (NodeID:904bb0f7-df2d-4c1d-ab65-f45480449db1, Addr:1) to Network 'PROFIBUS_1'
--> Associating Network 'PROFIBUS_1' with PLC 'PLC' (via Node 'P1' Addr: 1)
Mapping Device/Node 'PB1' (NodeID:086deb3e-1f8a-471c-8d00-879d11991c6d, Addr:12) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:7cf9f331-96fd-4a89-bf31-7faf501077cd, Addr:20) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:d5a8a086-5c97-4c7d-b488-823e7d75370e, Addr:21) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:e2893de8-90e6-42e6-9e83-7838a57f5038, Addr:22) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:af0f7eb6-720e-42c3-9a97-bf75183d0dc2, Addr:10) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:bfff87c4-b07c-441c-b977-58e967b96587, Addr:8) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:69c65f28-7810-44e6-aae6-fcecb035f91b, Addr:40) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:2784bae8-9807-475f-89bd-bcf44282f5f4, Addr:12) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:e9c5f60a-1da2-4c9b-979e-7d03a5b58a44, Addr:20) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:dd7201c2-e127-4a9d-b6ae-7a74a4ffe418, Addr:21) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:d8825919-3a6c-4f95-aef0-62c782cfdb51, Addr:22) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:27d0e31d-46dc-4fdd-ab82-cfb91899a27c, Addr:10) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:d91d5905-aa1a-485e-b4eb-8333cc2133c2, Addr:8) to Network 'PROFIBUS_1'
Mapping Device/Node 'PB1' (NodeID:0c5dfe06-786d-4ab6-b57c-8dfede56c2aa, Addr:40) to Network 'PROFIBUS_1'
Data extraction and structuring complete.
Generating JSON output: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.hierarchical.json
JSON data written successfully.
Markdown summary written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export_Hardware_Tree.md
IO upward debug tree written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export_IO_Upward_Debug.md
Found 1 PLC(s). Generating individual hardware trees...
Generating Hardware Tree for PLC 'PLC' (ID: a48e038f-0bcc-4b48-8373-033da316c62b) at: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\Documentation\SAE196_c0.2_CAx_Export_Hardware_Tree.md
Markdown summary written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\Documentation\SAE196_c0.2_CAx_Export_Hardware_Tree.md
Script finished.
--- ERRORES (STDERR) ---

View File

@ -4,6 +4,7 @@
### Directory structure
<working_directory>/
├── <Project_Name>_CAx_Export.aml
├── <PLC1_Name>/
│ ├── ProgramBlocks_XML/
│ │ └── ... (archivos XML de bloques)
@ -30,7 +31,7 @@
│ └── xref_db_usage_summary.md
│ └── xref_plc_tags_summary.md
│ └── full_project_representation.md
│ └── SAE196_c0.2_CAx_Export_Hardware_Tree.md
│ └── <Project_Name>_CAx_Export_Hardware_Tree.md
├── <PLC2_Name>/
│ ├── ProgramBlocks_XML/

View File

@ -396,387 +396,423 @@ def extract_aml_data(root):
# --- Helper Function for Recursive IO Search (Unchanged from v20) ---
def find_io_recursively(device_id, project_data):
"""Recursively finds all IO addresses under a given device ID."""
def find_io_recursively(device_id, project_data, module_context):
"""
Recursively finds all IO addresses under a given device ID, using module_context
for details of the main hardware module.
module_context = {"id": ..., "name": ..., "order_number": ..., "type_name": ...}
"""
io_list = []
device_info = project_data.get("devices", {}).get(device_id)
if not device_info:
return io_list
if device_info.get("io_addresses"):
# Slot position is from the current device_info (which holds the IO)
# It's often found in attributes.PositionNumber for sub-elements.
slot_pos = device_info.get("attributes", {}).get("PositionNumber", device_info.get("position", "N/A"))
for addr in device_info["io_addresses"]:
io_list.append(
{
"module_name": device_info.get("name", device_id),
"module_pos": device_info.get("position", "N/A"),
"module_id": module_context["id"],
"module_name": module_context["name"],
"module_pos": slot_pos, # Slot of the IO sub-element
"module_order_number": module_context["order_number"],
"module_type_name": module_context["type_name"],
**addr,
}
)
children_ids = device_info.get("children_ids", [])
for child_id in children_ids:
if child_id != device_id: # Basic loop prevention
io_list.extend(find_io_recursively(child_id, project_data))
# The module_context remains the same as we recurse *within* a main module's structure
io_list.extend(find_io_recursively(child_id, project_data, module_context))
return io_list
# --- generate_markdown_tree function (v26 - Final Cleaned Version) ---
def generate_markdown_tree(project_data, md_file_path):
"""(v26) Generates final hierarchical Markdown with aesthetic improvements."""
markdown_lines = ["# Project Hardware & IO Summary (Tree View v26)", ""]
def generate_markdown_tree(project_data, md_file_path, target_plc_id):
"""(v29 Mod) Generates hierarchical Markdown for a specific PLC."""
if not project_data or not project_data.get("plcs"):
markdown_lines.append("*No PLC identified in the project data.*")
plc_info = project_data.get("plcs", {}).get(target_plc_id)
plc_name_for_title = "Unknown PLC"
if plc_info:
plc_name_for_title = plc_info.get('name', target_plc_id)
markdown_lines = [f"# Hardware & IO Summary for PLC: {plc_name_for_title}", ""]
if not plc_info:
markdown_lines.append(f"*Details for PLC ID '{target_plc_id}' not found in the project data.*")
try:
with open(md_file_path, "w", encoding="utf-8") as f:
f.write("\n".join(markdown_lines))
print(f"\nMarkdown summary written to: {md_file_path}")
print(f"Markdown summary (PLC not found) written to: {md_file_path}")
except Exception as e:
print(f"ERROR writing Markdown file {md_file_path}: {e}")
return
markdown_lines.append(f"Identified {len(project_data['plcs'])} PLC(s).")
# Content previously in the loop now directly uses plc_info and target_plc_id
markdown_lines.append(f"\n## PLC: {plc_info.get('name', target_plc_id)}")
type_name = plc_info.get("type_name", "N/A")
order_num = plc_info.get("order_number", "N/A")
firmware = plc_info.get("firmware_version", "N/A")
if type_name and type_name != "N/A":
markdown_lines.append(f"- **Type Name:** `{type_name}`")
if order_num and order_num != "N/A":
markdown_lines.append(f"- **Order Number:** `{order_num}`")
if firmware and firmware != "N/A":
markdown_lines.append(f"- **Firmware:** `{firmware}`")
for plc_id, plc_info in project_data.get("plcs", {}).items():
markdown_lines.append(f"\n## PLC: {plc_info.get('name', plc_id)}")
type_name = plc_info.get("type_name", "N/A")
order_num = plc_info.get("order_number", "N/A")
firmware = plc_info.get("firmware_version", "N/A")
if type_name and type_name != "N/A":
markdown_lines.append(f"- **Type Name:** `{type_name}`")
if order_num and order_num != "N/A":
markdown_lines.append(f"- **Order Number:** `{order_num}`")
if firmware and firmware != "N/A":
markdown_lines.append(f"- **Firmware:** `{firmware}`")
# ID removed
plc_networks = plc_info.get("connected_networks", {})
markdown_lines.append("\n- **Networks:**")
if not plc_networks:
markdown_lines.append(
" - *No network connections found associated with this PLC object.*"
)
else:
sorted_network_items = sorted(
plc_networks.items(),
key=lambda item: project_data.get("networks", {})
.get(item[0], {})
.get("name", item[0]),
)
for net_id, plc_addr_on_net in sorted_network_items:
net_info = project_data.get("networks", {}).get(net_id)
if not net_info:
markdown_lines.append(
f" - !!! Error: Network info missing for ID {net_id} !!!"
)
continue
plc_networks = plc_info.get("connected_networks", {})
markdown_lines.append("\n- **Networks:**")
if not plc_networks:
markdown_lines.append(
" - *No network connections found associated with this PLC object.*"
f" - ### {net_info.get('name', net_id)} ({net_info.get('type', 'Unknown')})"
)
else:
sorted_network_items = sorted(
plc_networks.items(),
key=lambda item: project_data.get("networks", {})
.get(item[0], {})
.get("name", item[0]),
markdown_lines.append(
f" - PLC Address on this Net: `{plc_addr_on_net}`"
)
markdown_lines.append(f" - **Devices on Network:**")
devices_on_this_net = net_info.get("devices_on_net", {})
def sort_key(item):
node_id, node_addr = item
try:
parts = [int(p) for p in re.findall(r"\d+", node_addr)]
return parts
except:
return [float("inf")]
plc_interface_and_node_ids = set()
for node in plc_info.get("network_nodes", []):
plc_interface_and_node_ids.add(node["id"])
interface_id_lookup = (
project_data["devices"].get(node["id"], {}).get("parent_id")
)
if interface_id_lookup:
plc_interface_and_node_ids.add(interface_id_lookup)
plc_interface_and_node_ids.add(target_plc_id) # Use target_plc_id here
other_device_items = sorted(
[
(node_id, node_addr)
for node_id, node_addr in devices_on_this_net.items()
if node_id not in plc_interface_and_node_ids
],
key=sort_key,
)
for net_id, plc_addr_on_net in sorted_network_items:
net_info = project_data.get("networks", {}).get(net_id)
if not net_info:
markdown_lines.append(
f" - !!! Error: Network info missing for ID {net_id} !!!"
)
continue
markdown_lines.append(
f" - ### {net_info.get('name', net_id)} ({net_info.get('type', 'Unknown')})"
)
markdown_lines.append(
f" - PLC Address on this Net: `{plc_addr_on_net}`"
)
markdown_lines.append(f" - **Devices on Network:**")
devices_on_this_net = net_info.get("devices_on_net", {})
def sort_key(item):
node_id, node_addr = item
try:
parts = [int(p) for p in re.findall(r"\d+", node_addr)]
return parts
except:
return [float("inf")]
plc_interface_and_node_ids = set()
for node in plc_info.get("network_nodes", []):
plc_interface_and_node_ids.add(node["id"])
interface_id = (
project_data["devices"].get(node["id"], {}).get("parent_id")
)
if interface_id:
plc_interface_and_node_ids.add(interface_id)
plc_interface_and_node_ids.add(plc_id)
other_device_items = sorted(
[
(node_id, node_addr)
for node_id, node_addr in devices_on_this_net.items()
if node_id not in plc_interface_and_node_ids
],
key=sort_key,
)
if not other_device_items:
markdown_lines.append(" - *None (besides PLC interfaces)*")
else:
# --- Display Logic with Sibling IO Aggregation & Aesthetics ---
for node_id, node_addr in other_device_items:
node_info = project_data.get("devices", {}).get(node_id)
if not node_info:
markdown_lines.append(
f" - !!! Error: Node info missing for ID {node_id} Addr: {node_addr} !!!"
)
continue
interface_id = node_info.get("parent_id")
interface_info = None
actual_device_id = None
actual_device_info = None
rack_id = None
rack_info = None
if interface_id:
interface_info = project_data.get("devices", {}).get(
interface_id
)
if interface_info:
actual_device_id = interface_info.get("parent_id")
if actual_device_id:
actual_device_info = project_data.get(
"devices", {}
).get(actual_device_id)
if actual_device_info:
potential_rack_id = actual_device_info.get(
"parent_id"
)
if potential_rack_id:
potential_rack_info = project_data.get(
"devices", {}
).get(potential_rack_id)
if potential_rack_info and (
"Rack"
in potential_rack_info.get("name", "")
or potential_rack_info.get("position")
is None
):
rack_id = potential_rack_id
rack_info = potential_rack_info
display_info_title = (
actual_device_info
if actual_device_info
else (interface_info if interface_info else node_info)
)
display_id_title = (
actual_device_id
if actual_device_info
else (interface_id if interface_info else node_id)
)
io_search_root_id = (
actual_device_id
if actual_device_info
else (interface_id if interface_info else node_id)
)
io_search_root_info = project_data.get("devices", {}).get(
io_search_root_id
)
# Construct Title
display_name = display_info_title.get("name", display_id_title)
via_node_name = node_info.get("name", node_id)
title_str = f"#### {display_name}"
if display_id_title != node_id:
title_str += f" (via {via_node_name} @ `{node_addr}`)"
else:
title_str += f" (@ `{node_addr}`)"
markdown_lines.append(f" - {title_str}")
# Display Basic Details
if not other_device_items:
markdown_lines.append(" - *None (besides PLC interfaces)*")
else:
# --- Display Logic with Sibling IO Aggregation & Aesthetics ---
for node_id, node_addr in other_device_items:
node_info = project_data.get("devices", {}).get(node_id)
if not node_info:
markdown_lines.append(
f" - Address (on net): `{node_addr}`"
f" - !!! Error: Node info missing for ID {node_id} Addr: {node_addr} !!!"
)
type_name_disp = display_info_title.get("type_name", "N/A")
order_num_disp = display_info_title.get("order_number", "N/A")
pos_disp = display_info_title.get("position", "N/A")
if type_name_disp and type_name_disp != "N/A":
markdown_lines.append(
f" - Type Name: `{type_name_disp}`"
)
if order_num_disp and order_num_disp != "N/A":
markdown_lines.append(
f" - Order No: `{order_num_disp}`"
)
if pos_disp and pos_disp != "N/A":
markdown_lines.append(
f" - Pos (in parent): `{pos_disp}`"
)
ultimate_parent_id = rack_id
if not ultimate_parent_id and actual_device_info:
ultimate_parent_id = actual_device_info.get("parent_id")
if (
ultimate_parent_id
and ultimate_parent_id != display_id_title
):
ultimate_parent_info = project_data.get("devices", {}).get(
ultimate_parent_id
)
ultimate_parent_name = (
ultimate_parent_info.get("name", "?")
if ultimate_parent_info
else "?"
)
markdown_lines.append(
f" - Parent Structure: `{ultimate_parent_name}`"
) # Removed ID here
continue
# --- IO Aggregation Logic (from v24) ---
aggregated_io_addresses = []
parent_structure_id = (
io_search_root_info.get("parent_id")
if io_search_root_info
else None
interface_id = node_info.get("parent_id")
interface_info_dev = None # Renamed to avoid conflict
actual_device_id = None
actual_device_info = None
rack_id = None
# rack_info = None # rack_info was not used
if interface_id:
interface_info_dev = project_data.get("devices", {}).get(
interface_id
)
io_search_root_name_disp = (
io_search_root_info.get("name", "?")
if io_search_root_info
if interface_info_dev:
actual_device_id = interface_info_dev.get("parent_id")
if actual_device_id:
actual_device_info = project_data.get(
"devices", {}
).get(actual_device_id)
if actual_device_info:
potential_rack_id = actual_device_info.get(
"parent_id"
)
if potential_rack_id:
potential_rack_info = project_data.get(
"devices", {}
).get(potential_rack_id)
if potential_rack_info and (
"Rack"
in potential_rack_info.get("name", "")
or potential_rack_info.get("position")
is None
):
rack_id = potential_rack_id
# rack_info = potential_rack_info # Not used
display_info_title = (
actual_device_info
if actual_device_info
else (interface_info_dev if interface_info_dev else node_info)
)
display_id_title = (
actual_device_id
if actual_device_info
else (interface_id if interface_info_dev else node_id)
)
io_search_root_id = (
actual_device_id
if actual_device_info
else (interface_id if interface_info_dev else node_id)
)
io_search_root_info = project_data.get("devices", {}).get(
io_search_root_id
)
# Construct Title
display_name = display_info_title.get("name", display_id_title)
via_node_name = node_info.get("name", node_id)
title_str = f"#### {display_name}"
if display_id_title != node_id:
title_str += f" (via {via_node_name} @ `{node_addr}`)"
else:
title_str += f" (@ `{node_addr}`)"
markdown_lines.append(f" - {title_str}")
# Display Basic Details
markdown_lines.append(
f" - Address (on net): `{node_addr}`"
)
type_name_disp = display_info_title.get("type_name", "N/A")
order_num_disp = display_info_title.get("order_number", "N/A")
pos_disp = display_info_title.get("position", "N/A")
if type_name_disp and type_name_disp != "N/A":
markdown_lines.append(
f" - Type Name: `{type_name_disp}`"
)
if order_num_disp and order_num_disp != "N/A":
markdown_lines.append(
f" - Order No: `{order_num_disp}`"
)
if pos_disp and pos_disp != "N/A":
markdown_lines.append(
f" - Pos (in parent): `{pos_disp}`"
)
ultimate_parent_id = rack_id
if not ultimate_parent_id and actual_device_info:
ultimate_parent_id = actual_device_info.get("parent_id")
if (
ultimate_parent_id
and ultimate_parent_id != display_id_title
):
ultimate_parent_info = project_data.get("devices", {}).get(
ultimate_parent_id
)
ultimate_parent_name = (
ultimate_parent_info.get("name", "?")
if ultimate_parent_info
else "?"
)
markdown_lines.append(
f" - Parent Structure: `{ultimate_parent_name}`"
)
if parent_structure_id:
parent_structure_info = project_data.get("devices", {}).get(
parent_structure_id
)
parent_structure_name = (
parent_structure_info.get("name", "?")
if parent_structure_info
else "?"
)
search_title = f"parent '{parent_structure_name}'"
sibling_found_io = False
for dev_scan_id, dev_scan_info in project_data.get(
"devices", {}
).items():
if (
dev_scan_info.get("parent_id")
== parent_structure_id
):
io_from_sibling = find_io_recursively(
dev_scan_id, project_data
)
if io_from_sibling:
aggregated_io_addresses.extend(io_from_sibling)
sibling_found_io = True
# --- IO Aggregation Logic (from v24) ---
aggregated_io_addresses = []
parent_structure_id = (
io_search_root_info.get("parent_id")
if io_search_root_info
else None
)
io_search_root_name_disp = (
io_search_root_info.get("name", "?")
if io_search_root_info
else "?"
)
if parent_structure_id:
parent_structure_info = project_data.get("devices", {}).get(
parent_structure_id
)
parent_structure_name = (
parent_structure_info.get("name", "?")
if parent_structure_info
else "?"
)
search_title = f"parent '{parent_structure_name}'"
sibling_found_io = False
for dev_scan_id, dev_scan_info in project_data.get(
"devices", {}
).items():
if (
not sibling_found_io and not aggregated_io_addresses
): # Only show message if list still empty
markdown_lines.append(
f" - *No IO Addresses found in modules under {search_title} (ID: {parent_structure_id}).*"
dev_scan_info.get("parent_id") == parent_structure_id
):
# This dev_scan_info is the module
module_context_for_sibling = {
"id": dev_scan_id,
"name": dev_scan_info.get("name", dev_scan_id),
"order_number": dev_scan_info.get("order_number", "N/A"),
"type_name": dev_scan_info.get("type_name", "N/A")
}
io_from_sibling = find_io_recursively(
dev_scan_id, project_data, module_context_for_sibling
)
elif io_search_root_id:
search_title = f"'{io_search_root_name_disp}'"
aggregated_io_addresses = find_io_recursively(
io_search_root_id, project_data
)
if not aggregated_io_addresses:
markdown_lines.append(
f" - *No IO Addresses found in modules under {search_title} (ID: {io_search_root_id}).*"
)
else:
if io_from_sibling:
aggregated_io_addresses.extend(io_from_sibling)
sibling_found_io = True
if (
not sibling_found_io and not aggregated_io_addresses
): # Only show message if list still empty
markdown_lines.append(
f" - *Could not determine structure to search for IO addresses.*"
f" - *No IO Addresses found in modules under {search_title} (ID: {parent_structure_id}).*"
)
# --- End IO Aggregation ---
# Display aggregated IO Addresses with Siemens format (Cleaned)
if aggregated_io_addresses:
elif io_search_root_id:
search_title = f"'{io_search_root_name_disp}'"
module_context_for_root = {
"id": io_search_root_id,
"name": io_search_root_info.get("name", io_search_root_id),
"order_number": io_search_root_info.get("order_number", "N/A"),
"type_name": io_search_root_info.get("type_name", "N/A")
}
aggregated_io_addresses = find_io_recursively(
io_search_root_id, project_data, module_context_for_root
)
if not aggregated_io_addresses:
markdown_lines.append(
f" - **IO Addresses (Aggregated from Structure):**"
) # Removed redundant search root name
sorted_agg_io = sorted(
aggregated_io_addresses,
key=lambda x: (
(
int(x.get("module_pos", "9999"))
if x.get("module_pos", "9999").isdigit()
else 9999
),
x.get("module_name", ""),
x.get("type", ""),
(
int(x.get("start", "0"))
if x.get("start", "0").isdigit()
else float("inf")
),
f" - *No IO Addresses found in modules under {search_title} (ID: {io_search_root_id}).*"
)
else:
markdown_lines.append(
f" - *Could not determine structure to search for IO addresses.*"
)
# --- End IO Aggregation ---
# Display aggregated IO Addresses with Siemens format (Cleaned)
if aggregated_io_addresses:
markdown_lines.append(
f" - **IO Addresses (Aggregated from Structure):**"
)
sorted_agg_io = sorted(
aggregated_io_addresses,
key=lambda x: (
(
int(x.get("module_pos", "9999"))
if str(x.get("module_pos", "9999")).isdigit() # Ensure it's a string before isdigit
else 9999
),
)
last_module_id_key = None
for addr_info in sorted_agg_io:
current_module_id_key = (
addr_info.get("module_name", "?"),
addr_info.get("module_pos", "?"),
)
if current_module_id_key != last_module_id_key:
markdown_lines.append(
f" - **From Module:** {addr_info.get('module_name','?')} (Pos: {addr_info.get('module_pos','?')})"
)
last_module_id_key = current_module_id_key
# --- Siemens IO Formatting (from v25.1 - keep fixes) ---
io_type = addr_info.get("type", "?")
start_str = addr_info.get("start", "?")
length_str = addr_info.get("length", "?")
area_str = addr_info.get("area", "?")
siemens_addr = f"FMT_ERROR" # Default error
length_bits = 0
try:
start_byte = int(start_str)
length_bits = int(length_str)
length_bytes = math.ceil(
length_bits / 8.0
) # Use float division
if length_bits > 0 and length_bytes == 0:
length_bytes = 1 # Handle len < 8 bits
end_byte = start_byte + length_bytes - 1
prefix = "P?"
if io_type.lower() == "input":
prefix = "EW"
elif io_type.lower() == "output":
prefix = "AW"
siemens_addr = f"{prefix} {start_byte}..{end_byte}"
except Exception: # Catch any error during calc/format
siemens_addr = (
f"FMT_ERROR({start_str},{length_str})"
)
markdown_lines.append(
f" - `{siemens_addr}` (Len={length_bits} bits)" # Simplified output
)
# --- End Siemens IO Formatting ---
# IO Connections logic remains the same...
links_from = project_data.get("links_by_source", {}).get(
display_id_title, []
x.get("module_name", ""),
x.get("type", ""),
(
int(x.get("start", "0"))
if str(x.get("start", "0")).isdigit() # Ensure it's a string
else float("inf")
),
),
)
links_to = project_data.get("links_by_target", {}).get(
display_id_title, []
)
io_conns = []
for link in links_from:
if "channel" in link["source_suffix"].lower():
target_str = f"{link.get('target_device_name', link['target_id'])}:{link['target_suffix']}"
if link["target_id"] == display_id_title:
target_str = link["target_suffix"]
io_conns.append(
f"`{link['source_suffix']}` → `{target_str}`"
last_module_id_processed = None # Use the actual module ID for grouping
for addr_info in sorted_agg_io:
current_module_id_for_grouping = addr_info.get("module_id")
if current_module_id_for_grouping != last_module_id_processed:
module_name_disp = addr_info.get('module_name','?')
module_type_name_disp = addr_info.get('module_type_name', 'N/A')
module_order_num_disp = addr_info.get('module_order_number', 'N/A')
module_line_parts = [f"**{module_name_disp}**"]
if module_type_name_disp and module_type_name_disp != 'N/A':
module_line_parts.append(f"Type: `{module_type_name_disp}`")
if module_order_num_disp and module_order_num_disp != 'N/A':
module_line_parts.append(f"OrderNo: `{module_order_num_disp}`")
# Removed (Pos: ...) from this line as requested
markdown_lines.append(f" - {', '.join(module_line_parts)}")
last_module_id_processed = current_module_id_for_grouping
# --- Siemens IO Formatting (from v25.1 - keep fixes) ---
io_type = addr_info.get("type", "?")
start_str = addr_info.get("start", "?")
length_str = addr_info.get("length", "?")
# area_str = addr_info.get("area", "?") # Not used in final output string
siemens_addr = f"FMT_ERROR" # Default error
length_bits = 0
try:
start_byte = int(start_str)
length_bits = int(length_str)
length_bytes = math.ceil(
length_bits / 8.0
) # Use float division
if length_bits > 0 and length_bytes == 0:
length_bytes = 1 # Handle len < 8 bits
end_byte = start_byte + length_bytes - 1
prefix = "P?"
if io_type.lower() == "input":
prefix = "EW"
elif io_type.lower() == "output":
prefix = "AW"
siemens_addr = f"{prefix} {start_byte}..{end_byte}"
except Exception:
siemens_addr = (
f"FMT_ERROR({start_str},{length_str})"
)
for link in links_to:
if "channel" in link["target_suffix"].lower():
source_str = f"{link.get('source_device_name', link['source_id'])}:{link['source_suffix']}"
if link["source_id"] == display_id_title:
source_str = link["source_suffix"]
io_conns.append(
f"`{source_str}` → `{link['target_suffix']}`"
)
if io_conns:
markdown_lines.append(
f" - **IO Connections (Channels):**"
f" - `{siemens_addr}` (Len={length_bits} bits)"
)
for conn in sorted(list(set(io_conns))):
markdown_lines.append(f" - {conn}")
markdown_lines.append("") # Spacing
# --- *** END Display Logic *** ---
# --- End Siemens IO Formatting ---
# IO Connections logic
links_from = project_data.get("links_by_source", {}).get(
display_id_title, []
)
links_to = project_data.get("links_by_target", {}).get(
display_id_title, []
)
io_conns = []
for link in links_from:
if "channel" in link["source_suffix"].lower():
target_str = f"{link.get('target_device_name', link['target_id'])}:{link['target_suffix']}"
if link["target_id"] == display_id_title:
target_str = link["target_suffix"]
io_conns.append(
f"`{link['source_suffix']}` → `{target_str}`"
)
for link in links_to:
if "channel" in link["target_suffix"].lower():
source_str = f"{link.get('source_device_name', link['source_id'])}:{link['source_suffix']}"
if link["source_id"] == display_id_title:
source_str = link["source_suffix"]
io_conns.append(
f"`{source_str}` → `{link['target_suffix']}`"
)
if io_conns:
markdown_lines.append(
f" - **IO Connections (Channels):**"
)
for conn in sorted(list(set(io_conns))):
markdown_lines.append(f" - {conn}")
markdown_lines.append("") # Spacing
# --- *** END Display Logic *** ---
try:
with open(md_file_path, "w", encoding="utf-8") as f:
@ -923,10 +959,9 @@ def generate_io_upward_tree(project_data, md_file_path):
print(f"ERROR writing IO upward debug tree file {md_file_path}: {e}")
# --- process_aml_file function (unchanged from v22) ---
def process_aml_file(
aml_file_path, json_output_path, md_output_path, md_upward_output_path
):
# --- extract_and_save_global_outputs function (Refactored from process_aml_file) ---
def extract_and_save_global_outputs(aml_file_path, json_output_path, md_upward_output_path):
"""Extracts data from AML, saves global JSON and IO upward tree, returns project_data."""
# (Unchanged)
print(f"Processing AML file: {aml_file_path}")
if not os.path.exists(aml_file_path):
@ -945,16 +980,20 @@ def process_aml_file(
except Exception as e:
print(f"ERROR writing JSON file {json_output_path}: {e}")
traceback.print_exc()
generate_markdown_tree(project_data, md_output_path) # v26 MD generation
# Generate and save the IO upward tree (global)
generate_io_upward_tree(
project_data, md_upward_output_path
) # v23 upward generation
)
return project_data
except ET.LxmlError as xml_err:
print(f"ERROR parsing XML file {aml_file_path} with lxml: {xml_err}")
traceback.print_exc()
return None
except Exception as e:
print(f"ERROR processing AML file {aml_file_path}: {e}")
traceback.print_exc()
return None
def select_cax_file(initial_dir=None): # Add initial_dir parameter
@ -987,12 +1026,20 @@ def select_output_directory():
return dir_path
def sanitize_filename(name):
"""Sanitizes a string to be used as a valid filename or directory name."""
name = str(name) # Ensure it's a string
name = re.sub(r'[<>:"/\\|?*]', '_', name) # Replace forbidden characters
name = re.sub(r'\s+', '_', name) # Replace multiple whitespace with single underscore
name = name.strip('._') # Remove leading/trailing dots or underscores
return name if name else "Unnamed_Device"
# --- Main Execution ---
if __name__ == "__main__":
configs = load_configuration()
working_directory = configs.get("working_directory")
script_version = "v28 - Working Directory Integration" # Updated version
script_version = "v30 - Enhanced Module Info in Hardware Tree"
print(
f"--- AML (CAx Export) to Hierarchical JSON and Obsidian MD Converter ({script_version}) ---"
)
@ -1031,21 +1078,40 @@ if __name__ == "__main__":
# Construct output file paths within the selected output directory (working_directory)
output_json_file = output_path / input_path.with_suffix(".hierarchical.json").name
output_md_file = output_path / input_path.with_name(f"{input_path.stem}_Hardware_Tree.md") # Simplified name
output_md_upward_file = output_path / input_path.with_name(f"{input_path.stem}_IO_Upward_Debug.md") # Simplified name
# Hardware tree MD name is now PLC-specific and handled below
output_md_upward_file = output_path / input_path.with_name(f"{input_path.stem}_IO_Upward_Debug.md")
print(f"Input AML: {input_path.resolve()}")
print(f"Output Directory: {output_path.resolve()}")
print(f"Output JSON: {output_json_file.resolve()}")
print(f"Output Main Tree MD: {output_md_file.resolve()}")
print(f"Output IO Debug Tree MD: {output_md_upward_file.resolve()}")
# Process the selected file and save outputs to the selected directory
process_aml_file(
# Process the AML file to get project_data and save global files
project_data = extract_and_save_global_outputs(
str(input_path),
str(output_json_file),
str(output_md_file),
str(output_md_upward_file),
)
if project_data:
# Now, generate the hardware tree per PLC
if not project_data.get("plcs"):
print("\nNo PLCs found in the project data. Cannot generate PLC-specific hardware trees.")
else:
print(f"\nFound {len(project_data['plcs'])} PLC(s). Generating individual hardware trees...")
for plc_id, plc_data_for_plc in project_data.get("plcs", {}).items():
plc_name_original = plc_data_for_plc.get('name', plc_id)
plc_name_sanitized = sanitize_filename(plc_name_original)
plc_doc_dir = output_path / plc_name_sanitized / "Documentation"
plc_doc_dir.mkdir(parents=True, exist_ok=True)
hardware_tree_md_filename = f"{input_path.stem}_Hardware_Tree.md"
output_plc_md_file = plc_doc_dir / hardware_tree_md_filename
print(f" Generating Hardware Tree for PLC '{plc_name_original}' (ID: {plc_id}) at: {output_plc_md_file.resolve()}")
generate_markdown_tree(project_data, str(output_plc_md_file), plc_id)
else:
print("\nFailed to process AML data. Halting before generating PLC-specific trees.")
print("\nScript finished.")

View File

@ -1,21 +1,36 @@
[16:58:28] Iniciando ejecución de x1.py en C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email...
[16:58:29] Working directory: C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email
[16:58:29] Input directory: C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email
[16:58:29] Output directory: C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/10 - E5.007095 - Modifica O&U - SAE463
[16:58:29] Cronologia file: C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/10 - E5.007095 - Modifica O&U - SAE463\cronologia.md
[16:58:29] Attachments directory: C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email\adjuntos
[16:58:29] Beautify rules file: D:\Proyectos\Scripts\ParamManagerScripts\backend\script_groups\EmailCrono\config\beautify_rules.json
[16:58:29] Found 1 .eml files
[16:58:29] Loaded 0 existing messages
[16:58:29] Processing C:\Trabajo\SIDEL\10 - E5.007095 - Modifica O&U - SAE463\Reporte\Email\R_ {EXT} E5.006894 - Modifica O&U - SAE463 New Analyzer.eml
[16:58:29] Aplicando reglas de prioridad 1
[16:58:29] Aplicando reglas de prioridad 2
[16:58:29] Aplicando reglas de prioridad 3
[16:58:29] Aplicando reglas de prioridad 4
[16:58:29] Estadísticas de procesamiento:
[16:58:29] - Total mensajes encontrados: 1
[16:58:29] - Mensajes únicos añadidos: 1
[16:58:29] - Mensajes duplicados ignorados: 0
[16:58:29] Writing 1 messages to C:/Users/migue/OneDrive/Miguel/Obsidean/Trabajo/VM/04-SIDEL/10 - E5.007095 - Modifica O&U - SAE463\cronologia.md
[16:58:29] Ejecución de x1.py finalizada (success). Duración: 0:00:00.434600.
[16:58:29] Log completo guardado en: D:\Proyectos\Scripts\ParamManagerScripts\backend\script_groups\EmailCrono\log_x1.txt
[13:56:30] Iniciando ejecución de x3.py en C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport...
[13:56:30] --- AML (CAx Export) to Hierarchical JSON and Obsidian MD Converter (v30 - Enhanced Module Info in Hardware Tree) ---
[13:56:30] Using Working Directory for Output: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
[13:56:34] Input AML: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
[13:56:34] Output Directory: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport
[13:56:34] Output JSON: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.hierarchical.json
[13:56:34] Output IO Debug Tree MD: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export_IO_Upward_Debug.md
[13:56:34] Processing AML file: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.aml
[13:56:34] Pass 1: Found 203 InternalElement(s). Populating device dictionary...
[13:56:34] Pass 2: Identifying PLCs and Networks (Refined v2)...
[13:56:34] Identified Network: PROFIBUS_1 (d645659a-3704-4cd6-b2c8-6165ceeed6ee) Type: Profibus
[13:56:34] Identified Network: ETHERNET_1 (f0b1c852-7dc9-4748-888e-34c60b519a75) Type: Ethernet/Profinet
[13:56:34] Identified PLC: PLC (a48e038f-0bcc-4b48-8373-033da316c62b) - Type: CPU 1516F-3 PN/DP OrderNo: 6ES7 516-3FP03-0AB0
[13:56:34] Pass 3: Processing InternalLinks (Robust Network Mapping & IO)...
[13:56:34] Found 116 InternalLink(s).
[13:56:34] Mapping Device/Node 'E1' (NodeID:439930b8-1bbc-4cb2-a93b-2eed931f4b12, Addr:10.1.33.11) to Network 'ETHERNET_1'
[13:56:34] --> Associating Network 'ETHERNET_1' with PLC 'PLC' (via Node 'E1' Addr: 10.1.33.11)
[13:56:34] Mapping Device/Node 'P1' (NodeID:904bb0f7-df2d-4c1d-ab65-f45480449db1, Addr:1) to Network 'PROFIBUS_1'
[13:56:34] --> Associating Network 'PROFIBUS_1' with PLC 'PLC' (via Node 'P1' Addr: 1)
[13:56:34] Mapping Device/Node 'PB1' (NodeID:2784bae8-9807-475f-89bd-bcf44282f5f4, Addr:12) to Network 'PROFIBUS_1'
[13:56:34] Mapping Device/Node 'PB1' (NodeID:e9c5f60a-1da2-4c9b-979e-7d03a5b58a44, Addr:20) to Network 'PROFIBUS_1'
[13:56:34] Mapping Device/Node 'PB1' (NodeID:dd7201c2-e127-4a9d-b6ae-7a74a4ffe418, Addr:21) to Network 'PROFIBUS_1'
[13:56:34] Mapping Device/Node 'PB1' (NodeID:d8825919-3a6c-4f95-aef0-62c782cfdb51, Addr:22) to Network 'PROFIBUS_1'
[13:56:34] Mapping Device/Node 'PB1' (NodeID:27d0e31d-46dc-4fdd-ab82-cfb91899a27c, Addr:10) to Network 'PROFIBUS_1'
[13:56:34] Mapping Device/Node 'PB1' (NodeID:d91d5905-aa1a-485e-b4eb-8333cc2133c2, Addr:8) to Network 'PROFIBUS_1'
[13:56:34] Mapping Device/Node 'PB1' (NodeID:0c5dfe06-786d-4ab6-b57c-8dfede56c2aa, Addr:40) to Network 'PROFIBUS_1'
[13:56:34] Data extraction and structuring complete.
[13:56:34] Generating JSON output: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export.hierarchical.json
[13:56:34] JSON data written successfully.
[13:56:34] IO upward debug tree written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\SAE196_c0.2_CAx_Export_IO_Upward_Debug.md
[13:56:34] Found 1 PLC(s). Generating individual hardware trees...
[13:56:34] Generating Hardware Tree for PLC 'PLC' (ID: a48e038f-0bcc-4b48-8373-033da316c62b) at: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\Documentation\SAE196_c0.2_CAx_Export_Hardware_Tree.md
[13:56:34] Markdown summary written to: C:\Trabajo\SIDEL\06 - E5.007363 - Modifica O&U - SAE196 (cip integrato)\Reporte\IOExport\PLC\Documentation\SAE196_c0.2_CAx_Export_Hardware_Tree.md
[13:56:34] Script finished.
[13:56:34] Ejecución de x3.py finalizada (success). Duración: 0:00:03.887946.
[13:56:34] Log completo guardado en: D:\Proyectos\Scripts\ParamManagerScripts\backend\script_groups\ObtainIOFromProjectTia\log_x3.txt