Implementación de mejoras en la gestión de variables de streaming, asegurando que solo se transmitan las variables activas y sincronizando las configuraciones de streaming al iniciar la aplicación. Se corrigieron problemas de inicialización de streaming en la recuperación automática y se actualizaron los archivos de configuración y estado del sistema para reflejar los cambios recientes. Además, se mejoró la interfaz de usuario para mostrar correctamente el estado de las variables en streaming.
This commit is contained in:
parent
79479c368a
commit
3d4c2b3d42
|
@ -4,6 +4,49 @@
|
|||
|
||||
### Latest Modifications (Current Session)
|
||||
|
||||
#### Streaming Status and Variable Enable Issues Fix
|
||||
**Issues**: Three critical problems were affecting the streaming functionality:
|
||||
1. Stream status showing "📡 Streaming: Active (undefined vars)" due to property name mismatch
|
||||
2. Auto-recovery not properly initializing streaming after application reload
|
||||
3. Variable Enable flags not being respected - all variables exposed to plotJuggler regardless of individual streaming settings
|
||||
|
||||
**Root Cause Analysis**:
|
||||
1. **Undefined vars**: Frontend expected `streaming_variables_count` but backend sent `total_streaming_variables`
|
||||
2. **Auto-recovery**: UDP socket setup was missing during dataset restoration, causing streaming threads to start but data not reaching plotJuggler
|
||||
3. **Variable filtering**: System only checked `streaming_variables` list but ignored individual variable `streaming: true/false` flags
|
||||
|
||||
**Solution Implementation**:
|
||||
|
||||
**Stream Status Fix**:
|
||||
- Added `streaming_variables_count` property to backend status response for frontend compatibility
|
||||
- Implemented dual-layer filtering: variables must be in `streaming_variables` list AND have `streaming: true` flag
|
||||
- Updated status calculation to count only truly active streaming variables
|
||||
|
||||
**Auto-Recovery Enhancement**:
|
||||
- Modified `attempt_auto_recovery()` to setup UDP socket before activating datasets
|
||||
- Ensures complete streaming infrastructure is established during automatic restoration
|
||||
- Proper error handling if UDP socket setup fails during recovery
|
||||
|
||||
**Variable Enable Filtering**:
|
||||
- Enhanced `dataset_streaming_loop()` to filter variables using both criteria: presence in streaming list AND individual streaming flag
|
||||
- Updated `toggle_variable_streaming()` to maintain consistency between list membership and individual flags
|
||||
- Added `sync_streaming_variables()` function to fix existing data inconsistencies
|
||||
- Automatic synchronization on application startup ensures data integrity
|
||||
|
||||
**Technical Changes**:
|
||||
- **Backend Status**: Now returns both `total_streaming_variables` and `streaming_variables_count` for compatibility
|
||||
- **Streaming Filter**: Double verification before sending data to plotJuggler
|
||||
- **Data Consistency**: Automatic synchronization of streaming flags with streaming variables lists
|
||||
- **Auto-Recovery**: UDP socket initialization included in restoration process
|
||||
|
||||
**User Experience Impact**:
|
||||
- Accurate variable count display in stream status
|
||||
- Automatic streaming restoration after application restart
|
||||
- Precise control over which variables are actually streamed to plotJuggler
|
||||
- Consistent behavior between UI settings and actual data transmission
|
||||
|
||||
### Previous Modifications
|
||||
|
||||
#### Frontend Table Update Bug Fix
|
||||
**Issue**: When adding variables using the "➕ Add Variable" button, the variables were successfully added to the backend dataset but the variables table in the frontend was not refreshing to show the newly added variable.
|
||||
|
||||
|
|
|
@ -6,6 +6,7 @@ __pycache__/
|
|||
# C extensions
|
||||
*.so
|
||||
*.csv
|
||||
*.lock
|
||||
|
||||
# Distribution / packaging
|
||||
.Python
|
||||
|
|
|
@ -1067,8 +1067,200 @@
|
|||
"prefix": "mixer",
|
||||
"sampling_interval": 1.0
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T17:57:48.090115",
|
||||
"level": "info",
|
||||
"event_type": "Application started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T17:57:48.111503",
|
||||
"level": "info",
|
||||
"event_type": "plc_connection",
|
||||
"message": "Successfully connected to PLC 10.1.33.11",
|
||||
"details": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T17:57:48.115971",
|
||||
"level": "info",
|
||||
"event_type": "dataset_activated",
|
||||
"message": "Dataset activated: DAR",
|
||||
"details": {
|
||||
"dataset_id": "dar",
|
||||
"variables_count": 4,
|
||||
"streaming_count": 4,
|
||||
"prefix": "dar"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T17:59:20.355788",
|
||||
"level": "info",
|
||||
"event_type": "dataset_activated",
|
||||
"message": "Dataset activated: DAR",
|
||||
"details": {
|
||||
"dataset_id": "dar",
|
||||
"variables_count": 4,
|
||||
"streaming_count": 4,
|
||||
"prefix": "dar"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T17:59:20.358047",
|
||||
"level": "info",
|
||||
"event_type": "streaming_started",
|
||||
"message": "Multi-dataset streaming started: 1 datasets activated",
|
||||
"details": {
|
||||
"activated_datasets": 1,
|
||||
"total_datasets": 2,
|
||||
"udp_host": "127.0.0.1",
|
||||
"udp_port": 9870
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:00:01.148713",
|
||||
"level": "info",
|
||||
"event_type": "Application started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:00:01.174850",
|
||||
"level": "info",
|
||||
"event_type": "plc_connection",
|
||||
"message": "Successfully connected to PLC 10.1.33.11",
|
||||
"details": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:00:01.179115",
|
||||
"level": "info",
|
||||
"event_type": "dataset_activated",
|
||||
"message": "Dataset activated: DAR",
|
||||
"details": {
|
||||
"dataset_id": "dar",
|
||||
"variables_count": 4,
|
||||
"streaming_count": 4,
|
||||
"prefix": "dar"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:01:12.766609",
|
||||
"level": "info",
|
||||
"event_type": "dataset_activated",
|
||||
"message": "Dataset activated: DAR",
|
||||
"details": {
|
||||
"dataset_id": "dar",
|
||||
"variables_count": 4,
|
||||
"streaming_count": 4,
|
||||
"prefix": "dar"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:01:12.769824",
|
||||
"level": "info",
|
||||
"event_type": "streaming_started",
|
||||
"message": "Multi-dataset streaming started: 1 datasets activated",
|
||||
"details": {
|
||||
"activated_datasets": 1,
|
||||
"total_datasets": 2,
|
||||
"udp_host": "127.0.0.1",
|
||||
"udp_port": 9870
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:01:26.427183",
|
||||
"level": "info",
|
||||
"event_type": "Application started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:01:26.451766",
|
||||
"level": "info",
|
||||
"event_type": "plc_connection",
|
||||
"message": "Successfully connected to PLC 10.1.33.11",
|
||||
"details": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:01:26.456954",
|
||||
"level": "info",
|
||||
"event_type": "dataset_activated",
|
||||
"message": "Dataset activated: DAR",
|
||||
"details": {
|
||||
"dataset_id": "dar",
|
||||
"variables_count": 4,
|
||||
"streaming_count": 4,
|
||||
"prefix": "dar"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:01:30.909879",
|
||||
"level": "info",
|
||||
"event_type": "dataset_activated",
|
||||
"message": "Dataset activated: DAR",
|
||||
"details": {
|
||||
"dataset_id": "dar",
|
||||
"variables_count": 4,
|
||||
"streaming_count": 4,
|
||||
"prefix": "dar"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:01:30.911944",
|
||||
"level": "info",
|
||||
"event_type": "streaming_started",
|
||||
"message": "Multi-dataset streaming started: 1 datasets activated",
|
||||
"details": {
|
||||
"activated_datasets": 1,
|
||||
"total_datasets": 2,
|
||||
"udp_host": "127.0.0.1",
|
||||
"udp_port": 9870
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:20:09.887378",
|
||||
"level": "info",
|
||||
"event_type": "Application started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:20:09.913286",
|
||||
"level": "info",
|
||||
"event_type": "plc_connection",
|
||||
"message": "Successfully connected to PLC 10.1.33.11",
|
||||
"details": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T18:20:09.917270",
|
||||
"level": "info",
|
||||
"event_type": "dataset_activated",
|
||||
"message": "Dataset activated: DAR",
|
||||
"details": {
|
||||
"dataset_id": "dar",
|
||||
"variables_count": 4,
|
||||
"streaming_count": 4,
|
||||
"prefix": "dar"
|
||||
}
|
||||
}
|
||||
],
|
||||
"last_updated": "2025-07-17T17:43:27.747295",
|
||||
"total_entries": 95
|
||||
"last_updated": "2025-07-17T18:20:09.917270",
|
||||
"total_entries": 113
|
||||
}
|
101
main.py
101
main.py
|
@ -86,6 +86,7 @@ class PLCDataStreamer:
|
|||
# Load configuration from files
|
||||
self.load_configuration()
|
||||
self.load_datasets() # Load multiple datasets configuration
|
||||
self.sync_streaming_variables() # Synchronize streaming variables configuration
|
||||
self.load_system_state()
|
||||
self.load_events_log()
|
||||
|
||||
|
@ -187,6 +188,42 @@ class PLCDataStreamer:
|
|||
except Exception as e:
|
||||
self.logger.error(f"Error saving datasets: {e}")
|
||||
|
||||
def sync_streaming_variables(self):
|
||||
"""Synchronize streaming variables configuration - ensure variables in streaming_variables list have streaming=true"""
|
||||
try:
|
||||
sync_needed = False
|
||||
for dataset_id, dataset_info in self.datasets.items():
|
||||
streaming_vars = dataset_info.get("streaming_variables", [])
|
||||
variables_config = dataset_info.get("variables", {})
|
||||
|
||||
for var_name in streaming_vars:
|
||||
if var_name in variables_config:
|
||||
# If variable is in streaming list but doesn't have streaming=true, fix it
|
||||
if not variables_config[var_name].get("streaming", False):
|
||||
variables_config[var_name]["streaming"] = True
|
||||
sync_needed = True
|
||||
self.logger.info(
|
||||
f"Synchronized streaming flag for variable '{var_name}' in dataset '{dataset_id}'"
|
||||
)
|
||||
|
||||
# Also ensure variables not in streaming list have streaming=false
|
||||
for var_name, var_config in variables_config.items():
|
||||
if var_name not in streaming_vars and var_config.get(
|
||||
"streaming", False
|
||||
):
|
||||
var_config["streaming"] = False
|
||||
sync_needed = True
|
||||
self.logger.info(
|
||||
f"Disabled streaming flag for variable '{var_name}' in dataset '{dataset_id}'"
|
||||
)
|
||||
|
||||
if sync_needed:
|
||||
self.save_datasets()
|
||||
self.logger.info("Streaming variables configuration synchronized")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error synchronizing streaming variables: {e}")
|
||||
|
||||
def create_dataset(
|
||||
self, dataset_id: str, name: str, prefix: str, sampling_interval: float = None
|
||||
):
|
||||
|
@ -323,12 +360,13 @@ class PLCDataStreamer:
|
|||
|
||||
if area == "db":
|
||||
var_config["db"] = db
|
||||
if area in ["e", "a", "mb"]:
|
||||
if area in ["e", "a", "mb"] or (area == "db" and bit is not None):
|
||||
var_config["bit"] = bit
|
||||
|
||||
# Add to dataset
|
||||
self.datasets[dataset_id]["variables"][name] = var_config
|
||||
|
||||
# Update streaming variables list if streaming is enabled
|
||||
if streaming:
|
||||
if name not in self.datasets[dataset_id]["streaming_variables"]:
|
||||
self.datasets[dataset_id]["streaming_variables"].append(name)
|
||||
|
@ -340,7 +378,9 @@ class PLCDataStreamer:
|
|||
|
||||
# Log the addition
|
||||
area_description = {
|
||||
"db": f"DB{db}.{offset}",
|
||||
"db": (
|
||||
f"DB{db}.DBX{offset}.{bit}" if bit is not None else f"DB{db}.{offset}"
|
||||
),
|
||||
"mw": f"MW{offset}",
|
||||
"m": f"M{offset}",
|
||||
"pew": f"PEW{offset}",
|
||||
|
@ -403,6 +443,10 @@ class PLCDataStreamer:
|
|||
if name not in self.datasets[dataset_id]["variables"]:
|
||||
raise ValueError(f"Variable '{name}' not found in dataset '{dataset_id}'")
|
||||
|
||||
# Update the individual variable streaming flag
|
||||
self.datasets[dataset_id]["variables"][name]["streaming"] = enabled
|
||||
|
||||
# Update the streaming variables list
|
||||
if enabled:
|
||||
if name not in self.datasets[dataset_id]["streaming_variables"]:
|
||||
self.datasets[dataset_id]["streaming_variables"].append(name)
|
||||
|
@ -537,12 +581,14 @@ class PLCDataStreamer:
|
|||
# Write to CSV (all variables)
|
||||
self.write_dataset_csv_data(dataset_id, all_data)
|
||||
|
||||
# Get filtered data for streaming
|
||||
# Get filtered data for streaming - only variables that are in streaming_variables list AND have streaming=true
|
||||
streaming_variables = dataset_info.get("streaming_variables", [])
|
||||
dataset_vars_config = dataset_info.get("variables", {})
|
||||
streaming_data = {
|
||||
name: value
|
||||
for name, value in all_data.items()
|
||||
if name in streaming_variables
|
||||
and dataset_vars_config.get(name, {}).get("streaming", False)
|
||||
}
|
||||
|
||||
# Send filtered data to PlotJuggler
|
||||
|
@ -838,6 +884,13 @@ class PLCDataStreamer:
|
|||
if self.last_state.get("should_stream", False):
|
||||
self.logger.info("Attempting to restore streaming...")
|
||||
|
||||
# Setup UDP socket first
|
||||
if not self.setup_udp_socket():
|
||||
self.logger.warning(
|
||||
"Failed to setup UDP socket during auto-recovery"
|
||||
)
|
||||
return
|
||||
|
||||
# Restore active datasets
|
||||
restored_datasets = self.last_state.get("active_datasets", [])
|
||||
activated_count = 0
|
||||
|
@ -1022,6 +1075,7 @@ class PLCDataStreamer:
|
|||
f"For bit areas ({area}), bit position must be specified (0-7)"
|
||||
)
|
||||
|
||||
# Validar rango de bit para todas las áreas que lo soporten
|
||||
if bit is not None and (bit < 0 or bit > 7):
|
||||
raise ValueError("Bit position must be between 0 and 7")
|
||||
|
||||
|
@ -1037,8 +1091,8 @@ class PLCDataStreamer:
|
|||
if area == "db":
|
||||
var_config["db"] = db
|
||||
|
||||
# Add bit position for bit areas
|
||||
if area in ["e", "a", "mb"]:
|
||||
# Add bit position for bit areas and DB with specific bit
|
||||
if area in ["e", "a", "mb"] or (area == "db" and bit is not None):
|
||||
var_config["bit"] = bit
|
||||
|
||||
self.variables[name] = var_config
|
||||
|
@ -1056,7 +1110,9 @@ class PLCDataStreamer:
|
|||
|
||||
# Updated area description to include bit addresses
|
||||
area_description = {
|
||||
"db": f"DB{db}.{offset}",
|
||||
"db": (
|
||||
f"DB{db}.DBX{offset}.{bit}" if bit is not None else f"DB{db}.{offset}"
|
||||
),
|
||||
"mw": f"MW{offset}",
|
||||
"m": f"M{offset}",
|
||||
"pew": f"PEW{offset}",
|
||||
|
@ -1353,7 +1409,12 @@ class PLCDataStreamer:
|
|||
value = struct.unpack(">h", raw_data)[0]
|
||||
elif var_type == "bool":
|
||||
raw_data = self.plc.db_read(db, offset, 1)
|
||||
value = bool(raw_data[0] & 0x01)
|
||||
if bit is not None:
|
||||
# Use snap7.util.get_bool for specific bit extraction
|
||||
value = snap7.util.get_bool(raw_data, 0, bit)
|
||||
else:
|
||||
# Default to bit 0 for backward compatibility
|
||||
value = bool(raw_data[0] & 0x01)
|
||||
elif var_type == "dint":
|
||||
raw_data = self.plc.db_read(db, offset, 4)
|
||||
value = struct.unpack(">l", raw_data)[0]
|
||||
|
@ -1726,9 +1787,18 @@ class PLCDataStreamer:
|
|||
total_variables = sum(
|
||||
len(dataset["variables"]) for dataset in self.datasets.values()
|
||||
)
|
||||
total_streaming_vars = sum(
|
||||
len(dataset["streaming_variables"]) for dataset in self.datasets.values()
|
||||
)
|
||||
|
||||
# Count only variables that are in streaming_variables list AND have streaming=true
|
||||
total_streaming_vars = 0
|
||||
for dataset in self.datasets.values():
|
||||
streaming_vars = dataset.get("streaming_variables", [])
|
||||
variables_config = dataset.get("variables", {})
|
||||
active_streaming_vars = [
|
||||
var
|
||||
for var in streaming_vars
|
||||
if variables_config.get(var, {}).get("streaming", False)
|
||||
]
|
||||
total_streaming_vars += len(active_streaming_vars)
|
||||
|
||||
return {
|
||||
"plc_connected": self.connected,
|
||||
|
@ -1739,6 +1809,7 @@ class PLCDataStreamer:
|
|||
"active_datasets_count": len(self.active_datasets),
|
||||
"total_variables": total_variables,
|
||||
"total_streaming_variables": total_streaming_vars,
|
||||
"streaming_variables_count": total_streaming_vars, # Add this for frontend compatibility
|
||||
"sampling_interval": self.sampling_interval,
|
||||
"current_dataset_id": self.current_dataset_id,
|
||||
"datasets": {
|
||||
|
@ -1746,7 +1817,15 @@ class PLCDataStreamer:
|
|||
"name": info["name"],
|
||||
"prefix": info["prefix"],
|
||||
"variables_count": len(info["variables"]),
|
||||
"streaming_count": len(info["streaming_variables"]),
|
||||
"streaming_count": len(
|
||||
[
|
||||
var
|
||||
for var in info.get("streaming_variables", [])
|
||||
if info.get("variables", {})
|
||||
.get(var, {})
|
||||
.get("streaming", False)
|
||||
]
|
||||
),
|
||||
"sampling_interval": info.get("sampling_interval"),
|
||||
"enabled": info.get("enabled", False),
|
||||
"active": dataset_id in self.active_datasets,
|
||||
|
|
|
@ -14,26 +14,25 @@
|
|||
"area": "db",
|
||||
"offset": 18,
|
||||
"type": "real",
|
||||
"streaming": false,
|
||||
"streaming": true,
|
||||
"db": 2122
|
||||
},
|
||||
"UR29_Brix_Digital": {
|
||||
"area": "db",
|
||||
"offset": 40,
|
||||
"type": "real",
|
||||
"streaming": false,
|
||||
"streaming": true,
|
||||
"db": 2120
|
||||
},
|
||||
"CTS306_Conditi": {
|
||||
"area": "db",
|
||||
"offset": 18,
|
||||
"type": "real",
|
||||
"streaming": false,
|
||||
"streaming": true,
|
||||
"db": 2124
|
||||
}
|
||||
},
|
||||
"streaming_variables": [
|
||||
"PEW300",
|
||||
"UR29_Brix_Digital",
|
||||
"UR62_Brix",
|
||||
"CTS306_Conditi"
|
||||
|
@ -55,7 +54,7 @@
|
|||
"active_datasets": [
|
||||
"dar"
|
||||
],
|
||||
"current_dataset_id": "mixer",
|
||||
"current_dataset_id": "dar",
|
||||
"version": "1.0",
|
||||
"last_update": "2025-07-17T17:43:40.796590"
|
||||
"last_update": "2025-07-17T18:21:25.684433"
|
||||
}
|
|
@ -1 +1 @@
|
|||
32896
|
||||
37704
|
|
@ -7,5 +7,5 @@
|
|||
]
|
||||
},
|
||||
"auto_recovery_enabled": true,
|
||||
"last_update": "2025-07-17T17:14:45.397366"
|
||||
"last_update": "2025-07-17T18:20:09.918278"
|
||||
}
|
Loading…
Reference in New Issue