Implementación de un sistema de control de streaming UDP independiente, separando la grabación automática de CSV y la transmisión manual a PlotJuggler. Se añadieron nuevos endpoints API para iniciar y detener el streaming UDP, y se mejoró la gestión de eventos y el registro de estado. Además, se actualizaron los archivos de configuración y estado del sistema para reflejar estos cambios, y se realizaron mejoras en la interfaz de usuario para clarificar la distinción entre las operaciones de grabación y streaming.

This commit is contained in:
Miguel 2025-07-20 23:30:12 +02:00
parent a37cb8be3b
commit 5138a2e7cd
13 changed files with 1040 additions and 576 deletions

View File

@ -12,22 +12,176 @@ It must be as simple as possible to allow the pack using PyInstaller
Variables are defined in DataSets or groups with different polling times. DataSets enable data exchange between machines as they are JSON files that allow setting various polling times. CSV files are also created with a suffix from the DataSet, making it easier to use the data externally.
* DataSets can be active or inactive, which determines if they are saved in the corresponding CSV.
* Variables can be active or inactive for streaming.
* Variables can be active or inactive for UDP streaming to PlotJuggler.
Real-Time Transmission (Streaming): Sends data in real time using the UDP protocol, allowing applications like PlotJuggler to receive and visualize data live.
For streaming, there is an update interval for the data, but this interval only updates available data. It does not mean the data is read at that speed. Streaming uses only a reading cache of active variables. If variables are not active, they are not included in streaming.
**Automatic CSV Recording**: When the PLC is connected, all datasets with variables are automatically activated and begin recording data to CSV files. This recording is continuous and independent of other operations.
Live Monitoring: The web interface displays current variable values and system status, updating in real time.
**Real-Time UDP Transmission (PlotJuggler Streaming)**: Sends data in real time using the UDP protocol, allowing applications like PlotJuggler to receive and visualize data live. This is a manual control separate from automatic CSV recording.
For UDP streaming, there is an update interval for the data, but this interval only updates available data. It does not mean the data is read at that speed. UDP streaming uses only a reading cache of active variables. If variables are not marked for streaming, they are not included in UDP transmission.
**Live Web Monitoring**: The web interface displays current variable values and system status, updating in real time from the streaming cache.
Frontend monitoring also uses only the cache of active variables.
In summary, variables are read from the PLC only if they are active and at the speed set in their DataSet. Each read variable is stored in a memory cache. JSON streaming and frontend monitoring have their own intervals and read values only from this cache, which helps protect the PLC from overload.
In summary, variables are read from the PLC only if they are active and at the speed set in their DataSet. Each read variable is stored in a memory cache. CSV recording is automatic when PLC is connected, while UDP streaming and frontend monitoring have their own intervals and read values only from this cache, which helps protect the PLC from overload.
The application is designed to be persistent across restarts, restoring the previous state as much as possible. This includes reconnecting to the PLC, reactivating DataSets, and resuming streaming if they were active before.
The application is designed to be persistent across restarts, restoring the previous state as much as possible. This includes reconnecting to the PLC, reactivating DataSets, and resuming operations if they were active before.
## Modifications
### Latest Modifications (Current Session)
#### Critical Fix: CSV Recording vs UDP Streaming Separation and Thread Join Error Resolution
**Issue**: El sistema tenía un error crítico `RuntimeError: cannot join current thread` al detener streaming, y había confusión entre CSV recording (que debe ser automático) y UDP streaming (que debe ser manual). Al detener streaming UDP se detenía también el recording CSV, violando el diseño del sistema.
**Root Cause Analysis**:
1. **Thread Join Error**: En `dataset_streaming_loop` línea 552, el mismo hilo que estaba ejecutando llamaba a `self.stop_dataset_streaming()`, que intentaba hacer `thread.join()` sobre sí mismo
2. **Arquitectura Mixta Incorrecta**: Los métodos `start_streaming()` y `stop_streaming()` activaban/desactivaban datasets completos, afectando tanto CSV como UDP
3. **Conceptos Mezclados**: No había separación real entre recording automático y streaming manual a PlotJuggler
**Solution**: Implementé separación completa entre CSV recording (automático) y UDP streaming (manual), eliminando la confusión arquitectural y resolviendo el error del thread join.
**Implementation**:
**Nueva Arquitectura de Control**:
- **CSV Recording**: Automático cuando PLC conectado, independiente de UDP streaming
- **UDP Streaming**: Manual solo para PlotJuggler, no afecta CSV recording
- **Dataset Threads**: Manejan ambos pero con flags independientes (`csv_recording_enabled`, `udp_streaming_enabled`)
**Cambios Técnicos en DataStreamer** (`core/streamer.py`):
- **Nuevos Flags de Control**: `udp_streaming_enabled` y `csv_recording_enabled` independientes
- **Métodos Separados**:
- `start_csv_recording()` / `stop_csv_recording()` - Control automático de grabación
- `start_udp_streaming()` / `stop_udp_streaming()` - Control manual de streaming UDP
- **Thread Join Fix**: Verificación `thread != threading.current_thread()` antes de join
- **Dataset Loop Mejorado**: El bucle no llama más a `stop_dataset_streaming()` internamente
**Lógica de Dataset Loop Actualizada**:
```python
# 📝 CSV Recording: Always write if enabled (automatic)
if self.csv_recording_enabled:
self.write_dataset_csv_data(dataset_id, all_data)
# 📡 UDP Streaming: Only if UDP streaming is enabled (manual)
if self.udp_streaming_enabled:
# Send filtered data to PlotJuggler
if streaming_data:
self.send_to_plotjuggler(streaming_data)
```
**Cambios en Conexión PLC** (`core/plc_data_streamer.py`):
- **Auto-start CSV Recording**: `connect_plc()` ahora inicia automáticamente CSV recording
- **Desconexión Completa**: `disconnect_plc()` detiene tanto CSV recording como UDP streaming
- **Logging Mejorado**: Mensajes claros sobre qué se está iniciando/deteniendo
**Auto-Recovery Actualizado** (`core/instance_manager.py`):
- **Recovery Separado**: Restaura CSV recording automáticamente y UDP streaming solo si estaba activo
- **Orden Correcto**: CSV recording primero, luego UDP streaming si es necesario
**Nuevos Endpoints API** (`main.py`):
- **CSV Recording**: `/api/csv/recording/start` y `/api/csv/recording/stop`
- **UDP Streaming**: `/api/udp/streaming/start` y `/api/udp/streaming/stop`
- **Legacy Compatibility**: Endpoints antiguos mantienen funcionalidad pero solo para UDP
**Frontend Actualizado**:
- **Streaming.js**: Usa nuevos endpoints UDP independientes
- **Status.js**: Botones de control usan endpoints correctos para UDP streaming
- **Separación Visual**: Clara distinción entre CSV recording y UDP streaming en interfaz
**Beneficios del Nuevo Sistema**:
- **Error Resolution**: Eliminado completamente el `RuntimeError: cannot join current thread`
- **Operación Correcta**: CSV recording continúa automáticamente independiente del UDP streaming
- **Control Granular**: Usuario puede controlar UDP streaming sin afectar grabación de datos
- **Robustez**: Sistema más estable y predecible en operación industrial
- **Claridad Conceptual**: Separación clara entre grabación automática y streaming manual
**Flujo de Operación Corregido**:
1. **Conectar PLC** → Inicia automáticamente CSV recording para todos los datasets con variables
2. **CSV Recording** → Siempre activo cuando PLC conectado, independiente de otros controles
3. **UDP Streaming** → Control manual independiente para envío a PlotJuggler
4. **Desconectar PLC** → Detiene tanto CSV recording como UDP streaming
**Technical Fix Details**:
- **Thread Management**: Eliminación de auto-cleanup en `dataset_streaming_loop`
- **State Flags**: Banderas independientes para cada tipo de operación
- **Error Prevention**: Verificación de thread actual antes de join operations
- **Resource Management**: Cierre correcto de recursos sin afectar threads activos
Esta modificación resuelve el problema crítico reportado y establece una arquitectura sólida que separa correctamente las funciones automáticas de las manuales, cumpliendo con el diseño original del sistema.
#### Automatic Recording on PLC Connection and Interface Improvements
**Issue**: The application required manual activation of datasets for recording after connecting to the PLC, and the interface had several usability issues including non-functional status buttons and redundant diagnostic functions.
**Solution**: Implemented automatic dataset activation upon PLC connection and streamlined the interface by removing unnecessary functions and clarifying the distinction between automatic CSV recording and manual UDP streaming.
**Implementation**:
**Automatic Recording System**:
- **PLC Connection Trigger**: When connecting to PLC, all datasets with variables are automatically activated for recording
- **Immediate Data Collection**: Recording begins instantly without manual intervention
- **Smart Activation**: Only datasets with defined variables are activated, preventing empty dataset processing
- **Error Handling**: Graceful handling of activation failures with detailed logging
- **State Persistence**: Auto-activated datasets are saved in system state for recovery
**Backend Changes** (`core/plc_data_streamer.py`):
- **Enhanced `connect_plc()` Method**: Now automatically activates datasets with variables
- **Activation Logging**: Detailed logging of which datasets were auto-activated
- **Error Recovery**: Individual dataset activation failures don't prevent others from starting
- **Event Logging**: Enhanced connection events include auto-activation statistics
**Interface Streamlining**:
- **Removed Redundant Functions**: Eliminated `diagnose-btn` and `diagnoseConnection()` function
- **Removed Manual Refresh**: Eliminated `refresh-values-btn` and `refreshVariableValues()` function
- **Automatic Monitoring**: Variable values are now automatically monitored when PLC is connected
- **Live Display System**: Replaced manual refresh with automatic live display from cache
**Status Button Fix** (`static/js/status.js`):
- **Event Listener Issue**: Fixed `status-connect-btn` not working during streaming updates
- **Dynamic Button Handling**: Added event listeners in `updateStatusFromStream()` function
- **Consistent Behavior**: All status buttons now work regardless of update method
- **Robust Implementation**: Proper event listener management for all dynamic buttons
**Conceptual Separation**:
- **UDP Streaming**: Now clearly labeled as "PlotJuggler UDP Streaming" - manual control for data visualization
- **CSV Recording**: Automatic and continuous when PLC is connected - no manual intervention required
- **Live Display**: Optional real-time display of variable values in web interface
- **Independent Operation**: CSV recording works independently of UDP streaming
**Interface Updates** (`templates/index.html`):
- **Section Renaming**: "Multi-Dataset Streaming Control" → "PlotJuggler UDP Streaming Control"
- **Status Bar Updates**: "Streaming" → "UDP Streaming" for clarity
- **Button Text Changes**: "Start All Active Datasets" → "Start UDP Streaming"
- **Information Panels**: Updated descriptions to clarify automatic vs manual operations
- **Variable Management**: Removed manual refresh and diagnose buttons, simplified workflow
**JavaScript Enhancements** (`static/js/variables.js`):
- **Auto-Start Live Display**: `autoStartLiveDisplay()` function replaces manual refresh
- **Streaming Indicator Updates**: Modified `updateStreamingIndicator()` for new button structure
- **Function Cleanup**: Removed `refreshVariableValues()`, `diagnoseConnection()`, and related functions
- **Automatic Integration**: Live display starts automatically when dataset changes and PLC is connected
**New Workflow**:
1. **Connect PLC** → Automatically activates all datasets with variables and begins recording
2. **CSV Recording** → Always active when PLC connected (independent of UDP streaming)
3. **UDP Streaming** → Manual control only for PlotJuggler data visualization
4. **Live Display** → Optional real-time display of cached values in web interface
**Benefits**:
- **Simplified Operation**: No need to remember to activate recording manually
- **Immediate Data Collection**: Recording starts as soon as PLC connection is established
- **Clear Separation**: Distinct understanding of automatic recording vs manual streaming
- **Reduced Complexity**: Eliminated redundant diagnostic and refresh functions
- **Better UX**: More intuitive workflow with fewer manual steps
- **Robust Interface**: Fixed status buttons work consistently in all scenarios
**Technical Improvements**:
- **Event Listener Management**: Proper handling of dynamically created buttons
- **Automatic State Management**: System automatically manages recording state
- **Error Resilience**: Individual failures don't prevent overall system operation
- **Performance Optimization**: Removed unnecessary manual refresh operations
- **Code Cleanup**: Eliminated redundant functions and simplified codebase
This represents a significant improvement in user experience and system automation, making the application more suitable for production industrial environments where reliability and simplicity are paramount.
#### Instance Lock Verification and Cleanup System
**Issue**: When the application was terminated unexpectedly (crash, forced shutdown, etc.), the lock file `plc_streamer.lock` would remain in the filesystem with a stale PID, preventing new instances from starting even though no actual process was running. Additionally, the lock verification logic needed better user feedback.

View File

@ -2751,8 +2751,341 @@
"udp_host": "127.0.0.1",
"udp_port": 9870
}
},
{
"timestamp": "2025-07-20T23:12:13.679345",
"level": "info",
"event_type": "application_started",
"message": "Application initialization completed successfully",
"details": {}
},
{
"timestamp": "2025-07-20T23:12:31.445814",
"level": "info",
"event_type": "dataset_activated",
"message": "Dataset activated: DAR",
"details": {
"dataset_id": "dar",
"variables_count": 6,
"streaming_count": 4,
"prefix": "dar"
}
},
{
"timestamp": "2025-07-20T23:12:31.449815",
"level": "info",
"event_type": "plc_connection",
"message": "Successfully connected to PLC 10.1.33.249 and auto-activated 1 datasets for recording",
"details": {
"ip": "10.1.33.249",
"rack": 0,
"slot": 2,
"auto_activated_datasets": 1,
"dataset_names": [
"DAR"
]
}
},
{
"timestamp": "2025-07-20T23:12:34.845486",
"level": "info",
"event_type": "dataset_activated",
"message": "Dataset activated: DAR",
"details": {
"dataset_id": "dar",
"variables_count": 6,
"streaming_count": 4,
"prefix": "dar"
}
},
{
"timestamp": "2025-07-20T23:12:34.849625",
"level": "info",
"event_type": "streaming_started",
"message": "Multi-dataset streaming started: 1 datasets activated",
"details": {
"activated_datasets": 1,
"total_datasets": 2,
"udp_host": "127.0.0.1",
"udp_port": 9870
}
},
{
"timestamp": "2025-07-20T23:13:51.856604",
"level": "info",
"event_type": "dataset_deactivated",
"message": "Dataset deactivated: DAR",
"details": {
"dataset_id": "dar"
}
},
{
"timestamp": "2025-07-20T23:13:51.859613",
"level": "info",
"event_type": "streaming_stopped",
"message": "Multi-dataset streaming stopped: 1 datasets deactivated",
"details": {}
},
{
"timestamp": "2025-07-20T23:17:52.196719",
"level": "info",
"event_type": "streaming_stopped",
"message": "Multi-dataset streaming stopped: 0 datasets deactivated",
"details": {}
},
{
"timestamp": "2025-07-20T23:17:52.200719",
"level": "info",
"event_type": "streaming_stopped",
"message": "Multi-dataset streaming stopped: 0 datasets deactivated",
"details": {}
},
{
"timestamp": "2025-07-20T23:17:52.203718",
"level": "info",
"event_type": "plc_disconnection",
"message": "Disconnected from PLC 10.1.33.249",
"details": {}
},
{
"timestamp": "2025-07-20T23:17:54.026988",
"level": "info",
"event_type": "dataset_activated",
"message": "Dataset activated: DAR",
"details": {
"dataset_id": "dar",
"variables_count": 6,
"streaming_count": 4,
"prefix": "dar"
}
},
{
"timestamp": "2025-07-20T23:17:54.031992",
"level": "info",
"event_type": "plc_connection",
"message": "Successfully connected to PLC 10.1.33.249 and auto-activated 1 datasets for recording",
"details": {
"ip": "10.1.33.249",
"rack": 0,
"slot": 2,
"auto_activated_datasets": 1,
"dataset_names": [
"DAR"
]
}
},
{
"timestamp": "2025-07-20T23:18:12.560727",
"level": "info",
"event_type": "dataset_deactivated",
"message": "Dataset deactivated: DAR",
"details": {
"dataset_id": "dar"
}
},
{
"timestamp": "2025-07-20T23:18:14.249955",
"level": "info",
"event_type": "dataset_activated",
"message": "Dataset activated: DAR",
"details": {
"dataset_id": "dar",
"variables_count": 6,
"streaming_count": 4,
"prefix": "dar"
}
},
{
"timestamp": "2025-07-20T23:18:16.469647",
"level": "info",
"event_type": "dataset_deactivated",
"message": "Dataset deactivated: DAR",
"details": {
"dataset_id": "dar"
}
},
{
"timestamp": "2025-07-20T23:18:18.238188",
"level": "info",
"event_type": "dataset_activated",
"message": "Dataset activated: DAR",
"details": {
"dataset_id": "dar",
"variables_count": 6,
"streaming_count": 4,
"prefix": "dar"
}
},
{
"timestamp": "2025-07-20T23:22:53.579226",
"level": "info",
"event_type": "application_started",
"message": "Application initialization completed successfully",
"details": {}
},
{
"timestamp": "2025-07-20T23:23:13.161965",
"level": "info",
"event_type": "application_started",
"message": "Application initialization completed successfully",
"details": {}
},
{
"timestamp": "2025-07-20T23:24:05.283706",
"level": "info",
"event_type": "application_started",
"message": "Application initialization completed successfully",
"details": {}
},
{
"timestamp": "2025-07-20T23:24:05.319632",
"level": "info",
"event_type": "dataset_activated",
"message": "Dataset activated: DAR",
"details": {
"dataset_id": "dar",
"variables_count": 6,
"streaming_count": 4,
"prefix": "dar"
}
},
{
"timestamp": "2025-07-20T23:24:05.323152",
"level": "info",
"event_type": "csv_recording_started",
"message": "CSV recording started: 1 datasets activated",
"details": {
"activated_datasets": 1,
"total_datasets": 2
}
},
{
"timestamp": "2025-07-20T23:24:30.952393",
"level": "info",
"event_type": "application_started",
"message": "Application initialization completed successfully",
"details": {}
},
{
"timestamp": "2025-07-20T23:24:31.002955",
"level": "info",
"event_type": "dataset_activated",
"message": "Dataset activated: DAR",
"details": {
"dataset_id": "dar",
"variables_count": 6,
"streaming_count": 4,
"prefix": "dar"
}
},
{
"timestamp": "2025-07-20T23:24:31.009499",
"level": "info",
"event_type": "csv_recording_started",
"message": "CSV recording started: 1 datasets activated",
"details": {
"activated_datasets": 1,
"total_datasets": 2
}
},
{
"timestamp": "2025-07-20T23:24:51.202452",
"level": "info",
"event_type": "udp_streaming_started",
"message": "UDP streaming to PlotJuggler started",
"details": {
"udp_host": "127.0.0.1",
"udp_port": 9870,
"datasets_available": 1
}
},
{
"timestamp": "2025-07-20T23:24:59.356614",
"level": "info",
"event_type": "udp_streaming_stopped",
"message": "UDP streaming to PlotJuggler stopped (CSV recording continues)",
"details": {}
},
{
"timestamp": "2025-07-20T23:25:04.088165",
"level": "info",
"event_type": "csv_recording_stopped",
"message": "CSV recording stopped (dataset threads continue for UDP streaming)",
"details": {}
},
{
"timestamp": "2025-07-20T23:25:04.094297",
"level": "info",
"event_type": "udp_streaming_stopped",
"message": "UDP streaming to PlotJuggler stopped (CSV recording continues)",
"details": {}
},
{
"timestamp": "2025-07-20T23:25:04.248255",
"level": "info",
"event_type": "dataset_deactivated",
"message": "Dataset deactivated: DAR",
"details": {
"dataset_id": "dar"
}
},
{
"timestamp": "2025-07-20T23:25:04.253860",
"level": "info",
"event_type": "plc_disconnection",
"message": "Disconnected from PLC 10.1.33.249 (stopped recording and streaming)",
"details": {}
},
{
"timestamp": "2025-07-20T23:25:07.496515",
"level": "info",
"event_type": "dataset_activated",
"message": "Dataset activated: DAR",
"details": {
"dataset_id": "dar",
"variables_count": 6,
"streaming_count": 4,
"prefix": "dar"
}
},
{
"timestamp": "2025-07-20T23:25:07.500914",
"level": "info",
"event_type": "csv_recording_started",
"message": "CSV recording started: 1 datasets activated",
"details": {
"activated_datasets": 1,
"total_datasets": 2
}
},
{
"timestamp": "2025-07-20T23:25:07.505843",
"level": "info",
"event_type": "plc_connection",
"message": "Successfully connected to PLC 10.1.33.249 and auto-started CSV recording for 1 datasets",
"details": {
"ip": "10.1.33.249",
"rack": 0,
"slot": 2,
"auto_started_recording": true,
"recording_datasets": 1,
"dataset_names": [
"DAR"
]
}
},
{
"timestamp": "2025-07-20T23:25:10.215131",
"level": "info",
"event_type": "udp_streaming_started",
"message": "UDP streaming to PlotJuggler started",
"details": {
"udp_host": "127.0.0.1",
"udp_port": 9870,
"datasets_available": 1
}
}
],
"last_updated": "2025-07-20T23:03:40.714677",
"total_entries": 260
"last_updated": "2025-07-20T23:25:10.215131",
"total_entries": 294
}

View File

@ -20,7 +20,7 @@ class InstanceManager:
"""Safely remove lock file with retry logic for Windows compatibility"""
if not os.path.exists(self.lock_file):
return True
for attempt in range(max_retries):
try:
os.remove(self.lock_file)
@ -28,38 +28,44 @@ class InstanceManager:
except PermissionError as e:
if platform.system() == "Windows" and attempt < max_retries - 1:
if self.logger:
self.logger.debug(f"Lock file removal attempt {attempt + 1} failed (Windows), retrying in {delay}s...")
self.logger.debug(
f"Lock file removal attempt {attempt + 1} failed (Windows), retrying in {delay}s..."
)
time.sleep(delay)
delay *= 1.5 # Exponential backoff
else:
if self.logger:
self.logger.warning(f"Failed to remove lock file after {max_retries} attempts: {e}")
self.logger.warning(
f"Failed to remove lock file after {max_retries} attempts: {e}"
)
return False
except Exception as e:
if self.logger:
self.logger.warning(f"Unexpected error removing lock file: {e}")
return False
return False
def acquire_instance_lock(self) -> bool:
"""Acquire lock to ensure single instance execution with improved stale lock detection"""
try:
print("🔍 Checking for existing instances...")
# Check if lock file exists
if os.path.exists(self.lock_file):
lock_should_be_removed = False
removal_reason = ""
old_pid = None
# Try to read PID from existing lock file
try:
with open(self.lock_file, "r") as f:
old_pid = int(f.read().strip())
if self.logger:
self.logger.info(f"Found existing lock file with PID: {old_pid}")
self.logger.info(
f"Found existing lock file with PID: {old_pid}"
)
# Check if process is still running
if psutil.pid_exists(old_pid):
@ -67,61 +73,86 @@ class InstanceManager:
try:
proc = psutil.Process(old_pid)
cmdline = " ".join(proc.cmdline())
# More specific check - only block if it's really our application
if (
("main.py" in cmdline and "S7_snap7_Stremer_n_Log" in cmdline)
(
"main.py" in cmdline
and "S7_snap7_Stremer_n_Log" in cmdline
)
or ("plc_streamer" in cmdline.lower())
or ("PLCDataStreamer" in cmdline)
):
print(f"❌ Another instance of PLC Streamer is already running (PID: {old_pid})")
print(
f"❌ Another instance of PLC Streamer is already running (PID: {old_pid})"
)
print(f" Command: {cmdline}")
print("💡 Stop the other instance first or wait for it to finish")
print(
"💡 Stop the other instance first or wait for it to finish"
)
if self.logger:
self.logger.error(f"Another instance is already running (PID: {old_pid})")
self.logger.error(
f"Another instance is already running (PID: {old_pid})"
)
self.logger.error(f"Command line: {cmdline}")
return False
else:
# Different Python process, remove stale lock
lock_should_be_removed = True
removal_reason = f"Found lock file from different application (PID {old_pid})"
if self.logger:
self.logger.info(f"Found different Python process (PID: {old_pid}), removing stale lock")
self.logger.info(f"Different process command: {cmdline}")
except (psutil.NoSuchProcess, psutil.AccessDenied, psutil.ZombieProcess):
self.logger.info(
f"Found different Python process (PID: {old_pid}), removing stale lock"
)
self.logger.info(
f"Different process command: {cmdline}"
)
except (
psutil.NoSuchProcess,
psutil.AccessDenied,
psutil.ZombieProcess,
):
# Process disappeared or can't access it, remove stale lock
lock_should_be_removed = True
removal_reason = f"Process {old_pid} is not accessible"
if self.logger:
self.logger.info(f"Process {old_pid} is not accessible, removing stale lock")
self.logger.info(
f"Process {old_pid} is not accessible, removing stale lock"
)
else:
# Old process is dead, remove stale lock file
lock_should_be_removed = True
removal_reason = f"Found stale lock file (PID {old_pid} doesn't exist)"
removal_reason = (
f"Found stale lock file (PID {old_pid} doesn't exist)"
)
if self.logger:
self.logger.info(f"Removed stale lock file - PID {old_pid} doesn't exist")
self.logger.info(
f"Removed stale lock file - PID {old_pid} doesn't exist"
)
except (ValueError, IOError, UnicodeDecodeError):
# Invalid lock file, remove it
lock_should_be_removed = True
removal_reason = "Invalid or corrupted lock file"
if self.logger:
self.logger.info("Removing invalid lock file")
# Perform safe removal if needed
if lock_should_be_removed:
print(f"🧹 {removal_reason}, removing it")
if not self._safe_remove_lock_file():
print(f"⚠️ Unable to remove lock file. Trying to continue...")
if self.logger:
self.logger.warning("Failed to remove lock file, but continuing with initialization")
self.logger.warning(
"Failed to remove lock file, but continuing with initialization"
)
# Create new lock file with current PID (with retry for Windows)
lock_created = False
@ -134,13 +165,17 @@ class InstanceManager:
except PermissionError as e:
if platform.system() == "Windows" and attempt < 2:
if self.logger:
self.logger.debug(f"Lock file creation attempt {attempt + 1} failed, retrying...")
self.logger.debug(
f"Lock file creation attempt {attempt + 1} failed, retrying..."
)
time.sleep(0.5)
else:
raise e
if not lock_created:
raise PermissionError("Unable to create lock file after multiple attempts")
raise PermissionError(
"Unable to create lock file after multiple attempts"
)
# Register cleanup function only once
if not self._cleanup_registered:
@ -148,14 +183,16 @@ class InstanceManager:
self._cleanup_registered = True
print(f"✅ Instance lock acquired successfully (PID: {os.getpid()})")
if self.logger:
self.logger.info(f"Instance lock acquired: {self.lock_file} (PID: {os.getpid()})")
self.logger.info(
f"Instance lock acquired: {self.lock_file} (PID: {os.getpid()})"
)
return True
except Exception as e:
print(f"⚠️ Error acquiring instance lock: {e}")
if self.logger:
self.logger.error(f"Error acquiring instance lock: {e}")
return False
@ -202,31 +239,33 @@ class InstanceManager:
return None
def attempt_auto_recovery(self, config_manager, plc_client, data_streamer) -> bool:
"""Attempt to restore previous system state"""
"""Attempt to restore previous state based on system state file"""
if not config_manager.auto_recovery_enabled:
if self.logger:
self.logger.info("Auto-recovery disabled, skipping state restoration")
return False
if self.logger:
self.logger.info("Attempting auto-recovery of previous state...")
recovery_success = False
try:
# Try to restore connection
if config_manager.last_state.get("should_connect", False):
if self.logger:
self.logger.info("Attempting to restore PLC connection...")
if plc_client.connect():
# Try to reconnect to PLC
if plc_client.connect(
config_manager.plc_config["ip"],
config_manager.plc_config["rack"],
config_manager.plc_config["slot"],
):
if self.logger:
self.logger.info("PLC connection restored successfully")
# Try to restore streaming if connection was successful
# 🔑 NEW: Restore CSV recording (automatic)
recording_restored = data_streamer.start_csv_recording()
if recording_restored and self.logger:
self.logger.info("CSV recording restored successfully")
# 🔑 NEW: Restore UDP streaming if it was active (manual)
if config_manager.last_state.get("should_stream", False):
if self.logger:
self.logger.info("Attempting to restore streaming...")
self.logger.info("Attempting to restore UDP streaming...")
# Setup UDP socket first
if not data_streamer.setup_udp_socket():
@ -236,48 +275,30 @@ class InstanceManager:
)
return False
# Restore active datasets
restored_datasets = config_manager.last_state.get(
"active_datasets", []
)
activated_count = 0
# Start UDP streaming
udp_restored = data_streamer.start_udp_streaming()
if udp_restored and self.logger:
self.logger.info("UDP streaming restored successfully")
for dataset_id in restored_datasets:
if dataset_id in config_manager.datasets:
try:
data_streamer.activate_dataset(dataset_id)
activated_count += 1
except Exception as e:
if self.logger:
self.logger.warning(
f"Failed to restore dataset {dataset_id}: {e}"
)
# Update system state
config_manager.save_system_state(
connected=True,
streaming=data_streamer.is_streaming(),
active_datasets=data_streamer.get_active_datasets(),
)
if activated_count > 0:
recovery_success = True
if self.logger:
self.logger.info(
f"Streaming restored successfully: {activated_count} datasets activated"
)
else:
if self.logger:
self.logger.warning(
"Failed to restore streaming: no datasets activated"
)
else:
recovery_success = True # Connection restored successfully
return True
else:
if self.logger:
self.logger.warning("Failed to restore PLC connection")
else:
recovery_success = True # No connection was expected
return False
except Exception as e:
if self.logger:
self.logger.error(f"Error during auto-recovery: {e}")
recovery_success = False
self.logger.error(f"Auto-recovery failed: {e}")
return False
return recovery_success
return False
def wait_for_safe_startup(self, delay_seconds: float = 1.0):
"""Wait for a safe startup delay to ensure previous instance cleanup"""

View File

@ -94,7 +94,7 @@ class PLCDataStreamer:
# PLC Connection Methods
def connect_plc(self) -> bool:
"""Connect to PLC"""
"""Connect to PLC and automatically start CSV recording for datasets with variables"""
success = self.plc_client.connect(
self.config_manager.plc_config["ip"],
self.config_manager.plc_config["rack"],
@ -102,17 +102,51 @@ class PLCDataStreamer:
)
if success:
self.config_manager.save_system_state(
connected=True,
streaming=self.data_streamer.is_streaming(),
active_datasets=self.data_streamer.get_active_datasets(),
)
self.event_logger.log_event(
"info",
"plc_connection",
f"Successfully connected to PLC {self.config_manager.plc_config['ip']}",
self.config_manager.plc_config,
)
# 🔑 NEW: Automatically start CSV recording (not UDP streaming)
recording_started = self.data_streamer.start_csv_recording()
if recording_started:
activated_datasets = list(self.data_streamer.get_active_datasets())
dataset_names = [
self.config_manager.datasets[ds_id]["name"]
for ds_id in activated_datasets
]
if self.logger:
self.logger.info(
f"Auto-started CSV recording for {len(activated_datasets)} datasets"
)
for dataset_name in dataset_names:
self.logger.info(f"Recording: {dataset_name}")
# Log connection with auto-recording info
activation_msg = f"Successfully connected to PLC {self.config_manager.plc_config['ip']}"
if activated_datasets:
activation_msg += f" and auto-started CSV recording for {len(activated_datasets)} datasets"
self.event_logger.log_event(
"info",
"plc_connection",
activation_msg,
{
**self.config_manager.plc_config,
"auto_started_recording": True,
"recording_datasets": len(activated_datasets),
"dataset_names": dataset_names,
},
)
else:
# Connection successful but no recording started
self.event_logger.log_event(
"info",
"plc_connection",
f"Successfully connected to PLC {self.config_manager.plc_config['ip']} (no datasets for recording)",
{
**self.config_manager.plc_config,
"auto_started_recording": False,
"recording_datasets": 0,
},
)
else:
self.event_logger.log_event(
"error",
@ -124,8 +158,21 @@ class PLCDataStreamer:
return success
def disconnect_plc(self):
"""Disconnect from PLC"""
self.data_streamer.stop_streaming()
"""Disconnect from PLC and stop all recording/streaming"""
# Stop both CSV recording and UDP streaming
self.data_streamer.stop_csv_recording()
self.data_streamer.stop_udp_streaming()
# Deactivate all datasets
active_datasets_copy = self.config_manager.active_datasets.copy()
for dataset_id in active_datasets_copy:
try:
self.data_streamer.deactivate_dataset(dataset_id)
except Exception as e:
if self.logger:
self.logger.warning(f"Error deactivating dataset {dataset_id}: {e}")
# Disconnect from PLC
self.plc_client.disconnect()
self.config_manager.save_system_state(
@ -135,7 +182,7 @@ class PLCDataStreamer:
self.event_logger.log_event(
"info",
"plc_disconnection",
f"Disconnected from PLC {self.config_manager.plc_config['ip']}",
f"Disconnected from PLC {self.config_manager.plc_config['ip']} (stopped recording and streaming)",
)
# Configuration Methods

View File

@ -24,19 +24,14 @@ def resource_path(relative_path):
class DataStreamer:
"""Handles data streaming, CSV recording, and dataset management
🔑 CORE PRINCIPLE: Single PLC Read per Dataset Interval
========================================================
This class implements the application's core principle of reading PLC variables
only once per dataset at their configured sampling intervals, then using cached
values for all other operations (CSV recording, UDP streaming, web interface).
🔑 CORE PRINCIPLE: CSV Recording vs UDP Streaming Independence
===========================================================
This class implements strict separation between:
1. CSV Recording: Automatic, always active when PLC connected
2. UDP Streaming: Manual control for PlotJuggler visualization
Data Flow:
1. dataset_streaming_loop() reads ALL variables in a dataset at configured interval
2. read_dataset_variables() performs the actual PLC read and updates cache
3. All other functions (APIs, streaming, frontend) use get_cached_dataset_values()
4. NO direct PLC reads outside of the streaming loops
This protects the PLC from overload and ensures data consistency across all outputs.
Each dataset thread handles both CSV writing and UDP streaming,
but UDP transmission is controlled by independent flag.
"""
def __init__(self, config_manager, plc_client, event_logger, logger=None):
@ -48,9 +43,10 @@ class DataStreamer:
# UDP streaming setup
self.udp_socket = None
self.udp_streaming_enabled = False # 🔑 Independent UDP control
# Streaming state
self.streaming = False
# CSV recording state (automatic when PLC connected)
self.csv_recording_enabled = False
# Dataset streaming threads and files
self.dataset_threads = {} # dataset_id -> thread object
@ -451,13 +447,13 @@ class DataStreamer:
)
def dataset_streaming_loop(self, dataset_id: str):
"""Streaming loop for a specific dataset"""
"""Streaming loop for a specific dataset - handles both CSV and UDP"""
dataset_info = self.config_manager.datasets[dataset_id]
interval = self.config_manager.get_dataset_sampling_interval(dataset_id)
if self.logger:
self.logger.info(
f"Dataset '{dataset_info['name']}' streaming loop started (interval: {interval}s)"
f"Dataset '{dataset_info['name']}' loop started (interval: {interval}s)"
)
consecutive_errors = 0
@ -479,36 +475,61 @@ class DataStreamer:
if all_data:
consecutive_errors = 0
# Write to CSV (all variables)
self.write_dataset_csv_data(dataset_id, all_data)
# 📝 CSV Recording: Always write if enabled (automatic)
if self.csv_recording_enabled:
self.write_dataset_csv_data(dataset_id, all_data)
# Get filtered data for streaming - only variables that are in streaming_variables list AND have streaming=true
streaming_variables = dataset_info.get("streaming_variables", [])
dataset_vars_config = dataset_info.get("variables", {})
streaming_data = {
name: value
for name, value in all_data.items()
if name in streaming_variables
and dataset_vars_config.get(name, {}).get("streaming", False)
}
# 📡 UDP Streaming: Only if UDP streaming is enabled (manual)
if self.udp_streaming_enabled:
# Get filtered data for streaming - only variables that are in streaming_variables list AND have streaming=true
streaming_variables = dataset_info.get(
"streaming_variables", []
)
dataset_vars_config = dataset_info.get("variables", {})
streaming_data = {
name: value
for name, value in all_data.items()
if name in streaming_variables
and dataset_vars_config.get(name, {}).get(
"streaming", False
)
}
# Send filtered data to PlotJuggler
if streaming_data:
self.send_to_plotjuggler(streaming_data)
# Send filtered data to PlotJuggler
if streaming_data:
self.send_to_plotjuggler(streaming_data)
# Log data
timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S.%f")[:-3]
if self.logger:
csv_count = len(all_data) if self.csv_recording_enabled else 0
udp_count = 0
if self.udp_streaming_enabled:
streaming_variables = dataset_info.get(
"streaming_variables", []
)
dataset_vars_config = dataset_info.get("variables", {})
udp_count = len(
[
name
for name in all_data.keys()
if name in streaming_variables
and dataset_vars_config.get(name, {}).get(
"streaming", False
)
]
)
self.logger.info(
f"[{timestamp}] Dataset '{dataset_info['name']}': CSV: {len(all_data)} vars, Streaming: {len(streaming_data)} vars"
f"[{timestamp}] Dataset '{dataset_info['name']}': CSV: {csv_count} vars, UDP: {udp_count} vars"
)
else:
consecutive_errors += 1
if consecutive_errors >= max_consecutive_errors:
self.event_logger.log_event(
"error",
"dataset_streaming_error",
f"Multiple consecutive read failures for dataset '{dataset_info['name']}' ({consecutive_errors}). Stopping streaming.",
"dataset_loop_error",
f"Multiple consecutive read failures for dataset '{dataset_info['name']}' ({consecutive_errors}). Stopping dataset.",
{
"dataset_id": dataset_id,
"consecutive_errors": consecutive_errors,
@ -525,8 +546,8 @@ class DataStreamer:
consecutive_errors += 1
self.event_logger.log_event(
"error",
"dataset_streaming_error",
f"Error in dataset '{dataset_info['name']}' streaming loop: {str(e)}",
"dataset_loop_error",
f"Error in dataset '{dataset_info['name']}' loop: {str(e)}",
{
"dataset_id": dataset_id,
"error": str(e),
@ -537,8 +558,8 @@ class DataStreamer:
if consecutive_errors >= max_consecutive_errors:
self.event_logger.log_event(
"error",
"dataset_streaming_error",
f"Too many consecutive errors for dataset '{dataset_info['name']}'. Stopping streaming.",
"dataset_loop_error",
f"Too many consecutive errors for dataset '{dataset_info['name']}'. Stopping dataset.",
{
"dataset_id": dataset_id,
"consecutive_errors": consecutive_errors,
@ -548,10 +569,10 @@ class DataStreamer:
time.sleep(1) # Wait before retry
# Clean up when exiting
self.stop_dataset_streaming(dataset_id)
# 🔑 FIXED: Do NOT call stop_dataset_streaming from within the loop
# The thread will be cleaned up externally when needed
if self.logger:
self.logger.info(f"Dataset '{dataset_info['name']}' streaming loop ended")
self.logger.info(f"Dataset '{dataset_info['name']}' loop ended")
def start_dataset_streaming(self, dataset_id: str):
"""Start streaming thread for a specific dataset"""
@ -583,7 +604,8 @@ class DataStreamer:
if dataset_id in self.dataset_threads:
# The thread will detect this and stop
thread = self.dataset_threads[dataset_id]
if thread.is_alive():
# 🔑 FIXED: Check if we're not trying to join the current thread
if thread.is_alive() and thread != threading.current_thread():
thread.join(timeout=2)
del self.dataset_threads[dataset_id]
@ -649,31 +671,26 @@ class DataStreamer:
{"dataset_id": dataset_id},
)
def start_streaming(self) -> bool:
"""Start data streaming - activates all datasets with variables"""
# 🔑 NEW: CSV Recording Methods (Automatic)
def start_csv_recording(self) -> bool:
"""Start automatic CSV recording for all datasets with variables"""
if not self.plc_client.is_connected():
self.event_logger.log_event(
"error", "streaming_error", "Cannot start streaming: PLC not connected"
"error",
"csv_recording_error",
"Cannot start CSV recording: PLC not connected",
)
return False
if not self.config_manager.datasets:
self.event_logger.log_event(
"error",
"streaming_error",
"Cannot start streaming: No datasets configured",
"csv_recording_error",
"Cannot start CSV recording: No datasets configured",
)
return False
if not self.setup_udp_socket():
self.event_logger.log_event(
"error",
"streaming_error",
"Cannot start streaming: UDP socket setup failed",
)
return False
# Activate all datasets that have variables
# Activate all datasets that have variables for CSV recording
activated_count = 0
for dataset_id, dataset_info in self.config_manager.datasets.items():
if dataset_info.get("variables"):
@ -689,12 +706,99 @@ class DataStreamer:
if activated_count == 0:
self.event_logger.log_event(
"error",
"streaming_error",
"Cannot start streaming: No datasets with variables configured",
"csv_recording_error",
"Cannot start CSV recording: No datasets with variables configured",
)
return False
self.streaming = True
self.csv_recording_enabled = True
self.config_manager.save_system_state(
connected=self.plc_client.is_connected(),
streaming=self.udp_streaming_enabled,
active_datasets=self.config_manager.active_datasets,
)
self.event_logger.log_event(
"info",
"csv_recording_started",
f"CSV recording started: {activated_count} datasets activated",
{
"activated_datasets": activated_count,
"total_datasets": len(self.config_manager.datasets),
},
)
return True
def stop_csv_recording(self):
"""Stop CSV recording but keep dataset threads for potential UDP streaming"""
self.csv_recording_enabled = False
# Close all CSV files but keep threads running
for dataset_id in list(self.dataset_csv_files.keys()):
if dataset_id in self.dataset_csv_files:
self.dataset_csv_files[dataset_id].close()
del self.dataset_csv_files[dataset_id]
del self.dataset_csv_writers[dataset_id]
del self.dataset_csv_hours[dataset_id]
# Reset modification file flag
self.dataset_using_modification_files.pop(dataset_id, None)
self.config_manager.save_system_state(
connected=self.plc_client.is_connected(),
streaming=self.udp_streaming_enabled,
active_datasets=self.config_manager.active_datasets,
)
self.event_logger.log_event(
"info",
"csv_recording_stopped",
"CSV recording stopped (dataset threads continue for UDP streaming)",
)
# 🔑 NEW: UDP Streaming Methods (Manual)
def start_udp_streaming(self) -> bool:
"""Start UDP streaming to PlotJuggler (independent of CSV recording)"""
if not self.plc_client.is_connected():
self.event_logger.log_event(
"error",
"udp_streaming_error",
"Cannot start UDP streaming: PLC not connected",
)
return False
if not self.config_manager.datasets:
self.event_logger.log_event(
"error",
"udp_streaming_error",
"Cannot start UDP streaming: No datasets configured",
)
return False
if not self.setup_udp_socket():
self.event_logger.log_event(
"error",
"udp_streaming_error",
"Cannot start UDP streaming: UDP socket setup failed",
)
return False
# Ensure datasets are active (for data availability)
activated_count = 0
for dataset_id, dataset_info in self.config_manager.datasets.items():
if (
dataset_info.get("variables")
and dataset_id not in self.config_manager.active_datasets
):
try:
self.activate_dataset(dataset_id)
activated_count += 1
except Exception as e:
if self.logger:
self.logger.warning(
f"Failed to activate dataset {dataset_id}: {e}"
)
self.udp_streaming_enabled = True
self.config_manager.save_system_state(
connected=self.plc_client.is_connected(),
streaming=True,
@ -703,29 +807,19 @@ class DataStreamer:
self.event_logger.log_event(
"info",
"streaming_started",
f"Multi-dataset streaming started: {activated_count} datasets activated",
"udp_streaming_started",
f"UDP streaming to PlotJuggler started",
{
"activated_datasets": activated_count,
"total_datasets": len(self.config_manager.datasets),
"udp_host": self.config_manager.udp_config["host"],
"udp_port": self.config_manager.udp_config["port"],
"datasets_available": len(self.config_manager.active_datasets),
},
)
return True
def stop_streaming(self):
"""Stop streaming - deactivates all active datasets"""
self.streaming = False
# Stop all dataset streaming threads
active_datasets_copy = self.config_manager.active_datasets.copy()
for dataset_id in active_datasets_copy:
try:
self.deactivate_dataset(dataset_id)
except Exception as e:
if self.logger:
self.logger.warning(f"Error deactivating dataset {dataset_id}: {e}")
def stop_udp_streaming(self):
"""Stop UDP streaming to PlotJuggler (CSV recording continues)"""
self.udp_streaming_enabled = False
# Close UDP socket
if self.udp_socket:
@ -735,23 +829,31 @@ class DataStreamer:
self.config_manager.save_system_state(
connected=self.plc_client.is_connected(),
streaming=False,
active_datasets=set(),
active_datasets=self.config_manager.active_datasets,
)
datasets_stopped = len(active_datasets_copy)
self.event_logger.log_event(
"info",
"streaming_stopped",
f"Multi-dataset streaming stopped: {datasets_stopped} datasets deactivated",
"udp_streaming_stopped",
"UDP streaming to PlotJuggler stopped (CSV recording continues)",
)
# 🔑 LEGACY METHODS (for backward compatibility)
def start_streaming(self) -> bool:
"""Legacy method: Start UDP streaming (CSV recording should be started separately)"""
return self.start_udp_streaming()
def stop_streaming(self):
"""Legacy method: Stop UDP streaming only (CSV recording continues)"""
self.stop_udp_streaming()
def is_streaming(self) -> bool:
"""Check if streaming is active"""
return self.streaming
"""Check if UDP streaming is active"""
return self.udp_streaming_enabled
def is_csv_recording(self) -> bool:
"""Check if CSV recording is active"""
return bool(self.dataset_csv_files) and self.streaming
return self.csv_recording_enabled and bool(self.dataset_csv_files)
def get_active_datasets(self) -> Set[str]:
"""Get set of currently active dataset IDs"""

95
main.py
View File

@ -304,7 +304,6 @@ def connect_plc():
@app.route("/api/plc/disconnect", methods=["POST"])
def disconnect_plc():
"""Disconnect from PLC"""
streamer.stop_streaming()
streamer.disconnect_plc()
return jsonify({"success": True, "message": "Disconnected from PLC"})
@ -1080,22 +1079,29 @@ def set_current_dataset():
@app.route("/api/streaming/start", methods=["POST"])
def start_streaming():
"""Start streaming"""
"""Start UDP streaming (legacy endpoint - now only starts UDP streaming)"""
error_response = check_streamer_initialized()
if error_response:
return error_response
if streamer.start_streaming():
return jsonify({"success": True, "message": "Streaming started"})
return jsonify({"success": True, "message": "UDP streaming started"})
else:
return jsonify({"success": False, "message": "Error starting streaming"}), 500
return (
jsonify({"success": False, "message": "Error starting UDP streaming"}),
500,
)
@app.route("/api/streaming/stop", methods=["POST"])
def stop_streaming():
"""Stop streaming"""
"""Stop UDP streaming (legacy endpoint - now only stops UDP streaming)"""
error_response = check_streamer_initialized()
if error_response:
return error_response
streamer.stop_streaming()
return jsonify({"success": True, "message": "Streaming stopped"})
return jsonify({"success": True, "message": "UDP streaming stopped"})
@app.route("/api/sampling", methods=["POST"])
@ -1116,9 +1122,13 @@ def update_sampling():
@app.route("/api/csv/start", methods=["POST"])
def start_csv_recording():
"""Start CSV recording independently"""
if streamer.start_csv_recording():
def start_csv_recording_legacy():
"""Start CSV recording independently (legacy endpoint)"""
error_response = check_streamer_initialized()
if error_response:
return error_response
if streamer.data_streamer.start_csv_recording():
return jsonify({"success": True, "message": "CSV recording started"})
else:
return (
@ -1128,12 +1138,73 @@ def start_csv_recording():
@app.route("/api/csv/stop", methods=["POST"])
def stop_csv_recording():
"""Stop CSV recording independently"""
streamer.stop_csv_recording()
def stop_csv_recording_legacy():
"""Stop CSV recording independently (legacy endpoint)"""
error_response = check_streamer_initialized()
if error_response:
return error_response
streamer.data_streamer.stop_csv_recording()
return jsonify({"success": True, "message": "CSV recording stopped"})
@app.route("/api/csv/recording/start", methods=["POST"])
def start_csv_recording():
"""Start CSV recording independently of UDP streaming"""
error_response = check_streamer_initialized()
if error_response:
return error_response
if streamer.data_streamer.start_csv_recording():
return jsonify({"success": True, "message": "CSV recording started"})
else:
return (
jsonify({"success": False, "message": "Error starting CSV recording"}),
500,
)
@app.route("/api/csv/recording/stop", methods=["POST"])
def stop_csv_recording():
"""Stop CSV recording independently of UDP streaming"""
error_response = check_streamer_initialized()
if error_response:
return error_response
streamer.data_streamer.stop_csv_recording()
return jsonify({"success": True, "message": "CSV recording stopped"})
# 🔑 NEW: UDP Streaming Control (Independent)
@app.route("/api/udp/streaming/start", methods=["POST"])
def start_udp_streaming():
"""Start UDP streaming to PlotJuggler independently of CSV recording"""
error_response = check_streamer_initialized()
if error_response:
return error_response
if streamer.data_streamer.start_udp_streaming():
return jsonify(
{"success": True, "message": "UDP streaming to PlotJuggler started"}
)
else:
return (
jsonify({"success": False, "message": "Error starting UDP streaming"}),
500,
)
@app.route("/api/udp/streaming/stop", methods=["POST"])
def stop_udp_streaming():
"""Stop UDP streaming to PlotJuggler independently of CSV recording"""
error_response = check_streamer_initialized()
if error_response:
return error_response
streamer.data_streamer.stop_udp_streaming()
return jsonify({"success": True, "message": "UDP streaming to PlotJuggler stopped"})
@app.route("/api/status")
def get_status():
"""Get current status"""

View File

@ -70,5 +70,5 @@
],
"current_dataset_id": "dar",
"version": "1.0",
"last_update": "2025-07-20T23:03:40.707669"
"last_update": "2025-07-20T23:25:07.495201"
}

View File

@ -182,7 +182,7 @@ function initDatasetListeners() {
}
// Auto-refrescar valores para el nuevo dataset
autoRefreshOnDatasetChange();
autoStartLiveDisplay();
} else {
showMessage(data.message, 'error');
}

View File

@ -45,14 +45,14 @@ function updateStatus() {
});
}
// Actualizar estado de streaming
// Actualizar estado de streaming UDP
if (data.streaming) {
streamStatus.innerHTML = '📡 Streaming: Active <div style="margin-top: 8px;"><button type="button" id="status-streaming-btn">⏹️ Stop</button></div>';
streamStatus.innerHTML = '📡 UDP Streaming: Active <div style="margin-top: 8px;"><button type="button" id="status-streaming-btn">⏹️ Stop</button></div>';
streamStatus.className = 'status-item status-streaming';
// Añadir event listener al botón de parar streaming
// Añadir event listener al botón de parar streaming UDP
document.getElementById('status-streaming-btn').addEventListener('click', function () {
fetch('/api/streaming/stop', { method: 'POST' })
fetch('/api/udp/streaming/stop', { method: 'POST' })
.then(response => response.json())
.then(data => {
showMessage(data.message, data.success ? 'success' : 'error');
@ -60,12 +60,12 @@ function updateStatus() {
});
});
} else {
streamStatus.innerHTML = '📡 Streaming: Inactive <div style="margin-top: 8px;"><button type="button" id="status-start-btn">▶️ Start</button></div>';
streamStatus.innerHTML = '📡 UDP Streaming: Inactive <div style="margin-top: 8px;"><button type="button" id="status-start-btn">▶️ Start</button></div>';
streamStatus.className = 'status-item status-idle';
// Añadir event listener al botón de iniciar streaming
// Añadir event listener al botón de iniciar streaming UDP
document.getElementById('status-start-btn').addEventListener('click', function () {
fetch('/api/streaming/start', { method: 'POST' })
fetch('/api/udp/streaming/start', { method: 'POST' })
.then(response => response.json())
.then(data => {
showMessage(data.message, data.success ? 'success' : 'error');
@ -169,18 +169,70 @@ function updateStatusFromStream(status) {
if (status.plc_connected) {
plcStatus.innerHTML = '🔌 PLC: Connected <div style="margin-top: 8px;"><button type="button" id="status-disconnect-btn">❌ Disconnect</button></div>';
plcStatus.className = 'status-item status-connected';
// Añadir event listener al nuevo botón de desconexión
const disconnectBtn = document.getElementById('status-disconnect-btn');
if (disconnectBtn) {
disconnectBtn.addEventListener('click', function () {
fetch('/api/plc/disconnect', { method: 'POST' })
.then(response => response.json())
.then(data => {
showMessage(data.message, data.success ? 'success' : 'error');
updateStatus();
});
});
}
} else {
plcStatus.innerHTML = '🔌 PLC: Disconnected <div style="margin-top: 8px;"><button type="button" id="status-connect-btn">🔗 Connect</button></div>';
plcStatus.className = 'status-item status-disconnected';
// Añadir event listener al botón de conexión
const connectBtn = document.getElementById('status-connect-btn');
if (connectBtn) {
connectBtn.addEventListener('click', function () {
fetch('/api/plc/connect', { method: 'POST' })
.then(response => response.json())
.then(data => {
showMessage(data.message, data.success ? 'success' : 'error');
updateStatus();
});
});
}
}
// Actualizar estado de streaming
// Actualizar estado de streaming UDP
if (status.streaming) {
streamStatus.innerHTML = '📡 Streaming: Active <div style="margin-top: 8px;"><button type="button" id="status-streaming-btn">⏹️ Stop</button></div>';
streamStatus.innerHTML = '📡 UDP Streaming: Active <div style="margin-top: 8px;"><button type="button" id="status-streaming-btn">⏹️ Stop</button></div>';
streamStatus.className = 'status-item status-streaming';
// Añadir event listener al botón de parar streaming UDP
const stopBtn = document.getElementById('status-streaming-btn');
if (stopBtn) {
stopBtn.addEventListener('click', function () {
fetch('/api/udp/streaming/stop', { method: 'POST' })
.then(response => response.json())
.then(data => {
showMessage(data.message, data.success ? 'success' : 'error');
updateStatus();
});
});
}
} else {
streamStatus.innerHTML = '📡 Streaming: Inactive <div style="margin-top: 8px;"><button type="button" id="status-start-btn">▶️ Start</button></div>';
streamStatus.innerHTML = '📡 UDP Streaming: Inactive <div style="margin-top: 8px;"><button type="button" id="status-start-btn">▶️ Start</button></div>';
streamStatus.className = 'status-item status-idle';
// Añadir event listener al botón de iniciar streaming UDP
const startBtn = document.getElementById('status-start-btn');
if (startBtn) {
startBtn.addEventListener('click', function () {
fetch('/api/udp/streaming/start', { method: 'POST' })
.then(response => response.json())
.then(data => {
showMessage(data.message, data.success ? 'success' : 'error');
updateStatus();
});
});
}
}
// Actualizar estado de grabación CSV

View File

@ -1,26 +1,34 @@
/**
* Gestión del streaming de datos a PlotJuggler
* Gestión del streaming UDP a PlotJuggler (independiente del recording CSV)
*/
// Inicializar listeners para el control de streaming
// Inicializar listeners para el control de streaming UDP
function initStreamingListeners() {
// Iniciar streaming
// Iniciar streaming UDP
document.getElementById('start-streaming-btn').addEventListener('click', function () {
fetch('/api/streaming/start', { method: 'POST' })
fetch('/api/udp/streaming/start', { method: 'POST' })
.then(response => response.json())
.then(data => {
showMessage(data.message, data.success ? 'success' : 'error');
updateStatus();
})
.catch(error => {
console.error('Error starting UDP streaming:', error);
showMessage('Error starting UDP streaming', 'error');
});
});
// Detener streaming
// Detener streaming UDP
document.getElementById('stop-streaming-btn').addEventListener('click', function () {
fetch('/api/streaming/stop', { method: 'POST' })
fetch('/api/udp/streaming/stop', { method: 'POST' })
.then(response => response.json())
.then(data => {
showMessage(data.message, data.success ? 'success' : 'error');
updateStatus();
})
.catch(error => {
console.error('Error stopping UDP streaming:', error);
showMessage('Error stopping UDP streaming', 'error');
});
});

View File

@ -101,173 +101,22 @@ function toggleStreaming(varName, enabled) {
});
}
// Refrescar valores de variables desde el PLC
function refreshVariableValues() {
if (!currentDatasetId) {
showMessage('Please select a dataset first', 'warning');
return;
// Auto-start live display when dataset changes (if PLC is connected)
function autoStartLiveDisplay() {
if (currentDatasetId) {
// Check if PLC is connected by fetching status
fetch('/api/status')
.then(response => response.json())
.then(status => {
if (status.plc_connected && !isStreamingVariables) {
startVariableStreaming();
showMessage('Live display started automatically for active dataset', 'info');
}
})
.catch(error => {
console.error('Error checking PLC status:', error);
});
}
const refreshBtn = document.getElementById('refresh-values-btn');
const lastRefreshTime = document.getElementById('last-refresh-time');
// Deshabilitar botón y mostrar estado de carga
refreshBtn.disabled = true;
refreshBtn.innerHTML = '⏳ Reading...';
fetch(`/api/datasets/${currentDatasetId}/variables/values`)
.then(response => response.json())
.then(data => {
if (data.success) {
// Actualizar valores de variables en la tabla
Object.keys(data.values).forEach(varName => {
const valueCell = document.getElementById(`value-${varName}`);
if (valueCell) {
const value = data.values[varName];
valueCell.textContent = value;
// Código de color y tooltip basado en el estado del valor
if (value === 'ERROR' || value === 'FORMAT_ERROR') {
valueCell.style.color = 'var(--pico-color-red-500)';
// Añadir tooltip con error detallado si está disponible
const errorDetail = data.detailed_errors && data.detailed_errors[varName];
if (errorDetail) {
valueCell.title = `Error: ${errorDetail}`;
valueCell.style.cursor = 'help';
}
} else {
valueCell.style.color = 'var(--pico-color-green-600)';
valueCell.title = `Value: ${value}`;
valueCell.style.cursor = 'default';
}
}
});
// Actualizar timestamp, estadísticas e información de origen
if (data.timestamp) {
const stats = data.stats;
const source = data.source || 'cache';
const isCache = data.is_cached;
// Crear indicador de origen (siempre caché ahora)
const sourceIcon = '📊';
const sourceText = 'from streaming cache';
let statsText = '';
if (stats && stats.total > 0) {
statsText = `<br/><small style="color: var(--pico-muted-color);">📈 ${stats.success}/${stats.total} variables</small>`;
}
lastRefreshTime.innerHTML = `
Last refresh: ${data.timestamp}<br/>
<small style="color: var(--pico-color-green-600);">
${data.message}
</small>${statsText}<br/>
<small style="color: var(--pico-muted-color);">
${sourceIcon} ${sourceText}
</small>
${data.cache_info ? `<br/><small style="color: var(--pico-muted-color);">${data.cache_info}</small>` : ''}
`;
}
// Mostrar mensaje apropiado
if (data.warning) {
showMessage(data.warning, 'warning');
// Mostrar información detallada de error en consola para depuración
if (data.detailed_errors && Object.keys(data.detailed_errors).length > 0) {
console.warn('Variable read errors:', data.detailed_errors);
}
} else {
showMessage(data.message, 'success');
}
} else {
// Manejar diferentes tipos de casos de fallo
const errorType = data.error_type;
if (errorType === 'dataset_inactive') {
// Dataset no está activo - guiar al usuario para activarlo
showMessage(`⚠️ ${data.message}`, 'warning');
clearVariableValues('DATASET INACTIVE');
lastRefreshTime.innerHTML = `
Last refresh attempt: ${data.timestamp}<br/>
<small style="color: var(--pico-color-amber-500);">
Dataset not active - activate dataset to populate cache
</small><br/>
<small style="color: var(--pico-muted-color);">
💡 Use "Activate" button in dataset controls above
</small>
`;
} else if (errorType === 'plc_disconnected') {
// PLC no conectado - guiar al usuario para conectar
showMessage(`🔌 ${data.message}`, 'warning');
clearVariableValues('PLC OFFLINE');
lastRefreshTime.innerHTML = `
Last refresh attempt: ${data.timestamp}<br/>
<small style="color: var(--pico-color-red-500);">
🔌 PLC not connected - cache cannot be populated
</small><br/>
<small style="color: var(--pico-muted-color);">
💡 Connect to PLC first, then activate dataset
</small>
`;
} else if (errorType === 'no_cache_available') {
// No hay caché todavía - el streaming está iniciando
showMessage(`${data.message}`, 'info');
clearVariableValues('READING...');
lastRefreshTime.innerHTML = `
Last refresh attempt: ${data.timestamp}<br/>
<small style="color: var(--pico-color-blue-500);">
Cache being populated by streaming process
</small><br/>
<small style="color: var(--pico-muted-color);">
${data.note || 'Please wait for next reading cycle'}
</small>
`;
} else {
// Caso de fallo completo u otros errores
showMessage(`${data.message}`, 'error');
clearVariableValues('ERROR');
const source = data.source || 'cache';
const sourceIcon = '📊';
const sourceText = 'from streaming cache';
lastRefreshTime.innerHTML = `
Last refresh attempt: ${data.timestamp}<br/>
<small style="color: var(--pico-color-red-500);">
${data.message}
</small><br/>
<small style="color: var(--pico-muted-color);">
${sourceIcon} ${sourceText}
</small>
`;
// Mostrar información detallada de error si está disponible
if (data.detailed_errors && Object.keys(data.detailed_errors).length > 0) {
console.error('Detailed variable errors:', data.detailed_errors);
}
}
}
})
.catch(error => {
console.error('Error refreshing variable values:', error);
showMessage('Network error retrieving cached variable values', 'error');
clearVariableValues('COMM ERROR');
lastRefreshTime.innerHTML = `
Last refresh attempt: ${new Date().toLocaleString()}<br/>
<small style="color: var(--pico-color-red-500);">
🌐 Network error communicating with server
</small>
`;
})
.finally(() => {
// Re-habilitar botón
refreshBtn.disabled = false;
refreshBtn.innerHTML = '🔄 Refresh Values';
});
}
// Limpiar todos los valores de variables y establecer mensaje de estado
@ -280,15 +129,7 @@ function clearVariableValues(statusMessage = '--') {
});
}
// Auto-refrescar valores cuando cambia el dataset (opcional)
function autoRefreshOnDatasetChange() {
if (currentDatasetId) {
// Pequeño retraso para asegurar que la tabla está cargada
setTimeout(() => {
refreshVariableValues();
}, 500);
}
}
// Iniciar streaming de variables en tiempo real
function startVariableStreaming() {
@ -446,16 +287,14 @@ function updateVariableValuesFromStream(data) {
// Actualizar indicador de streaming
function updateStreamingIndicator(isStreaming) {
const refreshBtn = document.getElementById('refresh-values-btn');
if (refreshBtn) {
const toggleBtn = document.getElementById('toggle-streaming-btn');
if (toggleBtn) {
if (isStreaming) {
refreshBtn.innerHTML = '🔄 Live Streaming';
refreshBtn.disabled = true;
refreshBtn.title = 'Real-time streaming is active - values update automatically';
toggleBtn.innerHTML = '⏹️ Stop Live Display';
toggleBtn.title = 'Stop live variable display';
} else {
refreshBtn.innerHTML = '🔄 Refresh Values';
refreshBtn.disabled = false;
refreshBtn.title = 'Click to refresh variable values';
toggleBtn.innerHTML = '▶️ Start Live Display';
toggleBtn.title = 'Start live variable display';
}
}
}
@ -481,157 +320,3 @@ function toggleRealTimeStreaming() {
}
}
// Función de diagnóstico para problemas de conexión y variables
function diagnoseConnection() {
if (!currentDatasetId) {
showMessage('No dataset selected for diagnosis', 'error');
return;
}
const diagnoseBtn = document.getElementById('diagnose-btn');
const originalText = diagnoseBtn.innerHTML;
// Deshabilitar botón y mostrar estado de diagnóstico
diagnoseBtn.disabled = true;
diagnoseBtn.innerHTML = '🔍 Diagnosing...';
// Crear informe de diagnóstico
let diagnosticReport = [];
diagnosticReport.push('=== PLC CONNECTION DIAGNOSTICS ===');
diagnosticReport.push(`Dataset: ${currentDatasetId}`);
diagnosticReport.push(`Timestamp: ${new Date().toLocaleString()}`);
diagnosticReport.push('');
// Paso 1: Verificar estado de conexión PLC
fetch('/api/status')
.then(response => response.json())
.then(statusData => {
diagnosticReport.push('1. PLC Connection Status:');
diagnosticReport.push(` Connected: ${statusData.plc_connected ? 'YES' : 'NO'}`);
diagnosticReport.push(` PLC IP: ${statusData.plc_config.ip}`);
diagnosticReport.push(` Rack: ${statusData.plc_config.rack}`);
diagnosticReport.push(` Slot: ${statusData.plc_config.slot}`);
diagnosticReport.push('');
if (!statusData.plc_connected) {
diagnosticReport.push(' ❌ PLC is not connected. Please check:');
diagnosticReport.push(' - Network connectivity to PLC');
diagnosticReport.push(' - PLC IP address, rack, and slot configuration');
diagnosticReport.push(' - PLC is powered on and operational');
diagnosticReport.push('');
showDiagnosticResults(diagnosticReport);
return;
}
// Paso 2: Obtener información del dataset
return fetch('/api/datasets')
.then(response => response.json())
.then(datasetData => {
const dataset = datasetData.datasets[currentDatasetId];
if (!dataset) {
diagnosticReport.push('2. Dataset Status:');
diagnosticReport.push(' ❌ Dataset not found');
showDiagnosticResults(diagnosticReport);
return;
}
diagnosticReport.push('2. Dataset Information:');
diagnosticReport.push(` Name: ${dataset.name}`);
diagnosticReport.push(` Variables: ${Object.keys(dataset.variables).length}`);
diagnosticReport.push(` Active: ${dataset.enabled ? 'YES' : 'NO'}`);
diagnosticReport.push('');
// Paso 3: Probar lectura de variables con diagnósticos
diagnosticReport.push('3. Variable Reading Test:');
return fetch(`/api/datasets/${currentDatasetId}/variables/values`)
.then(response => response.json())
.then(valueData => {
if (valueData.success) {
const stats = valueData.stats || {};
diagnosticReport.push(` ✅ Success: ${stats.success || 0}/${stats.total || 0} variables read`);
if (stats.failed > 0) {
diagnosticReport.push(` ⚠️ Failed: ${stats.failed} variables had errors`);
diagnosticReport.push('');
diagnosticReport.push('4. Variable-Specific Errors:');
if (valueData.detailed_errors) {
Object.keys(valueData.detailed_errors).forEach(varName => {
diagnosticReport.push(` ${varName}: ${valueData.detailed_errors[varName]}`);
});
}
} else {
diagnosticReport.push(' ✅ All variables read successfully');
}
} else {
diagnosticReport.push(` ❌ Complete failure: ${valueData.message}`);
diagnosticReport.push('');
diagnosticReport.push('4. Detailed Error Information:');
if (valueData.detailed_errors) {
Object.keys(valueData.detailed_errors).forEach(varName => {
diagnosticReport.push(` ${varName}: ${valueData.detailed_errors[varName]}`);
});
}
diagnosticReport.push('');
diagnosticReport.push('5. Troubleshooting Suggestions:');
if (valueData.error_type === 'connection_error') {
diagnosticReport.push(' - Check PLC network connection');
diagnosticReport.push(' - Verify PLC is responding to network requests');
diagnosticReport.push(' - Check firewall settings');
} else if (valueData.error_type === 'all_failed') {
diagnosticReport.push(' - Verify variable memory addresses are correct');
diagnosticReport.push(' - Check if data blocks exist in PLC program');
diagnosticReport.push(' - Ensure variable types match PLC configuration');
}
}
showDiagnosticResults(diagnosticReport);
});
});
})
.catch(error => {
diagnosticReport.push('❌ Diagnostic failed with network error:');
diagnosticReport.push(` ${error.message}`);
diagnosticReport.push('');
diagnosticReport.push('Troubleshooting:');
diagnosticReport.push(' - Check web server connection');
diagnosticReport.push(' - Refresh the page and try again');
showDiagnosticResults(diagnosticReport);
})
.finally(() => {
// Re-habilitar botón
diagnoseBtn.disabled = false;
diagnoseBtn.innerHTML = originalText;
});
}
// Mostrar resultados de diagnóstico en consola y como mensaje
function showDiagnosticResults(diagnosticReport) {
const reportText = diagnosticReport.join('\n');
// Log a consola para análisis detallado
console.log(reportText);
// Mostrar mensaje de resumen al usuario
const errorCount = reportText.match(/❌/g)?.length || 0;
const warningCount = reportText.match(/⚠️/g)?.length || 0;
const successCount = reportText.match(/✅/g)?.length || 0;
let summaryMessage = 'Diagnosis completed. ';
if (errorCount > 0) {
summaryMessage += `${errorCount} errors found. `;
}
if (warningCount > 0) {
summaryMessage += `${warningCount} warnings found. `;
}
if (successCount > 0) {
summaryMessage += `${successCount} checks passed. `;
}
summaryMessage += 'Check browser console (F12) for detailed report.';
const messageType = errorCount > 0 ? 'error' : (warningCount > 0 ? 'warning' : 'success');
showMessage(summaryMessage, messageType);
}

View File

@ -7,5 +7,5 @@
]
},
"auto_recovery_enabled": true,
"last_update": "2025-07-20T23:03:40.714677"
"last_update": "2025-07-20T23:25:10.215131"
}

View File

@ -41,7 +41,7 @@
</div>
</div>
<div class="status-item" id="stream-status">
📡 Streaming: Inactive
📡 UDP Streaming: Inactive
<div style="margin-top: 8px;">
<button type="button" id="status-streaming-btn">⏹️ Stop</button>
</div>
@ -161,31 +161,29 @@
<!-- Variables Management Section -->
<div id="variables-management" style="display: none;">
<!-- Real-time Streaming Control -->
<!-- Real-time Variable Monitoring Info -->
<div
style="margin-bottom: 1rem; padding: 1rem; background: var(--pico-card-background-color); border-radius: var(--pico-border-radius); border: var(--pico-border-width) solid var(--pico-border-color);">
<div
style="display: flex; justify-content: space-between; align-items: center; flex-wrap: wrap; gap: 1rem;">
<div>
<strong>🔄 Real-time Variable Streaming</strong>
<strong>📊 Automatic Variable Monitoring</strong>
<br>
<small style="color: var(--pico-muted-color);">
Enable live updates of variable values without page refresh
Variables are automatically monitored and recorded when PLC is connected and dataset is
active
</small>
</div>
<div style="display: flex; gap: 0.5rem; align-items: center;">
<button type="button" id="toggle-streaming-btn" class="outline"
onclick="toggleRealTimeStreaming()">
▶️ Start Live Streaming
</button>
<button type="button" id="refresh-values-btn" onclick="refreshVariableValues()">
🔄 Refresh Values
▶️ Start Live Display
</button>
</div>
</div>
<div id="last-refresh-time"
style="margin-top: 0.5rem; font-size: 0.9em; color: var(--pico-muted-color);">
Click "Refresh Values" to read current variable values
Live display shows cached values from automatic monitoring
</div>
</div>
@ -252,13 +250,6 @@
<div style="display: flex; justify-content: space-between; align-items: center; margin-bottom: 1rem;">
<h4 style="margin: 0;">📊 Variables in Dataset</h4>
<div style="display: flex; align-items: center; gap: 0.5rem; flex-wrap: wrap;">
<button type="button" id="refresh-values-btn" class="outline" onclick="refreshVariableValues()">
🔄 Refresh Values
</button>
<button type="button" id="diagnose-btn" class="secondary" onclick="diagnoseConnection()"
title="Run connection and variable diagnostics">
🔍 Diagnose
</button>
<div style="margin-left: 1rem;">
<span id="last-refresh-time"
style="color: var(--pico-muted-color); font-size: 0.9rem;"></span>
@ -330,22 +321,22 @@
</div>
</article>
<!-- Multi-Dataset Streaming Control -->
<!-- PlotJuggler Streaming Control -->
<article>
<header>🚀 Multi-Dataset Streaming Control</header>
<header>📡 PlotJuggler UDP Streaming Control</header>
<div class="info-section">
<p><strong>📡 Streaming Mode:</strong> Only variables marked for streaming in active datasets are sent
to PlotJuggler</p>
<p><strong>💾 CSV Recording:</strong> Each active dataset automatically records ALL its variables to
separate CSV files</p>
<p><strong>📡 UDP Streaming:</strong> Sends only variables marked for streaming to PlotJuggler via UDP
</p>
<p><strong>💾 Automatic Recording:</strong> When PLC is connected, all datasets with variables
automatically record to CSV files</p>
<p><strong>📁 File Organization:</strong> records/[dd-mm-yyyy]/[prefix]_[hour].csv (e.g., temp_14.csv,
pressure_14.csv)</p>
<p><strong>⏱️ Individual Sampling:</strong> Each dataset can have its own sampling interval or use the
global one</p>
<p><strong>⏱️ Independent Operation:</strong> CSV recording works independently of UDP streaming -
always active when PLC is connected</p>
</div>
<div class="controls">
<button id="start-streaming-btn">▶️ Start All Active Datasets</button>
<button class="secondary" id="stop-streaming-btn">⏹️ Stop All Streaming</button>
<button id="start-streaming-btn">📡 Start UDP Streaming</button>
<button class="secondary" id="stop-streaming-btn">⏹️ Stop UDP Streaming</button>
<button class="outline" onclick="location.reload()">🔄 Refresh Status</button>
</div>
</article>