Se implementó un sistema de registro de eventos persistente con almacenamiento en disco y visualización en tiempo real en la interfaz web. Se añadieron nuevas funcionalidades para la carga y guardado del estado del sistema, así como mejoras en la gestión de instancias únicas y recuperación automática. Además, se realizaron ajustes en la interfaz para mostrar el registro de eventos, incluyendo opciones de filtrado y limpieza de la vista.
This commit is contained in:
parent
cc729d8f82
commit
5e3b1ae76e
|
@ -4,6 +4,71 @@
|
|||
|
||||
### Latest Modifications (Current Session)
|
||||
|
||||
#### Persistent Application Events Log
|
||||
**Decision**: Implemented comprehensive event logging system with persistent storage and real-time web interface.
|
||||
|
||||
**Rationale**: Industrial applications require detailed audit trails and event monitoring for troubleshooting, compliance, and operational analysis. Previous logging was limited to console output without persistent storage or web access.
|
||||
|
||||
**Implementation**:
|
||||
|
||||
**Persistent Event Storage**:
|
||||
- Created `application_events.json` file for structured event storage
|
||||
- JSON-based format with timestamp, level, event type, message and detailed metadata
|
||||
- Automatic log rotation with configurable maximum entries (1000 events)
|
||||
- UTF-8 encoding support for international character sets
|
||||
|
||||
**Event Categories**:
|
||||
- **Connection Events**: PLC connect/disconnect with connection parameters
|
||||
- **Configuration Changes**: PLC settings, UDP settings, sampling interval updates with before/after values
|
||||
- **Variable Management**: Variable addition/removal with complete configuration details
|
||||
- **Streaming Operations**: Start/stop streaming with variable counts and settings
|
||||
- **CSV Recording**: Start/stop recording with file paths and variable counts
|
||||
- **Error Events**: Connection failures, streaming errors, configuration errors with detailed error information
|
||||
- **System Events**: Application startup, shutdown, and recovery operations
|
||||
|
||||
**Enhanced Error Handling**:
|
||||
- Consecutive error detection in streaming loop with automatic shutdown after 5 failures
|
||||
- Detailed error context including error messages and retry counts
|
||||
- Graceful degradation with error logging instead of silent failures
|
||||
|
||||
**Web Interface Integration**:
|
||||
- New "Application Events Log" section at bottom of main page
|
||||
- Real-time log display with automatic refresh every 10 seconds
|
||||
- Configurable event limit (25, 50, 100, 200 events)
|
||||
- Color-coded log levels (info: gray, warning: orange, error: red)
|
||||
- Event type icons for quick visual identification
|
||||
- Expandable details view showing complete event metadata
|
||||
- Manual refresh and clear view functions
|
||||
|
||||
**Technical Architecture**:
|
||||
- Thread-safe logging with automatic file persistence
|
||||
- RESTful API endpoint `/api/events` for log data retrieval
|
||||
- Structured event format with consistent metadata fields
|
||||
- Monospace font display for improved readability
|
||||
- Responsive design with mobile-friendly log viewer
|
||||
|
||||
**API Enhancements**:
|
||||
- GET `/api/events?limit=N` endpoint for retrieving recent events
|
||||
- Response includes total event count and current selection size
|
||||
- Error handling with proper HTTP status codes
|
||||
- Configurable event limit with maximum safety cap
|
||||
|
||||
**User Experience Benefits**:
|
||||
- Immediate visibility into system operations and issues
|
||||
- Historical event tracking across application restarts
|
||||
- Detailed troubleshooting information for technical support
|
||||
- Real-time monitoring of system health and operations
|
||||
- Professional logging interface suitable for industrial environments
|
||||
|
||||
**Storage Efficiency**:
|
||||
- Automatic log size management to prevent disk space issues
|
||||
- Efficient JSON serialization with minimal storage overhead
|
||||
- Fast event retrieval with in-memory caching
|
||||
|
||||
**Impact**: Operators and technicians now have comprehensive visibility into all system operations, significantly improving troubleshooting capabilities and providing detailed audit trails for industrial compliance requirements.
|
||||
|
||||
### Previous Modifications
|
||||
|
||||
#### Persistent Configuration System
|
||||
**Decision**: Implemented JSON-based persistence for both PLC configuration and variables setup.
|
||||
|
||||
|
@ -102,9 +167,118 @@ Configuration handling is separated into distinct methods for PLC settings, UDP
|
|||
|
||||
**User Experience**: Users can now record all process data for historical analysis while sending only relevant variables to real-time visualization tools, reducing network traffic and improving PlotJuggler performance.
|
||||
|
||||
#### Industrial-Grade Reliability Enhancements
|
||||
**Decision**: Implemented comprehensive system state persistence and auto-recovery mechanisms for industrial environment resilience.
|
||||
|
||||
**Rationale**: Industrial applications require maximum uptime and automatic recovery from power outages, system failures, and unexpected interruptions without manual intervention.
|
||||
|
||||
**Implementation**:
|
||||
|
||||
**Persistent Streaming Configuration**:
|
||||
- Modified variable storage format to include streaming state (`"streaming": true/false`)
|
||||
- Variables now remember which ones are enabled for PlotJuggler streaming across application restarts
|
||||
- Automatic migration from old format ensures backward compatibility
|
||||
- Streaming configuration persists independently from variable definitions
|
||||
|
||||
**System State Persistence**:
|
||||
- Created `system_state.json` file to track connection, streaming, and CSV recording states
|
||||
- Automatic state saving on every significant system change (connect, disconnect, start/stop streaming)
|
||||
- State includes last known configuration and auto-recovery preferences
|
||||
- Timestamp tracking for state changes and recovery attempts
|
||||
|
||||
**Single Instance Control**:
|
||||
- Implemented PID-based instance locking using `psutil` library for Windows compatibility
|
||||
- Process verification ensures only legitimate instances are detected (checks command line for 'main.py' or 'plc')
|
||||
- Automatic cleanup of stale lock files from terminated processes
|
||||
- Graceful handling of concurrent startup attempts
|
||||
|
||||
**Auto-Recovery System**:
|
||||
- Automatic restoration of previous connection state on application startup
|
||||
- Intelligent recovery sequence: PLC connection → streaming/CSV recording restoration
|
||||
- Configurable auto-recovery with enable/disable option
|
||||
- Retry mechanism with exponential backoff for failed recovery attempts
|
||||
|
||||
**Robust Error Handling**:
|
||||
- Maximum retry system (3 attempts) for critical failures
|
||||
- Graceful shutdown procedure with proper resource cleanup
|
||||
- Instance lock release on application termination
|
||||
- Comprehensive error logging and user feedback
|
||||
|
||||
**Technical Architecture**:
|
||||
- State management integrated into existing configuration system
|
||||
- Thread-safe state persistence with automatic file handling
|
||||
- Cross-platform compatibility (Windows focus with `psutil` instead of `fcntl`)
|
||||
- Memory-efficient state tracking with minimal performance impact
|
||||
|
||||
**Industrial Benefits**:
|
||||
- Zero-configuration restart after power failures
|
||||
- Prevents operator confusion from multiple running instances
|
||||
- Maintains data continuity during system interruptions
|
||||
- Reduces manual intervention requirements in automated environments
|
||||
|
||||
**Dependencies Added**:
|
||||
- `psutil==5.9.5` for cross-platform process management and instance control
|
||||
|
||||
### Future Considerations
|
||||
|
||||
The persistent configuration system provides a foundation for more advanced features like configuration profiles, backup/restore functionality, and remote configuration management.
|
||||
|
||||
The English interface and standardized design make the application ready for potential integration with larger industrial monitoring systems or deployment in international environments.
|
||||
|
||||
The industrial-grade reliability enhancements ensure the application meets the stringent uptime requirements of production environments and can recover automatically from common industrial disruptions like power outages and system reboots.
|
||||
|
||||
#### Dynamic CSV File Creation on Variable Modifications
|
||||
**Decision**: Implemented automatic creation of new CSV files with timestamp when variables are modified during active recording.
|
||||
|
||||
**Rationale**: When variables are added or removed during active CSV recording, the data structure changes require a new file to maintain data integrity. Continuing to write to the same file would result in misaligned columns and data corruption. The timestamped filename provides clear traceability of when variable configuration changes occurred.
|
||||
|
||||
**Implementation**:
|
||||
|
||||
**File Naming Strategy**:
|
||||
- Standard hourly files: `hour.csv` (e.g., `14.csv` for 2:00 PM)
|
||||
- Modification files: `_hour_min_sec.csv` (e.g., `_14_25_33.csv` for 2:25:33 PM)
|
||||
- Maintains chronological order while clearly identifying modification points
|
||||
|
||||
**Automatic File Management**:
|
||||
- Detects variable additions and removals during active CSV recording
|
||||
- Automatically closes current CSV file when variables are modified
|
||||
- Creates new file with modification timestamp immediately
|
||||
- Writes new headers matching updated variable configuration
|
||||
- Resets internal state to ensure proper file rotation continues
|
||||
|
||||
**Enhanced `get_csv_file_path()` Method**:
|
||||
- Added `use_modification_timestamp` parameter for conditional file naming
|
||||
- Preserves backward compatibility with existing hourly rotation
|
||||
- Generates precise timestamp (`%H_%M_%S`) for modification tracking
|
||||
|
||||
**New `create_new_csv_file_for_variable_modification()` Method**:
|
||||
- Triggered automatically by `add_variable()` and `remove_variable()` methods
|
||||
- Only activates when CSV recording is active to avoid unnecessary file creation
|
||||
- Handles file closure, creation, and header writing in single atomic operation
|
||||
- Comprehensive error handling with detailed logging
|
||||
|
||||
**Event Logging Integration**:
|
||||
- Records CSV file creation events with modification reason
|
||||
- Logs file paths, variable counts, and timestamps for audit trails
|
||||
- Distinguishes between regular rotation and modification-triggered files
|
||||
|
||||
**Technical Benefits**:
|
||||
- Maintains data integrity across variable configuration changes
|
||||
- Provides clear audit trail of when system configuration was modified
|
||||
- Enables precise correlation between data files and system state
|
||||
- Supports continuous operation without manual intervention
|
||||
|
||||
**Data Continuity**:
|
||||
- Zero data loss during variable modifications
|
||||
- Seamless transition between files without interrupting recording
|
||||
- Automatic header generation ensures proper CSV structure
|
||||
- Maintains sampling rate and timing precision
|
||||
|
||||
This enhancement ensures that CSV data remains structured and analyzable even when the monitoring configuration evolves during operation, which is critical for long-running industrial processes where monitoring requirements may change.
|
||||
|
||||
**Bug Fix - File Continuation Logic**:
|
||||
- Added `using_modification_file` flag to track when a modification timestamp file is active
|
||||
- Modified `setup_csv_file()` to respect this flag and continue using the modification file until the hour naturally changes
|
||||
- Prevents the system from immediately reverting to standard hourly files after creating modification files
|
||||
- Ensures data continuity in the intended timestamped file rather than switching back to regular rotation
|
||||
|
||||
|
|
|
@ -5,6 +5,7 @@ __pycache__/
|
|||
|
||||
# C extensions
|
||||
*.so
|
||||
*.csv
|
||||
|
||||
# Distribution / packaging
|
||||
.Python
|
||||
|
|
|
@ -0,0 +1,258 @@
|
|||
{
|
||||
"events": [
|
||||
{
|
||||
"timestamp": "2025-07-17T14:39:03.917840",
|
||||
"level": "info",
|
||||
"event_type": "Application started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:39:03.938409",
|
||||
"level": "info",
|
||||
"event_type": "plc_connection",
|
||||
"message": "Successfully connected to PLC 10.1.33.11",
|
||||
"details": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:39:03.939409",
|
||||
"level": "info",
|
||||
"event_type": "csv_started",
|
||||
"message": "CSV recording started for 3 variables",
|
||||
"details": {
|
||||
"variables_count": 3,
|
||||
"output_directory": "records\\17-07-2025"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:39:03.940403",
|
||||
"level": "info",
|
||||
"event_type": "streaming_started",
|
||||
"message": "Streaming started with 3 variables",
|
||||
"details": {
|
||||
"variables_count": 3,
|
||||
"streaming_variables_count": 3,
|
||||
"sampling_interval": 0.1,
|
||||
"udp_host": "127.0.0.1",
|
||||
"udp_port": 9870
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:56:25.611805",
|
||||
"level": "info",
|
||||
"event_type": "Application started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:56:25.640479",
|
||||
"level": "info",
|
||||
"event_type": "plc_connection",
|
||||
"message": "Successfully connected to PLC 10.1.33.11",
|
||||
"details": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:56:25.642459",
|
||||
"level": "info",
|
||||
"event_type": "csv_started",
|
||||
"message": "CSV recording started for 3 variables",
|
||||
"details": {
|
||||
"variables_count": 3,
|
||||
"output_directory": "records\\17-07-2025"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:56:25.643467",
|
||||
"level": "info",
|
||||
"event_type": "streaming_started",
|
||||
"message": "Streaming started with 3 variables",
|
||||
"details": {
|
||||
"variables_count": 3,
|
||||
"streaming_variables_count": 3,
|
||||
"sampling_interval": 0.1,
|
||||
"udp_host": "127.0.0.1",
|
||||
"udp_port": 9870
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:25:37.659374",
|
||||
"level": "info",
|
||||
"event_type": "variable_added",
|
||||
"message": "Variable added: CTS306_Conditi -> DB2124.18 (real)",
|
||||
"details": {
|
||||
"name": "CTS306_Conditi",
|
||||
"db": 2124,
|
||||
"offset": 18,
|
||||
"type": "real",
|
||||
"total_variables": 4
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:25:37.662879",
|
||||
"level": "info",
|
||||
"event_type": "csv_file_created",
|
||||
"message": "New CSV file created after variable modification: _15_25_37.csv",
|
||||
"details": {
|
||||
"file_path": "records\\17-07-2025\\_15_25_37.csv",
|
||||
"variables_count": 4,
|
||||
"reason": "variable_modification"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:42:38.033187",
|
||||
"level": "info",
|
||||
"event_type": "Application started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:42:38.052471",
|
||||
"level": "info",
|
||||
"event_type": "plc_connection",
|
||||
"message": "Successfully connected to PLC 10.1.33.11",
|
||||
"details": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:42:38.053687",
|
||||
"level": "info",
|
||||
"event_type": "csv_started",
|
||||
"message": "CSV recording started for 4 variables",
|
||||
"details": {
|
||||
"variables_count": 4,
|
||||
"output_directory": "records\\17-07-2025"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:42:38.055690",
|
||||
"level": "info",
|
||||
"event_type": "streaming_started",
|
||||
"message": "Streaming started with 4 variables",
|
||||
"details": {
|
||||
"variables_count": 4,
|
||||
"streaming_variables_count": 4,
|
||||
"sampling_interval": 0.1,
|
||||
"udp_host": "127.0.0.1",
|
||||
"udp_port": 9870
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:43:12.383366",
|
||||
"level": "info",
|
||||
"event_type": "variable_added",
|
||||
"message": "Variable added: test -> DB2124.14 (real)",
|
||||
"details": {
|
||||
"name": "test",
|
||||
"db": 2124,
|
||||
"offset": 14,
|
||||
"type": "real",
|
||||
"total_variables": 5
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:43:12.385360",
|
||||
"level": "info",
|
||||
"event_type": "csv_file_created",
|
||||
"message": "New CSV file created after variable modification: _15_43_12.csv",
|
||||
"details": {
|
||||
"file_path": "records\\17-07-2025\\_15_43_12.csv",
|
||||
"variables_count": 5,
|
||||
"reason": "variable_modification"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:43:12.407642",
|
||||
"level": "error",
|
||||
"event_type": "streaming_error",
|
||||
"message": "Error in streaming loop: dictionary changed size during iteration",
|
||||
"details": {
|
||||
"error": "dictionary changed size during iteration",
|
||||
"consecutive_errors": 1
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:43:33.392876",
|
||||
"level": "error",
|
||||
"event_type": "streaming_error",
|
||||
"message": "Error in streaming loop: dictionary changed size during iteration",
|
||||
"details": {
|
||||
"error": "dictionary changed size during iteration",
|
||||
"consecutive_errors": 1
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:43:33.394375",
|
||||
"level": "info",
|
||||
"event_type": "variable_removed",
|
||||
"message": "Variable removed: test",
|
||||
"details": {
|
||||
"name": "test",
|
||||
"removed_config": {
|
||||
"db": 2124,
|
||||
"offset": 14,
|
||||
"type": "real",
|
||||
"streaming": false
|
||||
},
|
||||
"total_variables": 4
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:43:33.397370",
|
||||
"level": "info",
|
||||
"event_type": "csv_file_created",
|
||||
"message": "New CSV file created after variable modification: _15_43_33.csv",
|
||||
"details": {
|
||||
"file_path": "records\\17-07-2025\\_15_43_33.csv",
|
||||
"variables_count": 4,
|
||||
"reason": "variable_modification"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:43:37.383086",
|
||||
"level": "info",
|
||||
"event_type": "config_change",
|
||||
"message": "UDP configuration updated: 127.0.0.1:9870",
|
||||
"details": {
|
||||
"old_config": {
|
||||
"host": "127.0.0.1",
|
||||
"port": 9870
|
||||
},
|
||||
"new_config": {
|
||||
"host": "127.0.0.1",
|
||||
"port": 9870
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:43:38.917840",
|
||||
"level": "info",
|
||||
"event_type": "config_change",
|
||||
"message": "PLC configuration updated: 10.1.33.11:0/2",
|
||||
"details": {
|
||||
"old_config": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
},
|
||||
"new_config": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
"last_updated": "2025-07-17T15:43:38.917840",
|
||||
"total_entries": 22
|
||||
}
|
675
main.py
675
main.py
|
@ -14,11 +14,14 @@ import time
|
|||
import logging
|
||||
import threading
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, Optional
|
||||
from typing import Dict, Any, Optional, List
|
||||
import struct
|
||||
import os
|
||||
import csv
|
||||
from pathlib import Path
|
||||
import atexit
|
||||
import psutil
|
||||
import sys
|
||||
|
||||
app = Flask(__name__)
|
||||
app.secret_key = "plc_streamer_secret_key"
|
||||
|
@ -30,6 +33,8 @@ class PLCDataStreamer:
|
|||
# Configuration file paths
|
||||
self.config_file = "plc_config.json"
|
||||
self.variables_file = "plc_variables.json"
|
||||
self.state_file = "system_state.json"
|
||||
self.events_log_file = "application_events.json"
|
||||
|
||||
# Default configuration
|
||||
self.plc_config = {"ip": "192.168.1.100", "rack": 0, "slot": 2}
|
||||
|
@ -46,6 +51,9 @@ class PLCDataStreamer:
|
|||
self.current_csv_writer = None
|
||||
self.current_hour = None
|
||||
self.csv_headers_written = False
|
||||
self.using_modification_file = (
|
||||
False # Flag to track if using modification timestamp file
|
||||
)
|
||||
|
||||
# System states
|
||||
self.plc = None
|
||||
|
@ -55,12 +63,43 @@ class PLCDataStreamer:
|
|||
self.stream_thread = None
|
||||
self.sampling_interval = 0.1
|
||||
|
||||
# Auto-recovery settings
|
||||
self.auto_recovery_enabled = True
|
||||
self.last_state = {
|
||||
"should_connect": False,
|
||||
"should_stream": False,
|
||||
"should_record_csv": False,
|
||||
}
|
||||
|
||||
# Single instance control
|
||||
self.lock_file = "plc_streamer.lock"
|
||||
self.lock_fd = None
|
||||
|
||||
# Events log for persistent logging
|
||||
self.events_log = []
|
||||
self.max_log_entries = 1000 # Maximum number of log entries to keep
|
||||
|
||||
# Setup logging first
|
||||
self.setup_logging()
|
||||
|
||||
# Load configuration from files
|
||||
self.load_configuration()
|
||||
self.load_variables()
|
||||
self.load_system_state()
|
||||
self.load_events_log()
|
||||
|
||||
# Acquire instance lock and attempt auto-recovery
|
||||
if self.acquire_instance_lock():
|
||||
# Small delay to ensure previous instance has fully cleaned up
|
||||
time.sleep(1)
|
||||
self.log_event(
|
||||
"info",
|
||||
"Application started",
|
||||
"Application initialization completed successfully",
|
||||
)
|
||||
self.attempt_auto_recovery()
|
||||
else:
|
||||
raise RuntimeError("Another instance of the application is already running")
|
||||
|
||||
def setup_logging(self):
|
||||
"""Configure the logging system"""
|
||||
|
@ -108,8 +147,16 @@ class PLCDataStreamer:
|
|||
if os.path.exists(self.variables_file):
|
||||
with open(self.variables_file, "r") as f:
|
||||
self.variables = json.load(f)
|
||||
|
||||
# Load streaming configuration
|
||||
self.streaming_variables.clear()
|
||||
for var_name, var_config in self.variables.items():
|
||||
# If streaming property doesn't exist, default to False for backward compatibility
|
||||
if var_config.get("streaming", False):
|
||||
self.streaming_variables.add(var_name)
|
||||
|
||||
self.logger.info(
|
||||
f"Variables loaded from {self.variables_file}: {len(self.variables)} variables"
|
||||
f"Variables loaded from {self.variables_file}: {len(self.variables)} variables, {len(self.streaming_variables)} enabled for streaming"
|
||||
)
|
||||
else:
|
||||
self.logger.info(
|
||||
|
@ -118,9 +165,143 @@ class PLCDataStreamer:
|
|||
except Exception as e:
|
||||
self.logger.error(f"Error loading variables: {e}")
|
||||
|
||||
def load_system_state(self):
|
||||
"""Load system state from JSON file"""
|
||||
try:
|
||||
if os.path.exists(self.state_file):
|
||||
with open(self.state_file, "r") as f:
|
||||
state_data = json.load(f)
|
||||
self.last_state = state_data.get("last_state", self.last_state)
|
||||
self.auto_recovery_enabled = state_data.get(
|
||||
"auto_recovery_enabled", True
|
||||
)
|
||||
self.logger.info(f"System state loaded from {self.state_file}")
|
||||
else:
|
||||
self.logger.info("No system state file found, starting with defaults")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error loading system state: {e}")
|
||||
|
||||
def save_system_state(self):
|
||||
"""Save current system state to JSON file"""
|
||||
try:
|
||||
state_data = {
|
||||
"last_state": {
|
||||
"should_connect": self.connected,
|
||||
"should_stream": self.streaming,
|
||||
"should_record_csv": self.csv_recording,
|
||||
},
|
||||
"auto_recovery_enabled": self.auto_recovery_enabled,
|
||||
"last_update": datetime.now().isoformat(),
|
||||
}
|
||||
|
||||
with open(self.state_file, "w") as f:
|
||||
json.dump(state_data, f, indent=4)
|
||||
self.logger.debug("System state saved")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error saving system state: {e}")
|
||||
|
||||
def attempt_auto_recovery(self):
|
||||
"""Attempt to restore previous system state"""
|
||||
if not self.auto_recovery_enabled:
|
||||
self.logger.info("Auto-recovery disabled, skipping state restoration")
|
||||
return
|
||||
|
||||
self.logger.info("Attempting auto-recovery of previous state...")
|
||||
|
||||
# Try to restore connection
|
||||
if self.last_state.get("should_connect", False):
|
||||
self.logger.info("Attempting to restore PLC connection...")
|
||||
if self.connect_plc():
|
||||
self.logger.info("PLC connection restored successfully")
|
||||
|
||||
# Try to restore streaming if connection was successful
|
||||
if self.last_state.get("should_stream", False):
|
||||
self.logger.info("Attempting to restore streaming...")
|
||||
if self.start_streaming():
|
||||
self.logger.info("Streaming restored successfully")
|
||||
else:
|
||||
self.logger.warning("Failed to restore streaming")
|
||||
|
||||
# Try to restore CSV recording if needed
|
||||
elif self.last_state.get("should_record_csv", False):
|
||||
self.logger.info("Attempting to restore CSV recording...")
|
||||
if self.start_csv_recording():
|
||||
self.logger.info("CSV recording restored successfully")
|
||||
else:
|
||||
self.logger.warning("Failed to restore CSV recording")
|
||||
else:
|
||||
self.logger.warning("Failed to restore PLC connection")
|
||||
|
||||
def acquire_instance_lock(self) -> bool:
|
||||
"""Acquire lock to ensure single instance execution"""
|
||||
try:
|
||||
# Check if lock file exists
|
||||
if os.path.exists(self.lock_file):
|
||||
# Read PID from existing lock file
|
||||
with open(self.lock_file, "r") as f:
|
||||
try:
|
||||
old_pid = int(f.read().strip())
|
||||
|
||||
# Check if process is still running
|
||||
if psutil.pid_exists(old_pid):
|
||||
# Get process info to verify it's our application
|
||||
try:
|
||||
proc = psutil.Process(old_pid)
|
||||
cmdline = " ".join(proc.cmdline())
|
||||
if "main.py" in cmdline or "plc" in cmdline.lower():
|
||||
self.logger.error(
|
||||
f"Another instance is already running (PID: {old_pid})"
|
||||
)
|
||||
return False
|
||||
except (psutil.NoSuchProcess, psutil.AccessDenied):
|
||||
# Process doesn't exist or can't access, continue
|
||||
pass
|
||||
|
||||
# Old process is dead, remove stale lock file
|
||||
os.remove(self.lock_file)
|
||||
self.logger.info("Removed stale lock file")
|
||||
|
||||
except (ValueError, IOError):
|
||||
# Invalid lock file, remove it
|
||||
os.remove(self.lock_file)
|
||||
self.logger.info("Removed invalid lock file")
|
||||
|
||||
# Create new lock file with current PID
|
||||
with open(self.lock_file, "w") as f:
|
||||
f.write(str(os.getpid()))
|
||||
|
||||
# Register cleanup function
|
||||
atexit.register(self.release_instance_lock)
|
||||
|
||||
self.logger.info(
|
||||
f"Instance lock acquired: {self.lock_file} (PID: {os.getpid()})"
|
||||
)
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error acquiring instance lock: {e}")
|
||||
return False
|
||||
|
||||
def release_instance_lock(self):
|
||||
"""Release instance lock"""
|
||||
try:
|
||||
# Remove lock file
|
||||
if os.path.exists(self.lock_file):
|
||||
os.remove(self.lock_file)
|
||||
self.logger.info("Instance lock released")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error releasing instance lock: {e}")
|
||||
|
||||
def save_variables(self):
|
||||
"""Save variables configuration to JSON file"""
|
||||
try:
|
||||
# Update streaming state in variables before saving
|
||||
for var_name in self.variables:
|
||||
self.variables[var_name]["streaming"] = (
|
||||
var_name in self.streaming_variables
|
||||
)
|
||||
|
||||
with open(self.variables_file, "w") as f:
|
||||
json.dump(self.variables, f, indent=4)
|
||||
self.logger.info(f"Variables saved to {self.variables_file}")
|
||||
|
@ -129,36 +310,92 @@ class PLCDataStreamer:
|
|||
|
||||
def update_plc_config(self, ip: str, rack: int, slot: int):
|
||||
"""Update PLC configuration"""
|
||||
old_config = self.plc_config.copy()
|
||||
self.plc_config = {"ip": ip, "rack": rack, "slot": slot}
|
||||
self.save_configuration()
|
||||
self.logger.info(f"PLC configuration updated: {self.plc_config}")
|
||||
|
||||
config_details = {"old_config": old_config, "new_config": self.plc_config}
|
||||
self.log_event(
|
||||
"info",
|
||||
"config_change",
|
||||
f"PLC configuration updated: {ip}:{rack}/{slot}",
|
||||
config_details,
|
||||
)
|
||||
|
||||
def update_udp_config(self, host: str, port: int):
|
||||
"""Update UDP configuration"""
|
||||
old_config = self.udp_config.copy()
|
||||
self.udp_config = {"host": host, "port": port}
|
||||
self.save_configuration()
|
||||
self.logger.info(f"UDP configuration updated: {self.udp_config}")
|
||||
|
||||
config_details = {"old_config": old_config, "new_config": self.udp_config}
|
||||
self.log_event(
|
||||
"info",
|
||||
"config_change",
|
||||
f"UDP configuration updated: {host}:{port}",
|
||||
config_details,
|
||||
)
|
||||
|
||||
def update_sampling_interval(self, interval: float):
|
||||
"""Update sampling interval"""
|
||||
old_interval = self.sampling_interval
|
||||
self.sampling_interval = interval
|
||||
self.save_configuration()
|
||||
self.logger.info(f"Sampling interval updated: {interval}s")
|
||||
|
||||
config_details = {"old_interval": old_interval, "new_interval": interval}
|
||||
self.log_event(
|
||||
"info",
|
||||
"config_change",
|
||||
f"Sampling interval updated: {interval}s",
|
||||
config_details,
|
||||
)
|
||||
|
||||
def add_variable(self, name: str, db: int, offset: int, var_type: str):
|
||||
"""Add a variable for polling"""
|
||||
self.variables[name] = {"db": db, "offset": offset, "type": var_type}
|
||||
self.variables[name] = {
|
||||
"db": db,
|
||||
"offset": offset,
|
||||
"type": var_type,
|
||||
"streaming": False,
|
||||
}
|
||||
self.save_variables()
|
||||
self.logger.info(f"Variable added: {name} -> DB{db}.{offset} ({var_type})")
|
||||
|
||||
variable_details = {
|
||||
"name": name,
|
||||
"db": db,
|
||||
"offset": offset,
|
||||
"type": var_type,
|
||||
"total_variables": len(self.variables),
|
||||
}
|
||||
self.log_event(
|
||||
"info",
|
||||
"variable_added",
|
||||
f"Variable added: {name} -> DB{db}.{offset} ({var_type})",
|
||||
variable_details,
|
||||
)
|
||||
self.create_new_csv_file_for_variable_modification()
|
||||
|
||||
def remove_variable(self, name: str):
|
||||
"""Remove a variable from polling"""
|
||||
if name in self.variables:
|
||||
var_config = self.variables[name].copy()
|
||||
del self.variables[name]
|
||||
# Also remove from streaming variables if present
|
||||
self.streaming_variables.discard(name)
|
||||
self.save_variables()
|
||||
self.logger.info(f"Variable removed: {name}")
|
||||
|
||||
variable_details = {
|
||||
"name": name,
|
||||
"removed_config": var_config,
|
||||
"total_variables": len(self.variables),
|
||||
}
|
||||
self.log_event(
|
||||
"info",
|
||||
"variable_removed",
|
||||
f"Variable removed: {name}",
|
||||
variable_details,
|
||||
)
|
||||
self.create_new_csv_file_for_variable_modification()
|
||||
|
||||
def toggle_streaming_variable(self, name: str, enabled: bool):
|
||||
"""Enable or disable a variable for streaming"""
|
||||
|
@ -167,6 +404,10 @@ class PLCDataStreamer:
|
|||
self.streaming_variables.add(name)
|
||||
else:
|
||||
self.streaming_variables.discard(name)
|
||||
|
||||
# Save changes to persist streaming configuration
|
||||
self.save_variables()
|
||||
|
||||
self.logger.info(
|
||||
f"Variable {name} streaming: {'enabled' if enabled else 'disabled'}"
|
||||
)
|
||||
|
@ -177,22 +418,95 @@ class PLCDataStreamer:
|
|||
day_folder = now.strftime("%d-%m-%Y")
|
||||
return os.path.join("records", day_folder)
|
||||
|
||||
def get_csv_file_path(self) -> str:
|
||||
def get_csv_file_path(self, use_modification_timestamp: bool = False) -> str:
|
||||
"""Get the complete file path for current hour's CSV file"""
|
||||
now = datetime.now()
|
||||
hour = now.strftime("%H")
|
||||
|
||||
if use_modification_timestamp:
|
||||
# Create filename with hour_min_sec format for variable modifications
|
||||
time_suffix = now.strftime("%H_%M_%S")
|
||||
filename = f"_{time_suffix}.csv"
|
||||
else:
|
||||
# Standard hourly format
|
||||
hour = now.strftime("%H")
|
||||
filename = f"{hour}.csv"
|
||||
|
||||
directory = self.get_csv_directory_path()
|
||||
return os.path.join(directory, f"{hour}.csv")
|
||||
return os.path.join(directory, filename)
|
||||
|
||||
def ensure_csv_directory(self):
|
||||
"""Create CSV directory structure if it doesn't exist"""
|
||||
directory = self.get_csv_directory_path()
|
||||
Path(directory).mkdir(parents=True, exist_ok=True)
|
||||
|
||||
def create_new_csv_file_for_variable_modification(self):
|
||||
"""Create a new CSV file when variables are modified during active recording"""
|
||||
if not self.csv_recording:
|
||||
return
|
||||
|
||||
try:
|
||||
# Close current file if open
|
||||
if self.current_csv_file:
|
||||
self.current_csv_file.close()
|
||||
self.logger.info(
|
||||
f"Closed previous CSV file due to variable modification"
|
||||
)
|
||||
|
||||
# Create new file with modification timestamp
|
||||
self.ensure_csv_directory()
|
||||
csv_path = self.get_csv_file_path(use_modification_timestamp=True)
|
||||
|
||||
self.current_csv_file = open(csv_path, "w", newline="", encoding="utf-8")
|
||||
self.current_csv_writer = csv.writer(self.current_csv_file)
|
||||
|
||||
# Mark that we're using a modification file and set current hour
|
||||
self.using_modification_file = True
|
||||
self.current_hour = datetime.now().hour
|
||||
|
||||
# Write headers with new variable configuration
|
||||
if self.variables:
|
||||
headers = ["timestamp"] + list(self.variables.keys())
|
||||
self.current_csv_writer.writerow(headers)
|
||||
self.current_csv_file.flush()
|
||||
self.csv_headers_written = True
|
||||
|
||||
self.logger.info(
|
||||
f"New CSV file created after variable modification: {csv_path}"
|
||||
)
|
||||
self.log_event(
|
||||
"info",
|
||||
"csv_file_created",
|
||||
f"New CSV file created after variable modification: {os.path.basename(csv_path)}",
|
||||
{
|
||||
"file_path": csv_path,
|
||||
"variables_count": len(self.variables),
|
||||
"reason": "variable_modification",
|
||||
},
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(
|
||||
f"Error creating new CSV file after variable modification: {e}"
|
||||
)
|
||||
self.log_event(
|
||||
"error",
|
||||
"csv_error",
|
||||
f"Failed to create new CSV file after variable modification: {str(e)}",
|
||||
{"error": str(e)},
|
||||
)
|
||||
|
||||
def setup_csv_file(self):
|
||||
"""Setup CSV file for the current hour"""
|
||||
current_hour = datetime.now().hour
|
||||
|
||||
# If we're using a modification file and the hour hasn't changed, keep using it
|
||||
if (
|
||||
self.using_modification_file
|
||||
and self.current_hour == current_hour
|
||||
and self.current_csv_file is not None
|
||||
):
|
||||
return
|
||||
|
||||
# Check if we need to create a new file
|
||||
if self.current_hour != current_hour or self.current_csv_file is None:
|
||||
# Close previous file if open
|
||||
|
@ -210,6 +524,9 @@ class PLCDataStreamer:
|
|||
self.current_csv_writer = csv.writer(self.current_csv_file)
|
||||
self.current_hour = current_hour
|
||||
|
||||
# Reset modification file flag when creating regular hourly file
|
||||
self.using_modification_file = False
|
||||
|
||||
# Write headers if it's a new file
|
||||
if not file_exists and self.variables:
|
||||
headers = ["timestamp"] + list(self.variables.keys())
|
||||
|
@ -255,15 +572,32 @@ class PLCDataStreamer:
|
|||
def start_csv_recording(self) -> bool:
|
||||
"""Start CSV recording"""
|
||||
if not self.connected:
|
||||
self.logger.error("PLC not connected")
|
||||
self.log_event(
|
||||
"error", "csv_error", "Cannot start CSV recording: PLC not connected"
|
||||
)
|
||||
return False
|
||||
|
||||
if not self.variables:
|
||||
self.logger.error("No variables configured")
|
||||
self.log_event(
|
||||
"error",
|
||||
"csv_error",
|
||||
"Cannot start CSV recording: No variables configured",
|
||||
)
|
||||
return False
|
||||
|
||||
self.csv_recording = True
|
||||
self.logger.info("CSV recording started")
|
||||
self.save_system_state()
|
||||
|
||||
csv_details = {
|
||||
"variables_count": len(self.variables),
|
||||
"output_directory": self.get_csv_directory_path(),
|
||||
}
|
||||
self.log_event(
|
||||
"info",
|
||||
"csv_started",
|
||||
f"CSV recording started for {len(self.variables)} variables",
|
||||
csv_details,
|
||||
)
|
||||
return True
|
||||
|
||||
def stop_csv_recording(self):
|
||||
|
@ -275,8 +609,10 @@ class PLCDataStreamer:
|
|||
self.current_csv_file = None
|
||||
self.current_csv_writer = None
|
||||
self.current_hour = None
|
||||
self.using_modification_file = False
|
||||
|
||||
self.logger.info("CSV recording stopped")
|
||||
self.save_system_state()
|
||||
self.log_event("info", "csv_stopped", "CSV recording stopped")
|
||||
|
||||
def connect_plc(self) -> bool:
|
||||
"""Connect to S7-315 PLC"""
|
||||
|
@ -290,12 +626,35 @@ class PLCDataStreamer:
|
|||
)
|
||||
|
||||
self.connected = True
|
||||
self.logger.info(f"Connected to PLC {self.plc_config['ip']}")
|
||||
self.save_system_state()
|
||||
|
||||
connection_details = {
|
||||
"ip": self.plc_config["ip"],
|
||||
"rack": self.plc_config["rack"],
|
||||
"slot": self.plc_config["slot"],
|
||||
}
|
||||
self.log_event(
|
||||
"info",
|
||||
"plc_connection",
|
||||
f"Successfully connected to PLC {self.plc_config['ip']}",
|
||||
connection_details,
|
||||
)
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.connected = False
|
||||
self.logger.error(f"Error connecting to PLC: {e}")
|
||||
error_details = {
|
||||
"ip": self.plc_config["ip"],
|
||||
"rack": self.plc_config["rack"],
|
||||
"slot": self.plc_config["slot"],
|
||||
"error": str(e),
|
||||
}
|
||||
self.log_event(
|
||||
"error",
|
||||
"plc_connection_failed",
|
||||
f"Failed to connect to PLC {self.plc_config['ip']}: {str(e)}",
|
||||
error_details,
|
||||
)
|
||||
return False
|
||||
|
||||
def disconnect_plc(self):
|
||||
|
@ -304,9 +663,20 @@ class PLCDataStreamer:
|
|||
if self.plc:
|
||||
self.plc.disconnect()
|
||||
self.connected = False
|
||||
self.logger.info("Disconnected from PLC")
|
||||
self.save_system_state()
|
||||
|
||||
self.log_event(
|
||||
"info",
|
||||
"plc_disconnection",
|
||||
f"Disconnected from PLC {self.plc_config['ip']}",
|
||||
)
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error disconnecting from PLC: {e}")
|
||||
self.log_event(
|
||||
"error",
|
||||
"plc_disconnection_error",
|
||||
f"Error disconnecting from PLC: {str(e)}",
|
||||
{"error": str(e)},
|
||||
)
|
||||
|
||||
def read_variable(self, var_config: Dict[str, Any]) -> Any:
|
||||
"""Read a specific variable from the PLC"""
|
||||
|
@ -388,6 +758,9 @@ class PLCDataStreamer:
|
|||
f"Starting streaming with interval of {self.sampling_interval}s"
|
||||
)
|
||||
|
||||
consecutive_errors = 0
|
||||
max_consecutive_errors = 5
|
||||
|
||||
while self.streaming:
|
||||
try:
|
||||
start_time = time.time()
|
||||
|
@ -396,6 +769,9 @@ class PLCDataStreamer:
|
|||
all_data = self.read_all_variables()
|
||||
|
||||
if all_data:
|
||||
# Reset error counter on successful read
|
||||
consecutive_errors = 0
|
||||
|
||||
# Write to CSV (all variables)
|
||||
self.write_csv_data(all_data)
|
||||
|
||||
|
@ -411,6 +787,16 @@ class PLCDataStreamer:
|
|||
self.logger.info(
|
||||
f"[{timestamp}] CSV: {len(all_data)} vars, Streaming: {len(streaming_data)} vars"
|
||||
)
|
||||
else:
|
||||
consecutive_errors += 1
|
||||
if consecutive_errors >= max_consecutive_errors:
|
||||
self.log_event(
|
||||
"error",
|
||||
"streaming_error",
|
||||
f"Multiple consecutive read failures ({consecutive_errors}). Stopping streaming.",
|
||||
{"consecutive_errors": consecutive_errors},
|
||||
)
|
||||
break
|
||||
|
||||
# Maintain sampling interval
|
||||
elapsed = time.time() - start_time
|
||||
|
@ -418,20 +804,47 @@ class PLCDataStreamer:
|
|||
time.sleep(sleep_time)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error in streaming loop: {e}")
|
||||
break
|
||||
consecutive_errors += 1
|
||||
self.log_event(
|
||||
"error",
|
||||
"streaming_error",
|
||||
f"Error in streaming loop: {str(e)}",
|
||||
{"error": str(e), "consecutive_errors": consecutive_errors},
|
||||
)
|
||||
|
||||
if consecutive_errors >= max_consecutive_errors:
|
||||
self.log_event(
|
||||
"error",
|
||||
"streaming_error",
|
||||
"Too many consecutive errors. Stopping streaming.",
|
||||
{"consecutive_errors": consecutive_errors},
|
||||
)
|
||||
break
|
||||
|
||||
time.sleep(1) # Wait before retry
|
||||
|
||||
def start_streaming(self) -> bool:
|
||||
"""Start data streaming"""
|
||||
if not self.connected:
|
||||
self.logger.error("PLC not connected")
|
||||
self.log_event(
|
||||
"error", "streaming_error", "Cannot start streaming: PLC not connected"
|
||||
)
|
||||
return False
|
||||
|
||||
if not self.variables:
|
||||
self.logger.error("No variables configured")
|
||||
self.log_event(
|
||||
"error",
|
||||
"streaming_error",
|
||||
"Cannot start streaming: No variables configured",
|
||||
)
|
||||
return False
|
||||
|
||||
if not self.setup_udp_socket():
|
||||
self.log_event(
|
||||
"error",
|
||||
"streaming_error",
|
||||
"Cannot start streaming: UDP socket setup failed",
|
||||
)
|
||||
return False
|
||||
|
||||
# Start CSV recording automatically
|
||||
|
@ -442,7 +855,21 @@ class PLCDataStreamer:
|
|||
self.stream_thread.daemon = True
|
||||
self.stream_thread.start()
|
||||
|
||||
self.logger.info("Streaming and CSV recording started")
|
||||
self.save_system_state()
|
||||
|
||||
streaming_details = {
|
||||
"variables_count": len(self.variables),
|
||||
"streaming_variables_count": len(self.streaming_variables),
|
||||
"sampling_interval": self.sampling_interval,
|
||||
"udp_host": self.udp_config["host"],
|
||||
"udp_port": self.udp_config["port"],
|
||||
}
|
||||
self.log_event(
|
||||
"info",
|
||||
"streaming_started",
|
||||
f"Streaming started with {len(self.streaming_variables)} variables",
|
||||
streaming_details,
|
||||
)
|
||||
return True
|
||||
|
||||
def stop_streaming(self):
|
||||
|
@ -458,7 +885,10 @@ class PLCDataStreamer:
|
|||
self.udp_socket.close()
|
||||
self.udp_socket = None
|
||||
|
||||
self.logger.info("Streaming and CSV recording stopped")
|
||||
self.save_system_state()
|
||||
self.log_event(
|
||||
"info", "streaming_stopped", "Data streaming and CSV recording stopped"
|
||||
)
|
||||
|
||||
def get_status(self) -> Dict[str, Any]:
|
||||
"""Get current system status"""
|
||||
|
@ -476,9 +906,84 @@ class PLCDataStreamer:
|
|||
),
|
||||
}
|
||||
|
||||
def log_event(
|
||||
self, level: str, event_type: str, message: str, details: Dict[str, Any] = None
|
||||
):
|
||||
"""Add an event to the persistent log"""
|
||||
try:
|
||||
event = {
|
||||
"timestamp": datetime.now().isoformat(),
|
||||
"level": level, # info, warning, error
|
||||
"event_type": event_type, # connection, disconnection, error, config_change, etc.
|
||||
"message": message,
|
||||
"details": details or {},
|
||||
}
|
||||
|
||||
# Global streamer instance
|
||||
streamer = PLCDataStreamer()
|
||||
self.events_log.append(event)
|
||||
|
||||
# Limit log size
|
||||
if len(self.events_log) > self.max_log_entries:
|
||||
self.events_log = self.events_log[-self.max_log_entries :]
|
||||
|
||||
# Save to file
|
||||
self.save_events_log()
|
||||
|
||||
# Also log to regular logger
|
||||
if level == "error":
|
||||
self.logger.error(f"[{event_type}] {message}")
|
||||
elif level == "warning":
|
||||
self.logger.warning(f"[{event_type}] {message}")
|
||||
else:
|
||||
self.logger.info(f"[{event_type}] {message}")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error adding event to log: {e}")
|
||||
|
||||
def load_events_log(self):
|
||||
"""Load events log from JSON file"""
|
||||
try:
|
||||
if os.path.exists(self.events_log_file):
|
||||
with open(self.events_log_file, "r", encoding="utf-8") as f:
|
||||
data = json.load(f)
|
||||
self.events_log = data.get("events", [])
|
||||
# Limit log size on load
|
||||
if len(self.events_log) > self.max_log_entries:
|
||||
self.events_log = self.events_log[-self.max_log_entries :]
|
||||
self.logger.info(f"Events log loaded: {len(self.events_log)} entries")
|
||||
else:
|
||||
self.events_log = []
|
||||
self.logger.info("No events log file found, starting with empty log")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error loading events log: {e}")
|
||||
self.events_log = []
|
||||
|
||||
def save_events_log(self):
|
||||
"""Save events log to JSON file"""
|
||||
try:
|
||||
log_data = {
|
||||
"events": self.events_log,
|
||||
"last_updated": datetime.now().isoformat(),
|
||||
"total_entries": len(self.events_log),
|
||||
}
|
||||
with open(self.events_log_file, "w", encoding="utf-8") as f:
|
||||
json.dump(log_data, f, indent=2, ensure_ascii=False)
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error saving events log: {e}")
|
||||
|
||||
def get_recent_events(self, limit: int = 50) -> List[Dict[str, Any]]:
|
||||
"""Get recent events from the log"""
|
||||
return self.events_log[-limit:] if self.events_log else []
|
||||
|
||||
|
||||
# Global streamer instance (will be initialized in main)
|
||||
streamer = None
|
||||
|
||||
|
||||
def check_streamer_initialized():
|
||||
"""Check if streamer is initialized, return error response if not"""
|
||||
if streamer is None:
|
||||
return jsonify({"error": "Application not initialized"}), 503
|
||||
return None
|
||||
|
||||
|
||||
@app.route("/images/<filename>")
|
||||
|
@ -490,6 +995,8 @@ def serve_image(filename):
|
|||
@app.route("/")
|
||||
def index():
|
||||
"""Main page"""
|
||||
if streamer is None:
|
||||
return "Application not initialized", 503
|
||||
return render_template(
|
||||
"index.html", status=streamer.get_status(), variables=streamer.variables
|
||||
)
|
||||
|
@ -498,6 +1005,10 @@ def index():
|
|||
@app.route("/api/plc/config", methods=["POST"])
|
||||
def update_plc_config():
|
||||
"""Update PLC configuration"""
|
||||
error_response = check_streamer_initialized()
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
try:
|
||||
data = request.get_json()
|
||||
ip = data.get("ip", "10.1.33.11")
|
||||
|
@ -529,6 +1040,10 @@ def update_udp_config():
|
|||
@app.route("/api/plc/connect", methods=["POST"])
|
||||
def connect_plc():
|
||||
"""Connect to PLC"""
|
||||
error_response = check_streamer_initialized()
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
if streamer.connect_plc():
|
||||
return jsonify({"success": True, "message": "Connected to PLC"})
|
||||
else:
|
||||
|
@ -598,6 +1113,10 @@ def get_streaming_variables():
|
|||
@app.route("/api/streaming/start", methods=["POST"])
|
||||
def start_streaming():
|
||||
"""Start streaming"""
|
||||
error_response = check_streamer_initialized()
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
if streamer.start_streaming():
|
||||
return jsonify({"success": True, "message": "Streaming started"})
|
||||
else:
|
||||
|
@ -650,20 +1169,102 @@ def stop_csv_recording():
|
|||
@app.route("/api/status")
|
||||
def get_status():
|
||||
"""Get current status"""
|
||||
if streamer is None:
|
||||
return jsonify({"error": "Application not initialized"}), 503
|
||||
return jsonify(streamer.get_status())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Create templates directory if it doesn't exist
|
||||
os.makedirs("templates", exist_ok=True)
|
||||
|
||||
print("🚀 Starting Flask server for PLC S7-315 Streamer")
|
||||
print("📊 Web interface available at: http://localhost:5050")
|
||||
print("🔧 Configure your PLC and variables through the web interface")
|
||||
@app.route("/api/events")
|
||||
def get_events():
|
||||
"""Get recent events from the application log"""
|
||||
error_response = check_streamer_initialized()
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
try:
|
||||
app.run(debug=True, host="0.0.0.0", port=5050)
|
||||
except KeyboardInterrupt:
|
||||
print("\n⏹️ Stopping server...")
|
||||
limit = request.args.get("limit", 50, type=int)
|
||||
limit = min(limit, 200) # Maximum 200 events per request
|
||||
|
||||
events = streamer.get_recent_events(limit)
|
||||
return jsonify(
|
||||
{
|
||||
"success": True,
|
||||
"events": events,
|
||||
"total_events": len(streamer.events_log),
|
||||
"showing": len(events),
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
return jsonify({"success": False, "error": str(e)}), 500
|
||||
|
||||
|
||||
def graceful_shutdown():
|
||||
"""Perform graceful shutdown"""
|
||||
print("\n⏹️ Performing graceful shutdown...")
|
||||
try:
|
||||
streamer.stop_streaming()
|
||||
streamer.disconnect_plc()
|
||||
streamer.release_instance_lock()
|
||||
print("✅ Shutdown completed successfully")
|
||||
except Exception as e:
|
||||
print(f"⚠️ Error during shutdown: {e}")
|
||||
|
||||
|
||||
def main():
|
||||
"""Main application entry point with error handling and recovery"""
|
||||
max_retries = 3
|
||||
retry_count = 0
|
||||
|
||||
while retry_count < max_retries:
|
||||
try:
|
||||
# Create templates directory if it doesn't exist
|
||||
os.makedirs("templates", exist_ok=True)
|
||||
|
||||
print("🚀 Starting Flask server for PLC S7-315 Streamer")
|
||||
print("📊 Web interface available at: http://localhost:5050")
|
||||
print("🔧 Configure your PLC and variables through the web interface")
|
||||
|
||||
# Initialize streamer (this will handle instance locking and auto-recovery)
|
||||
global streamer
|
||||
|
||||
# Start Flask application
|
||||
app.run(debug=False, host="0.0.0.0", port=5050, use_reloader=False)
|
||||
|
||||
# If we reach here, the server stopped normally
|
||||
break
|
||||
|
||||
except RuntimeError as e:
|
||||
if "Another instance" in str(e):
|
||||
print(f"❌ {e}")
|
||||
print("💡 Tip: Stop the other instance or wait for it to finish")
|
||||
sys.exit(1)
|
||||
else:
|
||||
print(f"⚠️ Runtime error: {e}")
|
||||
retry_count += 1
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\n⏸️ Received interrupt signal...")
|
||||
graceful_shutdown()
|
||||
break
|
||||
|
||||
except Exception as e:
|
||||
print(f"💥 Unexpected error: {e}")
|
||||
retry_count += 1
|
||||
|
||||
if retry_count < max_retries:
|
||||
print(f"🔄 Attempting restart ({retry_count}/{max_retries})...")
|
||||
time.sleep(2) # Wait before retry
|
||||
else:
|
||||
print("❌ Maximum retries reached. Exiting...")
|
||||
graceful_shutdown()
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
# Initialize streamer instance
|
||||
streamer = PLCDataStreamer()
|
||||
main()
|
||||
except Exception as e:
|
||||
print(f"💥 Critical error during initialization: {e}")
|
||||
sys.exit(1)
|
||||
|
|
|
@ -2,11 +2,25 @@
|
|||
"UR29_Brix": {
|
||||
"db": 2121,
|
||||
"offset": 18,
|
||||
"type": "real"
|
||||
"type": "real",
|
||||
"streaming": true
|
||||
},
|
||||
"UR62_Brix": {
|
||||
"db": 2122,
|
||||
"offset": 18,
|
||||
"type": "real"
|
||||
"type": "real",
|
||||
"streaming": true
|
||||
},
|
||||
"UR29_Brix_Digital": {
|
||||
"db": 2120,
|
||||
"offset": 40,
|
||||
"type": "real",
|
||||
"streaming": true
|
||||
},
|
||||
"CTS306_Conditi": {
|
||||
"db": 2124,
|
||||
"offset": 18,
|
||||
"type": "real",
|
||||
"streaming": true
|
||||
}
|
||||
}
|
File diff suppressed because it is too large
Load Diff
|
@ -1,2 +1,3 @@
|
|||
Flask==2.3.3
|
||||
python-snap7==1.3
|
||||
python-snap7==1.3
|
||||
psutil==5.9.5
|
|
@ -0,0 +1,9 @@
|
|||
{
|
||||
"last_state": {
|
||||
"should_connect": true,
|
||||
"should_stream": true,
|
||||
"should_record_csv": true
|
||||
},
|
||||
"auto_recovery_enabled": true,
|
||||
"last_update": "2025-07-17T15:42:38.054690"
|
||||
}
|
|
@ -250,6 +250,97 @@
|
|||
color: #1f2937;
|
||||
}
|
||||
|
||||
.log-container {
|
||||
max-height: 400px;
|
||||
overflow-y: auto;
|
||||
background-color: #1f2937;
|
||||
border-radius: 8px;
|
||||
padding: 15px;
|
||||
border: 1px solid #374151;
|
||||
}
|
||||
|
||||
.log-entry {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
margin-bottom: 10px;
|
||||
padding: 8px;
|
||||
border-radius: 5px;
|
||||
font-family: 'Courier New', monospace;
|
||||
font-size: 12px;
|
||||
border-left: 3px solid transparent;
|
||||
}
|
||||
|
||||
.log-entry.log-info {
|
||||
background-color: #374151;
|
||||
border-left-color: #6b7280;
|
||||
color: #e5e7eb;
|
||||
}
|
||||
|
||||
.log-entry.log-warning {
|
||||
background-color: #451a03;
|
||||
border-left-color: #f59e0b;
|
||||
color: #fef3c7;
|
||||
}
|
||||
|
||||
.log-entry.log-error {
|
||||
background-color: #450a0a;
|
||||
border-left-color: #ef4444;
|
||||
color: #fecaca;
|
||||
}
|
||||
|
||||
.log-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
font-weight: bold;
|
||||
margin-bottom: 4px;
|
||||
}
|
||||
|
||||
.log-timestamp {
|
||||
font-size: 10px;
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
.log-type {
|
||||
background-color: rgba(255, 255, 255, 0.1);
|
||||
padding: 2px 6px;
|
||||
border-radius: 3px;
|
||||
font-size: 10px;
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.log-message {
|
||||
margin-bottom: 4px;
|
||||
word-wrap: break-word;
|
||||
}
|
||||
|
||||
.log-details {
|
||||
font-size: 10px;
|
||||
opacity: 0.8;
|
||||
background-color: rgba(0, 0, 0, 0.2);
|
||||
padding: 4px;
|
||||
border-radius: 3px;
|
||||
margin-top: 4px;
|
||||
white-space: pre-wrap;
|
||||
}
|
||||
|
||||
.log-controls {
|
||||
display: flex;
|
||||
gap: 10px;
|
||||
margin-bottom: 15px;
|
||||
flex-wrap: wrap;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.log-stats {
|
||||
background-color: #f9fafb;
|
||||
border: 1px solid #e5e7eb;
|
||||
border-radius: 5px;
|
||||
padding: 8px 12px;
|
||||
font-size: 12px;
|
||||
color: #374151;
|
||||
}
|
||||
|
||||
@media (max-width: 768px) {
|
||||
.header h1 {
|
||||
font-size: 2rem;
|
||||
|
@ -452,6 +543,41 @@
|
|||
<button class="btn" onclick="location.reload()">🔄 Refresh Status</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Application Events Log -->
|
||||
<div class="card">
|
||||
<h2>📋 Application Events Log</h2>
|
||||
<div class="info-section">
|
||||
<p><strong>📝 Event Tracking:</strong> Connection events, configuration changes, errors and system
|
||||
status</p>
|
||||
<p><strong>💾 Persistent Storage:</strong> Events are saved to disk and persist between application
|
||||
restarts</p>
|
||||
</div>
|
||||
|
||||
<div class="log-controls">
|
||||
<button class="btn" onclick="refreshEventLog()">🔄 Refresh Log</button>
|
||||
<button class="btn" onclick="clearLogView()">🧹 Clear View</button>
|
||||
<select id="log-limit" onchange="refreshEventLog()">
|
||||
<option value="25">Last 25 events</option>
|
||||
<option value="50" selected>Last 50 events</option>
|
||||
<option value="100">Last 100 events</option>
|
||||
<option value="200">Last 200 events</option>
|
||||
</select>
|
||||
<div class="log-stats" id="log-stats">
|
||||
Loading log statistics...
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="log-container" id="events-log">
|
||||
<div class="log-entry log-info">
|
||||
<div class="log-header">
|
||||
<span>📡 System</span>
|
||||
<span class="log-timestamp">Loading...</span>
|
||||
</div>
|
||||
<div class="log-message">Loading application events...</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
|
@ -695,12 +821,131 @@
|
|||
});
|
||||
});
|
||||
|
||||
// Application Events Log Functions
|
||||
function formatTimestamp(isoString) {
|
||||
const date = new Date(isoString);
|
||||
return date.toLocaleString('es-ES', {
|
||||
year: 'numeric',
|
||||
month: '2-digit',
|
||||
day: '2-digit',
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
second: '2-digit'
|
||||
});
|
||||
}
|
||||
|
||||
function getEventIcon(eventType) {
|
||||
const icons = {
|
||||
'plc_connection': '🔗',
|
||||
'plc_connection_failed': '❌',
|
||||
'plc_disconnection': '🔌',
|
||||
'plc_disconnection_error': '⚠️',
|
||||
'streaming_started': '▶️',
|
||||
'streaming_stopped': '⏹️',
|
||||
'streaming_error': '❌',
|
||||
'csv_started': '💾',
|
||||
'csv_stopped': '📁',
|
||||
'csv_error': '❌',
|
||||
'config_change': '⚙️',
|
||||
'variable_added': '➕',
|
||||
'variable_removed': '➖',
|
||||
'application_started': '🚀'
|
||||
};
|
||||
return icons[eventType] || '📋';
|
||||
}
|
||||
|
||||
function createLogEntry(event) {
|
||||
const logEntry = document.createElement('div');
|
||||
logEntry.className = `log-entry log-${event.level}`;
|
||||
|
||||
const hasDetails = event.details && Object.keys(event.details).length > 0;
|
||||
|
||||
logEntry.innerHTML = `
|
||||
<div class="log-header">
|
||||
<span>${getEventIcon(event.event_type)} ${event.event_type.replace(/_/g, ' ').toUpperCase()}</span>
|
||||
<span class="log-timestamp">${formatTimestamp(event.timestamp)}</span>
|
||||
</div>
|
||||
<div class="log-message">${event.message}</div>
|
||||
${hasDetails ? `<div class="log-details">${JSON.stringify(event.details, null, 2)}</div>` : ''}
|
||||
`;
|
||||
|
||||
return logEntry;
|
||||
}
|
||||
|
||||
function refreshEventLog() {
|
||||
const limit = document.getElementById('log-limit').value;
|
||||
|
||||
fetch(`/api/events?limit=${limit}`)
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
if (data.success) {
|
||||
const logContainer = document.getElementById('events-log');
|
||||
const logStats = document.getElementById('log-stats');
|
||||
|
||||
// Clear existing entries
|
||||
logContainer.innerHTML = '';
|
||||
|
||||
// Update statistics
|
||||
logStats.textContent = `Showing ${data.showing} of ${data.total_events} events`;
|
||||
|
||||
// Add events (reverse order to show newest first)
|
||||
const events = data.events.reverse();
|
||||
|
||||
if (events.length === 0) {
|
||||
logContainer.innerHTML = `
|
||||
<div class="log-entry log-info">
|
||||
<div class="log-header">
|
||||
<span>📋 System</span>
|
||||
<span class="log-timestamp">${new Date().toLocaleString('es-ES')}</span>
|
||||
</div>
|
||||
<div class="log-message">No events found</div>
|
||||
</div>
|
||||
`;
|
||||
} else {
|
||||
events.forEach(event => {
|
||||
logContainer.appendChild(createLogEntry(event));
|
||||
});
|
||||
}
|
||||
|
||||
// Auto-scroll to top to show newest events
|
||||
logContainer.scrollTop = 0;
|
||||
} else {
|
||||
console.error('Error loading events:', data.error);
|
||||
showMessage('Error loading events log', 'error');
|
||||
}
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error fetching events:', error);
|
||||
showMessage('Error fetching events log', 'error');
|
||||
});
|
||||
}
|
||||
|
||||
function clearLogView() {
|
||||
const logContainer = document.getElementById('events-log');
|
||||
logContainer.innerHTML = `
|
||||
<div class="log-entry log-info">
|
||||
<div class="log-header">
|
||||
<span>🧹 System</span>
|
||||
<span class="log-timestamp">${new Date().toLocaleString('es-ES')}</span>
|
||||
</div>
|
||||
<div class="log-message">Log view cleared. Click refresh to reload events.</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
const logStats = document.getElementById('log-stats');
|
||||
logStats.textContent = 'Log view cleared';
|
||||
}
|
||||
|
||||
// Update status every 5 seconds
|
||||
setInterval(updateStatus, 5000);
|
||||
|
||||
// Update event log every 10 seconds
|
||||
setInterval(refreshEventLog, 10000);
|
||||
|
||||
// Initial update
|
||||
updateStatus();
|
||||
loadStreamingStatus();
|
||||
refreshEventLog();
|
||||
</script>
|
||||
</body>
|
||||
|
||||
|
|
Loading…
Reference in New Issue