diff --git a/.doc/MemoriaDeEvolucion.md b/.doc/MemoriaDeEvolucion.md index 4ce6700..e69de29 100644 --- a/.doc/MemoriaDeEvolucion.md +++ b/.doc/MemoriaDeEvolucion.md @@ -1,1181 +0,0 @@ -2025-08-11 - Favicon y logos - -- Resumen petición usuario: Cambiar el logo por `record.png` y que también se vea en la pestaña del navegador (favicon). - -- Cambios clave: - - `frontend/index.html`: `` y `shortcut icon` para forzar recarga. - - `main.py`: Ruta `/favicon.ico` que sirve `static/icons/record.png` para cubrir la petición automática del navegador. - - `frontend/src/App.jsx`: Reemplazo de logos visuales por `/static/icons/record.png` en header y navbar. - -- Decisiones/Notas: - - Estándar adoptado para estáticos en React: - - `frontend/public/`: archivos públicos sin import (favicon, robots.txt, imágenes públicas). Servidos desde la raíz: `/favicon.ico`, `/record.png`. - - `frontend/src/assets/`: assets usados por componentes React, importados con `import img from '...';` para que Vite haga hashing y optimización. - - Para favicon no usar archivos bajo `frontend/src/…`. Debe estar en `frontend/public` y enlazarse como `/favicon.ico`. - - Se añadió `?v=2` para evitar caché del navegador. - -# Project Evolution Memory - -## PLC S7-315 Streamer & Logger - -## Functional Description of the Application - -This application is a web server (created with the Flask framework in Python) that acts as an intermediary to monitor and record data in CSV format from a PLC Siemens S7 with the SNAP7 library to be used on a low-resource PC connected to the PLC. -It must be as simple as possible to allow the pack using PyInstaller and must be capable of running completelly offline from internet. - -#### Its key functions are: - -Variables are defined in DataSets or groups with different polling times. DataSets enable data exchange between machines as they are JSON files that allow setting various polling times. CSV files are also created with a suffix from the DataSet, making it easier to use the data externally. - -* DataSets can be active or inactive, which determines if they are saved in the corresponding CSV. -* Variables can be active or inactive for UDP streaming to PlotJuggler. - -**Automatic CSV Recording**: When the PLC is connected, all datasets with variables are automatically activated and begin recording data to CSV files. This recording is continuous and independent of other operations. - -**Real-Time UDP Transmission (PlotJuggler Streaming)**: Sends data in real time using the UDP protocol, allowing applications like PlotJuggler to receive and visualize data live. This is a manual control separate from automatic CSV recording. -For UDP streaming, there is an update interval for the data, but this interval only updates available data. It does not mean the data is read at that speed. UDP streaming uses only a reading cache of active variables. If variables are not marked for streaming, they are not included in UDP transmission. - -**Live Web Monitoring**: The web interface displays current variable values and system status, updating in real time from the streaming cache. -Frontend monitoring also uses only the cache of active variables. - -In summary, variables are read from the PLC only if they are active and at the speed set in their DataSet. Each read variable is stored in a memory cache. CSV recording is automatic when PLC is connected, while UDP streaming and frontend monitoring have their own intervals and read values only from this cache, which helps protect the PLC from overload. - -The application is designed to be persistent across restarts, restoring the previous state as much as possible. This includes reconnecting to the PLC, reactivating DataSets, and resuming operations if they were active before. - -## Modifications - -### Latest Modifications (Current Session) - -#### Unified React Dashboard (Status + Collapsible Config + Events Preview) -User prompt summary: "Hacer la página principal en React más intuitiva como el legacy: estado tipo `index.html`/`status.js`, agregar el log de eventos al final, y que Config sea un tab colapsable en la misma página." - -Decisions: -- Create a new React Dashboard as the main route with: legacy-like status controls, a collapsible Config section using RJSF with tabs for `plc`, `datasets`, `plots`, and a recent Events table at the bottom. -- Keep existing dedicated pages (`/status`, `/events`, `/config`, `/plots`) for deeper navigation. - -Implementation: -- `frontend/src/pages/Dashboard.jsx`: - - Status bar with PLC Connect/Disconnect and UDP Start/Stop mirrored from legacy behavior; live updates via SSE `/api/stream/status`. - - Collapsible Config editor using `@rjsf/core` + custom widgets, schema selector tabs, Import/Export, Save. - - Recent Events preview table (last 50) with quick Refresh and link to full Events page. -- `frontend/src/services/api.js`: added helpers `connectPlc`, `disconnectPlc`, `startUdpStreaming`, `stopUdpStreaming`. -- `frontend/src/App.jsx`: route `/` now renders the new Dashboard. - -Notes: -- Variables and comments kept in English as per project rules. No fallback code added. -- This makes the SPA landing page operationally useful similar to the legacy UI while maintaining modular pages. - -#### React SPA Migration and RJSF Implementation -User prompt summary: "Migrate the legacy `index.html` to a React SPA using Bootstrap 5. Use `react-jsonschema-form` (RJSF) to create forms from JSON schemas, especially for `plc_config.json` in a modal. Also, create a dedicated page for real-time plots and use a table editor for arrays." - -Decisions: -- We will replace the Jinja2-based frontend with a React Single-Page Application (SPA). -- We will use `react-bootstrap` for the UI components to align with Bootstrap 5. -- We will use `react-jsonschema-form` (`@rjsf/core`) to dynamically generate forms from our existing JSON schemas. -- Due to dependency conflicts with `@rjsf/bootstrap-4` and `@rjsf/react-bootstrap`, we will create custom form widgets using `react-bootstrap` components to ensure compatibility and full control over the UI. - -Implementation: -- `frontend/package.json`: Installed `@rjsf/core` and `@rjsf/validator-ajv8`. Removed attempts to install theme packages. -- `frontend/src/pages/Plots.jsx`: Created a placeholder page for the real-time plotting feature. -- `frontend/src/components/PLCConfigModal.jsx`: - - Refactored the modal to use `@rjsf/core` instead of a themed version. - - Implemented custom widgets (`TextWidget`, `UpDownWidget`) using `react-bootstrap` components (`BSForm.Control`). - - Created a `uiSchema` to map the custom widgets to the corresponding fields in the `plc_config.json` schema. -- This approach resolves the dependency issues and provides a flexible foundation for building the rest of the forms. - -Notes: -- The installation of RJSF theme packages proved to be problematic due to version incompatibilities with `react-bootstrap`. The custom widget approach is more robust. -- The next steps will be to create custom widgets for other form fields (like booleans and selects) and to implement the table editor for arrays. - -#### React SPA: rutas iniciales y consumo de APIs (Status, Events) -User prompt summary: "El servidor frontend con React en main.py parece funcionar; migrar resto de vistas incrementalmente (Status, Datasets/Variables, Plotting, Events, Config Editor)." - -Decisions: -- Mantener migración incremental creando SPA con router y servicios de API reutilizables. -- Priorizar páginas de bajo riesgo: `Status` y `Events` como primeras vistas. - -Implementation: -- `frontend/src/App.jsx`: añadió router con rutas `/`, `/status`, `/events`, barra de navegación. -- `frontend/src/pages/Status.jsx`: página que consume `/api/status` con botón de refresco. -- `frontend/src/pages/Events.jsx`: página que consume `/api/events?limit=100` con tabla responsive. -- `frontend/src/services/api.js`: cliente fetch básico (`getStatus`, `getEvents`, `getJson`, `postJson`, `putJson`). -- `frontend/src/main.jsx`: envoltura con `BrowserRouter`. -- `frontend/package.json`: dependencia `react-router-dom`. - -Notes: -- Próximos pasos: migrar `Datasets/Variables`, `Plotting` y `Config Editor` usando el mismo cliente API. - -#### RJSF + React-Bootstrap para Config Editor y modal PLC; página dedicada de Plots -User prompt summary: "Usar Bootstrap en legacy, migrar a React con RJSF (tema Bootstrap 5) para formularios basados en JSON Schema; arrays como tabla; página separada para plots; modal para `plc_config.json`." - -Decisions: -- Adoptar `react-jsonschema-form` con tema Bootstrap 5 para editar `plc`, `datasets`, `plots` desde schemas. -- Crear página `Config` y modal dedicado para PLC config con RJSF. -- Añadir página independiente `Plots` para migrar gráficos en tiempo real. - -Implementation: -- `frontend/package.json`: añadidas deps `@rjsf/core`, `@rjsf/bootstrap-5`, `@rjsf/validator-ajv8`, `react-bootstrap`. -- `frontend/src/services/api.js`: funciones `listSchemas`, `getSchema`, `readConfig`, `writeConfig`. -- `frontend/src/pages/Config.jsx`: selector de esquema, import/export JSON, form RJSF con guardado. -- `frontend/src/components/PLCConfigModal.jsx`: modal con RJSF para `plc`. -- `frontend/src/pages/Plots.jsx`: scaffolding de página de plots. -- `frontend/src/App.jsx`: rutas `/config` y `/plots`, botón para abrir modal de PLC. - -Notes: -- Próximo: editor tabular para arrays (variables de datasets) con tabla editable; luego migrar plotting realtime. - -#### React + Vite + Bootstrap Migration Kickoff -User prompt summary: "Pasar todo a Bootstrap y React con Vite; comenzar la refactorización del proyecto (main.py, index.html)". - -Decisions: -- Mantener backend Flask y APIs `/api/*` intactas. -- Añadir CORS para permitir desarrollo con Vite (`localhost:5173`). -- Servir build de React (Vite) en `/app` y conservar UI Jinja como `/legacy` para transición. - -Implementation: -- `main.py`: agregado Flask-Cors; nuevas rutas `/app` e `/app/assets/*` sirviendo `frontend/dist`; `/legacy` mantiene template actual; `/` ahora sirve la SPA React. -- `requirements.txt`: añadido `Flask-Cors`. -- `templates/index.html`: aviso con enlace a `/app` (nueva UI). -- `frontend/`: creado proyecto base Vite React con Bootstrap (package.json, vite.config.js con proxy a Flask, index.html, src/main.jsx, src/App.jsx). - -Notes: -- Migración incremental: iremos moviendo secciones (Status, Datasets/Variables, Plotting, Events, Config Editor) a React en etapas. - -#### Config Editor UI Compact Layout and Schema Enhancements -User prompt summary: Mejorar la interfaz del editor JSONForm para que los campos no ocupen toda la pantalla; configurar límites; definir nombres y descripciones de campos. - -Decisions: -- Aplicar layout compacto por CSS tanto para JSONForm como para el renderer fallback, en columnas auto-fit. -- Enriquecer los esquemas con `title`, `description`, límites (`minimum`, `maximum`, `minLength`, `maxLength`, `pattern`) y formatos donde aplique. - -Implementation: -- styles.css: añadido grid en `#jsonform-form` y `.config-editor-form` + `.object-group` para 2–3 columnas según ancho, con gaps compactos. -- plc.schema.json: títulos y descripciones para `ip`, `rack`, `slot`; `format: ipv4` para `ip`; `maximum` y `description` para `sampling_interval`. -- datasets.schema.json: títulos/descr. para `name`, `prefix` (con `pattern` y límites de longitud), límites para `offset` (máx 8191) y `db` (máx 9999), títulos para `area/type/bit/streaming`, títulos para propiedades raíz y de dataset; `sampling_interval` con `maximum`. -- plots.schema.json: títulos y descripciones para `name`, `variables`, `time_window`, `y_min`, `y_max`, y campos de trigger. - -Notes: -- JSON Schema estándar no define layout; el layout por columnas se resuelve en CSS/JSONForm form. El fallback ahora también se ve compacto sin depender de JSONForm. - -#### Config Editor UI: JSONForm as primary renderer (simple forms from JSON Schema) -User prompt summary: Prefer a simpler form-based editor for JSON content (avoid exposing too many controls); evaluate alternatives to JSONEditor; prefer JSONForm (jQuery + Bootstrap-like forms) over tree/code editors. - -Decisions: -- Keep backend `/api/config/*` unchanged; swap frontend renderer to JSONForm for a simpler UX. -- Load JSONForm via CDN with minimal deps (jQuery + Underscore). Avoid full Bootstrap migration; styling remains with Pico.css. -- Keep JSONEditor as fallback (tree/code) for advanced edits. - -Implementation: -- templates/index.html: added CDN includes for jQuery, Underscore, and JSONForm; kept JSONEditor include for fallback. -- static/js/config_editor.js: prefer JSONForm when available, generating forms directly from JSON Schema; wired Save to submit JSONForm and PUT values; Import rebuilds form with imported JSON; Export uses last known values or JSONEditor when active. - -Notes: -- JSONForm benefits: simple, guided forms; schema-driven validation; no React/bundler required. -- Licensing: MIT; OK for commercial/offline packaging. -- Handsontable not used due to licensing constraints; Tabulator remains an alternative for tabular arrays if needed. - - -#### Schema-based Config Editor & API -Decision: Añadir un editor dinámico basado en JSON Schema para gestionar `plc_config.json`, `plc_datasets.json` y `plot_sessions.json`, con importar/exportar. - -Implementation: -- Backend: nueva clase `ConfigSchemaManager` (`core/schema_manager.py`), endpoints `/api/config/*` para listar esquemas, leer/escribir y exportar. -- Esquemas: `schemas/plc.schema.json`, `schemas/datasets.schema.json`, `schemas/plots.schema.json`. -- Frontend: pestaña “Config Editor” y script `static/js/config_editor.js` que genera formularios desde schema (objects, arrays, enums, booleans con labels) e importar/exportar. -- Dependencias: `jsonschema` (validación opcional en backend). - -Notas UX: primera versión funcional. Siguiente paso acordado: evaluar librerías UI de JSON Schema para mejorar el diseño visual y la ergonomía del editor. - -#### Real-Time Plotting System Implementation -**Decision**: Implementar un sistema completo de plotting en tiempo real con trigger de variables boolean y uso exclusivo del cache del recording. - -**Rationale**: Los usuarios necesitaban visualización en tiempo real de variables del PLC sin agregar carga adicional al sistema. El sistema de plotting debe ser independiente del UDP streaming existente y usar solo los datos del cache para mantener la eficiencia. - -**Implementation**: - -**Backend Architecture** (`core/plot_manager.py`): -- **PlotSession Class**: Maneja sesiones individuales de plotting con configuración específica -- **PlotManager Class**: Gestiona todas las sesiones con thread safety -- **Trigger System**: Variables boolean que reinician automáticamente el trace -- **Cache Integration**: Usa exclusivamente datos del cache del recording (sin carga PLC adicional) - -**Key Features**: -- **Boolean Trigger**: Variables bool pueden reiniciar traces automáticamente -- **Multiple Sessions**: Múltiples plot sessions independientes simultáneas -- **Performance Optimized**: Solo usa cache del recording, no lecturas PLC adicionales -- **Flexible Configuration**: Ventana de tiempo, rango Y, trigger configurable - -**Frontend Implementation** (`static/js/plotting.js`): -- **Chart.js Integration**: Gráficos modernos y responsivos -- **Auto-update System**: Actualización automática cada 500ms -- **Modal Interface**: Creación intuitiva de nuevas sesiones -- **Real-time Controls**: Start/Stop/Pause/Clear individual por sesión - -**API Endpoints** (`main.py`): -- **GET /api/plots**: Estado de todas las sesiones -- **POST /api/plots**: Crear nueva sesión -- **DELETE /api/plots/**: Eliminar sesión -- **POST /api/plots//control**: Control de sesión (start/stop/pause/clear) -- **GET /api/plots//data**: Datos para Chart.js -- **GET /api/plots/variables**: Variables disponibles (solo datasets activos) - -**Tab System Integration**: -- **New Tab Interface**: Sistema de tabs para organizar funcionalidades -- **Plotting Tab**: Dedicado al sistema de plotting -- **Events Tab**: Para logs y eventos del sistema -- **Responsive Design**: Adaptable a diferentes tamaños de pantalla - -**Technical Benefits**: -- **Zero PLC Load**: No agrega lecturas adicionales al PLC -- **Cache Efficiency**: Reutiliza datos del sistema de recording -- **Thread Safety**: Operaciones concurrentes seguras -- **Memory Management**: Deques con tamaño limitado para evitar memory leaks - -**User Experience**: -- **Intuitive Interface**: Creación fácil de plots con modal -- **Real-time Feedback**: Estadísticas en tiempo real de cada sesión -- **Visual Controls**: Botones claros para controlar sesiones -- **Error Handling**: Manejo robusto de errores y feedback al usuario - -**Trigger System Details**: -- **Boolean Variables Only**: Solo variables de tipo BOOL pueden ser triggers -- **Configurable Logic**: Trigger en True o False según configuración -- **Automatic Restart**: Limpia datos y reinicia trace cuando se activa -- **State Tracking**: Mantiene estado del trigger para detectar cambios - -**Integration with Existing System**: -- **DataStreamer Enhancement**: Integración automática con el cache existente -- **No Breaking Changes**: Compatible con sistema existente -- **Performance Neutral**: No afecta rendimiento del sistema principal -- **Event Logging**: Logs detallados de operaciones de plotting - -**Configuration Options**: -- **Time Window**: 10-3600 segundos configurable -- **Y-Axis Range**: Automático o manual (min/max) -- **Variable Selection**: Solo variables de datasets activos -- **Trigger Configuration**: Variable bool + lógica (True/False) - -**Industrial Benefits**: -- **Process Monitoring**: Visualización en tiempo real de variables críticas -- **Trigger Analysis**: Análisis de eventos específicos con reinicio automático -- **Multiple Views**: Diferentes perspectivas del mismo proceso -- **Historical Context**: Mantiene contexto temporal con ventanas configurables - -**Dependencies Added**: -- **Chart.js**: Librería de gráficos (CDN) -- **Flask-SocketIO**: Para futuras mejoras con WebSockets -- **Date-fns**: Para manejo de fechas en Chart.js - -Esta implementación proporciona una herramienta poderosa para monitoreo en tiempo real sin comprometer el rendimiento del sistema, manteniendo la arquitectura existente y agregando funcionalidad significativa para análisis de procesos industriales. - -#### Critical Fix: CSV Recording vs UDP Streaming Separation and Thread Join Error Resolution -**Issue**: El sistema tenía un error crítico `RuntimeError: cannot join current thread` al detener streaming, y había confusión entre CSV recording (que debe ser automático) y UDP streaming (que debe ser manual). Al detener streaming UDP se detenía también el recording CSV, violando el diseño del sistema. - -**Root Cause Analysis**: -1. **Thread Join Error**: En `dataset_streaming_loop` línea 552, el mismo hilo que estaba ejecutando llamaba a `self.stop_dataset_streaming()`, que intentaba hacer `thread.join()` sobre sí mismo -2. **Arquitectura Mixta Incorrecta**: Los métodos `start_streaming()` y `stop_streaming()` activaban/desactivaban datasets completos, afectando tanto CSV como UDP -3. **Conceptos Mezclados**: No había separación real entre recording automático y streaming manual a PlotJuggler - -**Solution**: Implementé separación completa entre CSV recording (automático) y UDP streaming (manual), eliminando la confusión arquitectural y resolviendo el error del thread join. - -**Implementation**: - -**Nueva Arquitectura de Control**: -- **CSV Recording**: Automático cuando PLC conectado, independiente de UDP streaming -- **UDP Streaming**: Manual solo para PlotJuggler, no afecta CSV recording -- **Dataset Threads**: Manejan ambos pero con flags independientes (`csv_recording_enabled`, `udp_streaming_enabled`) - -**Cambios Técnicos en DataStreamer** (`core/streamer.py`): -- **Nuevos Flags de Control**: `udp_streaming_enabled` y `csv_recording_enabled` independientes -- **Métodos Separados**: - - `start_csv_recording()` / `stop_csv_recording()` - Control automático de grabación - - `start_udp_streaming()` / `stop_udp_streaming()` - Control manual de streaming UDP -- **Thread Join Fix**: Verificación `thread != threading.current_thread()` antes de join -- **Dataset Loop Mejorado**: El bucle no llama más a `stop_dataset_streaming()` internamente - -**Lógica de Dataset Loop Actualizada**: -```python -# 📝 CSV Recording: Always write if enabled (automatic) -if self.csv_recording_enabled: - self.write_dataset_csv_data(dataset_id, all_data) - -# 📡 UDP Streaming: Only if UDP streaming is enabled (manual) -if self.udp_streaming_enabled: - # Send filtered data to PlotJuggler - if streaming_data: - self.send_to_plotjuggler(streaming_data) -``` - -**Cambios en Conexión PLC** (`core/plc_data_streamer.py`): -- **Auto-start CSV Recording**: `connect_plc()` ahora inicia automáticamente CSV recording -- **Desconexión Completa**: `disconnect_plc()` detiene tanto CSV recording como UDP streaming -- **Logging Mejorado**: Mensajes claros sobre qué se está iniciando/deteniendo - -**Auto-Recovery Actualizado** (`core/instance_manager.py`): -- **Recovery Separado**: Restaura CSV recording automáticamente y UDP streaming solo si estaba activo -- **Orden Correcto**: CSV recording primero, luego UDP streaming si es necesario - -**Nuevos Endpoints API** (`main.py`): -- **CSV Recording**: `/api/csv/recording/start` y `/api/csv/recording/stop` -- **UDP Streaming**: `/api/udp/streaming/start` y `/api/udp/streaming/stop` -- **Legacy Compatibility**: Endpoints antiguos mantienen funcionalidad pero solo para UDP - -**Frontend Actualizado**: -- **Streaming.js**: Usa nuevos endpoints UDP independientes -- **Status.js**: Botones de control usan endpoints correctos para UDP streaming -- **Separación Visual**: Clara distinción entre CSV recording y UDP streaming en interfaz - -**Beneficios del Nuevo Sistema**: -- **Error Resolution**: Eliminado completamente el `RuntimeError: cannot join current thread` -- **Operación Correcta**: CSV recording continúa automáticamente independiente del UDP streaming -- **Control Granular**: Usuario puede controlar UDP streaming sin afectar grabación de datos -- **Robustez**: Sistema más estable y predecible en operación industrial -- **Claridad Conceptual**: Separación clara entre grabación automática y streaming manual - -**Flujo de Operación Corregido**: -1. **Conectar PLC** → Inicia automáticamente CSV recording para todos los datasets con variables -2. **CSV Recording** → Siempre activo cuando PLC conectado, independiente de otros controles -3. **UDP Streaming** → Control manual independiente para envío a PlotJuggler -4. **Desconectar PLC** → Detiene tanto CSV recording como UDP streaming - -**Technical Fix Details**: -- **Thread Management**: Eliminación de auto-cleanup en `dataset_streaming_loop` -- **State Flags**: Banderas independientes para cada tipo de operación -- **Error Prevention**: Verificación de thread actual antes de join operations -- **Resource Management**: Cierre correcto de recursos sin afectar threads activos - -Esta modificación resuelve el problema crítico reportado y establece una arquitectura sólida que separa correctamente las funciones automáticas de las manuales, cumpliendo con el diseño original del sistema. - -#### Automatic Recording on PLC Connection and Interface Improvements -**Issue**: The application required manual activation of datasets for recording after connecting to the PLC, and the interface had several usability issues including non-functional status buttons and redundant diagnostic functions. - -**Solution**: Implemented automatic dataset activation upon PLC connection and streamlined the interface by removing unnecessary functions and clarifying the distinction between automatic CSV recording and manual UDP streaming. - -**Implementation**: - -**Automatic Recording System**: -- **PLC Connection Trigger**: When connecting to PLC, all datasets with variables are automatically activated for recording -- **Immediate Data Collection**: Recording begins instantly without manual intervention -- **Smart Activation**: Only datasets with defined variables are activated, preventing empty dataset processing -- **Error Handling**: Graceful handling of activation failures with detailed logging -- **State Persistence**: Auto-activated datasets are saved in system state for recovery - -**Backend Changes** (`core/plc_data_streamer.py`): -- **Enhanced `connect_plc()` Method**: Now automatically activates datasets with variables -- **Activation Logging**: Detailed logging of which datasets were auto-activated -- **Error Recovery**: Individual dataset activation failures don't prevent others from starting -- **Event Logging**: Enhanced connection events include auto-activation statistics - -**Interface Streamlining**: -- **Removed Redundant Functions**: Eliminated `diagnose-btn` and `diagnoseConnection()` function -- **Removed Manual Refresh**: Eliminated `refresh-values-btn` and `refreshVariableValues()` function -- **Automatic Monitoring**: Variable values are now automatically monitored when PLC is connected -- **Live Display System**: Replaced manual refresh with automatic live display from cache - -**Status Button Fix** (`static/js/status.js`): -- **Event Listener Issue**: Fixed `status-connect-btn` not working during streaming updates -- **Dynamic Button Handling**: Added event listeners in `updateStatusFromStream()` function -- **Consistent Behavior**: All status buttons now work regardless of update method -- **Robust Implementation**: Proper event listener management for all dynamic buttons - -**Conceptual Separation**: -- **UDP Streaming**: Now clearly labeled as "PlotJuggler UDP Streaming" - manual control for data visualization -- **CSV Recording**: Automatic and continuous when PLC is connected - no manual intervention required -- **Live Display**: Optional real-time display of variable values in web interface -- **Independent Operation**: CSV recording works independently of UDP streaming - -**Interface Updates** (`templates/index.html`): -- **Section Renaming**: "Multi-Dataset Streaming Control" → "PlotJuggler UDP Streaming Control" -- **Status Bar Updates**: "Streaming" → "UDP Streaming" for clarity -- **Button Text Changes**: "Start All Active Datasets" → "Start UDP Streaming" -- **Information Panels**: Updated descriptions to clarify automatic vs manual operations -- **Variable Management**: Removed manual refresh and diagnose buttons, simplified workflow - -**JavaScript Enhancements** (`static/js/variables.js`): -- **Auto-Start Live Display**: `autoStartLiveDisplay()` function replaces manual refresh -- **Streaming Indicator Updates**: Modified `updateStreamingIndicator()` for new button structure -- **Function Cleanup**: Removed `refreshVariableValues()`, `diagnoseConnection()`, and related functions -- **Automatic Integration**: Live display starts automatically when dataset changes and PLC is connected - -**New Workflow**: -1. **Connect PLC** → Automatically activates all datasets with variables and begins recording -2. **CSV Recording** → Always active when PLC connected (independent of UDP streaming) -3. **UDP Streaming** → Manual control only for PlotJuggler data visualization -4. **Live Display** → Optional real-time display of cached values in web interface - -**Benefits**: -- **Simplified Operation**: No need to remember to activate recording manually -- **Immediate Data Collection**: Recording starts as soon as PLC connection is established -- **Clear Separation**: Distinct understanding of automatic recording vs manual streaming -- **Reduced Complexity**: Eliminated redundant diagnostic and refresh functions -- **Better UX**: More intuitive workflow with fewer manual steps -- **Robust Interface**: Fixed status buttons work consistently in all scenarios - -**Technical Improvements**: -- **Event Listener Management**: Proper handling of dynamically created buttons -- **Automatic State Management**: System automatically manages recording state -- **Error Resilience**: Individual failures don't prevent overall system operation -- **Performance Optimization**: Removed unnecessary manual refresh operations -- **Code Cleanup**: Eliminated redundant functions and simplified codebase - -This represents a significant improvement in user experience and system automation, making the application more suitable for production industrial environments where reliability and simplicity are paramount. - -#### Instance Lock Verification and Cleanup System -**Issue**: When the application was terminated unexpectedly (crash, forced shutdown, etc.), the lock file `plc_streamer.lock` would remain in the filesystem with a stale PID, preventing new instances from starting even though no actual process was running. Additionally, the lock verification logic needed better user feedback. - -**Solution**: Enhanced the existing `InstanceManager.acquire_instance_lock()` method with improved PID verification, stale lock cleanup, and clear user feedback during startup, ensuring robust instance management without creating duplicate verification systems. - -**Implementation**: - - -#### Dataset Management and Variables Integration -**Issue**: The Dataset Management section was occupying excessive space and was conceptually redundant with the Variables section, since changing the dataset automatically updates the variable list. The interface felt disconnected and inefficient. - -**Solution**: Integrated Dataset Management and Variables into a single, more compact section that provides better user experience and space efficiency. - -**Implementation**: - -**Unified Interface Design**: -- **Combined Header**: Dataset selector and management controls moved to the header of the variables section -- **Compact Status Bar**: Dataset information displayed in a horizontal status bar instead of separate sections -- **Integrated Workflow**: Dataset selection directly shows variables, eliminating redundant UI elements -- **Space Optimization**: Reduced vertical space usage by approximately 40% while maintaining all functionality - -**New Layout Structure**: -- **Header Integration**: Dataset selector, New/Delete buttons in the main section header -- **Status Bar**: Horizontal display of dataset name, prefix, sampling, variable counts, and activation controls -- **Variables Section**: Form and table for variable management appear only when dataset is selected -- **No Dataset Message**: Helpful placeholder when no dataset is selected - -**Technical Changes**: -- **HTML Structure**: Merged separate `
` sections into single integrated section -- **JavaScript Updates**: Modified `updateDatasetInfo()` to handle new DOM structure -- **CSS Adjustments**: Optimized layout for better space utilization -- **Modal Integration**: Maintained dataset creation modal functionality - -**User Experience Improvements**: -- **Intuitive Flow**: Select dataset → immediately see and manage variables -- **Reduced Cognitive Load**: Less visual separation between related concepts -- **Better Space Usage**: More content visible without scrolling -- **Consistent Interface**: Dataset and variables feel like a unified system - -**Visual Design**: -- **Header Controls**: Dataset selector and action buttons in single row -- **Status Information**: Compact horizontal layout with key dataset metrics -- **Responsive Design**: Maintains mobile compatibility with flex-wrap layouts -- **Professional Appearance**: Clean, industrial-grade interface suitable for production environments - -**Benefits**: -- **Space Efficiency**: 40% reduction in vertical space usage -- **Logical Flow**: Dataset selection naturally leads to variable management -- **Reduced Redundancy**: Eliminates duplicate information display -- **Better UX**: More intuitive workflow for industrial users -- **Maintained Functionality**: All original features preserved in more efficient layout - -This integration represents a significant UX improvement that makes the interface more professional and efficient while maintaining all the powerful multi-dataset functionality. - -#### Streaming Status and Variable Enable Issues Fix -**Issues**: Three critical problems were affecting the streaming functionality: -1. Stream status showing "📡 Streaming: Active (undefined vars)" due to property name mismatch -2. Auto-recovery not properly initializing streaming after application reload -3. Variable Enable flags not being respected - all variables exposed to plotJuggler regardless of individual streaming settings - -**Root Cause Analysis**: -1. **Undefined vars**: Frontend expected `streaming_variables_count` but backend sent `total_streaming_variables` -2. **Auto-recovery**: UDP socket setup was missing during dataset restoration, causing streaming threads to start but data not reaching plotJuggler -3. **Variable filtering**: System only checked `streaming_variables` list but ignored individual variable `streaming: true/false` flags - -**Solution Implementation**: - -**Stream Status Fix**: -- Added `streaming_variables_count` property to backend status response for frontend compatibility -- Implemented dual-layer filtering: variables must be in `streaming_variables` list AND have `streaming: true` flag -- Updated status calculation to count only truly active streaming variables - -**Auto-Recovery Enhancement**: -- Modified `attempt_auto_recovery()` to setup UDP socket before activating datasets -- Ensures complete streaming infrastructure is established during automatic restoration -- Proper error handling if UDP socket setup fails during recovery - -**Variable Enable Filtering**: -- Enhanced `dataset_streaming_loop()` to filter variables using both criteria: presence in streaming list AND individual streaming flag -- Updated `toggle_variable_streaming()` to maintain consistency between list membership and individual flags -- Added `sync_streaming_variables()` function to fix existing data inconsistencies -- Automatic synchronization on application startup ensures data integrity - -**Technical Changes**: -- **Backend Status**: Now returns both `total_streaming_variables` and `streaming_variables_count` for compatibility -- **Streaming Filter**: Double verification before sending data to plotJuggler -- **Data Consistency**: Automatic synchronization of streaming flags with streaming variables lists -- **Auto-Recovery**: UDP socket initialization included in restoration process - -**User Experience Impact**: -- Accurate variable count display in stream status -- Automatic streaming restoration after application restart -- Precise control over which variables are actually streamed to plotJuggler -- Consistent behavior between UI settings and actual data transmission - -### Previous Modifications - -#### Frontend Table Update Bug Fix -**Issue**: When adding variables using the "➕ Add Variable" button, the variables were successfully added to the backend dataset but the variables table in the frontend was not refreshing to show the newly added variable. - -**Root Cause**: The `loadDatasetVariables()` function was empty with only placeholder comments. After successful variable addition, the code called `loadDatasets()` which updated the dataset information but never regenerated the HTML table content. - -**Solution**: Implemented complete `loadDatasetVariables()` function that: -- Dynamically regenerates the variables table HTML from current dataset data -- Properly formats memory areas (DB, MW, E.bit, A.bit, MB.bit, etc.) -- Restores streaming checkbox states from dataset configuration -- Adds edit and remove buttons with proper event handlers - -**Technical Changes**: -- **`loadDatasetVariables()`**: Now fully functional, rebuilds table DOM from `currentDatasets[datasetId].variables` -- **`editVariable()`**: Updated to use local dataset data instead of API calls for better performance -- **Edit form submission**: Modified to use dataset-specific API endpoints (`/api/datasets//variables`) -- **Memory area formatting**: Consistent display format for all PLC address types - -**User Experience Impact**: Variables now appear immediately in the table after successful addition, providing instant visual feedback and eliminating confusion about whether the operation succeeded. - -#### Multi-Dataset Architecture Implementation -**Decision**: Completely redesigned the application to support multiple independent datasets with separate CSV files, custom prefixes, and individual sampling intervals. - -**Rationale**: Industrial monitoring often requires grouping variables logically (temperature sensors, pressure sensors, flow meters, etc.) with different sampling rates and separate data files for analysis. The original single-dataset approach was limiting for complex industrial scenarios where different process areas need different monitoring strategies. - -**Implementation**: - -**New Dataset Structure**: -- **Dataset Management**: Each dataset has a unique ID, descriptive name, CSV prefix, and optional custom sampling interval -- **Independent Variables**: Each dataset contains its own set of PLC variables with individual streaming configurations -- **Separate CSV Files**: Each dataset generates files with format: `prefix_hour.csv` (e.g., `temp_14.csv`, `pressure_14.csv`) -- **Individual Sampling**: Datasets can use global sampling interval or define their own for specialized monitoring needs - -**Core Architecture Changes**: -- Replaced single variable collection with `datasets` dictionary structure -- Each dataset runs in independent thread with its own CSV writer and streaming logic -- Dataset activation/deactivation controls which data streams are active -- Current dataset selection for variable editing and management - -**API Enhancements**: -- `GET/POST /api/datasets` - List and create datasets -- `DELETE /api/datasets/` - Remove datasets -- `POST /api/datasets//activate|deactivate` - Control dataset streaming -- `POST /api/datasets//variables` - Add variables to specific dataset -- `DELETE /api/datasets//variables/` - Remove variables from dataset -- `POST /api/datasets//variables//streaming` - Toggle streaming per variable -- `POST /api/datasets/current` - Set current editing dataset - -**User Interface Redesign**: -- **Dataset Selector**: Dropdown to choose current dataset for editing -- **Dataset Creation Modal**: Professional form for creating new datasets with validation -- **Activate/Deactivate Controls**: Independent activation of datasets for streaming -- **Dataset Information Panel**: Real-time display of dataset status, variables count, and streaming configuration -- **Per-Dataset Variable Management**: Variables are now managed within selected dataset context - -**Technical Benefits**: -- **Scalable Architecture**: Easy to add new monitoring groups without affecting existing ones -- **Independent Operations**: Each dataset can start/stop streaming independently -- **Flexible Sampling**: Critical processes can have faster sampling while others use standard rates -- **Organized Data Storage**: Separate CSV files make data analysis and processing more manageable -- **Thread Safety**: Each dataset operates in isolated thread for optimal performance - -**CSV File Organization**: -``` -records/ -├── 17-07-2025/ -│ ├── temp_14.csv # Temperature sensors data for 2PM -│ ├── pressure_14.csv # Pressure sensors data for 2PM -│ ├── flow_14.csv # Flow meters data for 2PM -│ └── digital_14.csv # Digital I/O data for 2PM -``` - -**Configuration Structure**: -```json -{ - "datasets": { - "temperature_sensors": { - "name": "Temperature Sensors", - "prefix": "temp", - "variables": { ... }, - "streaming_variables": [...], - "sampling_interval": 0.5, - "enabled": true - } - }, - "active_datasets": ["temperature_sensors"], - "current_dataset_id": "temperature_sensors" -} -``` - -**Migration Strategy**: -- Removed legacy single-dataset system completely (no backward compatibility needed as application not yet operational) -- Created sample configuration with temperature and digital input datasets -- Updated all API endpoints to work with dataset-based architecture - -**Industrial Impact**: -- **Process Optimization**: Different sampling rates for different process criticality levels -- **Data Organization**: Logical grouping makes data analysis more efficient for process engineers -- **Resource Management**: Only necessary datasets consume system resources -- **Maintenance Efficiency**: Easier troubleshooting when issues are isolated to specific process areas -- **Scalability**: Can easily add new monitoring areas without system redesign - -**User Experience**: -- **Intuitive Workflow**: Select dataset → manage variables → activate streaming -- **Visual Feedback**: Clear status indicators for each dataset's state -- **Professional Interface**: Modal dialogs and organized controls suitable for industrial environments -- **Error Prevention**: Validation prevents common configuration mistakes - -This represents a fundamental architectural improvement that transforms the application from a simple variable logger into a comprehensive multi-process monitoring platform suitable for complex industrial environments. - -#### Expanded Data Types and Memory Areas Support -**Decision**: Extended the application to support MW (Memory Words), PEW (Process Input Words), PAW (Process Output Words), and individual bit addressing (E5.1, A3.7, M10.0) plus additional Siemens PLC data types. - -**Rationale**: The original implementation was limited to Data Blocks (DB) only, which restricted access to other important PLC memory areas commonly used in industrial applications. MW (Markers/Memory), PEW (Process Inputs), and PAW (Process Outputs) are essential for monitoring peripheral I/O and internal PLC memory, providing comprehensive visibility into the complete PLC system state. - -**Implementation**: - -**New Memory Areas Supported**: -- **MW/M (Memory Words/Markers)**: Internal PLC memory for flags, intermediate calculations, and program logic -- **PEW/PE (Process Input Words)**: Direct access to analog and digital input peripherals -- **PAW/PA (Process Output Words)**: Direct access to analog and digital output peripherals -- **DB (Data Blocks)**: Existing support maintained for backward compatibility - -**Additional Data Types Added**: -- **word**: 16-bit unsigned integer (0-65535) -- **byte**: 8-bit unsigned integer (0-255) -- **uint**: 16-bit unsigned integer (same as word, alternative naming) -- **udint**: 32-bit unsigned integer (0-4294967295) -- **sint**: 8-bit signed integer (-128 to 127) -- **usint**: 8-bit unsigned integer (same as byte, alternative naming) - -**Technical Architecture Changes**: - -**Enhanced Variable Configuration**: -- Added `area` field to variable configuration with validation for supported area types -- Modified `add_variable()` method signature to include area parameter: `add_variable(name, area, db, offset, var_type)` -- Backward compatibility maintained for existing DB-based configurations -- DB parameter now optional and only required for DB area type - -**Smart Area Detection**: -- Automatic area type validation with descriptive error messages -- Support for both short and long area names (e.g., "mw"/"m", "pew"/"pe", "paw"/"pa") -- Case-insensitive area specification for user convenience - -**snap7 Library Integration**: -- Utilized snap7's dedicated functions for optimal performance: - - `mb_read()` for Memory/Markers access - - `eb_read()` for Process Input access - - `ab_read()` for Process Output access - - `db_read()` for Data Block access (existing) -- Each area uses appropriate snap7 function rather than generic `read_area()` for better performance - -**Enhanced Variable Display**: -- Dynamic area description generation for logging and display -- Format examples: "MW100", "PEW256", "PAW64", "DB1.20" -- Clear identification of memory area in event logs and status displays - -**Individual Bit Addressing Support**: -- Added support for individual bit monitoring using Siemens standard notation -- **E (Process Input Bits)**: E5.1 = Input byte 5, bit 1 (sensors, limit switches) -- **A (Process Output Bits)**: A3.7 = Output byte 3, bit 7 (actuators, indicator lamps) -- **MB (Memory Bits)**: M10.0 = Memory byte 10, bit 0 (internal flags, state variables) -- Uses `snap7.util.get_bool()` for reliable bit extraction from byte arrays -- Web interface automatically restricts data type to BOOL for bit areas -- Dynamic bit position selector (0-7) appears only for bit areas -- Format examples: "E5.1", "A3.7", "M10.0" - -**Variable Editing Functionality**: -- Added comprehensive variable editing system with modal interface -- **Edit Button**: ✏️ Edit button added alongside Remove button in variables table -- **Modal Form**: Professional modal dialog with form validation and dynamic field visibility -- **API Support**: GET endpoint for fetching variable configuration, PUT endpoint for updates -- **Data Preservation**: Maintains streaming state when variables are modified -- **Name Changes**: Supports changing variable names with duplicate validation -- **Field Validation**: Dynamic UI that adapts to memory area type (DB fields, bit selectors) -- **Seamless UX**: Modal closes on successful update with automatic page refresh - -**API Enhancements**: -- Updated REST API validation to include all new data types and areas -- Comprehensive input validation with descriptive error messages -- Support for all area types in variable addition endpoint - -**Error Handling**: -- Robust validation for unsupported area types with clear error messages -- Data type validation against complete supported type list -- Graceful fallback for invalid configurations - -**Industrial Benefits**: -- **Comprehensive I/O Monitoring**: Direct access to process inputs and outputs without requiring DB mapping -- **Memory Diagnostics**: Ability to monitor internal PLC flags and calculation results -- **Flexible Data Types**: Support for various integer sizes optimizes memory usage and precision -- **Complete System Visibility**: Monitor entire PLC memory space including peripherals and internal state - - ---- - -#### CSV Recording Management and File Rotation System -**Issue**: The system lacked control over CSV storage location and had no mechanism to prevent disk space exhaustion from accumulated CSV files over time. Users needed visibility into storage usage and automated cleanup capabilities. - -**Solution**: Implemented comprehensive CSV recording management with configurable storage directory and intelligent file rotation system based on size, time, and space constraints. - -**Implementation**: - -**Configurable Storage Directory**: -- **Dynamic Path Configuration**: Users can specify custom directory for CSV file storage -- **Absolute Path Support**: Full path display and validation for storage location -- **Directory Auto-Creation**: System automatically creates directory structure as needed -- **Path Validation**: Ensures directory accessibility and write permissions - -**Intelligent File Rotation System**: -- **Multi-Criteria Cleanup**: Rotation based on total size (MB), maximum days, or maximum hours -- **Priority Logic**: Hours override days when both are specified for precise control -- **Automated Scheduling**: Configurable cleanup intervals (default 24 hours) -- **Manual Cleanup**: On-demand cleanup execution for immediate space management - -**Storage Monitoring and Analytics**: -- **Real-time Directory Statistics**: Display total files, combined size, oldest/newest file timestamps -- **Day-Folder Breakdown**: Individual statistics for each day's recording folder -- **Disk Space Integration**: Shows available space and estimated recording time remaining -- **Visual Progress Indicators**: Clear display of storage utilization and trends - -**Configuration Management**: -- **Persistent Settings**: All rotation settings stored in main configuration file -- **Validation Layer**: Input validation for size limits, time ranges, and directory paths -- **Hot Configuration**: Changes applied immediately without system restart -- **Backup-Friendly**: Configuration preserved during system migrations - -**Web Interface Integration**: -- **Dedicated Configuration Section**: Comprehensive CSV management panel in web interface -- **Real-time Updates**: Live display of current configuration and directory statistics -- **Interactive Forms**: User-friendly inputs with validation and helpful tooltips -- **Status Monitoring**: Visual indicators for cleanup status and disk space usage - -**Technical Architecture**: -- **ConfigManager Enhancement**: Extended with CSV-specific configuration methods -- **DataStreamer Integration**: Cleanup execution integrated with streaming lifecycle -- **Event Logging**: All cleanup activities logged with detailed statistics -- **Error Handling**: Graceful handling of file access errors and permission issues - -**User Experience Features**: -- **Information Panels**: Expandable sections showing detailed directory statistics -- **Manual Controls**: One-click manual cleanup with confirmation dialogs -- **Configuration Preview**: Real-time display of current settings and their effects -- **Progress Feedback**: Clear messages for successful operations and error conditions - -**Industrial Benefits**: -- **Continuous Operation**: Prevents disk space exhaustion in long-running industrial systems -- **Data Lifecycle Management**: Automated retention policies for regulatory compliance -- **Storage Optimization**: Intelligent cleanup preserves recent data while managing space -- **Operational Visibility**: Clear insight into data storage patterns and system health -- **Maintenance Automation**: Reduces manual intervention requirements in production environments - -**Default Configuration**: -- **Base Directory**: "records" (configurable) -- **Rotation Enabled**: True -- **Size Limit**: 1000 MB (1 GB) -- **Time Retention**: 30 days -- **Cleanup Interval**: 24 hours - -**API Endpoints**: -- **GET /api/csv/config**: Retrieve current CSV configuration and disk statistics -- **POST /api/csv/config**: Update CSV configuration parameters -- **POST /api/csv/cleanup**: Trigger manual cleanup operation -- **GET /api/csv/directory/info**: Get detailed directory statistics and file information - ---- - -#### Real-Time Variable Value Display System -**Need**: Users requested the ability to see current values of PLC variables in the web interface, especially when variables are being read for CSV recording. Since Flask doesn't natively support real-time streaming like WebSockets, a manual refresh approach was implemented. - -**Solution**: Added a "Current Value" column to the variables table with a refresh button that reads current values from the PLC on demand, providing immediate feedback on variable states without continuous polling. - -**Implementation**: - -**Backend Enhancement**: -- **New API Endpoint**: `GET /api/datasets//variables/values` -- **PLC Value Reading**: Utilizes existing `PLCClient.read_multiple_variables()` method -- **Value Formatting**: Smart formatting based on data type (REAL with 3 decimals, BOOL as TRUE/FALSE, integers as whole numbers) -- **Error Handling**: Graceful handling of PLC connection issues and read errors -- **Timestamp**: Includes read timestamp for user reference - -**Frontend Enhancements**: -- **Table Column**: Added "Current Value" column to variables table -- **Refresh Button**: Manual refresh button with loading state indication -- **Visual Feedback**: Color-coded values (green for successful reads, red for errors, gray for offline) -- **Auto-Refresh**: Automatic value refresh when switching between datasets -- **Timestamp Display**: Shows last refresh time for user awareness - -**User Experience Features**: -- **Loading States**: Button shows "⏳ Reading..." during PLC communication -- **Status Messages**: Clear feedback about read operations and PLC connection status -- **Error Indication**: Displays "ERROR", "PLC OFFLINE", or "COMM ERROR" as appropriate -- **Dataset Context**: Values are cleared when no dataset is selected -- **Connection Awareness**: Checks PLC connection before attempting reads - -**Technical Implementation**: -- **Frontend Function**: `refreshVariableValues()` handles the refresh operation -- **Value Cells**: Each variable has a uniquely identified cell for value display -- **Format Logic**: Server-side formatting ensures consistent display across data types -- **Event Integration**: Integrates with existing dataset management system -- **CSS Styling**: Monospace font for values, visual indicators for different states - -**Benefits**: -- **Immediate Feedback**: Users can verify PLC communication and variable values instantly -- **Debugging Aid**: Helps troubleshoot PLC configuration and connectivity issues -- **Process Monitoring**: Allows monitoring of critical process variables during configuration -- **No Continuous Polling**: Efficient manual refresh approach reduces network overhead -- **User Control**: Users decide when to refresh, maintaining performance -- **Clear Status**: Visual indicators provide clear information about system state - - ---- - -#### Optimized Value Display Using Streaming Cache -**Improvement**: Modified the variable value refresh system to use cached values from the streaming process instead of making new PLC reads, improving efficiency and consistency with CSV data. - -**Rationale**: The original implementation made direct PLC reads every time the user clicked refresh, which was inefficient and could show different values than those being written to CSV files. Since the streaming system already reads values continuously, it makes more sense to display those exact values. - -**Implementation**: - -**Backend Cache System**: -- **Value Cache**: `last_read_values{}` stores the most recent values read during streaming -- **Timestamp Cache**: `last_read_timestamps{}` records when values were last read -- **Error Cache**: `last_read_errors{}` maintains specific error information for each variable -- **Automatic Updates**: Cache is updated every streaming cycle with actual CSV data -- **Cache Management**: Values are cleared when datasets are deactivated or streaming stops - -**Smart Data Source Selection**: -- **Primary**: Uses cached values from streaming process (exactly what's in CSV) -- **Fallback**: Direct PLC read only when no cache available (streaming not active) -- **Source Indication**: Clear labeling of data source for user awareness -- **Consistency**: Ensures displayed values match CSV file contents - -**Enhanced User Interface**: -- **Source Indicators**: Visual icons showing data origin (📊 cache vs 🔗 direct) -- **Timestamp Accuracy**: Uses actual read timestamp from streaming process -- **Performance Indication**: Shows "from streaming cache" or "direct PLC read" -- **Cache Availability**: Automatically falls back to direct reads when needed - -**Operational Benefits**: -- **Reduced PLC Load**: Eliminates unnecessary duplicate reads from PLC -- **Data Consistency**: Shows exactly the same values being written to CSV -- **Better Performance**: No communication delays when displaying cached values -- **Network Efficiency**: Reduces PLC network traffic and potential timeouts -- **Real-time Accuracy**: Values reflect the actual streaming process state - -**User Experience Improvements**: -- **Instant Response**: Cached values display immediately without PLC communication -- **Source Transparency**: Users know whether they're seeing live or cached data -- **Streaming Awareness**: Interface clearly indicates when streaming is providing data -- **Fallback Reliability**: System still works when streaming is not active - -**Technical Implementation**: -- **Cache Integration**: `DataStreamer.get_cached_dataset_values()` provides cached data access -- **Source Tracking**: Response includes source information (`cache` vs `plc_direct`) -- **Error Preservation**: Cached errors from streaming process are preserved and displayed -- **Automatic Cleanup**: Cache is cleared when streaming stops or datasets are deactivated - ---- - -## 📊 Dynamic Refresh Rate Control for Real-Time Charts - -### User Request Summary: -Usuario quería poder seleccionar el tiempo de refresco de las gráficas en tiempo real en vez de que sea un valor fijo, sin impacto en el PLC ya que siempre se leen valores del cache. - -### Implementation Details: - -**UI Changes**: -- **Refresh Rate Input**: Changed from dropdown to editable number input field for custom millisecond values -- **Flexible Input**: Users can enter any value between 100ms and 60,000ms with automatic validation -- **Dual Location Support**: Input field appears both in main tab and sub-tabs for consistency -- **Visual Integration**: Uses clock emoji (⏱️) as label, "ms" unit indicator, and compact styling matching existing controls -- **User Experience**: Enter key support for immediate application, auto-clamping to valid ranges - -**Technical Implementation**: -- **Dynamic Configuration**: Modified `createStreamingChartConfig()` to use dynamic refresh rate instead of hardcoded 1000ms -- **Session Tracking**: Added `refreshRates` Map to track individual session refresh rates -- **Real-time Updates**: `updateRefreshRate()` function dynamically changes chart refresh without recreation -- **Interval Management**: Properly handles both ChartJS streaming intervals and manual fallback intervals -- **Synchronization**: Both selectors (main/tab) stay synchronized when changed -- **Memory Management**: Refresh rates are properly cleaned up when sessions are removed - -**Key Features**: -- **No PLC Impact**: Only affects visualization refresh rate, not data collection from PLC cache -- **Instant Changes**: Refresh rate changes take effect immediately without chart recreation -- **Persistence**: Each plot session maintains its own independent refresh rate -- **Fallback Support**: Works with both streaming mode and manual refresh fallback -- **User Experience**: Intuitive controls integrated seamlessly with existing UI - -**Code Changes**: -- **plotting.js**: Added refreshRates Map, updateRefreshRate() function, dynamic config -- **tabs.js**: Added refresh rate selector to sub-tab controls -- **styles.css**: Added styling for refresh-rate-control and refresh-rate-selector -- **Memory cleanup**: Ensured refresh rates are deleted when sessions are removed - -**Benefits**: -- **Flexible Visualization**: Users can optimize refresh rate based on their monitoring needs -- **Performance Control**: Slower refresh rates reduce CPU usage for long-term monitoring -- **Independent Operation**: Each chart can have different refresh rates as needed -- **Cache-Based**: No additional load on PLC communication system - ---- - -### 📝 Update: Changed to Editable Input Field - -**User Request**: Cambiar el combo box por un campo editable con el tiempo en ms - -**Changes Made**: -- **Replaced dropdown**: `` with millisecond values -- **Direct value entry**: Users can now enter exact millisecond values instead of preset options -- **Enhanced validation**: Auto-clamping to valid range (100-60,000ms) with console warnings -- **Improved UX**: Enter key support for immediate application, "ms" unit label for clarity -- **Consistent styling**: Updated CSS classes from `.refresh-rate-selector` to `.refresh-rate-input` - -**Technical Benefits**: -- **Precision Control**: Users can set exact refresh rates like 1500ms, 750ms, etc. -- **Wide Range**: Supports from 100ms (high-frequency) to 60s (long-term monitoring) -- **Input Validation**: Automatic range enforcement prevents invalid values -- **Real-time Feedback**: Immediate visual and console feedback for out-of-range values - ---- - -### 🐛 Bug Fix: Refresh Rate Not Working - -**Issue**: El refresh rate no se aplicaba correctamente debido a throttles y intervalos hardcodeados. - -**Problems Found**: -1. **Fixed Throttle**: `onStreamingRefresh` tenía un límite fijo de 800ms que impedía refresh rates más rápidos -2. **Fixed Manual Interval**: Modo fallback usaba 400ms fijo en lugar del refresh rate configurado -3. **Plugin Integration**: El modo streaming no reiniciaba correctamente los intervalos internos - -**Corrections Made**: -- **Dynamic Throttle**: Cambiado a usar 80% del refresh rate configurado (mínimo 100ms) -- **Dynamic Manual Refresh**: `startManualRefresh` ahora usa el refresh rate configurado por sesión -- **Improved Streaming**: Mejor reinicio de intervalos para el plugin chartjs-streaming -- **Enhanced Debugging**: Logs detallados para diagnosticar problemas de refresh rate - -**Technical Implementation**: -```javascript -// Throttle dinámico basado en refresh rate -const minInterval = Math.max(refreshRate * 0.8, 100); - -// Intervalo manual con refresh rate dinámico -sessionData.manualInterval = setInterval(() => { - this.onStreamingRefresh(sessionId, sessionData.chart); -}, refreshRate); - -// Reinicio mejorado del plugin streaming -streaming.intervalId = setInterval(() => { - if (!chart.scales.x.realtime.pause && typeof chart.scales.x.realtime.onRefresh === 'function') { - chart.scales.x.realtime.onRefresh(chart); - } - if (chart.scales.x.updateRealTimeData) { - chart.scales.x.updateRealTimeData(); - } - chart.update('quiet'); -}, finalRefreshRateMs); -``` - -**Result**: El refresh rate ahora funciona correctamente tanto en modo streaming como fallback, con logs detallados para monitoreo. - ---- - -## Reconexión Automática con Backoff Exponencial -**Date**: 2025-01-08 -**Request**: "Implementar reconexión automática cuando se pierde la comunicación con el PLC. Luego de perder la comunicación debería cerrar la comunicación y reintentar con backoff exponencial: 1s, 2s, 4s, 8s, 16s, 32s hasta llegar a 1 minuto máximo. Se seguirá reintentando cada 1 minuto." - -**Problem**: El sistema actual no tenía capacidad de reconexión automática cuando se perdía la comunicación con el PLC. Los errores de timeout y connection abort se registraban pero no se intentaba reconectar automáticamente. - -**Technical Implementation**: - -### PLCClient Enhanced with Automatic Reconnection -```python -# New properties for reconnection management -self.reconnection_enabled = True -self.is_reconnecting = False -self.consecutive_failures = 0 -self.max_backoff_seconds = 60 # Maximum 1 minute backoff -self.base_backoff_seconds = 1 # Start with 1 second - -# Exponential backoff algorithm -def _calculate_backoff_delay(self) -> float: - if self.consecutive_failures == 0: - return 0 - power = self.consecutive_failures - 1 - delay = self.base_backoff_seconds * (2 ** power) - return min(delay, self.max_backoff_seconds) - -# Connection error detection -def _is_connection_error(self, error_str: str) -> bool: - connection_error_indicators = [ - 'connection timed out', 'connection abort', 'recv tcp', - 'send tcp', 'socket', 'timeout', 'connection refused', - 'connection reset', 'broken pipe', 'network is unreachable' - ] - error_lower = str(error_str).lower() - return any(indicator in error_lower for indicator in connection_error_indicators) -``` - -### Automatic Reconnection Process -- **Error Detection**: Errores de conexión se detectan automáticamente en `read_variable()` -- **Connection Close**: Se fuerza desconexión antes de cada intento de reconexión -- **Background Thread**: Reconexión ejecutada en hilo separado para no bloquear operaciones -- **Backoff Sequence**: 1s → 2s → 4s → 8s → 16s → 32s → 60s (max) → continúa cada 60s -- **Thread Safety**: Uso de locks para evitar múltiples intentos simultáneos - -### API Integration -```python -# New API endpoints -/api/plc/reconnection/status # GET - Estado detallado de reconexión -/api/plc/reconnection/enable # POST - Habilitar reconexión automática -/api/plc/reconnection/disable # POST - Deshabilitar reconexión automática - -# Enhanced status response -{ - "plc_reconnection": { - "enabled": true, - "active": false, - "consecutive_failures": 0, - "next_delay_seconds": 0, - "max_backoff_seconds": 60 - }, - "plc_connection_info": { - "connected": true, - "reconnection_enabled": true, - "is_reconnecting": false, - "consecutive_failures": 0 - } -} -``` - -### Frontend Integration -```javascript -// Dynamic status display with reconnection info -if (reconnectionInfo.active) { - statusText = '🔌 PLC: Reconnecting...'; - statusClass = 'status-item status-reconnecting'; - reconnectionDetails = `🔄 Next attempt in ${nextDelay}s (failure #${failures})`; -} - -// New CSS class with animation -.status-reconnecting { - background: #ff9800; - color: white; - animation: pulse 2s infinite; -} -``` - -### Key Features Implemented -1. **Automatic Detection**: Identifica errores de conexión específicos del protocolo S7 -2. **Exponential Backoff**: Secuencia 1s, 2s, 4s, 8s, 16s, 32s, 60s con máximo de 1 minuto -3. **Clean Reconnection**: Cierra completamente la conexión antes de cada intento -4. **Visual Feedback**: Estado visual animado durante reconexión en la interfaz web -5. **API Control**: Endpoints para habilitar/deshabilitar y monitorear reconexión -6. **Thread Safety**: Manejo seguro de hilos para evitar conflictos -7. **Graceful Shutdown**: Limpieza correcta de hilos al desconectar - -**Result**: El sistema ahora se reconecta automáticamente cuando se pierde la comunicación con el PLC, usando backoff exponencial hasta 1 minuto máximo, con indicación visual en tiempo real del estado de reconexión. - -### ⚡ Automatic Dataset Resume After Reconnection -**Problem**: Aunque la reconexión funcionaba, el streaming/recording no se reanudaba automáticamente después de una reconexión exitosa. - -**Solution**: Sistema de callback y tracking de datasets activos: -```python -# Callback system in PLCClient -self.reconnection_callbacks = [] # List of callback functions - -def _notify_reconnection_success(self): - """Notify all registered callbacks about successful reconnection""" - for callback in self.reconnection_callbacks: - try: - callback() - except Exception as e: - if self.logger: - self.logger.error(f"Error in reconnection callback: {e}") - -# DataStreamer registers for reconnection callbacks -self.plc_client.add_reconnection_callback(self._on_plc_reconnected) - -# Track datasets before disconnection -def _track_disconnection(self): - """Track which datasets were active before disconnection for auto-resume""" - self.datasets_before_disconnection = set() - for dataset_id in self.config_manager.active_datasets: - if (dataset_id in self.dataset_threads and - self.dataset_threads[dataset_id].is_alive()): - self.datasets_before_disconnection.add(dataset_id) - -# Resume datasets after reconnection -def _on_plc_reconnected(self): - """Callback method called when PLC reconnects successfully""" - datasets_to_resume = self.datasets_before_disconnection.copy() - resumed_count = 0 - - for dataset_id in datasets_to_resume: - if (dataset_id in self.config_manager.datasets and - dataset_id in self.config_manager.active_datasets): - - thread_exists = dataset_id in self.dataset_threads - thread_alive = thread_exists and self.dataset_threads[dataset_id].is_alive() - if not thread_exists or not thread_alive: - self.start_dataset_streaming(dataset_id) - resumed_count += 1 -``` - -### 🔄 Complete Reconnection Flow -1. **Connection Lost** → Error detectado en `read_variable()` -2. **Track Active Datasets** → Se guardan los datasets activos antes de perder conexión -3. **Dataset Loops End** → Los while loops terminan por `is_connected() = False` -4. **Background Reconnection** → Inicia proceso de backoff exponencial -5. **Successful Reconnection** → Se establece conexión y se notifica a callbacks -6. **Auto-Resume Streaming** → Se reinician automáticamente los dataset loops que estaban activos -7. **CSV Recording Resumed** → Se reanuda grabación automáticamente -8. **Visual Update** → Interfaz muestra estado reconectado - -**Result**: El sistema ahora se reconecta Y reanuda automáticamente el streaming/recording de todos los datasets que estaban activos antes de la desconexión, proporcionando continuidad completa del servicio. - -## 🚨 Bug Fix Crítico: Dead Thread Cleanup - -### Problema Detectado -Durante testing se descubrió que aunque el tracking y auto-resume se ejecutaban correctamente (se veía en logs), **los nuevos threads no generaban datos**. El log mostraba: -- ✅ Tracking funcionando: "Tracked 1 active datasets for auto-resume: ['DAR']" -- ✅ Auto-resume ejecutándose: "Successfully resumed streaming for dataset 'DAR'" -- ❌ Sin datos después: No más logs de "Dataset 'DAR': CSV: 6 vars, UDP: 0 vars" - -### Root Cause -Bug en `start_dataset_streaming()` en `core/streamer.py`: - -```python -# CÓDIGO PROBLEMÁTICO - ANTES -if dataset_id in self.dataset_threads: - return True # Already running ← ❌ BUG! -``` - -**El problema**: Cuando el PLC se desconectaba, el thread moría pero **seguía en el dictionary**. Al intentar auto-resume, el método detectaba la entrada existente y retornaba `True` sin verificar si el thread estaba realmente vivo, **nunca creando el nuevo thread**. - -### Solución Implementada -```python -# CÓDIGO CORREGIDO - DESPUÉS -if dataset_id in self.dataset_threads: - existing_thread = self.dataset_threads[dataset_id] - if existing_thread.is_alive(): - return True # Already running and alive - else: - # Clean up dead thread - del self.dataset_threads[dataset_id] - -# Continuar para crear nuevo thread... -``` - -### Key Changes -- **Verificación de vida**: Ahora comprueba `thread.is_alive()` antes de retornar -- **Limpieza automática**: Elimina threads muertos del dictionary automáticamente -- **Logging mejorado**: Registra limpieza de threads muertos para debugging - -### Impacto -Este fix resuelve completamente el problema donde: -- La reconexión funcionaba ✅ -- El tracking funcionaba ✅ -- El auto-resume se ejecutaba ✅ -- Pero **los datos no se reanudaban** ❌ - -**Now**: Sistema completamente funcional con reconexión automática Y auto-resume de datos funcionando al 100%. \ No newline at end of file diff --git a/application_events.json b/application_events.json index d0651e4..2cafa99 100644 --- a/application_events.json +++ b/application_events.json @@ -9259,8 +9259,92 @@ "event_type": "application_started", "message": "Application initialization completed successfully", "details": {} + }, + { + "timestamp": "2025-08-12T09:43:30.914932", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T09:44:48.333195", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T10:39:12.071678", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T11:47:27.789149", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T12:29:32.566924", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T14:30:28.890776", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T14:35:07.292689", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T14:40:08.698091", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T14:44:03.411647", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T14:50:27.446910", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T15:00:13.141898", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} + }, + { + "timestamp": "2025-08-12T15:06:11.269817", + "level": "info", + "event_type": "application_started", + "message": "Application initialization completed successfully", + "details": {} } ], - "last_updated": "2025-08-12T09:13:36.619106", - "total_entries": 862 + "last_updated": "2025-08-12T15:06:11.269817", + "total_entries": 874 } \ No newline at end of file diff --git a/config/schema/datasets.schema.json b/config/schema/datasets.schema.json index 1629d96..ca67c97 100644 --- a/config/schema/datasets.schema.json +++ b/config/schema/datasets.schema.json @@ -2,7 +2,7 @@ "$schema": "https://json-schema.org/draft/2020-12/schema", "$id": "datasets.schema.json", "title": "Datasets Configuration", - "description": "Esquema para editar plc_datasets.json (múltiples datasets y variables)", + "description": "Schema to edit plc_datasets.json (multiple datasets and variables)", "type": "object", "additionalProperties": false, "properties": { @@ -15,14 +15,14 @@ "name": { "type": "string", "title": "Dataset Name", - "description": "Nombre legible del dataset", + "description": "Human-readable name of the dataset", "minLength": 1, "maxLength": 60 }, "prefix": { "type": "string", "title": "CSV Prefix", - "description": "Prefijo para archivos CSV", + "description": "Prefix for CSV files", "pattern": "^[a-zA-Z0-9_-]+$", "minLength": 1, "maxLength": 20 @@ -103,7 +103,7 @@ }, "streaming_variables": { "type": "array", - "title": "Streaming Variables", + "title": "Streaming variables", "items": { "type": "string" }, @@ -114,8 +114,8 @@ "number", "null" ], - "title": "Sampling Interval (s)", - "description": "Vacío para usar el intervalo global", + "title": "Sampling interval (s)", + "description": "Leave empty to use the global interval", "minimum": 0.01, "maximum": 10 }, diff --git a/config/schema/plc.schema.json b/config/schema/plc.schema.json index 4291732..8b44514 100644 --- a/config/schema/plc.schema.json +++ b/config/schema/plc.schema.json @@ -2,10 +2,12 @@ "$id": "plc.schema.json", "$schema": "https://json-schema.org/draft/2020-12/schema", "additionalProperties": false, - "description": "Esquema para editar plc_config.json", + "dependencies": {}, + "description": "Schema to edit plc_config.json", "properties": { "csv_config": { "additionalProperties": false, + "dependencies": {}, "properties": { "cleanup_interval_hours": { "default": 24, @@ -15,80 +17,65 @@ }, "last_cleanup": { "title": "Last Cleanup", - "type": [ - "string", - "null" - ] + "type": "string" }, "max_days": { "default": 30, "minimum": 1, "title": "Max Days", - "type": [ - "integer", - "null" - ] + "type": "integer" }, "max_hours": { "default": null, "minimum": 1, "title": "Max Hours", - "type": [ - "integer", - "null" - ] + "type": "integer" }, "max_size_mb": { "default": 1000, "minimum": 1, "title": "Max Size (MB)", - "type": [ - "integer", - "null" - ] + "type": "integer" }, "records_directory": { "default": "records", + "description": "Directory to save *.csv files", "title": "Records Directory", "type": "string" }, "rotation_enabled": { "default": true, - "enum": [ - true, - false - ], "options": { "enum_titles": [ "Activate", "Deactivate" ] }, - "title": "Rotation", + "title": "Rotation Active", "type": "boolean" } }, "required": [ + "cleanup_interval_hours", "records_directory", - "rotation_enabled", - "cleanup_interval_hours" + "rotation_enabled" ], "title": "CSV Recording", "type": "object" }, "plc_config": { "additionalProperties": false, + "dependencies": {}, "properties": { "ip": { - "description": "Dirección IP del PLC (S7-31x)", - "format": "ipv4", + "description": "IP of PLC (S7-31x)", "pattern": "^.+$", "title": "PLC IP", "type": "string" }, "rack": { "default": 0, - "description": "Número de rack (0-7)", + "description": "Rack of PLC", "maximum": 7, "minimum": 0, "title": "Rack", @@ -96,7 +83,7 @@ }, "slot": { "default": 2, - "description": "Número de slot (generalmente 2)", + "description": "Normally 2", "maximum": 31, "minimum": 0, "title": "Slot", @@ -113,7 +100,7 @@ }, "sampling_interval": { "default": 0.1, - "description": "Intervalo global de muestreo en segundos", + "description": "interval sampling in seconds", "maximum": 10, "minimum": 0.01, "title": "Sampling Interval (s)", @@ -121,11 +108,13 @@ }, "udp_config": { "additionalProperties": false, + "dependencies": {}, "properties": { "host": { "default": "127.0.0.1", + "description": "Normally this is 127.0.0.1", "pattern": "^.+$", - "title": "UDP Host", + "title": "UDP Host IP", "type": "string" }, "port": { @@ -144,10 +133,10 @@ } }, "required": [ + "csv_config", "plc_config", - "udp_config", "sampling_interval", - "csv_config" + "udp_config" ], "title": "PLC & UDP Configuration", "type": "object" diff --git a/config/schema/plots.schema.json b/config/schema/plots.schema.json index 35a8c57..996727d 100644 --- a/config/schema/plots.schema.json +++ b/config/schema/plots.schema.json @@ -2,7 +2,7 @@ "$schema": "https://json-schema.org/draft/2020-12/schema", "$id": "plots.schema.json", "title": "Plot Sessions", - "description": "Esquema para editar plot_sessions.json (sesiones de gráfica)", + "description": "Schema to edit plot_sessions.json (plot sessions)", "type": "object", "additionalProperties": false, "properties": { @@ -14,12 +14,12 @@ "name": { "type": "string", "title": "Plot Name", - "description": "Nombre de la sesión de gráfica" + "description": "Human-readable name of the plot session" }, "variables": { "type": "array", "title": "Variables", - "description": "Variables a graficar", + "description": "Variables to be plotted", "items": { "type": "string" }, @@ -27,8 +27,8 @@ }, "time_window": { "type": "integer", - "title": "Time Window (s)", - "description": "Ventana temporal en segundos", + "title": "Time window (s)", + "description": "Time window in seconds", "minimum": 5, "maximum": 3600, "default": 60 @@ -39,7 +39,7 @@ "null" ], "title": "Y Min", - "description": "Vacío para auto" + "description": "Leave empty for auto" }, "y_max": { "type": [ @@ -47,7 +47,7 @@ "null" ], "title": "Y Max", - "description": "Vacío para auto" + "description": "Leave empty for auto" }, "trigger_variable": { "type": [ diff --git a/config/schema/ui/datasets.uischema.json b/config/schema/ui/datasets.uischema.json index b4fd928..20e3cfb 100644 --- a/config/schema/ui/datasets.uischema.json +++ b/config/schema/ui/datasets.uischema.json @@ -9,31 +9,31 @@ "ui:placeholder": "temp" }, "sampling_interval": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "enabled": { - "ui:widget": "CheckboxWidget" + "ui:widget": "checkbox" }, "variables": { "ui:description": "Variables inside this dataset", "items": { "area": { - "ui:widget": "SelectWidget" + "ui:widget": "select" }, "db": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "offset": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "bit": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "type": { - "ui:widget": "SelectWidget" + "ui:widget": "select" }, "streaming": { - "ui:widget": "CheckboxWidget" + "ui:widget": "checkbox" } } } diff --git a/config/schema/ui/plc.uischema.json b/config/schema/ui/plc.uischema.json index 9bbf5d8..867d27b 100644 --- a/config/schema/ui/plc.uischema.json +++ b/config/schema/ui/plc.uischema.json @@ -1,22 +1,23 @@ { "csv_config": { "cleanup_interval_hours": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, + "last_cleanup": {}, "max_days": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "max_hours": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "max_size_mb": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "records_directory": { "ui:placeholder": "records" }, "rotation_enabled": { - "ui:widget": "CheckboxWidget" + "ui:widget": "checkbox" }, "ui:layout": [ [ @@ -30,31 +31,126 @@ }, { "name": "max_days", - "width": 3 + "width": 2 + }, + { + "name": "max_hours", + "width": 2 + }, + { + "name": "max_size_mb", + "width": 2 + } + ], + [ + { + "name": "records_directory", + "width": 10 + }, + { + "name": "rotation_enabled", + "width": 2 } ] + ], + "ui:order": [ + "cleanup_interval_hours", + "last_cleanup", + "max_days", + "max_hours", + "max_size_mb", + "records_directory", + "rotation_enabled" ] }, "plc_config": { "ip": { - "ui:placeholder": "192.168.1.100" + "ui:placeholder": "192.168.1.100", + "ui:column": 6 }, "rack": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown", + "ui:column": 3 }, "slot": { - "ui:widget": "UpDownWidget" - } + "ui:widget": "updown", + "ui:column": 3 + }, + "ui:layout": [ + [ + { + "name": "ip", + "width": 6 + }, + { + "name": "rack", + "width": 3 + }, + { + "name": "slot", + "width": 3 + } + ] + ], + "ui:order": [ + "ip", + "rack", + "slot" + ] }, "sampling_interval": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "udp_config": { "host": { "ui:placeholder": "127.0.0.1" }, "port": { - "ui:widget": "UpDownWidget" - } - } + "ui:widget": "updown" + }, + "ui:layout": [ + [ + { + "name": "host", + "width": 6 + }, + { + "name": "port", + "width": 6 + } + ] + ], + "ui:order": [ + "host", + "port" + ] + }, + "ui:layout": [ + [ + { + "name": "plc_config", + "width": 6 + }, + { + "name": "udp_config", + "width": 6 + } + ], + [ + { + "name": "csv_config", + "width": 10 + }, + { + "name": "sampling_interval", + "width": 2 + } + ] + ], + "ui:order": [ + "csv_config", + "plc_config", + "sampling_interval", + "udp_config" + ] } \ No newline at end of file diff --git a/config/schema/ui/plots.uischema.json b/config/schema/ui/plots.uischema.json index 5b9765c..f7d1272 100644 --- a/config/schema/ui/plots.uischema.json +++ b/config/schema/ui/plots.uischema.json @@ -2,19 +2,19 @@ "plots": { "items": { "time_window": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "y_min": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "y_max": { - "ui:widget": "UpDownWidget" + "ui:widget": "updown" }, "trigger_enabled": { - "ui:widget": "CheckboxWidget" + "ui:widget": "checkbox" }, "trigger_on_true": { - "ui:widget": "CheckboxWidget" + "ui:widget": "checkbox" } } } diff --git a/frontend/package.json b/frontend/package.json index e69e053..7804058 100644 --- a/frontend/package.json +++ b/frontend/package.json @@ -10,12 +10,13 @@ }, "dependencies": { "@rjsf/core": "^5.24.12", - "@rjsf/fluent-ui": "^5.24.12", + "@rjsf/chakra-ui": "^5.24.12", "@rjsf/validator-ajv8": "^5.24.12", - "@fluentui/react": "^8.120.2", - "bootstrap": "^5.3.3", + "@chakra-ui/react": "^2.8.2", + "@emotion/react": "^11.13.0", + "@emotion/styled": "^11.13.0", + "framer-motion": "^11.2.12", "react": "^18.2.0", - "react-bootstrap": "^2.10.4", "react-dom": "^18.2.0", "react-router-dom": "^6.26.1" }, diff --git a/frontend/src/App.jsx b/frontend/src/App.jsx index 705462a..92d879a 100644 --- a/frontend/src/App.jsx +++ b/frontend/src/App.jsx @@ -1,6 +1,7 @@ import React from 'react' import recLogo from './assets/logo/record.png' import { Routes, Route, Link } from 'react-router-dom' +import { Box, Container, Flex, HStack, Select, Button, Heading, Text, useColorMode, useColorModeValue, Stack } from '@chakra-ui/react' import StatusPage from './pages/Status.jsx' import EventsPage from './pages/Events.jsx' import ConfigPage from './pages/Config.jsx' @@ -10,59 +11,102 @@ import DashboardPage from './pages/Dashboard.jsx' function Home() { return ( -
-
-

+ + + REC PLC S7-31x Streamer & Logger (React) -

-

React base ready. We will migrate views incrementally.

- Go to legacy mode -
+ + + React base ready. We will migrate views incrementally. + + + + Esta es una vista inicial de React. Probaremos el consumo de APIs y luego migraremos módulos (Status, Datasets/Variables, Plotting, Events, Config Editor) de forma incremental. + + + Acciones rápidas + + + + + + + + ) +} -
- Esta es una vista inicial de React. Probaremos el consumo de APIs y luego migraremos módulos (Status, Datasets/Variables, Plotting, Events, Config Editor) de forma incremental. -
+function ColorModeSelector() { + const { colorMode, setColorMode } = useColorMode() + const bg = useColorModeValue('gray.100', 'gray.700') + const [selection, setSelection] = React.useState(() => { + return localStorage.getItem('ui-color-mode-preference') || 'system' + }) -
-

Acciones rápidas

- -
-
+ React.useEffect(() => { + if (selection === 'system') { + try { localStorage.removeItem('chakra-ui-color-mode') } catch { /* ignore */ } + const mq = window.matchMedia('(prefers-color-scheme: dark)') + setColorMode(mq.matches ? 'dark' : 'light') + const handler = (e) => setColorMode(e.matches ? 'dark' : 'light') + mq.addEventListener?.('change', handler) + return () => mq.removeEventListener?.('change', handler) + } else { + try { localStorage.setItem('chakra-ui-color-mode', selection) } catch { /* ignore */ } + setColorMode(selection) + } + }, [selection, setColorMode]) + + return ( + + + ) } function NavBar() { + const navBg = useColorModeValue('gray.100', 'gray.800') return ( - + + + + + REC + PLC Streamer + + + + + + + + + + + ) } function App() { const [showPLCModal, setShowPLCModal] = React.useState(false) return ( - <> + -
- -
+ + + } /> } /> @@ -71,7 +115,7 @@ function App() { } /> setShowPLCModal(false)} /> - +
) } diff --git a/frontend/src/components/PLCConfigModal.jsx b/frontend/src/components/PLCConfigModal.jsx index 5f159e5..3febdfc 100644 --- a/frontend/src/components/PLCConfigModal.jsx +++ b/frontend/src/components/PLCConfigModal.jsx @@ -1,25 +1,35 @@ import React, { useEffect, useState } from 'react'; -import { Modal, Button } from 'react-bootstrap'; -import Form from '@rjsf/core'; +import { + Modal, + ModalOverlay, + ModalContent, + ModalHeader, + ModalBody, + ModalFooter, + ModalCloseButton, + Button, + Alert, + AlertIcon, +} from '@chakra-ui/react' +import Form from '@rjsf/chakra-ui'; import validator from '@rjsf/validator-ajv8'; import { getSchema, readConfig, writeConfig } from '../services/api.js'; -import { widgets } from './rjsf/widgets.jsx'; +// Chakra theme widgets are used by default const uiSchema = { - 'ui:widget': 'TextWidget', plc_config: { - rack: { 'ui:widget': 'UpDownWidget' }, - slot: { 'ui:widget': 'UpDownWidget' }, + rack: { 'ui:widget': 'updown' }, + slot: { 'ui:widget': 'updown' }, }, udp_config: { - port: { 'ui:widget': 'UpDownWidget' }, + port: { 'ui:widget': 'updown' }, }, - sampling_interval: { 'ui:widget': 'UpDownWidget' }, + sampling_interval: { 'ui:widget': 'updown' }, csv_config: { - max_size_mb: { 'ui:widget': 'UpDownWidget' }, - max_days: { 'ui:widget': 'UpDownWidget' }, - max_hours: { 'ui:widget': 'UpDownWidget' }, - cleanup_interval_hours: { 'ui:widget': 'UpDownWidget' }, + max_size_mb: { 'ui:widget': 'updown' }, + max_days: { 'ui:widget': 'updown' }, + max_hours: { 'ui:widget': 'updown' }, + cleanup_interval_hours: { 'ui:widget': 'updown' }, }, }; @@ -55,29 +65,34 @@ export default function PLCConfigModal({ show, onClose }) { }; return ( - - - PLC Configuration - - - {msg &&
{msg}
} - {schema && ( -
setFormData(formData)} - widgets={widgets} - uiSchema={uiSchema} - > - -
- )} -
- - - + + + + PLC Configuration + + + {msg && ( + + {msg} + + )} + {schema && ( +
setFormData(formData)} + uiSchema={uiSchema} + > + +
+ )} +
+ + + +
); } diff --git a/frontend/src/components/rjsf/LayoutObjectFieldTemplate.jsx b/frontend/src/components/rjsf/LayoutObjectFieldTemplate.jsx new file mode 100644 index 0000000..2201f7a --- /dev/null +++ b/frontend/src/components/rjsf/LayoutObjectFieldTemplate.jsx @@ -0,0 +1,79 @@ +import React from 'react' +import { SimpleGrid, Box, Heading, Text, Stack } from '@chakra-ui/react' + +// Simple ObjectFieldTemplate supporting ui:layout +// uiSchema example: +// { +// "ui:layout": [ +// [ { "name": "fieldA", "width": 6 }, { "name": "fieldB", "width": 6 } ], +// [ { "name": "fieldC", "width": 12 } ] +// ] +// } + +export default function LayoutObjectFieldTemplate(props) { + const { TitleField, DescriptionField, title, description, properties = [], uiSchema } = props + const layout = uiSchema && uiSchema['ui:layout'] + + if (!layout) { + return ( + + {title && ( + TitleField ? ( + + ) : ( + {title} + ) + )} + {description && ( + DescriptionField ? ( + + ) : ( + {description} + ) + )} + + {properties.map((prop) => ( + {prop.content} + ))} + + + ) + } + + // Map property name to its renderer + const propMap = new Map(properties.map((p) => [p.name, p])) + + return ( + + {title && ( + TitleField ? ( + + ) : ( + {title} + ) + )} + {description && ( + DescriptionField ? ( + + ) : ( + {description} + ) + )} + {layout.map((row, rowIdx) => ( + + {row.map((cell, cellIdx) => { + const prop = propMap.get(cell.name) + if (!prop) return null + const col = Math.min(Math.max(cell.width || 12, 1), 12) + return ( + {prop.content} + ) + })} + + ))} + + ) +} + + + diff --git a/frontend/src/components/rjsf/widgets.jsx b/frontend/src/components/rjsf/widgets.jsx index 85e7830..e4b3b3c 100644 --- a/frontend/src/components/rjsf/widgets.jsx +++ b/frontend/src/components/rjsf/widgets.jsx @@ -1,5 +1,6 @@ import React from 'react'; -import { Form as BSForm } from 'react-bootstrap'; +// Deprecated: Bootstrap widgets were used before migrating to Chakra UI theme +// Legacy Bootstrap widgets no longer used after migrating to Chakra UI export const TextWidget = ({ id, placeholder, required, readonly, disabled, label, value, onChange, onBlur, onFocus, autofocus, options, schema }) => ( diff --git a/frontend/src/main.jsx b/frontend/src/main.jsx index 490eceb..02e5907 100644 --- a/frontend/src/main.jsx +++ b/frontend/src/main.jsx @@ -1,14 +1,18 @@ import React from 'react' import { createRoot } from 'react-dom/client' import App from './App.jsx' -import 'bootstrap/dist/css/bootstrap.min.css' import { BrowserRouter } from 'react-router-dom' +import { ChakraProvider, ColorModeScript } from '@chakra-ui/react' +import theme from './theme.js' createRoot(document.getElementById('root')).render( - - - + + + + + + ) diff --git a/frontend/src/pages/Config.jsx b/frontend/src/pages/Config.jsx index 48f226a..d0f7e3e 100644 --- a/frontend/src/pages/Config.jsx +++ b/frontend/src/pages/Config.jsx @@ -1,9 +1,9 @@ import React, { useEffect, useMemo, useState } from 'react' -import { Container, Row, Col, Button, ButtonGroup, Dropdown, DropdownButton } from 'react-bootstrap' -import Form from '@rjsf/fluent-ui' +import { Container, Heading, HStack, Button, Menu, MenuButton, MenuList, MenuItem, useColorModeValue, Alert, AlertIcon, Spacer } from '@chakra-ui/react' +import Form from '@rjsf/chakra-ui' import validator from '@rjsf/validator-ajv8' import { listSchemas, getSchema, readConfig, writeConfig } from '../services/api.js' -import { widgets } from '../components/rjsf/widgets.jsx' +import LayoutObjectFieldTemplate from '../components/rjsf/LayoutObjectFieldTemplate.jsx' function buildUiSchema(schema) { if (!schema || typeof schema !== 'object') return undefined @@ -13,9 +13,9 @@ function buildUiSchema(schema) { const type = s.type // handle oneOf/anyOf by taking first option for ui mapping const resolved = type || (Array.isArray(s.oneOf) && s.oneOf[0]?.type) || (Array.isArray(s.anyOf) && s.anyOf[0]?.type) - if (resolved === 'string') return { 'ui:widget': 'TextWidget' } - if (resolved === 'integer' || resolved === 'number') return { 'ui:widget': 'UpDownWidget' } - if (resolved === 'boolean') return { 'ui:widget': 'CheckboxWidget' } + if (resolved === 'string') return { 'ui:widget': 'text' } + if (resolved === 'integer' || resolved === 'number') return { 'ui:widget': 'updown' } + if (resolved === 'boolean') return { 'ui:widget': 'checkbox' } if (resolved === 'object' && s.properties) { const ui = {} for (const [key, prop] of Object.entries(s.properties)) { @@ -120,31 +120,29 @@ export default function ConfigPage() { } return ( - - -

Config Editor

- - + + + Config Editor + + Schema: {currentId} + {available.map(id => ( - setCurrentId(id)}> - {id} - + setCurrentId(id)}>{id} ))} - - - - - - - - - + + + + + + + + {message && ( -
{message}
+ {message} )} {schema && ( @@ -155,11 +153,12 @@ export default function ConfigPage() { onSubmit={handleSave} onChange={({ formData }) => setFormData(formData)} uiSchema={uiSchema} + templates={{ ObjectFieldTemplate: LayoutObjectFieldTemplate }} > -
- - -
+ + + + )}
diff --git a/frontend/src/pages/Dashboard.jsx b/frontend/src/pages/Dashboard.jsx index b5f75d8..f88f6ed 100644 --- a/frontend/src/pages/Dashboard.jsx +++ b/frontend/src/pages/Dashboard.jsx @@ -1,6 +1,9 @@ import React, { useEffect, useMemo, useRef, useState } from 'react' -import Form from '@rjsf/fluent-ui' +import { Link } from 'react-router-dom' +import { Box, Container, Flex, Grid, GridItem, HStack, Heading, Text, Button, Badge, Table, Thead, Tbody, Tr, Th, Td, Alert, AlertIcon, Card, CardBody, useColorModeValue } from '@chakra-ui/react' +import Form from '@rjsf/chakra-ui' import validator from '@rjsf/validator-ajv8' +import LayoutObjectFieldTemplate from '../components/rjsf/LayoutObjectFieldTemplate.jsx' import { getStatus, getEvents, @@ -18,58 +21,45 @@ function StatusBar({ status }) { const plcConnected = !!status?.plc_connected const streaming = !!status?.streaming const csvRecording = !!status?.csv_recording + const muted = useColorModeValue('gray.600', 'gray.300') return ( -
-
-
-
-
-
🔌 PLC: {plcConnected ? 'Connected' : 'Disconnected'}
- {status?.plc_reconnection?.enabled && ( -
- 🔄 Auto-reconnection: {status?.plc_reconnection?.active ? 'reconnecting…' : 'enabled'} -
- )} -
-
- {plcConnected ? ( - - ) : ( - - )} -
-
-
-
-
-
-
-
-
📡 UDP Streaming: {streaming ? 'Active' : 'Inactive'}
-
-
- {streaming ? ( - - ) : ( - - )} -
-
-
-
-
-
-
-
💾 CSV: {csvRecording ? 'Recording' : 'Inactive'}
- {status?.disk_space_info && ( -
- 💽 {status.disk_space_info.free_space} free · ⏱️ ~{status.disk_space_info.recording_time_left} -
- )} -
-
-
-
+ + + + 🔌 PLC: {plcConnected ? 'Connected' : 'Disconnected'} + {status?.plc_reconnection?.enabled && ( + + 🔄 Auto-reconnection: {status?.plc_reconnection?.active ? 'reconnecting…' : 'enabled'} + + )} + + {plcConnected ? ( + + ) : ( + + )} + + + + + 📡 UDP Streaming: {streaming ? 'Active' : 'Inactive'} + + {streaming ? ( + + ) : ( + + )} + + + + 💾 CSV: {csvRecording ? 'Recording' : 'Inactive'} + {status?.disk_space_info && ( + + 💽 {status.disk_space_info.free_space} free · ⏱️ ~{status.disk_space_info.recording_time_left} + + )} + + ) } @@ -79,9 +69,9 @@ function buildUiSchema(schema) { if (!s || typeof s !== 'object') return undefined const type = s.type const resolved = type || (Array.isArray(s.oneOf) && s.oneOf[0]?.type) || (Array.isArray(s.anyOf) && s.anyOf[0]?.type) - if (resolved === 'string') return { 'ui:widget': 'TextWidget' } - if (resolved === 'integer' || resolved === 'number') return { 'ui:widget': 'UpDownWidget' } - if (resolved === 'boolean') return { 'ui:widget': 'CheckboxWidget' } + if (resolved === 'string') return { 'ui:widget': 'text' } + if (resolved === 'integer' || resolved === 'number') return { 'ui:widget': 'updown' } + if (resolved === 'boolean') return { 'ui:widget': 'checkbox' } if (resolved === 'object' && s.properties) { const ui = {} for (const [key, prop] of Object.entries(s.properties)) { @@ -102,6 +92,7 @@ export default function DashboardPage() { const [status, setStatus] = useState(null) const [statusError, setStatusError] = useState('') const sseRef = useRef(null) + const muted = useColorModeValue('gray.600', 'gray.300') const [schemas, setSchemas] = useState({}) const available = useMemo(() => { @@ -213,66 +204,70 @@ export default function DashboardPage() { }, [currentSchemaId]) return ( -
-
-

PLC S7-31x Streamer & Logger

-
Unified dashboard: status, config and events
-
+ + + PLC S7-31x Streamer & Logger + Unified dashboard: status, config and events + - {statusError &&
{statusError}
} + {statusError && {statusError}} {status && } {['plc', 'datasets', 'plots'].map((sectionId) => ( -
-
-
-
🧩 {sectionId}
-
+ + + + 🧩 {sectionId} + -
-
+ + -
-
+ + ))} -
-

📋 Recent Events

-
- - Open Events -
-
+ + + + -
- - - - - - - - - + +
TimeLevelMessage
+ + + + + + + + {events.map((ev, idx) => ( - - - - - + + + + + ))} {events.length === 0 && ( - + )} - -
TimeLevelMessage
{ev.timestamp || '-'}{ev.level || ev.type || 'INFO'} -
{ev.message || ev.event || '-'}
- {ev.details &&
{typeof ev.details === 'object' ? JSON.stringify(ev.details) : String(ev.details)}
} -
{ev.timestamp || '-'}{ev.level || ev.type || 'INFO'} + {ev.message || ev.event || '-'} + {ev.details && ( + + {typeof ev.details === 'object' ? JSON.stringify(ev.details) : String(ev.details)} + + )} +
No events
No events
-
-
+ + + +
) } @@ -288,8 +283,8 @@ function SectionControls({ sectionId }) { })() }, [sectionId]) return ( -
-
+ }}>💾 Save + ) } @@ -351,7 +346,7 @@ function SectionForm({ sectionId }) { return () => { mounted = false } }, [sectionId]) - if (loading || !localSchema) return
Loading {sectionId}…
+ if (loading || !localSchema) return Loading {sectionId}… return (
diff --git a/frontend/src/pages/Events.jsx b/frontend/src/pages/Events.jsx index 1dec762..188ad09 100644 --- a/frontend/src/pages/Events.jsx +++ b/frontend/src/pages/Events.jsx @@ -1,5 +1,6 @@ import React, { useEffect, useState } from 'react' import { getEvents } from '../services/api.js' +import { Container, Heading, HStack, Button, Alert, AlertIcon, Table, Thead, Tbody, Tr, Th, Td, Box, Text, useColorModeValue } from '@chakra-ui/react' export default function EventsPage() { const [events, setEvents] = useState([]) @@ -28,44 +29,48 @@ export default function EventsPage() { }, []) return ( -
-

Events

-
- - /api/events -
+ + + - {error &&
{error}
} - {loading && !error &&
Cargando eventos...
} + {error && {error}} + {loading && !error && Cargando eventos...} {!loading && !error && ( -
- - - - - - - - - + +
TimeLevelMessage
+ + + + + + + + {events.map((ev, idx) => ( - - - - - + + + + + ))} - -
TimeLevelMessage
{ev.timestamp || '-'}{ev.level || ev.type || 'INFO'} -
{ev.message || ev.event || '-'}
- {ev.details &&
{typeof ev.details === 'object' ? JSON.stringify(ev.details) : String(ev.details)}
} -
{ev.timestamp || '-'}{ev.level || ev.type || 'INFO'} + {ev.message || ev.event || '-'} + {ev.details && ( + + {typeof ev.details === 'object' ? JSON.stringify(ev.details) : String(ev.details)} + + )} +
-
+ + + )} -
+ ) } diff --git a/frontend/src/theme.js b/frontend/src/theme.js new file mode 100644 index 0000000..4a4557f --- /dev/null +++ b/frontend/src/theme.js @@ -0,0 +1,68 @@ +import { extendTheme } from '@chakra-ui/react' +import { mode } from '@chakra-ui/theme-tools' + +const config = { + initialColorMode: 'system', + useSystemColorMode: true, +} + +const styles = { + global: (props) => ({ + 'html, body, #root': { height: '100%' }, + body: { + backgroundColor: mode('gray.50', 'gray.700')(props), + color: mode('gray.800', 'gray.100')(props), + }, + // Override common Bootstrap surfaces so they respect color mode + '.navbar': { + backgroundColor: mode('#f8f9fa', '#2D3748')(props), + borderColor: mode('#dee2e6', '#4A5568')(props), + color: 'inherit', + }, + '.navbar *': { color: 'inherit' }, + '.card': { + backgroundColor: mode('#ffffff', '#2D3748')(props), + borderColor: mode('#dee2e6', '#4A5568')(props), + color: 'inherit', + }, + '.card *': { color: 'inherit' }, + '.alert': { + backgroundColor: mode('#f8f9fa', '#2A4365')(props), + color: mode('#0c5460', '#bee3f8')(props), + borderColor: mode('#b8daff', '#2C5282')(props), + }, + '.table': { + color: 'inherit', + }, + '.table-striped tbody tr:nth-of-type(odd)': { + backgroundColor: mode('rgba(0,0,0,.05)', 'rgba(255,255,255,.06)')(props), + }, + '.form-label': { color: 'inherit' }, + '.btn.btn-outline-primary': { + borderColor: mode('#0d6efd', '#90cdf4')(props), + color: mode('#0d6efd', '#90cdf4')(props), + }, + // In dark mode, align Bootstrap CSS variables to dark tokens + ...(props.colorMode === 'dark' + ? { + ':root': { + '--bs-body-bg': '#2D3748', + '--bs-body-color': '#E2E8F0', + '--bs-border-color': '#4A5568', + '--bs-card-bg': '#2D3748', + '--bs-card-color': '#E2E8F0', + '--bs-heading-color': '#EDF2F7', + '--bs-link-color': '#90cdf4', + '--bs-table-bg': 'transparent', + '--bs-table-color': '#E2E8F0', + '--bs-table-striped-bg': 'rgba(255,255,255,0.06)', + '--bs-navbar-color': '#E2E8F0', + }, + } + : {}), + }), +} + +const theme = extendTheme({ config, styles }) + +export default theme diff --git a/schemas/datasets.schema.json b/schemas/datasets.schema.json index 1629d96..ca67c97 100644 --- a/schemas/datasets.schema.json +++ b/schemas/datasets.schema.json @@ -2,7 +2,7 @@ "$schema": "https://json-schema.org/draft/2020-12/schema", "$id": "datasets.schema.json", "title": "Datasets Configuration", - "description": "Esquema para editar plc_datasets.json (múltiples datasets y variables)", + "description": "Schema to edit plc_datasets.json (multiple datasets and variables)", "type": "object", "additionalProperties": false, "properties": { @@ -15,14 +15,14 @@ "name": { "type": "string", "title": "Dataset Name", - "description": "Nombre legible del dataset", + "description": "Human-readable name of the dataset", "minLength": 1, "maxLength": 60 }, "prefix": { "type": "string", "title": "CSV Prefix", - "description": "Prefijo para archivos CSV", + "description": "Prefix for CSV files", "pattern": "^[a-zA-Z0-9_-]+$", "minLength": 1, "maxLength": 20 @@ -103,7 +103,7 @@ }, "streaming_variables": { "type": "array", - "title": "Streaming Variables", + "title": "Streaming variables", "items": { "type": "string" }, @@ -114,8 +114,8 @@ "number", "null" ], - "title": "Sampling Interval (s)", - "description": "Vacío para usar el intervalo global", + "title": "Sampling interval (s)", + "description": "Leave empty to use the global interval", "minimum": 0.01, "maximum": 10 }, diff --git a/schemas/plc.schema.json b/schemas/plc.schema.json index ab4b575..93dba6f 100644 --- a/schemas/plc.schema.json +++ b/schemas/plc.schema.json @@ -2,7 +2,7 @@ "$schema": "https://json-schema.org/draft/2020-12/schema", "$id": "plc.schema.json", "title": "PLC & UDP Configuration", - "description": "Esquema para editar plc_config.json", + "description": "Schema to edit plc_config.json", "type": "object", "additionalProperties": false, "properties": { @@ -14,14 +14,14 @@ "ip": { "type": "string", "title": "PLC IP", - "description": "Dirección IP del PLC (S7-31x)", + "description": "IP address of the PLC (S7-31x)", "format": "ipv4", "pattern": "^.+$" }, "rack": { "type": "integer", "title": "Rack", - "description": "Número de rack (0-7)", + "description": "Rack number (0-7)", "minimum": 0, "maximum": 7, "default": 0 @@ -29,7 +29,7 @@ "slot": { "type": "integer", "title": "Slot", - "description": "Número de slot (generalmente 2)", + "description": "Slot number (usually 2)", "minimum": 0, "maximum": 31, "default": 2 @@ -69,7 +69,7 @@ "minimum": 0.01, "maximum": 10, "title": "Sampling Interval (s)", - "description": "Intervalo global de muestreo en segundos", + "description": "Global sampling interval in seconds", "default": 0.1 }, "csv_config": { diff --git a/schemas/plots.schema.json b/schemas/plots.schema.json index 35a8c57..996727d 100644 --- a/schemas/plots.schema.json +++ b/schemas/plots.schema.json @@ -2,7 +2,7 @@ "$schema": "https://json-schema.org/draft/2020-12/schema", "$id": "plots.schema.json", "title": "Plot Sessions", - "description": "Esquema para editar plot_sessions.json (sesiones de gráfica)", + "description": "Schema to edit plot_sessions.json (plot sessions)", "type": "object", "additionalProperties": false, "properties": { @@ -14,12 +14,12 @@ "name": { "type": "string", "title": "Plot Name", - "description": "Nombre de la sesión de gráfica" + "description": "Human-readable name of the plot session" }, "variables": { "type": "array", "title": "Variables", - "description": "Variables a graficar", + "description": "Variables to be plotted", "items": { "type": "string" }, @@ -27,8 +27,8 @@ }, "time_window": { "type": "integer", - "title": "Time Window (s)", - "description": "Ventana temporal en segundos", + "title": "Time window (s)", + "description": "Time window in seconds", "minimum": 5, "maximum": 3600, "default": 60 @@ -39,7 +39,7 @@ "null" ], "title": "Y Min", - "description": "Vacío para auto" + "description": "Leave empty for auto" }, "y_max": { "type": [ @@ -47,7 +47,7 @@ "null" ], "title": "Y Max", - "description": "Vacío para auto" + "description": "Leave empty for auto" }, "trigger_variable": { "type": [