Refactor configuration management and schema handling
- Updated JSON schema for plot variables to improve clarity and structure. - Modified UI schema for dataset variables to enhance user experience and layout. - Revamped plot definitions UI schema for better organization and usability. - Enhanced dataset manager to support filtered variable configuration based on selected datasets. - Implemented a unified JSON manager for streamlined CRUD operations on configuration files. - Improved error handling and validation for JSON schema loading and configuration management. - Updated main application logic to utilize new JSON and schema managers for configuration operations. - Added validation for dataset definitions and PLC configuration schemas.
This commit is contained in:
parent
276efb117d
commit
2ed5acf039
|
@ -25,12 +25,23 @@ This is a **dual-stack industrial automation system** for Siemens S7-315 PLCs co
|
|||
- UDP streaming: Manual control for PlotJuggler visualization
|
||||
- Each dataset thread handles both, but UDP transmission is independently controlled
|
||||
|
||||
### 3. Schema-Driven Configuration
|
||||
All configuration uses JSON Schema validation in `config/schema/`:
|
||||
- `plc.schema.json`: PLC connection + UDP settings
|
||||
- `dataset-*.schema.json`: Variable definitions and datasets
|
||||
- `plot-*.schema.json`: Plot configuration and variables
|
||||
- UI schemas in `config/schema/ui/` for RJSF form generation
|
||||
### 3. Schema-Driven Configuration with RJSF
|
||||
All configuration uses JSON Schema validation with React JSON Schema Forms (RJSF):
|
||||
- **Frontend-First Validation**: RJSF handles all form generation and validation
|
||||
- **Backend API Simplification**: Flask provides simple CRUD operations for JSON files
|
||||
- **Array-Based Data Structure**: All configurations use array format for RJSF compatibility
|
||||
- **Three Form Types**:
|
||||
- Type 1: Single object forms (PLC config)
|
||||
- Type 2: Array management forms (dataset definitions, plot definitions)
|
||||
- Type 3: Filtered array forms with combo selectors (variables linked to datasets/plots)
|
||||
|
||||
### 4. JSON Configuration Structure
|
||||
**CRITICAL**: All JSON files use array-based structures for RJSF compatibility:
|
||||
- `plc_config.json`: Single object with `udp_config` containing `sampling_interval`
|
||||
- `dataset_definitions.json`: `{"datasets": [{"id": "DAR", "name": "...", ...}]}`
|
||||
- `dataset_variables.json`: `{"variables": [{"dataset_id": "DAR", "variables": [...]}]}`
|
||||
- `plot_definitions.json`: `{"plots": [{"id": "plot1", "name": "...", ...}]}`
|
||||
- `plot_variables.json`: `{"plot_variables": [{"plot_id": "plot1", "variables": [...]}]}`
|
||||
|
||||
## Development Workflows
|
||||
|
||||
|
@ -72,11 +83,12 @@ npm run build # Production build to dist/
|
|||
- **Pattern**: RJSF forms for configuration, custom tables for data management
|
||||
|
||||
### API Endpoints Structure
|
||||
Flask routes in `main.py` follow REST patterns:
|
||||
- `/api/config/*`: Configuration CRUD operations
|
||||
Flask routes in `main.py` follow simplified REST patterns with unified JSON handling:
|
||||
- `/api/config/*`: Unified configuration CRUD operations for all JSON files
|
||||
- `/api/plc/*`: PLC connection and status
|
||||
- `/api/streaming/*`: Data streaming controls
|
||||
- `/api/plots/*`: Plot session management
|
||||
- **API Philosophy**: Backend provides simple file I/O, frontend handles all validation via RJSF
|
||||
|
||||
## Important Conventions
|
||||
|
||||
|
@ -89,6 +101,7 @@ Flask routes in `main.py` follow REST patterns:
|
|||
- **FormTable.jsx**: Single-row forms per item using RJSF schemas
|
||||
- **DatasetFormManager/PlotFormManager**: Master-detail table management
|
||||
- Chakra UI components with consistent styling via `theme.js`
|
||||
- **RJSF Integration**: All forms auto-generated from JSON Schema, no hardcoded form fields
|
||||
|
||||
### 3. Thread Safety
|
||||
- Data streaming uses thread-safe collections and proper cleanup
|
||||
|
@ -97,6 +110,9 @@ Flask routes in `main.py` follow REST patterns:
|
|||
|
||||
### 4. Schema Evolution
|
||||
Follow existing patterns in `config/schema/` - all forms are auto-generated from JSON Schema + UI Schema combinations. Never hardcode form fields.
|
||||
- **Array-First Design**: All multi-item configurations use array structures for RJSF type 2 forms
|
||||
- **Unified Validation**: JSON Schema validation both client-side (RJSF) and server-side (jsonschema library)
|
||||
- **Schema-UI Separation**: Data schemas in `/config/schema/`, UI schemas in `/config/schema/ui/`
|
||||
|
||||
### 5. Development Context
|
||||
- Use `.doc/MemoriaDeEvolucion.md` for understanding recent changes and decisions
|
||||
|
@ -123,4 +139,92 @@ Follow existing patterns in `config/schema/` - all forms are auto-generated from
|
|||
|
||||
### Notes
|
||||
Always write software variables and comments in English
|
||||
The development is focused on Windows and after testing must work without CDN completely offline.
|
||||
The development is focused on Windows and after testing must work without CDN completely offline.
|
||||
|
||||
## RJSF Configuration Management
|
||||
|
||||
### Form Type Architecture
|
||||
The system implements three distinct RJSF form patterns:
|
||||
|
||||
**Type 1: Single Object Forms**
|
||||
- Used for: PLC configuration (`plc_config.json`)
|
||||
- Structure: Single JSON object with nested properties
|
||||
- RJSF Pattern: Direct object form rendering
|
||||
- Example: Connection settings, UDP configuration with `sampling_interval`
|
||||
|
||||
**Type 2: Array Management Forms**
|
||||
- Used for: Dataset definitions (`dataset_definitions.json`), Plot definitions (`plot_definitions.json`)
|
||||
- Structure: `{"datasets": [...]}` or `{"plots": [...]}`
|
||||
- RJSF Pattern: Array form with add/remove/edit capabilities
|
||||
- Critical: Root must be array wrapper for RJSF compatibility
|
||||
|
||||
**Type 3: Filtered Array Forms with Combo Selectors**
|
||||
- Used for: Variables linked to datasets/plots (`dataset_variables.json`, `plot_variables.json`)
|
||||
- Structure: Array with foreign key references (`dataset_id`, `plot_id`)
|
||||
- RJSF Pattern: Filtered forms based on selected dataset/plot
|
||||
- Workflow: Select parent → Edit associated variables
|
||||
- **Implementation**: Combo selector + dynamic schema generation for selected item
|
||||
- **Key Functions**: `getSelectedDatasetVariables()`, `updateSelectedDatasetVariables()`
|
||||
|
||||
### RJSF Best Practices and Common Pitfalls
|
||||
**Critical Widget Guidelines**:
|
||||
- **Arrays**: Never specify `"ui:widget": "array"` - arrays use built-in ArrayField component
|
||||
- **Valid Widgets**: text, textarea, select, checkbox, updown, variableSelector
|
||||
- **Widget Registry**: All widgets must be registered in `AllWidgets.jsx`
|
||||
- **Custom Widgets**: Use specific widget names, avoid generic type names
|
||||
|
||||
**Schema Structure Rules**:
|
||||
- **Array Items**: Always include `title` property for array item schemas
|
||||
- **UI Layout**: Use `"ui:layout"` for grid-based field arrangement
|
||||
- **Field Templates**: Leverage `LayoutObjectFieldTemplate` for responsive layouts
|
||||
- **Error Handling**: RJSF errors often indicate missing widgets or malformed schemas
|
||||
|
||||
### Type 3 Form Implementation Pattern
|
||||
```javascript
|
||||
// Step 1: Parent Selector (Combo)
|
||||
const [selectedItemId, setSelectedItemId] = useState('')
|
||||
|
||||
// Step 2: Filtered Data Helper
|
||||
const getSelectedItemData = () => {
|
||||
return allData.find(item => item.parent_id === selectedItemId) || defaultData
|
||||
}
|
||||
|
||||
// Step 3: Update Helper
|
||||
const updateSelectedItemData = (newData) => {
|
||||
const updated = allData.map(item =>
|
||||
item.parent_id === selectedItemId ? { ...item, ...newData } : item
|
||||
)
|
||||
setAllData({ ...allData, items: updated })
|
||||
}
|
||||
|
||||
// Step 4: Dynamic Schema Generation
|
||||
const dynamicSchema = {
|
||||
type: "object",
|
||||
properties: { /* fields specific to selected item */ }
|
||||
}
|
||||
```
|
||||
|
||||
### JSON Schema Migration Notes
|
||||
- **Legacy to Array**: All object-based configs converted to array format
|
||||
- **ID Fields**: Added explicit `id` fields to all array items for referencing
|
||||
- **Validation**: Unified validation using `jsonschema` library server-side + RJSF client-side
|
||||
- **Backward Compatibility**: Migration handled in backend for existing configurations
|
||||
|
||||
### Development Debugging Guide
|
||||
**RJSF Error Resolution**:
|
||||
- `No widget 'X' for type 'Y'`: Check widget registration in `AllWidgets.jsx`
|
||||
- Array rendering errors: Remove `"ui:widget"` specification from array fields
|
||||
- Schema validation failures: Use `validate_schema.py` to test JSON structure
|
||||
- Form not displaying: Verify schema structure matches expected Type 1/2/3 pattern
|
||||
|
||||
**Type 3 Form Debugging**:
|
||||
- Combo not showing options: Check parent data loading and `availableItems` array
|
||||
- Form not updating: Verify `selectedItemId` state and helper functions
|
||||
- Data not persisting: Check `updateSelectedItemData()` logic and save operations
|
||||
- Schema errors: Ensure dynamic schema generation matches data structure
|
||||
|
||||
**Frontend-Backend Integration**:
|
||||
- API endpoint naming: Use consistent `/api/config/{config-name}` pattern
|
||||
- JSON structure validation: Backend uses `jsonschema`, frontend uses RJSF validation
|
||||
- Error handling: Both client and server should handle array format gracefully
|
||||
- Configuration loading: Always verify API response structure before setting form data
|
|
@ -1,77 +1,5 @@
|
|||
{
|
||||
"events": [
|
||||
{
|
||||
"timestamp": "2025-07-17T14:56:25.611805",
|
||||
"level": "info",
|
||||
"event_type": "Application started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:56:25.640479",
|
||||
"level": "info",
|
||||
"event_type": "plc_connection",
|
||||
"message": "Successfully connected to PLC 10.1.33.11",
|
||||
"details": {
|
||||
"ip": "10.1.33.11",
|
||||
"rack": 0,
|
||||
"slot": 2
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:56:25.642459",
|
||||
"level": "info",
|
||||
"event_type": "csv_started",
|
||||
"message": "CSV recording started for 3 variables",
|
||||
"details": {
|
||||
"variables_count": 3,
|
||||
"output_directory": "records\\17-07-2025"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T14:56:25.643467",
|
||||
"level": "info",
|
||||
"event_type": "streaming_started",
|
||||
"message": "Streaming started with 3 variables",
|
||||
"details": {
|
||||
"variables_count": 3,
|
||||
"streaming_variables_count": 3,
|
||||
"sampling_interval": 0.1,
|
||||
"udp_host": "127.0.0.1",
|
||||
"udp_port": 9870
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:25:37.659374",
|
||||
"level": "info",
|
||||
"event_type": "variable_added",
|
||||
"message": "Variable added: CTS306_Conditi -> DB2124.18 (real)",
|
||||
"details": {
|
||||
"name": "CTS306_Conditi",
|
||||
"db": 2124,
|
||||
"offset": 18,
|
||||
"type": "real",
|
||||
"total_variables": 4
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:25:37.662879",
|
||||
"level": "info",
|
||||
"event_type": "csv_file_created",
|
||||
"message": "New CSV file created after variable modification: _15_25_37.csv",
|
||||
"details": {
|
||||
"file_path": "records\\17-07-2025\\_15_25_37.csv",
|
||||
"variables_count": 4,
|
||||
"reason": "variable_modification"
|
||||
}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:42:38.033187",
|
||||
"level": "info",
|
||||
"event_type": "Application started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-07-17T15:42:38.052471",
|
||||
"level": "info",
|
||||
|
@ -10497,8 +10425,57 @@
|
|||
"event_type": "application_started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-08-13T22:31:56.301635",
|
||||
"level": "info",
|
||||
"event_type": "application_started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-08-13T22:38:57.295676",
|
||||
"level": "info",
|
||||
"event_type": "application_started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-08-13T22:46:37.365559",
|
||||
"level": "info",
|
||||
"event_type": "application_started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-08-13T22:53:47.634252",
|
||||
"level": "info",
|
||||
"event_type": "application_started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-08-13T23:22:49.824873",
|
||||
"level": "info",
|
||||
"event_type": "application_started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-08-13T23:25:32.057929",
|
||||
"level": "info",
|
||||
"event_type": "application_started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-08-13T23:30:43.779703",
|
||||
"level": "info",
|
||||
"event_type": "application_started",
|
||||
"message": "Application initialization completed successfully",
|
||||
"details": {}
|
||||
}
|
||||
],
|
||||
"last_updated": "2025-08-13T22:11:17.400295",
|
||||
"last_updated": "2025-08-13T23:30:43.779703",
|
||||
"total_entries": 1000
|
||||
}
|
|
@ -1,18 +1,27 @@
|
|||
{
|
||||
"datasets": {
|
||||
"DAR": {
|
||||
"created": "2025-08-08T15:47:18.566053",
|
||||
"enabled": true,
|
||||
"name": "DAR",
|
||||
"prefix": "gateway_phoenix",
|
||||
"sampling_interval": 1
|
||||
},
|
||||
"Fast": {
|
||||
"created": "2025-08-09T02:06:26.840011",
|
||||
"enabled": true,
|
||||
"name": "Fast",
|
||||
"prefix": "fast",
|
||||
"sampling_interval": 0.61
|
||||
}
|
||||
"datasets": [
|
||||
{
|
||||
"created": "2025-08-08T15:47:18.566053",
|
||||
"enabled": true,
|
||||
"id": "DAR",
|
||||
"name": "DAR",
|
||||
"prefix": "gateway_phoenix",
|
||||
"sampling_interval": 1.01
|
||||
},
|
||||
{
|
||||
"created": "2025-08-09T02:06:26.840011",
|
||||
"enabled": true,
|
||||
"id": "Fast",
|
||||
"name": "Fast",
|
||||
"prefix": "fast",
|
||||
"sampling_interval": 0.62
|
||||
},
|
||||
{
|
||||
"enabled": true,
|
||||
"id": "Test",
|
||||
"name": "test",
|
||||
"prefix": "test",
|
||||
"sampling_interval": 1
|
||||
}
|
||||
]
|
||||
}
|
|
@ -1,14 +1,37 @@
|
|||
{
|
||||
"dataset_variables": [
|
||||
"variables": [
|
||||
{
|
||||
"dataset_id": "DAR",
|
||||
"variables": [
|
||||
{
|
||||
"dataset_id": "DAR",
|
||||
"variables": {},
|
||||
"streaming_variables": []
|
||||
"name": "UR29_Brix",
|
||||
"area": "db",
|
||||
"db": 1011,
|
||||
"offset": 1322,
|
||||
"type": "real",
|
||||
"streaming": true
|
||||
},
|
||||
{
|
||||
"dataset_id": "Fast",
|
||||
"variables": {},
|
||||
"streaming_variables": []
|
||||
"name": "UR29_ma",
|
||||
"area": "db",
|
||||
"db": 1011,
|
||||
"offset": 1296,
|
||||
"type": "real",
|
||||
"streaming": true
|
||||
},
|
||||
{
|
||||
"name": "fUR29_Brix",
|
||||
"area": "db",
|
||||
"db": 1011,
|
||||
"offset": 1322,
|
||||
"type": "real",
|
||||
"streaming": true
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
{
|
||||
"dataset_id": "Fast",
|
||||
"variables": []
|
||||
}
|
||||
]
|
||||
}
|
|
@ -1,14 +1,15 @@
|
|||
{
|
||||
"plots": {
|
||||
"plot_1": {
|
||||
"name": "UR29",
|
||||
"time_window": 75,
|
||||
"y_min": null,
|
||||
"y_max": null,
|
||||
"trigger_variable": null,
|
||||
"trigger_enabled": false,
|
||||
"trigger_on_true": true,
|
||||
"session_id": "plot_1"
|
||||
}
|
||||
"plots": [
|
||||
{
|
||||
"id": "plot_1",
|
||||
"name": "UR29",
|
||||
"session_id": "plot_1",
|
||||
"time_window": 75,
|
||||
"trigger_enabled": false,
|
||||
"trigger_on_true": true,
|
||||
"trigger_variable": null,
|
||||
"y_max": null,
|
||||
"y_min": null
|
||||
}
|
||||
]
|
||||
}
|
|
@ -1,29 +1,29 @@
|
|||
{
|
||||
"plot_variables": [
|
||||
"variables": [
|
||||
{
|
||||
"plot_id": "plot_1",
|
||||
"variables": [
|
||||
{
|
||||
"plot_id": "plot_1",
|
||||
"variables": {
|
||||
"var_1": {
|
||||
"variable_name": "UR29_Brix",
|
||||
"color": "#3498db",
|
||||
"enabled": true
|
||||
},
|
||||
"var_2": {
|
||||
"variable_name": "UR29_ma",
|
||||
"color": "#e74c3c",
|
||||
"enabled": true
|
||||
},
|
||||
"var_3": {
|
||||
"variable_name": "fUR29_Brix",
|
||||
"color": "#2ecc71",
|
||||
"enabled": true
|
||||
},
|
||||
"var_4": {
|
||||
"variable_name": "fUR29_ma",
|
||||
"color": "#f39c12",
|
||||
"enabled": true
|
||||
}
|
||||
}
|
||||
"color": "#3498db",
|
||||
"enabled": true,
|
||||
"variable_name": "UR29_Brix"
|
||||
},
|
||||
{
|
||||
"color": "#e74c3c",
|
||||
"enabled": true,
|
||||
"variable_name": "UR29_ma"
|
||||
},
|
||||
{
|
||||
"color": "#2ecc71",
|
||||
"enabled": true,
|
||||
"variable_name": "fUR29_Brix"
|
||||
},
|
||||
{
|
||||
"color": "#f39c12",
|
||||
"enabled": true,
|
||||
"variable_name": "fUR29_ma"
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
|
@ -56,6 +56,7 @@
|
|||
}
|
||||
},
|
||||
"required": [
|
||||
"id",
|
||||
"name",
|
||||
"prefix"
|
||||
],
|
||||
|
@ -64,10 +65,7 @@
|
|||
"dependencies": {}
|
||||
},
|
||||
"title": "Datasets",
|
||||
"type": ["array", "object"],
|
||||
"additionalProperties": {
|
||||
"$ref": "#/properties/datasets/items"
|
||||
}
|
||||
"type": "array"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
|
|
|
@ -1,11 +1,12 @@
|
|||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"$id": "dataset-variables.schema.json",
|
||||
"title": "Dataset Variables",
|
||||
"description": "Schema for variables assigned to each dataset",
|
||||
"type": "object",
|
||||
"additionalProperties": false,
|
||||
"properties": {
|
||||
"dataset_variables": {
|
||||
"variables": {
|
||||
"type": "array",
|
||||
"title": "Dataset Variables Collection",
|
||||
"description": "Array of dataset variable configurations",
|
||||
|
@ -18,11 +19,17 @@
|
|||
"description": "Unique identifier for the dataset"
|
||||
},
|
||||
"variables": {
|
||||
"type": "object",
|
||||
"title": "Dataset Variables",
|
||||
"additionalProperties": {
|
||||
"type": "array",
|
||||
"title": "Variables",
|
||||
"description": "Array of PLC variables for this dataset",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {
|
||||
"type": "string",
|
||||
"title": "Variable Name",
|
||||
"description": "Human-readable name for the variable"
|
||||
},
|
||||
"area": {
|
||||
"type": "string",
|
||||
"title": "Memory Area",
|
||||
|
@ -82,34 +89,27 @@
|
|||
"streaming": {
|
||||
"type": "boolean",
|
||||
"title": "Stream to PlotJuggler",
|
||||
"description": "Include this variable in UDP streaming",
|
||||
"default": false
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"name",
|
||||
"area",
|
||||
"offset",
|
||||
"type"
|
||||
]
|
||||
}
|
||||
},
|
||||
"streaming_variables": {
|
||||
"type": "array",
|
||||
"title": "Streaming variables",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"default": []
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"dataset_id",
|
||||
"variables",
|
||||
"streaming_variables"
|
||||
"variables"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"dataset_variables"
|
||||
"variables"
|
||||
]
|
||||
}
|
|
@ -1,75 +1,69 @@
|
|||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"$id": "plot-definitions.schema.json",
|
||||
"additionalProperties": false,
|
||||
"title": "Plot Definitions",
|
||||
"description": "Schema for plot session definitions (metadata only, no variables)",
|
||||
"type": "object",
|
||||
"additionalProperties": false,
|
||||
"properties": {
|
||||
"plots": {
|
||||
"additionalProperties": {
|
||||
"type": "array",
|
||||
"title": "Plot Definitions",
|
||||
"description": "Array of plot session configurations",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "string",
|
||||
"title": "Plot ID",
|
||||
"description": "Unique identifier for the plot"
|
||||
},
|
||||
"name": {
|
||||
"description": "Human-readable name of the plot session",
|
||||
"type": "string",
|
||||
"title": "Plot Name",
|
||||
"type": "string"
|
||||
"description": "Human-readable name of the plot session"
|
||||
},
|
||||
"session_id": {
|
||||
"title": "Session Id",
|
||||
"type": "string"
|
||||
"type": "string",
|
||||
"title": "Session ID",
|
||||
"description": "Session identifier for this plot"
|
||||
},
|
||||
"time_window": {
|
||||
"default": 60,
|
||||
"description": "Time window in seconds",
|
||||
"maximum": 3600,
|
||||
"minimum": 5,
|
||||
"type": "integer",
|
||||
"title": "Time window (s)",
|
||||
"type": "integer"
|
||||
},
|
||||
"trigger_enabled": {
|
||||
"default": false,
|
||||
"title": "Enable Trigger",
|
||||
"type": "boolean"
|
||||
},
|
||||
"trigger_on_true": {
|
||||
"default": true,
|
||||
"title": "Trigger on True",
|
||||
"type": "boolean"
|
||||
},
|
||||
"trigger_variable": {
|
||||
"title": "Trigger Variable",
|
||||
"type": [
|
||||
"string",
|
||||
"null"
|
||||
]
|
||||
},
|
||||
"y_max": {
|
||||
"description": "Leave empty for auto",
|
||||
"title": "Y Max",
|
||||
"type": [
|
||||
"number",
|
||||
"null"
|
||||
]
|
||||
"description": "Time window in seconds",
|
||||
"minimum": 5,
|
||||
"maximum": 3600,
|
||||
"default": 60
|
||||
},
|
||||
"y_min": {
|
||||
"description": "Leave empty for auto",
|
||||
"type": ["number", "null"],
|
||||
"title": "Y Min",
|
||||
"type": [
|
||||
"number",
|
||||
"null"
|
||||
]
|
||||
"description": "Leave empty for auto"
|
||||
},
|
||||
"y_max": {
|
||||
"type": ["number", "null"],
|
||||
"title": "Y Max",
|
||||
"description": "Leave empty for auto"
|
||||
},
|
||||
"trigger_variable": {
|
||||
"type": ["string", "null"],
|
||||
"title": "Trigger Variable"
|
||||
},
|
||||
"trigger_enabled": {
|
||||
"type": "boolean",
|
||||
"title": "Enable Trigger",
|
||||
"default": false
|
||||
},
|
||||
"trigger_on_true": {
|
||||
"type": "boolean",
|
||||
"title": "Trigger on True",
|
||||
"default": true
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"name",
|
||||
"time_window"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"title": "Plot Definitions",
|
||||
"type": "object"
|
||||
"required": ["id", "name", "time_window"]
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"plots"
|
||||
],
|
||||
"title": "Plot Definitions",
|
||||
"type": "object"
|
||||
"required": ["plots"]
|
||||
}
|
|
@ -1,11 +1,12 @@
|
|||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"$id": "plot-variables.schema.json",
|
||||
"title": "Plot Variables",
|
||||
"description": "Schema for variables assigned to each plot session",
|
||||
"type": "object",
|
||||
"additionalProperties": false,
|
||||
"properties": {
|
||||
"plot_variables": {
|
||||
"variables": {
|
||||
"type": "array",
|
||||
"title": "Plot Variables Collection",
|
||||
"description": "Array of plot variable configurations",
|
||||
|
@ -18,16 +19,16 @@
|
|||
"description": "Unique identifier for the plot session"
|
||||
},
|
||||
"variables": {
|
||||
"type": "object",
|
||||
"title": "Plot Variables",
|
||||
"description": "Variables configuration for plotting with colors",
|
||||
"additionalProperties": {
|
||||
"type": "array",
|
||||
"title": "Variables",
|
||||
"description": "Array of variables for this plot with visualization settings",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"variable_name": {
|
||||
"type": "string",
|
||||
"title": "Variable Name",
|
||||
"description": "Select a variable from available dataset variables"
|
||||
"description": "Name of the variable to plot (must exist in dataset variables)"
|
||||
},
|
||||
"color": {
|
||||
"type": "string",
|
||||
|
@ -45,8 +46,7 @@
|
|||
},
|
||||
"required": [
|
||||
"variable_name",
|
||||
"color",
|
||||
"enabled"
|
||||
"color"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
@ -59,6 +59,6 @@
|
|||
}
|
||||
},
|
||||
"required": [
|
||||
"plot_variables"
|
||||
"variables"
|
||||
]
|
||||
}
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"dataset_variables": {
|
||||
"variables": {
|
||||
"ui:description": "⚙️ Configure PLC variables for each dataset - specify memory areas, data types, and streaming options",
|
||||
"ui:options": {
|
||||
"addable": true,
|
||||
|
@ -9,8 +9,7 @@
|
|||
"items": {
|
||||
"ui:order": [
|
||||
"dataset_id",
|
||||
"variables",
|
||||
"streaming_variables"
|
||||
"variables"
|
||||
],
|
||||
"dataset_id": {
|
||||
"ui:widget": "text",
|
||||
|
@ -25,8 +24,9 @@
|
|||
"orderable": true,
|
||||
"removable": true
|
||||
},
|
||||
"additionalProperties": {
|
||||
"items": {
|
||||
"ui:order": [
|
||||
"name",
|
||||
"area",
|
||||
"db",
|
||||
"offset",
|
||||
|
@ -36,9 +36,13 @@
|
|||
],
|
||||
"ui:layout": [
|
||||
[
|
||||
{
|
||||
"name": "name",
|
||||
"width": 4
|
||||
},
|
||||
{
|
||||
"name": "area",
|
||||
"width": 3
|
||||
"width": 2
|
||||
},
|
||||
{
|
||||
"name": "db",
|
||||
|
@ -46,10 +50,6 @@
|
|||
},
|
||||
{
|
||||
"name": "offset",
|
||||
"width": 3
|
||||
},
|
||||
{
|
||||
"name": "bit",
|
||||
"width": 2
|
||||
},
|
||||
{
|
||||
|
@ -58,12 +58,21 @@
|
|||
}
|
||||
],
|
||||
[
|
||||
{
|
||||
"name": "bit",
|
||||
"width": 3
|
||||
},
|
||||
{
|
||||
"name": "streaming",
|
||||
"width": 12
|
||||
"width": 9
|
||||
}
|
||||
]
|
||||
],
|
||||
"name": {
|
||||
"ui:widget": "text",
|
||||
"ui:placeholder": "Variable name",
|
||||
"ui:help": "📝 Human-readable name for this variable"
|
||||
},
|
||||
"area": {
|
||||
"ui:widget": "select",
|
||||
"ui:help": "PLC memory area (DB=DataBlock, MW=MemoryWord, etc.)",
|
||||
|
@ -178,15 +187,10 @@
|
|||
"ui:help": "📡 Enable real-time streaming to PlotJuggler for visualization"
|
||||
}
|
||||
}
|
||||
},
|
||||
"streaming_variables": {
|
||||
"ui:widget": "checkboxes",
|
||||
"ui:description": "📡 Streaming Variables",
|
||||
"ui:help": "Variables that are streamed in real-time to PlotJuggler. This list is automatically updated when you enable/disable streaming on individual variables above."
|
||||
}
|
||||
}
|
||||
},
|
||||
"ui:order": [
|
||||
"dataset_variables"
|
||||
"variables"
|
||||
]
|
||||
}
|
|
@ -1,77 +1,109 @@
|
|||
{
|
||||
"plots": {
|
||||
"ui:description": "🎯 Configure plot sessions - set time windows, Y axis ranges, and trigger conditions",
|
||||
"ui:options": {
|
||||
"addable": true,
|
||||
"orderable": true,
|
||||
"removable": true
|
||||
},
|
||||
"items": {
|
||||
"ui:order": [
|
||||
"id",
|
||||
"name",
|
||||
"session_id",
|
||||
"time_window",
|
||||
"y_min",
|
||||
"y_max",
|
||||
"trigger_variable",
|
||||
"trigger_enabled",
|
||||
"trigger_on_true"
|
||||
],
|
||||
"ui:layout": [
|
||||
[
|
||||
{
|
||||
"name": "id",
|
||||
"width": 3
|
||||
},
|
||||
{
|
||||
"name": "name",
|
||||
"width": 4
|
||||
},
|
||||
{
|
||||
"name": "session_id",
|
||||
"width": 3
|
||||
},
|
||||
{
|
||||
"name": "time_window",
|
||||
"width": 2
|
||||
}
|
||||
],
|
||||
[
|
||||
{
|
||||
"name": "y_min",
|
||||
"width": 3
|
||||
},
|
||||
{
|
||||
"name": "y_max",
|
||||
"width": 3
|
||||
},
|
||||
{
|
||||
"name": "trigger_variable",
|
||||
"width": 3
|
||||
},
|
||||
{
|
||||
"name": "trigger_enabled",
|
||||
"width": 3
|
||||
}
|
||||
],
|
||||
[
|
||||
{
|
||||
"name": "trigger_on_true",
|
||||
"width": 12
|
||||
}
|
||||
]
|
||||
],
|
||||
"id": {
|
||||
"ui:widget": "text",
|
||||
"ui:placeholder": "plot_1",
|
||||
"ui:help": "🆔 Unique identifier for this plot"
|
||||
},
|
||||
"name": {
|
||||
"ui:widget": "text",
|
||||
"ui:placeholder": "My Plot",
|
||||
"ui:help": "📊 Human-readable name for the plot"
|
||||
},
|
||||
"session_id": {
|
||||
"ui:widget": "text",
|
||||
"ui:placeholder": "plot_1",
|
||||
"ui:help": "🔗 Session identifier (usually same as ID)"
|
||||
},
|
||||
"time_window": {
|
||||
"ui:widget": "updown"
|
||||
},
|
||||
"trigger_enabled": {
|
||||
"ui:widget": "checkbox"
|
||||
},
|
||||
"trigger_on_true": {
|
||||
"ui:widget": "checkbox"
|
||||
},
|
||||
"y_max": {
|
||||
"ui:widget": "updown"
|
||||
"ui:widget": "updown",
|
||||
"ui:help": "⏱️ Time window in seconds (5-3600)"
|
||||
},
|
||||
"y_min": {
|
||||
"ui:widget": "updown"
|
||||
"ui:widget": "updown",
|
||||
"ui:help": "📉 Minimum Y axis value (leave empty for auto)"
|
||||
},
|
||||
"y_max": {
|
||||
"ui:widget": "updown",
|
||||
"ui:help": "📈 Maximum Y axis value (leave empty for auto)"
|
||||
},
|
||||
"trigger_variable": {
|
||||
"ui:widget": "text",
|
||||
"ui:help": "🎯 Variable name to use as trigger (optional)"
|
||||
},
|
||||
"trigger_enabled": {
|
||||
"ui:widget": "checkbox",
|
||||
"ui:help": "✅ Enable trigger-based recording"
|
||||
},
|
||||
"trigger_on_true": {
|
||||
"ui:widget": "checkbox",
|
||||
"ui:help": "🔄 Trigger when variable becomes true (vs false)"
|
||||
}
|
||||
},
|
||||
"ui:description": "Plot session configuration (time window, Y axis, triggers)",
|
||||
"ui:layout": [
|
||||
[
|
||||
{
|
||||
"name": "session_id",
|
||||
"width": 2
|
||||
},
|
||||
{
|
||||
"name": "name",
|
||||
"width": 4
|
||||
},
|
||||
{
|
||||
"name": "trigger_variable",
|
||||
"width": 4
|
||||
},
|
||||
{
|
||||
"name": "trigger_enabled",
|
||||
"width": 2
|
||||
}
|
||||
],
|
||||
[
|
||||
{
|
||||
"name": "time_window",
|
||||
"width": 4
|
||||
},
|
||||
{
|
||||
"name": "y_min",
|
||||
"width": 4
|
||||
},
|
||||
{
|
||||
"name": "y_max",
|
||||
"width": 4
|
||||
}
|
||||
]
|
||||
],
|
||||
"session_id": {
|
||||
"ui:column": 2
|
||||
},
|
||||
"name": {
|
||||
"ui:column": 4
|
||||
},
|
||||
"trigger_variable": {
|
||||
"ui:column": 4
|
||||
},
|
||||
"trigger_enabled": {
|
||||
"ui:column": 2
|
||||
},
|
||||
"time_window": {
|
||||
"ui:column": 4
|
||||
},
|
||||
"y_min": {
|
||||
"ui:column": 4
|
||||
},
|
||||
"y_max": {
|
||||
"ui:column": 4
|
||||
}
|
||||
}
|
||||
},
|
||||
"ui:order": [
|
||||
"plots"
|
||||
]
|
||||
}
|
|
@ -1,5 +1,5 @@
|
|||
{
|
||||
"plot_variables": {
|
||||
"variables": {
|
||||
"ui:description": "📊 Configure plot variables with colors and settings for real-time visualization",
|
||||
"ui:options": {
|
||||
"addable": true,
|
||||
|
@ -14,7 +14,7 @@
|
|||
"plot_id": {
|
||||
"ui:widget": "text",
|
||||
"ui:placeholder": "Enter unique plot identifier",
|
||||
"ui:help": "🆔 Unique identifier for this plot session"
|
||||
"ui:help": "🆔 Unique identifier for this plot session (must match existing plot)"
|
||||
},
|
||||
"variables": {
|
||||
"ui:description": "🎨 Plot Variable Configuration",
|
||||
|
@ -24,38 +24,32 @@
|
|||
"orderable": true,
|
||||
"removable": true
|
||||
},
|
||||
"additionalProperties": {
|
||||
"items": {
|
||||
"ui:order": [
|
||||
"variable_name",
|
||||
"enabled",
|
||||
"color"
|
||||
"color",
|
||||
"enabled"
|
||||
],
|
||||
"ui:layout": [
|
||||
[
|
||||
{
|
||||
"name": "variable_name",
|
||||
"width": 12
|
||||
}
|
||||
],
|
||||
[
|
||||
{
|
||||
"name": "enabled",
|
||||
"width": 6
|
||||
},
|
||||
{
|
||||
"name": "color",
|
||||
"width": 6
|
||||
"width": 3
|
||||
},
|
||||
{
|
||||
"name": "enabled",
|
||||
"width": 3
|
||||
}
|
||||
]
|
||||
],
|
||||
"variable_name": {
|
||||
"ui:widget": "VariableSelectorWidget",
|
||||
"ui:help": "🔍 Select a variable from the available dataset variables",
|
||||
"ui:description": "Choose from existing PLC variables defined in your datasets"
|
||||
},
|
||||
"enabled": {
|
||||
"ui:widget": "checkbox",
|
||||
"ui:help": "📊 Enable this variable to be displayed in the real-time plot"
|
||||
"ui:widget": "text",
|
||||
"ui:placeholder": "UR29_Brix",
|
||||
"ui:help": "<22> Name of the variable to plot (must exist in dataset variables)"
|
||||
},
|
||||
"color": {
|
||||
"ui:widget": "color",
|
||||
|
@ -77,12 +71,16 @@
|
|||
"#16a085"
|
||||
]
|
||||
}
|
||||
},
|
||||
"enabled": {
|
||||
"ui:widget": "checkbox",
|
||||
"ui:help": "📊 Enable this variable to be displayed in the real-time plot"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"ui:order": [
|
||||
"plot_variables"
|
||||
"variables"
|
||||
]
|
||||
}
|
|
@ -256,7 +256,7 @@ function StatusBar({ status, onRefresh }) {
|
|||
}
|
||||
|
||||
// Pure RJSF Configuration Panel with Full UI Schema Layout Support
|
||||
function ConfigurationPanel({ schemas, currentSchemaId, onSchemaChange, schemaData, formData, onFormChange, onSave, saving, message }) {
|
||||
function ConfigurationPanel({ schemaData, formData, onFormChange, onSave, saving, message }) {
|
||||
const cardBg = useColorModeValue('white', 'gray.700')
|
||||
const borderColor = useColorModeValue('gray.200', 'gray.600')
|
||||
|
||||
|
@ -264,7 +264,7 @@ function ConfigurationPanel({ schemas, currentSchemaId, onSchemaChange, schemaDa
|
|||
return (
|
||||
<Card bg={cardBg} borderColor={borderColor}>
|
||||
<CardBody>
|
||||
<Text>Loading configuration...</Text>
|
||||
<Text>Loading PLC configuration...</Text>
|
||||
</CardBody>
|
||||
</Card>
|
||||
)
|
||||
|
@ -273,27 +273,12 @@ function ConfigurationPanel({ schemas, currentSchemaId, onSchemaChange, schemaDa
|
|||
return (
|
||||
<Card bg={cardBg} borderColor={borderColor}>
|
||||
<CardHeader>
|
||||
<Flex align="center">
|
||||
<Box>
|
||||
<Heading size="md">🔧 Configuration Editor</Heading>
|
||||
<Text fontSize="sm" color="gray.500" mt={1}>
|
||||
Pure RJSF configuration management with full UI Schema layout support (ui:layout, ui:widgets, custom field templates)
|
||||
</Text>
|
||||
</Box>
|
||||
<Spacer />
|
||||
<Select
|
||||
value={currentSchemaId}
|
||||
onChange={(e) => onSchemaChange(e.target.value)}
|
||||
width="200px"
|
||||
size="sm"
|
||||
>
|
||||
{schemas?.map(schemaInfo => (
|
||||
<option key={schemaInfo.id} value={schemaInfo.id}>
|
||||
{schemaInfo.title || schemaInfo.id}
|
||||
</option>
|
||||
))}
|
||||
</Select>
|
||||
</Flex>
|
||||
<Box>
|
||||
<Heading size="md">🔧 PLC & UDP Configuration</Heading>
|
||||
<Text fontSize="sm" color="gray.500" mt={1}>
|
||||
Configure PLC connection settings and UDP streaming parameters
|
||||
</Text>
|
||||
</Box>
|
||||
{message && (
|
||||
<Alert status="success" mt={2}>
|
||||
<AlertIcon />
|
||||
|
@ -337,6 +322,7 @@ function DatasetManager() {
|
|||
const [variablesConfig, setVariablesConfig] = useState(null)
|
||||
const [datasetsSchemaData, setDatasetsSchemaData] = useState(null)
|
||||
const [variablesSchemaData, setVariablesSchemaData] = useState(null)
|
||||
const [selectedDatasetId, setSelectedDatasetId] = useState('')
|
||||
const [loading, setLoading] = useState(true)
|
||||
const toast = useToast()
|
||||
|
||||
|
@ -354,6 +340,11 @@ function DatasetManager() {
|
|||
setVariablesConfig(variablesData)
|
||||
setDatasetsSchemaData(datasetsSchemaResponse)
|
||||
setVariablesSchemaData(variablesSchemaResponse)
|
||||
|
||||
// Auto-select first dataset if none selected
|
||||
if (!selectedDatasetId && datasetsData?.datasets?.length > 0) {
|
||||
setSelectedDatasetId(datasetsData.datasets[0].id)
|
||||
}
|
||||
} catch (error) {
|
||||
toast({
|
||||
title: '❌ Failed to load dataset data',
|
||||
|
@ -404,6 +395,41 @@ function DatasetManager() {
|
|||
}
|
||||
}
|
||||
|
||||
// Get filtered variables for selected dataset (Type 3 Form Pattern)
|
||||
const getSelectedDatasetVariables = () => {
|
||||
if (!variablesConfig?.variables || !selectedDatasetId) {
|
||||
return { variables: [] }
|
||||
}
|
||||
|
||||
const datasetVars = variablesConfig.variables.find(v => v.dataset_id === selectedDatasetId)
|
||||
return datasetVars || { variables: [] }
|
||||
}
|
||||
|
||||
// Update variables for selected dataset (Type 3 Form Pattern)
|
||||
const updateSelectedDatasetVariables = (newVariableData) => {
|
||||
if (!variablesConfig?.variables || !selectedDatasetId) return
|
||||
|
||||
const updatedVariables = variablesConfig.variables.map(v =>
|
||||
v.dataset_id === selectedDatasetId
|
||||
? { ...v, ...newVariableData }
|
||||
: v
|
||||
)
|
||||
|
||||
// If dataset not found, add new entry
|
||||
if (!variablesConfig.variables.find(v => v.dataset_id === selectedDatasetId)) {
|
||||
updatedVariables.push({
|
||||
dataset_id: selectedDatasetId,
|
||||
...newVariableData
|
||||
})
|
||||
}
|
||||
|
||||
const updatedConfig = { ...variablesConfig, variables: updatedVariables }
|
||||
setVariablesConfig(updatedConfig)
|
||||
}
|
||||
|
||||
// Available datasets for combo selector
|
||||
const availableDatasets = datasetsConfig?.datasets || []
|
||||
|
||||
useEffect(() => {
|
||||
loadDatasetData()
|
||||
}, [])
|
||||
|
@ -473,37 +499,140 @@ function DatasetManager() {
|
|||
</TabPanel>
|
||||
|
||||
<TabPanel p={0} pt={4}>
|
||||
{variablesSchemaData?.schema && variablesConfig && (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<Heading size="md">Dataset Variables Configuration</Heading>
|
||||
<Text fontSize="sm" color="gray.500" mt={1}>
|
||||
Raw JSON configuration for variables assigned to each dataset
|
||||
</Text>
|
||||
</CardHeader>
|
||||
<CardBody>
|
||||
<Form
|
||||
schema={variablesSchemaData.schema}
|
||||
uiSchema={variablesSchemaData.uiSchema}
|
||||
formData={variablesConfig}
|
||||
validator={validator}
|
||||
widgets={allWidgets}
|
||||
templates={{ ObjectFieldTemplate: LayoutObjectFieldTemplate }}
|
||||
onSubmit={({ formData }) => saveVariables(formData)}
|
||||
onChange={({ formData }) => setVariablesConfig(formData)}
|
||||
>
|
||||
<HStack spacing={2} mt={4}>
|
||||
<Button type="submit" colorScheme="blue">
|
||||
💾 Save Variables
|
||||
</Button>
|
||||
<Button variant="outline" onClick={loadDatasetData}>
|
||||
🔄 Reset
|
||||
</Button>
|
||||
</HStack>
|
||||
</Form>
|
||||
</CardBody>
|
||||
</Card>
|
||||
)}
|
||||
{/* Type 3 Form: Filtered Array Forms with Combo Selectors */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<Heading size="md">Dataset Variables Configuration</Heading>
|
||||
<Text fontSize="sm" color="gray.500" mt={1}>
|
||||
Type 3 Form: Select a dataset, then configure its variables (combo + filtered form pattern)
|
||||
</Text>
|
||||
</CardHeader>
|
||||
<CardBody>
|
||||
{/* Step 1: Dataset Selector (Combo) */}
|
||||
<VStack spacing={4} align="stretch">
|
||||
<Box>
|
||||
<Text fontSize="sm" fontWeight="bold" mb={2}>
|
||||
🎯 Step 1: Select Dataset
|
||||
</Text>
|
||||
<Select
|
||||
value={selectedDatasetId}
|
||||
onChange={(e) => setSelectedDatasetId(e.target.value)}
|
||||
placeholder="Choose a dataset to configure..."
|
||||
size="md"
|
||||
>
|
||||
{availableDatasets.map(dataset => (
|
||||
<option key={dataset.id} value={dataset.id}>
|
||||
📊 {dataset.name} ({dataset.id})
|
||||
</option>
|
||||
))}
|
||||
</Select>
|
||||
{availableDatasets.length === 0 && (
|
||||
<Text fontSize="sm" color="orange.500" mt={2}>
|
||||
⚠️ No datasets available. Configure datasets first in the "Dataset Definitions" tab.
|
||||
</Text>
|
||||
)}
|
||||
</Box>
|
||||
|
||||
{/* Step 2: Filtered Variables Form */}
|
||||
{selectedDatasetId && (
|
||||
<Box>
|
||||
<Divider mb={4} />
|
||||
<Text fontSize="sm" fontWeight="bold" mb={2}>
|
||||
⚙️ Step 2: Configure Variables for Dataset "{selectedDatasetId}"
|
||||
</Text>
|
||||
|
||||
{/* Create a simplified schema for single dataset variables */}
|
||||
{(() => {
|
||||
const selectedDatasetVars = getSelectedDatasetVariables()
|
||||
|
||||
// Create a simplified schema for just this dataset's variables
|
||||
const singleDatasetSchema = {
|
||||
type: "object",
|
||||
properties: {
|
||||
variables: {
|
||||
type: "array",
|
||||
title: "Variables",
|
||||
description: `PLC variables to record in dataset ${selectedDatasetId}`,
|
||||
items: {
|
||||
type: "object",
|
||||
properties: {
|
||||
name: { type: "string", title: "Variable Name" },
|
||||
area: {
|
||||
type: "string",
|
||||
title: "Memory Area",
|
||||
enum: ["db", "mw", "m", "pew", "pe", "paw", "pa", "e", "a", "mb"],
|
||||
default: "db"
|
||||
},
|
||||
db: { type: "integer", title: "DB Number", minimum: 1, maximum: 9999 },
|
||||
offset: { type: "integer", title: "Offset", minimum: 0, maximum: 8191 },
|
||||
bit: { type: "integer", title: "Bit Position", minimum: 0, maximum: 7 },
|
||||
type: {
|
||||
type: "string",
|
||||
title: "Data Type",
|
||||
enum: ["real", "int", "dint", "bool", "word", "byte"],
|
||||
default: "real"
|
||||
},
|
||||
streaming: { type: "boolean", title: "Stream to UDP", default: false }
|
||||
},
|
||||
required: ["name", "area", "offset", "type"]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const singleDatasetUiSchema = {
|
||||
variables: {
|
||||
items: {
|
||||
"ui:layout": [[
|
||||
{ "name": "name", "width": 3 },
|
||||
{ "name": "area", "width": 2 },
|
||||
{ "name": "db", "width": 1 },
|
||||
{ "name": "offset", "width": 2 },
|
||||
{ "name": "type", "width": 2 },
|
||||
{ "name": "streaming", "width": 2 }
|
||||
]]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return (
|
||||
<Form
|
||||
schema={singleDatasetSchema}
|
||||
uiSchema={singleDatasetUiSchema}
|
||||
formData={selectedDatasetVars}
|
||||
validator={validator}
|
||||
widgets={allWidgets}
|
||||
templates={{ ObjectFieldTemplate: LayoutObjectFieldTemplate }}
|
||||
onSubmit={({ formData }) => {
|
||||
updateSelectedDatasetVariables(formData)
|
||||
saveVariables(variablesConfig)
|
||||
}}
|
||||
onChange={({ formData }) => updateSelectedDatasetVariables(formData)}
|
||||
>
|
||||
<HStack spacing={2} mt={4}>
|
||||
<Button type="submit" colorScheme="blue">
|
||||
💾 Save Variables for {selectedDatasetId}
|
||||
</Button>
|
||||
<Button variant="outline" onClick={loadDatasetData}>
|
||||
🔄 Reset
|
||||
</Button>
|
||||
</HStack>
|
||||
</Form>
|
||||
)
|
||||
})()}
|
||||
</Box>
|
||||
)}
|
||||
|
||||
{!selectedDatasetId && availableDatasets.length > 0 && (
|
||||
<Box textAlign="center" py={8}>
|
||||
<Text color="gray.500">
|
||||
👆 Select a dataset above to configure its variables
|
||||
</Text>
|
||||
</Box>
|
||||
)}
|
||||
</VStack>
|
||||
</CardBody>
|
||||
</Card>
|
||||
</TabPanel>
|
||||
</TabPanels>
|
||||
</Tabs>
|
||||
|
@ -613,8 +742,6 @@ export default function Dashboard() {
|
|||
const [statusLoading, setStatusLoading] = useState(true)
|
||||
const [statusError, setStatusError] = useState('')
|
||||
|
||||
const [schemas, setSchemas] = useState([])
|
||||
const [currentSchemaId, setCurrentSchemaId] = useState('plc')
|
||||
const [schemaData, setSchemaData] = useState(null)
|
||||
const [formData, setFormData] = useState(null)
|
||||
const [saving, setSaving] = useState(false)
|
||||
|
@ -683,28 +810,18 @@ export default function Dashboard() {
|
|||
}
|
||||
}, [])
|
||||
|
||||
// Load schemas
|
||||
const loadSchemas = useCallback(async () => {
|
||||
try {
|
||||
const schemasData = await api.listSchemas()
|
||||
setSchemas(schemasData.schemas || [])
|
||||
} catch (error) {
|
||||
console.error('Failed to load schemas:', error)
|
||||
}
|
||||
}, [])
|
||||
|
||||
// Load specific config
|
||||
const loadConfig = useCallback(async (schemaId) => {
|
||||
// Load PLC config
|
||||
const loadConfig = useCallback(async () => {
|
||||
try {
|
||||
const [schemaResponse, configData] = await Promise.all([
|
||||
api.getSchema(schemaId),
|
||||
api.readConfig(schemaId)
|
||||
api.getSchema('plc'),
|
||||
api.readConfig('plc')
|
||||
])
|
||||
setSchemaData(schemaResponse)
|
||||
setFormData(configData)
|
||||
setMessage('')
|
||||
} catch (error) {
|
||||
console.error(`Failed to load config ${schemaId}:`, error)
|
||||
console.error('Failed to load PLC config:', error)
|
||||
}
|
||||
}, [])
|
||||
|
||||
|
@ -712,8 +829,8 @@ export default function Dashboard() {
|
|||
const saveConfig = useCallback(async (data) => {
|
||||
try {
|
||||
setSaving(true)
|
||||
await api.writeConfig(currentSchemaId, data)
|
||||
setMessage(`✅ Configuration saved successfully`)
|
||||
await api.writeConfig('plc', data)
|
||||
setMessage(`✅ PLC configuration saved successfully`)
|
||||
setTimeout(() => setMessage(''), 3000)
|
||||
setFormData(data)
|
||||
} catch (error) {
|
||||
|
@ -721,7 +838,7 @@ export default function Dashboard() {
|
|||
} finally {
|
||||
setSaving(false)
|
||||
}
|
||||
}, [currentSchemaId])
|
||||
}, [])
|
||||
|
||||
// Load events
|
||||
const loadEvents = useCallback(async () => {
|
||||
|
@ -739,18 +856,12 @@ export default function Dashboard() {
|
|||
// Effects
|
||||
useEffect(() => {
|
||||
loadStatus()
|
||||
loadSchemas()
|
||||
loadConfig()
|
||||
loadEvents()
|
||||
|
||||
const cleanup = subscribeSSE()
|
||||
return cleanup
|
||||
}, [loadStatus, loadSchemas, loadEvents, subscribeSSE])
|
||||
|
||||
useEffect(() => {
|
||||
if (currentSchemaId) {
|
||||
loadConfig(currentSchemaId)
|
||||
}
|
||||
}, [currentSchemaId, loadConfig])
|
||||
}, [loadStatus, loadConfig, loadEvents, subscribeSSE])
|
||||
|
||||
if (statusLoading) {
|
||||
return (
|
||||
|
@ -794,9 +905,6 @@ export default function Dashboard() {
|
|||
<TabPanels>
|
||||
<TabPanel p={0} pt={4}>
|
||||
<ConfigurationPanel
|
||||
schemas={schemas}
|
||||
currentSchemaId={currentSchemaId}
|
||||
onSchemaChange={setCurrentSchemaId}
|
||||
schemaData={schemaData}
|
||||
formData={formData}
|
||||
onFormChange={setFormData}
|
||||
|
|
320
main.py
320
main.py
|
@ -12,6 +12,7 @@ from datetime import datetime
|
|||
import os
|
||||
import sys
|
||||
from core import PLCDataStreamer
|
||||
from utils.json_manager import JSONManager, SchemaManager
|
||||
|
||||
app = Flask(__name__)
|
||||
CORS(
|
||||
|
@ -50,8 +51,10 @@ def project_path(*parts: str) -> str:
|
|||
return os.path.join(base_dir, *parts)
|
||||
|
||||
|
||||
# Global streamer instance (will be initialized in main)
|
||||
# Global instances
|
||||
streamer = None
|
||||
json_manager = JSONManager()
|
||||
schema_manager = SchemaManager()
|
||||
|
||||
|
||||
def check_streamer_initialized():
|
||||
|
@ -148,103 +151,40 @@ def serve_react_index(path: str = ""):
|
|||
|
||||
|
||||
# ==============================
|
||||
# Config Schemas & Editor API
|
||||
# Unified JSON Configuration API
|
||||
# ==============================
|
||||
|
||||
|
||||
@app.route("/api/config/schemas", methods=["GET"])
|
||||
def list_config_schemas():
|
||||
"""Listar esquemas disponibles - SISTEMA UNIFICADO."""
|
||||
error_response = check_streamer_initialized()
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
"""List all available configuration schemas."""
|
||||
try:
|
||||
# Sistema unificado: escanear directorio de esquemas
|
||||
schema_dir = "config/schema"
|
||||
schemas = []
|
||||
|
||||
if os.path.exists(schema_dir):
|
||||
for filename in os.listdir(schema_dir):
|
||||
if filename.endswith(".schema.json"):
|
||||
schema_id = filename.replace(".schema.json", "")
|
||||
schema_path = os.path.join(schema_dir, filename)
|
||||
|
||||
try:
|
||||
with open(schema_path, "r", encoding="utf-8") as f:
|
||||
schema_data = json.load(f)
|
||||
|
||||
schemas.append(
|
||||
{
|
||||
"id": schema_id,
|
||||
"title": schema_data.get("title", schema_id),
|
||||
"description": schema_data.get("description", ""),
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
if streamer.logger:
|
||||
streamer.logger.warning(
|
||||
f"Could not load schema '{schema_id}': {e}"
|
||||
)
|
||||
continue
|
||||
|
||||
return jsonify(
|
||||
{"success": True, "schemas": sorted(schemas, key=lambda x: x["id"])}
|
||||
)
|
||||
|
||||
schemas = schema_manager.list_available_schemas()
|
||||
return jsonify({"success": True, "schemas": schemas})
|
||||
except Exception as e:
|
||||
return jsonify({"success": False, "error": str(e)}), 500
|
||||
|
||||
|
||||
@app.route("/api/config/schema/<schema_id>", methods=["GET"])
|
||||
def get_config_schema(schema_id):
|
||||
"""Obtener un esquema específico en formato JSON Schema - SISTEMA UNIFICADO."""
|
||||
error_response = check_streamer_initialized()
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
"""Get a specific JSON schema with optional UI schema."""
|
||||
try:
|
||||
# Sistema unificado: leer directamente del archivo de esquema
|
||||
schema_path = f"config/schema/{schema_id}.schema.json"
|
||||
ui_schema_path = f"config/schema/ui/{schema_id}.uischema.json"
|
||||
|
||||
# Leer esquema principal
|
||||
try:
|
||||
with open(schema_path, "r", encoding="utf-8") as f:
|
||||
schema = json.load(f)
|
||||
except FileNotFoundError:
|
||||
# Get main schema
|
||||
schema = schema_manager.get_schema(schema_id)
|
||||
if not schema:
|
||||
return (
|
||||
jsonify({"success": False, "error": f"Schema '{schema_id}' not found"}),
|
||||
404,
|
||||
)
|
||||
except json.JSONDecodeError as e:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"success": False,
|
||||
"error": f"Invalid JSON in schema '{schema_id}': {str(e)}",
|
||||
}
|
||||
),
|
||||
500,
|
||||
)
|
||||
|
||||
# Intentar leer UI schema opcional
|
||||
ui_schema = None
|
||||
if os.path.exists(ui_schema_path):
|
||||
try:
|
||||
with open(ui_schema_path, "r", encoding="utf-8") as f:
|
||||
ui_schema = json.load(f)
|
||||
except Exception as e:
|
||||
# UI schema es opcional, continuar sin él
|
||||
if streamer.logger:
|
||||
streamer.logger.warning(
|
||||
f"Could not load UI schema for '{schema_id}': {e}"
|
||||
)
|
||||
# Get optional UI schema
|
||||
ui_schema = schema_manager.get_ui_schema(schema_id)
|
||||
|
||||
resp = {"success": True, "schema": schema}
|
||||
if ui_schema is not None:
|
||||
resp["ui_schema"] = ui_schema
|
||||
return jsonify(resp)
|
||||
response = {"success": True, "schema": schema}
|
||||
if ui_schema:
|
||||
response["ui_schema"] = ui_schema
|
||||
|
||||
return jsonify(response)
|
||||
|
||||
except Exception as e:
|
||||
return jsonify({"success": False, "error": str(e)}), 500
|
||||
|
@ -252,138 +192,39 @@ def get_config_schema(schema_id):
|
|||
|
||||
@app.route("/api/config/<config_id>", methods=["GET"])
|
||||
def read_config(config_id):
|
||||
"""Leer configuración actual (plc/datasets/plots) - SISTEMA UNIFICADO."""
|
||||
error_response = check_streamer_initialized()
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
"""Read configuration data from JSON file."""
|
||||
try:
|
||||
# Sistema unificado: leer directamente del archivo JSON correspondiente
|
||||
config_files = {
|
||||
"plc": "config/data/plc_config.json",
|
||||
"dataset-definitions": "config/data/dataset_definitions.json",
|
||||
"dataset-variables": "config/data/dataset_variables.json",
|
||||
"plot-definitions": "config/data/plot_definitions.json",
|
||||
"plot-variables": "config/data/plot_variables.json",
|
||||
}
|
||||
|
||||
if config_id not in config_files:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"success": False,
|
||||
"error": f"Configuration '{config_id}' not supported",
|
||||
}
|
||||
),
|
||||
404,
|
||||
)
|
||||
|
||||
file_path = config_files[config_id]
|
||||
|
||||
# Leer archivo JSON directamente
|
||||
try:
|
||||
with open(file_path, "r", encoding="utf-8") as f:
|
||||
data = json.load(f)
|
||||
return jsonify({"success": True, "data": data})
|
||||
except FileNotFoundError:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"success": False,
|
||||
"error": f"Configuration file '{file_path}' not found",
|
||||
}
|
||||
),
|
||||
404,
|
||||
)
|
||||
except json.JSONDecodeError as e:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"success": False,
|
||||
"error": f"Invalid JSON in '{file_path}': {str(e)}",
|
||||
}
|
||||
),
|
||||
500,
|
||||
)
|
||||
|
||||
data = json_manager.read_json(config_id)
|
||||
return jsonify({"success": True, "data": data})
|
||||
except ValueError as e:
|
||||
return jsonify({"success": False, "error": str(e)}), 400
|
||||
except Exception as e:
|
||||
return jsonify({"success": False, "error": str(e)}), 500
|
||||
|
||||
|
||||
@app.route("/api/config/<config_id>", methods=["PUT"])
|
||||
def write_config(config_id):
|
||||
"""Sobrescribir configuración a partir del cuerpo JSON - SISTEMA UNIFICADO."""
|
||||
error_response = check_streamer_initialized()
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
"""Write configuration data to JSON file."""
|
||||
try:
|
||||
payload = request.get_json(force=True, silent=False)
|
||||
if not payload:
|
||||
return jsonify({"success": False, "error": "No JSON data provided"}), 400
|
||||
|
||||
# Sistema unificado: escribir directamente al archivo JSON correspondiente
|
||||
config_files = {
|
||||
"plc": "config/data/plc_config.json",
|
||||
"dataset-definitions": "config/data/dataset_definitions.json",
|
||||
"dataset-variables": "config/data/dataset_variables.json",
|
||||
"plot-definitions": "config/data/plot_definitions.json",
|
||||
"plot-variables": "config/data/plot_variables.json",
|
||||
}
|
||||
# Write the data
|
||||
json_manager.write_json(config_id, payload)
|
||||
|
||||
if config_id not in config_files:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"success": False,
|
||||
"error": f"Configuration '{config_id}' not supported",
|
||||
}
|
||||
),
|
||||
404,
|
||||
)
|
||||
|
||||
file_path = config_files[config_id]
|
||||
|
||||
# Validar JSON contra esquema si existe
|
||||
schema_path = f"config/schema/{config_id}.schema.json"
|
||||
if os.path.exists(schema_path):
|
||||
# Notify backend to reload if it's PLC config
|
||||
if config_id == "plc" and streamer:
|
||||
try:
|
||||
with open(schema_path, "r", encoding="utf-8") as f:
|
||||
schema = json.load(f)
|
||||
|
||||
import jsonschema
|
||||
|
||||
jsonschema.validate(payload, schema)
|
||||
except jsonschema.ValidationError as e:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"success": False,
|
||||
"error": f"Schema validation failed: {e.message}",
|
||||
"validation_error": str(e),
|
||||
}
|
||||
),
|
||||
400,
|
||||
)
|
||||
streamer.config_manager.load_configuration()
|
||||
except Exception as e:
|
||||
# Si hay problemas con la validación, continuar sin validar
|
||||
if streamer.logger:
|
||||
streamer.logger.warning(
|
||||
f"Schema validation skipped for {config_id}: {e}"
|
||||
)
|
||||
|
||||
# Escribir archivo JSON directamente
|
||||
os.makedirs(os.path.dirname(file_path), exist_ok=True)
|
||||
with open(file_path, "w", encoding="utf-8") as f:
|
||||
json.dump(payload, f, indent=2, ensure_ascii=False)
|
||||
|
||||
# Si es plc config, actualizar el config manager
|
||||
if config_id == "plc":
|
||||
streamer.config_manager.load_configuration()
|
||||
# Log the error but don't fail the save operation
|
||||
print(f"Warning: Could not reload config in backend: {e}")
|
||||
|
||||
return jsonify(
|
||||
{
|
||||
"success": True,
|
||||
"message": f"Configuration '{config_id}' saved successfully",
|
||||
"file_path": file_path,
|
||||
}
|
||||
)
|
||||
|
||||
|
@ -395,70 +236,57 @@ def write_config(config_id):
|
|||
|
||||
@app.route("/api/config/<config_id>/export", methods=["GET"])
|
||||
def export_config(config_id):
|
||||
"""Exportar configuración como descarga JSON - SISTEMA UNIFICADO."""
|
||||
"""Export configuration as downloadable JSON file."""
|
||||
try:
|
||||
data = json_manager.read_json(config_id)
|
||||
|
||||
# Prepare download response
|
||||
content = json.dumps(data, indent=2, ensure_ascii=False)
|
||||
filename = f"{config_id}_export_{datetime.now().strftime('%Y%m%d_%H%M%S')}.json"
|
||||
|
||||
response = Response(content, mimetype="application/json")
|
||||
response.headers["Content-Disposition"] = f"attachment; filename={filename}"
|
||||
return response
|
||||
|
||||
except ValueError as e:
|
||||
return jsonify({"success": False, "error": str(e)}), 400
|
||||
except Exception as e:
|
||||
return jsonify({"success": False, "error": str(e)}), 500
|
||||
|
||||
|
||||
@app.route("/api/config/<config_id>/reload", methods=["POST"])
|
||||
def reload_config(config_id):
|
||||
"""Notify backend to reload configuration from JSON files."""
|
||||
error_response = check_streamer_initialized()
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
try:
|
||||
# Sistema unificado: leer directamente del archivo JSON correspondiente
|
||||
config_files = {
|
||||
"plc": "config/data/plc_config.json",
|
||||
"dataset-definitions": "config/data/dataset_definitions.json",
|
||||
"dataset-variables": "config/data/dataset_variables.json",
|
||||
"plot-definitions": "config/data/plot_definitions.json",
|
||||
"plot-variables": "config/data/plot_variables.json",
|
||||
}
|
||||
if config_id == "plc":
|
||||
streamer.config_manager.load_configuration()
|
||||
elif config_id in ["dataset-definitions", "dataset-variables"]:
|
||||
# Reload dataset configuration
|
||||
streamer.load_datasets()
|
||||
elif config_id in ["plot-definitions", "plot-variables"]:
|
||||
# Reload plot configuration if needed
|
||||
pass
|
||||
|
||||
if config_id not in config_files:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"success": False,
|
||||
"error": f"Configuration '{config_id}' not supported",
|
||||
}
|
||||
),
|
||||
404,
|
||||
)
|
||||
|
||||
file_path = config_files[config_id]
|
||||
|
||||
# Leer archivo JSON directamente
|
||||
try:
|
||||
with open(file_path, "r", encoding="utf-8") as f:
|
||||
data = json.load(f)
|
||||
except FileNotFoundError:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"success": False,
|
||||
"error": f"Configuration file '{file_path}' not found",
|
||||
}
|
||||
),
|
||||
404,
|
||||
)
|
||||
except json.JSONDecodeError as e:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"success": False,
|
||||
"error": f"Invalid JSON in '{file_path}': {str(e)}",
|
||||
}
|
||||
),
|
||||
500,
|
||||
)
|
||||
|
||||
# Preparar respuesta con cabeceras de descarga
|
||||
content = json.dumps(data, indent=2)
|
||||
filename = f"{config_id}_export.json"
|
||||
resp = Response(content, mimetype="application/json")
|
||||
resp.headers["Content-Disposition"] = f"attachment; filename={filename}"
|
||||
return resp
|
||||
return jsonify(
|
||||
{
|
||||
"success": True,
|
||||
"message": f"Configuration '{config_id}' reloaded successfully",
|
||||
}
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
return jsonify({"success": False, "error": str(e)}), 500
|
||||
|
||||
|
||||
# ==============================
|
||||
# Operational API (PLC Control, Streaming, etc.)
|
||||
# ==============================
|
||||
|
||||
|
||||
@app.route("/api/plc/config", methods=["POST"])
|
||||
def update_plc_config():
|
||||
"""Update PLC configuration"""
|
||||
|
|
|
@ -0,0 +1,143 @@
|
|||
"""
|
||||
Unified JSON handling utilities for the application.
|
||||
Simple CRUD operations for configuration files.
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
from typing import Dict, Any, Optional, List
|
||||
|
||||
|
||||
class JSONManager:
|
||||
"""Simplified JSON file manager for configuration data."""
|
||||
|
||||
def __init__(self, base_path: str = "config/data"):
|
||||
self.base_path = base_path
|
||||
self.config_files = {
|
||||
"plc": "plc_config.json",
|
||||
"dataset-definitions": "dataset_definitions.json",
|
||||
"dataset-variables": "dataset_variables.json",
|
||||
"plot-definitions": "plot_definitions.json",
|
||||
"plot-variables": "plot_variables.json",
|
||||
}
|
||||
|
||||
# Ensure data directory exists
|
||||
os.makedirs(self.base_path, exist_ok=True)
|
||||
|
||||
def get_file_path(self, config_id: str) -> str:
|
||||
"""Get full file path for a config ID."""
|
||||
if config_id not in self.config_files:
|
||||
raise ValueError(f"Unknown config ID: {config_id}")
|
||||
return os.path.join(self.base_path, self.config_files[config_id])
|
||||
|
||||
def read_json(self, config_id: str) -> Dict[str, Any]:
|
||||
"""Read JSON data from file."""
|
||||
file_path = self.get_file_path(config_id)
|
||||
|
||||
try:
|
||||
with open(file_path, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
except FileNotFoundError:
|
||||
return self._get_default_data(config_id)
|
||||
except json.JSONDecodeError as e:
|
||||
raise ValueError(f"Invalid JSON in {file_path}: {str(e)}")
|
||||
|
||||
def write_json(self, config_id: str, data: Dict[str, Any]) -> None:
|
||||
"""Write JSON data to file."""
|
||||
file_path = self.get_file_path(config_id)
|
||||
|
||||
# Ensure directory exists
|
||||
os.makedirs(os.path.dirname(file_path), exist_ok=True)
|
||||
|
||||
with open(file_path, "w", encoding="utf-8") as f:
|
||||
json.dump(data, f, indent=2, ensure_ascii=False)
|
||||
|
||||
def _get_default_data(self, config_id: str) -> Dict[str, Any]:
|
||||
"""Get default data structure for each config type."""
|
||||
defaults = {
|
||||
"plc": {
|
||||
"csv_config": {
|
||||
"max_days": 30,
|
||||
"max_size_mb": 1000,
|
||||
"records_directory": "records",
|
||||
"rotation_enabled": True,
|
||||
},
|
||||
"plc_config": {"ip": "192.168.1.100", "rack": 0, "slot": 2},
|
||||
"udp_config": {
|
||||
"host": "127.0.0.1",
|
||||
"port": 9870,
|
||||
"sampling_interval": 1.0,
|
||||
},
|
||||
},
|
||||
"dataset-definitions": {"datasets": []},
|
||||
"dataset-variables": {"dataset_variables": []},
|
||||
"plot-definitions": {"plots": []},
|
||||
"plot-variables": {"plot_variables": []},
|
||||
}
|
||||
return defaults.get(config_id, {})
|
||||
|
||||
def list_available_configs(self) -> List[str]:
|
||||
"""List all available config IDs."""
|
||||
return list(self.config_files.keys())
|
||||
|
||||
def file_exists(self, config_id: str) -> bool:
|
||||
"""Check if config file exists."""
|
||||
try:
|
||||
file_path = self.get_file_path(config_id)
|
||||
return os.path.exists(file_path)
|
||||
except ValueError:
|
||||
return False
|
||||
|
||||
|
||||
class SchemaManager:
|
||||
"""Simple schema file manager."""
|
||||
|
||||
def __init__(self, schema_path: str = "config/schema"):
|
||||
self.schema_path = schema_path
|
||||
self.ui_schema_path = os.path.join(schema_path, "ui")
|
||||
|
||||
def get_schema(self, schema_id: str) -> Optional[Dict[str, Any]]:
|
||||
"""Get JSON schema by ID."""
|
||||
schema_file = os.path.join(self.schema_path, f"{schema_id}.schema.json")
|
||||
|
||||
try:
|
||||
with open(schema_file, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
except (FileNotFoundError, json.JSONDecodeError):
|
||||
return None
|
||||
|
||||
def get_ui_schema(self, schema_id: str) -> Optional[Dict[str, Any]]:
|
||||
"""Get UI schema by ID."""
|
||||
ui_schema_file = os.path.join(self.ui_schema_path, f"{schema_id}.uischema.json")
|
||||
|
||||
try:
|
||||
with open(ui_schema_file, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
except (FileNotFoundError, json.JSONDecodeError):
|
||||
return None
|
||||
|
||||
def list_available_schemas(self) -> List[Dict[str, str]]:
|
||||
"""List all available schemas."""
|
||||
schemas = []
|
||||
|
||||
if not os.path.exists(self.schema_path):
|
||||
return schemas
|
||||
|
||||
for filename in os.listdir(self.schema_path):
|
||||
if filename.endswith(".schema.json"):
|
||||
schema_id = filename.replace(".schema.json", "")
|
||||
|
||||
# Try to get title from schema
|
||||
schema = self.get_schema(schema_id)
|
||||
title = (
|
||||
schema.get("title", schema_id.title())
|
||||
if schema
|
||||
else schema_id.title()
|
||||
)
|
||||
description = schema.get("description", "") if schema else ""
|
||||
|
||||
schemas.append(
|
||||
{"id": schema_id, "title": title, "description": description}
|
||||
)
|
||||
|
||||
return sorted(schemas, key=lambda x: x["id"])
|
|
@ -1,19 +1,39 @@
|
|||
import json
|
||||
import jsonschema
|
||||
|
||||
# Cargar esquema y datos
|
||||
with open("config/schema/plc.schema.json", "r") as f:
|
||||
# Cargar esquema y datos para dataset-definitions
|
||||
with open("config/schema/dataset-definitions.schema.json", "r") as f:
|
||||
schema = json.load(f)
|
||||
|
||||
with open("config/data/plc_config.json", "r") as f:
|
||||
with open("config/data/dataset_definitions.json", "r") as f:
|
||||
data = json.load(f)
|
||||
|
||||
# Validar
|
||||
try:
|
||||
jsonschema.validate(data, schema)
|
||||
print("✅ Validation successful!")
|
||||
print("✅ Dataset definitions validation successful!")
|
||||
except jsonschema.ValidationError as e:
|
||||
print("❌ Validation error:")
|
||||
print("❌ Dataset definitions validation error:")
|
||||
print(f'Property path: {".".join(str(x) for x in e.absolute_path)}')
|
||||
print(f"Message: {e.message}")
|
||||
print(f"Failed value: {e.instance}")
|
||||
except Exception as e:
|
||||
print(f"❌ Other error: {e}")
|
||||
|
||||
print("\n" + "=" * 50 + "\n")
|
||||
|
||||
# También validar PLC config
|
||||
with open("config/schema/plc.schema.json", "r") as f:
|
||||
plc_schema = json.load(f)
|
||||
|
||||
with open("config/data/plc_config.json", "r") as f:
|
||||
plc_data = json.load(f)
|
||||
|
||||
try:
|
||||
jsonschema.validate(plc_data, plc_schema)
|
||||
print("✅ PLC config validation successful!")
|
||||
except jsonschema.ValidationError as e:
|
||||
print("❌ PLC config validation error:")
|
||||
print(f'Property path: {".".join(str(x) for x in e.absolute_path)}')
|
||||
print(f"Message: {e.message}")
|
||||
print(f"Failed value: {e.instance}")
|
||||
|
|
Loading…
Reference in New Issue