feat: Agregar registro detallado de eventos de aplicación y mejorar la carga de datos históricos desde archivos CSV
- Se añadieron múltiples eventos de inicio de aplicación, activación de datasets y creación de sesiones de gráficos en el archivo application_events.json. - Se mejoró la función get_historical_data en main.py con un manejo de errores más robusto y mensajes de depuración detallados. - Se actualizó el estado del sistema en system_state.json para reflejar los cambios en los datasets activos y la última actualización. - Se documentaron las mejoras en la Memoria de Evolución, destacando la implementación de la carga de datos históricos y la validación de encabezados CSV.
This commit is contained in:
parent
91718e7bf7
commit
4f7b55bd0d
|
@ -693,3 +693,40 @@ ChartjsPlot render → Chart.js integration → streaming setup
|
|||
- Causa: El `Form` usaba `formData={data[key]}` y no controlábamos `onChange` en modo edición, por lo que cualquier re-render restauraba el valor original.
|
||||
- Solución: `FormTable.jsx` ahora usa un estado local `editingFormData` cuando `editingKey === key`. Se inicializa al pulsar Edit, `onChange` actualiza `editingFormData`, y `formData` del `Form` se alimenta de ese estado hasta guardar o cancelar.
|
||||
- Impacto: Al editar un item, los cambios entre campos se mantienen correctamente hasta pulsar Save.
|
||||
|
||||
## 2025-08-14
|
||||
|
||||
- Solicitud (resumen): Implementar carga de datos históricos desde archivos CSV al iniciar plots para que puedan comenzar inmediatamente con datos del span definido. Corrección de errores de Chart.js relacionados con addEventListener en elementos null.
|
||||
|
||||
- Decisiones y cambios:
|
||||
- Se corrigió error crítico en `frontend/src/components/ChartjsPlot.jsx` donde `addEventListener` se llamaba en elementos DOM null durante inicialización/cleanup de Chart.js.
|
||||
- Se agregaron validaciones exhaustivas de DOM: verificación de canvas montado, contexto 2D disponible, elemento en DOM tree antes de crear charts.
|
||||
- Se mejoró la función de cleanup con try-catch para evitar errores durante desmontaje de componentes, especialmente en React StrictMode.
|
||||
- Se habilitó la funcionalidad de carga de datos históricos que estaba temporalmente desactivada.
|
||||
- La API `/api/plots/historical` ya existía y funcionaba correctamente con pandas para leer archivos CSV organizados por fecha (DD-MM-YYYY).
|
||||
|
||||
- Conocimientos técnicos relevantes:
|
||||
- Los archivos CSV se almacenan en `records/{fecha}/` con formato `{prefix}_{hora}.csv` y contienen timestamp + variables de datasets.
|
||||
- La carga histórica busca datos en el tiempo de ventana especificado (time_window) y los pre-popula en los datasets del chart.
|
||||
- Chart.js con streaming plugin requiere validación estricta de DOM para evitar errores de event listeners en elementos null.
|
||||
- El backend maneja retrocompatibilidad para formatos de variables (arrays, objetos con keys, formato con variable_name).
|
||||
|
||||
- Arquitectura de carga histórica:
|
||||
- Frontend: `loadHistoricalData()` → POST `/api/plots/historical` con variables y time_window
|
||||
- Backend: busca archivos CSV en fechas relevantes, filtra por rango temporal, extrae datos de variables matching
|
||||
- Chart.js: pre-popula datasets con datos históricos ordenados cronológicamente antes de iniciar streaming en tiempo real
|
||||
- Beneficio: plots muestran contexto inmediato sin esperar nuevos datos del PLC
|
||||
|
||||
- Corrección de errores adicionales:
|
||||
- **Error HTTP 500**: Agregado debugging exhaustivo en endpoint `/api/plots/historical` con logs detallados para diagnosticar problemas de pandas, archivos CSV y rutas.
|
||||
- **Error "Canvas is already in use"**: Mejorado sistema de validación y cleanup de Chart.js para React StrictMode que ejecuta useEffect doble en desarrollo.
|
||||
- **Error "Cannot set properties of null"**: Fortalecida validación de DOM antes de crear charts, verificando canvas montado, contexto 2D disponible y elemento en DOM tree.
|
||||
- **Delay aumentado**: React StrictMode requiere 50ms en lugar de 10ms para completar cleanup entre renders.
|
||||
- **Validación final**: Verificación comprehensiva antes de crear chart que previene conflictos de canvas con registro de Chart.js.
|
||||
|
||||
- Estados de seguridad implementados:
|
||||
- Verificación de dependencias Chart.js al inicio del componente
|
||||
- Validación de canvas DOM antes de obtener contexto 2D
|
||||
- Cleanup agresivo con try-catch para evitar errores en desmontaje
|
||||
- Detección de charts existentes en registro Chart.js antes de crear nuevos
|
||||
- Manejo de errores con estados de loading/error que informan problemas al usuario
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -7,31 +7,31 @@ let dependenciesValid = false;
|
|||
|
||||
const checkChartDependencies = () => {
|
||||
if (dependenciesChecked) return dependenciesValid;
|
||||
|
||||
|
||||
try {
|
||||
if (typeof window === 'undefined') {
|
||||
console.warn('⚠️ Window not available, skipping dependency check');
|
||||
return false;
|
||||
}
|
||||
|
||||
|
||||
// Check for Chart.js
|
||||
if (!window.Chart) {
|
||||
console.error('❌ Chart.js not loaded');
|
||||
return false;
|
||||
}
|
||||
|
||||
|
||||
// Check for chartjs-plugin-streaming
|
||||
const hasStreamingPlugin = !!(
|
||||
window.Chart.registry?.scales?.get?.('realtime') ||
|
||||
window.ChartStreaming ||
|
||||
window.chartjsPluginStreaming
|
||||
);
|
||||
|
||||
|
||||
if (!hasStreamingPlugin) {
|
||||
console.error('❌ chartjs-plugin-streaming not loaded or realtime scale not registered');
|
||||
return false;
|
||||
}
|
||||
|
||||
|
||||
console.log('✅ Chart.js dependencies verified');
|
||||
dependenciesValid = true;
|
||||
return true;
|
||||
|
@ -146,10 +146,10 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
const loadHistoricalData = useCallback(async (variables, timeWindow) => {
|
||||
try {
|
||||
console.log(`📊 Loading historical data for ${variables.length} variables (${timeWindow}s window)...`);
|
||||
|
||||
|
||||
const response = await fetch('/api/plots/historical', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Accept': 'application/json'
|
||||
},
|
||||
|
@ -158,11 +158,11 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
time_window: timeWindow
|
||||
})
|
||||
});
|
||||
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
|
||||
const data = await response.json();
|
||||
console.log(`📊 Historical data response:`, data);
|
||||
return data.data || [];
|
||||
|
@ -186,6 +186,19 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
|
||||
const Chart = window.Chart;
|
||||
|
||||
// Ensure canvas is properly mounted in DOM and has context
|
||||
const canvas = canvasRef.current;
|
||||
if (!canvas || !canvas.getContext) {
|
||||
console.warn('⚠️ Canvas not properly mounted, delaying chart creation...');
|
||||
return;
|
||||
}
|
||||
|
||||
// Additional check for canvas being in DOM
|
||||
if (!document.contains(canvas)) {
|
||||
console.warn('⚠️ Canvas not in DOM, skipping chart creation...');
|
||||
return;
|
||||
}
|
||||
|
||||
// Ensure zoom plugin is registered only if available to avoid plugin errors
|
||||
let zoomAvailable = false;
|
||||
try {
|
||||
|
@ -212,74 +225,110 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
throw new Error('Realtime scale not available. Ensure chartjs-plugin-streaming v2.x is loaded after Chart.js.');
|
||||
}
|
||||
|
||||
const ctx = canvasRef.current.getContext('2d');
|
||||
const ctx = canvas.getContext('2d');
|
||||
if (!ctx) {
|
||||
console.warn('⚠️ Could not get 2D context from canvas');
|
||||
return;
|
||||
}
|
||||
|
||||
// CRITICAL: More aggressive cleanup check - StrictMode protection
|
||||
// CRITICAL: Enhanced cleanup for React StrictMode
|
||||
let needsCleanup = false;
|
||||
const existingChartInRegistry = Chart.getChart(ctx.canvas);
|
||||
const hasChartReference = !!chartRef.current;
|
||||
const hasCanvasChartProperty = !!ctx.canvas.chartjs;
|
||||
|
||||
|
||||
if (existingChartInRegistry || hasChartReference || hasCanvasChartProperty) {
|
||||
console.log(`🚨 FORCE CLEANUP - Registry: ${!!existingChartInRegistry}, Ref: ${hasChartReference}, Canvas: ${hasCanvasChartProperty}`);
|
||||
|
||||
needsCleanup = true;
|
||||
console.log(`🚨 AGGRESSIVE CLEANUP - Registry: ${!!existingChartInRegistry}, Ref: ${hasChartReference}, Canvas: ${hasCanvasChartProperty}`);
|
||||
|
||||
// Stop all timers first - most critical to prevent async errors
|
||||
const cleanupTimers = (chart) => {
|
||||
if (!chart) return;
|
||||
|
||||
try {
|
||||
// Stop streaming plugin timers
|
||||
const rt = chart.options?.scales?.x?.realtime;
|
||||
if (rt) {
|
||||
rt.pause = true;
|
||||
if (rt._timer) {
|
||||
clearInterval(rt._timer);
|
||||
rt._timer = null;
|
||||
}
|
||||
if (rt._onProgress) {
|
||||
rt._onProgress = null;
|
||||
}
|
||||
}
|
||||
|
||||
// Stop Chart.js animations
|
||||
if (chart.stop) chart.stop();
|
||||
|
||||
// Clear any chart-level timers
|
||||
if (chart._bufferedRender) {
|
||||
chart._bufferedRender = null;
|
||||
}
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error cleaning timers:', e);
|
||||
}
|
||||
};
|
||||
|
||||
// Cleanup existing registry entry
|
||||
if (existingChartInRegistry) {
|
||||
try {
|
||||
const rt = existingChartInRegistry.options?.scales?.x?.realtime;
|
||||
if (rt) {
|
||||
rt.pause = true;
|
||||
if (rt._timer) {
|
||||
clearInterval(rt._timer);
|
||||
rt._timer = null;
|
||||
}
|
||||
existingChartInRegistry.update('none');
|
||||
}
|
||||
existingChartInRegistry.stop();
|
||||
cleanupTimers(existingChartInRegistry);
|
||||
existingChartInRegistry.destroy();
|
||||
console.log('✅ Force destroyed registry chart');
|
||||
console.log('✅ Registry chart destroyed');
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error force destroying registry chart:', e);
|
||||
console.warn('⚠️ Error destroying registry chart:', e);
|
||||
// Force removal from registry even if destroy fails
|
||||
try {
|
||||
delete Chart.instances[existingChartInRegistry.id];
|
||||
} catch (registryError) {
|
||||
console.warn('⚠️ Could not remove from registry:', registryError);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// Cleanup reference
|
||||
if (chartRef.current) {
|
||||
try {
|
||||
const rt = chartRef.current.options?.scales?.x?.realtime;
|
||||
if (rt) {
|
||||
rt.pause = true;
|
||||
if (rt._timer) {
|
||||
clearInterval(rt._timer);
|
||||
rt._timer = null;
|
||||
}
|
||||
chartRef.current.update('none');
|
||||
}
|
||||
chartRef.current.stop();
|
||||
cleanupTimers(chartRef.current);
|
||||
chartRef.current.destroy();
|
||||
console.log('✅ Force destroyed reference chart');
|
||||
console.log('✅ Reference chart destroyed');
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error force destroying reference chart:', e);
|
||||
console.warn('⚠️ Error destroying reference chart:', e);
|
||||
} finally {
|
||||
chartRef.current = null;
|
||||
}
|
||||
}
|
||||
|
||||
// Force cleanup canvas properties
|
||||
if (ctx.canvas.chartjs) {
|
||||
delete ctx.canvas.chartjs;
|
||||
|
||||
// Force cleanup canvas properties and reset
|
||||
try {
|
||||
if (ctx.canvas.chartjs) {
|
||||
delete ctx.canvas.chartjs;
|
||||
}
|
||||
|
||||
// Clear canvas ID if it exists
|
||||
if (ctx.canvas.id) {
|
||||
const chartId = ctx.canvas.id;
|
||||
if (Chart.instances && Chart.instances[chartId]) {
|
||||
delete Chart.instances[chartId];
|
||||
}
|
||||
}
|
||||
|
||||
// Reset canvas completely
|
||||
ctx.clearRect(0, 0, ctx.canvas.width, ctx.canvas.height);
|
||||
ctx.canvas.style.width = '';
|
||||
ctx.canvas.style.height = '';
|
||||
|
||||
// Force canvas to lose focus
|
||||
if (document.activeElement === ctx.canvas) {
|
||||
ctx.canvas.blur();
|
||||
}
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error resetting canvas:', e);
|
||||
}
|
||||
|
||||
// Aggressive canvas reset
|
||||
ctx.clearRect(0, 0, ctx.canvas.width, ctx.canvas.height);
|
||||
ctx.canvas.style.width = '';
|
||||
ctx.canvas.style.height = '';
|
||||
|
||||
// Force blur to remove focus
|
||||
if (document.activeElement === ctx.canvas) {
|
||||
ctx.canvas.blur();
|
||||
}
|
||||
|
||||
console.log('🧹 Force cleanup completed');
|
||||
|
||||
console.log('🧹 Aggressive cleanup completed');
|
||||
}
|
||||
|
||||
// Enhanced chart cleanup - check for multiple potential chart instances
|
||||
|
@ -293,10 +342,10 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
rt.pause = true;
|
||||
existingChart.update('none');
|
||||
}
|
||||
|
||||
|
||||
// Stop any running animations and timers
|
||||
existingChart.stop();
|
||||
|
||||
|
||||
// Destroy the chart instance
|
||||
existingChart.destroy();
|
||||
console.log('✅ Existing Chart.js instance destroyed');
|
||||
|
@ -309,7 +358,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
if (chartRef.current) {
|
||||
try {
|
||||
console.log('🔄 Destroying chart reference...');
|
||||
|
||||
|
||||
// Stop streaming plugin timers
|
||||
const rt = chartRef.current.options?.scales?.x?.realtime;
|
||||
if (rt) {
|
||||
|
@ -321,13 +370,13 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
}
|
||||
chartRef.current.update('none');
|
||||
}
|
||||
|
||||
|
||||
// Clear any animation frames and timers
|
||||
chartRef.current.stop();
|
||||
|
||||
|
||||
// Destroy the chart
|
||||
chartRef.current.destroy();
|
||||
|
||||
|
||||
console.log('✅ Chart reference destroyed successfully');
|
||||
} catch (destroyError) {
|
||||
console.warn('⚠️ Error destroying chart reference:', destroyError);
|
||||
|
@ -344,11 +393,11 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
|
||||
// Clear the canvas completely and reset size
|
||||
ctx.clearRect(0, 0, ctx.canvas.width, ctx.canvas.height);
|
||||
|
||||
|
||||
// Reset canvas styling to ensure clean state
|
||||
ctx.canvas.style.width = '';
|
||||
ctx.canvas.style.height = '';
|
||||
|
||||
|
||||
// Force canvas to lose focus if it has it
|
||||
if (document.activeElement === ctx.canvas) {
|
||||
ctx.canvas.blur();
|
||||
|
@ -359,13 +408,13 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
|
||||
const datasets = enabledVariables.map((variableInfo, index) => {
|
||||
const color = variableInfo.color || getColor(variableInfo.name, index);
|
||||
|
||||
|
||||
// Get style configuration with defaults
|
||||
const lineTension = (typeof config.line_tension === 'number') ? config.line_tension : 0.4;
|
||||
const stepped = config.stepped === true;
|
||||
const pointRadius = (typeof config.point_radius === 'number') ? config.point_radius : 1;
|
||||
const pointHoverRadius = (typeof config.point_hover_radius === 'number') ? config.point_hover_radius : 4;
|
||||
|
||||
|
||||
return {
|
||||
label: variableInfo.label, // Use display label for chart legend
|
||||
data: [],
|
||||
|
@ -390,20 +439,17 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
// Load historical data to pre-populate the chart
|
||||
const timeWindow = config.time_window || 60;
|
||||
const variableNames = enabledVariables.map(v => v.name);
|
||||
|
||||
// TEMPORARILY DISABLED: Historical data loading due to backend HTTP 500 error
|
||||
// TODO: Fix backend /api/plots/historical endpoint
|
||||
console.log(`📊 Historical data loading temporarily disabled for ${variableNames.length} variables`);
|
||||
|
||||
/*
|
||||
if (variableNames.length > 0) {
|
||||
|
||||
// Load historical data only if session is active (started)
|
||||
if (variableNames.length > 0 && session?.is_active && !session?.is_paused) {
|
||||
setIsLoadingHistorical(true);
|
||||
try {
|
||||
console.log(`📊 Loading historical data for ${variableNames.length} variables (${timeWindow}s window)...`);
|
||||
const historicalData = await loadHistoricalData(variableNames, timeWindow);
|
||||
|
||||
|
||||
if (historicalData.length > 0) {
|
||||
console.log(`📊 Loaded ${historicalData.length} historical data points`);
|
||||
|
||||
|
||||
// Group data by variable and add to appropriate dataset
|
||||
const dataByVariable = {};
|
||||
historicalData.forEach(point => {
|
||||
|
@ -416,7 +462,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
y: value
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
// Add historical data to datasets
|
||||
enabledVariables.forEach((variableInfo, index) => {
|
||||
const historicalPoints = dataByVariable[variableInfo.name] || [];
|
||||
|
@ -427,7 +473,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
console.log(`📊 Added ${historicalPoints.length} historical points for ${variableInfo.name}`);
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
// Update data points counter
|
||||
const totalHistoricalPoints = Object.values(dataByVariable).reduce((sum, points) => sum + points.length, 0);
|
||||
setDataPointsCount(totalHistoricalPoints);
|
||||
|
@ -439,8 +485,9 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
} finally {
|
||||
setIsLoadingHistorical(false);
|
||||
}
|
||||
} else if (variableNames.length > 0) {
|
||||
console.log(`📊 Historical data loading skipped - session not active (is_active: ${session?.is_active}, is_paused: ${session?.is_paused})`);
|
||||
}
|
||||
*/
|
||||
|
||||
const yMinInitial = (typeof config.y_min === 'number' && isFinite(config.y_min)) ? config.y_min : undefined;
|
||||
const yMaxInitial = (typeof config.y_max === 'number' && isFinite(config.y_max)) ? config.y_max : undefined;
|
||||
|
@ -507,16 +554,16 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
...(zoomAvailable ? {
|
||||
zoom: {
|
||||
// Solo habilitar zoom/pan en modo fullscreen
|
||||
pan: {
|
||||
enabled: !!session?.isFullscreen,
|
||||
mode: 'x',
|
||||
modifierKey: 'shift'
|
||||
pan: {
|
||||
enabled: !!session?.isFullscreen,
|
||||
mode: 'x',
|
||||
modifierKey: 'shift'
|
||||
},
|
||||
zoom: {
|
||||
drag: { enabled: !!session?.isFullscreen },
|
||||
wheel: { enabled: !!session?.isFullscreen },
|
||||
pinch: { enabled: !!session?.isFullscreen },
|
||||
mode: 'x'
|
||||
zoom: {
|
||||
drag: { enabled: !!session?.isFullscreen },
|
||||
wheel: { enabled: !!session?.isFullscreen },
|
||||
pinch: { enabled: !!session?.isFullscreen },
|
||||
mode: 'x'
|
||||
}
|
||||
}
|
||||
} : {})
|
||||
|
@ -539,33 +586,80 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
}
|
||||
};
|
||||
|
||||
// Final safety check before creating new chart
|
||||
if (ctx.canvas.chartjs) {
|
||||
console.warn('⚠️ Canvas still has Chart.js reference, forcing cleanup...');
|
||||
try {
|
||||
const existingChart = window.Chart.getChart(ctx.canvas);
|
||||
if (existingChart) {
|
||||
existingChart.destroy();
|
||||
// Final comprehensive check with retry logic for React StrictMode
|
||||
let retryCount = 0;
|
||||
const maxRetries = 3;
|
||||
|
||||
const checkAndCreate = async () => {
|
||||
await new Promise(resolve => setTimeout(resolve, 10)); // Small delay for cleanup completion
|
||||
|
||||
const finalExistingChart = Chart.getChart(ctx.canvas);
|
||||
const hasCanvasProperty = !!ctx.canvas.chartjs;
|
||||
const hasReference = !!chartRef.current;
|
||||
|
||||
if (finalExistingChart || hasCanvasProperty || hasReference) {
|
||||
retryCount++;
|
||||
console.warn(`⚠️ Attempt ${retryCount}: Chart still exists after cleanup`);
|
||||
console.warn(`Registry: ${!!finalExistingChart}, Canvas: ${hasCanvasProperty}, Ref: ${hasReference}`);
|
||||
|
||||
if (retryCount < maxRetries) {
|
||||
// Force additional cleanup attempt
|
||||
if (finalExistingChart) {
|
||||
try {
|
||||
finalExistingChart.destroy();
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error in retry cleanup:', e);
|
||||
}
|
||||
}
|
||||
if (hasCanvasProperty) {
|
||||
delete ctx.canvas.chartjs;
|
||||
}
|
||||
if (hasReference) {
|
||||
chartRef.current = null;
|
||||
}
|
||||
|
||||
// Wait and retry
|
||||
setTimeout(checkAndCreate, 20 * retryCount);
|
||||
return;
|
||||
} else {
|
||||
console.error('❌ Failed to cleanup chart after maximum retries');
|
||||
setError('Chart initialization failed: Canvas cleanup failed');
|
||||
setIsLoading(false);
|
||||
return;
|
||||
}
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error cleaning up existing chart reference:', e);
|
||||
}
|
||||
delete ctx.canvas.chartjs;
|
||||
|
||||
// Safe to create chart
|
||||
try {
|
||||
console.log('🚀 Creating new Chart.js instance...');
|
||||
chartRef.current = new Chart(ctx, chartConfig);
|
||||
console.log('✅ Chart created successfully');
|
||||
} catch (chartError) {
|
||||
console.error('❌ Chart creation failed:', chartError);
|
||||
setError(`Chart creation failed: ${chartError.message}`);
|
||||
setIsLoading(false);
|
||||
return;
|
||||
}
|
||||
|
||||
// Continue with post-creation setup
|
||||
sessionDataRef.current.isRealTimeMode = true;
|
||||
sessionDataRef.current.noDataCycles = 0;
|
||||
const initialPaused = !session?.is_active || session?.is_paused;
|
||||
sessionDataRef.current.ingestPaused = initialPaused;
|
||||
sessionDataRef.current.isPaused = initialPaused;
|
||||
console.log(`✅ Plot ${session?.session_id}: Real-time Streaming enabled`);
|
||||
|
||||
setIsLoading(false);
|
||||
setError(null);
|
||||
};
|
||||
|
||||
// If cleanup was needed, wait a bit longer before checking
|
||||
if (needsCleanup) {
|
||||
setTimeout(checkAndCreate, 50);
|
||||
} else {
|
||||
checkAndCreate();
|
||||
}
|
||||
|
||||
console.log('🚀 Creating new Chart.js instance...');
|
||||
chartRef.current = new Chart(ctx, chartConfig);
|
||||
sessionDataRef.current.isRealTimeMode = true;
|
||||
sessionDataRef.current.noDataCycles = 0;
|
||||
// Sync ingest pause state with initial chart pause
|
||||
const initialPaused = !session?.is_active || session?.is_paused;
|
||||
sessionDataRef.current.ingestPaused = initialPaused;
|
||||
sessionDataRef.current.isPaused = initialPaused;
|
||||
console.log(`✅ Plot ${session?.session_id}: Real-time Streaming enabled`);
|
||||
|
||||
setIsLoading(false);
|
||||
setError(null);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error creating chart:', error);
|
||||
setError(error.message);
|
||||
|
@ -593,11 +687,11 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
if (!response.ok) return;
|
||||
|
||||
const plotData = await response.json();
|
||||
|
||||
|
||||
// Add new data to chart
|
||||
const pointsAdded = addNewDataToStreaming(plotData, now);
|
||||
updatePointsCounter(plotData);
|
||||
|
||||
|
||||
if (pointsAdded > 0) {
|
||||
console.log(`📊 Plot ${sessionId}: Added ${pointsAdded} points to chart`);
|
||||
}
|
||||
|
@ -763,12 +857,71 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
sessionData.userOverrideUntil = Date.now() + 3000;
|
||||
}, []);
|
||||
|
||||
const resumeStreaming = useCallback(() => {
|
||||
const resumeStreaming = useCallback(async () => {
|
||||
const sessionData = sessionDataRef.current;
|
||||
if (!chartRef.current) return;
|
||||
|
||||
// Load historical data when resuming if chart is empty
|
||||
const chart = chartRef.current;
|
||||
const hasHistoricalData = chart.data.datasets.some(dataset => dataset.data && dataset.data.length > 0);
|
||||
|
||||
if (!hasHistoricalData && session?.config) {
|
||||
const config = session.config;
|
||||
const enabledVariables = getEnabledVariables(config.variables);
|
||||
const timeWindow = config.time_window || 60;
|
||||
const variableNames = enabledVariables.map(v => v.name);
|
||||
|
||||
if (variableNames.length > 0) {
|
||||
setIsLoadingHistorical(true);
|
||||
try {
|
||||
console.log(`📊 Loading historical data on start for ${variableNames.length} variables (${timeWindow}s window)...`);
|
||||
const historicalData = await loadHistoricalData(variableNames, timeWindow);
|
||||
|
||||
if (historicalData.length > 0) {
|
||||
console.log(`📊 Loaded ${historicalData.length} historical data points on start`);
|
||||
|
||||
// Group data by variable and add to appropriate dataset
|
||||
const dataByVariable = {};
|
||||
historicalData.forEach(point => {
|
||||
const { variable, timestamp, value } = point;
|
||||
if (!dataByVariable[variable]) {
|
||||
dataByVariable[variable] = [];
|
||||
}
|
||||
dataByVariable[variable].push({
|
||||
x: new Date(timestamp),
|
||||
y: value
|
||||
});
|
||||
});
|
||||
|
||||
// Add historical data to datasets
|
||||
enabledVariables.forEach((variableInfo, index) => {
|
||||
const historicalPoints = dataByVariable[variableInfo.name] || [];
|
||||
if (historicalPoints.length > 0) {
|
||||
// Sort points by timestamp to ensure proper order
|
||||
historicalPoints.sort((a, b) => a.x - b.x);
|
||||
chart.data.datasets[index].data = historicalPoints;
|
||||
console.log(`📊 Added ${historicalPoints.length} historical points for ${variableInfo.name} on start`);
|
||||
}
|
||||
});
|
||||
|
||||
// Update chart with historical data
|
||||
chart.update('quiet');
|
||||
|
||||
// Update data points counter
|
||||
const totalHistoricalPoints = Object.values(dataByVariable).reduce((sum, points) => sum + points.length, 0);
|
||||
setDataPointsCount(totalHistoricalPoints);
|
||||
} else {
|
||||
console.log(`📊 No historical data found on start for variables: ${variableNames.join(', ')}`);
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn('⚠️ Failed to load historical data on start:', error);
|
||||
} finally {
|
||||
setIsLoadingHistorical(false);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (sessionData.isRealTimeMode) {
|
||||
const chart = chartRef.current;
|
||||
const rt = chart.options?.scales?.x?.realtime;
|
||||
if (rt) {
|
||||
rt.pause = false;
|
||||
|
@ -784,7 +937,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
sessionData.ingestPaused = false;
|
||||
sessionData.isPaused = false;
|
||||
sessionData.userOverrideUntil = Date.now() + 3000;
|
||||
}, [startManualRefresh]);
|
||||
}, [startManualRefresh, loadHistoricalData, getEnabledVariables, session?.config, setIsLoadingHistorical, setDataPointsCount]);
|
||||
|
||||
const clearChart = useCallback(() => {
|
||||
if (!chartRef.current) return;
|
||||
|
@ -800,7 +953,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
|
||||
const resetZoom = useCallback(() => {
|
||||
if (!chartRef.current) return;
|
||||
|
||||
|
||||
try {
|
||||
// Try to reset zoom using the zoom plugin
|
||||
if (chartRef.current.resetZoom) {
|
||||
|
@ -822,12 +975,12 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
const updateConfig = useCallback(async (newConfig) => {
|
||||
try {
|
||||
console.log(`🔄 Updating configuration for plot session ${session?.session_id}...`);
|
||||
|
||||
|
||||
const oldConfig = resolvedConfigRef.current;
|
||||
resolvedConfigRef.current = { ...oldConfig, ...newConfig };
|
||||
|
||||
|
||||
// Check if chart recreation is needed
|
||||
const needsRecreation = !oldConfig ||
|
||||
const needsRecreation = !oldConfig ||
|
||||
oldConfig.line_tension !== newConfig.line_tension ||
|
||||
oldConfig.stepped !== newConfig.stepped ||
|
||||
oldConfig.point_radius !== newConfig.point_radius ||
|
||||
|
@ -835,7 +988,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
oldConfig.time_window !== newConfig.time_window ||
|
||||
oldConfig.y_min !== newConfig.y_min ||
|
||||
oldConfig.y_max !== newConfig.y_max;
|
||||
|
||||
|
||||
if (needsRecreation) {
|
||||
console.log(`🔄 Chart needs recreation due to configuration changes`);
|
||||
await createStreamingChart();
|
||||
|
@ -855,7 +1008,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
setIsRefreshing(true);
|
||||
try {
|
||||
console.log(`🔄 Refreshing configuration for plot session ${session.session_id}...`);
|
||||
|
||||
|
||||
// Fetch latest session configuration from server
|
||||
const response = await fetch(`/api/plots/${session.session_id}/config`);
|
||||
if (!response.ok) {
|
||||
|
@ -863,15 +1016,15 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
}
|
||||
|
||||
const updatedSession = await response.json();
|
||||
|
||||
|
||||
// Update the resolved config with the latest configuration
|
||||
if (updatedSession.success && updatedSession.config) {
|
||||
const oldConfig = resolvedConfigRef.current;
|
||||
resolvedConfigRef.current = updatedSession.config;
|
||||
console.log(`✅ Configuration refreshed for plot session ${session.session_id}`);
|
||||
|
||||
|
||||
// Only recreate the chart if there are significant changes
|
||||
const needsRecreation = !oldConfig ||
|
||||
const needsRecreation = !oldConfig ||
|
||||
JSON.stringify(oldConfig.variables) !== JSON.stringify(updatedSession.config.variables) ||
|
||||
oldConfig.time_window !== updatedSession.config.time_window ||
|
||||
oldConfig.y_min !== updatedSession.config.y_min ||
|
||||
|
@ -880,7 +1033,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
oldConfig.stepped !== updatedSession.config.stepped ||
|
||||
oldConfig.point_radius !== updatedSession.config.point_radius ||
|
||||
oldConfig.point_hover_radius !== updatedSession.config.point_hover_radius;
|
||||
|
||||
|
||||
if (needsRecreation) {
|
||||
console.log(`🔄 Chart needs recreation due to configuration changes`);
|
||||
await createStreamingChart();
|
||||
|
@ -904,7 +1057,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
React.useEffect(() => {
|
||||
// Update sessionId ref when session changes
|
||||
sessionDataRef.current.sessionId = session?.session_id || null;
|
||||
|
||||
|
||||
if (typeof session?.onChartReady === 'function') {
|
||||
session.onChartReady({
|
||||
pauseStreaming,
|
||||
|
@ -944,80 +1097,109 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
// Initialize chart when config is resolved - simplified approach
|
||||
useEffect(() => {
|
||||
console.log(`🔍 useEffect triggered - sessionId: ${session?.session_id}, hasCanvas: ${!!canvasRef.current}, hasChart: ${!!chartRef.current}`);
|
||||
|
||||
|
||||
// Check dependencies first
|
||||
if (!checkChartDependencies()) {
|
||||
setError('Chart.js dependencies not loaded. Please refresh the page.');
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
// Only create chart when we have ALL requirements AND no existing chart
|
||||
if (session?.session_id && canvasRef.current && !chartRef.current) {
|
||||
const config = session?.config;
|
||||
if (config) {
|
||||
console.log(`🎯 Creating chart for session ${session.session_id} - conditions met`);
|
||||
|
||||
// Additional safety check - wait a tiny bit to ensure cleanup is complete
|
||||
|
||||
// Additional safety check - wait longer for React StrictMode to complete cleanup
|
||||
setTimeout(() => {
|
||||
// Double-check that we still need to create a chart
|
||||
// Double-check that we still need to create a chart and cleanup is complete
|
||||
if (chartRef.current || !canvasRef.current) {
|
||||
console.log('⏭️ Chart creation cancelled - state changed during delay');
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
// Extra check for Chart.js registry to ensure cleanup completed
|
||||
if (canvasRef.current) {
|
||||
const ctx = canvasRef.current.getContext('2d');
|
||||
const registryChart = typeof window.Chart !== 'undefined' ? window.Chart.getChart(ctx?.canvas) : null;
|
||||
if (registryChart) {
|
||||
console.log('⏭️ Chart creation cancelled - registry chart still exists');
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
resolvedConfigRef.current = config;
|
||||
createStreamingChart();
|
||||
}, 10);
|
||||
}, 50);
|
||||
} else {
|
||||
console.log(`⚠️ Session ${session.session_id} has no config, skipping chart creation`);
|
||||
}
|
||||
} else {
|
||||
console.log(`⏭️ Skipping chart creation - sessionId: ${!!session?.session_id}, canvas: ${!!canvasRef.current}, chart: ${!!chartRef.current}`);
|
||||
}
|
||||
|
||||
|
||||
return () => {
|
||||
console.log('🧹 Cleaning up chart component on unmount...');
|
||||
try {
|
||||
// Enhanced cleanup - check for Chart.js registry first
|
||||
if (canvasRef.current) {
|
||||
if (canvasRef.current && document.contains(canvasRef.current)) {
|
||||
const ctx = canvasRef.current.getContext('2d');
|
||||
const existingChart = Chart.getChart(ctx.canvas);
|
||||
|
||||
if (existingChart) {
|
||||
console.log('🧹 Cleaning up Chart.js instance on unmount...');
|
||||
const rt = existingChart.options?.scales?.x?.realtime;
|
||||
if (ctx && typeof window.Chart !== 'undefined') {
|
||||
const existingChart = window.Chart.getChart(ctx.canvas);
|
||||
|
||||
if (existingChart) {
|
||||
console.log('🧹 Cleaning up Chart.js instance on unmount...');
|
||||
const rt = existingChart.options?.scales?.x?.realtime;
|
||||
if (rt) {
|
||||
rt.pause = true;
|
||||
// Force stop any active timers
|
||||
if (rt._timer) {
|
||||
clearInterval(rt._timer);
|
||||
rt._timer = null;
|
||||
}
|
||||
try {
|
||||
existingChart.update('none');
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error updating chart during cleanup:', e);
|
||||
}
|
||||
}
|
||||
try {
|
||||
existingChart.stop();
|
||||
existingChart.destroy();
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error destroying chart during cleanup:', e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up our reference too
|
||||
if (chartRef.current) {
|
||||
console.log('🧹 Cleaning up chart reference on unmount...');
|
||||
try {
|
||||
const rt = chartRef.current.options?.scales?.x?.realtime;
|
||||
if (rt) {
|
||||
rt.pause = true;
|
||||
// Force stop any active timers
|
||||
// Force stop any active timers
|
||||
if (rt._timer) {
|
||||
clearInterval(rt._timer);
|
||||
rt._timer = null;
|
||||
}
|
||||
existingChart.update('none');
|
||||
try {
|
||||
chartRef.current.update('none');
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error updating chart reference during cleanup:', e);
|
||||
}
|
||||
}
|
||||
existingChart.stop();
|
||||
existingChart.destroy();
|
||||
chartRef.current.stop();
|
||||
chartRef.current.destroy();
|
||||
} catch (e) {
|
||||
console.warn('⚠️ Error destroying chart reference during cleanup:', e);
|
||||
} finally {
|
||||
chartRef.current = null;
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up our reference too
|
||||
if (chartRef.current) {
|
||||
console.log('🧹 Cleaning up chart reference on unmount...');
|
||||
const rt = chartRef.current.options?.scales?.x?.realtime;
|
||||
if (rt) {
|
||||
rt.pause = true;
|
||||
// Force stop any active timers
|
||||
if (rt._timer) {
|
||||
clearInterval(rt._timer);
|
||||
rt._timer = null;
|
||||
}
|
||||
chartRef.current.update('none');
|
||||
}
|
||||
chartRef.current.stop();
|
||||
chartRef.current.destroy();
|
||||
chartRef.current = null;
|
||||
}
|
||||
|
||||
|
||||
// Clean up canvas references completely
|
||||
if (canvasRef.current) {
|
||||
const ctx = canvasRef.current.getContext('2d');
|
||||
|
@ -1027,7 +1209,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
ctx.clearRect(0, 0, ctx.canvas.width, ctx.canvas.height);
|
||||
ctx.canvas.style.width = '';
|
||||
ctx.canvas.style.height = '';
|
||||
|
||||
|
||||
// Force canvas to lose focus
|
||||
if (document.activeElement === ctx.canvas) {
|
||||
ctx.canvas.blur();
|
||||
|
@ -1036,7 +1218,7 @@ const ChartjsPlot = ({ session, height = '400px' }) => {
|
|||
} catch (error) {
|
||||
console.warn('⚠️ Chart cleanup error:', error);
|
||||
}
|
||||
|
||||
|
||||
// Clean up any manual intervals
|
||||
if (sessionDataRef.current.manualInterval) {
|
||||
clearInterval(sessionDataRef.current.manualInterval);
|
||||
|
|
229
main.py
229
main.py
|
@ -1824,81 +1824,157 @@ def get_plot_variables():
|
|||
@app.route("/api/plots/historical", methods=["POST"])
|
||||
def get_historical_data():
|
||||
"""Get historical data from CSV files for plot initialization"""
|
||||
print("🔍 DEBUG: Historical endpoint called")
|
||||
try:
|
||||
data = request.get_json()
|
||||
print(f"🔍 DEBUG: Request data: {data}")
|
||||
|
||||
if not data:
|
||||
print("❌ DEBUG: No data provided")
|
||||
return jsonify({"error": "No data provided"}), 400
|
||||
|
||||
variables = data.get('variables', [])
|
||||
time_window_seconds = data.get('time_window', 60)
|
||||
|
||||
|
||||
variables = data.get("variables", [])
|
||||
time_window_seconds = data.get("time_window", 60)
|
||||
|
||||
print(f"🔍 DEBUG: Variables: {variables}")
|
||||
print(f"🔍 DEBUG: Time window: {time_window_seconds}")
|
||||
|
||||
if not variables:
|
||||
print("❌ DEBUG: No variables specified")
|
||||
return jsonify({"error": "No variables specified"}), 400
|
||||
|
||||
# Import here to avoid circular imports
|
||||
import pandas as pd
|
||||
import glob
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
|
||||
# Import pandas and glob (datetime already imported globally)
|
||||
try:
|
||||
print("🔍 DEBUG: Importing pandas...")
|
||||
import pandas as pd
|
||||
|
||||
print("🔍 DEBUG: Importing glob...")
|
||||
import glob
|
||||
|
||||
print("🔍 DEBUG: Importing timedelta...")
|
||||
from datetime import timedelta
|
||||
|
||||
print("🔍 DEBUG: All imports successful")
|
||||
except ImportError as e:
|
||||
print(f"❌ DEBUG: Import failed: {e}")
|
||||
return jsonify({"error": f"pandas import failed: {str(e)}"}), 500
|
||||
except Exception as e:
|
||||
print(f"❌ DEBUG: Unexpected import error: {e}")
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
return jsonify({"error": f"Import error: {str(e)}"}), 500
|
||||
|
||||
# Calculate time range
|
||||
end_time = datetime.now()
|
||||
start_time = end_time - timedelta(seconds=time_window_seconds)
|
||||
|
||||
try:
|
||||
print("🔍 DEBUG: Calculating time range...")
|
||||
end_time = datetime.now()
|
||||
start_time = end_time - timedelta(seconds=time_window_seconds)
|
||||
print(f"🔍 DEBUG: Time range calculated: {start_time} to {end_time}")
|
||||
except Exception as e:
|
||||
print(f"❌ DEBUG: Time calculation error: {e}")
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
return jsonify({"error": f"Time calculation failed: {str(e)}"}), 500
|
||||
|
||||
# Get records directory
|
||||
records_dir = os.path.join(os.path.dirname(__file__), 'records')
|
||||
if not os.path.exists(records_dir):
|
||||
return jsonify({"data": []})
|
||||
|
||||
historical_data = []
|
||||
|
||||
# Get date folders to search (today and yesterday in case time window spans days)
|
||||
today = end_time.strftime('%d-%m-%Y')
|
||||
yesterday = (end_time - timedelta(days=1)).strftime('%d-%m-%Y')
|
||||
|
||||
try:
|
||||
print("🔍 DEBUG: Getting records directory...")
|
||||
records_dir = os.path.join(os.path.dirname(__file__), "records")
|
||||
print(f"🔍 DEBUG: Records directory: {records_dir}")
|
||||
print(f"🔍 DEBUG: Records dir exists: {os.path.exists(records_dir)}")
|
||||
|
||||
if not os.path.exists(records_dir):
|
||||
print("🔍 DEBUG: Records directory not found, returning empty data")
|
||||
return jsonify({"data": []})
|
||||
|
||||
historical_data = []
|
||||
|
||||
# Get date folders to search (today and yesterday in case time window spans days)
|
||||
print("🔍 DEBUG: Calculating date folders...")
|
||||
today = end_time.strftime("%d-%m-%Y")
|
||||
yesterday = (end_time - timedelta(days=1)).strftime("%d-%m-%Y")
|
||||
print(f"🔍 DEBUG: Searching dates: {yesterday}, {today}")
|
||||
except Exception as e:
|
||||
print(f"❌ DEBUG: Records directory error: {e}")
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
return jsonify({"error": f"Records directory error: {str(e)}"}), 500
|
||||
|
||||
date_folders = []
|
||||
for date_folder in [yesterday, today]:
|
||||
folder_path = os.path.join(records_dir, date_folder)
|
||||
print(
|
||||
f"🔍 DEBUG: Checking folder: {folder_path}, exists: {os.path.exists(folder_path)}"
|
||||
)
|
||||
if os.path.exists(folder_path):
|
||||
date_folders.append(folder_path)
|
||||
|
||||
|
||||
print(f"🔍 DEBUG: Found date folders: {date_folders}")
|
||||
|
||||
# Search for CSV files with any of the required variables
|
||||
for folder_path in date_folders:
|
||||
csv_files = glob.glob(os.path.join(folder_path, '*.csv'))
|
||||
|
||||
csv_files = glob.glob(os.path.join(folder_path, "*.csv"))
|
||||
print(f"🔍 DEBUG: CSV files in {folder_path}: {csv_files}")
|
||||
|
||||
for csv_file in csv_files:
|
||||
try:
|
||||
print(f"🔍 DEBUG: Processing CSV file: {csv_file}")
|
||||
|
||||
# Read first line to check if any required variables are present
|
||||
with open(csv_file, 'r') as f:
|
||||
with open(csv_file, "r") as f:
|
||||
header_line = f.readline().strip()
|
||||
if not header_line:
|
||||
print(f"🔍 DEBUG: Empty header in {csv_file}, skipping")
|
||||
continue
|
||||
|
||||
headers = [h.strip() for h in header_line.split(',')]
|
||||
|
||||
|
||||
headers = [h.strip() for h in header_line.split(",")]
|
||||
print(f"🔍 DEBUG: Headers in {csv_file}: {headers}")
|
||||
|
||||
# Check if any of our variables are in this file
|
||||
matching_vars = [var for var in variables if var in headers]
|
||||
print(f"🔍 DEBUG: Matching variables: {matching_vars}")
|
||||
|
||||
if not matching_vars:
|
||||
print(
|
||||
f"🔍 DEBUG: No matching variables in {csv_file}, skipping"
|
||||
)
|
||||
continue
|
||||
|
||||
|
||||
# Read the CSV file
|
||||
print(f"🔍 DEBUG: Reading CSV file with pandas...")
|
||||
df = pd.read_csv(csv_file)
|
||||
|
||||
if 'timestamp' not in df.columns:
|
||||
print(f"🔍 DEBUG: CSV loaded, shape: {df.shape}")
|
||||
|
||||
if "timestamp" not in df.columns:
|
||||
print(f"🔍 DEBUG: No timestamp column in {csv_file}, skipping")
|
||||
continue
|
||||
|
||||
|
||||
# Convert timestamp to datetime
|
||||
df['timestamp'] = pd.to_datetime(df['timestamp'])
|
||||
|
||||
print(f"🔍 DEBUG: Converting timestamps...")
|
||||
df["timestamp"] = pd.to_datetime(df["timestamp"])
|
||||
print(
|
||||
f"🔍 DEBUG: Timestamp range: {df['timestamp'].min()} to {df['timestamp'].max()}"
|
||||
)
|
||||
print(f"🔍 DEBUG: Filter range: {start_time} to {end_time}")
|
||||
|
||||
# Filter by time range
|
||||
mask = (df['timestamp'] >= start_time) & (df['timestamp'] <= end_time)
|
||||
mask = (df["timestamp"] >= start_time) & (
|
||||
df["timestamp"] <= end_time
|
||||
)
|
||||
filtered_df = df[mask]
|
||||
|
||||
print(f"🔍 DEBUG: Filtered dataframe shape: {filtered_df.shape}")
|
||||
|
||||
if filtered_df.empty:
|
||||
print(f"🔍 DEBUG: No data in time range for {csv_file}")
|
||||
continue
|
||||
|
||||
|
||||
# Extract data for matching variables only
|
||||
print(f"🔍 DEBUG: Extracting data for variables: {matching_vars}")
|
||||
for _, row in filtered_df.iterrows():
|
||||
timestamp = row['timestamp']
|
||||
timestamp = row["timestamp"]
|
||||
for var in matching_vars:
|
||||
if var in row:
|
||||
try:
|
||||
|
@ -1906,50 +1982,71 @@ def get_historical_data():
|
|||
value = row[var]
|
||||
if pd.isna(value):
|
||||
continue
|
||||
|
||||
|
||||
# Handle boolean values
|
||||
if isinstance(value, str):
|
||||
if value.lower() == 'true':
|
||||
if value.lower() == "true":
|
||||
value = True
|
||||
elif value.lower() == 'false':
|
||||
elif value.lower() == "false":
|
||||
value = False
|
||||
else:
|
||||
try:
|
||||
value = float(value)
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
historical_data.append({
|
||||
'timestamp': timestamp.isoformat(),
|
||||
'variable': var,
|
||||
'value': value
|
||||
})
|
||||
|
||||
historical_data.append(
|
||||
{
|
||||
"timestamp": timestamp.isoformat(),
|
||||
"variable": var,
|
||||
"value": value,
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
# Skip invalid values
|
||||
continue
|
||||
|
||||
|
||||
except Exception as e:
|
||||
# Skip files that can't be read
|
||||
print(f"Warning: Could not read CSV file {csv_file}: {e}")
|
||||
continue
|
||||
|
||||
|
||||
# Sort by timestamp
|
||||
historical_data.sort(key=lambda x: x['timestamp'])
|
||||
|
||||
return jsonify({
|
||||
"data": historical_data,
|
||||
"time_range": {
|
||||
"start": start_time.isoformat(),
|
||||
"end": end_time.isoformat()
|
||||
},
|
||||
"variables_found": list(set([item['variable'] for item in historical_data])),
|
||||
"total_points": len(historical_data)
|
||||
})
|
||||
|
||||
except ImportError:
|
||||
return jsonify({"error": "pandas is required for historical data processing"}), 500
|
||||
historical_data.sort(key=lambda x: x["timestamp"])
|
||||
|
||||
print(f"🔍 DEBUG: Total historical data points found: {len(historical_data)}")
|
||||
print(
|
||||
f"🔍 DEBUG: Variables found: {list(set([item['variable'] for item in historical_data]))}"
|
||||
)
|
||||
|
||||
return jsonify(
|
||||
{
|
||||
"data": historical_data,
|
||||
"time_range": {
|
||||
"start": start_time.isoformat(),
|
||||
"end": end_time.isoformat(),
|
||||
},
|
||||
"variables_found": list(
|
||||
set([item["variable"] for item in historical_data])
|
||||
),
|
||||
"total_points": len(historical_data),
|
||||
}
|
||||
)
|
||||
|
||||
except ImportError as e:
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"error": f"pandas is required for historical data processing: {str(e)}"
|
||||
}
|
||||
),
|
||||
500,
|
||||
)
|
||||
except Exception as e:
|
||||
return jsonify({"error": str(e)}), 500
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
return jsonify({"error": f"Internal server error: {str(e)}"}), 500
|
||||
|
||||
|
||||
@app.route("/api/status")
|
||||
|
|
|
@ -3,11 +3,11 @@
|
|||
"should_connect": true,
|
||||
"should_stream": true,
|
||||
"active_datasets": [
|
||||
"DAR",
|
||||
"Test",
|
||||
"Fast",
|
||||
"DAR"
|
||||
"Fast"
|
||||
]
|
||||
},
|
||||
"auto_recovery_enabled": true,
|
||||
"last_update": "2025-08-14T23:27:17.699618"
|
||||
"last_update": "2025-08-15T00:17:13.675666"
|
||||
}
|
Binary file not shown.
Loading…
Reference in New Issue