Energy Consumption Data Verification (ECDV) is a critical technical process within Energy Management Systems (EMS) that ensures the accuracy and reliability of energy consumption monitoring data. With the acceleration of global energy management standardization, the ISO 50001 energy management system standard explicitly requires enterprises to establish data quality control procedures for energy consumption. China’s Implementation Plan for Promoting the Construction of Online Monitoring Systems for Key Energy-Consuming Unitsalso mandates that key energy-consuming units must pass validity verification before uploading data. Data verification aims to identify and correct erroneous data caused by metering equipment failures, transmission anomalies, or human factors, thereby providing a credible foundation for energy audits, efficiency evaluations, and carbon accounting. Typical verification covers real-time monitoring data of various energy sources such as electricity, gas, heat, and chilled water, addressing three dimensions: data completeness, reasonableness, and consistency.
In terms of technological evolution, traditional manual sampling methods can no longer meet the verification demands for massive datasets in modern energy management systems. The Fujian Provincial Technical Specification for Public Building Energy Consumption Monitoring Systems (DBJ/T13-158-2012)was the first to propose automated verification rules, including range checks, sudden change checks, and correlation checks. Furthermore, the patent document CN201710382164 developed a validation process classifying anomalies into natural disaster-type (non-human anomalies) and human error-type (human-induced anomalies), enhancing data credibility through multiple verification methods such as power validation and total quantity validation. With the application of Internet of Things (IoT) and artificial intelligence technologies, modern ECDV systems have transitioned from passive error correction to active early warning, serving as a foundational component of energy digital transformation.
The energy consumption data verification technical system comprises four key methodological modules:
Basic Verification targets the data acquisition process by monitoring the status of metering equipment and conducting communication diagnostics to identify hardware failures such as sensor malfunctions or signal interruptions. This method requires all metering devices to comply with standard protocols such as DL/T645 and Modbus, and undergo regular accuracy calibration—for instance, electronic electricity meters must meet the 0.5S accuracy class standard.
Logical Verification applies physical principles and statistical rules to construct validation rules based on the operational laws of energy systems. These include power reasonableness checks (whether equipment operating power exceeds rated ranges), energy balance verification (whether the difference between input and consumed energy falls within an allowable error band), and temporal continuity checks (whether non-natural data breakpoints occur). In an electronics factory case study, the system successfully identified abnormal electricity data caused by current transformer ratio errors by comparing the rated power of air conditioning units with actual monitored values.
Correlation Verification represents a more complex validation tier that identifies hidden errors by analyzing inter-system data correlations. For example, HVAC energy consumption data must be coupled with environmental parameters such as indoor temperature, humidity, and fresh air volume; lighting electricity curves should be associated with occupancy sensor data and natural illumination levels. Manufacturing facilities need to align production rhythms with energy consumption curves—e.g., a three-shift electronics factory should exhibit stable daily energy consumption patterns, with significant fluctuations triggering anomaly alerts. Commercial buildings focus on verifying the reasonableness of energy consumption shifts under time-of-use electricity pricing strategies to prevent data manipulation aimed at reducing electricity costs.
Technologically, modern ECDV systems generally adopt a layered verification architecture. The edge computing layer, deployed on data acquisition terminals, performs basic verification and simple logical checks to filter abnormal data locally. The platform layer, leveraging energy big data centers, runs machine learning algorithms and expert rule libraries to execute complex correlation analyses and business rule validation.
Energy consumption data anomalies can be categorized into three types based on their root causes: device-level, system-level, and operation-level.
Device-level anomalies primarily stem from metering equipment failures or parameter drift, including pulse signal loss (e.g., magnetic interference in mechanical water meters), transmitter zero drift (e.g., uncalibrated pressure sensors over time), and current transformer ratio errors (e.g., system parameters not updated after retrofits). These anomalies typically manifest as data breakpoints, constant values, or out-of-range fluctuations. Resolution mainly involves device maintenance or parameter correction.
System-level anomalies arise from energy system operational states, such as harmonic interference from frequency converters causing electricity meter counting errors, water leakage leading to inaccurate water flow statistics, or synchronization time differences between multiple stations resulting in balance errors. Addressing these issues requires systemic solutions like hardware filtering or timestamp calibration. For instance, an electronics factory project resolved both electricity metering anomalies and quality control problems by installing active power filters.
Operation-level anomalies involve human factors, including unintentional errors (e.g., incorrect meter reading intervals, unit conversion mistakes) and deliberate tampering (e.g., circumventing energy consumption quota assessments). The verification method proposed in patent CN201710382164 effectively detects human intervention by analyzing differences in energy consumption patterns between workdays and rest days, as well as hourly data change gradients. Handling mechanisms involve automatically storing suspicious data in a temporary database for managerial review and confirmation before deciding on data repair, elimination, or acceptance. To prevent data falsification, advanced systems have incorporated blockchain technology to ensure immutability across the entire "acquisition-transmission-storage" process of metering data.
Anomaly data processing strategies adhere to the "minimal intervention" principle. For brief, minor anomalies (less than 15% deviation), methods like linear interpolation or historical data under similar conditions are used for repair. Persistent anomalies trigger equipment maintenance procedures, during which data are estimated using alternative metering methods. All anomaly events and handling measures are recorded in a data quality log to form a complete audit trail.