Having messed around a lot with the system over the last few days I let it ‘settle’ for 24 hours and then compared the measured values with the meter readings. I have dual-tariff that is separately metered (the old fashioned way) and Peak electricity reads 131% of the metered value while Off-Peak reads 92% of the metered value. I’ve compared both P derived and E derived kWh measurements - they’re identical.
Both Peak (CT1) and Off-Peak (CT2) are using 100A sensors and the calibration looks like the default to me. I listed out the firmware settings and they are:
emonTx V4 CM Continuous Monitoring V1.5.4
OpenEnergyMonitor.org
Loaded EEPROM config
Settings:
Band 433 MHz, Group 210, Node 17, 7 dBm
Calibration:
vCal = 807.86
assumedV = 240.00
i1Cal = 300.30
i1Lead = 3.20
i2Cal = 300.30
i2Lead = 3.20
i3Cal = 150.15
i3Lead = 3.20
i4Cal = 60.06
i4Lead = 3.20
i5Cal = 60.06
i5Lead = 3.20
i6Cal = 60.06
i6Lead = 3.20
datalog = 9.80
pulses = 1
pulse period = 100
RF off
temp_enable = 1
JSON Format Off
Reference voltage calibration: 1.0304
AC present - Real Power calc enabled
MSG:1,Vrms:244.95,P1:236,P2:1,P3:0,P4:0,P5:0,P6:2,E1:36992,E2:76480,E3:19,E4:6,E5:0,E6:163,T1:23.25,T2:21.75,pulse:0
I can live with this and will put correction factors into the feeds to compensate, but they are higher than I’d have imagined and are in opposite directions, which seems strange. Does this point to an issue with the firmware?