Difference between CT sensors and SmartMeter?

I am starting to figure out what is needed to monitor a panel and sub-panel.

Right now I’m pulling net energy data using an Eagle-200 via ZigBee from the PG&E SmartMeter. Its net because I have a solar PV system.

Anyway, has anyone seen any differences between the data pulled directly from the meter versus CT sensors on the mains?

I’m in California, USA.

Thank you
Greg

Apart from the obvious, viz. your smart meter is by definition 100% accurate (even if it’s not, in reality) and there will be component tolerances that affect analogue measurements via a c.t, then a lot depends on how the meter is programmed, and what it ignores. And the meter manufacturers and suppliers tend to be very tight-lipped about that.

Hi Greg

This may be too late for you, but I’ve only just joined this community so I thought I’d send you reply anyway! I’ve posted about long-term monitoring to SD cards using CT/VT and meter pulse outputs. As long as you calibrate the CT and (more important) the VT at the outset, I found that this method would typically correspond within 5% of the meter data over long periods – if you then have the meter readings you can adjust the logged result. This was achieved using the recommended low cost 100A clip-on CT, so you could probably improve on the result by using a lower-rated CT commensurate with the maximum current you expect to see. I also compared profiles between this method of logging and logging of meter pulse data and found surprisingly good correspondence, even over lightly loaded periods.

Just to be clear, I assume you are using a CT and voltage source (via Emon products?) to get true power rather than relying only on a current reading and assumed voltage? I find that the voltage varies quite a lot, particularly in rural areas with lots of PV, and you may have phase variations between the voltage and current (non-unity power factor), so it is important to have the voltage profile for accuracy.

All the best

Simon (North Wales, UK)

And to add to that, you must also calibrate very carefully to remove the errors due to manufacturing tolerances, in the c.t. and v.t. in particular. If you use the ‘shop’ transformers, their tolerances alone could introduce an error of 8% (USA/EU) or 6% (UK) if not calibrated, and that’s not counting phase errors that will adversely affect the real power calculation, and which are heavily dependent on the quantities being measured.