The article doesn’t make it clear how they measured the “true” values, or how those values relate to the meter’s operating range, but as the result has been verified by the body responsible for national standards (VSL, the Dutch Metrology Institute), it would seem to be a pretty damning indictment of “electronic” meters and particularly of designs that rely on Rogowski coils.
Other than the current sensing element having a highly non-linear frequency response, I’m struggling to see how a meter, which has to use a similar algorithm to emonLib, can be wrong by a factor of more than 5 times.
There’s more detail in the underlying paper referenced in that article. The really gross errors (> 500%) seem to be restricted to Rogowski coils being subject to severe rise times of 1.1 A/μs. Their theory is that the integration logic is being pushed into saturation, but it’s an educated guess since they don’t have much visibility into how the meter works other than to crack it open and see what sort of current sensor it uses.
As a side bar, they noted that several cheap energy monitors (but not the revenue meters) were confused by all the PLT (Powerline Telecommunications) disturbances on V. Apparently there’s a fair bit of this SmartGrid chatter going on all the time in Holland, so V typically shows high levels of harmonic distortion at the comms frequencies. If the energy monitor sampling frequency is close to that frequency, and the designers didn’t include anti-aliasing filters, it can appear to be at 50Hz.
Here is a follow-up ‘survey’-type report by a UK-based meter designer/manufacturer, giving a somewhat more detailed explanation of the problem:
(The BBC “Money Box” program referenced might be unavailable outside the UK.)