Calibration value in case of Hall effect sensor with voltage output

Hello the community!

I’d like to create an simple energy monitor based on an Arduino. The goal is simply to measure AC current. To do so, I’ll use the “Emon.Lib” library.
My CT sensor is YHDC HSTS016L. Its output is voltage type : 2.5V +/-0.625V. I’ve thus no need of burden resistor.

In this configuration, what would be the calibration value to set for the function "emon1.current(pin, calibration)?

Thanks in advance for your help!

Welcome, Dimitri, to the OEM forum.

That is not a current transformer, it is a Hall effect sensor, which is totally different to a current transformer.

However, the calibration constant is exactly the same, it is the current that gives you 1 V at the ADC input.

You can get that from the data sheet, and it depends on which model you have.

For example, for the 100 A version, the rated input current is 100 A, and the rated output is 2.5±0.625 V. (Note, these are d.c. values, the rated rms current for a true sine wave is 70.7 A.)

Unfortunately, it is not clear what these values mean, nor what the Vref is or how it affects operation. The data sheet is very badly written.

The way most other manufacturers of Hall effect devices specify their products is with a graph showing output voltage against input current, which is very easy to interpret.

I interpret the data sheet to say that the quiescent output voltage is 2.5 V ± 15 mV, and the rated change in output voltage is 0.625 V for a change in input current of 100 A.

If that is the case, the change of input current that would give you 1.0 V change in output voltage is 160 A, so your calibration constant is 160. It is of course different for the other versions in the range.

1 Like