Hi, I’m Andrea and I’m a new user.

I’m testing a self-built shield for consumption reading similar to the shield for Arduino.

The AC-AC adapter is a 220-9V pcb transformer and there are 5 connected CT sensors.

Everything works, I have done a ‘temporary’ calibration and I can get a power factor very close to 1.0, but only in a certain range of values. Below 80W the power factor drops to 0.40 and actually the real power and apparent power are very different.

I do not understand if the cause could be the transformer or the calibration, or the choice of the components. I tried to recalibrate the sensors for lower powers in an empirical way, but worse readings of higher powers. What should I do? It’s normal? It would be wrong (and above all possible!) To recalibrate the sensors automatically based on the power band read? thank you

If you look at the test report (in ‘Learn’) for the YHDC SCT-013-000, you will see that the phase error changes rapidly at low and high currents. This is typical of all current transformers, the only difference being that if you buy an expensive “revenue” or metering grade, it will happen at a lower current and so give you a wider usable band. So yes, it is normal. If you look at the specification, you will see that the c.t’s accuracy is specified over a limited range - normally given as a percentage of the rated current. In the case of the SCT-013-000, non-linearity is specified as ±3% over the range 10%—120% of rated input current, phase error is not even mentioned!

You could try to apply a correction based on the measured current. I know how to do it, but it means rewriting emonLib and I have not yet tried to write the code to put it into practice.

You do not say which c.t. you are using, and what your maximum current is. Don’t forget that, depending on the wire size and the c.t, you might be able to have a multi-turn primary winding, which will increase the sensitivity and reduce the error of your c.t. (E.g, to test the 100 A SCT-013-000 c.t. up to 250 A, I used a 50 turns coil for the primary winding with a current of 5 A flowing. I routinely use a 20 turns coil with 5 A to give me an apparent current of 100 A.)

Hi , thanks for your answer.

I’m using 5 sct 013-000 in a 220-230 V domestic system with a maximum of 16 A, photovoltaic system with on-site exchange, housing, shed, tavern.

The intent is to measure the various consumptions to divide the costs and verify the power fed into the network.

I do not know how many turns the transformer has.

However your answer made me clear why I had difficulty reading at various times of the day!

I’ll see what I can do at the software level and how to optimize the hardware.

Thank you very much!

That c.t. is rated at 100 A, so I am not surprised that you have a problem with an 80 W load (~350 mA).

If it is possible to do this, you can pass the wire (the transformer primary winding) through the centre of the c.t. up to 6 times. 6 times will give you a maximum current that you read of 96 A (6 × 16). Each time will lower the current at which the error happens, but the current you read will be *n* times the true current. You change the calibration constant (multiply it by 1/n) to correct that.

Sorry, the sensor is sct 013-000 50mA.

Maybe I can suppose an ideal max of 3kW from the grid and 3kW from the solar panel , about 30A .

I do not understand, every sensor (5) must have the cable of the primary transformer wound (therefore the 220 v of the system that feeds the transformer?) Or the cable where I am making the measurement?

I ordered these kind when I still found some ‘confusion’ on the internal burden resistor, number of turns, analog readings arduino. Then I found openenergymonitor and now I’m trying.

The transformer is pcb version, the circuit is on prototype board, the readings are not bad, I have tested them a bit with multimeters and with power meter and the solar panel inverter, various loads, the problem I see only in low consumption .

Is better if I change the sensor with one of 30A?

What would you recommend to me?

Reading here:

https://learn.openenergymonitor.org/electricity-monitoring/ct-sensors/measurement-implications-of-adc-resolution-at-low-current-values

I thought about the possibility of increasing the resolution by oversampling the analog readings of the 10 bit Arduino ADC, maybe it could improve accuracy if the samples are increased. I say a stupid thing?

The SCT-013-000 is a ** current transformer**. It measures current. The voltage of the wire on which it sits does not make any difference. The current transformer needs two windings, but it comes with only one, the secondary winding. You supply the

*primary*winding, and that is the wire you clip it on.

To read up to 33 A with your SCT-013-000, you can pass the main wire onto which you clip the c.t. three times through the c.t. That way, you make a primary winding of 3 turns. So instead of your c.t. being 100 A : 50 mA, it becomes a 33.33 A : 50 mA one.

I think that will not make any real improvement. The SCT-013-030 is almost identical inside, but it has an internal burden resistor.

Oversampling might help with the accuracy problem that you have. Your problem is the phase error of the c.t. becomes large at low currents. You can improve that by reducing the value of the burden resistor, but then you have a smaller voltage to measure. Oversampling might then help a little.

Now it’s all more clear! Thanks a lot, I will try!

Thank you again

Hi, here I am again…

I’ve test your solution in two way:

First I ‘ve done a 3 turn for the primary with a 4mm2 wire and change the calibration constant from 111,1 ( i have a burden resistor of 18 ohm and this value is a starting point of the ‘Learn’ and however correct during a test of 1000W resistive load) to 111,1 / 3 = 37,03.

The error scaled down but the power factor is distant from 1.0 .

Second test I’ve tryed with 10 turns and a calibration of 1/10 but a wire of 1,5mm2 with no load or max load of 20-30W of led lamp. With no load I have a power result o 3,5W and applying some load about 150W the reading is correct.

I don’t want to try more load because I need a correct dimension of wire, 1,5mm2 is too little, 4 is more correct for a max of 16A .

Now my question is, if the ratio between real power and apparent Power is near 1.0 with more than 100W with only 1 turn and if I see a correct real power with 10 turns and no load or little load but the cos Fi is completely wrong (0.06!), considering the cos Fi a phase displacement from real power and apparent power usually present during inductive load or capacitive load, what I have to consider correct? The real power when I have a load that I know also if the power factor is wrong? The CT rappresent an inductive load if the load is too little ? The problem is only the sensibility of the CT? I have to settle for this error and therefore I can not consider the loads below 50W as correct readings? or find sensors that allow me to wrap more coils of cable of adequate size? amplifying the output value from the CT would improve the accuracy or make the reading scale more raw?

From the link that I have posted the problem with low load is only the false reading of the sensor that make a wrong reading during the calc of the measuring…

thanks for your patience …and please sorry for the amount of question!!

Thanks

Cos(φ) is not the same as power factor *unless* both your voltage wave and your current wave are pure sine waves - and usually, you only have that in a text book! In the real world, your voltage might be close to a sine wave, but your current waveform can be anything. You only need to browse this site to see examples of current waveforms that are a long way from being a sine wave. The current will be the same shape and exactly in phase with the voltage wave only when you have a pure resistance as your load. EmonLib calculates power factor according to the definition: it is the ratio of real power to apparent power.

You must use a purely resistive load when calibrating - your LED lamp could easily have a current waveform that is a not a sine wave, and a power factor that is not 1.0 (I installed some 3 W LED lamps for a friend 2 days ago, the power factor of those is 0.4 If you calibrated your instrument with that, it would be very wrong.)

Let me explain again what happens when you measure a power. Both the c.t. and the v.t. have a phase error. There is also a time difference between when you read the voltage sample and when you read the current sample. That time difference *looks like* a phase error. The transformer errors are in the same direction so they will tend to cancel each other, but not completely. The timing error will add to or subtract from that, depending on which comes first. Phasecal is a mathematical trick that tries to move the voltage wave in time so that it aligns precisely with the current wave. If that is not done correctly, it will look as if your load is reactive when it is not, and so the power factor will be wrong.

Unfortunately, the error in the v.t. depends on the voltage, and the error in the c.t. depends on the current. If you spend a lot of money (and I mean **a lot**), then you can buy transformers that have very small errors. But a c.t. will always have a phase error at very low currents (relative to the rated current). You cannot avoid that - it is because of the power needed to magnetize the core.

The SCT006 (from our shop) is good up to (but not above) 16 A with the 22 Ω burden of the emonTx. It is designed for a 5 Ω burden so I did not test it with a higher resistance burden. I tested that down to 25 mA, which is about 6 W, the report is in ‘Learn’. Compare that to the SCT-013-000 which I tested down to 250 mA, at which current it started to show increasing phase errors, as you have seen.

So you are right: