# Emon Calibration

I want to use EmonLib.h to measure AC power of an induction motor using Arduino uno. I have used a 220V / 2V Potential Transformer to Lower the voltage and a 5:1 CT to lower the current. Futher Voltage of Potential Transformer is lowered using a voltage divider circuit using two 1K resistances. Then a dc offset is added and a biased clipping circuit to as to save the arduino from any voltage spikes. The DC offset circuit not only adds a dc voltage to ac but also outputs a signal having Vp-p/2 as compared to original. Please click on the link below to see the circuit. Here both the 5V DC sources are taken from Arduino pins.
Circuit Image

And the output is fed to A0 Pin for voltage and A1 Pin for current. Current transformer has 1 Ohm burden resistance across it. I have gone through caliberation theory given in the site.
Caliberation Theory

I cannot help from that description. Why can you not calibrate properly? Why are your results wrong? Is the voltage wrong, or is the current wrong, or are both wrong? What do you measure, and what did you expect to measure?

You must of course change all the numbers given in the instructions to the actual numbers that you measure from your transformer and your other components. Also remember that when you use the manufacturer’s numbers, they might not be accurate, because the value can change from the catalogue value due to tolerances in manufacture.

If you post your maths, I might be able to help. And attach the circuit diagram here. I don’t look at 3rd party sites.

Sir,
Both of my values are wrong but i think solving anyone will give me the method to solve other. I am measuring phase voltage of a star connected system. And i expect 220 V to be displayed. Kindly go through the Circuit Diagram and Voltage Constant Calculation attached here.

Your problem is not only calibration - your input circuit is wrong.

First, look at the circuit here and compare it with yours.

Second, is your transformer voltage 2 V? The reason I question it is because all small transformers give the rated voltage at full load. Under no load, which is essentially what 2 × 1 kΩ resistors are, then the voltage will be greater than 2 V, by a percentage called the “regulation”. A guess would be 20 - 25%, say around 2.5 V, but that’s by no means guaranteed. If you don’t know it for your transformer, just measure the mains voltage and the output voltage, and use that ratio in the calibration maths.

Finally, your clamping circuit is wrong. There is no need for D2, all that is doing is putting a constant 0.65 V or thereabouts voltage drop in series with the input. In fact, D2 should go to the GND rail as you are supposed to have one diode that conducts to GND when the input goes below GND, and one diode that conducts to Vcc when the input goes above Vcc. You only have the second one. You really need another resistor, say 1 kΩ, where you have D2 now, so as to limit the current in the input’s protection diodes to less than 1 mA. ( 1 kΩ will limit it to one diode forward drop ÷ 1 kΩ ≈ 0.6 mA, and won’t materially affect the calibration - it must be less than 10 kΩ.)

Your CT input will look exactly the same, except that my R1 is your burden resistor and my R2 is a short circuit. Remember that you’ll need at least a 1 W resistor for your burden, and it will get very hot.

When you have done those changes, I think your voltage calibration constant will be about 176, and your current calibration will be 5.

Sir,
I have made changes according to instruction above plus a few more. I have changed my potential transformer. This one is 220 V/ 7 V and instead of a voltage divider circuit to scale down the resistance i have used a POT so that i can vary the voltage and see if its working properly. And due to unavailability of 470k i have used 22k in biasing circuit. A few things i am not sure about is

1. if i have connected D2 in proper position or not?
2. if i have connected properly then where should i connect point B (see in the circuit below) ?
3. Voltage constant is input voltage at PT divided by input arduino…i.e Vab or voltage at point A with respect to ground?

There was no need to change from the 1 kΩ resistors to 22 kΩ, we only use the highest practical value to keep the current low to extend battery life. You’re not running on batteries so it does not matter. And there was no need to change the transformer, all you needed was to know the actual voltage. I’d calculated that it was unlikely to exceed the input range of the Arduino, that is about 1.75 V rms, so your 2 V transformer’s regulation would have been impossibly poor long before there was a problem.

Diode D2 should connect between GND and the junction of D1, 100 Ω and 1 kΩ but with the polarity reversed so that the cathode is the signal and the anode is GND. So it will only conduct from GND to the input when the input voltage goes below GND (0 V).

I would not use a potentiometer. You can do so if you wish, but it is a component that can fail, or be moved accidentally, and you have the ability to alter the calibration in software anyway. All you need to do is make sure that, with the highest mains voltage that it is possible to have where you live, the alternating rms voltage at the Arduino input pin does not exceed 1.75 V.

For calibration, forget the direct voltages. We are only concerned with the alternating component. And of course it doesn’t matter whether you use rms, peak or peak-peak, it is the ratio that matters, not how you measure it.

Sir,
I have done as you have directed and got proper results but i still do not understand calibration part fully. With 220/2 v transformer and 1 K voltage divider circuit i get voltage calibration constant= 340 and with 220/7 v PT with a POT i get voltage calibration constant= 325. Results with 176 as you told were wrong. The way i calculated is by measuring rms input at PT and alternating rms output after biasing and clipping which is also the input at arduino pins then i divide former by later to get the constant. Please tell me if this is the correct way or not.?

Apart from this there is one more problem that is arising. I did all these parts with a bulb i.e. unity power factor load in a single phase. Now i want to measure voltage current and power factor in a single phase of a three phase delta connected induction motor. So i have connected a three phase auto transformer and connected PT at the output terminals of it specifically across neutral and a phase. Similarly, CT on any one phase. My aim is to get phase voltage current and power factor of a running motor so that i can draw its power circle. But as i am doing this i get sinusoidal waveform for current but distorted waveform for voltage which is i guess due to the harmonics. This distortion would change the rms value at the inputs of arduino and give me erroneous results. How do i handle this problem?

My value of 176 for the voltage calibration assumed your original transformer, divider and a guess at the transformer regulation; which is why I wrote “about”. You can never calculate the exact calibration value because you do not know the exact value of every component. The way you measured - mains voltage and Arduino pin voltage - is actually what all the maths about transformer ratio, regulation and divider ratio is all about and if I’m not wrong, the last line of the explanation says exactly that. The only way that you are wrong is that this number does not take account of the ADC reference voltage not being exactly 5 V. The accurate way to calibrate is to look at the voltage that your Arduino is reporting in software, and compare that with the mains voltage that you measure.
The real reason for calculating the calibration constant is to show whether there is a gross error somewhere - which usually means a wrong or faulty component. If the value you calculate from knowing the component values and transformer ratio is close to the value you get by trial and error, then you can be confident that everything is probably correct. (This is how we found that your input circuit was wrong - proving the method works!)