Can I use my doorbell transformer to measure voltage for real power measurement?

The first thing I notice is your voltage measurements are about 10 volts low.
(unless of course, your line coltage really is 110 Volts, but it should be closer to 120 V)

Have you measured your line voltage with a reasonably accurate meter?

The other thing that sticks out is your measured power factor.
PF can never be greater than 1, so you have an issue there as well.

yes, I measure 115.x V with my multimeter and the kill-a-watt agrees. I’ll keep testing but I figured I’d get some input on what I’m seeing so far. I was thinking my calibration values were to blame but I’ve double checked and I think I have them calculated properly.

If your line voltage measures ~115 VAC, and your device is reporting ~110 VAC, that says to me,
at least one cal factor is off.

Then there’s the Power Factor. Something’s off there too.

I’ll keep testing. Was hoping one of you would see an obvious error based on my numbers but you’ve given me some things to look out for as I progress :slight_smile: appreciate the feedback

Keep hammerin’ at it!

thumbs_up thumbsup

1 Like

How did you measure the line voltage and the output of the transformer at the same time? What I’m suggesting is, unless you’ve got a pair of multimeters, the voltage probably changed between measurements. Probably not all of the discrepancy will be due to this alone, but some will be.

But what happens next, is this voltage (I don’t think you’ve mentioned the value) going straight into the ADC, or do you divide it again? If you do, what’s the value and tolerance of these components?

Two ADC’s? do they have the same voltage reference? If your ADC’s reference is not what you think, there’s another source of calibration error.
And bear in mind that the law of natural perversity says all the individual errors must add up in the worst possible combination.

There’s definitely an approximation somewhere in your maths, because you should be using the same voltage and current (as numbers) for both the real and the apparent power calculation, so you shouldn’t be getting a power factor greater than 1. if you’re using the old discrete sample version of emonLib, then it’s almost certainly the interpolation in that which is the source of this error. If it worries you, you can use the maths from these contributions EmonLib: Inaccurate power factor and
Rms calculations in EmonLib and Learn documentation - #3 by mafheldt;
and this has been incorporated into emonLibDB (but not into either emonLibCM or emonLib).

I’m out right now so I’m going to read this in depth when I get home. Just figured I’d share my little test setup so you can see how I’m measuring the voltage and transformer at the same time. I’m using a kilowatt and an AC line splitter. I can see the voltage on both the killawatt and with my meter on the test points of the AC line splitter. The transformer wire can’t be seen in this photo. I have it running to my bench setup from a wire from my panel.

OK, it wasn’t clear how you were measuring the voltages. Remember, I only know what’s written here.

Hopefully my photo helped shed some light on my setup.

My voltage input goes uses the setup suggested in the openenergymonitor docs for reading the AC voltage. R1 measures 9.96K and R2 measures 328. When I measure the voltage on the transformer with my multi meter it reads 20V. The transformer is marked 9V so I’m assuming the 20V would drop if it was loaded? I did my calculations for reducing the transformer voltage using 24V to make sure I would safely stay within range. With this divider is should drop it to 0.765V. Then I’m adding a 1.65V offset and that is the signal I’m reading with my ADC. You had mentioned you usually use 1.1V for 3.3V ADC’s. Could this be part of my problem?

Here is how I calculated the calcVI calibration constant. I’m not super confident in my work here.

Rd = (9.96K + 328) / 328 = 31.36
Rt = 120 / 20 = 6

6 * 31.36 = 189.6

Yes, I’m running two ADC’s. They are multi channel but I where they operate at 3.3K SPS I didn’t want to use multiple channels (slows things down) and opted to go with a dedicated ADC for each. The 12-bit ADS1015 has an internal voltage reference and it’s output value is based on the PGA setting. I’ve modified the EmonLib code to work with this. For example the line:

double V_RATIO = VCAL *((SupplyVoltage/1000.0) / (ADC_COUNTS));

is now:

double V_RATIO = VCAL * (_gain[VOLTAGE_ADC] / 2047.0f);

_gain is an array containing the gain settings of each ADC. In that code, the value is 4.09600019

I suspect I may have an issue in the low pass filter but my Irms value is spot on so that makes me feel like it’s correct:

    //-----------------------------------------------------------------------------
    // B) Apply digital low pass filters to extract the 2.5 V or 1.65 V dc offset,
    //     then subtract this - signal is now centred on 0 counts.
    //-----------------------------------------------------------------------------
    //offsetV = offsetV + ((sampleV-offsetV)/ ADC_COUNTS);
    offsetV = offsetV + ((_gain[VOLTAGE_ADC]*(sampleV-offsetV)) / 2047.0);
    filteredV = sampleV - offsetV;
    //offsetI = offsetI + ((sampleI-offsetI) / ADC_COUNTS);
    offsetI = offsetI + ((_gain[PHASE_1_ADC] *(sampleI-offsetI)) / 2047.0);
    filteredI = sampleI - offsetI;

I’m going to give that a read and keep testing things. I appreciate your feedback. I’m sure you get tired of answering the same questions from different idiots who venture down their own path rather than following a tried and true recipe :joy:

Ouch! it’s 9 V at the rated output. This rise is called the “regulation” of the transformer, and I call that, at 120%, abysmal. Our a.c. adapter is bad at about 30% - the transformer supplying your home would probably be better than 5% (and for power distribution transformers - UK practice, this is calculated the other way: as the drop at full load rather than the rise at no load). It won’t hurt to measure the voltage with your 10 kΩ load - I’d be prepared to bet it’s no longer 20 V.

The easy way to remember, and maybe calculate, the calibration constant is it’s the mains voltage that would give exactly 1 V at the ADC input. As long as you measure both as either peak, or peak to peak, or rms, it doesn’t matter because it’s a unitless number. I don’t know the transformer rating, so I can’t calculate the voltage drop you expect with your 10 kΩ load, but it may well go a long way towards explaining the 4% discrepancy.

At the rms value, there shouldn’t be a problem with this. It’s not using the full range of the ADC, but so what? The supply voltage varies by only a few percent from the nominal, whereas with current it goes from 0% up to (maybe) 100% – it’s when the current is at the low end that you need to make sure you’re using the whole of the range of the ADC. You’re never likely to read the voltage below 90% of nominal, so losing a bit on the voltage measurement will make no real difference overall.

Just one word of caution… I am assuming it’s also providing power to the bell, has anyone rung the bell yet?

I tried to do something similar a while back, not measuring the voltage but using the supply to the bell via a resistive divider to drive some logic (i…e so my microcontroller knew when the bell rang.)

I had an old fashioned electro-mechanical bell in the house, the voltage spikes when the bell operating was enough to fry my digital inputs! If this is a risk you probably want some sort of diode clamp on the signal. I am sure someone on here can advise.

ok, so I feel my calibration constant is correct. One other thing I’m just figuring out now is that if I measure the voltage at various other outlets in my house it’s more like 118V vs 115V I’m seeing on the outlet I’m testing with. I’m assuming there’s some load on that line that’s causing the voltage to be lower hmmm

you raise a good point. I need to check that. I’m leaning towards ditching this doorbell transformer and getting an AC-AC adapter I think.

I think that was always the better option.

agreed. I just need to find a good one that’s available to me here in Canada

Here’s one from Amazon for $15.

https://www.jameco.com/z/ADU090150A2231-Jameco-ReliaPro-AC-to-AC-Wall-Adapter-Transformer-9-Volt-1500mA-Black-Straight-2-5mm-Female-Plug_112336.html

Here’s the datasheet:

https://www.jameco.com/Jameco/Products/ProdDS/112336.pdf

Does that look ok?

Edit - added excerpt of electrical data from datasheet, to post. Moderator - BT

@Robert.Wall am I correct to say that particular transformer has ~17% regulation which would make it a good candidate?

I really should have searched first

If the transformer’s load is constant, your only concern is accurately knowing the output voltage with this load.

Regulation matters when the load changes and the resulting voltage change affects either this load or other apparatus fed by the transformer.

(Consider what would happen if your house was fed by a big version of your doorbell transformer: it would be 120 V when all your appliances and lights were on, and about 260 V when you had just one light on. Not a satisfactory state of affairs.)

Yes, the Jameco one looks OK - you’ll need to measure the output because the no-load voltage is still subject to the 5% manufacturing tolerance. It appears to be available from Amazon (Canada) for $17.

1 Like

Ordered one. Thanks!