Australian AC Voltage Sensor Adapter:


Just wondering if there is the possiblity of purchasing an AC Voltage Sensor Adapter with an Australian plug from either the shop or elsewhere?

As far as I’m aware, the shop stocks only UK, US and EU style adapters. You can however use any a.c. output adapter, or a 9 V transformer (if you put it in a box). The output should be a nominal 9 V, in reality the calibration of the standard (factory-loaded) sketches is set for 11.6 V, which is the open-circuit voltage we expect. If you do go that route, you probably need a programmer to adjust the calibration (though, with the exception of phase error correction, it can be done in emonHub).

A local one was discussed here:

The link to the Jaycar product page was posted by dBC in post #3 just above it.

Curses! I knew I’d seen a mention of one somewhere, but couldn’t find it on my crib notes.

Did you ever get round to testing that Jaycar one for phase shift?

There’s no mention of the no-load output voltage, so calibration will be essential. (The no-load voltage must not exceed 14.3 V at the maximum mains voltage you experience, 260 V if dBC’s mains is anything to go by - which relates to about 13.2 V at the nominal 240 V. It’s unlikely that a nominal 9 V output will reach that on no-load.)

I didn’t, but I haven’t forgotten either. The paying job seems to be consuming significant time this year, its already May and I’m sure it was only just Christmas!

That’s the same VT I’ve been using in all my stm32 experiments with the emontxshield demo stuff (STM32 Development). So I can add some data, at least for my specimen. My no-load voltage ratio is almost exactly 1:23, i.e. when I put in 230V I get out 10V. And my overall VT+emontxshield voltage ratio is almost exactly 300. When I put in 230V I get out 766mV. The transformer divides by 23 and the shield divides by 13 (nominally), for a total division of 299 (measured at closer to 300).

My phase measurements have all been relative to the SCT013 phase error, not absolute. At 230V, 10A, I found I had a net phase error of ~4.8 degrees. I then calibrated that away using the simple ADC lag technique discussed in the stm32 thread, so now at 230V and 10A I have no phase error at all.

[Section about distortion at higher voltages retracted as I’ve now discovered much of the distortion at the higher voltages is actually coming from the calibrator, which is limited to driving just 20mA on it’s V output].

In theory I should be able to bypass the VT altogether, set the calibrator to ~10V and feed that straight into the AC jack on the shield. Then the phase error will be entirely down to the CT, so I can measure that at 10A (or any other current for that matter) and then turn those relative VT phase errors into actual VT phase errors. I’ll add that to the list of things to try when I get a chance.


I might need to set it quite a bit lower than 10V with that 10uF cap loading it up.

The 10 μF is tying the bias mid-rail voltage to GND a.c-wise, it’s not loading the c.t. (i.e. the calibrator) output.

If the calibrator output is tied to the same GND as the emonTx GND (and you can’t apply a d.c. offset), then you need to a.c. couple the output to the biased input, and in that case you’ll need a load (>10 kΩ or so) between the ADC input and the bias mid-rail to establish the d.c. conditions.

Yeh, thanks… not sure what I was thinking there… but I was clearly mis-remembering where that cap goes.

Actually, the calibrator outputs are isolated (with some common-mode restrictions). In theory, I think I ought to be able to:

  1. replace the VT with the calibrator’s V output (suitably scaled down to ~10V or so) and fully characterise the CT (that’s more like my typical use of it, since my energy monitor doesn’t involve a VT but instead uses shunts).

  2. replace the CT with the calibrator’s I output (suitably scaled down to some tens of mAs) and somewhat characterise the VT - limited by the restriction of only being able to pump 20mA into it before the V signal starts to distort.

That way there’ll only be one source of phase errors at a time as there’s only one transformer (CT or VT) involved at a time.

Sounds good to me. The common-mode voltage should only be 2.5 V maximum (5 V Arduino Shield).

Can i ask a really dumb question in regards to the original question… Is there anything wrong with buying the UK AC sensor and using an UK to AU adapter? Don’t both countries run on the same voltage?

In theory, no. In practice, it’s better to not have adapters.

Yes, and frequency.

The Jaycar adapter is also
Voltage calibration coefficient - 300.0 (No-load = 10.4 V)
Unfortunately, we don’t have details for its phase error.

I have a sample.

Powertech MP-3027 0.92° at 230V/50Hz. Rated 1000mA 445g

There is also

DCSS AC910 2.17° Rated 1000mA 327g.

As usual, the heavier transformer exhibits lower phase shift.

These were both measured across a 13k load.

I am running with an adapter and it works fine, but obviously depends just how deep in to calibration and accuracy you want to get. I have updated my P1&P2 scales slightly and emon monitoring is now very close to what my smart meter is gathering in terms of totals.

nodename = emonpi
    names = power1,power2,power1pluspower2,vrms,t1,t2,t3,t4,t5,t6,pulsecount
    datacodes = h, h, h, h, h, h, h, h, h, h, L
    scales = 1.07,1.16,1,0.01,0.1,0.1,0.1,0.1,0.1,0.1,1
    units = W,W,W,V,C,C,C,C,C,C,p

I am running with three adapters. They were pretty much on the measured voltage via a multimeter, whereas the jaycar unit was considerably off. I don’t have the exact measurements, but all three are behaving quite consistently and reliably.

All a.c. adapters will have a similar way of specifying the output: There will be the nominal voltage at full load, which will have a manufacturing tolerance, usually a few percent. The there might be two ways of specifying the output at no-load - which is a good approximation to our usage: they might specify the no-load voltage, or they might specify the regulation as a percentage, i.e. the percentage rise in voltage above the voltage at full load.

If you use the UK adapter, that’s nominally a 9 V unit: the output is 9 V ± 5% at full load, but 11.6 V ± 3% at no-load. So the regulation is nearly 30% - not unusual for a very small transformer.

I just bought myself an Ideal Power 77DS-12-09 without entirely knowing what I was doing (made sure volts matched and current was not lower). It arrived today, I plugged it in and my reported voltage has dropped from ~240V (from the Ideal Power DE-06-09 that came with my EmonPi many years ago) to ~225V.

Does anyone have any details on its calibration? I don’t have the gear needed to do such a calibration myself.

If nobody has details on its calibration, what gear do I need to do such a calibration?

You’re right that it needs a new calibration constant. Unfortunately, the data sheet fails to give the parameter which is important and which would enable me to calculate the calibration for you – this is the open-circuit voltage. (This is because the 9 V output is at full load. As we use the adapter, it has a load so small you can discount it. Consequently, the voltage is higher than 9.0 V, but they don’t say how much.)

Do you still have the DE-06-09? Or a smart meter, or something else, maybe a PV inverter, that displays voltage? If you only have the old adapter, the best you can do is plug that in, note the voltage it displays, change to the new one and note the voltage again. Then adjust the calibration constant by the ratio of the two voltages.

If you have none of these, then it’s time to beg, borrow or buy a multimeter. Measure the mains voltage and adjust the calibration to give the same value.

Some years ago I reviewed various makes & models available in the UK (it’s in 'Docs → * Learn →Electricity Monitoring→ Current and Voltage) but is well out of date now. If you wish, post links to a few that are available locally, I can give you a bit of guidance but it will be based on what I read, probably not on experience of any particular one.

Oh, I have a multimeter somewhere, but I use it so rarely that I’ll have to review before I stick some probes in a power point.

So the relationship between the mains voltage and adapter voltage is linear? Or is near enough to linear at normal grid voltages?

Do be careful. Read the manual first – any doubts come back here and ask. Select the highest a.c. range you have, and keep your fingers out. :face_with_hand_over_mouth:

The adapter (it’s only a plain isolating transformer, nothing else inside the case) is certainly linear over the normal range of mains voltages. I wouldn’t like to say when it got above 255 V – I’d expect it to read low as you got above this.