240VAC to 12VAC Transformers

Yes I have. All I can say is that I don’t have one of those adapters, so I’ll give it the benefit of the doubt.

While it’s not explicitely stated how the phase difference was measured there, it implies by the diagrams that it was done by measuring the time difference at zero crossing. My experience is that is not accurate. When I look at those plots, there is a clear phase difference at zero and along the slopes either side, but when you look at the peaks, there is virtually no difference. I believe that’s why my method yields different results. It weights that part of the curve much more than than the zero voltage part. That’s also what is happening in power factor calculations.

I get about 2 degrees for the ideal UK adapter. The crappiest 120V adapter that I’ve tested is in the 5 degree range, but most of the adapters with reasonable weight cores are around 2. The Ideal 120V adapter is 1.55. I don’t believe it changes by two or three degrees with voltage variation.

I use the same technique to measure shift in the CTs. When I measure the net phase difference between a VT and a CT, it is very close (.1-.2) to the difference between the phase leads that I measure for the individual transformers. So it all works for me, and the accuracy of the device seems to be right on as a result.

When doing these calculations, I use a technique where one of the signals is numerically shifted by 30-40 degree, the difference calculated, and then the artificial shift subtracted. That puts the calculation into a cosine range with greater slope and so better resolution.

But the proof of the pudding is in the eating. Look at some of your low PF loads with IoTaWatt and see how the phase correction compares to some standard.