MIT researchers develop a low-cost device to monitor home power consumption

Here is a news report that I came across today.

My guess is it’s Hall effect devices that sense the field on the outside of a cable. How they achieve “self-calibrates” would be very interesting. The linked press release from MIT goes into somewhat more detail, and is well worth reading.


From the article:

Other groups have attempted to use wireless sensors to pick up the very
faint magnetic and electric fields near a wire, but such systems have
required a complex alignment process since the fields in some places can
cancel each other out. The MIT team solved the problem by using an
array of five sensors, each slightly offset from the others, and a
calibration system that tracks the readings from each sensor and figures
out which one is positioned to give the strongest signal.

Yes, I read that but how does it know what the strongest signal represents in amps? If they can figure that out with any degree of accuracy, it’s pretty clever. When you’re invoking Biot-Savart with the core of a CT, it’s one thing, but inferring the same from 5 spot measurements seems something of a leap of faith. “The proof of the pudding…” springs to mind. No doubt we’ll find out in due course.

A few years back I absolutely wanted to make a noninvasinve sensor for a simple appliance status monitoring (on/off state of our heating system’s pump motor). I dropped the idea later but I found a product by Sparkfun (or was it Seeed? who knows) based on the Allegro A1324 linear Hall-effect sensor. It was a unique board with 2 of the chips in a special arrangement. One had to align the board properly on the wire and fasten it very firmly. The sensors’ outputs were fed to opamps to make their readout useful for simple ADCs of an uC. It worked, with a surprisingly decent accuracy, at least according to the blog post where the product was first shown.

I think the new MIT invention is different in several ways. If they position the sensors around the wire, they might be able to sense magnetic field vectors and using a lot of math they might be able to make sense of the data. Of course they would probably make a few assumptions that make calcs easier, like the mains freq, expected field strenght, known waveforms, etc. I don’t really know how it works in reality but if it does as advertised, the $25-30 is not bad at all. Cheap chinese clones will appear in no time… Very interesting.

Too right you are.

I noticed “Ph.D.” after two of the authors names. MEng after another. The third is the MIT Professor of EE.
Waaaay above my pay grade!

Given some of the developments that have been made at MIT (e.g. the Apollo Guidance Computer)
I’m hoping the device doesn’t take forever to reach the market.
(Or that a larger corporate concern doesn’t buy it so they can “bury” it)

If you keep clicking, you end up here:

Current and Voltage Reconstruction From Non-Contact Field Measurements
Non-contact electromagnetic field sensors can monitor voltage and current in multiple-conductor cables from a distance. Knowledge of the cable and sensor geometry is generally required to determine the transformation that recovers voltages and currents from the sensed electromagnetic fields. This paper presents a new calibration technique that enables the use of non-contact sensors without the prior knowledge of conductor geometry. Calibration of the sensors is accomplished with a reference load or through observation of in situ loads.

I didn’t look that far because IEEE stuff is usually behind a paywall. That doesn’t surprise me in the least. It looks like the reporter/press office wasn’t too concerned with accuracy and more concerned with writing something that might get a headline.

A bit more searching and I ended up at this publicly accessible paper out of Berkeley:

which looks like it could be the basis for this startup:

You don’t need to wait for the MIT version to get to market - Modern Device stock a simpler sensor.
The original intent was a go/nogo detection of current passing in a mains lead by contact only - turned out with tweaking (and a two point calibration) it does a reasonable job of actually measuring the amperage. The trick is that, unlike at a distance, very close up, the two opposing magnetic fields do not exactly cancel - avoiding the need to split out individual conductors that a conventional C.T. solution requires.

Sure, it’s not instrument grade current measurement, but this class of device can be a solution when you just need a non-invasive detection of current is/isn’t flowing, or even heavy/medium/light/standby current flowing such as figuring out what state the washing machine has reached.

The MD sensor needs to be calibrated against a known load.
Looks like it’d be great for detecting go/no-go conditions.
What I really like about the MIT development is the self-calibration.

So not exactly self-calibration after all, which is to be expected as it appears to clip onto one side of a cable only.

Yes, assuming that referenced paper is still as good as it gets (and I suspect it is) they’ve no inherent way to convert their final result into actual WattHours without some external help: either a reference load or a long term comparison with the revenue meter. Without one of those they can get it to within an unknown linear scaling factor, which I guess is sufficient to make conclusions like “27% of your energy is used by your fridge”, and leave it to the end-user to multiply the 27% by whatever their revenue meter reads, if they particularly need to know the result in kWHs.

Ah yes, this is the one I mentioned in my previous post. It’s Modern Device, I even forgot their name…