Community
OpenEnergyMonitor

OpenEnergyMonitor Community

Incorrect CT clamp reading

Hi,

I have an YHDC-SCT013 50A/1V connected to an ESP8266. Reading the documentation I came up with the following calculations:

AREF = 3.3V

Primary peak-current = RMS current × √2 = 50 A × 1.414 = 70.7 A
Secondary peak-current = Primary peak-current / no. of turns = 70.7 A / 2000 = 0.0354 A
Ideal burden resistance = (AREF/2) / Secondary peak-current = 1.65 V / 0.0354 A = 46.61 Ω

Therefore, I used a 47 Ω resistor, with two 10k Ω for the potential divider.

#include "EmonLib.h"
// Include Emon Library
EnergyMonitor emon1;
// Create an instance
void setup()
{
  Serial.begin(9600);

  emon1.current(PIN_A0, 42.55);             // Current: input pin, calibration.
}

void loop()
{
  double Irms = emon1.calcIrms(5588);  // Calculate Irms only
  Serial.print("Apparent power: ");
  Serial.print(Irms*230.0);           // Apparent power
  Serial.println("W");
  Serial.print("Current RMS: ");
  Serial.print(Irms);             // Irms
  Serial.println("A");
  Serial.println();
  delay(1000);

}

Using the hardware and sample code above the sensor does produce an output, but I believe the calibration is off. I understand the calibration represents the mains current that gives 1 V at the ADC input, I am otherwise unsure how to continue correctly from there, and think I calculated the value I did from a 30 A current.

I have the CT sensor connected to the live wire of my computer monitor, and according to the documentation for the monitor there should be a power draw of between 20-55W, the ESP8266 serial monitor prints a value of 15-16W

Here is a snippet of the serial data:

Apparent power: 15.85W
Current RMS: 0.07A

Apparent power: 15.67W
Current RMS: 0.07A

Apparent power: 16.04W
Current RMS: 0.07A

Is there anything obvious I am doing wrong?

Welcome, Jonathan, to the OEM forum.

I’m afraid you’ve missed the important part - the 1 V output versions of the SCT-013-xxx are voltage output because the burden resistor is internal, a real current transformer is just that, it outputs a current, not a voltage, so it needs a burden resistor because your ADC wants a voltage. Yours doesn’t need, or want a second burden.

If you remove your 47 Ω burden, and recalculate the calibration based on the c.t. ratio being 50 A : 1 V, then your results should come good.

You must get asked these questions all the time, as I did try to read about it before asking, so thank you both for answering and for how quickly you did!

Removing the extra burden resistor instantly looks better, but now I can not recall where I saw the formula to calculate the calibration factor. Are you able to help with that?

For the Atmel 328P and emonLib, it is the current that gives 1.0 V at the ADC input for the current - irrespective of whether the burden is internal or external (equally the voltage that gives 1 V at the input for the voltage). However, I don’t know the inner details of the ESP8266, so I cannot help more than that - you would need to work it out from first principles, i.e knowing the ADC reference voltage and the count that this voltage represents.
You can look how it’s done in emonlib.cpp, and if the numbers for your ESP are the same, it will work - otherwise you’ll need to either modify you copy of emonLib or use an additional scale factor on the result.

But one word of warning - Expressif aren’t known for high performance ADCs - see [TW#12287] ESP32 ADC accuracy · Issue #164 · espressif/esp-idf · GitHub and the comments that follow.

Thanks for the feedback, I’ll digest that.

Regarding the ADC performance, would you consider the ADS1115 16-Bit ADC a good module for multi-channel analogue reading?

I’ve not used that myself, others I think have (search the forums?) with satisfactory results.

Here’s one example… take note of the speed at which they can sample to make sure it is suitable for your needs.

There are many other posts with good information as well, just search (at the top of this page using the magnifying glass) for ADS1115.

1 Like

Lastly, assuming I had all calculations and components theoretically correct. How could I confirm the readings were actually accurate? I assume either a meter reading should there be one, and a plug-in meter for applliances.

That depends on the standard you apply to “actually accurate”.

The usual definition I use is “Your supplier’s meter is always 100% accurate, even though it might not be.” By that I mean that meter is regarded as being reliable and good. To prove that, you’d need to borrow or hire a meter with a certificate of calibration, that guarantees an accuracy that’s better than whatever it is you are testing. A readily available plug-in meter will come with a normal accuracy tolerance, but almost certainly not with it’s calibration certified.

1 Like