I am trying to use some smaller solid-core CT sensors that I will be installing into a rack mount PDU. I wanted to test these CTs before I move forward. My tests seem to work after I calibrate them to a specific amperage, but then fall apart at different amperage.
The two CTs I have on hand are the PE-51719NL and the CST306-3A. The former has 3 pins and is listed as turn count 200CT (what I assume is 200 turns to the center tap and then 200 turns to the end). The latter just has 2 pins and 200 turns. I have tried both of these at 200 turns with both 18 and 33 Ohm resistors.
I did some sampling with my “Arduino oscilloscope” to get an idea of what the sine waves look like after connecting the CST and I encountered this strange output at 8 to 10 amps which is about half of what the maximum is. I was expecting a sine wave with higher fluctuations. Can anyone tell me what’s wrong with my test circuit? I have followed this diagram in setting it up and I am using the 18 Ohm resistor for both outputs above.
The fun maths I did:
Primary peak-current = 20A * 1.414 = 28.28 (I will only be measuring a 15 A max circuit, 12 A sustained, figured 20A was fine here)
Secondary peak-current = 28.28 / 200 = 0.1414
Ideal burden resistance = 2.5 ÷ 0.1414 = 17.680339463 (using an Arduino Mega running at 5v)
Last question: At .5 Amp, I am seeing a fluctuation of ±0.09. The math seems to suggest it should be ±0.045 (.5 A primary / 200 turns = 2.5 mA secondary * 18 Ohm = 0.045 V). Is this correct?
Thanks in advance for any advice on this. I am fairly new to electronics on this level.