Welcome, @gerrishp2 , to OEM.
No, it is the difference between the phase errors of the current transformer and the voltage transformer.
Both the transformers we use have a phase error that varies with the quantity being measured - you can look at the test reports in the ‘Learn’ to see what the individual errors on the samples tested were, and bear in mind those will vary from one manufacturing batch to the next, and especially if the materials - the ferrite in the case of the c.t. or the steel for the v.t. - are changed. If you want to know why transformers have phase errors, you need to study the phasor diagrams of the transformers.
So the default value is a “best overall approximation” - it’s probably not correct for you but until you calibrate at the voltage and current that you use, it’s a good default starting point.
If you want transformers with a phase error that approximates to zero, you need deep pockets.