Can I use my doorbell transformer to measure voltage for real power measurement?

How did you measure the line voltage and the output of the transformer at the same time? What I’m suggesting is, unless you’ve got a pair of multimeters, the voltage probably changed between measurements. Probably not all of the discrepancy will be due to this alone, but some will be.

But what happens next, is this voltage (I don’t think you’ve mentioned the value) going straight into the ADC, or do you divide it again? If you do, what’s the value and tolerance of these components?

Two ADC’s? do they have the same voltage reference? If your ADC’s reference is not what you think, there’s another source of calibration error.
And bear in mind that the law of natural perversity says all the individual errors must add up in the worst possible combination.

There’s definitely an approximation somewhere in your maths, because you should be using the same voltage and current (as numbers) for both the real and the apparent power calculation, so you shouldn’t be getting a power factor greater than 1. if you’re using the old discrete sample version of emonLib, then it’s almost certainly the interpolation in that which is the source of this error. If it worries you, you can use the maths from these contributions EmonLib: Inaccurate power factor and
Rms calculations in EmonLib and Learn documentation - #3 by mafheldt;
and this has been incorporated into emonLibDB (but not into either emonLibCM or emonLib).