Calculate Gas Boiler % flame - nearly accurate - from temperatures only

EDITED: as I dither between this working…or not! The 1st 2 comments below may no longer make sense

After Mike Henderson’s clever idea to calculate a proxy for Gas usage of a gas boiler - from a CT measuring electric current of the fan driving the combustion:

Could the Flame % be calculated knowing only the two temperature measurements:

• Boiler Flow, and Return

Yes. But need to be careful to exclude times when the boiler is OFF: by ignoring all samples where the main temp delta is lower than the minimum it can be when the boiler is on, in minimum flame.

Mike’s method uses electrical flow to know when the boiler is lit… but from pipe temps only, we don’t have that ‘signal’.

Details below.

Details

Like the CT method Mike describes - it also assumes that the boiler’s water pump is running at fixed speed.

Basically - the ‘work done’ as the Boiler modulates it’s flame up and down: can ONLY cause a change in delta between Flow out and Return In. Ain’t nowhere else for it to go.

So when I plotted the delta - it struck me that it is a proxy for Flame %.
The graphs of Delta and Flame %

To work out the formula - I use Mike’s approach to watch the boiler’s app reporting Flame % and comparing it to my delta.

I got a 100% flame at delta of 20 C. and 10% (the boiler maker’s claimed minimum) down at 6 C.
So the maths for my specific boiler were

• 14 C range - for a 10-100% flame range: ie 6.43% per deg C
• so need to handle that (for the value of (deltaC less 6))
• then add the 10% of flame back
• then multiple by the boiler power (32Kw)
• add ‘log’ lines to save KWh and cumulative daily energy used .
Ie
1 Like

Do you know a flame conducts electricity? You could detect the presence of a flame that way.

Actually… the deltaT across the boiler IS enough info ; to detect when the gas boiler is off.

I edited the post to update for that.

It drops very rapidly. In my case - immediately below the minimum delta when the boiler is at it’s lowest 10% flame (delta = 6 C) - it drops to 2.5 and below.

So line 4 in my spec above does the filtering - excludes anything less than 4 C.

I’ll report back in a week - if I get the excellent accuracy that Mike’s CT based approach achieved.

That CT is likely to be more accurate: as it measure inputs - whereas abve measures outputs of the boiler: and will be less accurate if, for example: as the boiler flame modulates the % efficiency of the combustion changes.

Maybe a combination of the approaches would be a way to prove either way if any specific boiler suffers from that varying efficiency under modulation?

And here is your challenge sir:-) Taken just moments ago

Comparing smart meter against Emoncms calc.

You have piqued my interest. Need to give this a go. I have all the data, but I’m yet to be convinced

well done Mike - you have really impressive accuracy!
closer than 0.1 % !

Mine - I am still working out the constants.

I think if your ambient temperature never changed and your radiator TRV’s remained at 100% OPEN for a reasonably long firing time you might get some results. I’m not convinced that for short burst firing like an Evohome system does that you’ll get accurate results.

Mike - is your Evohome plugged in to your boiler by OpenTherm or ON/OFF?

If the former - then (the theory says) that Evohome will tell the boiler to modulate down / start heating earlier on cold mornings.

ie not short firing bursts.
Or am I missing something.

Oh no, my boiler isn’t special enough to support Opentherm. Unfortunately, it only supports ON/OFF but it does make a fair attempt at holding the flow temperature to what I’ve set. It is a very old contraption.

The Evohome controller makes a fair attempt at keeping a low flow temperature even though it can’t measure it. The latest software uses that advanced load scaling which as far as I can tell is just a fancy description for turning it on and off more often.

So other than a pump, a PCB, and maybe a solenoid/valve, there isn’t much of a variation in the power used by the boiler prior to the point that it fires.

The good thing about that though is that once I had identified what electrical power the boiler pulls just at the point before firing, I then have a good reference for the minimum firing electrical wattage, then maximum firing electrical wattage is just whatever I see as the maximum reach.

Once I identified the range of wattage which is 94-124 watts. Everything below 94 watts appears to be flow checking, purging, or some other safety feature.

I don’t call this 94-124watts 0-100% firing. The boiler doesn’t drop to 0% firing, it’s able to drop to around 33% due to its poor turndown ratio. I would like a boiler with a better turndown ratio because that would help me run a lower flow temperature, but that’s a whole other story.

That gives me a feed for the firing rate range of 33-100%. I then know the range of the boiler power in kW from the boiler manual, so that’s just a bit of maths to give a kW feed.

I have in my calculation an adjustment factor (fudge factor) which I seem to have to make small adjustments to perhaps twice a year when I see the results drifting from what my smart meter shows.
I think as the boiler ages things change a bit, then there is also its annual service which does alter it.

What I don’t compensate for is Air Density since my boiler is in the loft and the temperature of the inlet air doesn’t have a wide range. If it had been an issue then it’s an easy calculation to add that in for a correction.

The other is supply voltage because that will have a small effect. I see a range of 236-250v over the time I’ve had the emonpi installed. I did think about creating a feed to record the wattage of the boiler when only the pump was running and compare this with voltage over time to see how the pump wattage is affected. At the moment I’m not compensating for this either.

I think this only works well because it’s a fairly simple boiler. I dread to think about what would be involved to get the same results from a boiler with a modulating pump.

Thanks Mike

that’s helpful - confirms I’m on the right tracks.

I have in my calculation an adjustment factor (fudge factor) which I seem to have to make small adjustments to perhaps twice a year.

Where do you apply your fudge in your formula? Does the electrical range (94-124watts) change over time? Or does the boiler’s KWH produced (when elec. at 124 watts ) gradually drop over time? due to less gas flow… or ?

Ie

then there is also its annual service which does alter it.

Does the boiler increase it’s efficiency after a service - more KWH heat out for the same gas in?

My correction is done at the point where I calculate the firing rate from wattage as I do see the electrical range changing. So I have a correction of Calculated Firing Rate * 0.98 at the moment. I could just enter the new range, but this simple factor seems to work, its definitely just a hack:-)

During the service, the boiler engineer fiddles with (adjusts?) the gas valve and I spend a few days adjusting my calcs again.

It would be a lot easier if I could place the CT inside the boiler and just measure FAN Wattage, but the Boiler Engineer would take a dim view when it come to service time and there was extra equipment inside the boiler:-)

oh, can that actually be worked out? How am I going to measure the Heat Output of the boiler?

Really, at the moment I just assume that at all times it’s performing exactly as it says in the boiler specification. That is, it is a 28kw boiler at full pelt, and assume that the maximum electrical wattage I see is the boiler at full pelt and so 28kw.

Are you thinking that a lower wattage suggests higher efficiency? Might just be that the Boiler Engineer has adjusted the gas valve minimum/maximum mixture ratios in such a way that it is no longer 28kw but perhaps lower or higher in an attempt to get CO levels within limits. Maybe what I’m accounting for with my correction factor.

I think to do that with any accuracy I’d have to fit a gas flow meter, use that and perhaps flow temperatures over time.

Not forgetting that the calorific value of the gas varies too, so knowing only the quantity isn’t enough. The average calorific value is shown on your bill, but how much does it vary over what time scale? You’d need at least some idea of this.

Modern boilers build up slowly. Mine takes quite a while to get going (system boiler - not Combi). I took it up with the manufacturer and their response was - “as designed - more efficient”.

I’d also expect them to 'modulate - i.e. at full pelt I get a 20°C DeltaT, as that delta drops (return gets warmer and output at target temperature) I’d expect the boiler to reduce heat input.

can that actually be worked out? How am I going to measure the Heat Output of the boiler?

In short - the deltaT at ‘flame 100%’ is the measure of the boiler ‘gas to heat’ efficiency. If it drops,

• either - the flame is less efficient - boiler at 100% no longer produces same delta
• OR possibly an indicator of the flame-pipes becoming gunged up = less gas used. But you’d be able to spot that in your Smart meter.

Robert.Wall highlights that there are other factors too -like the varying calorific value of gas.
But if “delta T at max flame” increases immediately after a service - then that step change cannot be the change in gas properties? Especially if it had been nice and consistent for days/weeks before.

Are you thinking that a lower wattage suggests higher efficiency?

It’s not the wattage - it’s the modulation - the question is: does the efficiency (gas quantity → heat) remain linear as the boiler modulates down?

IF not - then your method (CT’s) or mine (deltaT’s) may be inaccurate - even though we calibrated the calcs nicely for 100% on and ‘minimum modulation as per maker’s specs’

Right now I have googled widely - but found no manufacturer’s docs that show a graph of efficiency % vs Modulation.

Don’t trust the boiler manufacturer manual.

Mine is 20+ years old but even so … Getting cold enough weather (and running a bath slowly) to have the flame on continuously for half an hour, I went outside with a notebook and wrote down the number on the plain old clicky-numbers mechanical gas meter, half an hour apart. Multiplied by ‘about 11.2’ to convert from m^3 gas to kWh, and multiplied by two to get kWh/hour which is in units of kW. In this way and from cutting a few wires in this older Vaillant, I discovered that its gas inlet has two throttle valves in series along the gas inlet pipe: ON/OFF opens when connected to UK mains (I’ve not tested whether euro-212 Volts 50Hz AC would also be enough) and the second one if not connected lets through enough gas for 12kW (about half of the rated maximum of this boiler) which is its usual state. 1 to 5 Volts DC on this second solenoid does nothing, 7 Volts DC increases the flame ‘a little bit’ with steady increase from about 6 to about 12 Volts DC. That is, while there is mains AC on the first solenoid and also >11V DC on the second solenoid, this boiler runs at its maximum burn rate. It wasn’t using enough gas for more than 22kW that day.

Is there interest for me to test this plain old gas combi boiler some more ?

I’ve gone with presumption that running my gas boiler for a somewhat mediocre outlet temperature <60C, at always its lowest 50% flame size, should get lowest flue gas heat waste. Presumably the sales reps liked to sell bigger numbers, so this model was rigged to flare up and massively increase the flame size and burner sound when you run the kitchen hot water tap, with not a huge change to how hot the water arrives out of the tap. Should I try testing more ? It is a 20+ year old Vaillant.