Alternative balancing ideas and charging from solar

No it won’t. It’s how all “top balance” BMS systems work, whether that be diyBMS, Batrium, Electrodacus, Daly etc.

They all drain power from the highest cell(s) when they exceed a limit.

1 Like

@GadgetUK, Thanks for paying attention to my post and responding.

Yes, and I know all about them. I was making a mistake in thinking how others must be setting them. I assumed that they would drive the pack up to the knee voltage and hold it there. I was forgetting that you’d set the absorption to the knee voltage, then set float to below that. I assume that the settings let you specify an even lower voltage that will trigger the thing to start the run back up to the knee, right? This mistake is what led me to be concerned that others were relentlessly holding their packs at the knee.

This mistake came about because the Electrodacus runs the pack up to the knee, then waits until the pack drops back down to float, then runs it right back up. I was doing the same, until I realized this was dumb.

(For example with LiFePo4, the knee might be 3.6 per cell and the float might be 3.4.)

Yes, I am in the process of debugging a controller that uses 2x INA226 chips to measure current. I have also modified the controller software to take advantage of this and it provides a %SOC.

I did not intend to imply that if I stop at 90% SOC the software will ignore individual cell voltages. Always the charging will be shut off if any one cell hits the knee (and loads off at the low knee of course).

I will modify the code to run up to the knee at a much lower frequency (e.g. once per month) for the dual purpose of eliminating accumulated SOC error, and also to measure how well the cells are balanced. When the first one hits the knee, charging will be shut off, SOC will be reset to 100%, and I will check the cell differences. If some are lagging by too much, then I will tell the modules to give the high ones a “hair cut”. That hair cut can be an arbitrarily low amps over a long period of time.

Yes, I know. I removed that post because I only posted it to get you to read my “better balancing algorithm” post.

However, I see no comfort in all BMS’s doing this. Every trip to the knee stresses the cells, right? I don’t know the relative degradation between a battery never getting to the knee vs one that gets to the knee every day, so I don’t know how much benefit there is to not getting to the knee, but if we measure current and thus know the SOC, we reduce the trips to the knee by an order of magnitude.

When using whole pack voltage to turn on/off the charge, as most people seem to be doing, the amount of time that the highest cell stays at the knee is proportional to how far ahead it is from the other cells. This is of course why everyone says that balancing is important. I don’t do this, which is why I say balancing is a nice to have.

But notice that the JLCPCB assembled module can only drain at most .85A and that’s with some cooling. @Stuart, you pointed me to the paper that stated that at the knee a lithium pack is going to be taking .03C of amps, right? For any rational ratio of modules to cell Ah, there is no chance that the module can drain current fast enough to prevent driving that cell deeper into the knee, while it is waiting for the other cells to catch up.

This means that the more you need to balance the cells, the more you shouldn’t use the modules to do this. Assuming of course that you are using whole pack voltage as the charge stop.

John,

I’ve read through your posts about balancing. I posted earlier about balancing large packs, but it seems more needs to be said. I’ll give it a shot.

What you’ve observed (large current input plus small shunt current equals overcharged cell) is correct and valid. What you seem to believe about charging or discharging lithium cells to the ‘knee’ is not - those aren’t damage zones. The knee when charging LiFePO4 is at around 3.4 V per cell, give or take, depending on the internal resistance of the cell. Using 80% capacity will take the cell over the knee on both charge and discharge. Remember that the knee appears on the high side when electrons are being pushed into the battery and Ohm’s law reminds us that cell voltage will be above the resting voltage when charge current is stopped. Once charging stops, cell voltages will settle back down to around 3.4 as the small surface charge bleeds off. Same for discharge - voltage will sag over the knee, and will rapidly bounce back up once the load is removed.

[Ultimately the chemistry can be charged to more than 4.0 V per cell but we don’t want to go there (or to minimum 2.0 or 2.1 V) to extend cell life. Using about 80% of capacity is the usual, but 70 is good as well for longer life. We can get into the weeds here, since LiFePO4 will provide longest calendar life when kept at about 50% SOC, but I’m guessing we buy the cells to do work, not take up space in storage!]

You noted that you’re feeding your battery straight from solar panels and are not using a charge controller (as you don’t see the value of MPPT). This is the reason you’re having trouble with your charging, and it’s why you’re struggling to add new capability to the BMS. Frankly, the capability you seek is part of the reason one should be using a charge controller, or at least a proper lithium charger with the appropriate CC/CV algorithm. A proper charger will have an input for an external trigger for an end of charge notification. This is critical because solar panels and chargers are across the entire battery - but lithium must must always be managed at the cell level.

Finally, there’s been a number of comments that suggest that balancing is really important ™ for lithium packs. It depends. If one has a small pack in their laptop or other device, then that extra 4 or 5% of maximum capacity might mean one doesn’t have to recharge the laptop until they get to the motel at the end of the day. In that instance, when mission requires extracting the absolute maximum from a battery, balancing can be useful. For larger batteries, though, it’s not. My experience has been with stationary backup using LIFePO4 cells and solar charging, and batteries of at least 60 Ah in electric vehicle service - from 70V motorcycle packs to 16 KWh 400V EV batteries. Large batteries do not need regular balancing, and certainly don’t need to be balanced every charge. It wastes energy and it keeps the cells at high states of charge for long periods of time (longer time if shunt current is low), which shortens cycle life. For a larger house battery, there likely won’t be enough generation to fully charge the battery every day anyway, though PV is cheap enough these days so that a couple of extra panels is less expensive than a larger battery in many cases. There are no absolutes. :slight_smile:

I’m using an Outback charge controller that is using a modified lead-acid charge algo. Lead acid is charged with the CV/CC stages we need, but it also has a ‘float’ (overcomes the self-discharge of lead acid chemistry) and ‘balance’ (provides controlled overcharge to balance lead acid cells). Float and balance are turned of and must be for all lithium chemistry cells. The BMS I’m currently using is analog and has cell-level sensors for high voltage and low voltage. The high voltage event for any single cell triggers the charger to stop charging the entire pack. This allows the BMS cell board to bleed the high cell down if necessary. Most of the time won’t be necessary since cell voltage will drop once charge current is removed. A couple of times each year, I’ll connect a power supply to the weak cells and bring them up one at a time to feed those OCD moments when I want all of the lines on a chart to look straight-ish. It’s not necessary, since the balance is worth 1 or 2 Amp-hours at most, and that’s lost on a 24V pack of 400 Ah cells.

There are some chargers and BMSs that can do more sophisticated battery management, but even the ones in my Outlander PHEV with multiple controllers and proper Coulomb counting into and out of the battery still struggle to keep all of the cells balanced to an OCD level of straight-lines-on-a-chart-ish-ness (<-- technical term LOL). There’s no need for this level of management for EVs or house batteries.

Please do add a proper CC/CV charger to your system, and use the BMS to tell the charger to stop (a relay on the BMS controlling a large relay in the cable from the panels, for one example). Your battery will thank you with a long life.

Best,
Andy

1 Like

Right, and thus the typical MPPT or PWM charge controller is a poor fit. They can only look at pack voltage, and there is really no point to looking at pack voltage. They must be turned on and off by the BMS. Why not eliminate the thing and use a simple relay and diode?

Were you under the impression that I wired the panels straight to the battery without a relay controlled by the BMS to disconnect the panels from the battery? I have stated countless times that my DIYBMS shuts off the panels when any ONE cell reaches the critical V (see below you objected to “knee”, so I’ll use a different name), and resumes charging when ALL cells are below Vgood. I have also stated that my setup does not pay attention to whole pack voltage. It is exhausting attempting to write so that people do no jump to silly conclusions.

All we need between a 60 cell panel and a 8 cell LiFePo4 battery is a diode and a relay. The diode prevents current from draining the battery at night, and the relay disconnects it when the BMS says to. I totally understand the utility of an MPPT if you cannot have the 60:8 ratio.

Maybe the reason people keep telling me that I need an MPPT is because the DIYBMS rules cannot do hysteresis. I modified the DIYBMS rules to do hysteresis, and of course it does this at the cell level. I suspect everyone just assumes that an MPPT is required, and then goes about patiently explaining to me how I need to make up for the fact that it cannot see individual cell voltage and how to set the lead-acid values (absorption and float) to be correct for Li.

Oh, maybe others are assuming that my panels can produce more amps than my batteries can take. This is not the case. My panels will produce at most 60amps. My battery is 8 280ah cells. In other words there’s no constant charge needed.

Additionally, the “constant voltage” is really nothing more than “turn off when any one cell gets to 3.6” and “turn on when all cells are below 3.4”.

Exactly. We agree there is no point to balancing. I believe the reason others insist balancing is vital, is because their MPPT controller cannot be turned on/off by the BMS and it can only see whole pack voltage.

The feature I am adding to the controller software and modules that I will deploy, will do that OCD automatically. This is not vital, and I won’t have trouble doing it. I’m doing it for fun, really.

You are just defining “knee” differently than I was. The higher you go past or the longer you stay at the whatever-you-want-to-call-the-V-you-feel-is-too-high, the more damage you do. I am not debating what that voltage is for this chemistry or that chemistry, or how far, or how much damage. I just used “knee” so I didn’t have to type that longer technical term.

Maybe you are saying that having my batteries go from 3.6 to 3.4 multiple times per day won’t affect longevity. That may be true, but is irrelevant to me because there’s no capacity benefit to not going past 3.4 and the algorithm I will put in place only rarely go up to 3.6.

Sadly the market suggests otherwise, and charge controllers feature in lots of the DIY powerwall forums.

I think in general terms with a good pack, the cells remain very closely balanced, especially if the BMS is able to do it’s job. At the point where there is a significant imbalance then you’ve got a problem with a cell in the pack.

Most good charge controllers have an interface to allow you to tell them to stop the charge. Either through a switch input, or a software control.

1 Like

John, can you run us through the reasons why you think a decent MPPT solar charge controller isn’t a good fit for your circumstances?

From my perspective, the BMS should almost be seen as the “last resort”, it should be there to protect the battery its managing.

There isn’t a problem with a solar charge controller measuring the pack voltage - that’s how it knows when to start and stop charging, the charger should stop automatically at a safe “fully charged” voltage. It will probably have temperature sensing too.

The BMS should force it to stop if it needs to based on rules/logic, to protect the battery its managing.

You seem to be trying to make a BMS controller to operate and control the whole ecosystem, I think you will have an easier time buying a solar charger and then just worrying about management of the battery.

2 Likes

So, we are not able to comprehend the engineering reasons to use a charge controller? Instead we use the logic that if everyone else is doing it, it must be right?

Notice that this provides no reason to use the charge controller. It just explains how to overcome its shortcomings.

Absolutely not. I posted a link to an article that explained why a charge controller is needed and compared the difference between a PWM and MPPT charge controller. Which was good enough for me to understand.

Personally I don’t need a charge controller as I’m not using solarPV directly with my planned powerwall. I’ve got a nice Victron inverter that I’ll be using to handle the charging and discharging. The BMS will be able to stop either the charging or discharging using an volt free contact input on the Victron.

The only thing I found in that article that seemed like it was a reason to use a charge controller was the float.

(I will use round LiFePo4 numbers and N is the number of battery cells.)

Once the pack reaches “absorption” (3.6N), the charger goes to float (3.4N). From that time, until the loads exceed the current that the solar can supply, the net current in/out of the pack will be zero. That sounds like a benefit compared to my situation where my pack will be oscillating between say 85% and 90%, or between 3.35N and 3.4N (I’m just picking random numbers to convey the concept).

I have not found anything that suggests this is some sort of obvious benefit. Someone would have to measure the degraded cell capacity where for a fraction of the day the cells are resting, compared to the situation where the cells are oscillating. If the cell capacity was a nice function of the count of + to - current flow events, that would make a case for wanting something that can maintain that net zero current. The theory is that it would provide one + to - current transition per day. Of course you’d have to get the clouds, and trees to behave nicely, and you shouldn’t turn on the air conditioner that exceeds the charge current. I’m providing the trees and clouds and user behavior examples to make it clear this would be a very difficult comparison to make in order to justify spending $xxx on a charge controller.

Notice that in the article they explain how a PWM controller works. “The switch is once again “flicked” ON and OFF as needed (pulse width modulated) to hold the battery voltage at the float voltage”, and they mention no negatives regarding the fact that we are now doing magnitudes more + - current change events.

If that was the case, I don’t see why others have not stated this benefit. As far as I can tell, these charge controllers were created because lead-acid must have 3 stages of charge and it was easy enough to tell users how to set them to charge Li. The sellers of MPPTs gain nothing by pointing out that if you have a 60:8 ratio between PV cells and LiFePo4 cells, you only need a relay/diode and of course some software in the BMS that you must have.

I believe that is called micro-cycling.

There is such a phenomenon when the solar panels are cold that they actually put out a higher voltage than their stated open circuit voltage. It has been discussed on other forums.

The only concern i have is that your only defence against over charging your cells is a relay controlled by the bms. What if this were to fail somehow?

The argument for having a charge controller is that there would be 2 lines of defense against overcharging the pack. Say for instance the charge controller fails to switch off, the bms would then step in, and vice versa.

The other thing i note that you argue is that the only thing you gain by balancing is capacity. However suppose you have a 3 cell pack. Cell 1 and 2 charge to 3.8v and cell 3 charges to 3.9v… the higher the voltage on a cell the quicker it degrades, fast forward the Capacity of cell 3 has dropped quicker than cells 1 & 2 and now it charges to 4v all while the other two sit at 3.75v.

This unnecessarily decreases your capacity and stresses/ kills your expensive cells.

1 Like

I’m not redefining the knee. The engineers that work with batteries, including the folks that invented the things, reference the ‘knee’ as the point on the LiFePO4 charge or discharge curve where the cell voltage takes a dramatic departure from about 3.2-3.3 V at rest (higher when charging; lower when discharging) and heads either rapidly up towards 4.0, or rapidly down towards 2.0. I have zero objection to the term.

In the constant current stage, energy from the solar panels is fed to the battery very efficiently for the bulk phase of charging. Once the predetermined end voltage is reached, the cell is not yet fully charged. The charger transitions to constant voltage - this keeps the cell voltage from running away as lithium is wont to do, and also provides the slower absorption phase where the remaining 15-25% of the charge happens. This NOT the same as ‘turn the relay off when a cell reaches a voltage’. Your method only performs the bulk phase of charging. If you have determined that you need for example 70% of your capacity, and you select your end of charge (and low voltage cut) points for that capacity, and you don’t perform the CV absorption phase, then you’re only actually filling the battery to 55% SOC. Fast charging with a high current input requires a longer period of absorption because the elevated current raises cell voltage and will trigger the high-voltage disconnect sooner. The lower the rate of charge, the more accurate the transistion from CC to CV.

The end phase of the charge must be managed - the current must be tapered off - because if it’s not, cell temperature will rise. Leaving the cell connected to the full current from the supply will shorten cell life at best and destroy them at worst.

Most LiFePO4 cells won’t remain at 3.6 V per cell once the charger is disconnected. There’s very little energy there, and they will settle fairly rapidly to about 3.4. If damage is going to occur at that end of the curve, it’ll be from exceeding cell voltage or by allowing cells to overheat. Unless you define damage as the capacity loss that results from regular cycles to 100%, then sure, charge only to 80% or 70%. The ‘knee’ is not an indicator of damage at either end of the curve. Since most of the LiFePO4 charge/discharge curve is relatively horizontal, the bends provide us confirmation that we’ve gotten near to the end of the charge or discharge process.

I doubt that others think that balancing is important because they can’t shut off their controller, whether the front end of the charger uses MPPT or PWM) I also doubt it’s because the BMS can’t do hysteresis. It’s clear from their comments that they understand that is a function of the charger or charge controller, not the BMS. If some here are assuming that your panels can produce more current than your batteries can take, they’re correct. End of charge for your 280 Ah is marked by the current tapering down to 0.05-0.1 C. This is 14 to 28 Amps for your battery. Trying to push 60 A into a battery late in the charge will increase cell temperature. This is abuse and will damage the battery. At the extreme, it can cause a fire.

No new software is necessary for the current DIYBMS to balance the battery when it’s used in a proper system. It appears you’re simply trying to change the BMS because you don’t want to use a charger/charge controller.

It is possible to manage a battery manually. It’s also possible to manage it with only a charge controller/charger. It’s also possible to manage it with a BMS that only indicates cell high or low voltage. Each of those, though, requires manual monitoring and intervention to protect the battery because each of those configurations is incomplete.

When managing any type of lithium chemistry, you deactivate float and balance and desulphate. These must not be used. Float is CV. Float is ‘trickle charging’. Float causes plating of electrodes. Balance and desulphate are designed to overcharge the battery. Lead acid can tolerate this but lithium cells cannot.

Managing lead acid batteries with a relay and a diode went out in the 1970s, and was never a thing for lithium. Modding a BMS to support that is not an improvement.

3 Likes

That Board design is interesting concept , easier to expand just putting Jumper’s between boards

I have a relatively large 100ah prismatic cell LiFePO4 battery and had it for a couple years now. just mentioning it factory installed BMS does not balance the cells. all the single BMS does is read the voltage and temperature of each Cell. and if any one off the maximum or minimum is exceeded on a cell the battery is disconnected - the voltage is measure to 3 decimal points. and the cells are usually within 0.001 volt of each other . i will admit when I first got the battery i had issues. two cells were severely out balance and I had to manually balance them , but since balancing them manually I have not had issues since of them going out of balance . — out of curiosity , I use to have a MPPT solar controller on it but I found that it sucked when operating with my battery. today I dropped the voltage output of my solar panels to just less then my battery required voltage and installed a boost converter setting the voltage at the optimal input voltage for my battery. now it works much better as it lets the BMS handle the charging process instead of the solar controller and BMS fighting against each other all the time trying to charge the battery the way they wanted too…

and for peoples curiosity here an image of my batteries BMS

here a picture of my batteries BMS this particular one handles 16 cells. it also provides heating support. and good to 100amp discharge if over 100amp charge/discharge is required contact relay will need to be installed