I'm working on an off-grid PLC based battery bank monitor and dump load controller to replace commercial units that don't work really well with our lithium ion battery bank. In addition, I plan to switch hydro turbine from nominal 24 Vdc to nominal 48 Vdc by changing from DD to DW or SD connection and then run it through an SSR to do MPPT and step down to battery voltage via PWM for better generator efficiency and lower line losses.
Initially, I'd like to monitor system voltage at the following points:
- Battery bank, nominal 24 Vdc (future upgrade will likely change this to 48 Vdc)
- DC bus in controller I'm building. When all is working as planned, this will be connected to battery bank and running the same voltage. But battery bank safety system may disconnect the battery bank for OVP/UVP or high differential voltage between cells. In this case, we still want to keep DC bus voltage correct to protect other components and--when sufficient solar and/or turbine power are present--keep things running without batteries.
- Hydro turbine input 0..60 Vdc (higher OCV to be confirmed, may affect voltage divider and zener diode needs for this input)
The present plan is to use a Beckhoff PLC (due to familiarity with programming these) and an ES3068 0..10 Vdc input card to read voltages as well as inputs from hall effect sensors to read current at various points in the system.
I understand the voltage divider concept, but this is the first time I'm applying one so I'm learning as I go. I'm thinking of using 10 kΩ and 1kΩ resistors for the divider to give a readable range in excess of 100 V while still giving good resolution in the expected operating range. Based on internal resistance of input card >130 kΩ, it seems going much higher in resistors will adversely affect accuracy. Or am I missing something here? I would like to minimize parasitic load (and heat dissipation) of resistors while maintaining sensing accuracy within 100 mV around battery voltage (24-28 Vdc).
Considering possibly failure modes, I'm thinking defined interruption behaviour resistors such as these two may be a good way to go:
My thinking is if the 10 kΩ resistor on the positive side of the divider fails, it should fail open to protect the PLC input from overvoltage. Zero voltage input will indicate failed resistor or other OC condition on input. However, if the 1 kΩ resistor on the negative side of divider fails open, we need a way to keep PLC input below 30 Vdc. I'd like the input to stay high in this failure case--even over 10 Vdc is okay as long as it is under 30 Vdc. Would a zener diode like the Microsemi #1N5929A 15 V connected in parallel across the 1 kΩ resistor be appropriate? Would it affect the PLC input reading when voltage is in expected 0..100 Vdc range? Or is there a better way to do this?
The overall goal is high reliability and failure modes that don't result in destruction of costly components. Am I headed the right direction, or missing something? Is there a better way to get these voltages into PLC accurately, reliably, and safely?