I frequently work on projects in which I use optocouplers for isolating digital +5VDC control signals (for example, from a microcontroller) from the rest of the circuit. However, since these work by illuminating an LED inside the device, there can be several tens of milliamps load on the microcontroller pins. I am looking for advice on what would be the best practice for buffering this control signal with an additonal stage, so that the microcontroller effectively sees a high impedance, and thereby reducing the current that it needs to provide?
Just naively off the top of my head, I can think of a few things which might work:
Simply use an op amp as a unity gain buffer amplifier.
Use a dedicated comparator chip to compare the input signal with, for example, +2.5VDC.
Use a MOSFET as a kind of signal amplifier.
However, upon doing some reading, I have come across a whole bunch of chips that I have never used before, but sound like they may be designed for this kind of thing. For example:
- A Differential Line Driver (MC3487)
- A Differential Line Receiver (DC90C032)
- A Line Transceiver (SN65MLVD040)
- Buffer gates and drivers (SN74LS07, SN74ABT126)
I really have no experience with any of these and am a little overwhelmed by the amount of stuff available! So can anyone help me to learn the differences between these devices, and which ones of them would / wouldn't be suitable in this case. Is there a best / standard way of achieving what I describe?
edit:
Since I could be switching up to around x30 outputs, I do not want to be concerned at all about loading the microcontrollers, and so will not be considering connecting directly to the DIO pins. Therefore, I think I will go for a logic buffer IC. I am going to try using the SN74LVC1G125 "Single Bus Buffer Gate With 3-State Output" for each input, and see how that works out.