The reg is rated to 300mA
That's one of its many limits! Not the only one. You're thermally limited.
Let's see what that means.
Linear Voltage regulators work like this:
They use an internal transistor as an "adjustable resistor", which they always adjust so that, given the current drawn at the output, the voltage dropped over the resistor is such that the desired output voltage is kept.
In other words: they're designed so that the difference in energy per charge (electron) in the voltage difference between in- and output is converted to heat.
That gives us the very simple formula of the power converted to heat:
$$P_\text{linear reg., heat} = \left(V_\text{out}-V_\text{in}\right)\cdot I_\text{out}\text.$$
In your case: 11 to 16 V drop × 50 mA current = 550 mW to 800 mW idle power.
Go into your data sheet, look for "thermal specifications" or similar, look for "thermal resistance". Look for your package variant. If you've got a heat sink attached to some part of your chip, look for the resistance junction-to-that-part, add the thermal resistance of the heatsink, and multiply with the power converted to heat.
You sadly didn't specify the package variant, so I'm assuming SOT-23-5. That's got a junction-to-ambient thermal resistance of 235°C/W, so 550 mW will heat it up by more than 100°C. Yowza!
So, clearly, this was a bit of a design mistake:
- When dropping reliably high voltage differences, the way to go is usually a switch-mode regulator. Linear regulators "burn" all the voltage difference between in- and output. Switch-mode regulators just store energy and deliver at a different voltage than the input – which means you have a bit of losses here and there, but not "burn all the energy per electron that's in the input voltage-output voltage".
- You didn't calculate how much power you'll be converting to heat in your linear voltage regulator, and thus didn't use one where you could have attached a sufficiently sized heat sink.