LEDs are driven with current, NOT with voltage. If its (LED's) threshold is 3V, then anything over 3V connected to it will act as short circuit with infinite (in practice, finite, but well over any spec) current.
You always* therefore need a current limiting resistor. If your supply is 4.5V and LED drops 3V, then you will have 1.5V across the resistor. Picking different values for the resistor you can set different current through the resistor = current through LED.
For example, 1.5k resistor will give 1.5V across resistor, so it's 1mA of current through the resistor = 1mA through LED. Take half the resistance (for example two 1.5k in parallel will give 750Ohm) and you have 2mA through LEDs and so on.
How to get maximum brightness: If you want 700mA through the resistor with 1.5V voltage drop, it's 1.5V/0.7 ~ 2.15 Ohm. Get something slightly larger than that, you don't want to exceed max current.Also, your resistor should be able to take such current, which is not little! 1.5V*0.7A = 1.05W! That's a lot! You should probably connect a few resistors in parallel, will be easier.
And no, you will never achieve brightness like what you had when it burned, because it was already out of the spec. There is a limit to how bright it can be without burning.
*you can have a special LED constant current driver without "visible" resistor, but it's a whole another circuit and it's worth a separate discussion; may actually be feasible in your case.
EDIT If you're using a coin cell, you can get 5mA, MAYBE 10mA out of it, it just can't source more. If you have AA/AAA, maybe you can pull a few dozen mA, don't count on more either.
![enter image description here](https://cdn.statically.io/img/i.sstatic.net/peZfN.png)