I am making a Lithium battery discharger/tester to test some 18650 batteries that I found in some old laptop cells because I am too poor to buy some brand new battery which cost more than 5 USD each. Using Two 20 ohm 1 w 2512 chip resistors, it gives me a sweet little tiny 10 ohm 2W resistor. A lithium battery at full charge is 4.2v. With my calculation, the maximum current should be less than 4.2v/10 ohm = 420mA; or With Power <=RI^2= 10 0.42^2= 1.76 watt. So a 2w resistor should be able to deal with a 1.76w load. Unfortunate, it's not the case. My resistors are extremely hot to touch (hot glue melting kind of hot). After running for a hour or so, even the other side of the PCB is burning hot to touch. But to my release, other than being super hot (oh boy, I wish my gf is that hot too), I don't find any problem. First of all, the current for my testing battery is constant at 360ma, so this really tells me that the resistance is not changing much. Second of all, nothing is burning and after removing the load and read the value of those 1% accuracy resistors, they still read 10 ohm.
Here we go, after my boring story, my first question, can heat really kill resistors (especially this kind of resistor) when operated below its rated wattage? To my own understanding, heat does kill the resistor eventually, but it should last pretty long time? Third question, what is an ideal load for discharging a 4.2v lithium battery @ about 500ma?
After soldering a huge nail to one size of the pad as a heat-sink as others have suggested, this helps a little if not at all. The resistors themselves are still super hot, while the whole nail is hot to touch. So My conclusion is that huge PCB trace is simply not enough to dissipate the heat generated by these two tiny resistors.