In short, there are no nice standard formulas for this. One can make some order-of-magnitude calculations, though.
The key formula you need is the inverse-square law: the intensity of a spherical energy source falls off with the inverse square of distance. $$I(r)=\frac{I}{r^2}.$$ The useful thing is that if you know that some source with intensity $I_1$ has certain effects at distance $r_1$ (say, a supernova being dangerous at 50 light years) then you get the same effects from a source with intensity $I_2$ at a distance $$r_2=\sqrt{\frac{I_2}{I_1}}r_1.$$ So a hypernova 10 times more luminious than a supernova would be dangerous out to about 158 lightyears (assuming the initial 50 lightyear range - this is where different papers and models might quibble a fair bit). Notice that you need a fairly large luminosity to get a vast range in this case.
Hypernovas are likely progenitors of gamma-ray bursts that release most of the energy along fairly narrow jets, 2-20 degrees across. That would amp up the range significantly. If the jet covers a fraction $f$ of the sky, that means the range is now $$r_{2 GRB}=\sqrt{\frac{I_2}{f I_1}}r_1$$ where the intensities are the true energy releases. For an opening-half angle $\theta$, $f=1-\cos(\theta)$ so we should expect $f$ to be between 0.00015 and 0.015, basically increasing the range by a factor 81 to 8.1 if one is unlucky enough to be in the beam.
The exact effect on different planets depends on many complex factors. Planetary evaporation is not a simple process unless the star delivers far more energy than the gravitational binding energy. The absorbed energy $\pi \epsilon R^2 I(r)$ where $0<\epsilon<1$ is the absorption efficiency must be much larger than $3GM^2/5R$, or $$ I(r) \gg \frac{3GM^2}{5\pi\epsilon R^3} $$ This scales proportional to mass and density, so to evaporate a ten times more massive planet you need ten times more radiation, assuming the same density.
Merely melting the surface or blowing off the atmosphere would depend on a lot of geophysical details. $\epsilon$ depends a lot on wavelengths, whether there is a plasma layer forming and similar messy issues. The light curve timecourse also matters, since some of the processes will be hydrodynamic flows rather than instant shockwaves or long-term heating.
A crude model might be that the absorbed energy heats up the atmosphere initially as $$\Delta T = \frac{\epsilon I(r)}{ C_P M_a}$$ where $C_P$ is the specific heat and $M_a$ the mass of a one-square meter column of atmosphere. For Earth, $M_a=10300$ kg and $C_P=1.00$ kJ/kg.K. A $10^{47}$ J hypernova at a distance of 50 ly radiating evenly would deposit $10^{47}/4\pi (50 \text{ly})^2\approx 35\cdot 10^9$ W/m$^2$, heating atmosphere if we assume $\epsilon=0.1$ by 345 K - enough to broil us, but not enough to evaporate the oceans (there $M$ is 1000 kg per meter depth, and $C_P=4.2$ kJ/kg.K). Were it just a supernova 100 times less intensive the effect is a balmy 3.45 K. On the other hand, with a focused GRB beam we can get a hundred to 10,000 times more heating (at which point the above formula stops being valid since the air ionises and becomes plasma).