I've done a similar calculation for a "star" made of water. Minimum size of a "water star"
The essence of the calculation is that deuterium ignites, only if the central temperature reaches the ignition temperature before the core becomes degenerate. The central temperature can be determined from the virial theorem and the perfect gas law and depends on the mean particle mass $\mu$ and $M/R$.
The radius at which a mass $M$ becomes electron-degenerate depends on $\mu$ and the mean mass per electron $\mu_e$.
The minimum mass is found by setting $R$ to be the radius at which degeneracy sets in and assuming the central temperature at that point equals the deuterium ignition temperature, which we can assume is the central temperature of a "star" of normal composition that ignites deuterium at about 13 Jupiter masses (the result given by detailed models).
A normal composition has something like $\mu = 16/27$ atomic mass units and $\mu_e=8/7$ atomic mass units (for an ionised hydrogen/helium mixture). Ionised deuterium has $\mu = 2/3$ and $\mu_e =2$.
The minimum mass turns out to be $\propto \mu^{-3/2}\mu_e^{-1/2}$. So scaling 13 Jupiter masses for the differing compositions we get a new minimum mass of 14.4 Jupiter masses.
This is a factor of a few lower than the hydrogen (protium) ignition threshold, principally because the ignition temperature for deuterium is also a few times lower than that for hydrogen.