7
$\begingroup$

I have $n$ data points all of which are measurements of angles which I know to be increasing linearly with equal spacing. I am trying to find the best fit line for this data. Researching it, it seems the Von Mises distribution is most commonly used for linear-circular data, but I can't find any papers that find the best fit line so I am attempting it myself. This is what I have so far:

![enter image description here We want to fit the data to the equation:

$\theta = \alpha + \beta i$

Since the data is evenly spaced we are using $i$ as the x-values and to make the spacing $=1$ and have the points evenly spaced around $0$

So we want to minimise the mean square error:

$$Q'\left( {\alpha ,\beta } \right) = \sum\limits_{i = - \frac{{n - 1}}{2}}^{\frac{{n - 1}}{2}} {{{\left( {\cos \left( {\alpha + \beta i} \right) - \cos \left( {{\theta _i}} \right)} \right)}^2} + {{\left( {\sin \left( {\alpha + \beta i} \right) - \sin \left( {{\theta _i}} \right)} \right)}^2}}$$

$${\left( {\cos \left( {\alpha + \beta i} \right) - \cos \left( {{\theta _i}} \right)} \right)^2} + {\left( {\sin \left( {\alpha + \beta i} \right) - \sin \left( {{\theta _i}} \right)} \right)^2} = 4{\sin ^2}\frac{{\left( {\alpha + i\beta - {\theta _i}} \right)}}{2}\;$$

So we can minimise:

$$Q\left( {\alpha ,\beta } \right) = \sum\limits_{i = - \frac{{n - 1}}{2}}^{\frac{{n - 1}}{2}} {{{\sin }^2}\frac{{\left( {\alpha + i\beta - {\theta _i}} \right)}}{2}}$$

So we can set $$\frac{{dQ}}{{d\alpha }} = \frac{1}{2}\sum\limits_{i = - \frac{{n - 1}}{2}}^{\frac{{n - 1}}{2}} {\sin {\left( {\alpha + i\beta - {\theta _i}} \right)}}=0$$

and

$$\frac{{dQ}}{{d\beta }} = \frac{1}{2}\sum\limits_{i = - \frac{{n - 1}}{2}}^{\frac{{n - 1}}{2}} i {\sin {\left( {\alpha + i\beta - {\theta _i}} \right)}}=0$$

and solve for $\alpha$ and $\beta$

But that is as far as I get.

How can we solve these equations given the measurements $\theta_i$?

Note: simple linear regression actually works very well when the differences between successive angles is less than $π$ but it gets trickier when the difference exceeds $π$, especially if the standard deviation of the measurement error is large.

$\endgroup$
13
  • 1
    $\begingroup$ I think that you should have a look at scikit-guess.readthedocs.io/en/latest/_downloads/… This is a book by @jjacquelin. The method is not the conventional one but is works very well. $\endgroup$ Commented Jun 9, 2023 at 8:21
  • $\begingroup$ Thanks Claude, that looks very interesting. $\endgroup$ Commented Jun 9, 2023 at 9:28
  • $\begingroup$ If you just want something simple that works, why not just do a regression on the angles? You just have to be a little careful around your start/end to make sure your angles are increasing as you go. $\endgroup$
    – Eric
    Commented Jun 15, 2023 at 21:51
  • $\begingroup$ If you are fixing r=1 and only worried about fitting $\theta$... is there a reason why you're not just plotting $n$ versus $\theta$ and doing a plain-old linear fit in that? $\endgroup$
    – DotCounter
    Commented Jun 15, 2023 at 23:47
  • $\begingroup$ @AmateurDotCounter and Eric. That is what I do when the differences between succesive angles is less than $\pi$ but it gets trickier when the difference exceeds $\pi$ $\endgroup$ Commented Jun 16, 2023 at 15:21

2 Answers 2

3
$\begingroup$

Considering the position error as

$$ \delta_k^2 = \|e^{i(\alpha+(k-1)\beta)}-e^{i\theta_k}\|^2 = (\cos(\alpha+(k-1)\beta)-\cos\theta_k)^2+(\sin(\alpha+(k-1)\beta)-\sin\theta_k)^2 $$

we have

$$ E(\alpha,\beta) = \sum_{k=1}^n(\cos(\alpha+(k-1)\beta)-\cos\theta_k)^2+(\sin(\alpha+(k-1)\beta)-\sin\theta_k)^2 $$

the conditions for minimum are

$$ \cases{\frac{\partial E}{\partial\alpha} = f_1(\alpha,\beta) = 0\\ \frac{\partial E}{\partial\beta} = f_2(\alpha,\beta) = 0} $$

now calling $F = (f_1,f_2)$ we use a Newton-like iteration procedure to solve $F=0$.

$$ \delta X = X-X_0 = -M^{-1}(X_0)F(X_0) $$

with $X = (\alpha,\beta)$ and as initial guess $X_0=\left(\theta_0,\frac{2\pi}{n}\right)$. Follows a MATHEMATICA script implementing this procedure.

Clear["Global`*"]
n = 9;
Table[theta[k] = 2 (k - 1) Pi/n + RandomReal[{-1, 1}/n], {k, 1, n}];
delta = (-Cos[alpha + (k-1) beta] + Cos[theta[k]])^2 + (-Sin[alpha + (k-1) beta] + Sin[theta[k]])^2; 
obj = Sum[delta, {k, 1, n}];
F = Grad[obj, {alpha, beta}];
M = Grad[F, {alpha, beta}];
iM = Inverse[M];
X0 = {theta[1], 2 Pi/n};
X = {alpha, beta};
imax = 30;
error = 10^-6;

For[i = 1, i <= imax, i++,
 F0 = F /. Thread[X -> X0];
 iM0 = iM /. Thread[X -> X0];
 dX = -iM0.F0;
 Print[{X0, F0}];
 If[Max[Norm[F0], Norm[dX]] < error, Break[]];
 X0 = dX + X0
 ]

gr1 = Graphics[Table[{Red, PointSize[0.02], Point[{Cos[theta[k]], Sin[theta[k]]}]}, {k, 1, n}]];
gr2 = Graphics[Table[{Blue, PointSize[0.02], Point[{Cos[alpha + (k-1) beta], Sin[alpha + (k-1) beta]}]} /. Thread[X -> X0], {k, 1, n}]];
Show[gr1, gr2]

enter image description here

NOTE

According to an observation made for @Eric, the simple regression between the angles suffices because the fitting values agree. With

$$ E(\alpha,\beta) = \sum_k(\alpha+(k-1)\beta-\theta_k)^2 $$

$$ \cases{\frac{\partial E}{\partial\alpha} = n\alpha + \frac{(n-1)n}{2}\beta -\sum_k\theta_k = 0\\ \frac{\partial E}{\partial\beta} = \frac{(n-1)n}{2}\alpha+\left(\sum_k (k-1)^2\right)\beta - \sum_k (k-1)\theta_k = 0} $$

solab = Solve[{n alpha + (n - 1) n/2 beta - Sum[theta[k], {k, 1, n}] == 0, (n - 1) n/2 alpha + Sum[(k-1)^2, {k, 1, n}] beta - Sum[(k-1) theta[k], {k, 1, n}] == 0}, {alpha, beta}][[1]]
gr3 = Graphics[Table[{Green, PointSize[0.02], Point[{Cos[alpha + (k-1) beta], Sin[alpha + (k-1) beta]}]} /. solab, {k, 1, n}]];
Show[gr1, gr2, gr3]

NOTE-2

After understanding that the data $\theta_k$ can wind many times the circle, this final version will solve the problem (I hope).

Defining $$E(\alpha,\beta) = \sum_{k=1}^n\left(\alpha+(k-1)\beta -\theta_k\right)^2$$

we have

$$ \cases{\frac{\partial E}{\partial\alpha} = \sum_{k=1}^n\left(\alpha+(k-1)\beta -\theta_k\right)=0\\ \frac{\partial E}{\partial\beta} = \sum_{k=1}^n(k-1)\left(\alpha+(k-1)\beta -\theta_k\right)=0} $$

or

$$ \left(\matrix{n&\sum_{k=1}^n(k-1)\\ \sum_{k=1}^n(k-1)& \sum_{k=1}^n(k-1)^2}\right)\left(\matrix{\alpha\\ \beta}\right)=\left(\matrix{\sum_{k=1}^n\theta_k\\ \sum_{k=1}^n(k-1)\theta_k}\right) $$

which follows in the next script

Clear["Global`*"]
n = 8;
lambda = 1.3;
beta0 = lambda Pi;
Table[theta[k] = (k - 1) beta0 + RandomReal[{-1, 1}/15], {k, 1, n}];
obj = Sum[(alpha + (k - 1) delta - theta[k])^2, {k, 1, n}]
solalphadelta = Solve[Grad[obj, {alpha, delta}] == 0, {alpha, delta}][[1]]
gr1 = Graphics[Table[{Red, PointSize[0.02], Point[{Cos[theta[k]], Sin[theta[k]]}]}, {k, 1, n}]];
gr2 = Graphics[Table[{Blue, PointSize[0.02], Point[{Cos[alpha + (k - 1) delta], Sin[alpha + (k - 1) delta]}]} /. solalphadelta, {k, 1, n}]];
gr3 = Graphics[Circle[{0, 0}, 1]];
Show[gr1, gr2, gr3]

enter image description here

$\endgroup$
6
  • $\begingroup$ Does this work if $\beta>\pi$ with a large error standard deviation? $\endgroup$ Commented Jun 16, 2023 at 16:25
  • $\begingroup$ We correct an index by passing it from $k$ to $k-1$. Hope this fixes the issues $\endgroup$
    – Cesareo
    Commented Jun 16, 2023 at 20:01
  • $\begingroup$ In my example diagram I have shown a case where $n=8$ and $\beta=2\pi/n$ but this is not generally true. I am most interested in cases where $\beta > \pi$ $\endgroup$ Commented Jun 17, 2023 at 2:33
  • $\begingroup$ Please. Have a look at NOTE-2. $\endgroup$
    – Cesareo
    Commented Jun 23, 2023 at 8:31
  • $\begingroup$ Your code gives me errors. I don't understand your solution. Can you explain more? $\endgroup$ Commented Jun 27, 2023 at 3:11
0
$\begingroup$

Let consider $n$ points distribuited with angles $\theta_i$ and we want to fit them by $n$ equidistant points with angles $\gamma_i=\alpha+i\beta $ with $\beta = \frac{2\pi}n$ for $i=0,\ldots,n-1$.

Then we need to minimize the quantity (with angles expressed in radians):

$$Q(\alpha) =\sum_i (\gamma_i-\theta_i)^2=\sum_i (\alpha+i\beta-\theta_i)^2$$

that is

$$\frac {dQ(\alpha)}{d\alpha}= \sum_i 2(\alpha+i\beta-\theta_i)=0 \implies n\alpha+\frac{n(n-1)}2\beta-\sum_i \theta_i=0$$

$$\implies \alpha =\frac{\sum_i \theta_i}n-\frac{n-1}{n}\pi$$

$\endgroup$
4
  • $\begingroup$ In my example diagram I have shown a case where $n=8$ and $β=2π/n$ but this is not generally true. I am most interested in cases where $β>π$ $\endgroup$ Commented Jun 17, 2023 at 2:34
  • $\begingroup$ @MichaelMcLaughlin So the measurements can make more than $1$ laps around the circle? $\endgroup$
    – user
    Commented Jun 18, 2023 at 17:08
  • $\begingroup$ Yes. Many laps even $\endgroup$ Commented Jun 18, 2023 at 20:48
  • $\begingroup$ @MichaelMcLaughlin Thanks! I'll try to revise for a general solution. $\endgroup$
    – user
    Commented Jun 19, 2023 at 10:40

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .