0
$\begingroup$

According to Shapiro and Wilk(1965) in lemma 3,

$W$ has lower bound: $na_1^2/(n-1).$

To find this value, they solve the problem:

$$Max\quad y'y$$

$$ s.t.\quad 1'y=0,\quad and\quad a'y = 1,\quad and \quad y_1 \leq ,\cdots,\leq y_n. $$ $$ where\qquad a'a = 1\quad and \quad -a_i = a_{n-i+1}, $$ $$ |a_1|=|a_n| > |a_2|=|a_{n-1}| > \ldots > |a_{n/2}| = |a_{n/2 +1}| \quad \mbox{if}\: n \: \mbox{is even} ,$$ $$ |a_1|=|a_n| > |a_2|=|a_{n-1}| >\ldots > a_{n/2 +1/2} = 0 \quad \mbox{if}\: n \: \mbox{is odd} .$$

In the paper, they presented the vertices and find the maximum value.

These vertices are

$\left(\frac{n-1}{na_1}, \frac{-1}{na_1},\ldots, \frac{-1}{na_1} \right)$

$\left(\frac{n-2}{n(a_1+a_2)}, \frac{n-2}{n(a_1+a_2)},\frac{-2}{n(a_1+a_2)},\ldots, \frac{-2}{n(a_1+a_2)} \right)$

$\vdots$

$\left( \frac{1}{n(a_1+\ldots+a_{n-1})}, \frac{1}{n(a_1+\ldots+a_{n-1})},\ldots,\frac{-(n-1)}{n(a_1+\ldots+a_{n-1} )}\right ).$

I couldn't understand this proof because the way of finding vertices of the convex region.

My question is that how to find the vertices of intersection of two(or more) hyperplanes?



About $W$

Let $m'=(m_1,m_2,\ldots,m_n)$ denote the vector of expected values of standard normal order statistics, and let $V=(v_{ij})$ be the corresponding $n\times n$ covariance matrix. That is, if $x_1\leq x_2\leq\cdots\leq x_n$ denotes an orderd random sample of size $n$ from a standard noraml distribution, then $$ \mathbb{E} (x_i) = m_i, \qquad \mbox{and} \qquad cov(x_i,x_j)=v_{ij} .$$

Let $y'=(y_1,\ldots,y_n)$ denote a vector of ordered random observations. The objective is to derive a test for the hypothesis that this is asample form a normal distribution with unknown mean $\mu$ and unknown variance $\sigma^2$.

Clearly, if the $\{y_i\}$ are a normal sample then $y_i$ may be expressed as $$y_i = \mu + \sigma x_i\quad (i=1,2,\ldots,n).$$ By the generalized least-sqared theorem, the best linear unbiased estimates of $\mu$ and $\sigma$ are those quantities that minimize the quadratic form $(y-\mu 1 -\sigma m)'V^{-1}(y-\mu 1 -\sigma m)$, where $1'=(1,1,\ldots,1)$(the squared mahalanobis distance). These estimates are:

$$ \hat{\mu}= {m'V^{-1}(m1'-1m')V^{-1}y \over 1'V^{-1}1m'V^{-1}m-(1'V^{-1}m)^2}, \qquad \mbox{and} \qquad \hat{\sigma}= {1'V^{-1}(1m'-m1')V^{-1}y \over 1'V^{-1}1m'V^{-1}m-(1'V^{-1}m)^2}.$$

For symmetric distributions, $1'V^{-1}m=0$, and hence $$ \hat{\mu}={1'V^{-1}y \over 1'V^{-1}1}, \qquad \mbox{and} \qquad \hat{\sigma}= {m'V^{-1}y\over m'V^{-1}m} .$$

The $W$ test statistic for normality is defined by $$ W={R^4\hat{\sigma}^2 \over C^2S^2}={b^2 \over S^2}={(a'y)^2\over S^2}= {\left (\sum_{i=1}^{n}a_iy_i \right )^2 \over \sum_{i=1}^{n}(y_i-\bar{y})^2} ,\quad \mbox{where}$$ \begin{align*} &R^2=m'V^{-1}m,\\ &C^2=m'V^{-1}V^{-1}m,\\ &a'=(a_1,\ldots,a_n)={m'V^{-1}\over (m'V^{-1}V^{-1}m)^{1\over2}},\\ &b=R^2\hat{\sigma}/C. \end{align*}

Lemma 1. $W$ is scale and origin invariant.

proof: Let $K=cY+d$, \begin{align*} W(K) &= {\left (\sum_{i=1}^{n}a_i K_i \right )^2 \over \sum_{i=1}^{n}(K_i-\bar{K})^2} = {\left (\sum_{i=1}^{n}a_i(cY_i + d) \right )^2 \over \sum_{i=1}^{n}((cY_i + d)-(c\bar{Y}+d))^2} = {\left (\sum_{i=1}^{n}ca_i Y_i + \sum_{i=1}^{n}a_i d \right )^2 \over c^2\sum_{i=1}^{n}(Y_i-\bar{Y})^2}\\ &= {c^2 \left (\sum_{i=1}^{n}a_i Y_i \right )^2 \over c^2 \sum_{i=1}^{n}(Y_i-\bar{Y})^2} = {\left (\sum_{i=1}^{n}a_i Y_i \right )^2 \over \sum_{i=1}^{n}(Y_i-\bar{Y})^2} = W(Y), \qquad \because\: \sum_{i=1}^{n} a_i =0. \quad\qquad\qquad \square \end{align*}

By Lemma 1, minimizing $W = {\left (\sum_{i=1}^{n}a_iy_i \right )^2 \over \sum_{i=1}^{n}(y_i-\bar{y})^2}$ problem:

$$Min\quad {1 \over y'y} \to Max\quad {y'y}$$

$$ s.t.\quad 1'y=0,\quad and\quad a'y = 1,\quad and \quad y_1 \leq ,\cdots,\leq y_n $$ $$ where\qquad a'a = 1\quad and \quad -a_i = a_{n-i+1}, $$

$$ |a_1|=|a_n| > |a_2|=|a_{n-1}| > \ldots > |a_{n/2}| = |a_{n/2 +1}| \quad \mbox{if}\: n \: \mbox{is even} ,$$ $$ |a_1|=|a_n| > |a_2|=|a_{n-1}| >\ldots > a_{n/2 +1/2} = 0 \quad \mbox{if}\: n \: \mbox{is odd} .$$

You can read the paper here: https://www.jstor.org/stable/2333709

$\endgroup$
4
  • $\begingroup$ Please state the definition of $W$. I believe this proof is false: the feasible region contains a $(n-2)$-dimensional hyperplane, so the maximization problem is infinite. $\endgroup$
    – daw
    Commented Feb 28 at 12:45
  • $\begingroup$ Whatever they are doing, I think the minimum of $W$ is zero, achieved for vectors $y$ orthogonal to $a$. $\endgroup$
    – daw
    Commented Feb 28 at 12:53
  • $\begingroup$ It seems that they are assuming $y_1 \le y_2 \le \dots$. Then this would be ok. $\endgroup$
    – daw
    Commented Feb 28 at 14:18
  • $\begingroup$ @daw Thank you for the comment. I didn't catch that $y_1\leq,\ldots,\leq y_n.$ I summarized the information about $W$. Again, thank you so much for your consideration. $\endgroup$
    – 박원빈
    Commented Feb 29 at 8:36

0

You must log in to answer this question.