4
$\begingroup$

Let us suppose that we have a Bayesian game where the information structure is defined to be as $P^X=\{(X,\mathcal{X},P_\theta)\}_{\theta\in\Theta}$ where a signal generated by the information structure on each state $\theta\in\Theta$ is just a random variable taking its value in $(X,\mathcal{X})$ endowed with the probability distribution $P_\theta$. If we have another information structure $Q^Y=\{(Y,\mathcal{Y},Q_\theta)\}_{\theta\in\Theta}$ what assumptions are needed so as we could compare these information structures and how the Blackwell's Equivalence Theorem on Information Structures is applied to do the comparison?

Also, if one of the two information structures were an expansion from the other, say $P^X$ is an expansion of $Q^Y$, what would this change?

$\endgroup$

1 Answer 1

5
$\begingroup$

To answer the first part of your question, we do not need any more assumptions for the comparison of experiments (besides some measurability issues).

Before going on, I'll fix some notations to ones that are standard in the game theory literature, and for the sake of my convenience.

An experiment (or informtion structure) is defined as a tuple $(S,\pi)$ for a given state space $\Omega$, where $S$ is the set of signals endowed with some appropriate $\sigma$-algebra, and $$\pi: \Omega \rightarrow \Delta (S)$$ is a signal function, where $\pi(s|\omega)$ is the probability that $s$ is realized given state $\omega$. In your notation, $X$ would correspond to $S$, and $P_{\theta}(x)=\pi(x|\theta)$.

To state the Blackwell's Equivalence Theorem, we would need some more definitions.

An experiment $(S',\pi')$ is a $\textbf{garbling}$ of $(S,\pi)$ if there exists a map $g: S \rightarrow \Delta(S')$ such that $$\pi'(s'|\omega)=\sum_{s'\in S'}g(s'|s)\pi(s|\omega)$$ , or $$\pi'= g \circ \pi$$ where $\circ$ denotes the composition of two stochastic maps Intuitively, a garbling function $g: S \rightarrow \Delta(S')$ can be thought as adding noise, and blurring the information generated from $\pi$. Since $g(s'|s)$ is the probability that the signal realization $s\in S$ is changed into signal realization $s'\in S'$, $\pi'$ is a signal function that is more uncertain, and noisier than $\pi$ When the state space and the signal space $S, S'$ is finite, one can use matrices to represent garbling. Let $\pi_i(s_j)=\pi(s_j|\omega_i)$. An experiment $(S,\pi)$ can be represented by a matrix $$\Pi=\begin{pmatrix} \pi_1(s) \\ \pi_2(s) \\ ... \\ \pi_m(s) \end{pmatrix}$$ Then, $(S',\pi')$ is a garbling of $(S,\pi)$ if there exists an $n \times n'$ stochastic matrix G such that $\Pi'=\Pi G$

Obviously, garbling is not a complete order, but a $\textit{partial}$ order. Many information strutures cannot be directly compared with each other.

We say that $(S,\pi)$ is $\textbf{more informative}$ than $(S',\pi')$ if for $\textit{any}$ finite $A$ and $u: A \times \Omega \rightarrow \mathbb{R}$, the decision maker prefers $(S,\pi)$ over $(S',\pi')$. This definition is very strong since it requires any decision maker to prefer $(S,\pi)$ over $(S',\pi')$

We say that a distribution of actions conditional on state, $\chi: \Omega \rightarrow \Delta(A)$ is $\textbf{feasible}$ under $(S,\pi)$ if there exists DM's strategy $\sigma: S \rightarrow \Delta(A)$ such that $\chi=\sigma \circ \pi$. In other words, $\chi$ is feasible if one can construct a behavioral strategy $\sigma$ as a function only of signal realizations, and the resulting action distribution $\sigma \circ \pi$ is $\chi$. The set of feasible distribution of actions is equal to the DM's choice set.

Also, fix some prior belief over states $\mu \in \Delta (\Omega)$. For a given signal realization $s$, a Bayesian agent would update his belief to form a posterior belief $\mu_s(\omega)=\frac{\mu(\omega)\pi(s|\omega)}{\sum_{\omega'\in\Omega}\mu(\omega')\pi(s|\omega')}$. Since each signal realization $s$ would correspond to a posterior belief $\mu_s$, and the signal realization $s$ is generated in a probabilistic manner, each experiment $(S,\pi)$ would induce a distribution over posteriors $\tau\in\Delta(\Delta(\Omega))$. WLOG assuming all $\mu_s\neq \mu_{s'}$ for distinct $s\neq s'$, one can see $\tau(\mu_s)=\pi(s)=\sum_{\omega\in\Omega}\pi(s|\omega)\mu(\omega)$. In fact, $\sum_{supp(\tau)}\tau(\mu_s)\mu_s=\mu$ can be seen from straightforward calculation. This equation is usually referred to "Bayes plausibility constraint", and it simply means that the expected posterior is just the prior.

We say a distribution $F$ is a $\textbf{Mean preserving spread}$ of $G$ if $G$ second-order stochastically dominates $F$ and $F$ and $G$ has the same mean. There are some alternative equivalent definitions to this, but I'll skip this.

Now, we have all the building blocks to state the beautiful Blackwell Equivalence Theorem (1951, 1953).

The following are equivalent

  1. $(S',\pi')$ is a garbling of $(S, \pi)$
  2. Any feasible distribution of actions under $(S',\pi')$ is also feasible under $(S,\pi)$
  3. $(S,\pi)$ is more informative than $(S',\pi')$
  4. $\tau$ is a mean preserving spraed of $\tau'$, where $\tau$ is the distribution of posteriors induced by $(S,\pi)$

The term $\textbf{Blackwell Order}$ refers to the $\textit{partial}$ order over the set of all experiments $\Pi$, where $\pi \succeq \pi'$ if and only if one of the four conditions above holds.

Hope this answers your first question.

For the second part, we say an information structure $(S*,\pi*)$ is a $\textbf{combination}$ of $(S,\pi)$, $(S',\pi')$ if $$S*=S \times S'$$ and $$\sum_{s'\in S'}\pi^*(s,s'|\theta)=\pi(s|\theta)$$ $$\sum_{s\in S}\pi^*(s,s'|\theta)=\pi'(s'|\theta)$$ Note that the above definition does not put any restriction on the correlation structure of $S$ an $S'$. The only condition it requires is that $\pi*$ marginalizes to $\pi$ and $\pi'$

An information structure $(S*,\pi*)$ is an $\textbf{expansion}$ of $(S,\pi)$ if there exists some information structure $(S',\pi')$ such that $(S*,\pi*)$ is a combintation of $(S,\pi),(S',\pi')$

If $S*$ is an expansion of $S$, $S*$ is at least as informative as $S$. In fact, they are Blackwell equivalent. This is straightforward from the equivalence conditions above. There may be other proofs, but I'll use the feasibility conditions for now.

Assume $\chi: \Omega \rightarrow \Delta(A)$ is feasible under $(S,\pi)$. Then, there exists some $\sigma: S \rightarrow \Delta(A)$ such that $$\chi_{\omega}(a)=\sum_{s}\pi(s|\omega)\sigma(a|s)$$ Define $\sigma*: S \times S' \rightarrow \Delta(A)$ as $\sigma*(a|s,s')=\sigma(a|s)$ for all $s \in S$, $s'\in S'$ We have $$\chi_{\omega}(a)=\sum_{s}\pi(s|\omega)\sigma(a|s)=\sum_{s}\sum_{s'}\pi*(s,s'|\omega)\sigma(a|s)=\sum_{s}\sum_{s'}\pi*(s,s'|\omega)\sigma*(a|s,s') \\ = \sum_{s,s'}\pi*(s,s'|\omega)\sigma*(a|s,s')$$ The second equation follows from definition of expansion, and the third equation is just our definition of $\sigma*$. Thus, the expansion $(S*,\pi*)$ is more informative than $(S,\pi)$

To show $(S,\pi)$ is more informative than $(S*,\pi*)$, just define $\sigma$ as $\sigma(a|s)=\sigma*(a|s,s')$.

Hope the answer helped! Any comments are welcome

$\endgroup$
3
  • 1
    $\begingroup$ Actually, infinite versions of the theorem usually require a lot more than some measurability assumptions. Usually, one needs to assume the existence of densities. $\endgroup$ Commented Jan 9, 2022 at 17:15
  • $\begingroup$ @MichaelGreinecker Thank you for your comment. Can you elaborate more on densities issues? any references would be helpful $\endgroup$
    – djsteve
    Commented Jan 10, 2022 at 13:10
  • $\begingroup$ The standard reference for these kinds of questions is the book "Mathematical theory of Statistics" by Strasser, which makes for really, really hard reading. The most readable reference I know is Chapter 1 of the book "Statistical Experiments and Decisions: Asymptotic Theory" by Shiryaev and Spokoiny, which is still quite technical. There you can look for the assumption that a statistical experiment is "dominated," which means that each probability measure admits a density with respect to a single $\sigma$-finite measure. $\endgroup$ Commented Jan 10, 2022 at 15:20

Not the answer you're looking for? Browse other questions tagged or ask your own question.