Here is a one-dimensional rendering of a 1964 version of the Pitman-Koopman-Darmois-Fisher theorem with a proof, as presented by Peter Bickel during Erich Lehmann's course in Berkeley, recovered from a then-graduate student's notes (who kindly sent them to me!). My own comments are within square brackets.
Theorem Let $\theta$ be a real parameter, $\theta\in\Theta\subset\mathbb R$, and let $p_\theta(\cdot)$ be the density of $n$ real i.i.d. $X_i$'s.
Assume $(X_1,\ldots,X_n)$ admits a [real] sufficient statistic
$T(X_1,\ldots,X_n)$ such that
- $\frac{\partial T}{\partial x_i}$ exists for all $i$'s and $x_i$'s
- the support $A=\{x; p_\theta(x)>0\}$ is the same open set for all $\theta$'s
- $\frac{\partial^2 T}{\partial x_i\partial\theta}$ exists for all $(\theta,x_i)$'s, is continuous in $(\theta,x)$, and different from
zero on $A\times\Theta$.
Then there exist $Q(\cdot)$ and $T^*(\cdot)$ such that
$$p_\theta(x)=C(\theta)\exp\{Q(\theta)\cdot T^*(x)\}h(x)$$
[when $x\in A$ and $\theta\in\Theta$].
Proof. By the factorisation theorem,
$$p_\theta(x_1)\cdots p_\theta(x_n) = h_\theta(T(x_1,\ldots,x_n))\cdot g(x_1,\ldots,x_n)$$
Let
$$A^*=\{(x_1,\ldots,x_n,\theta)|p_\theta(x_1)\cdots p_\theta(x_n)>0\}$$
which is open. Then
$$\sum_{i=1}^n \log\, p_\theta(x_i)=\log\,h_\theta(T(x_1,\ldots,x_n))+\log\,g(x_1,\ldots,x_n)$$
and
$$\frac{\partial^2}{\partial x_i\partial\theta} \sum_{i=1}^n \log\,p_\theta(x_i)=\frac{\partial^2}{\partial x_i\partial\theta} \,\log\,h_\theta(T(x_1,\ldots,x_n))=\frac{\partial^2\log\,h_\theta(T)}{\partial T\partial\theta} \cdot \frac{\partial T}
{\partial x_i}$$
This reduces to
$$\frac{\partial^2}{\partial x\partial\theta} \log\, p_\theta(x)\Big|_{x=x_i}= \frac{\partial^2\log\,h_\theta(T)}{\partial T\partial\theta} \cdot \frac{\partial T(x_1,\ldots,x_n)}{\partial x_i}$$
[Since the LHS only depends on $\theta$ and $x_i$, this implies that ${\partial T(x_1,\ldots,x_n)}\big/{\partial x_i}$ only depends on $x_i$, hence that $T(x_1,\ldots,x_n)$ is of the form $\sum_i \tilde T(x_i)+C$]
Set $\theta=\theta_0$ and define
$$u(x)=\frac{\partial}{\partial\theta} \log\, p_\theta(x)\Big|_{\theta=\theta_0}$$
Then
$$\frac{\text d u(x)}{\text dx}\Big|_{x=x_i}=
\frac{\partial^2\log\,h_\theta(T)}{\partial T\partial\theta} \cdot \frac{\partial T(x_1,\ldots,x_n)}{\partial x_i}$$
and
$$\sum_{i=1}^n u(x_i)=\frac{\partial\log\,h_\theta(T)}{\partial\theta}\Big|_{\theta=\theta_0}=f(T)$$
Note that
$$\frac{\text d u(x)}{\text dx}\ne 0\quad\text{and}\quad
\frac{\partial^2\log\,h_\theta(T)}{\partial T\partial\theta}\Big|_{\theta=\theta_0}\ne 0$$
Then
$$\dfrac{\dfrac{\partial^2\log\,p_\theta(x)}{\partial x\partial\theta}\Big|_{x=x_i}}{\dfrac{\text d u(x)}{\text dx}\Big|_{x=x_i}}=\dfrac{\dfrac{\partial^2\log\,h_\theta(T)}{\partial T\partial\theta}}{\dfrac{\partial^2\log\,h_\theta(T)}{\partial T\partial\theta}\Big|_{\theta=\theta_0}}\tag{1}$$
We claim that the RHS of (1) is a function of $\theta$ only.
The LHS is a function of both $\theta$ and $x_i$ that holds for all values of $x_i$. Suppose both sides of the equation depend on $x_i$ and fix $\theta$. Suppose $(x_1,\ldots,x_n)\ne(y_1,\ldots,y_n)$. Then the LHS is the same for $(x_1,y_2,\ldots,y_n)$ and $(x_1,\ldots,x_n)$ when considering $i=1$. But $(x_1,y_2,\ldots,y_n)$ and $(y_1,\ldots,y_n)$ also give the same LHS when considering $i=2$. The RHS is therefore independent of $(x_1,\ldots,x_n)$. Then
\begin{align}\dfrac{\partial^2\log\,h_\theta(T)}{\partial T\partial\theta}&=\nu(\theta)f^\prime(T)\\
\log\,h_\theta(T)&=\nu^\star(\theta)f(T)+\gamma(\theta)+C
\end{align}
Thus
$$p_\theta(x_1)\cdots p_\theta(x_n) = C^\star(\theta)\,\exp\left\{\nu^\star(\theta)\sum_{i=1}^n u(x_i)\right\}\cdot g(x_1,\ldots,x_n)$$
and
\begin{align}\log\,g(x_1,\ldots,x_n) &= \sum_{i=1}^n \nu^\star(\theta)\sum_{i=1}^nu(x_i)- \sum_{i=1}^n \log\,p_\theta(x_i)\\
&:= \sum_{i=1}^n \log \tilde h(x_i)\end{align}
Hence
$$p_\theta(x_1)\cdots p_\theta(x_n) = C^\star(\theta)\,\exp\left\{\nu^\star(\theta)\sum_{i=1}^n u(x_i)\right\}\cdot \tilde h(x_1)\cdots
\tilde h(x_n)$$
leading to
$$p_\theta(x)= C(\theta)\,\exp\left\{\nu^\star(\theta)u(x)\right\}\cdot \tilde h(x)$$