This is solely a reference request. I have heard a few versions of the following theorem:
If the joint moment generating function $\mathbb{E}[e^{uX+vY}] = \mathbb{E}[e^{uX}]\mathbb{E}[e^{vY}]$ whenever the expectations are finite, then $X,Y$ are independent.
And there is a similar version for characteristic functions. Could anyone provide me a serious reference which proves one or both of these theorems?
-
1$\begingroup$ I'm not at all sure that this is true. (Certainly it's true when "if" and "then" are interchanged.) $\endgroup$Michael Hardy– Michael Hardy2013年01月26日 06:18:00 +00:00Commented Jan 26, 2013 at 6:18
-
1$\begingroup$ If this were true, every random variables with subexponential tails on both sides would be independent. Please reach a plausible statement. $\endgroup$Did– Did2013年01月26日 10:57:33 +00:00Commented Jan 26, 2013 at 10:57
-
1$\begingroup$ In this post a theorem with a more general result was presented: math.stackexchange.com/questions/1802289/… $\endgroup$Carlos H. Mendoza-Cardenas– Carlos H. Mendoza-Cardenas2016年05月29日 13:55:58 +00:00Commented May 29, 2016 at 13:55
2 Answers 2
Theorem (Kac's theorem) Let $X,Y$ be $\mathbb{R}^d$-valued random variables. Then the following statements are equivalent.
- $X,Y$ are independent
- $\forall \eta,\xi \in \mathbb{R}^d: \mathbb{E}e^{\imath ,円 (X,Y) \cdot (\xi,\eta)} = \mathbb{E}e^{\imath ,円 X \cdot \xi} \cdot \mathbb{E}e^{\imath ,円 Y \cdot \eta}$
Proof:
- $(1) \Rightarrow (2)$: Straightforward, use $\mathbb{E}(f(X) \cdot g(Y)) = \mathbb{E}(f(X)) \cdot \mathbb{E}(g(Y))$
- $(2) \Rightarrow (1)$: Let $(\tilde{X},\tilde{Y})$ be such that $\tilde{X}$, $\tilde{Y}$ are independent, $\tilde{X} \sim X$, $\tilde{Y} \sim Y$. Then $$\mathbb{E}e^{\imath ,円 (X,Y) \cdot (\xi,\eta)} \stackrel{(2)}{=} \mathbb{E}e^{\imath ,円 X \cdot \xi} \cdot \mathbb{E}e^{\imath ,円 Y \cdot \eta} = \mathbb{E}e^{\imath \tilde{X} \cdot \xi} \cdot \mathbb{E}e^{\imath \tilde{Y} \cdot \eta} = \mathbb{E}e^{\imath (\tilde{X},\tilde{Y}) \cdot (\xi,\eta)}$$ i.e. the characteristic functions of $(X,Y)$ and $(\tilde{X},\tilde{Y})$ coincide. From the uniqueness of the Fourier transform we conclude $(X,Y) \sim (\tilde{X},\tilde{Y})$. Consequently, $X$ and $Y$ are independent.
Remark: It is not important that $X$ and $Y$ are vectors of the same dimension. The same reasoning works if, say, $X$ is an $\mathbb{R}^k$-valued random variable and $Y$ and $\mathbb{R}^d$-valued random variable.
Reference (not for the given proof, but the result):David Applebaum, B.V. Rajarama Bhat, Johan Kustermans, J. Martin Lindsay, Michael Schuermann, Uwe Franz: Quantum Independent Increment Processes I: From Classical Probability to Quantum Stochastic Calculus (Theorem 2.1).
-
$\begingroup$ Do we need $X,Y \in L^1$? I don't see it being used in the proof. $\endgroup$nomadicmathematician– nomadicmathematician2016年06月13日 21:04:06 +00:00Commented Jun 13, 2016 at 21:04
-
$\begingroup$ @takecare You are right; it's not needed for the proof. $\endgroup$saz– saz2016年06月14日 05:38:18 +00:00Commented Jun 14, 2016 at 5:38
-
1$\begingroup$ @AnselB If $\mathbb{E}e^{i (X \xi + Y \eta)} = \mathbb{E}e^{i \xi X} \mathbb{E}e^{iY \eta}$ for all $\xi,ドル $\eta,ドル then $X$ and $Y$ are independent; that's exactly what the proof shows. If you prefer, think about like this: Denote by $\mathbb{P}_X$ and $\mathbb{P}_Y$ the distribution of $X$ and $Y,ドル respectively. Then the characteristic function of the product measure $\mu = \mathbb{P}_X \times \mathbb{P}_Y$ is given by $$\hat{\mu}(\xi,\eta) = \mathbb{E}e^{i \xi X} \mathbb{E}e^{i \eta Y}$$ which is, by assumption, also the characteristic function of $(X,Y)$. $\endgroup$saz– saz2017年02月04日 06:29:05 +00:00Commented Feb 4, 2017 at 6:29
-
1$\begingroup$ @takecare Yes... if $$\mathbb{E}\exp \left( i \sum_{j=1}^n \xi_j X_j \right) = \prod_{j=1}^n \mathbb{E}\exp(i \xi_j X_j)$$ for any $\xi_1,\ldots,\xi_n,ドル then the random variables are independent. $\endgroup$saz– saz2017年06月04日 04:45:31 +00:00Commented Jun 4, 2017 at 4:45
-
1$\begingroup$ @John I'm using characteristic functions (which is essentially nothing but the Fourier transform ... or the inverse Fourier transform, up to a constant, depending which definition you are using). Note that $\mathbb{R}^d \ni \xi \mapsto E(e^{i \xi X})$ is the characteristic function, not the moment generating function. $\endgroup$saz– saz2019年11月20日 14:02:30 +00:00Commented Nov 20, 2019 at 14:02
Builidng on the answer by saz. If X and Y have a joint density, here is another proof for (2)⇒(1): By the inverse Fouriour transform: $$f_\mathbf{X}(\mathbf{x})=\frac{1}{(2\pi)^n}\int_{R^n}{e^{-j\mathbf{v'x}}\phi_\mathbf{x}(\mathbf{v})d\mathbf{v}}$$
where x and v are vertical vectors, and in this case, vector $\mathbf{x} = [x\ y]'$, vector $\mathbf{v} = [v_1\ v_2]'$
Therefore, $$f_{XY}(x,y)=\frac{1}{(2\pi)^2}\iint{e^{-j(v_1x+v_2y)}\phi_{XY}(v_1,v_2)}dv_1dv_2\\=\frac{1}{2\pi}\int{e^{-j(v_1x)}\phi_{X}(v_1)}dv_1\frac{1}{2\pi}\int{e^{-j(v_2y)}\phi_{Y}(v_2)}dv_2\\=f_X(x)f_Y(y)$$
And the joint probability density function (pdf) equals to the product marginal pdf's is the definition of independence for continuous random variables. This method should work for discrete random variables as well.
-
$\begingroup$ Why do you assume $X$ and $Y$ have a joint density function? $\endgroup$kimchi lover– kimchi lover2019年10月13日 19:45:31 +00:00Commented Oct 13, 2019 at 19:45
-
$\begingroup$ You are right. I didn't think of that. I edited my answer correspondingly. $\endgroup$Shuang Liu– Shuang Liu2019年10月13日 21:04:38 +00:00Commented Oct 13, 2019 at 21:04
You must log in to answer this question.
Explore related questions
See similar questions with these tags.