Stochastic Process 1 -- Preliminaries
Propositions:
- If \(g: \mathbb{R} \rightarrow \mathbb{R}\) is a measurable function and \(X\) is a random variable, then \(Y = g(X)\) is also a random variable.
- If \(g\) is a strictly monotone function, let \(g(\mathbb{R})\) be the image of \(\mathbb{R}\) under function \(g.\) Suppose that \(X\) has a continuous density \(f_X.\) Then the random variable \(Y = g(X)\) has density\[ f_Y(y) = \frac{f_X(g^{-1}(y))}{|g'(g^{-1}(y))|}*\mathbb{1}_{g(\mathbb{R})}(y). \]. Here \(\mathbb{1}_{g(\mathbb{R})}\) is indicator function. The formula can be generated to piecewise strictly monotone functions.
The expectation of a random variable \(X\) is defined as \[ \mathbf{E}X = \int_{\Omega}X(\omega)\mu(d\omega) = \int_{-\infty}^{\infty}xP_X(dx). \] Here \(\mu\) is the probability measure, \(P_X\) is the probability distribution. The connection between \(\mu\) and \(P_X\) is given as \[ P_X(\Delta) := \mu(\{\omega: X(\omega) \in \Delta\})。 \] The connection between probability measure and density function is given as \[ \mu(X \in [x, x+\delta)) \sim f_X(x)\delta. \] The density function is defined as a nonnegative measurable function \(f_X(x)\) such that \[ F_X(x) = \int_{-\infty}^xf_X(y)dy. \] The function \(F_X(x)\) is the distribution function.
From the above definition we can derive a Stieltjes integral expression of expectation as \(P_X(dx) := \mu(\{\omega: X(\omega) \in dx\})\sim f_X(x)dx = dF_X(x).\) Then we have \[ \mathbf{E}X = \int_{-\infty}^{\infty} x dF_X(x). \] Similarly, let \(g\) be a measurable function of \(X.\) Then expectation of \(g\) is given by \[ \mathbf{E}g(X) = \int_{-\infty}^{\infty}g(x)dF_X(x). \]
Some Inequalities on Random variable \(X:\) 1. Holder \(|\mathbf{E} XY| \le \mathbf{E}^{1/p} |X|^p \mathbf{E}^{1/q} |X|^q,\) where \(1/p+1/q = 1, p>1, q>1.\) 2. Jensen \(g(\mathbf{E}X) \le \mathbf{E}g(X)\) for convex \(g.\) 3. Chebyshev \(P(X \ge \epsilon) \le \frac{\mathbf{E} X}{\epsilon}.\) for nonnegative \(X.\) 4. \(P(|Y - \mathbf{E}Y| \ge \delta) \le \frac{Var \ Y}{\delta^2}.\) for \(Y\) with finite variance.
Borel-Cantelli: 1. Let \(A_1, A_2, ...\) be a sequence of events. Then the event \[ \underset{n}{\lim \sup A_n} := \overset{\infty}{\underset{n=1}{\cap}} \overset{\infty}{\underset{m=n}{\cup}} A_m. \]If \(\sum_{n=1}^{\infty} P(A_n) < \infty,\) then \(P(\underset{n}{\lim \sup A_n}) = 0.\)
- If \(A_1, A_2,...\) are independent events, and \(\sum_{n=1}^{\infty} P(A_n) < \infty,\) then \(P(\underset{n}{\lim \sup A_n}) = 1.\)
Convergence Conditions: 1. Converges in mean: \(\mathbf{E} |X_n - X| \rightarrow 0.\) 2. Converges in mean square: \(\mathbf{E}|X_n - X|^2 \rightarrow 0.\) 3. Converges in probability: \(P(|X_n - X| \ge \epsilon) \rightarrow, \forall \epsilon > 0.\)
An immediate result is that 1., 2. imply 3.
The characteristic function of random variable \(X\) is \[ \varphi(\alpha) := \int_{-\infty}^{\infty} e^{i\alpha x}dF_X(x) = \mathbf{E}e^{i\alpha X}. \]
The covariance of two random variables \(X, Y\) is defined as \[ Cov(X,Y) := \mathbf{E}[(X - \mathbf{E}X)(Y - \mathbf{E}Y)] = \mathbf{E}XY - \mathbf{E}X\mathbf{E}Y. \]
A necessary and sufficient condition for \(X_1, X_2, ..., X_n\) to be independent is the following: \[ F_{X}(x) = \overset{n}{\underset{k=1}{\Pi}} F_{X_k}(x_k). \] Here \(F_{X_k}(x_k) = P(X_k < x_k), k = 1,2,...,n\) are marginal distribution functions.
Conditional expectation:
The conditional expectation of a random variable given an event \(B\) is defined for \(P(B)>0\) by the formula: \[ \mathbf{E}\{X\mid B\} := \int_{\Omega} X(\omega)P(d\omega \mid B) = \frac{\mathbf{E} \{X \mathbb{1}_B\}}{P(B)}. \]
The conditional expectation of a random variable given a \(\sigma\)-algebra \(\mathcal{Q}\) is defined in this way: If \(\mathcal{Q}\) is the algebra generated by a finite number of disjoint sets \(B_k, k=1,...,m,\) then for \(\omega \in B_k,\) \[ \mathbf{E}\{X \mid \mathcal{Q}\} := \mathbf{E}\{X \mid B_k\} = \frac{\mathbf{E}\{X \mathbb{1}_{B_k}\}}{P(B_k)}. \] Therefore, the definition on overall set \(B = B_1 \cup B_2 \cup...\cup B_{k}\) is given by:
Let \(X\) be a random variable with finite expectation. The conditional expectation \(\mathbf{E} \{X \,mid \mathcal{Q}\}\) of \(X\) given a \(\sigma\)-algebra \(\mathcal{Q} \subset \mathcal{F}\) is the \(\mathcal{Q}\)-measurable random variable such that \[ \int_B \mathbf{E}\{X \mid \mathcal{Q}\} dP = \int_B X dP \]for every \(B \in \mathcal{Q}.\)
Furthermore, the conditional probability \(P(A\mid \mathcal{Q})\) is defined by \(P(A\mid \mathcal{Q}) := \mathbf{E}\{\mathbb{1}_A \mid \mathcal{Q}\}.\)