## 概率

$$\mathbb{P}\left(\bigcup_{i=1}^{\infty} A_{i}\right)=\sum_{i=1}^{\infty} \mathbb{P}\left(A_{i}\right)$$

1. $\mathbb{P}(\varnothing) = 0$ 因为 $\mathbb {P}(\Omega) + \mathbb {P}(\varnothing) = \mathbb {P}(\Omega) \Rightarrow \mathbb {P}(\varnothing) = 0$
2. $A \subset B \Rightarrow \mathbb{P}(A) \leq \mathbb{P}(B)$ 因为 $\mathbb {P}(A) + \mathbb {P}(B - A) = \mathbb {P}(B) \Rightarrow \mathbb {P}(A) \leq \mathbb {P}(B)$
3. $0 \leq \mathbb{P}(A) \leq 1$
4. $\mathbb{P}\left(A^c\right) = 1 - \mathbb{P}(A)$
5. $A \cap B = \varnothing \Rightarrow \mathbb{P}(A \cup B) = \mathbb{P}(A) + \mathbb{P}(B)$ 划分的性质。

## 有限样本空间的概率

$$P(A) = \dfrac{|A|}{|\Omega|}$$

## 条件概率

$P(A|B) = \dfrac{P(AB)}{P(B)}$

$\mathbb{P}(A B)=\mathbb{P}(A \mid B) \mathbb{P}(B)=\mathbb{P}(B \mid A) \mathbb{P}(A)$

## 1.7 Bayes 定理

$\mathbb{P}(B)=\sum_{i=1}^{k} \mathbb{P}\left(B \mid A_{i}\right) \mathbb{P}\left(A_{i}\right)$

【重要】贝叶斯定理 (Bayes’ Theorem)

$\mathbb{P}\left(A_{i} \mid B\right)=\dfrac{\mathbb{P}\left(B \mid A_{i}\right) \mathbb{P}\left(A_{i}\right)}{\sum_{j} \mathbb{P}\left(B \mid A_{j}\right) \mathbb{P}\left(A_{j}\right)}$

$P (A)$ 吃华莱士的概率

$P (A^C)$ 没吃华莱士的概率

$P (B|A)$ 吃华莱士的情况下拉肚子的概率

$P (B|A^C)$ 没吃华莱士的情况下拉肚子的概率

$$P(A|B) = \dfrac{P(B|A) P(A)}{P(B)} = \dfrac{P(B|A) P(A)}{P(B|A) P(A) + P(B|A^C)P(A^C)}$$

## 习题

♞1

Fill in the details in the proof of Theorem 2.8. Also, prove the monotone decreasing case.

$B_1 = A_1$

$B_2 = \{\omega \in \Omega : \omega \in A_2 \wedge \omega \not\in A_1\}$

$B_3 = \{\omega \in \Omega : \omega \in A_3\wedge \omega \not\in A_2 \wedge \omega \not\in A_1 \}$

$$\lim_{n\to\infty} \mathbb{P}(A_n) = \lim_{n\to\infty} \sum_{i=1}^{n} \mathbb{P}(B_i) = \sum_{i=1}^{+\infty } \mathbb{P}(B_i) = \mathbb{P}(A)$$

$B_1 = A_1$

$B_2 = A_2 - B_1$

$B_n = A_n - B_{n-1}$

♞2

Prove the statements in equation (2.1).