1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
|
Bayes' Theorem is useful for determining something like ``how likely is
XYZ to have disease A if they pass test B?'' because it lets us convert
coditionals in the other direction (e.g. test given disease).
Independent Random Events
(C, \bb B, P) is a probability space
With A, B \in \bb B and A, B \subseteq C, they are independent iff
P(A\cap B) = P(A)P(B).
A group of events Ai, ... An in \bb B is
(1) pairwise independent iff P(A_i \cap A_j) = P(A_i)P(A_j) (i \neq j).
(2) triplewise independent iff P(A_i \cap A_j \cap A_k) =
P(A_i)P(A_j)P(A_k) (i \neq j \neq k \neq i).
(3) mutually independent iff for all subsets C of {A1, ..., An},
P(intersection of C) = product of all P(A) where A in C.
3 implies 2 and 1, but 2 doesn't imply 1.
Independence can also be defined equivalently as:
P(A | C) = P(A)
A,B are conditionally independent if P(A\cap B | C) = P(A|C)P(B|C)
Random Variables
[What lol]
X = X(w) : C \mapsto D where D is the range of X.
Inverse functions can exist, I guess.
P_X(A) = P({all w : X(w) in A})
Key Properties
1) P_X(A) is a probability set function on D.
2) P_X(A) \geq 0
3) P_x(D) = 1
4) P_x(empty) = 0
5+ P_x=(A) = 1 - P_x(D \setminus A)
6,7) monotonicity, sigma-additivitiy.
Discrete r.v. have countable domain.
Ex: Binomial r.v.
X ~ Binomial(n, p)
n in N, p in (0,1)
D = {0, 1, ... n}
P(X = x) = (n choose x)p^x(1-p)^{n-x}
X ~ Poisson(\lambda)
D = N^+.
P(X = x) = \lambda^x e^{-\lambda}/x!
Probability Mass Function (pmf)
For r.v. with countable domain D,
P_X(x) := P(X = x) (if x \in D, 0 otherwise)
Properties of P_X(x), x \in D:
(Correspond directly to probability set function properties)
1) Typically, P_X(x) > 0 forall x \in D. >= 0 also acceptable.
2) sum over all x of D P_x(x) gives 1.
3) {X in A} equivalent to {w in C : X(w) in A}
r.v. of continuous type
Ex: Let X uniformly take values in [0, 1].
P(X in (a, b]) = b - a. 0 \leq a < b \leq 1.
Cumulative distribution type
Defined for discrete and continuous type r.v.
F_X(x) := P(X \leq x).
F_X : R -> [0,1] [couldn't it be from any ordered domain?]
1) 0 \leq F_X \leq 1
2) non-decreasing
3) right-continuous
|