aboutsummaryrefslogtreecommitdiff
path: root/zhilova/04_events
diff options
context:
space:
mode:
authorHolden Rohrer <hr@hrhr.dev>2021-09-21 17:12:46 -0400
committerHolden Rohrer <hr@hrhr.dev>2021-09-21 17:12:46 -0400
commit32f4af5f369fa9f0b2988ecad7797f4bec3661c3 (patch)
tree7ce1c56011914681d6e2ffb5737dcdf1078d3930 /zhilova/04_events
parentb8433c9909bc5d29df16fd3011251a0a214d2b1a (diff)
notes and homework
Diffstat (limited to 'zhilova/04_events')
-rw-r--r--zhilova/04_events89
1 files changed, 89 insertions, 0 deletions
diff --git a/zhilova/04_events b/zhilova/04_events
new file mode 100644
index 0000000..551d4cc
--- /dev/null
+++ b/zhilova/04_events
@@ -0,0 +1,89 @@
+Bayes' Theorem is useful for determining something like ``how likely is
+XYZ to have disease A if they pass test B?'' because it lets us convert
+coditionals in the other direction (e.g. test given disease).
+
+ Independent Random Events
+(C, \bb B, P) is a probability space
+With A, B \in \bb B and A, B \subseteq C, they are independent iff
+P(A\cap B) = P(A)P(B).
+
+A group of events Ai, ... An in \bb B is
+
+(1) pairwise independent iff P(A_i \cap A_j) = P(A_i)P(A_j) (i \neq j).
+
+(2) triplewise independent iff P(A_i \cap A_j \cap A_k) =
+P(A_i)P(A_j)P(A_k) (i \neq j \neq k \neq i).
+
+(3) mutually independent iff for all subsets C of {A1, ..., An},
+P(intersection of C) = product of all P(A) where A in C.
+
+3 implies 2 and 1, but 2 doesn't imply 1.
+
+Independence can also be defined equivalently as:
+P(A | C) = P(A)
+
+A,B are conditionally independent if P(A\cap B | C) = P(A|C)P(B|C)
+
+ Random Variables
+
+[What lol]
+
+X = X(w) : C \mapsto D where D is the range of X.
+
+Inverse functions can exist, I guess.
+
+P_X(A) = P({all w : X(w) in A})
+
+Key Properties
+
+1) P_X(A) is a probability set function on D.
+2) P_X(A) \geq 0
+3) P_x(D) = 1
+4) P_x(empty) = 0
+5+ P_x=(A) = 1 - P_x(D \setminus A)
+6,7) monotonicity, sigma-additivitiy.
+
+ Discrete r.v. have countable domain.
+Ex: Binomial r.v.
+
+X ~ Binomial(n, p)
+n in N, p in (0,1)
+
+D = {0, 1, ... n}
+
+P(X = x) = (n choose x)p^x(1-p)^{n-x}
+
+X ~ Poisson(\lambda)
+
+D = N^+.
+
+P(X = x) = \lambda^x e^{-\lambda}/x!
+
+ Probability Mass Function (pmf)
+
+For r.v. with countable domain D,
+
+P_X(x) := P(X = x) (if x \in D, 0 otherwise)
+
+Properties of P_X(x), x \in D:
+ (Correspond directly to probability set function properties)
+
+1) Typically, P_X(x) > 0 forall x \in D. >= 0 also acceptable.
+
+2) sum over all x of D P_x(x) gives 1.
+
+3) {X in A} equivalent to {w in C : X(w) in A}
+
+ r.v. of continuous type
+Ex: Let X uniformly take values in [0, 1].
+P(X in (a, b]) = b - a. 0 \leq a < b \leq 1.
+
+ Cumulative distribution type
+Defined for discrete and continuous type r.v.
+
+F_X(x) := P(X \leq x).
+
+F_X : R -> [0,1] [couldn't it be from any ordered domain?]
+1) 0 \leq F_X \leq 1
+2) non-decreasing
+3) right-continuous