diff options
Diffstat (limited to 'zhilova')
-rw-r--r-- | zhilova/08_jensen | 37 | ||||
-rw-r--r-- | zhilova/hw1.tex | 133 |
2 files changed, 170 insertions, 0 deletions
diff --git a/zhilova/08_jensen b/zhilova/08_jensen index 20a8158..efe8bdb 100644 --- a/zhilova/08_jensen +++ b/zhilova/08_jensen @@ -21,3 +21,40 @@ f is strictly convex <=> f' is strictly increasing on (a,b) (2) If f is twice differentiable on (a,b) f is convex <=> f'' \geq 0 on (a,b) f is strictly convex <=> f'' > 0 on (a,b) + +Transformations of an r.v. + +Where X is an r.v. with pdf f_X(x), cdf F_X(x). + +Y := g(X). f_Y(y) = ? + +(Case 1) g is differentiable and invertible on D_X (range of X). + f_Y(y) = f_X(g^{-1}(y)) * |d/dy g^{-1}(y)| + Also: + if monotonically increasing, F_Y(y) = F_X(g^{-1}(y)) + if monotonically decreasing, F_Y(y) = 1 - F_X(g^{-1}(y)) + +(Case 2) g is piecewise bijective. +g is bijective on D_j where D_X = \cup_{j=1}^k D_j, with +D_i \cap D_j = \empty if i =/= j. (i.e. D_1...D_k is a partition of D_X) + +Then apply (1) through a sum. + f_y(y) = \sum f_X(g_j^{-1}(y)) * |d/dy g_j^{-1}(y)| * Indicator(y in range of g_j) + +F_Y(y) = P(g(X) \leq y) = P(\sum_{j=1}^k g_j(X) \leq y) += \int_R f_X(x) * indicator(x : g(x) \leq y) dx += \sum_{j=1}^k \int_R f_X(x) * Indicator{x : g_j(x) \leq y} dx +If g is monotonic increasing, the indicator is equivalent to x \leq +g_j^{-1}(y) + +This gives rise to several other transformations. Ex: scale +transformation ( g(x) = cx ), scale-position transformation ( g(x) = +cx+d ). + +Def: Symmetric distribution is when f_X(x) = f_X(-x). + +If X is a symmetric distribution and E|X| < \infty, EX = 0. + +EX = \int_{-\infty}^\infty xf_X(x) dx + = \int_0^\infty xf_X(x) + (-x)f_X(x) dx + = 0, by symmetry and some rearrangement of the integral. diff --git a/zhilova/hw1.tex b/zhilova/hw1.tex new file mode 100644 index 0000000..d9f5d22 --- /dev/null +++ b/zhilova/hw1.tex @@ -0,0 +1,133 @@ +\newfam\rsfs +\newfam\bbold +\def\scr#1{{\fam\rsfs #1}} +\def\bb#1{{\fam\bbold #1}} +\let\oldcal\cal +\def\cal#1{{\oldcal #1}} +\font\rsfsten=rsfs10 +\font\rsfssev=rsfs7 +\font\rsfsfiv=rsfs5 +\textfont\rsfs=\rsfsten +\scriptfont\rsfs=\rsfssev +\scriptscriptfont\rsfs=\rsfsfiv +\font\bbten=msbm10 +\font\bbsev=msbm7 +\font\bbfiv=msbm5 +\textfont\bbold=\bbten +\scriptfont\bbold=\bbsev +\scriptscriptfont\bbold=\bbfiv + +\def\Pr{\bb P} +\def\E{\bb E} +\newcount\qnum +\def\q{\afterassignment\qq\qnum=} +\def\qq{\qqq{\number\qnum}} +\def\qqq#1{\bigskip\goodbreak\noindent{\bf#1)}\smallskip} +\def\fr#1#2{{#1\over #2}} +\def\var{\mathop{\rm var}\nolimits} + +\q1 + +Let $A \setminus B := \{x \in A : x\not\in B\}.$ +$(A\setminus B) \cap B = \empty,$ by definition, so they are disjoint +sets. Therefore, if $x \in A \land x \in B,$ $x\not\in A \setminus B$ +and $x\not\in B\setminus A,$ and also the two setminuses do not +intersect with eachother, making them disjoint sets. +$$\Pr(A\cup B) = \Pr((A\setminus B)\cup(A\cap B)\cup(B\setminus A)) = +\Pr(A\setminus B) + \Pr(A\cap B) + \Pr(B\setminus A) + \Pr(A\cap B) - +\Pr(A\cap B)$$$$ = \Pr((A\setminus B)\cup(A\cap B)) + \Pr((B\setminus +A)\cup(A\cap B)) - \Pr(A\cap B) = \Pr(A) + \Pr(B) - \Pr(A\cap B).$$ + +\q2 + +$$\Pr(A\cap B) + \Pr(A\cap B^C) = \Pr(A\cap B) + \Pr(\{x\in A:x\not\in +B\}) = \Pr((A\cap B) \cup (A\setminus B)) = \Pr(A),$$ +by disjointedness and set arithmetic. + +\q3 + +$$\Pr(A_1 \cup (A_2^C \cap A_3^C)) = \Pr(A_1) + \Pr(A_2^C \cap A_3^C) - +\Pr(A_1\cap A_2^C\cap A_3^C) = 1/6 + \Pr(A_2^C)\Pr(A_3^C) - +\Pr(A_1)\Pr(A_2^C)\Pr(A_3^C)$$$$ = 1/6 + 25/36 - 25/216 = 161/216.$$ + +\q4 + +\noindent{\it (a)} + +Pairwise exclusive implies mutually exclusive: +$$A_1\cap A_2\cap A_3 = A_1\cap (A_2\cap A_3) = A_1\cap\empty = +\empty,$$ +and see {\it (b)} for a description of why these probabilities are +therefore impossible. + +\noindent{\it (b)} + +No, if they were, all three sets would be disjoint, giving +$$\Pr(A_1\cup A_2\cup A_3) = \Pr(A_1) + \Pr(A_2) + \Pr(A_3) = 1/2 + 1/4 + +1/3 > 1,$$ +violating a property of the probability function ($0 \leq \Pr X \leq 1$) + +\q5 + +\noindent{\it (a)} +$$\E(Y) = \E(2 - 3X) = \E(2) - 3\E(X) = 2 - 3*2 = -4.$$ + +\noindent{\it (b)} +$$\var Y = \E(Y^2) - \E(Y)^2 = \E((2-3X)^2) - 16 = \E(4 - 12X + 9X^2) - +16 = 4 - 12\E(X) + 9\E(X^2) - 16 = 18,$$ +by linearity of expectation. + +\q6 + +$$P_X(k) = \left\{\vbox{\halign{$#$\hfil&\hskip3em $#$\hfil\cr{\lambda^k +e^{-\lambda}\over k!}&k \geq 0\cr 0&{\rm otherwise}\cr}}\right..$$ + +\noindent{\it (a)} + +$$\E(e^{tX}) = \sum_{x=0}^\infty e^{tx}P_X(x) += \sum_{x=0}^\infty {e^{tx}\lambda^xe^{-\lambda}\over x!} += e^{-\lambda}\sum_{x=0}^\infty {e^{(t+\ln \lambda)x}\over x!} += e^{-\lambda}\sum_{x=0}^\infty {(\lambda e^t)^x\over x!} += e^{-\lambda}e^{\lambda e^t}, +$$ +by definition of the $e^ax$ Taylor series. + +\noindent{\it (b)} + +The third-order moment $\E(X^3)$ will be the third derivative of the mgf +$\E(e^{tX}).$ at $t=0,$ so we get +$$e^{-\lambda}(\lambda e^t)^3 e^{\lambda e^t} = \lambda^3.$$ + +\q7 + +\noindent{\it (a)} + +The pmf is $1/16$ for $X = 0$ and $X=4,$ $4/16$ for $X=1$ and $X=3,$ and +$6/16$ for $X=2,$ or in other terms, $P_X(k) = {{4\choose k}\over 2^4}.$ + +\noindent{\it (b)} + +This can be immediately computed as $$\Pr(\hbox{$X$ is odd}) = \Pr(X = 1) ++ \Pr(X = 3) = 1/2,$$ by disjointedness of those events. + +\q8 + +Chebyshev's inequality gives us +$$\Pr(|X-\E X| < .1) \geq 1-{\sigma^2\over .1^2} \geq .95$$ +The maximum value of $\sigma = \var X$ is $.05*.1^2 = \sigma^2 \to +\sigma = .1\sqrt{.05} \approx .022.$ + +\q9 + +{\it (a)} + +On $\{x \geq 0\} = \{y \geq 1\},$ +$$F_X(x) = 1 - e^{-\lambda x}.$$ +$$F_Y(y) = F_X(e^y) = 1 - e^{-\lambda e^y}.$$ +$$f_Y(y) = \lambda e^y e^{-\lambda e^y} = \lambda e^{y-\lambda e^y}.$$ + +{\it (b)} + +$\lambda < 1$ gives convergence. + +\bye |