diff options
-rw-r--r-- | li/hw4.tex | 140 | ||||
-rw-r--r-- | zhilova/08_jensen | 37 | ||||
-rw-r--r-- | zhilova/hw1.tex | 133 |
3 files changed, 310 insertions, 0 deletions
diff --git a/li/hw4.tex b/li/hw4.tex new file mode 100644 index 0000000..558f9cb --- /dev/null +++ b/li/hw4.tex @@ -0,0 +1,140 @@ +\def\bmatrix#1{\left[\matrix{#1}\right]} + + {\noindent\bf Section 2.4} + +{\noindent\bf 6.} + +{\it (a)} + +It has a two-sided inverse if $r = m = n.$ + +{\it (b)} + +It has infinitely many solutions if $r = m < n.$ + +{\noindent\bf 12.} + +{\it (a)} + +Matrix $A$ is of rank 1 and equal to +$$\bmatrix{1\cr 0\cr 2}\bmatrix{1&0&0&3}$$ + +{\it (b)} + +Matrix $A$ is of rank 1 and equal to +$$\bmatrix{2\cr 6}\bmatrix{1&-1}$$ + +{\noindent\bf 18.} + +The row space has basis: +$$\{\bmatrix{0\cr1\cr2\cr3\cr4}, \bmatrix{0\cr0\cr0\cr1\cr2}\},$$ +the null space has basis: +$$\{\bmatrix{1\cr0\cr0\cr0\cr0}, \bmatrix{0\cr-2\cr1\cr0\cr0}, +\bmatrix{0\cr0\cr0\cr-2\cr0}\},$$ +the column space has basis: +$$\{\bmatrix{1\cr1\cr0},\bmatrix{3\cr4\cr1}\},$$ +and the left null space has basis: +$$\{\bmatrix{1\cr-1\cr1}\}.$$ + +{\noindent\bf 32.} + +$A$ has column space of the xy-plane, and left null space of the z-axis. +It also has row space of the yz-plane, and null space of the x-axis. + +$I+A$ has full rank, so its column space and row space are ${\bf R}^3,$ +and its null space and left null space are the zero vector. + +\iffalse % practice problems + +{\noindent\bf 2.} + +{\noindent\bf 3.} + +{\noindent\bf 8.} + +{\noindent\bf 9.} + +{\noindent\bf 10.} + +{\noindent\bf 16.} + +{\noindent\bf 17.} + +{\noindent\bf 21.} + +{\noindent\bf 25.} + +{\noindent\bf 27.} + +{\noindent\bf 35.} + +{\noindent\bf 37.} + +\fi + + {\noindent\bf Section 2.6} + +{\noindent\bf 16.} + +$$\bmatrix{0&1&0&0\cr 0&0&1&0\cr 0&0&0&1\cr 1&0&0&0}$$ +If $A$ maps $(x_1, x_2, x_3, x_4)$ to $(x_2, x_3, x_4, x_1),$ $A^2$ maps +$x$ to $(x_3, x_4, x_1, x_2),$ and $A^3$ takes $x$ to $(x_4, x_1, x_2, +x_3),$ and $AA^3 = I = A^4,$ so $A^3 = A^{-1}$ by definition of the +identity. + +{\noindent\bf 28.} + +{\it (a)} + +Range is $V^2,$ and kernel is $0.$ + +{\it (b)} + +Range is $V^2,$ and kernel has basis $(0, 0, 1).$ + +{\it (c)} + +Range is $0,$ and kernel is $V^2$ + +{\it (d)} + +Range is the subspace with basis $(1, 1)$ and kernel has basis $(0, 1).$ + +{\noindent\bf 36.} + +{\it (a)} + +$$\bmatrix{2&5\cr 1&3}$$ + +{\it (b)} + +$$\bmatrix{3&-5\cr -1&2}$$ + +{\it (c)} + +Because, by linearity, if $(2, 6) \mapsto (1, 0),$ $.5(2,6) = (1, 3) +\mapsto (.5, 0).$ + +{\noindent\bf 44.} + +This is equivalent to a 180$^\circ$ rotation. + +\iffalse % practice problems + +{\noindent\bf 6.} + +{\noindent\bf 7.} + +{\noindent\bf 8.} + +{\noindent\bf 9.} + +{\noindent\bf 17.} + +{\noindent\bf 40.} + +{\noindent\bf 45.} + +\fi + +\bye diff --git a/zhilova/08_jensen b/zhilova/08_jensen index 20a8158..efe8bdb 100644 --- a/zhilova/08_jensen +++ b/zhilova/08_jensen @@ -21,3 +21,40 @@ f is strictly convex <=> f' is strictly increasing on (a,b) (2) If f is twice differentiable on (a,b) f is convex <=> f'' \geq 0 on (a,b) f is strictly convex <=> f'' > 0 on (a,b) + +Transformations of an r.v. + +Where X is an r.v. with pdf f_X(x), cdf F_X(x). + +Y := g(X). f_Y(y) = ? + +(Case 1) g is differentiable and invertible on D_X (range of X). + f_Y(y) = f_X(g^{-1}(y)) * |d/dy g^{-1}(y)| + Also: + if monotonically increasing, F_Y(y) = F_X(g^{-1}(y)) + if monotonically decreasing, F_Y(y) = 1 - F_X(g^{-1}(y)) + +(Case 2) g is piecewise bijective. +g is bijective on D_j where D_X = \cup_{j=1}^k D_j, with +D_i \cap D_j = \empty if i =/= j. (i.e. D_1...D_k is a partition of D_X) + +Then apply (1) through a sum. + f_y(y) = \sum f_X(g_j^{-1}(y)) * |d/dy g_j^{-1}(y)| * Indicator(y in range of g_j) + +F_Y(y) = P(g(X) \leq y) = P(\sum_{j=1}^k g_j(X) \leq y) += \int_R f_X(x) * indicator(x : g(x) \leq y) dx += \sum_{j=1}^k \int_R f_X(x) * Indicator{x : g_j(x) \leq y} dx +If g is monotonic increasing, the indicator is equivalent to x \leq +g_j^{-1}(y) + +This gives rise to several other transformations. Ex: scale +transformation ( g(x) = cx ), scale-position transformation ( g(x) = +cx+d ). + +Def: Symmetric distribution is when f_X(x) = f_X(-x). + +If X is a symmetric distribution and E|X| < \infty, EX = 0. + +EX = \int_{-\infty}^\infty xf_X(x) dx + = \int_0^\infty xf_X(x) + (-x)f_X(x) dx + = 0, by symmetry and some rearrangement of the integral. diff --git a/zhilova/hw1.tex b/zhilova/hw1.tex new file mode 100644 index 0000000..d9f5d22 --- /dev/null +++ b/zhilova/hw1.tex @@ -0,0 +1,133 @@ +\newfam\rsfs +\newfam\bbold +\def\scr#1{{\fam\rsfs #1}} +\def\bb#1{{\fam\bbold #1}} +\let\oldcal\cal +\def\cal#1{{\oldcal #1}} +\font\rsfsten=rsfs10 +\font\rsfssev=rsfs7 +\font\rsfsfiv=rsfs5 +\textfont\rsfs=\rsfsten +\scriptfont\rsfs=\rsfssev +\scriptscriptfont\rsfs=\rsfsfiv +\font\bbten=msbm10 +\font\bbsev=msbm7 +\font\bbfiv=msbm5 +\textfont\bbold=\bbten +\scriptfont\bbold=\bbsev +\scriptscriptfont\bbold=\bbfiv + +\def\Pr{\bb P} +\def\E{\bb E} +\newcount\qnum +\def\q{\afterassignment\qq\qnum=} +\def\qq{\qqq{\number\qnum}} +\def\qqq#1{\bigskip\goodbreak\noindent{\bf#1)}\smallskip} +\def\fr#1#2{{#1\over #2}} +\def\var{\mathop{\rm var}\nolimits} + +\q1 + +Let $A \setminus B := \{x \in A : x\not\in B\}.$ +$(A\setminus B) \cap B = \empty,$ by definition, so they are disjoint +sets. Therefore, if $x \in A \land x \in B,$ $x\not\in A \setminus B$ +and $x\not\in B\setminus A,$ and also the two setminuses do not +intersect with eachother, making them disjoint sets. +$$\Pr(A\cup B) = \Pr((A\setminus B)\cup(A\cap B)\cup(B\setminus A)) = +\Pr(A\setminus B) + \Pr(A\cap B) + \Pr(B\setminus A) + \Pr(A\cap B) - +\Pr(A\cap B)$$$$ = \Pr((A\setminus B)\cup(A\cap B)) + \Pr((B\setminus +A)\cup(A\cap B)) - \Pr(A\cap B) = \Pr(A) + \Pr(B) - \Pr(A\cap B).$$ + +\q2 + +$$\Pr(A\cap B) + \Pr(A\cap B^C) = \Pr(A\cap B) + \Pr(\{x\in A:x\not\in +B\}) = \Pr((A\cap B) \cup (A\setminus B)) = \Pr(A),$$ +by disjointedness and set arithmetic. + +\q3 + +$$\Pr(A_1 \cup (A_2^C \cap A_3^C)) = \Pr(A_1) + \Pr(A_2^C \cap A_3^C) - +\Pr(A_1\cap A_2^C\cap A_3^C) = 1/6 + \Pr(A_2^C)\Pr(A_3^C) - +\Pr(A_1)\Pr(A_2^C)\Pr(A_3^C)$$$$ = 1/6 + 25/36 - 25/216 = 161/216.$$ + +\q4 + +\noindent{\it (a)} + +Pairwise exclusive implies mutually exclusive: +$$A_1\cap A_2\cap A_3 = A_1\cap (A_2\cap A_3) = A_1\cap\empty = +\empty,$$ +and see {\it (b)} for a description of why these probabilities are +therefore impossible. + +\noindent{\it (b)} + +No, if they were, all three sets would be disjoint, giving +$$\Pr(A_1\cup A_2\cup A_3) = \Pr(A_1) + \Pr(A_2) + \Pr(A_3) = 1/2 + 1/4 + +1/3 > 1,$$ +violating a property of the probability function ($0 \leq \Pr X \leq 1$) + +\q5 + +\noindent{\it (a)} +$$\E(Y) = \E(2 - 3X) = \E(2) - 3\E(X) = 2 - 3*2 = -4.$$ + +\noindent{\it (b)} +$$\var Y = \E(Y^2) - \E(Y)^2 = \E((2-3X)^2) - 16 = \E(4 - 12X + 9X^2) - +16 = 4 - 12\E(X) + 9\E(X^2) - 16 = 18,$$ +by linearity of expectation. + +\q6 + +$$P_X(k) = \left\{\vbox{\halign{$#$\hfil&\hskip3em $#$\hfil\cr{\lambda^k +e^{-\lambda}\over k!}&k \geq 0\cr 0&{\rm otherwise}\cr}}\right..$$ + +\noindent{\it (a)} + +$$\E(e^{tX}) = \sum_{x=0}^\infty e^{tx}P_X(x) += \sum_{x=0}^\infty {e^{tx}\lambda^xe^{-\lambda}\over x!} += e^{-\lambda}\sum_{x=0}^\infty {e^{(t+\ln \lambda)x}\over x!} += e^{-\lambda}\sum_{x=0}^\infty {(\lambda e^t)^x\over x!} += e^{-\lambda}e^{\lambda e^t}, +$$ +by definition of the $e^ax$ Taylor series. + +\noindent{\it (b)} + +The third-order moment $\E(X^3)$ will be the third derivative of the mgf +$\E(e^{tX}).$ at $t=0,$ so we get +$$e^{-\lambda}(\lambda e^t)^3 e^{\lambda e^t} = \lambda^3.$$ + +\q7 + +\noindent{\it (a)} + +The pmf is $1/16$ for $X = 0$ and $X=4,$ $4/16$ for $X=1$ and $X=3,$ and +$6/16$ for $X=2,$ or in other terms, $P_X(k) = {{4\choose k}\over 2^4}.$ + +\noindent{\it (b)} + +This can be immediately computed as $$\Pr(\hbox{$X$ is odd}) = \Pr(X = 1) ++ \Pr(X = 3) = 1/2,$$ by disjointedness of those events. + +\q8 + +Chebyshev's inequality gives us +$$\Pr(|X-\E X| < .1) \geq 1-{\sigma^2\over .1^2} \geq .95$$ +The maximum value of $\sigma = \var X$ is $.05*.1^2 = \sigma^2 \to +\sigma = .1\sqrt{.05} \approx .022.$ + +\q9 + +{\it (a)} + +On $\{x \geq 0\} = \{y \geq 1\},$ +$$F_X(x) = 1 - e^{-\lambda x}.$$ +$$F_Y(y) = F_X(e^y) = 1 - e^{-\lambda e^y}.$$ +$$f_Y(y) = \lambda e^y e^{-\lambda e^y} = \lambda e^{y-\lambda e^y}.$$ + +{\it (b)} + +$\lambda < 1$ gives convergence. + +\bye |