probability

Calculus

Logarithmic Differentiation:

$$\dfrac{df(x)}{dx}=f(x)(\dfrac{d\ln f(x)}{dx})\text{, since }\dfrac{d\ln f(x)}{dx}=\dfrac{df(x)/dx}{f(x)}$$


Integration by Parts:

$$\int{udv}=uv-\int{vdu}$$

Tabular Method of Integration by Parts:

$$\begin{matrix} u & {} & dv \\ f & {} & g \\ {} & \overset{+}{\mathop{\searrow }}\, & {} \\ {{f}^{(1)}} & {} & {{g}^{(-1)}} \\ {} & \overset{-}{\mathop{\searrow }}\, & {} \\ {{f}^{(2)}} & {} & {{g}^{(-2)}} \\ {} & \overset{+}{\mathop{\searrow }}\, & {} \\ {{f}^{(3)}} & {} & {{g}^{(-3)}} \\ {} & \overset{-}{\mathop{\searrow }}\, & {} \\ {\vdots} & {} & {\vdots} \\ {f^{(n-1)}} & {} & {g^{-(n-1)}} \\ {} & {\overset{(-1)^{n-1}}{\mathop{\searrow }}} & {} \\ {f^{(n)}} & {} & {g^{(-n)}} \\ \end{matrix}$$


Mean

With PDF:

$$E[X]=\int_{-\infty}^{\infty}{xf(x)dx}$$

With Survival Function:

$$E[X]=\int_{0}^{\infty}{S(x)dx}=\int_{0}^{\infty}{(1-F(x))dx}$$

With a Function:

$$E[g(X)]=\int_{-\infty}^{\infty}{g(x)f_X(x)dx}$$ $$E[g(X)]=\int_{0}^{\infty}{g'(x)S_X(x)dx}\text{, for domain }x\ge 0$$ $$E[g(X)|j\le X\le k]=\dfrac{\int_{j}^{k}{g(x)f_X(x)dx}}{Pr(j\le X\le k)}$$

The Mean of Continuous R.V. and a Constant:

$$E[\min(X,k)]=\int_{-\infty}^{k}{x(x)dx}+k(1-F(k))$$ $$E[\min(X,k)]=\int_{-\infty}^{k}{x(x)dx}+k(1-F(k))$$ If \(f(x)=0\) for \(x<0\), then:

    > \(E[X]=\int_{0}^{\infty}{(1-F(x))dx}\)

    > \(E[\min(X,k)]=\int_{0}^{k}{(1-F(x))dx}\)


Variance and Other Moments

Variance:

$$Var(X)=E[X^{2}]-\mu^{2}$$ $$Var(aX+b)=a^2Var(X)$$ $$CV[X] = \dfrac{SD[X]}{E[X]}$$


Bernoulli Shortcut:

If \(Pr(X=a)=1-p\) and \(Pr(X=b)=p\), then \(Var(X)=(b-a)^{2}p(1-p)\)



Covariance

$$Cov(𝑋,Y)=𝐸[(X-\mu_X)(Y-\mu_Y)]$$ $$Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y)$$ $$Var(aX+bY)=a^2Var(X)+b^2Var(Y)+2abCov(X,Y)$$


If two variables are independent, then the expected value of their product equals the product of their expectations.

$$E[XY]=E[X]E[Y]$$

Note: Independence implies covariance of 0, but not conversely.




Covariance Rules:

$$Cov(X,X)=Var(X)$$ $$Cov(X,Y)=Cov(Y,X)$$ $$Cov(aX,bY)=abCov(X,Y)$$ $$Cov(X,aY+bZ)=aCov(X,Y)+bCov(X,Z)$$


Covariance Matrix:

$$\sum =(\begin{matrix} \sigma _{X}^{2} & {{\sigma }_{XY}} \\ {{\sigma }_{XY}} & \sigma _{Y}^{2} \\ \end{matrix})$$


Correlation Coefficient:

$$\rho_{X,Y}=Corr(X,Y)=\dfrac{Cov(X,Y)}{\sqrt{Var(X)}\sqrt{Var(Y)}}$$



Joint and Marginal Distributions

Joint Density Function:

$$Pr(a < X {\leq } b,c < Y {\leq } d)=\int_{a}^{b}{\int_{c}^{d}{f(x,y)dydx}}$$


Marginal Distribution:

$$f_X(x)=\int_{-\infty}^{\infty}{f(x,y)dy}$$ $$f_Y(y)=\int_{-\infty}^{\infty}{f(x,y)dx}$$

If the variables are independent, then:

$$f(x,y)=g(x)h(y)$$


Joint Moments:

$$E[g(x,y)]=\int{\int{g(x,y)f(x,y)dydx}}$$

If the variables are independent, then:

$$E[g(X)h(Y)] = E[g(X)]E[h(Y)]$$



Conditional Distributions

Conditional Density:

$$f_{X|Y}(x|y)=\dfrac{f(x,y)}{f_Y(y)}$$


Law of Total Probabilities:

$$f_X(x)=\int_{-\infty}^{\infty}{f_{X|Y}(x|y)f_Y(y)dy}$$


Bayes' Theorem:

$$f_{X|Y}(x|y)=\dfrac{f_{X|Y}(x|y)f_Y(y)dy}{f_X(x)}=\dfrac{f_{X|Y}(x|y)f_Y(y)dy}{\int_{-\infty}^{\infty}{f_{Y|X}(y|w)f_X(w)dw}}$$


Conditional Moments:

$$E[g(X)|Y=y]=\int_{-\infty}^{\infty}{\dfrac{g(x)f(x,y)dx}{f_Y(y)}}$$


Double Expectation:

$$E[g(X)]=E_Y[E_X[g(X)|Y]]$$


Specifically,

    > \(E[g(X)]=E_Y[E_X[X|Y]]\)

    > \(E[g(X^2)]=E_Y[E_X[X^2|Y]]\)


Conditional Variance:

$$Var(X)=E[Var(X|Y)]+Var(E[X|Y])$$



Order Statistics

Maximum & Minimum:

$$X_{(1)}=min(X_1,X_2,...,X_n)$$ $$X_{(n)}=max(X_1,X_2,...,X_n)$$

For IID r.v's:

For minimum,

$$S_{X_{(1)}}(x)=[S_X(x)]^n\text{, and the pdf is }f_{X_{(1)}}(x)=f_Y(x)=n(1-F_X(x))^{n-1}f_X(x)$$

For Maximum,

$$F_{X_{(n)}}(x)=[F_X(x)]^n\text{, and the pdf is }f_{X_{(n)}}(x)=f_Y(x)=n(F_X(x))^{n-1}f_X(x)$$


\(k^{th}\) Order Statistic:

$$f_{Y_{(k)}}(x)=\dfrac{n!}{(k-1)!1!(n-k)!}F_X(x)^{k-1}f_X(x)(1-F_X(x))^{n-k}$$

For Uniform r.v. on \([0,\theta]\),

$$E_{Y_{(k)}}(x)=\dfrac{k\theta}{n+1}$$


  • probability.txt
  • Last modified: 2023/10/25 00:12
  • by lph