Till Normal Distribution

TODO : Standard normal distribution funtion Z
master
lomna-dev 1 year ago
parent 8552f74f91
commit 9fe80885bf

@ -1,5 +1,6 @@
#+TITLE: Probability and Statistics ( BTech CSE )
#+AUTHOR: Anmol Nawani
#+LATEX_HEADER: \usepackage{amsmath}
# *Statistics
@ -307,3 +308,271 @@ And we know P(A) = 0.5, P(B) = 0.3 and P(C) = 0.2. Also P(P|A) = 0.20, P(P|B) =
+ Some other Identity
\[ P(\overline{A} \cap B) + P(A \cap B) = P(B) \]
\[ P(A \cap \overline{B}) + P(A \cap B) = P(A) \]
* Probability Function
It is a mathematical function that gives probability of occurance of different possible outcomes. We use variables to represent these possible outcomes called *random variables*. These are represented by capital letters. Example, $X$, $Y$, etc. We use these random variables as:
\\
Suppose X is flipping two coins.
\[ X = \{HH, HT, TT, TH\} \]
We can represent it as,
\[ X = \{0, 1, 2, 3\} \]
Now we can write a probability function $P(X=x)$ for flipping two coins as :
#+attr_latex: :align |c|c|c|
|-----+----------|
| $x$ | $P(X=x)$ |
|-----+----------|
| 0 | 0.25 |
| 1 | 0.25 |
| 2 | 0.25 |
| 3 | 0.25 |
|-----+----------|
Another example is throwing two dice and our random variable $X$ is sum of those two dice.
#+attr_latex: :align |c|c|c|
|-----+----------------|
| $x$ | $P(X=x)$ |
|-----+----------------|
| 2 | $1/36$ |
| 3 | $2/36$ |
| 4 | $3/36$ |
| 5 | $4/36$ |
| 6 | $5/36$ |
| 7 | $6/36$ |
| 8 | $5/36$ |
| 9 | $4/36$ |
| 10 | $3/36$ |
| 11 | $2/36$ |
| 12 | $1/36$ |
|-----+----------------|
** Types of probability functions (Continious and Discrete random variables)
Based on the range of the Random variables, probability function has two different names.
+ For discrete random variables it is called Probability Distribution function.
+ For continious random variables it is called Probability Density function.
* Proability Mass Function
If we can get a function such that,
\[ f(x) = P(X=x) \]
then $f(x)$ is called a *Probability Mass Function* (PMF).
** Properties of Probability Mass Function
Suppose a PMF
\[ f(x) = P(X=x) \]
Then,
*** For discrete variables
\[ \Sigma f(x) = 1 \]
\[ E(X^n) = \Sigma x^n f(x) \]
For $E(X)$, the summation is over all possible values of x.
\[ Mean = E(X) = \Sigma x f(x) \]
\[ Variance = E(X^2) - (E(X))^2 = \Sigma x^2 f(x) - ( \Sigma x f(x) )^2 \]
To get probabilities
\[ P(a \le X \le b) = \sum_{a}^{b} f(x) \]
\[ P(a < X \le b) = (\sum_{a}^{b} f(x)) - f(a) \]
\[ P(a \le X < b) = (\sum_{a}^{b} f(x)) - f(b) \]
Basically, we just add all $f(x)$ values from range of samples we need.
*** For continious variables
\[ \int_{-\infty}^{\infty} f(x) dx = 1 \]
\[ E(X^n) = \int_{-\infty}^{\infty} x^n f(x) dx \]
We only consider integral from the possible values of x. Else we assume 0.
\[ Mean = E(X) = \int_{-\infty}^{\infty} x f(x) dx \]
\[ Variance = E(X^2) - (E(X))^2 = \int_{-\infty}^{\infty} x^2 f(x) dx - ( \int_{-\infty}^{\infty} x f(x) dx )^2 \]
To get probability from a to b (inclusive and exclusive doesn't matter in continious).
\[ P(a < X < b) = \int_{a}^{b} f(x) dx \]
** Some properties of mean and variance
+ Mean
\[ E(aX) = aE(X) \]
\[ E(a) = a \]
\[ E(X + Y) = E(X) + E(Y) ]
+ Variance
If
\[ V(X) = E(X^2) - (E(X))^2 \]
Then
\[ V(aX) = a^2 V(X) \]
\[ V(a) = 0 \]
* Moment Generating Function
The moment generating function is given by
\[ M(t) = E(e^{tX}) \]
** For discrete
\[ M(t) = \sum_{0}^{\infty} e^{tx} f(x) \]
** For continious
\[ M(t) = \int_{-\infty}^{\infty} e^{tx} f(x) dx \]
** Calculations of Moments (E(X)) using MGF
\[ E(X^n) = (\frac{d^n}{dt^n} M(t))_{t=0} \]
* Binomial Distribution
The use of a binomial distribution is to calculate a known probability repeated n number of times, i.e, doing *n* number of trials.
A binomial distribution deals with discrete random variables.
\[ X = \{ 0,1,2, .... n \} \]
where *n* is the number of trials.
\[ P(X=x) = \ ^nC_x\ (p)^x(q)^{n-x} \]
Here
\[ n \rightarrow number\ of\ trials \]
\[ x \rightarrow number\ of\ successes \]
\[ p \rightarrow probability\ of\ success \]
\[ q \rightarrow probability\ of\ failure \]
\[ p = 1 - q \]
+ Mean
\[ Mean = np \]
+ Variance
\[ Variance = npq \]
+ Moment Generating Function
\[ M(t) = (q + pe^t)^n \]
** Additive Property of Binomial Distribution
For an independent variable $X$. The binomial distribution is represented as
\[ X ~ B(n,p) \]
Here,
\[ n \rightarrow number\ of\ trials \]
\[ p \rightarrow probability\ of\ success \]
+ Property
If given,
\[ X_1 \sim B(n_1, p) \]
\[ X_2 \sim B(n_2, p) \]
Then,
\[ X_1 + X_2 \sim B(n_1 + n_2, p) \]
+ *NOTE*
If
\[ X_1 \sim B(n_1, p_1) \]
\[ X_2 \sim B(n_2, p_2) \]
Then $X_1 + X_2$ is not a binomial distribution.
** Using a binomial distribution
We can use binomial distribution to easily calculate probability of multiple trials, if probability of one trial is known. Example, the probability of a duplet (both dice have same number) when two dice are thrown is $\frac{6}{36}$. \\
Suppose now we want to know the probability of a 3 duplets if a pair of dice is thrown 5 times. So in this case :
\[ number\ of\ trials\ (n) = 5 \]
\[ number\ of\ duplets\ we\ want\ probability\ for\ (x) = 3 \]
\[ probability\ of\ duplet\ (p) = \frac{6}{36} \]
\[ q = 1 - p = 1 - \frac{6}{36} \]
So using binomial distribution,
\[ P(probability\ of\ 3\ duplets) = P(X=3) = \ ^5C_3 \left(\frac{6}{36}\right)^3 \left(\frac{30}{36}\right)^{5-3} \]
* Poisson Distribution
A case of the binomial distribution where *n* is indefinitely large and *p* is very small and *$\lambda = np$* is finite.
\[ P(X=x) = \frac{e^{-\lambda}\lambda^x}{x!}\ if\ x = 0, 1, 2 ..... \]
\[ P(X=x) = 0\ otherwise \]
\[ \lambda = np \]
+ Mean
\[ Mean = \lambda \]
+ Variance
\[ Variance = \lambda \]
+ Moment Generating Funtion
\[ M(t) = e^{\lambda\left(e^{t}-1\right)} \]
** Additive property
If X_1, X_2, X_3..X_n follow poisson distribution with \lambda_1, \lambda_2, \lambda_3....\lambda_n \\
Then,
\[ X_1 + X_2 + X_3...+X_n \sim \lambda_1 + \lambda_2 + \lambda_3 + ...+ \lambda_n \]
* Exponential Distribution
A continuous random distribution which has probability mass function
\[ f(x) = \lambda e^{-\lambda x}\ ,\ when\ x \ge 0 \]
\[ f(x) = 0 \ ,\ otherwise \]
\[ where\ \lambda > 0 \]
+ Mean
\[ Mean = \frac{1}{\lambda} \]
+ Variance
\[ Variance = \frac{1}{\lambda^2} \]
+ Moment Generating Function
\[ M(t) = \frac{\lambda}{\lambda - t} \]
** Memory Less Property
\[ P[X > (s + t) \mid X > t] = P(X > s) \]
* Normal Distribution
Suppose for a probability funtion with random variable X, having mean \mu and variance \sigma^2.
We denote normal distribution using $X \sim N(\mu,\sigma)$ \\
The probability mass funtion is
\[ f(x) = \frac{1}{\sigma\sqrt{2\pi}}\exp\left(-\frac{1}{2}\left(\frac{x-\mu}{\sigma}\right)^{2}\right) \]
\[ -\infty < x < \infty \]
\[ -\infty < \mu < \infty \]
\[ \sigma > 0 \]
Here, $exp(x) = e^x$
+ Moment Generating Funtion
\[ M(t) = exp\left( \mu t + \frac{\sigma^2 t^2}{2} \right) \]
** Odd Moments
\[ E(X^{2n + 1}) = 0 \ , \ n = 0, 1, 2, ... \]
** Even Moments
\[ E(X^{2n}) = 1.3.5....(2n-3)(2n-1) \sigma^{2n} \ , \ n = 0, 1, 2, ... \]
** Properties
+ In a normal distribution
\[ Mean = Mode = Median \]
+ For normal distribution, mean deviation about mean is
\[ \sigma \sqrt{ \frac{2}{\pi} } \]
** Additive property
Suppose for distributions X_1, X_2, X_3 ... X_n with means \mu_1 , \mu_2 , \mu_3 ... \mu_n and standard deviation \sigma_1^2 , \sigma_2^2 , \sigma_3^2 ..... \sigma_n^2 respectively.
\\
Then X_1 + X_2 + X_3 will have mean *( \mu_1 + \mu_2 + \mu_3 + ... + \mu_n )* and standard deviation *(\sigma_1^2 + \sigma_2^2 + \sigma_3^2 + ..... + \sigma_n^2 )*
+ Additive Case
Given,
\[ X_1 \sim N(\mu_1, \sigma_1) \]
\[ X_2 \sim N(\mu_2, \sigma_2) \]
Then,
\[ a X_1 + b X_2 \sim N \left( a \mu_1 + b \mu_2, \sqrt{ a^2 \sigma_1^2 + b^2 \sigma_2^2} \right) \]

Binary file not shown.

@ -0,0 +1,11 @@
// *Remove Old
// #do
sh ./remove.sh
*Emacs
#do
emacs --script export.el
*LuaLatex
#do
lualatex main.tex
Loading…
Cancel
Save