@ -307,3 +308,271 @@ And we know P(A) = 0.5, P(B) = 0.3 and P(C) = 0.2. Also P(P|A) = 0.20, P(P|B) =
+ Some other Identity
+ Some other Identity
\[ P(\overline{A} \cap B) + P(A \cap B) = P(B) \]
\[ P(\overline{A} \cap B) + P(A \cap B) = P(B) \]
\[ P(A \cap \overline{B}) + P(A \cap B) = P(A) \]
\[ P(A \cap \overline{B}) + P(A \cap B) = P(A) \]
* Probability Function
It is a mathematical function that gives probability of occurance of different possible outcomes. We use variables to represent these possible outcomes called *random variables*. These are represented by capital letters. Example, $X$, $Y$, etc. We use these random variables as:
\\
Suppose X is flipping two coins.
\[ X = \{HH, HT, TT, TH\} \]
We can represent it as,
\[ X = \{0, 1, 2, 3\} \]
Now we can write a probability function $P(X=x)$ for flipping two coins as :
#+attr_latex: :align |c|c|c|
|-----+----------|
| $x$ | $P(X=x)$ |
|-----+----------|
| 0 | 0.25 |
| 1 | 0.25 |
| 2 | 0.25 |
| 3 | 0.25 |
|-----+----------|
Another example is throwing two dice and our random variable $X$ is sum of those two dice.
#+attr_latex: :align |c|c|c|
|-----+----------------|
| $x$ | $P(X=x)$ |
|-----+----------------|
| 2 | $1/36$ |
| 3 | $2/36$ |
| 4 | $3/36$ |
| 5 | $4/36$ |
| 6 | $5/36$ |
| 7 | $6/36$ |
| 8 | $5/36$ |
| 9 | $4/36$ |
| 10 | $3/36$ |
| 11 | $2/36$ |
| 12 | $1/36$ |
|-----+----------------|
** Types of probability functions (Continious and Discrete random variables)
Based on the range of the Random variables, probability function has two different names.
+ For discrete random variables it is called Probability Distribution function.
+ For continious random variables it is called Probability Density function.
* Proability Mass Function
If we can get a function such that,
\[ f(x) = P(X=x) \]
then $f(x)$ is called a *Probability Mass Function* (PMF).
** Properties of Probability Mass Function
Suppose a PMF
\[ f(x) = P(X=x) \]
Then,
*** For discrete variables
\[ \Sigma f(x) = 1 \]
\[ E(X^n) = \Sigma x^n f(x) \]
For $E(X)$, the summation is over all possible values of x.
The use of a binomial distribution is to calculate a known probability repeated n number of times, i.e, doing *n* number of trials.
A binomial distribution deals with discrete random variables.
\[ X = \{ 0,1,2, .... n \} \]
where *n* is the number of trials.
\[ P(X=x) = \ ^nC_x\ (p)^x(q)^{n-x} \]
Here
\[ n \rightarrow number\ of\ trials \]
\[ x \rightarrow number\ of\ successes \]
\[ p \rightarrow probability\ of\ success \]
\[ q \rightarrow probability\ of\ failure \]
\[ p = 1 - q \]
+ Mean
\[ Mean = np \]
+ Variance
\[ Variance = npq \]
+ Moment Generating Function
\[ M(t) = (q + pe^t)^n \]
** Additive Property of Binomial Distribution
For an independent variable $X$. The binomial distribution is represented as
\[ X ~ B(n,p) \]
Here,
\[ n \rightarrow number\ of\ trials \]
\[ p \rightarrow probability\ of\ success \]
+ Property
If given,
\[ X_1 \sim B(n_1, p) \]
\[ X_2 \sim B(n_2, p) \]
Then,
\[ X_1 + X_2 \sim B(n_1 + n_2, p) \]
+ *NOTE*
If
\[ X_1 \sim B(n_1, p_1) \]
\[ X_2 \sim B(n_2, p_2) \]
Then $X_1 + X_2$ is not a binomial distribution.
** Using a binomial distribution
We can use binomial distribution to easily calculate probability of multiple trials, if probability of one trial is known. Example, the probability of a duplet (both dice have same number) when two dice are thrown is $\frac{6}{36}$. \\
Suppose now we want to know the probability of a 3 duplets if a pair of dice is thrown 5 times. So in this case :