Classical Definition - number of outcomes in Snumber of ways the event can occur
Relative Frequency - Probability of an event is proportion of times the event occurs in a long series of repetitions of an experiment / process.
- E.g probability of getting a 2 from a rolled dice is 61
Subjective Probability - Probability of an event is a measure of how sure person making the statement is that the event will happen
Discrete - Consists of a finite or countably infinite set of sample points
- Simple definition
- 0≤P(ai)≤1
- ∑all iP(ai)=∑all aiP(ai)=1
- Let S={a1,a2,a3,…} be a discrete sample space
- Odds in favor of an event A is the probability the event occurs divided by the probability it does not occur, or simply 1−P(A)P(A), odds against the event is reciprocal of this or simply P(A)1−P(A)
Additional Rule - Do job 1 in p ways and job 2 in q ways, then we can do either job 1 OR job 2 (but not both) in p + q ways
Multiplication Rule Do job 1 in p ways and for each of these ways we can do job 2 in q ways. Then we can do both job 1 AND job 2 in p x q ways
- n×(n−1)×⋯×1 ordered arrangements of length n using each symbol once and only once. This is denoted by n!
- Permutation - n×(n−1)×⋯×(n−k+1) ordered arrangements of length k≤n each symbol at most once. This product is denoted by n(k) (read “n to k factors”). Note that n(k)=(n−k)!n! (ORDER DOES MATTER)
- Combination - k!(n−k)!n! (ORDER DOES NOT MATTER)
Inclusion Exclusion Principle - P(A∪B)=P(A)+P(B)−P(A∩B)
- 3 events: P(A∪B∪C)=P(A)+P(B)+P(C)−[P(A∩B)+P(A∩C)+P(B∩C)]+P(A∩B∩C)
Conditional Probability - A and B be two events such that P(B)>0 then
Independence - P(A ∩ B)=P(A)⋅P(B)
Law of Total Probability - S=∑k=1nBk where Bi∩Bj=∅ if i=j then for any event, A, P(A)=∑R=1nP(A∣BR)⋅P(BR)
Probability Mass Function - Mapping each outcome to the probability of it happening
Cumulative Distribution Function - Cumulatively adding up the outcomes of the previous event
Binomial Distribution - n independent trials with 2 possible outcomes
- success: probability p
- failure: probability 1 - p
- X = # of successes in n trials then X follows a Binomial Distribution with params n and p
- X ~ Bin(n, p)
Geometric Distribution - X is number of trials required to observe first success in a sequence of experiments
- X ~ Geo(p) then E(X) = p1 and Var(X) = p2(1−p)
Negative Binomial Distribution - X ~ NB(r, p)
- X is the number of trials required to observe the rth success
- E(X)=pr
- Var(X)=p2r(1−p)
Poisson Distribution - X ~ Po(λ)
- P(X = x) = f(x) = x!e−λλx, x=0,1,2,…, and λ>0
- λ - rate / mean “average # of successes”
- E(X)=λ=μ
- Var(X)=λ