Visual Tools
Calculators
Tables
Mathematical Keyboard
Converters
Other Tools

Probability

Introduction to Probability Section

Probability is a field of mathematics that deals with uncertainty and provides tools to measure and analyze how likely events are to occur. It begins with basic concepts such as outcomes, events, and sample spaces, forming the foundation for calculating likelihoods.
Central to probability is the concept of probability measures, which assign values between 0 and 1 to events, indicating their likelihood. A value of 0 means an event is impossible, while 1 signifies certainty. Key principles include independence (events that do not influence each other) and conditional probability, which considers the likelihood of an event given that another has occurred.
Probability also introduces random variables, which assign numerical values to outcomes. These variables are categorized as either discrete (taking specific values, like rolling a die) or continuous(taking any value within a range, like measuring temperature). Important measures such as expectancy(average value) and variance(spread or variability) are used to summarize the behavior of random variables.
Advanced topics include distributions, such as the binomial, normal, and Poisson distributions, which model specific types of random phenomena. These tools are essential for understanding patterns in random processes and making informed predictions.
Probability is widely applied in science, engineering, finance, and everyday decision-making. It forms the basis for statistics, enabling data-driven insights and predictions, and supports fields like machine learning, risk analysis, and quantum mechanics. By studying probability, students develop skills to reason about uncertainty and draw conclusions from incomplete information.

Probability Formulas

Explore Probability formulas with explanations and examples

Simple Probability

P(A)=Number of favorable outcomesTotal number of possible outcomesP(A) = \frac{\text{Number of favorable outcomes}}{\text{Total number of possible outcomes}}

Probability Range of an Event

0P(A)1 0 \leq P(A) \leq 1

Complement Rule

P(A)+P(A)=1P(A') + P(A) = 1

Conditional Probability Basic Formula

P(AB)=P(AB)P(B)P(A \mid B) = \frac{P(A \cap B)}{P(B)}

Bayes' Theorem

P(AB)=P(BA)×P(A)P(B)P(A \mid B) = \frac{P(B \mid A) \times P(A)}{P(B)}

Probability of Both Events Occurring (Multiplication Rule)

P(AB)=P(A)×P(B)P(A \cap B) = P(A) \times P(B)

Probability of Either Event Occurring (Addition Rule)

P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)

Probability of At Least One Event Not Occurring

P(¬A¬B)=P(¬A)×P(¬B)P(\neg A \cap \neg B) = P(\neg A) \times P(\neg B)

Probability of Exactly One Event Occurring

P(exactly one of A or B)=P(A¬B)+P(¬AB)P(\text{exactly one of } A \text{ or } B) = P(A \cap \neg B) + P(\neg A \cap B)

General Formula for Multiple Independent Events

P(ABC)=P(A)×P(B)×P(C)P(A \cap B \cap C) = P(A) \times P(B) \times P(C)

Probability of Both Disjoint Events Occurring

P(AB)=0P(A \cap B) = 0

Probability of Either Disjoint Event Occurring (Addition Rule)

P(AB)=P(A)+P(B)P(A \cup B) = P(A) + P(B)

Probability of Neither Disjoint Event Occurring

P(¬A¬B)=P(¬A)×P(¬B)P(\neg A \cap \neg B) = P(\neg A) \times P(\neg B)

Conditional Probability for Disjoint Events

P(AB)=0andP(BA)=0P(A \mid B) = 0 and P(B \mid A) = 0

Generalization to Multiple Disjoint Events

P(ABC)=P(A)+P(B)+P(C)+P(A \cup B \cup C \cup \ldots) = P(A) + P(B) + P(C) + \ldots

Probability Mass Function (PMF)

P(X=k)=(nk)pk(1p)nkP(X = k) = \binom{n}{k} p^{k} (1 - p)^{n - k}

Cumulative Distribution Function (CDF)

P(Xk)=i=0k(ni)pi(1p)niP(X \leq k) = \sum_{i=0}^{k} \binom{n}{i} p^{i} (1 - p)^{n - i}

Mean (Expected Value)

μ=E[X]=np\mu = E[X] = n p

Variance

σ2=Var(X)=np(1p)\sigma^2 = \operatorname{Var}(X) = n p (1 - p)

Standard Deviation

σ=np(1p)\sigma = \sqrt{n p (1 - p)}

Probability Mass Function (PMF)

P(X=k)=eλλkk!P(X = k) = \frac{e^{-\lambda} \lambda^{k}}{k!}

Cumulative Distribution Function (CDF)

P(Xk)=eλi=0kλii!P(X \leq k) = e^{-\lambda} \sum_{i=0}^{k} \frac{\lambda^{i}}{i!}

Mean (Expected Value)

μ=E[X]=λ\mu = E[X] = \lambda

Variance

σ2=Var(X)=λ\sigma^2 = \operatorname{Var}(X) = \lambda

Standard Deviation

σ=λ\sigma = \sqrt{\lambda}

Probability Mass Function (PMF)

P(X=k)=(1p)k1pP(X = k) = (1 - p)^{k - 1} p

Cumulative Distribution Function (CDF)

P(Xk)=1(1p)kP(X \leq k) = 1 - (1 - p)^{k}

Mean (Expected Value)

μ=E[X]=1p\mu = E[X] = \frac{1}{p}

Variance

σ2=Var(X)=1pp2\sigma^2 = \operatorname{Var}(X) = \frac{1 - p}{p^{2}}

Standard Deviation

σ=1pp2\sigma = \sqrt{\frac{1 - p}{p^{2}}}

Probability Mass Function (PMF)

P(X=k)=(k1r1)pr(1p)krP(X = k) = \binom{k - 1}{r - 1} p^{r} (1 - p)^{k - r}

Cumulative Distribution Function (CDF)

P(Xk)=Ip(r,kr+1)P(X \leq k) = I_{p}(r, k - r + 1)

Mean (Expected Value)

μ=E[X]=rp\mu = E[X] = \frac{r}{p}

Variance

σ2=Var(X)=r(1p)p2\sigma^2 = \operatorname{Var}(X) = \frac{r (1 - p)}{p^{2}}

Standard Deviation

σ=r(1p)p2\sigma = \sqrt{\frac{r (1 - p)}{p^{2}}}

Probability Mass Function (PMF)

P(X=k)=(Kk)(NKnk)(Nn)P(X = k) = \frac{\binom{K}{k} \binom{N - K}{n - k}}{\binom{N}{n}}

Mean (Expected Value)

μ=E[X]=n(KN)\mu = E[X] = n \left( \frac{K}{N} \right)

Variance

σ2=Var(X)=n(KN)(NKN)(NnN1)\sigma^2 = \operatorname{Var}(X) = n \left( \frac{K}{N} \right) \left( \frac{N - K}{N} \right) \left( \frac{N - n}{N - 1} \right)

Standard Deviation

σ=n(KN)(NKN)(NnN1)\sigma = \sqrt{n \left( \frac{K}{N} \right) \left( \frac{N - K}{N} \right) \left( \frac{N - n}{N - 1} \right)}

Probability of At Least $k$ Successes

P(Xk)=1i=0k1P(X=i)P(X \geq k) = 1 - \sum_{i=0}^{k - 1} P(X = i)

Probability Mass Function (PMF)

P(X1=x1,,Xk=xk)=n!x1!x2!xk!p1x1p2x2pkxkP(X_1 = x_1, \dots, X_k = x_k) = \frac{n!}{x_1! x_2! \dots x_k!} p_1^{x_1} p_2^{x_2} \dots p_k^{x_k}

Mean (Expected Value) of Each Outcome

E[Xi]=npiE[X_i] = n p_i

Variance of Each Outcome

Var(Xi)=npi(1pi)\operatorname{Var}(X_i) = n p_i (1 - p_i)

Covariance Between Outcomes

Cov(Xi,Xj)=npipj\operatorname{Cov}(X_i, X_j) = -n p_i p_j

Correlation Coefficient Between Outcomes

ρij=Cov(Xi,Xj)Var(Xi)Var(Xj)=pipjpi(1pi)pj(1pj)\rho_{ij} = \frac{\operatorname{Cov}(X_i, X_j)}{\sqrt{\operatorname{Var}(X_i) \operatorname{Var}(X_j)}} = \frac{-p_i p_j}{\sqrt{p_i (1 - p_i) p_j (1 - p_j)}}

Probability Mass Function (PMF)

P(X=k)=1ba+1P(X = k) = \frac{1}{b - a + 1}

Mean (Expected Value)

μ=E[X]=a+b2\mu = E[X] = \frac{a + b}{2}

Variance

σ2=Var(X)=(ba+1)2112\sigma^2 = \operatorname{Var}(X) = \frac{(b - a + 1)^2 - 1}{12}

Standard Deviation

σ=(ba+1)2112\sigma = \sqrt{\frac{(b - a + 1)^2 - 1}{12}}

Cumulative Distribution Function (CDF)

P(Xk)=ka+1ba+1fork=a,a+1,,bP(X \leq k) = \frac{k - a + 1}{b - a + 1} for k = a, a+1, \dots, b

Probability Mass Function (PMF)

P(X=k)=(k1r1)(NkKr)(NK)P(X = k) = \frac{\binom{k - 1}{r - 1} \binom{N - k}{K - r}}{\binom{N}{K}}

Mean (Expected Value)

μ=E[X]=r(N+1)K+1\mu = E[X] = \frac{r(N + 1)}{K + 1}

Variance

σ2=r(N+1)(NK)(Nr)(K+1)2(K+2)\sigma^2 = \frac{r (N + 1)(N - K)(N - r)}{(K + 1)^{2} (K + 2)}

Standard Deviation

σ=r(N+1)(NK)(Nr)(K+1)2(K+2)\sigma = \sqrt{\frac{r (N + 1)(N - K)(N - r)}{(K + 1)^{2} (K + 2)}}

Cumulative Distribution Function (CDF)

P(Xk)=1(Nrkr)(r1r1)(Nk)P(X \leq k) = 1 - \frac{\binom{N - r}{k - r} \binom{r - 1}{r - 1}}{\binom{N}{k}}

Probability Mass Function (PMF)

P(X=k)=1ln(1p)pkkP(X = k) = -\frac{1}{\ln(1 - p)} \frac{p^{k}}{k}

Mean (Expected Value)

μ=E[X]=p(1p)ln(1p)\mu = E[X] = \frac{-p}{(1 - p) \ln(1 - p)}

Variance

σ2=Var(X)=p(p+ln(1p))(1p)2[ln(1p)]2\sigma^2 = \operatorname{Var}(X) = \frac{-p (p + \ln(1 - p))}{(1 - p)^{2} [\ln(1 - p)]^{2}}

Standard Deviation

σ=Var(X)\sigma = \sqrt{\operatorname{Var}(X)}

Generating Function

GX(s)=ln(1ps)ln(1p)G_X(s) = \frac{\ln(1 - p s)}{\ln(1 - p)}

Simple Probability

P(A)=Number of favorable outcomesTotal number of possible outcomesP(A) = \frac{\text{Number of favorable outcomes}}{\text{Total number of possible outcomes}}

Probability Range of an Event

0P(A)1 0 \leq P(A) \leq 1

Complement Rule

P(A)+P(A)=1P(A') + P(A) = 1

Conditional Probability Basic Formula

P(AB)=P(AB)P(B)P(A \mid B) = \frac{P(A \cap B)}{P(B)}

Bayes' Theorem

P(AB)=P(BA)×P(A)P(B)P(A \mid B) = \frac{P(B \mid A) \times P(A)}{P(B)}

Probability of Both Events Occurring (Multiplication Rule)

P(AB)=P(A)×P(B)P(A \cap B) = P(A) \times P(B)

Probability of Either Event Occurring (Addition Rule)

P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)

Probability of At Least One Event Not Occurring

P(¬A¬B)=P(¬A)×P(¬B)P(\neg A \cap \neg B) = P(\neg A) \times P(\neg B)

Probability of Exactly One Event Occurring

P(exactly one of A or B)=P(A¬B)+P(¬AB)P(\text{exactly one of } A \text{ or } B) = P(A \cap \neg B) + P(\neg A \cap B)

General Formula for Multiple Independent Events

P(ABC)=P(A)×P(B)×P(C)P(A \cap B \cap C) = P(A) \times P(B) \times P(C)

Probability of Both Disjoint Events Occurring

P(AB)=0P(A \cap B) = 0

Probability of Either Disjoint Event Occurring (Addition Rule)

P(AB)=P(A)+P(B)P(A \cup B) = P(A) + P(B)

Probability of Neither Disjoint Event Occurring

P(¬A¬B)=P(¬A)×P(¬B)P(\neg A \cap \neg B) = P(\neg A) \times P(\neg B)

Conditional Probability for Disjoint Events

P(AB)=0andP(BA)=0P(A \mid B) = 0 and P(B \mid A) = 0

Generalization to Multiple Disjoint Events

P(ABC)=P(A)+P(B)+P(C)+P(A \cup B \cup C \cup \ldots) = P(A) + P(B) + P(C) + \ldots

Probability Mass Function (PMF)

P(X=k)=(nk)pk(1p)nkP(X = k) = \binom{n}{k} p^{k} (1 - p)^{n - k}

Cumulative Distribution Function (CDF)

P(Xk)=i=0k(ni)pi(1p)niP(X \leq k) = \sum_{i=0}^{k} \binom{n}{i} p^{i} (1 - p)^{n - i}

Mean (Expected Value)

μ=E[X]=np\mu = E[X] = n p

Variance

σ2=Var(X)=np(1p)\sigma^2 = \operatorname{Var}(X) = n p (1 - p)

Standard Deviation

σ=np(1p)\sigma = \sqrt{n p (1 - p)}

Probability Mass Function (PMF)

P(X=k)=eλλkk!P(X = k) = \frac{e^{-\lambda} \lambda^{k}}{k!}

Cumulative Distribution Function (CDF)

P(Xk)=eλi=0kλii!P(X \leq k) = e^{-\lambda} \sum_{i=0}^{k} \frac{\lambda^{i}}{i!}

Mean (Expected Value)

μ=E[X]=λ\mu = E[X] = \lambda

Variance

σ2=Var(X)=λ\sigma^2 = \operatorname{Var}(X) = \lambda

Standard Deviation

σ=λ\sigma = \sqrt{\lambda}

Probability Mass Function (PMF)

P(X=k)=(1p)k1pP(X = k) = (1 - p)^{k - 1} p

Cumulative Distribution Function (CDF)

P(Xk)=1(1p)kP(X \leq k) = 1 - (1 - p)^{k}

Mean (Expected Value)

μ=E[X]=1p\mu = E[X] = \frac{1}{p}

Variance

σ2=Var(X)=1pp2\sigma^2 = \operatorname{Var}(X) = \frac{1 - p}{p^{2}}

Standard Deviation

σ=1pp2\sigma = \sqrt{\frac{1 - p}{p^{2}}}

Probability Mass Function (PMF)

P(X=k)=(k1r1)pr(1p)krP(X = k) = \binom{k - 1}{r - 1} p^{r} (1 - p)^{k - r}

Cumulative Distribution Function (CDF)

P(Xk)=Ip(r,kr+1)P(X \leq k) = I_{p}(r, k - r + 1)

Mean (Expected Value)

μ=E[X]=rp\mu = E[X] = \frac{r}{p}

Variance

σ2=Var(X)=r(1p)p2\sigma^2 = \operatorname{Var}(X) = \frac{r (1 - p)}{p^{2}}

Standard Deviation

σ=r(1p)p2\sigma = \sqrt{\frac{r (1 - p)}{p^{2}}}

Probability Mass Function (PMF)

P(X=k)=(Kk)(NKnk)(Nn)P(X = k) = \frac{\binom{K}{k} \binom{N - K}{n - k}}{\binom{N}{n}}

Mean (Expected Value)

μ=E[X]=n(KN)\mu = E[X] = n \left( \frac{K}{N} \right)

Variance

σ2=Var(X)=n(KN)(NKN)(NnN1)\sigma^2 = \operatorname{Var}(X) = n \left( \frac{K}{N} \right) \left( \frac{N - K}{N} \right) \left( \frac{N - n}{N - 1} \right)

Standard Deviation

σ=n(KN)(NKN)(NnN1)\sigma = \sqrt{n \left( \frac{K}{N} \right) \left( \frac{N - K}{N} \right) \left( \frac{N - n}{N - 1} \right)}

Probability of At Least $k$ Successes

P(Xk)=1i=0k1P(X=i)P(X \geq k) = 1 - \sum_{i=0}^{k - 1} P(X = i)

Probability Mass Function (PMF)

P(X1=x1,,Xk=xk)=n!x1!x2!xk!p1x1p2x2pkxkP(X_1 = x_1, \dots, X_k = x_k) = \frac{n!}{x_1! x_2! \dots x_k!} p_1^{x_1} p_2^{x_2} \dots p_k^{x_k}

Mean (Expected Value) of Each Outcome

E[Xi]=npiE[X_i] = n p_i

Variance of Each Outcome

Var(Xi)=npi(1pi)\operatorname{Var}(X_i) = n p_i (1 - p_i)

Covariance Between Outcomes

Cov(Xi,Xj)=npipj\operatorname{Cov}(X_i, X_j) = -n p_i p_j

Correlation Coefficient Between Outcomes

ρij=Cov(Xi,Xj)Var(Xi)Var(Xj)=pipjpi(1pi)pj(1pj)\rho_{ij} = \frac{\operatorname{Cov}(X_i, X_j)}{\sqrt{\operatorname{Var}(X_i) \operatorname{Var}(X_j)}} = \frac{-p_i p_j}{\sqrt{p_i (1 - p_i) p_j (1 - p_j)}}

Probability Mass Function (PMF)

P(X=k)=1ba+1P(X = k) = \frac{1}{b - a + 1}

Mean (Expected Value)

μ=E[X]=a+b2\mu = E[X] = \frac{a + b}{2}

Variance

σ2=Var(X)=(ba+1)2112\sigma^2 = \operatorname{Var}(X) = \frac{(b - a + 1)^2 - 1}{12}

Standard Deviation

σ=(ba+1)2112\sigma = \sqrt{\frac{(b - a + 1)^2 - 1}{12}}

Cumulative Distribution Function (CDF)

P(Xk)=ka+1ba+1fork=a,a+1,,bP(X \leq k) = \frac{k - a + 1}{b - a + 1} for k = a, a+1, \dots, b

Probability Mass Function (PMF)

P(X=k)=(k1r1)(NkKr)(NK)P(X = k) = \frac{\binom{k - 1}{r - 1} \binom{N - k}{K - r}}{\binom{N}{K}}

Mean (Expected Value)

μ=E[X]=r(N+1)K+1\mu = E[X] = \frac{r(N + 1)}{K + 1}

Variance

σ2=r(N+1)(NK)(Nr)(K+1)2(K+2)\sigma^2 = \frac{r (N + 1)(N - K)(N - r)}{(K + 1)^{2} (K + 2)}

Standard Deviation

σ=r(N+1)(NK)(Nr)(K+1)2(K+2)\sigma = \sqrt{\frac{r (N + 1)(N - K)(N - r)}{(K + 1)^{2} (K + 2)}}

Cumulative Distribution Function (CDF)

P(Xk)=1(Nrkr)(r1r1)(Nk)P(X \leq k) = 1 - \frac{\binom{N - r}{k - r} \binom{r - 1}{r - 1}}{\binom{N}{k}}

Probability Mass Function (PMF)

P(X=k)=1ln(1p)pkkP(X = k) = -\frac{1}{\ln(1 - p)} \frac{p^{k}}{k}

Mean (Expected Value)

μ=E[X]=p(1p)ln(1p)\mu = E[X] = \frac{-p}{(1 - p) \ln(1 - p)}

Variance

σ2=Var(X)=p(p+ln(1p))(1p)2[ln(1p)]2\sigma^2 = \operatorname{Var}(X) = \frac{-p (p + \ln(1 - p))}{(1 - p)^{2} [\ln(1 - p)]^{2}}

Standard Deviation

σ=Var(X)\sigma = \sqrt{\operatorname{Var}(X)}

Generating Function

GX(s)=ln(1ps)ln(1p)G_X(s) = \frac{\ln(1 - p s)}{\ln(1 - p)}