Probability
Introduction to Probability Section
Probability is a field of mathematics that deals with uncertainty and provides tools to measure and analyze how likely events are to occur. It begins with basic concepts such as outcomes, events, and sample spaces, forming the foundation for calculating likelihoods.Central to probability is the concept of probability measures, which assign values between 0 and 1 to events, indicating their likelihood. A value of 0 means an event is impossible, while 1 signifies certainty. Key principles include independence (events that do not influence each other) and conditional probability, which considers the likelihood of an event given that another has occurred.
Probability also introduces random variables, which assign numerical values to outcomes. These variables are categorized as either discrete (taking specific values, like rolling a die) or continuous(taking any value within a range, like measuring temperature). Important measures such as expectancy(average value) and variance(spread or variability) are used to summarize the behavior of random variables.
Advanced topics include distributions, such as the binomial, normal, and Poisson distributions, which model specific types of random phenomena. These tools are essential for understanding patterns in random processes and making informed predictions.
Probability is widely applied in science, engineering, finance, and everyday decision-making. It forms the basis for statistics, enabling data-driven insights and predictions, and supports fields like machine learning, risk analysis, and quantum mechanics. By studying probability, students develop skills to reason about uncertainty and draw conclusions from incomplete information.
Probability Formulas
This page presents essential probability formulas organized by categories, ranging from basic principles to advanced distributions. Each formula includes detailed explanations, example calculations, and practical use cases, making it a helpful resource for students and practitioners working with probability theory and statistical analysis.
Simple Probability
P(A)=Total number of possible outcomesNumber of favorable outcomes
Probability Range of an Event
0≤P(A)≤1
Complement Rule
P(A′)+P(A)=1
Conditional Probability Basic Formula
P(A∣B)=P(B)P(A∩B)
Bayes' Theorem
P(A∣B)=P(B)P(B∣A)×P(A)
Probability of Both Events Occurring (Multiplication Rule)
P(A∩B)=P(A)×P(B)
Probability of Either Event Occurring (Addition Rule)
P(A∪B)=P(A)+P(B)−P(A∩B)
Probability of At Least One Event Not Occurring
P(¬A∩¬B)=P(¬A)×P(¬B)
Probability of Exactly One Event Occurring
P(exactly one of A or B)=P(A∩¬B)+P(¬A∩B)
General Formula for Multiple Independent Events
P(A∩B∩C)=P(A)×P(B)×P(C)
Probability of Both Disjoint Events Occurring
P(A∩B)=0
Probability of Either Disjoint Event Occurring (Addition Rule)
P(A∪B)=P(A)+P(B)
Probability of Neither Disjoint Event Occurring
P(¬A∩¬B)=P(¬A)×P(¬B)
Conditional Probability for Disjoint Events
P(A∣B)=0andP(B∣A)=0
Generalization to Multiple Disjoint Events
P(A∪B∪C∪…)=P(A)+P(B)+P(C)+…
Probability Mass Function (PMF)
P(X=k)=(kn)pk(1−p)n−k
Cumulative Distribution Function (CDF)
P(X≤k)=∑i=0k(in)pi(1−p)n−i
Mean (Expected Value)
μ=E[X]=np
Variance
σ2=Var(X)=np(1−p)
Standard Deviation
σ=np(1−p)
Probability Mass Function (PMF)
P(X=k)=k!e−λλk
Cumulative Distribution Function (CDF)
P(X≤k)=e−λ∑i=0ki!λi
Mean (Expected Value)
μ=E[X]=λ
Variance
σ2=Var(X)=λ
Standard Deviation
σ=λ
Probability Mass Function (PMF)
P(X=k)=(1−p)k−1p
Cumulative Distribution Function (CDF)
P(X≤k)=1−(1−p)k
Mean (Expected Value)
μ=E[X]=p1
Variance
σ2=Var(X)=p21−p
Standard Deviation
σ=p21−p
Probability Mass Function (PMF)
P(X=k)=(r−1k−1)pr(1−p)k−r
Cumulative Distribution Function (CDF)
P(X≤k)=Ip(r,k−r+1)
Mean (Expected Value)
μ=E[X]=pr
Variance
σ2=Var(X)=p2r(1−p)
Standard Deviation
σ=p2r(1−p)
Probability Mass Function (PMF)
P(X=k)=(nN)(kK)(n−kN−K)
Mean (Expected Value)
μ=E[X]=n(NK)
Variance
σ2=Var(X)=n(NK)(NN−K)(N−1N−n)
Standard Deviation
σ=n(NK)(NN−K)(N−1N−n)
Probability of At Least $k$ Successes
P(X≥k)=1−∑i=0k−1P(X=i)
Probability Mass Function (PMF)
P(X1=x1,…,Xk=xk)=x1!x2!…xk!n!p1x1p2x2…pkxk
Mean (Expected Value) of Each Outcome
E[Xi]=npi
Variance of Each Outcome
Var(Xi)=npi(1−pi)
Covariance Between Outcomes
Cov(Xi,Xj)=−npipj
Correlation Coefficient Between Outcomes
ρij=Var(Xi)Var(Xj)Cov(Xi,Xj)=pi(1−pi)pj(1−pj)−pipj
Probability Mass Function (PMF)
P(X=k)=b−a+11
Mean (Expected Value)
μ=E[X]=2a+b
Variance
σ2=Var(X)=12(b−a+1)2−1
Standard Deviation
σ=12(b−a+1)2−1
Cumulative Distribution Function (CDF)
P(X≤k)=b−a+1k−a+1fork=a,a+1,…,b
Probability Mass Function (PMF)
P(X=k)=(KN)(r−1k−1)(K−rN−k)
Mean (Expected Value)
μ=E[X]=K+1r(N+1)
Variance
σ2=(K+1)2(K+2)r(N+1)(N−K)(N−r)
Standard Deviation
σ=(K+1)2(K+2)r(N+1)(N−K)(N−r)
Cumulative Distribution Function (CDF)
P(X≤k)=1−(kN)(k−rN−r)(r−1r−1)
Probability Mass Function (PMF)
P(X=k)=−ln(1−p)1kpk
Mean (Expected Value)
μ=E[X]=(1−p)ln(1−p)−p
Variance
σ2=Var(X)=(1−p)2[ln(1−p)]2−p(p+ln(1−p))
Standard Deviation
σ=Var(X)
Generating Function
GX(s)=ln(1−p)ln(1−ps)
Simple Probability
P(A)=Total number of possible outcomesNumber of favorable outcomes
Probability Range of an Event
0≤P(A)≤1
Complement Rule
P(A′)+P(A)=1
Conditional Probability Basic Formula
P(A∣B)=P(B)P(A∩B)
Bayes' Theorem
P(A∣B)=P(B)P(B∣A)×P(A)
Probability of Both Events Occurring (Multiplication Rule)
P(A∩B)=P(A)×P(B)
Probability of Either Event Occurring (Addition Rule)
P(A∪B)=P(A)+P(B)−P(A∩B)
Probability of At Least One Event Not Occurring
P(¬A∩¬B)=P(¬A)×P(¬B)
Probability of Exactly One Event Occurring
P(exactly one of A or B)=P(A∩¬B)+P(¬A∩B)
General Formula for Multiple Independent Events
P(A∩B∩C)=P(A)×P(B)×P(C)
Probability of Both Disjoint Events Occurring
P(A∩B)=0
Probability of Either Disjoint Event Occurring (Addition Rule)
P(A∪B)=P(A)+P(B)
Probability of Neither Disjoint Event Occurring
P(¬A∩¬B)=P(¬A)×P(¬B)
Conditional Probability for Disjoint Events
P(A∣B)=0andP(B∣A)=0
Generalization to Multiple Disjoint Events
P(A∪B∪C∪…)=P(A)+P(B)+P(C)+…
Probability Mass Function (PMF)
P(X=k)=(kn)pk(1−p)n−k
Cumulative Distribution Function (CDF)
P(X≤k)=∑i=0k(in)pi(1−p)n−i
Mean (Expected Value)
μ=E[X]=np
Variance
σ2=Var(X)=np(1−p)
Standard Deviation
σ=np(1−p)
Probability Mass Function (PMF)
P(X=k)=k!e−λλk
Cumulative Distribution Function (CDF)
P(X≤k)=e−λ∑i=0ki!λi
Mean (Expected Value)
μ=E[X]=λ
Variance
σ2=Var(X)=λ
Standard Deviation
σ=λ
Probability Mass Function (PMF)
P(X=k)=(1−p)k−1p
Cumulative Distribution Function (CDF)
P(X≤k)=1−(1−p)k
Mean (Expected Value)
μ=E[X]=p1
Variance
σ2=Var(X)=p21−p
Standard Deviation
σ=p21−p
Probability Mass Function (PMF)
P(X=k)=(r−1k−1)pr(1−p)k−r
Cumulative Distribution Function (CDF)
P(X≤k)=Ip(r,k−r+1)
Mean (Expected Value)
μ=E[X]=pr
Variance
σ2=Var(X)=p2r(1−p)
Standard Deviation
σ=p2r(1−p)
Probability Mass Function (PMF)
P(X=k)=(nN)(kK)(n−kN−K)
Mean (Expected Value)
μ=E[X]=n(NK)
Variance
σ2=Var(X)=n(NK)(NN−K)(N−1N−n)
Standard Deviation
σ=n(NK)(NN−K)(N−1N−n)
Probability of At Least $k$ Successes
P(X≥k)=1−∑i=0k−1P(X=i)
Probability Mass Function (PMF)
P(X1=x1,…,Xk=xk)=x1!x2!…xk!n!p1x1p2x2…pkxk
Mean (Expected Value) of Each Outcome
E[Xi]=npi
Variance of Each Outcome
Var(Xi)=npi(1−pi)
Covariance Between Outcomes
Cov(Xi,Xj)=−npipj
Correlation Coefficient Between Outcomes
ρij=Var(Xi)Var(Xj)Cov(Xi,Xj)=pi(1−pi)pj(1−pj)−pipj
Probability Mass Function (PMF)
P(X=k)=b−a+11
Mean (Expected Value)
μ=E[X]=2a+b
Variance
σ2=Var(X)=12(b−a+1)2−1
Standard Deviation
σ=12(b−a+1)2−1
Cumulative Distribution Function (CDF)
P(X≤k)=b−a+1k−a+1fork=a,a+1,…,b
Probability Mass Function (PMF)
P(X=k)=(KN)(r−1k−1)(K−rN−k)
Mean (Expected Value)
μ=E[X]=K+1r(N+1)
Variance
σ2=(K+1)2(K+2)r(N+1)(N−K)(N−r)
Standard Deviation
σ=(K+1)2(K+2)r(N+1)(N−K)(N−r)
Cumulative Distribution Function (CDF)
P(X≤k)=1−(kN)(k−rN−r)(r−1r−1)
Probability Mass Function (PMF)
P(X=k)=−ln(1−p)1kpk
Mean (Expected Value)
μ=E[X]=(1−p)ln(1−p)−p
Variance
σ2=Var(X)=(1−p)2[ln(1−p)]2−p(p+ln(1−p))
Standard Deviation
σ=Var(X)
Generating Function
GX(s)=ln(1−p)ln(1−ps)
Probability Terms and Definitions
Probability
A measure of the likelihood that a specific event will occur, expressed as a value between 0 and 1.
Random Experiment
A process or action that produces uncertain outcomes, such as rolling a die or tossing a coin.
Sample Space
The set of all possible outcomes of a random experiment.
Event
A subset of the sample space, representing one or more outcomes of interest in a random experiment.
Elementary Event
A single outcome from the sample space that cannot be decomposed further.
Set Operations
Mathematical operations (like union, intersection, and complement) used to combine or relate sets.
Null Set
A set with no elements, representing an impossible event in probability.
Union of Sets
A set that contains all elements from either or both of the sets being combined.
Intersection of Sets
A set containing only the elements that are common to all sets being compared.
Disjoint Sets
Sets that have no elements in common.
Venn Diagram
A graphical representation of sets and their relationships using overlapping circles.
Complement of a Set
The set of elements in the sample space that are not in the given set.
De Morgan's Laws
Mathematical rules relating the complement of a union or intersection of sets to the intersection or union of their complements.
Relative Frequency
The ratio of the number of times an event occurs to the total number of trials or observations.
Probability Measure
A function assigning a probability value to events within a sample space while satisfying certain axioms.
Axiomatic Probability
A formal framework defining probability using a set of axioms ensuring logical consistency.
Elementary Properties of Probability
Basic rules of probability, including values between 0 and 1, and relationships between events like union and intersection.
Equally Likely Events
Events with the same probability of occurrence in an experiment.
Conditional Probability
The probability of one event occurring given that another event has occurred.
Bayes' Rule
A formula to update the probability of an event based on new information about related events.
Total Probability
A theorem that expresses the probability of an event as the sum of probabilities of it occurring under different conditions.
Independent Events
Events whose occurrences are not influenced by each other.
Mutual Exclusiveness
A condition where two or more events cannot occur simultaneously.
Bonferroni's Inequality
A relationship providing bounds for the probability of the union of events.
Boole's Inequality
An upper bound on the probability of the union of several events.
Bernoulli Experiment
A random experiment with exactly two possible outcomes, typically labeled as success and failure.
Sequence of Bernoulli Trials
Repeated independent Bernoulli experiments where the probability of success remains constant across trials.
Random Variable
A function that assigns numerical values to outcomes in a sample space, enabling the study of probabilities of events.
Cumulative Distribution Function
A function that gives the probability that a random variable is less than or equal to a given value.
Probability Mass Function
A function that specifies the probability of each possible value for a discrete random variable.
Probability Density Function
A function describing the likelihood of a continuous random variable taking on a specific value within an interval.
Discrete Random Variable
A random variable with a countable set of possible values.
Continuous Random Variable
A random variable with an uncountable set of values, typically forming an interval on the real number line.
Expected Value
The weighted average of all possible values of a random variable, reflecting its long-term average.
Variance
A measure of the spread or dispersion of a random variable, calculated as the average squared deviation from the mean.
Standard Deviation
The square root of the variance, providing a measure of spread in the same units as the random variable.
Bernoulli Distribution
A discrete distribution describing the outcome of a single trial with two possible outcomes, success and failure.
Binomial Distribution
A discrete distribution of the number of successes in a fixed number of independent Bernoulli trials.
Poisson Distribution
A discrete distribution modeling the number of events occurring in a fixed interval, assuming events occur independently.
Uniform Distribution
A distribution where all outcomes in a specified range are equally likely.
Exponential Distribution
A continuous distribution describing the time between events in a Poisson process, with the memoryless property.
Normal Distribution
A continuous distribution characterized by its bell-shaped curve, symmetric about the mean.
Rayleigh Distribution
A continuous distribution often used in signal processing, describing the magnitude of a vector in two dimensions.
Gamma Distribution
A continuous distribution that generalizes the exponential distribution, used in reliability and queuing models.
Markov Property
The memoryless property where the future state depends only on the current state and not on past states.
Central Limit Theorem
A theorem stating that the sum of many independent random variables tends toward a normal distribution, regardless of the original distributions.
Hypergeometric Distribution
A discrete distribution describing probabilities in draws without replacement from a finite population.
Geometric Distribution
A discrete distribution representing the number of trials needed to get the first success in repeated Bernoulli trials.
Chebyshev's Inequality
A statistical inequality providing a bound on the probability that a random variable deviates from its mean.
Markov Inequality
An inequality bounding the probability of a non-negative random variable exceeding a given value.
Bivariate Random Variable
A pair of random variables considered together, forming a two-dimensional vector defined on the same sample space.
Joint Cumulative Distribution Function
A function that gives the probability that two random variables simultaneously take on values less than or equal to specific values.
Marginal Distribution
The probability distribution of one random variable obtained by summing or integrating out the other variable in a joint distribution.
Joint Probability Mass Function
A function giving the probability that two discrete random variables simultaneously take on specific values.
Joint Probability Density Function
A function representing the probability density of two continuous random variables taking on specific values.
Covariance
A measure of how two random variables change together, indicating the direction of their relationship.
Correlation Coefficient
A normalized measure of the linear relationship between two variables, ranging from -1 to 1.
Conditional Probability Mass Function
The probability distribution of a discrete random variable given that another discrete random variable takes a specific value.
Conditional Probability Density Function
The probability density of a continuous random variable given that another continuous random variable takes a specific value.
Conditional Expectation
The expected value of one random variable given the value of another random variable.
Conditional Variance
The variance of a random variable given that another random variable takes on a specific value.
N-Variate Random Variables
A set of multiple random variables considered as a vector, defining a multi-dimensional space.
Multinomial Distribution
A generalization of the binomial distribution for more than two possible outcomes in each trial.
Bivariate Normal Distribution
A distribution where two continuous random variables are jointly normally distributed.
N-Variate Normal Distribution
A generalization of the bivariate normal distribution to more than two dimensions.
Independent Random Variables
Random variables whose outcomes do not influence each other's probabilities.
Orthogonal Random Variables
Random variables with zero covariance, indicating no linear relationship.
Uncorrelated Random Variables
Random variables with zero correlation coefficient, implying no linear relationship.
Moment of a Random Variable
A quantitative measure of the shape of the variable's probability distribution, derived as the expected value of its powers.
Central Limit Theorem for N-Variate
A theorem stating that the sum of multiple independent random variables approximates a multivariate normal distribution under certain conditions.
Function of a Random Variable
A rule that assigns a new random variable based on a transformation of an existing one, typically denoted as Y=g(X).
Probability Density Function of a Transformed Variable
The function that describes the distribution of probabilities for a random variable obtained through transformation.
Moment Generating Function
A function used to describe all moments of a random variable, defined as the expected value of e^(tX) for a real parameter t.
Characteristic Function
The Fourier transform of a probability distribution, useful for studying the properties and behaviors of random variables.
Weak Law of Large Numbers
A theorem stating that the sample mean of independent, identically distributed random variables converges in probability to their true mean as the sample size increases.
Strong Law of Large Numbers
A theorem that states the sample mean almost surely converges to the true mean as the sample size grows infinitely large.
Central Limit Theorem
A fundamental result in probability theory stating that the sum of a large number of independent, identically distributed random variables will be approximately normally distributed.
Probability
A measure of the likelihood that a specific event will occur, expressed as a value between 0 and 1.
Random Experiment
A process or action that produces uncertain outcomes, such as rolling a die or tossing a coin.
Sample Space
The set of all possible outcomes of a random experiment.
Event
A subset of the sample space, representing one or more outcomes of interest in a random experiment.
Elementary Event
A single outcome from the sample space that cannot be decomposed further.
Set Operations
Mathematical operations (like union, intersection, and complement) used to combine or relate sets.
Null Set
A set with no elements, representing an impossible event in probability.
Union of Sets
A set that contains all elements from either or both of the sets being combined.
Intersection of Sets
A set containing only the elements that are common to all sets being compared.
Disjoint Sets
Sets that have no elements in common.
Venn Diagram
A graphical representation of sets and their relationships using overlapping circles.
Complement of a Set
The set of elements in the sample space that are not in the given set.
De Morgan's Laws
Mathematical rules relating the complement of a union or intersection of sets to the intersection or union of their complements.
Relative Frequency
The ratio of the number of times an event occurs to the total number of trials or observations.
Probability Measure
A function assigning a probability value to events within a sample space while satisfying certain axioms.
Axiomatic Probability
A formal framework defining probability using a set of axioms ensuring logical consistency.
Elementary Properties of Probability
Basic rules of probability, including values between 0 and 1, and relationships between events like union and intersection.
Equally Likely Events
Events with the same probability of occurrence in an experiment.
Conditional Probability
The probability of one event occurring given that another event has occurred.
Bayes' Rule
A formula to update the probability of an event based on new information about related events.
Total Probability
A theorem that expresses the probability of an event as the sum of probabilities of it occurring under different conditions.
Independent Events
Events whose occurrences are not influenced by each other.
Mutual Exclusiveness
A condition where two or more events cannot occur simultaneously.
Bonferroni's Inequality
A relationship providing bounds for the probability of the union of events.
Boole's Inequality
An upper bound on the probability of the union of several events.
Bernoulli Experiment
A random experiment with exactly two possible outcomes, typically labeled as success and failure.
Sequence of Bernoulli Trials
Repeated independent Bernoulli experiments where the probability of success remains constant across trials.
Random Variable
A function that assigns numerical values to outcomes in a sample space, enabling the study of probabilities of events.
Cumulative Distribution Function
A function that gives the probability that a random variable is less than or equal to a given value.
Probability Mass Function
A function that specifies the probability of each possible value for a discrete random variable.
Probability Density Function
A function describing the likelihood of a continuous random variable taking on a specific value within an interval.
Discrete Random Variable
A random variable with a countable set of possible values.
Continuous Random Variable
A random variable with an uncountable set of values, typically forming an interval on the real number line.
Expected Value
The weighted average of all possible values of a random variable, reflecting its long-term average.
Variance
A measure of the spread or dispersion of a random variable, calculated as the average squared deviation from the mean.
Standard Deviation
The square root of the variance, providing a measure of spread in the same units as the random variable.
Bernoulli Distribution
A discrete distribution describing the outcome of a single trial with two possible outcomes, success and failure.
Binomial Distribution
A discrete distribution of the number of successes in a fixed number of independent Bernoulli trials.
Poisson Distribution
A discrete distribution modeling the number of events occurring in a fixed interval, assuming events occur independently.
Uniform Distribution
A distribution where all outcomes in a specified range are equally likely.
Exponential Distribution
A continuous distribution describing the time between events in a Poisson process, with the memoryless property.
Normal Distribution
A continuous distribution characterized by its bell-shaped curve, symmetric about the mean.
Rayleigh Distribution
A continuous distribution often used in signal processing, describing the magnitude of a vector in two dimensions.
Gamma Distribution
A continuous distribution that generalizes the exponential distribution, used in reliability and queuing models.
Markov Property
The memoryless property where the future state depends only on the current state and not on past states.
Central Limit Theorem
A theorem stating that the sum of many independent random variables tends toward a normal distribution, regardless of the original distributions.
Hypergeometric Distribution
A discrete distribution describing probabilities in draws without replacement from a finite population.
Geometric Distribution
A discrete distribution representing the number of trials needed to get the first success in repeated Bernoulli trials.
Chebyshev's Inequality
A statistical inequality providing a bound on the probability that a random variable deviates from its mean.
Markov Inequality
An inequality bounding the probability of a non-negative random variable exceeding a given value.
Bivariate Random Variable
A pair of random variables considered together, forming a two-dimensional vector defined on the same sample space.
Joint Cumulative Distribution Function
A function that gives the probability that two random variables simultaneously take on values less than or equal to specific values.
Marginal Distribution
The probability distribution of one random variable obtained by summing or integrating out the other variable in a joint distribution.
Joint Probability Mass Function
A function giving the probability that two discrete random variables simultaneously take on specific values.
Joint Probability Density Function
A function representing the probability density of two continuous random variables taking on specific values.
Covariance
A measure of how two random variables change together, indicating the direction of their relationship.
Correlation Coefficient
A normalized measure of the linear relationship between two variables, ranging from -1 to 1.
Conditional Probability Mass Function
The probability distribution of a discrete random variable given that another discrete random variable takes a specific value.
Conditional Probability Density Function
The probability density of a continuous random variable given that another continuous random variable takes a specific value.
Conditional Expectation
The expected value of one random variable given the value of another random variable.
Conditional Variance
The variance of a random variable given that another random variable takes on a specific value.
N-Variate Random Variables
A set of multiple random variables considered as a vector, defining a multi-dimensional space.
Multinomial Distribution
A generalization of the binomial distribution for more than two possible outcomes in each trial.
Bivariate Normal Distribution
A distribution where two continuous random variables are jointly normally distributed.
N-Variate Normal Distribution
A generalization of the bivariate normal distribution to more than two dimensions.
Independent Random Variables
Random variables whose outcomes do not influence each other's probabilities.
Orthogonal Random Variables
Random variables with zero covariance, indicating no linear relationship.
Uncorrelated Random Variables
Random variables with zero correlation coefficient, implying no linear relationship.
Moment of a Random Variable
A quantitative measure of the shape of the variable's probability distribution, derived as the expected value of its powers.
Central Limit Theorem for N-Variate
A theorem stating that the sum of multiple independent random variables approximates a multivariate normal distribution under certain conditions.
Function of a Random Variable
A rule that assigns a new random variable based on a transformation of an existing one, typically denoted as Y=g(X).
Probability Density Function of a Transformed Variable
The function that describes the distribution of probabilities for a random variable obtained through transformation.
Moment Generating Function
A function used to describe all moments of a random variable, defined as the expected value of e^(tX) for a real parameter t.
Characteristic Function
The Fourier transform of a probability distribution, useful for studying the properties and behaviors of random variables.
Weak Law of Large Numbers
A theorem stating that the sample mean of independent, identically distributed random variables converges in probability to their true mean as the sample size increases.
Strong Law of Large Numbers
A theorem that states the sample mean almost surely converges to the true mean as the sample size grows infinitely large.
Central Limit Theorem
A fundamental result in probability theory stating that the sum of a large number of independent, identically distributed random variables will be approximately normally distributed.
Browse Probability terminology including main concepts and their definitions with examples .A structured guide to probability theory terms and concepts, progressing from foundational definitions through set theory, random variables, and complex distributions. The content covers both theoretical aspects and practical applications, making probability concepts more accessible for study and reference.