Visual Tools
Calculators
Tables
Mathematical Keyboard
Converters
Other Tools

Probability Terms and Definitions

Basic Definitions

Probability



Definition:

A measure of the likelihood that a specific event will occur, expressed as a value between 0 and 1.

Random Experiment



Definition:

A process or action that produces uncertain outcomes, such as rolling a die or tossing a coin.

Sample Space



Definition:

The set of all possible outcomes of a random experiment.

Event



Definition:

A subset of the sample space, representing one or more outcomes of interest in a random experiment.

Elementary Event



Definition:

A single outcome from the sample space that cannot be decomposed further.

Set Operations

Set Operations



Definition:

Mathematical operations (like union, intersection, and complement) used to combine or relate sets.

Null Set



Definition:

A set with no elements, representing an impossible event in probability.

Union Of Sets



Definition:

A set that contains all elements from either or both of the sets being combined.

Intersection Of Sets



Definition:

A set containing only the elements that are common to all sets being compared.

Disjoint Sets



Definition:

Sets that have no elements in common.

Venn Diagram



Definition:

A graphical representation of sets and their relationships using overlapping circles.

Complement Of A Set



Definition:

The set of elements in the sample space that are not in the given set.

De Morgan's Laws



Definition:

Mathematical rules relating the complement of a union or intersection of sets to the intersection or union of their complements.

Basic Probability

Relative Frequency



Definition:

The ratio of the number of times an event occurs to the total number of trials or observations.

Probability Measure



Definition:

A function assigning a probability value to events within a sample space while satisfying certain axioms.

Axiomatic Probability



Definition:

A formal framework defining probability using a set of axioms ensuring logical consistency.

Elementary Properties Of Probability



Definition:

Basic rules of probability, including values between 0 and 1, and relationships between events like union and intersection.

Equally Likely Events



Definition:

Events with the same probability of occurrence in an experiment.

Independent Events



Definition:

Events whose occurrences are not influenced by each other.

Mutual Exclusiveness



Definition:

A condition where two or more events cannot occur simultaneously.

Conditional Probability

Conditional Probability



Definition:

The probability of one event occurring given that another event has occurred.

Bayes' Rule



Definition:

A formula to update the probability of an event based on new information about related events.

Total Probability



Definition:

A theorem that expresses the probability of an event as the sum of probabilities of it occurring under different conditions.

Conditional Probability Mass Function



Definition:

The probability distribution of a discrete random variable given that another discrete random variable takes a specific value.

Conditional Probability Density Function



Definition:

The probability density of a continuous random variable given that another continuous random variable takes a specific value.

Inequalities

Bonferroni's Inequality



Definition:

A relationship providing bounds for the probability of the union of events.

Boole's Inequality



Definition:

An upper bound on the probability of the union of several events.

Chebyshev's Inequality



Definition:

A statistical inequality providing a bound on the probability that a random variable deviates from its mean.

Markov Inequality



Definition:

An inequality bounding the probability of a non-negative random variable exceeding a given value.

Random Variables

Bernoulli Experiment



Definition:

A random experiment with exactly two possible outcomes, typically labeled as success and failure.

Sequence Of Bernoulli Trials



Definition:

Repeated independent Bernoulli experiments where the probability of success remains constant across trials.

Random Variable



Definition:

A function that assigns numerical values to outcomes in a sample space, enabling the study of probabilities of events.

Discrete Random Variable



Definition:

A random variable with a countable set of possible values.

Continuous Random Variable



Definition:

A random variable with an uncountable set of values, typically forming an interval on the real number line.

Independent Random Variables



Definition:

Random variables whose outcomes do not influence each other's probabilities.

Orthogonal Random Variables



Definition:

Random variables with zero covariance, indicating no linear relationship.

Uncorrelated Random Variables



Definition:

Random variables with zero correlation coefficient, implying no linear relationship.

Distribution Functions

Cumulative Distribution Function



Definition:

A function that gives the probability that a random variable is less than or equal to a given value.

Probability Mass Function



Definition:

A function that specifies the probability of each possible value for a discrete random variable.

Probability Density Function



Definition:

A function describing the likelihood of a continuous random variable taking on a specific value within an interval.

Probability Density Function Of A Transformed Variable



Definition:

The function that describes the distribution of probabilities for a random variable obtained through transformation.

Statistical Measures

Expected Value



Definition:

The weighted average of all possible values of a random variable, reflecting its long-term average.

Variance



Definition:

A measure of the spread or dispersion of a random variable, calculated as the average squared deviation from the mean.

Standard Deviation



Definition:

The square root of the variance, providing a measure of spread in the same units as the random variable.

Covariance



Definition:

A measure of how two random variables change together, indicating the direction of their relationship.

Correlation Coefficient



Definition:

A normalized measure of the linear relationship between two variables, ranging from -1 to 1.

Conditional Expectation



Definition:

The expected value of one random variable given the value of another random variable.

Conditional Variance



Definition:

The variance of a random variable given that another random variable takes on a specific value.

Moment Of A Random Variable



Definition:

A quantitative measure of the shape of the variable's probability distribution, derived as the expected value of its powers.

Probability Distributions

Bernoulli Distribution



Definition:

A discrete distribution describing the outcome of a single trial with two possible outcomes, success and failure.

Binomial Distribution



Definition:

A discrete distribution of the number of successes in a fixed number of independent Bernoulli trials.

Poisson Distribution



Definition:

A discrete distribution modeling the number of events occurring in a fixed interval, assuming events occur independently.

Uniform Distribution



Definition:

A distribution where all outcomes in a specified range are equally likely.

Exponential Distribution



Definition:

A continuous distribution describing the time between events in a Poisson process, with the memoryless property.

Normal Distribution



Definition:

A continuous distribution characterized by its bell-shaped curve, symmetric about the mean.

Rayleigh Distribution



Definition:

A continuous distribution often used in signal processing, describing the magnitude of a vector in two dimensions.

Gamma Distribution



Definition:

A continuous distribution that generalizes the exponential distribution, used in reliability and queuing models.

Hypergeometric Distribution



Definition:

A discrete distribution describing probabilities in draws without replacement from a finite population.

Geometric Distribution



Definition:

A discrete distribution representing the number of trials needed to get the first success in repeated Bernoulli trials.

Multinomial Distribution



Definition:

A generalization of the binomial distribution for more than two possible outcomes in each trial.

Bivariate Normal Distribution



Definition:

A distribution where two continuous random variables are jointly normally distributed.

N-Variate Normal Distribution



Definition:

A generalization of the bivariate normal distribution to more than two dimensions.

Advanced Concepts

Markov Property



Definition:

The memoryless property where the future state depends only on the current state and not on past states.

Central Limit Theorem



Definition:

A theorem stating that the sum of many independent random variables tends toward a normal distribution, regardless of the original distributions.

Central Limit Theorem For N-Variate



Definition:

A theorem stating that the sum of multiple independent random variables approximates a multivariate normal distribution under certain conditions.

Function Of A Random Variable



Definition:

A rule that assigns a new random variable based on a transformation of an existing one, typically denoted as Y=g(X).

Moment Generating Function



Definition:

A function used to describe all moments of a random variable, defined as the expected value of e^(tX) for a real parameter t.

Characteristic Function



Definition:

The Fourier transform of a probability distribution, useful for studying the properties and behaviors of random variables.

Weak Law Of Large Numbers



Definition:

A theorem stating that the sample mean of independent, identically distributed random variables converges in probability to their true mean as the sample size increases.

Strong Law Of Large Numbers



Definition:

A theorem that states the sample mean almost surely converges to the true mean as the sample size grows infinitely large.

Central Limit Theorem



Definition:

A fundamental result in probability theory stating that the sum of a large number of independent, identically distributed random variables will be approximately normally distributed.

Multivariate Probability

Bivariate Random Variable



Definition:

A pair of random variables considered together, forming a two-dimensional vector defined on the same sample space.

Joint Cumulative Distribution Function



Definition:

A function that gives the probability that two random variables simultaneously take on values less than or equal to specific values.

Marginal Distribution



Definition:

The probability distribution of one random variable obtained by summing or integrating out the other variable in a joint distribution.

Joint Probability Mass Function



Definition:

A function giving the probability that two discrete random variables simultaneously take on specific values.

Joint Probability Density Function



Definition:

A function representing the probability density of two continuous random variables taking on specific values.

N-Variate Random Variables



Definition:

A set of multiple random variables considered as a vector, defining a multi-dimensional space.